Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 170 Vote(s) - 3.58 Average
  • 1
  • 2
  • 3
  • 4
  • 5
fs.watch fired twice when I change the watched file

#11
## My custom solution
I personally like using `return` to prevent a block of code to run when checking something, so, here is my method:

var watching = false;
fs.watch('./file.txt', () => {
if(watching) return;
watching = true;

// do something

// the timeout is to prevent the script to run twice with short functions
// the delay can be longer to disable the function for a set time
setTimeout(() => {
watching = false;
}, 100);
};

Feel free to use this example to simplify your code. It may **NOT** be better than using a module from others, but it works pretty well!
Reply

#12
I was downloading file with puppeteer and once a file saved, I was sending automatic emails. Due to problem above, I noticed, I was sending 2 emails. I solved by stopping my application using `process.exit()` and auto-start with pm2. Using flags in code didn't saved me.

If anyone has this problem in future, one can use this solution as well. Exit from program and restart with monitor tools automatically.
Reply

#13
Here's my simple solution. It works well every time.

// Update obj as file updates
obj = JSON.parse(fs.readFileSync('./file.json', 'utf-8'));
fs.watch('./file.json', () => {
const data = JSON.parse(fs.readFileSync('./file.json', 'utf-8') || '{}');
if(Object.entries(data).length > 0) { // This checks fs.watch() isn't false-firing
obj = data;
console.log('File actually changed: ', obj)
}
});
Reply

#14
I came across the same issue. If you don't want to trigger multiple times, you can use a **debounce** function.
```js
fs.watch( 'example.xml', _.debounce(function ( curr, prev ) {
// on file change we can read the new xml
fs.readFile( 'example.xml','utf8', function ( err, data ) {
if ( err ) throw err;
console.dir(data);
console.log('Done');
});
}, 100));
```
Reply

#15
## Debouncing The Observer
A solution I arrived at was that (a) there needs to be a workaround for the problem in question and, (b), there needs to be a solution to ensure multiple rapid `Ctrl+s` actions do not cause Race Conditions. Here's what I have...

##### `./**/utilities.js` (somewhere)
```
export default {
...
debounce(fn, delay) { // #thxRemySharp

[To see links please register here]

var timer = null;

return function execute(...args) {
var context = this;
clearTimeout(timer);
timer = setTimeout(fn.bind(context, ...args), delay);
};
},
...
};
```

##### `./**/file.js` (elsewhere)
```
import utilities from './**/utilities.js'; // somewhere
...
function watch(server) {
const debounced = utilities.debounce(observeFilesystem.bind(this, server), 1000 * 0.25);
const observers = new Set()
.add( fs.watch('./src', debounced) )
.add( fs.watch('./index.html', debounced) )
;
console.log(`watching... (${observers.size})`);

return observers;
}

function observeFilesystem(server, type, filename) {
if (!filename) console.warn(`Tranfer Dev Therver: filesystem observation made without filename for type ${type}`);
console.log(`Filesystem event occurred:`, type, filename);
server.close(handleClose);
}
...
```

This way, the observation-handler that we pass into `fs.watch` is [in this case a bound bunction] which gets _debounced_ if multiple calls are made less than `1000 * 0.25` seconds (250ms) apart from one another.

It may be worth noting that I have also devised a pipeline of `Promise`s to help avoid other types of Race Conditions as the code also leverages other callbacks. Please also note the attribution to Remy Sharp whose debounce function has repeatedly proven very useful over the years.
Reply

#16
watcher = fs.watch( 'example.xml', function ( curr, prev ) {
watcher.close();
fs.readFile( 'example.xml','utf8', function ( err, data ) {
if ( err ) throw err;
console.dir(data);
console.log('Done');
});
});


I had similar similar problem but I was also reading the file in the callback which caused a loop.

This is where I found how to close watcher:

[To see links please register here]

Reply

#17
NodeJS does not fire multiple events for a single change, it is the editor you are using updating the file multiple times.

Editors use stream API for efficiency, they read and write data in chunks which causes multiple updates depending on the chunks size and the amount of content. Here is a snippet to test if `fs.watch` fires multiple events:

```javascript
const http = require('http');
const fs = require('fs');
const path = require('path');

const host = 'localhost';
const port = 3000;

const file = path.join(__dirname, 'config.json');

const requestListener = function (req, res) {
const data = new Date().toString();
fs.writeFileSync(file, data, { encoding: 'utf-8' });
res.end(data);
};

const server = http.createServer(requestListener);

server.listen(port, host, () => {
fs.watch(file, (eventType, filename) => {
console.log({ eventType });
});
console.log(`Server is running on http://${host}:${port}`);
});
```

I believe a simple solution would be checking for the last modified timestamp:

```javascript
let lastModified;

fs.watch(file, (eventType, filename) => {
stat(file).then(({ mtimeMs }) => {
if (lastModified !== mtimeMs) {
lastModified = mtimeMs;
console.log({ eventType, filename });
}
});
});
```

Please note that you need to use all-sync or all-async methods otherwise you will have issues:

Update the file in a editor, you will see only single event is logged:

```
const http = require('http');
const host = 'localhost';
const port = 3000;
const fs = require('fs');
const path = require('path');
const file = path.join(__dirname, 'config.json');

let lastModified;
const requestListener = function (req, res) {
const data = Date.now().toString();

fs.writeFileSync(file, data, { encoding: 'utf-8' });
lastModified = fs.statSync(file).mtimeMs;

res.end(data);
};

const server = http.createServer(requestListener);

server.listen(port, host, () => {
fs.watch(file, (eventType, filename) => {
const mtimeMs = fs.statSync(file).mtimeMs;
if (lastModified !== mtimeMs) {
lastModified = mtimeMs;
console.log({ eventType });
}
});

console.log(`Server is running on http://${host}:${port}`);
});
```

Few notes on the alternative solutions: Storing files for comparison will be memory inefficient especially if you have large files, taking file hashes will be expensive, custom flags are hard to keep track of, especially if you are going to detect changes made by other applications, and lastly unsubscribing and re-subscribing requires unnecessary juggling.

If you don't need an instant result, you can use setTimout to debounce successive events:

```javascript
let timeoutId;
fs.watch(file, (eventType, filename) => {

clearTimeout(timeoutId);
timeoutId = setTimeout(() => {
console.log({ eventType });
}, 100);

});
```
Reply

#18
I debounce this with a `setTimeout`:

var fsTimeout

fs.watch('file.js', function(e) {

if (!fsTimeout) {
console.log('file.js %s event', e)
fsTimeout = setTimeout(function() { fsTimeout=null }, 5000) // give 5 seconds for multiple events
}
}
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through