javascript-node and Error:EMFILE、too many open filesを解決します.
25018 ワード
For some days I have searched for a working solution to an error
The default value is 10240.This is a little strange in my eyes,because the number of files I'm handing in the directory is under 10240.Even stranger,I still receive the same errofter I's firectored of.com.
Second question:
After a number of searches I found a work around for the「too many open files」problem:
One last question(I'm new to javascript and node)I'm in the process of developping a web apaplication with a lot of requests for about 5000 daily users.I've many years of eexperiencinprogramming with th thethethe lagglagglagggggythandjava.soororororororororororororinalalalalalalallyI I I athththaatotototodididididididididisasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasablocking I/O model is really nice,seductive,and most of all very fast!
But what kind of problems shound I expect with node?Is it a production proven web server?What are your experiences?
javascript オース node.js file-descriptor 湖南省にある地名 edited Jan 23'12 at 5:05 jlafay 7,017 8 45 asked Jan 22'12 at 23:18 xaverras 600 1 6 12 「Is it a production proven web server?」May be a bit pedantic、but node isn't a web server as such.– UpThe Creek Oct 2'13 at 19:45
|
8 Answers 8
Using the
湖南省にある地名 answered Ar 10'13 at 19:32 Braveg 1 rl 7,034 3 24,41 Saved me,why is this not the node default?Why do I need to install some 3 rd party plugin to sove the issue?– Anthony Webb Aug 14'13 at 14:44 3 I think、generaally speaking、Node tries to expose as mush to the user as possible.This gives everone(not just Node core developers)the oppopoutnity to solive any probleams from the use of solivetime,solivetime.and download those published by others through npm.Don't expect a lot of smart from Node itself.Instead,expect to find the smart s in package published on npm.– Braveg 1 rl Aug 14'13 at 15:00 4 That's fine if it's your own code,but plenty of npm modules dont use this.– UpThe Creek Oct 2'13 at 19:43 1 This module soved all my ises!I agree that node appars to be a little raw still、but manly because it's really hard to understand what is gong with so litle documentation and 解決方法 right solutions to known ises.– sidonaldson 31'13 at 12:39 how do you npm ithow do I cobine this in my code instead of the reglar fs?– Aviram Netanel Feb 4'14 at 11:45 | show moreコメント
For when graceful-fs doesn't work...or you just want to understand where the leak is cosing from.Follow this process.
(e.g.graceful-fs isn't gonna fix your wagon if your ise with sockets.)
From My Blog Artcle: http://www.blakerobertson.com/devlog/2014/1/11/how-to-determine-whats-causing-error-connect-emfile-nodejs.html
How To Isolate
This command will output the number of open handles for nodejs processes:
Now,Look at the last column.That indicates which resource is open.You'll probably see a number of lines all with the same me me.Hopefully,that now tells you where to look in your the forak.
If you don't know multiple node process,first lookup which process has pid 12121.That'll tell you the process.
In my case above,I noticed that there were a bunch of very simiar IP Addreses.The were all
Command Reference
Use this syntax to determine how many open handles a process has open…
To get a count of open files for a certain pid
I used this command to test the number of files that were opened after dong various events in my ap.
|
I ran into this problem today,and findingのgood soldions for it,I created a module to address it.I was inspired by@fbartho's snippet,but wanted to avoid overwriting the fs module.
The module I wrote is Filequeue,and you use it just like fs:
|
You're reading too many files at once.Node reads files asynch ronously,so you'll be reading all files atonce.So you're probably reading 10240 atonce.
See if this works:
|
I just finished writing a little snippet of code to solive this proble myself,all of the other solutions appar way too heavyweight and require you to change your program structure.
This solution just stalls any fs.readFile or fs.writeFile cals so that there re re re no more than a set number in flight any given time.
|
With bagpipe、you just need change
湖南省にある地名 answered Nov 20'12 at 4:22 アメリカ1837639 27. It's all on chinese or other asian langage.Is there any documentation written in english? Fatih Aslan Feb 20'13 at 23:16 @FatihAslan English doc is available now. アメリカ1837639 Jul 30'13 at 12:12 or use async.js– メルボルン2991 Mar 3'15 at 5:03
|
Had the same problem when running the nodemon command so i reduced the name of files open in sublime text and the error dissappared.
湖南省にある地名 answered Dec 9'15 at 7:24 Bhiire Keneth 84 5 I,too,was getting
|
cwait is a general solution for limiting concurrent exections of any functions that return promises.
In your case the code could be something like:
|
protected by Community♦ Jan 23'15 at 12:43
Thank you for your interest in this question.Because it has atractod low-quality or spam answers that had to be removed,posting an answer now requires 10 レプロモーション オンスサイト assion bonus does not count) Would you like to answer one of these unanswered questions instead
Not the answer you're looking for?Browse other questions tagged javascript オース node.js file-descriptor or ask your own question.
転載先:https://www.cnblogs.com/zzsdream/p/11140512.html
Error: EMFILE, too many open files
It seems that many people have the same proble.The usual answer involves increase the number of file descriptors.So,I've tried this:sysctl -w kern.maxfiles=20480
、The default value is 10240.This is a little strange in my eyes,because the number of files I'm handing in the directory is under 10240.Even stranger,I still receive the same errofter I's firectored of.com.
Second question:
After a number of searches I found a work around for the「too many open files」problem:
var requestBatches = {};
function batchingReadFile(filename, callback) { // First check to see if there is already a batch if (requestBatches.hasOwnProperty(filename)) { requestBatches[filename].push(callback); return; } // Otherwise start a new one and make a real request var batch = requestBatches[filename] = [callback]; FS.readFile(filename, onRealRead); // Flush out the batch on complete function onRealRead() { delete requestBatches[filename]; for (var i = 0, l = batch.length; i < l; i++) { batch[i].apply(null, arguments); } } } function printFile(file){ console.log(file); } dir = "/Users/xaver/Downloads/xaver/xxx/xxx/" var files = fs.readdirSync(dir); for (i in files){ filename = dir + files[i]; console.log(filename); batchingReadFile(filename, printFile);
Unfortunary I still recieve the same error.What is wrong with th is code?One last question(I'm new to javascript and node)I'm in the process of developping a web apaplication with a lot of requests for about 5000 daily users.I've many years of eexperiencinprogramming with th thethethe lagglagglagggggythandjava.soororororororororororororinalalalalalalallyI I I athththaatotototodididididididididisasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasasablocking I/O model is really nice,seductive,and most of all very fast!
But what kind of problems shound I expect with node?Is it a production proven web server?What are your experiences?
javascript オース node.js file-descriptor 湖南省にある地名 edited Jan 23'12 at 5:05 jlafay 7,017 8 45 asked Jan 22'12 at 23:18 xaverras 600 1 6 12 「Is it a production proven web server?」May be a bit pedantic、but node isn't a web server as such.– UpThe Creek Oct 2'13 at 19:45
|
8 Answers 8
Using the
graceful-fs
module by Isaac Schlueter is probably the most apprate solution.It does increment back-of EMFILE is encountered.It can be used as a drop-inplacment for the built-in fs
module.湖南省にある地名 answered Ar 10'13 at 19:32 Braveg 1 rl 7,034 3 24,41 Saved me,why is this not the node default?Why do I need to install some 3 rd party plugin to sove the issue?– Anthony Webb Aug 14'13 at 14:44 3 I think、generaally speaking、Node tries to expose as mush to the user as possible.This gives everone(not just Node core developers)the oppopoutnity to solive any probleams from the use of solivetime,solivetime.and download those published by others through npm.Don't expect a lot of smart from Node itself.Instead,expect to find the smart s in package published on npm.– Braveg 1 rl Aug 14'13 at 15:00 4 That's fine if it's your own code,but plenty of npm modules dont use this.– UpThe Creek Oct 2'13 at 19:43 1 This module soved all my ises!I agree that node appars to be a little raw still、but manly because it's really hard to understand what is gong with so litle documentation and 解決方法 right solutions to known ises.– sidonaldson 31'13 at 12:39 how do you npm ithow do I cobine this in my code instead of the reglar fs?– Aviram Netanel Feb 4'14 at 11:45 | show moreコメント
For when graceful-fs doesn't work...or you just want to understand where the leak is cosing from.Follow this process.
(e.g.graceful-fs isn't gonna fix your wagon if your ise with sockets.)
From My Blog Artcle: http://www.blakerobertson.com/devlog/2014/1/11/how-to-determine-whats-causing-error-connect-emfile-nodejs.html
How To Isolate
This command will output the number of open handles for nodejs processes:
lsof -i -n -P | grep nodejs
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
...
nodejs 12211 root 1012u IPv4 151317015 0t0 TCP 10.101.42.209:40371->54.236.3.170:80 (ESTABLISHED) nodejs 12211 root 1013u IPv4 151279902 0t0 TCP 10.101.42.209:43656->54.236.3.172:80 (ESTABLISHED) nodejs 12211 root 1014u IPv4 151317016 0t0 TCP 10.101.42.209:34450->54.236.3.168:80 (ESTABLISHED) nodejs 12211 root 1015u IPv4 151289728 0t0 TCP 10.101.42.209:52691->54.236.3.173:80 (ESTABLISHED) nodejs 12211 root 1016u IPv4 151305607 0t0 TCP 10.101.42.209:47707->54.236.3.172:80 (ESTABLISHED) nodejs 12211 root 1017u IPv4 151289730 0t0 TCP 10.101.42.209:45423->54.236.3.171:80 (ESTABLISHED) nodejs 12211 root 1018u IPv4 151289731 0t0 TCP 10.101.42.209:36090->54.236.3.170:80 (ESTABLISHED) nodejs 12211 root 1019u IPv4 151314874 0t0 TCP 10.101.42.209:49176->54.236.3.172:80 (ESTABLISHED) nodejs 12211 root 1020u IPv4 151289768 0t0 TCP 10.101.42.209:45427->54.236.3.171:80 (ESTABLISHED) nodejs 12211 root 1021u IPv4 151289769 0t0 TCP 10.101.42.209:36094->54.236.3.170:80 (ESTABLISHED) nodejs 12211 root 1022u IPv4 151279903 0t0 TCP 10.101.42.209:43836->54.236.3.171:80 (ESTABLISHED) nodejs 12211 root 1023u IPv4 151281403 0t0 TCP 10.101.42.209:43930->54.236.3.172:80 (ESTABLISHED) ....
Notice the:1023 u(last line) - that's the 1024 th file handle which is the default maximm.Now,Look at the last column.That indicates which resource is open.You'll probably see a number of lines all with the same me me.Hopefully,that now tells you where to look in your the forak.
If you don't know multiple node process,first lookup which process has pid 12121.That'll tell you the process.
In my case above,I noticed that there were a bunch of very simiar IP Addreses.The were all
54.236.3.###
By doing ip address lookup,was able to determine in my case it was pbnub related.Command Reference
Use this syntax to determine how many open handles a process has open…
To get a count of open files for a certain pid
I used this command to test the number of files that were opened after dong various events in my ap.
lsof -i -n -P | grep "8465" | wc -l
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
28
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
31
# lsof -i -n -P | grep "nodejs.*8465" | wc -l 34
What is your process limit? ulimit -a
The line you want will look like this: open files (-n) 1024
湖南省にある地名 edited Nov 7'16 at 21:17 コマンダ 4,219 1 16 answered Jan 12'14 at 2:27 blak 3 r 8,274 8 51,80 1 How can you change open files limit?– 2619 May 29'14 at 9:35 ulimit-n 2048 to allow 2048 files open– Gael Nov 11'14 at 1:10|
I ran into this problem today,and findingのgood soldions for it,I created a module to address it.I was inspired by@fbartho's snippet,but wanted to avoid overwriting the fs module.
The module I wrote is Filequeue,and you use it just like fs:
var Filequeue = require('filequeue');
var fq = new Filequeue(200); // max number of files to open at once fq.readdir('/Users/xaver/Downloads/xaver/xxx/xxx/', function(err, files) { if(err) { throw err; } files.forEach(function(file) { fq.readFile('/Users/xaver/Downloads/xaver/xxx/xxx/' + file, function(err, data) { // do something here } }); });
湖南省にある地名 answered Mar 8'13 at 1:50 Trey Greffith 71 1 4|
You're reading too many files at once.Node reads files asynch ronously,so you'll be reading all files atonce.So you're probably reading 10240 atonce.
See if this works:
var fs = require('fs')
var events = require('events') var util = require('util') var path = require('path') var FsPool = module.exports = function(dir) { events.EventEmitter.call(this) this.dir = dir; this.files = []; this.active = []; this.threads = 1; this.on('run', this.runQuta.bind(this)) }; // So will act like an event emitter util.inherits(FsPool, events.EventEmitter); FsPool.prototype.runQuta = function() { if(this.files.length === 0 && this.active.length === 0) { return this.emit('done'); } if(this.active.length < this.threads) { var name = this.files.shift() this.active.push(name) var fileName = path.join(this.dir, name); var self = this; fs.stat(fileName, function(err, stats) { if(err) throw err; if(stats.isFile()) { fs.readFile(fileName, function(err, data) { if(err) throw err; self.active.splice(self.active.indexOf(name), 1) self.emit('file', name, data); self.emit('run'); }); } else { self.active.splice(self.active.indexOf(name), 1) self.emit('dir', name); self.emit('run'); } }); } return this }; FsPool.prototype.init = function() { var dir = this.dir; var self = this; fs.readdir(dir, function(err, files) { if(err) throw err; self.files = files self.emit('run'); }) return this }; var fsPool = new FsPool(__dirname) fsPool.on('file', function(fileName, fileData) { console.log('file name: ' + fileName) console.log('file data: ', fileData.toString('utf8')) }) fsPool.on('dir', function(dirName) { console.log('dir name: ' + dirName) }) fsPool.on('done', function() { console.log('done') }); fsPool.init()
湖南省にある地名 エドワードJl 25'16 at 11:19 atc 3,661 2 27,53 answered Jan 23'12 at 0:30 Tim P. 146 1 9|
I just finished writing a little snippet of code to solive this proble myself,all of the other solutions appar way too heavyweight and require you to change your program structure.
This solution just stalls any fs.readFile or fs.writeFile cals so that there re re re no more than a set number in flight any given time.
// Queuing reads and writes, so your nodejs script doesn't overwhelm system limits catastrophically
global.maxFilesInFlight = 100; // Set this value to some number safeish for your system var origRead = fs.readFile; var origWrite = fs.writeFile; var activeCount = 0; var pending = []; var wrapCallback = function(cb){ return function(){ activeCount--; cb.apply(this,Array.prototype.slice.call(arguments)); if (activeCount < global.maxFilesInFlight && pending.length){ console.log("Processing Pending read/write"); pending.shift()(); } }; }; fs.readFile = function(){ var args = Array.prototype.slice.call(arguments); if (activeCount < global.maxFilesInFlight){ if (args[1] instanceof Function){ args[1] = wrapCallback(args[1]); } else if (args[2] instanceof Function) { args[2] = wrapCallback(args[2]); } activeCount++; origRead.apply(fs,args); } else { console.log("Delaying read:",args[0]); pending.push(function(){ fs.readFile.apply(fs,args); }); } }; fs.writeFile = function(){ var args = Array.prototype.slice.call(arguments); if (activeCount < global.maxFilesInFlight){ if (args[1] instanceof Function){ args[1] = wrapCallback(args[1]); } else if (args[2] instanceof Function) { args[2] = wrapCallback(args[2]); } activeCount++; origWrite.apply(fs,args); } else { console.log("Delaying write:",args[0]); pending.push(function(){ fs.writeFile.apply(fs,args); }); } };
湖南省にある地名 answered Dec 2'12 at 23:28 fbartho 184 6 U shoud make a repo for this on github.– NickSep 4'14 at 3:06 This works very well if graceful-fs is not work for you.– Cekay Nov 8'16 at 18:08|
With bagpipe、you just need change
FS.readFile(filename, onRealRead);
=>var bagpipe = new Bagpipe(10);
bagpipe.push(FS.readFile, filename, onRealRead))
The bagpipe help you limit the parallel.more details: https://github.com/JacksonTian/bagpipe 湖南省にある地名 answered Nov 20'12 at 4:22 アメリカ1837639 27. It's all on chinese or other asian langage.Is there any documentation written in english? Fatih Aslan Feb 20'13 at 23:16 @FatihAslan English doc is available now. アメリカ1837639 Jul 30'13 at 12:12 or use async.js– メルボルン2991 Mar 3'15 at 5:03
|
Had the same problem when running the nodemon command so i reduced the name of files open in sublime text and the error dissappared.
湖南省にある地名 answered Dec 9'15 at 7:24 Bhiire Keneth 84 5 I,too,was getting
EMFILE
errors and through trial and error noticed that closing some Sublime windows relaved the issue.I still don't know why.I tried adding ulimit -n 2560
to my.bash_profile,but that didn't solive the issue.Does this indicate a need to change to Atom instead?– The Qodesmith Jan 26'16 at 13:38|
cwait is a general solution for limiting concurrent exections of any functions that return promises.
In your case the code could be something like:
var Promise = require('bluebird'); var cwait = require('cwait'); // Allow max. 10 concurrent file reads. var queue = new cwait.TaskQueue(Promise, 10); var read = queue.wrap(Promise.promisify(batchingReadFile)); Promise.map(files, function(filename) { console.log(filename); return(read(filename)); })
湖南省にある地名 answered May 10'16 at 13:46 jjrv 2,776 1 22,40|
protected by Community♦ Jan 23'15 at 12:43
Thank you for your interest in this question.Because it has atractod low-quality or spam answers that had to be removed,posting an answer now requires 10 レプロモーション オンスサイト assion bonus does not count) Would you like to answer one of these unanswered questions instead
Not the answer you're looking for?Browse other questions tagged javascript オース node.js file-descriptor or ask your own question.
転載先:https://www.cnblogs.com/zzsdream/p/11140512.html