Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Fails to retrieve directiry list of big (>4000 file) directories #135

@hershyheilpern

Description

@hershyheilpern

Script crashes with a maximum call stack size exceeded, when retrieving huge directories

i made the following changes in ftpd.js

	//for (i = 0; i < files.length && i < CONC; ++i)
      handleFile(i);
    j = --i;

    function handleFile(ii) {
      if (i >= files.length)
		  return finished()
        //return i == files.length + j ? finished() : null;

      self.server.getUsernameFromUid(files[ii].stats.uid, function(e1, uname) {
        self.server.getGroupFromGid(files[ii].stats.gid, function(e2, gname) {
          if (e1 || e2) {
            self._logIf(3, "Error getting user/group name for file: " + util.inspect(e1 || e2));
            fileInfos.push({ file: files[ii],
              uname: null,
              gname: null });
          }
          else {
            fileInfos.push({ file: files[ii],
              uname: uname,
              gname: gname });
          }
		  if ((i % 3000)==2999){
			  setTimeout(function(){
				  console.log("giving 10 ms for node to finish")
			  handleFile(++i)
			  },10)
		  } else {
          handleFile(++i);
		  }
        });
      });
    }

But i cannot understand why you are using both, a for loop as it would be synchronous, and also going thorough a for.. loop.

The point of the solution is to give node a 10 ms break to accomplish the jobs and empty up the stack.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions