Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream large collections #6

Open
erik-stephens opened this issue Jun 11, 2013 · 1 comment
Open

Stream large collections #6

erik-stephens opened this issue Jun 11, 2013 · 1 comment

Comments

@erik-stephens
Copy link

Summary

As a developer of an API, I would like large collections streamed in chunks in order to minimize the memory footprint required to serve such requests.

Notes

Looks like Transfer-Encoding: chunked is already baked into express.js or node.js. However, looks like express.js is serializing json responses like this:

var body = JSON.stringify(obj, replacer, spaces);

Fortunately, looks like there is a stream-friendly mongoose api. Will just need to handle the serialization in crudify. JSONStream looks promising for that, but if not, can do it in crudify with something like this in sendResult.coffee:

res.send('[');
delim = '';
entity for entity in data
  res.send(delim + JSON.stringify(entity))
  delim = ','
res.send(']');

Tasks

  • Use the Mongoose.QueryStream API
  • Handle the serialization of ourselves instead of relying on express.js.

References

@yocontra
Copy link
Member

We had this before but removed it when we added the authorization in. I need to figure out a way to do streaming auth without clogging the CPU

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants