python - Iterate over large collection in django - cache problem -


i need iterate on large collection (3 * 10^6 elements) in django kind of analysis can't done using single sql statement.

  • is possible turn off collection caching in django? (caching data not acceptable data has around 0.5gb)
  • is possible make django fetch collection in chunks? seems tries pre fetch whole collection in memory , iterate on it. think observing speed of execution:
    • iter(coll.objects.all()).next() - takes forever
    • iter(coll.objects.all()[:10000]).next() - takes less second

use queryset.iterator() walk on results instead of loading them first.


Comments

Popular posts from this blog

asp.net - repeatedly call AddImageUrl(url) to assemble pdf document -

java - Android recognize cell phone with keyboard or not? -

iphone - How would you achieve a LED Scrolling effect? -