Replies: 2 comments
-
I see similar issues as well. I was wondering why they never provided a way to test large dataset in the examples they provided. Now it makes sense. I have a very powerful laptop with more than enough memory capacity, yet it slows down when we go to large data set. I found that after 45million items on a simple three properties class with Lazy enabled it's running out of memory and crashing the application. I even created many little Lazy list, clean up after using the list as they suggested and it's still crashing. |
Beta Was this translation helpful? Give feedback.
-
Using 2.0.1 and seeing the same behaviour. From 1 - 100000 records all seems fine. Over 150000 records (slightly complex graphs) performance drops. It seems like memory is not released somehow and stalls with a OOM. Perhaps I'm using this wrong or I need to be more specific on housekeeping? Perhaps related? #353 |
Beta Was this translation helpful? Give feedback.
-
I am fascinated by the object persistence technology of eclipsestore because it is simple and convenient, and can support the rapid implementation of some small programs. However, as I use it longer and encounter larger and larger scales of data, I have come scalability issues.
I found that eclipsestore struggles with large-scale data. Even if I make full use of LazyHashMap and Lazy reference, it is still difficult to process and store large-scale data with limited memory, such as using 16GB of memory to handle and store 500GB of data.
Here are a few findings:
I wonder if the official eclipsestore can provide a best practice for handling large-scale data, especially providing a reference implementation for a high-performance key-value store, to resolve users' doubts.
Sincere thanks.
Beta Was this translation helpful? Give feedback.
All reactions