-
Notifications
You must be signed in to change notification settings - Fork 69
Is RNeo4j Transactional Endpoint slow? #52
Comments
Can you show me your code? |
sure.
Now of I run the following code for the same size :
i get
|
Any news about this? I might be doing something wrong but from what I understood this is the way to use the transactional endpoint. But the performance is worrisome. |
Sorry, thought I had responded to you. The problem is that you're committing in batches of 1000 in LOAD CSV and in a single batch of 12613 in the R code. It's not really a fair comparison. Can you commit in batches of 1000 in your R code and get back to me? |
ok will do that and will get back to you |
Any workaround for this? - Thanks |
Apologies for the long delay some other projects took my time...
i think that this makes a fair comparison... (load the data in batches of 1000) |
I have the same problem. Neither createNode / createRelation nor appendCypher are fast enough to use. My workaround is to use getNode and cypher with normal queries. Also, I create CSV files and import them via READ CSV. Both have the disadvantage that the R Code is not really understandable if the reader doesn't know what cypher/Neo4j is plus creating the CSV files needs storage. Thanks for your hard work. |
Sorry, I don't think the transactional endpoint will ever be as fast as |
Got used to import a csv via the neo4j console. I had 50000 rows.
After setting up an index I imported them in about 0.8 sec
Tried the same thing today with the transactional endpoint and it took 3 mins.
Is it that slow or am I doing something wrong?
The text was updated successfully, but these errors were encountered: