-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Connector does not work in local Spark mode #15
Comments
Hello! |
Reading data from Greenplum container to Spark executor is working. But for some reason it takes a minute to read a table with just 4 rows. |
Oh, I see, you are right - reading "somehow" works, and writing doesn't at all. |
Any updates? |
Description
The Spark-Greenplum connector does not work correctly in local Spark mode (local-master).
When performing read operations, there is a significant waiting time,
and write operations crash with a timeout error.
The problem is reproduced on Spark 3.x versions when writing data to Greenplum via
spark-greenplum-connector_2.12-3.1.jar
.Steps to reproduce
To reproduce this issue you need:
docker-compose.yml
:spark-greenplum-connector_2.12-3.1.jar
from source, but it does not work in local mode,the problem is described in issue: The show method for the query data is stuck #14, I have made
in the source code the changes suggested in the issue:
Logs
logs while trying to read data from table:
as can be seen from the logs between the interaction of
RMISlave
andRMIMaster
takes more than 30 seconds.logs while trying to write data to table:
Environment
Connector version: spark-greenplum-connector_2.12-3.1.jar
Java version, Scala version: Java 1.8.0, Scala 2.12
OS: macOS 13.4.1 (22F82)
The text was updated successfully, but these errors were encountered: