Skip to content

Commit

Permalink
Merged the suggested scenarios.
Browse files Browse the repository at this point in the history
  • Loading branch information
zx-zx committed Jan 17, 2025
1 parent 9c57d24 commit c4e93b9
Showing 1 changed file with 2 additions and 41 deletions.
43 changes: 2 additions & 41 deletions src/e2e-test/features/gcs/sink/GCSSink.feature
Original file line number Diff line number Diff line change
Expand Up @@ -282,6 +282,7 @@ Feature: GCS sink - Verification of GCS Sink plugin
Then Enter GCS sink property path
Then Select GCS property format "<FileFormat>"
Then Select GCS sink property contentType "<contentType>"
Then Enter GCS File system properties field "gcsCSVFileSysProperty"
Then Validate "GCS" plugin properties
Then Close the GCS properties
Then Save and Deploy Pipeline
Expand All @@ -295,47 +296,7 @@ Feature: GCS sink - Verification of GCS Sink plugin
| csv | text/csv |
| tsv | text/plain |

@BQ_SOURCE_DATATYPE_TEST @GCS_SINK_TEST
Scenario:Validate successful records transfer from BigQuery to GCS with advanced file system properties field
Given Open Datafusion Project to configure pipeline
Then Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "GCS" from the plugins list as: "Sink"
Then Open BigQuery source properties
Then Enter BigQuery property reference name
Then Enter BigQuery property projectId "projectId"
Then Enter BigQuery property datasetProjectId "projectId"
Then Override Service account details if set in environment variables
Then Enter BigQuery property dataset "dataset"
Then Enter BigQuery source property table name
Then Validate output schema with expectedSchema "bqSourceSchemaDatatype"
Then Validate "BigQuery" plugin properties
Then Close the BigQuery properties
Then Open GCS sink properties
Then Override Service account details if set in environment variables
Then Enter the GCS sink mandatory properties
Then Enter GCS File system properties field "gcsCSVFileSysProperty"
Then Validate "GCS" plugin properties
Then Close the GCS properties
Then Connect source as "BigQuery" and sink as "GCS" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Click on preview data for GCS sink
Then Verify preview output schema matches the outputSchema captured in properties
Then Close the preview data
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Verify data is transferred to target GCS bucket
Then Validate the values of records transferred to GCS bucket is equal to the values from source BigQuery table

@GCS_AVRO_FILE @GCS_SINK_TEST @GCS_Source_Required
@GCS_AVRO_FILE @GCS_SINK_TEST
Scenario Outline: To verify data transferred successfully from GCS Source to GCS Sink with write header true at Sink
Given Open Datafusion Project to configure pipeline
When Select plugin: "GCS" from the plugins list as: "Source"
Expand Down

0 comments on commit c4e93b9

Please sign in to comment.