Skip to content

Commit 713ed95

Browse files
committed
Additional e2e tests for bq sink
1 parent 64891be commit 713ed95

4 files changed

Lines changed: 224 additions & 0 deletions

File tree

src/e2e-test/features/bigquery/sink/BigQuerySinkError.feature

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,3 +62,17 @@ Feature: BigQuery sink - Validate BigQuery sink plugin error scenarios
6262
Then Enter BigQuery sink property table name
6363
Then Enter BigQuery property temporary bucket name "bqInvalidTemporaryBucket"
6464
Then Verify the BigQuery validation error message for invalid property "bucket"
65+
66+
@BQ_SINK_TEST
67+
Scenario:Verify BigQuery Sink properties validation errors for incorrect value of reference name
68+
Given Open Datafusion Project to configure pipeline
69+
When Sink is BigQuery
70+
Then Open BigQuery sink properties
71+
And Enter input plugin property: "referenceName" with value: "bqInvalidReferenceName"
72+
Then Enter BigQuery property projectId "projectId"
73+
Then Enter BigQuery property datasetProjectId "projectId"
74+
Then Override Service account details if set in environment variables
75+
Then Enter BigQuery property dataset "dataset"
76+
Then Enter BigQuery sink property table name
77+
Then Click on the Validate button
78+
Then Verify that the Plugin Property: "referenceName" is displaying an in-line error message: "errorMessageIncorrectReferenceName"

src/e2e-test/features/bigquery/sink/GCSToBigQuery_WithMacro.feature renamed to src/e2e-test/features/bigquery/sink/BigQuerySink_WithMacro.feature

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,3 +77,53 @@ Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfe
7777
Then Verify the pipeline status is "Succeeded"
7878
Then Get count of no of records transferred to target BigQuery Table
7979
Then Validate the cmek key "cmekBQ" of target BigQuery table if cmek is enabled
80+
81+
@BQ_INSERT_INT_SOURCE_TEST @BQ_SINK_TEST @BigQuery_Sink_Required
82+
Scenario:Validate successful records transfer from BigQuery to BigQuery with macro arguments for Advanced and Auto Create sections
83+
Given Open Datafusion Project to configure pipeline
84+
When Source is BigQuery
85+
When Sink is BigQuery
86+
Then Open BigQuery source properties
87+
Then Enter the BigQuery source mandatory properties
88+
Then Validate "BigQuery" plugin properties
89+
Then Close the BigQuery properties
90+
Then Open BigQuery sink properties
91+
Then Enter BigQuery property reference name
92+
Then Enter BigQuery property projectId "projectId"
93+
Then Enter BigQuery property datasetProjectId "projectId"
94+
Then Override Service account details if set in environment variables
95+
Then Enter BigQuery property dataset "dataset"
96+
Then Enter BigQuery sink property table name
97+
Then Enter BiqQuery property encryption key name "cmekBQ" if cmek is enabled
98+
Then Toggle BigQuery sink property truncateTable to true
99+
Then Toggle BigQuery sink property updateTableSchema to true
100+
Then Click on the Macro button of Property: "operation" and set the value to: "BqOperationType"
101+
Then Click on the Macro button of Property: "relationTableKey" and set the value to: "tableKey"
102+
Then Click on the Macro button of Property: "partitioningType" and set the value to: "BqPartioningType"
103+
Then Click on the Macro button of Property: "rangeStart" and set the value to: "BqRangeStart"
104+
Then Click on the Macro button of Property: "rangeEnd" and set the value to: "BqRangeEnd"
105+
Then Click on the Macro button of Property: "rangeInterval" and set the value to: "BqRangeInterval"
106+
Then Click on the Macro button of Property: "partitionByField" and set the value to: "BqPartitionByField"
107+
Then Click on the Macro button of Property: "partitionFilter" and set the value to: "BqPartitionFilter"
108+
Then Click on the Macro button of Property: "clusteringOrder" and set the value to: "BqClusteringOrder"
109+
Then Validate "BigQuery2" plugin properties
110+
Then Close the BigQuery properties
111+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
112+
Then Save the pipeline
113+
Then Deploy the pipeline
114+
Then Click on the Runtime Arguments Dropdown button
115+
Then Enter runtime argument value "bqOperationType" for key "BqOperationType"
116+
Then Enter runtime argument value "TableKey" for key "tableKey"
117+
Then Enter runtime argument value "bqPartioningType" for key "BqPartioningType"
118+
Then Enter runtime argument value "rangeStartValue" for key "BqRangeStart"
119+
Then Enter runtime argument value "rangeEndValue" for key "BqRangeEnd"
120+
Then Enter runtime argument value "rangeIntervalValue" for key "BqRangeInterval"
121+
Then Enter runtime argument value "partitionByFieldValue" for key "BqPartitionByField"
122+
Then Enter runtime argument value "bqPartitionFilterMacro" for key "BqPartitionFilter"
123+
Then Enter runtime argument value "BqclusterValue" for key "BqClusteringOrder"
124+
Then Run the Pipeline in Runtime with runtime arguments
125+
Then Wait till pipeline is in running state
126+
Then Open and capture logs
127+
Then Verify the pipeline status is "Succeeded"
128+
Then Get count of no of records transferred to target BigQuery Table
129+
Then Validate the cmek key "cmekBQ" of target BigQuery table if cmek is enabled

src/e2e-test/features/bigquery/sink/BigQueryToBigQuerySink.feature

Lines changed: 151 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -345,3 +345,154 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
345345
Then Close the pipeline logs
346346
Then Verify the pipeline status is "Succeeded"
347347
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
348+
349+
@BQ_INSERT_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST
350+
Scenario Outline:Validate successful records transfer from BigQuery to BigQuery with different time partioning type options
351+
Given Open Datafusion Project to configure pipeline
352+
When Expand Plugin group in the LHS plugins list: "Source"
353+
When Select plugin: "BigQuery" from the plugins list as: "Source"
354+
When Expand Plugin group in the LHS plugins list: "Sink"
355+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
356+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
357+
Then Navigate to the properties page of plugin: "BigQuery"
358+
Then Click plugin property: "switch-useConnection"
359+
Then Click on the Browse Connections button
360+
Then Select connection: "bqConnectionName"
361+
Then Click on the Browse button inside plugin properties
362+
Then Select connection data row with name: "dataset"
363+
Then Select connection data row with name: "bqSourceTable"
364+
Then Wait till connection data loading completes with a timeout of 60 seconds
365+
Then Verify input plugin property: "dataset" contains value: "dataset"
366+
Then Verify input plugin property: "table" contains value: "bqSourceTable"
367+
Then Click on the Get Schema button
368+
Then Validate "BigQuery" plugin properties
369+
And Close the Plugin Properties page
370+
Then Navigate to the properties page of plugin: "BigQuery2"
371+
Then Click plugin property: "useConnection"
372+
Then Click on the Browse Connections button
373+
Then Select connection: "bqConnectionName"
374+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
375+
Then Click on the Browse button inside plugin properties
376+
Then Click SELECT button inside connection data row with name: "dataset"
377+
Then Wait till connection data loading completes with a timeout of 60 seconds
378+
Then Verify input plugin property: "dataset" contains value: "dataset"
379+
Then Enter input plugin property: "table" with value: "bqTargetTable"
380+
And Select radio button plugin property: "operation" with value: "upsert"
381+
And Select radio button plugin property: "timePartitioningType" with value: "<options>"
382+
Then Click on the Add Button of the property: "relationTableKey" with value:
383+
| TableKey |
384+
Then Validate "BigQuery" plugin properties
385+
And Close the Plugin Properties page
386+
Then Save the pipeline
387+
Then Preview and run the pipeline
388+
Then Wait till pipeline preview is in running state
389+
Then Open and capture pipeline preview logs
390+
Then Verify the preview run status of pipeline in the logs is "succeeded"
391+
Then Close the pipeline logs
392+
Then Close the preview
393+
Then Deploy the pipeline
394+
Then Run the Pipeline in Runtime
395+
Then Wait till pipeline is in running state
396+
Then Open and capture logs
397+
Then Close the pipeline logs
398+
Then Verify the pipeline status is "Succeeded"
399+
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
400+
Examples:
401+
| options |
402+
| DAY |
403+
| HOUR |
404+
| MONTH |
405+
| YEAR |
406+
407+
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
408+
Scenario:Validate successful records transfer from BigQuery to BigQuery with BQ Job Labels with Key and Value pairs
409+
Given Open Datafusion Project to configure pipeline
410+
When Expand Plugin group in the LHS plugins list: "Source"
411+
When Select plugin: "BigQuery" from the plugins list as: "Source"
412+
When Expand Plugin group in the LHS plugins list: "Sink"
413+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
414+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
415+
Then Navigate to the properties page of plugin: "BigQuery"
416+
And Enter input plugin property: "referenceName" with value: "Reference"
417+
And Replace input plugin property: "project" with value: "projectId"
418+
And Enter input plugin property: "datasetProject" with value: "projectId"
419+
And Replace input plugin property: "dataset" with value: "dataset"
420+
Then Override Service account details if set in environment variables
421+
And Enter input plugin property: "table" with value: "bqSourceTable"
422+
Then Click on the Get Schema button
423+
Then Validate "BigQuery" plugin properties
424+
And Close the Plugin Properties page
425+
Then Navigate to the properties page of plugin: "BigQuery2"
426+
Then Replace input plugin property: "project" with value: "projectId"
427+
Then Override Service account details if set in environment variables
428+
Then Enter input plugin property: "datasetProject" with value: "projectId"
429+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
430+
Then Enter input plugin property: "dataset" with value: "dataset"
431+
Then Enter input plugin property: "table" with value: "bqTargetTable"
432+
Then Click plugin property: "truncateTable"
433+
Then Click plugin property: "updateTableSchema"
434+
Then Click on the Add Button of the property: "jobLabels" with value:
435+
| jobLabelKey |
436+
Then Click on the Add Button of the property: "jobLabels" with value:
437+
| jobLabelValue |
438+
Then Enter BigQuery sink property partition field "bqPartitionFieldTime"
439+
Then Validate "BigQuery" plugin properties
440+
Then Close the BigQuery properties
441+
Then Save the pipeline
442+
Then Preview and run the pipeline
443+
Then Wait till pipeline preview is in running state
444+
Then Open and capture pipeline preview logs
445+
Then Verify the preview run status of pipeline in the logs is "succeeded"
446+
Then Close the pipeline logs
447+
Then Close the preview
448+
Then Deploy the pipeline
449+
Then Run the Pipeline in Runtime
450+
Then Wait till pipeline is in running state
451+
Then Open and capture logs
452+
Then Verify the pipeline status is "Succeeded"
453+
454+
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
455+
Scenario:Validate successful records transfer from BigQuery to BigQuery with Partition Filter
456+
Given Open Datafusion Project to configure pipeline
457+
When Expand Plugin group in the LHS plugins list: "Source"
458+
When Select plugin: "BigQuery" from the plugins list as: "Source"
459+
When Expand Plugin group in the LHS plugins list: "Sink"
460+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
461+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
462+
Then Navigate to the properties page of plugin: "BigQuery"
463+
And Enter input plugin property: "referenceName" with value: "Reference"
464+
And Replace input plugin property: "project" with value: "projectId"
465+
And Enter input plugin property: "datasetProject" with value: "projectId"
466+
And Replace input plugin property: "dataset" with value: "dataset"
467+
Then Override Service account details if set in environment variables
468+
And Enter input plugin property: "table" with value: "bqSourceTable"
469+
Then Click on the Get Schema button
470+
Then Validate "BigQuery" plugin properties
471+
And Close the Plugin Properties page
472+
Then Navigate to the properties page of plugin: "BigQuery2"
473+
Then Replace input plugin property: "project" with value: "projectId"
474+
Then Override Service account details if set in environment variables
475+
Then Enter input plugin property: "datasetProject" with value: "projectId"
476+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
477+
Then Enter input plugin property: "dataset" with value: "dataset"
478+
Then Enter input plugin property: "table" with value: "bqTargetTable"
479+
Then Click plugin property: "truncateTable"
480+
Then Click plugin property: "updateTableSchema"
481+
And Select radio button plugin property: "operation" with value: "upsert"
482+
Then Click on the Add Button of the property: "relationTableKey" with value:
483+
| bqTableKey |
484+
Then Enter input plugin property: "partitionFilter" with value: "bqPartitionFilter"
485+
Then Validate "BigQuery" plugin properties
486+
Then Close the BigQuery properties
487+
Then Save the pipeline
488+
Then Preview and run the pipeline
489+
Then Wait till pipeline preview is in running state
490+
Then Open and capture pipeline preview logs
491+
Then Verify the preview run status of pipeline in the logs is "succeeded"
492+
Then Close the pipeline logs
493+
Then Close the preview
494+
Then Deploy the pipeline
495+
Then Run the Pipeline in Runtime
496+
Then Wait till pipeline is in running state
497+
Then Open and capture logs
498+
Then Verify the pipeline status is "Succeeded"

src/e2e-test/resources/pluginParameters.properties

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -188,6 +188,9 @@ bqSourceSchema=[{"key":"Id","value":"long"},{"key":"Value","value":"long"},{"key
188188
bqPartitionSourceSchema=[{"key":"transaction_id","value":"long"},{"key":"transaction_uid","value":"string"},\
189189
{"key":"transaction_date","value":"date"}]
190190
bqMandatoryProperties=referenceName, dataset, table
191+
jobLabelKey=transaction_uid
192+
jobLabelValue=transaction_uid:redis
193+
jsonStringValue=transaction_uid
191194
bqIncorrectProjectId=incorrectprojectid
192195
bqIncorrectDatasetProjectId=incorrectdatasetprojectid
193196
bqIncorrectFormatProjectId=INCORRECTFORMAT
@@ -201,7 +204,11 @@ bqFuturePartitionEndDate=2099-10-02
201204
bqTruncateTableTrue=True
202205
bqUpdateTableSchemaTrue=True
203206
clusterValue=transaction_date
207+
BqclusterValue=Name
204208
TableKey=PersonID
209+
bqPartioningType=INTEGER
210+
bqPartitionFilter=transaction_uid
211+
bqPartitionFilterMacro=Name
205212
bqSourceTable=dummy
206213
bqCreateTableQueryFile=testdata/BigQuery/BigQueryCreateTableQuery.txt
207214
bqInsertDataQueryFile=testdata/BigQuery/BigQueryInsertDataQuery.txt
@@ -215,6 +222,7 @@ bqSourceSchemaDatatype=[{"key":"transaction_info","value":"boolean"},{"key":"tra
215222
{"key":"difference","value":"array"},{"key":"Userdata","value":"record"}]
216223
bqPartitionField=Month_of_Joining
217224
bqPartitionFieldTime=transaction_date
225+
bqTableKey=unique_key
218226
bqRangeStart=1
219227
bqRangeEnd=10
220228
bqRangeInterval=2
@@ -245,6 +253,7 @@ rangeIntervalValue=1
245253
partitionByFieldValue=ID
246254
bqPartitionFieldDateTime=transaction_dt
247255
bqPartitionFieldTimeStamp=updated_on
256+
bqOperationType=Insert
248257
bqSourceTable2=dummy
249258
dedupeBy=DESC
250259
TableKeyDedupe=Name

0 commit comments

Comments
 (0)