Conversation
|
org.apache.beam.sdk.extensions.sql.BeamSqlDslSqlStdOperatorsTest > testArithmeticOperator failed due to breaking change in apache/calcite#3481 solved updated the test |
|
apache/calcite@a326bd2#diff-9cda2b29a1b9206e0daa6e6d722eb476575f83eadea1a6e302225afd08d9f0d2 made datetime becoming reserved keyword, failed several nexmark tests solved updated tests |
|
BeamSqlDslExistsTest failed due to query had been parsed differently. Previously, not exists parsed to and invokes logical join. Now it becomes
|
|
testProjectArrayFieldWithCoGBKJoin failed for similar reason: Did a bisect (use un-vendored Calcite: Abacn@d2aff89). Previously (Calcite 1.38 and below), parsed query: and Beam plan successfully convert to In Calcite 1.39+, the parsed query becomes and not able to convert due to the same reason, LogicalFilter(convention: None -> BEAM_LOGICAL) not implemented. |
|
opened https://issues.apache.org/jira/browse/CALCITE-7101. One can use Abacn@bdd6440 to reproduce, run |
|
I'm able to get all (other than ZetaSQL) tests passing, except that Iceberg sql tests failed with These tests were added in #34799. @ahmedabu98 @talatuyarer would you mind taking a look? They can be reproduced locally with |
Fix sql/jdbc integration test Fix Iceberg tests
|
Looks like Calcite 1.40 is much more strict for nested row types. Check needed changes in 069f6b2: Given a type: c_arr_struct ARRAY<ROW<c_arr_struct_arr ARRAY<VARCHAR>, c_arr_struct_integer INTEGER>>this insert no longer works: ROW(ARRAY['abc', 'xyz'], 123)it would have to be casted like this: CAST(ROW(ARRAY['abc', 'xyz'], 123) AS ROW(c_arr_struct_arr VARCHAR ARRAY, c_arr_struct_integer INTEGER)) |
|
Thanks @ahmedabu98 ! I'm able to bisesct calcite version to 1.33 (passed) and 1.34 (all 3 integration test failing). I use https://github.com/Abacn/beam/tree/unvendor-calcite-test for testing, where it contains commits using different version of Apache Calcite |
|
testSQLReadWithProjectAndFilterPushDown (org.apache.beam.sdk.extensions.sql.meta.provider.iceberg.IcebergReadWriteIT) failed org.apache.beam.sdk.Pipeline$PipelineExecutionException: org.apache.iceberg.exceptions.ValidationException: Cannot find field 'C_BOOLEAN' in struct: struct<1: c_integer: optional int, 2: c_float: optional float, 3: c_boolean: optional boolean, 4: c_timestamp: optional timestamptz, 5: c_varchar: optional string> looks like a case sensitivity issue? |
|
that should fix it |
Test #35483 #26403
need #35586. There are dependencies having multi-version java21 class
Please add a meaningful description for your change here
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>instead.CHANGES.mdwith noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.