Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Daily build fails on standalone cluster due to NoSuchMethodError #547

Closed
jeremyprime opened this issue Jun 2, 2023 · 2 comments · Fixed by #550
Closed

Daily build fails on standalone cluster due to NoSuchMethodError #547

jeremyprime opened this issue Jun 2, 2023 · 2 comments · Fixed by #550
Assignees
Labels
bug Something isn't working ci Management of the continuous integration pipeline High Priority

Comments

@jeremyprime
Copy link
Collaborator

jeremyprime commented Jun 2, 2023

Problem Description

Daily tests on a standalone cluster are failing (started May 29).

No code changes occurred recently, so likely due to a Docker image or third-party dependency that was updated. Most likely the Bitnami Spark image, which appears to have recently moved to Spark 3.4.0. This may become a larger story to support Spark 3.4.0 (see #549), so may want to pin tests to use Spark 3.3.x for now.


Spark Connector Logs

Complex Types Tests failure:

23/06/02 09:32:35 ERROR Main$: Uncaught exception from tests: 'java.lang.String org.apache.spark.sql.execution.datasources.PartitionedFile.filePath()'
java.lang.NoSuchMethodError: 'java.lang.String org.apache.spark.sql.execution.datasources.PartitionedFile.filePath()'
	at com.vertica.spark.datasource.wrappers.VerticaScanWrapper.$anonfun$planInputPartitions$1(VerticaScanWrapper.scala:40)
@jeremyprime jeremyprime added bug Something isn't working ci Management of the continuous integration pipeline High Priority labels Jun 2, 2023
@jeremyprime
Copy link
Collaborator Author

If pinning the version to 3.3.x, see the following files where we use latest (or reference that fact):

  • docker/docker-compose.yml (see client, spark-master, and spark-worker)
  • docker/README.md
  • .github/workflows/nightly.xml (see run-integration-tests-on-standalone-cluster, since it uses the default version from docker-compse.yml)

Updating docker-compose.yml and README.md is enough as the nightly tests rely on the default version in that file.

Find the latest Spark 3.3.x tag for the Bitnami Spark image here.

@jbobson98 jbobson98 self-assigned this Jun 29, 2023
@jbobson98 jbobson98 linked a pull request Jun 29, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ci Management of the continuous integration pipeline High Priority
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants