Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update versions of Scala, Spark and sbt for builds and CI/CD. #383

Merged
merged 3 commits into from
Mar 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 5 additions & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -109,10 +109,12 @@ jobs:
with:
ref: main

- name: Setup Scala
uses: olafurpg/setup-scala@v14
- name: Setup JDK and sbt
uses: actions/setup-[email protected]
with:
java-version: "[email protected]"
distribution: temurin
java-version: 8
cache: sbt

- name: Import GPG Key
run: |
Expand Down
22 changes: 13 additions & 9 deletions .github/workflows/scala.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,26 +18,30 @@ jobs:
strategy:
fail-fast: false
matrix:
scala: [2.11.12, 2.12.18, 2.13.12]
spark: [2.4.8, 3.2.4, 3.3.3, 3.4.1]
scala: [2.11.12, 2.12.19, 2.13.13]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't an include syntax be better than exclude in the case of Scala 2.11?

https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstrategymatrixinclude

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm going to check and compare, thanks!

spark: [2.4.8, 3.3.4, 3.4.2, 3.5.1]
exclude:
- scala: 2.11.12
spark: 3.2.4
spark: 3.3.4
- scala: 2.11.12
spark: 3.3.3
spark: 3.4.2
- scala: 2.11.12
spark: 3.4.1
- scala: 2.13.12
spark: 3.5.1
- scala: 2.12.19
spark: 2.4.8
- scala: 2.13.13
spark: 2.4.8
name: Test Spark ${{matrix.spark}} on Scala ${{matrix.scala}}
steps:
- name: Checkout code
uses: actions/checkout@v2
- uses: coursier/cache-action@v5
- name: Setup Scala
uses: olafurpg/setup-scala@v10
- name: Setup JDK and sbt
uses: actions/setup-[email protected]
with:
java-version: "[email protected]"
distribution: temurin
java-version: 8
cache: sbt
- name: Build and run unit tests
working-directory: ./pramen
run: sbt ++${{matrix.scala}} test -DSPARK_VERSION=${{matrix.spark}}
Expand Down
4 changes: 2 additions & 2 deletions pramen/build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ import BuildInfoTemplateSettings._
import com.github.sbt.jacoco.report.JacocoReportSettings

val scala211 = "2.11.12"
val scala212 = "2.12.18"
val scala213 = "2.13.12"
val scala212 = "2.12.19"
val scala213 = "2.13.13"

ThisBuild / organization := "za.co.absa.pramen"

Expand Down
2 changes: 1 addition & 1 deletion pramen/examples/combined_example.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# Prerequisites:
# 1. Download Spark 3.4.1 (Scala 2.12) and install it in /opt/spark/spark-3.4.1 or some other directory
# 2. At repo_root/pramen, run
# sbt -DSPARK_VERSION="3.4.1" ++2.12.18 assembly
# sbt -DSPARK_VERSION="3.4.1" ++2.12.19 assembly
# 3. Run
# ./examples/combined_example.sh

Expand Down
2 changes: 1 addition & 1 deletion pramen/project/Dependencies.scala
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ object Dependencies {
"org.apache.httpcomponents" % "httpclient" % httpClientVersion,
"org.scalatest" %% "scalatest" % scalatestVersion % Test
) ++ Seq(
getAbrisDependency(scalaVersion),
getAbrisDependency(sparkVersion(scalaVersion)),
getDeltaDependency(sparkVersion(scalaVersion), isCompile = false, isTest = true)
)

Expand Down
20 changes: 11 additions & 9 deletions pramen/project/Versions.scala
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@ import sbt.*

object Versions {
val defaultSparkVersionForScala211 = "2.4.8"
val defaultSparkVersionForScala212 = "3.3.3"
val defaultSparkVersionForScala213 = "3.4.1"
val defaultSparkVersionForScala212 = "3.3.4"
val defaultSparkVersionForScala213 = "3.4.2"

val typesafeConfigVersion = "1.4.0"
val postgreSqlDriverVersion = "42.3.8"
Expand Down Expand Up @@ -93,13 +93,15 @@ object Versions {
}
}

def getAbrisDependency(scalaVersion: String): ModuleID = {
// According to this: https://docs.delta.io/latest/releases.html
val abrisVersion = scalaVersion match {
case version if version.startsWith("2.11.") => "5.1.1"
case version if version.startsWith("2.12.") => "5.1.1"
case version if version.startsWith("2.13.") => "6.0.0"
case _ => throw new IllegalArgumentException(s"Scala $scalaVersion not supported for Abris dependency.")
def getAbrisDependency(sparkVersion: String): ModuleID = {
// According to this: https://github.com/AbsaOSS/ABRiS?tab=readme-ov-file#supported-versions
val abrisVersion = sparkVersion match {
case version if version.startsWith("2.4.") => "5.1.1"
case version if version.startsWith("3.0.") => "5.1.1"
case version if version.startsWith("3.1.") => "5.1.1"
case version if version == "3.2.0" => "6.1.1"
case version if version.startsWith("3.") => "6.4.0"
case _ => throw new IllegalArgumentException(s"Spark $sparkVersion not supported for Abris dependency.")
}

println(s"Using Abris version $abrisVersion")
Expand Down
2 changes: 1 addition & 1 deletion pramen/project/build.properties
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@
# limitations under the License.
#

sbt.version=1.9.7
sbt.version=1.9.9
Loading