Skip to content

Commit

Permalink
adds support for the Dataproc on GDC SparkApplication resource (#1223…
Browse files Browse the repository at this point in the history
…7) (#850)

[upstream:f954b7c9dab564bf88a97b24f94e9d795889faaf]

Signed-off-by: Modular Magician <[email protected]>
  • Loading branch information
modular-magician authored Nov 8, 2024
1 parent 71de6e8 commit 7f43847
Show file tree
Hide file tree
Showing 24 changed files with 713 additions and 0 deletions.
15 changes: 15 additions & 0 deletions dataprocgdc_sparkapplication/backing_file.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# This file has some scaffolding to make sure that names are unique and that
# a region and zone are selected when you try to create your Terraform resources.

locals {
name_suffix = "${random_pet.suffix.id}"
}

resource "random_pet" "suffix" {
length = 2
}

provider "google" {
region = "us-central1"
zone = "us-central1-c"
}
32 changes: 32 additions & 0 deletions dataprocgdc_sparkapplication/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
resource "google_dataproc_gdc_application_environment" "app_env" {
application_environment_id = "tf-e2e-spark-app-env-${local.name_suffix}"
serviceinstance = "do-not-delete-dataproc-gdc-instance"
project = "my-project-${local.name_suffix}"
location = "us-west2"
namespace = "default"
}

resource "google_dataproc_gdc_spark_application" "spark-application" {
spark_application_id = "tf-e2e-spark-app-${local.name_suffix}"
serviceinstance = "do-not-delete-dataproc-gdc-instance"
project = "my-project-${local.name_suffix}"
location = "us-west2"
namespace = "default"
labels = {
"test-label": "label-value"
}
annotations = {
"an_annotation": "annotation_value"
}
properties = {
"spark.executor.instances": "2"
}
application_environment = google_dataproc_gdc_application_environment.app_env.name
version = "1.2"
spark_application_config {
main_jar_file_uri = "file:///usr/lib/spark/examples/jars/spark-examples.jar"
jar_file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
archive_uris = ["file://usr/lib/spark/examples/spark-examples.jar"]
file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
}
}
7 changes: 7 additions & 0 deletions dataprocgdc_sparkapplication/motd
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
===

These examples use real resources that will be billed to the
Google Cloud Platform project you use - so make sure that you
run "terraform destroy" before quitting!

===
79 changes: 79 additions & 0 deletions dataprocgdc_sparkapplication/tutorial.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# Dataprocgdc Sparkapplication - Terraform

## Setup

<walkthrough-author name="[email protected]" analyticsId="UA-125550242-1" tutorialName="dataprocgdc_sparkapplication" repositoryUrl="https://github.com/terraform-google-modules/docs-examples"></walkthrough-author>

Welcome to Terraform in Google Cloud Shell! We need you to let us know what project you'd like to use with Terraform.

<walkthrough-project-billing-setup></walkthrough-project-billing-setup>

Terraform provisions real GCP resources, so anything you create in this session will be billed against this project.

## Terraforming!

Let's use {{project-id}} with Terraform! Click the Cloud Shell icon below to copy the command
to your shell, and then run it from the shell by pressing Enter/Return. Terraform will pick up
the project name from the environment variable.

```bash
export GOOGLE_CLOUD_PROJECT={{project-id}}
```

After that, let's get Terraform started. Run the following to pull in the providers.

```bash
terraform init
```

With the providers downloaded and a project set, you're ready to use Terraform. Go ahead!

```bash
terraform apply
```

Terraform will show you what it plans to do, and prompt you to accept. Type "yes" to accept the plan.

```bash
yes
```


## Post-Apply

### Editing your config

Now you've provisioned your resources in GCP! If you run a "plan", you should see no changes needed.

```bash
terraform plan
```

So let's make a change! Try editing a number, or appending a value to the name in the editor. Then,
run a 'plan' again.

```bash
terraform plan
```

Afterwards you can run an apply, which implicitly does a plan and shows you the intended changes
at the 'yes' prompt.

```bash
terraform apply
```

```bash
yes
```

## Cleanup

Run the following to remove the resources Terraform provisioned:

```bash
terraform destroy
```
```bash
yes
```
15 changes: 15 additions & 0 deletions dataprocgdc_sparkapplication_basic/backing_file.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# This file has some scaffolding to make sure that names are unique and that
# a region and zone are selected when you try to create your Terraform resources.

locals {
name_suffix = "${random_pet.suffix.id}"
}

resource "random_pet" "suffix" {
length = 2
}

provider "google" {
region = "us-central1"
zone = "us-central1-c"
}
12 changes: 12 additions & 0 deletions dataprocgdc_sparkapplication_basic/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
resource "google_dataproc_gdc_spark_application" "spark-application" {
spark_application_id = "tf-e2e-spark-app-basic-${local.name_suffix}"
serviceinstance = "do-not-delete-dataproc-gdc-instance"
project = "my-project-${local.name_suffix}"
location = "us-west2"
namespace = "default"
spark_application_config {
main_class = "org.apache.spark.examples.SparkPi"
jar_file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
args = ["10000"]
}
}
7 changes: 7 additions & 0 deletions dataprocgdc_sparkapplication_basic/motd
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
===

These examples use real resources that will be billed to the
Google Cloud Platform project you use - so make sure that you
run "terraform destroy" before quitting!

===
79 changes: 79 additions & 0 deletions dataprocgdc_sparkapplication_basic/tutorial.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# Dataprocgdc Sparkapplication Basic - Terraform

## Setup

<walkthrough-author name="[email protected]" analyticsId="UA-125550242-1" tutorialName="dataprocgdc_sparkapplication_basic" repositoryUrl="https://github.com/terraform-google-modules/docs-examples"></walkthrough-author>

Welcome to Terraform in Google Cloud Shell! We need you to let us know what project you'd like to use with Terraform.

<walkthrough-project-billing-setup></walkthrough-project-billing-setup>

Terraform provisions real GCP resources, so anything you create in this session will be billed against this project.

## Terraforming!

Let's use {{project-id}} with Terraform! Click the Cloud Shell icon below to copy the command
to your shell, and then run it from the shell by pressing Enter/Return. Terraform will pick up
the project name from the environment variable.

```bash
export GOOGLE_CLOUD_PROJECT={{project-id}}
```

After that, let's get Terraform started. Run the following to pull in the providers.

```bash
terraform init
```

With the providers downloaded and a project set, you're ready to use Terraform. Go ahead!

```bash
terraform apply
```

Terraform will show you what it plans to do, and prompt you to accept. Type "yes" to accept the plan.

```bash
yes
```


## Post-Apply

### Editing your config

Now you've provisioned your resources in GCP! If you run a "plan", you should see no changes needed.

```bash
terraform plan
```

So let's make a change! Try editing a number, or appending a value to the name in the editor. Then,
run a 'plan' again.

```bash
terraform plan
```

Afterwards you can run an apply, which implicitly does a plan and shows you the intended changes
at the 'yes' prompt.

```bash
terraform apply
```

```bash
yes
```

## Cleanup

Run the following to remove the resources Terraform provisioned:

```bash
terraform destroy
```
```bash
yes
```
15 changes: 15 additions & 0 deletions dataprocgdc_sparkapplication_pyspark/backing_file.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# This file has some scaffolding to make sure that names are unique and that
# a region and zone are selected when you try to create your Terraform resources.

locals {
name_suffix = "${random_pet.suffix.id}"
}

resource "random_pet" "suffix" {
length = 2
}

provider "google" {
region = "us-central1"
zone = "us-central1-c"
}
17 changes: 17 additions & 0 deletions dataprocgdc_sparkapplication_pyspark/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
resource "google_dataproc_gdc_spark_application" "spark-application" {
spark_application_id = "tf-e2e-pyspark-app-${local.name_suffix}"
serviceinstance = "do-not-delete-dataproc-gdc-instance"
project = "my-project-${local.name_suffix}"
location = "us-west2"
namespace = "default"
display_name = "A Pyspark application for a Terraform create test"
dependency_images = ["gcr.io/some/image"]
pyspark_application_config {
main_python_file_uri = "gs://goog-dataproc-initialization-actions-us-west2/conda/test_conda.py"
jar_file_uris = ["file:///usr/lib/spark/examples/jars/spark-examples.jar"]
python_file_uris = ["gs://goog-dataproc-initialization-actions-us-west2/conda/get-sys-exec.py"]
file_uris = ["file://usr/lib/spark/examples/spark-examples.jar"]
archive_uris = ["file://usr/lib/spark/examples/spark-examples.jar"]
args = ["10"]
}
}
7 changes: 7 additions & 0 deletions dataprocgdc_sparkapplication_pyspark/motd
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
===

These examples use real resources that will be billed to the
Google Cloud Platform project you use - so make sure that you
run "terraform destroy" before quitting!

===
79 changes: 79 additions & 0 deletions dataprocgdc_sparkapplication_pyspark/tutorial.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# Dataprocgdc Sparkapplication Pyspark - Terraform

## Setup

<walkthrough-author name="[email protected]" analyticsId="UA-125550242-1" tutorialName="dataprocgdc_sparkapplication_pyspark" repositoryUrl="https://github.com/terraform-google-modules/docs-examples"></walkthrough-author>

Welcome to Terraform in Google Cloud Shell! We need you to let us know what project you'd like to use with Terraform.

<walkthrough-project-billing-setup></walkthrough-project-billing-setup>

Terraform provisions real GCP resources, so anything you create in this session will be billed against this project.

## Terraforming!

Let's use {{project-id}} with Terraform! Click the Cloud Shell icon below to copy the command
to your shell, and then run it from the shell by pressing Enter/Return. Terraform will pick up
the project name from the environment variable.

```bash
export GOOGLE_CLOUD_PROJECT={{project-id}}
```

After that, let's get Terraform started. Run the following to pull in the providers.

```bash
terraform init
```

With the providers downloaded and a project set, you're ready to use Terraform. Go ahead!

```bash
terraform apply
```

Terraform will show you what it plans to do, and prompt you to accept. Type "yes" to accept the plan.

```bash
yes
```


## Post-Apply

### Editing your config

Now you've provisioned your resources in GCP! If you run a "plan", you should see no changes needed.

```bash
terraform plan
```

So let's make a change! Try editing a number, or appending a value to the name in the editor. Then,
run a 'plan' again.

```bash
terraform plan
```

Afterwards you can run an apply, which implicitly does a plan and shows you the intended changes
at the 'yes' prompt.

```bash
terraform apply
```

```bash
yes
```

## Cleanup

Run the following to remove the resources Terraform provisioned:

```bash
terraform destroy
```
```bash
yes
```
15 changes: 15 additions & 0 deletions dataprocgdc_sparkapplication_sparkr/backing_file.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# This file has some scaffolding to make sure that names are unique and that
# a region and zone are selected when you try to create your Terraform resources.

locals {
name_suffix = "${random_pet.suffix.id}"
}

resource "random_pet" "suffix" {
length = 2
}

provider "google" {
region = "us-central1"
zone = "us-central1-c"
}
Loading

0 comments on commit 7f43847

Please sign in to comment.