subcategory |
---|
Workspace |
This resource allows you to manage Databricks Workspace Files.
You can declare Terraform-managed workspace file by specifying source
attribute of corresponding local file.
data "databricks_current_user" "me" {
}
resource "databricks_workspace_file" "module" {
source = "${path.module}/module.py"
path = "${data.databricks_current_user.me.home}/AA/BB/CC"
}
You can also create a managed workspace file with inline sources through content_base64
attribute.
resource "databricks_workspace_file" "init_script" {
content_base64 = base64encode(<<-EOT
#!/bin/bash
echo "Hello World"
EOT
)
path = "/Shared/init-script.sh"
}
-> Note Files in Databricks workspace would only be changed, if Terraform stage did change. This means that any manual changes to managed workspace files won't be overwritten by Terraform, if there's no local change to file sources. Workspace files are identified by their path, so changing file's name manually on the workspace and then applying Terraform state would result in creation of workspace file from Terraform state.
The size of a workspace file source code must not exceed a few megabytes. The following arguments are supported:
path
- (Required) The absolute path of the workspace file, beginning with "/", e.g. "/Demo".source
- Path to file on local filesystem. Conflicts withcontent_base64
.content_base64
- The base64-encoded file content. Conflicts withsource
. Use ofcontent_base64
is discouraged, as it's increasing memory footprint of Terraform state and should only be used in exceptional circumstances, like creating a workspace file with configuration properties for a data pipeline.
In addition to all arguments above, the following attributes are exported:
id
- Path of workspace fileurl
- Routable URL of the workspace fileobject_id
- Unique identifier for a workspace fileworkspace_path
- path on Workspace File System (WSFS) in form of/Workspace
+path
- databricks_permissions can control which groups or individual users can access workspace file.
The workspace file resource can be imported using workspace file path
terraform import databricks_workspace_file.this /path/to/file
The following resources are often used in the same context:
- End to end workspace management guide.
- databricks_cluster to create Databricks Clusters.
- databricks_directory to manage directories in Databricks Workpace.
- databricks_job to manage Databricks Jobs to run non-interactive code in a databricks_cluster.
- databricks_pipeline to deploy Delta Live Tables.
- databricks_repo to manage Databricks Repos.
- databricks_secret to manage secrets in Databricks workspace.
- databricks_secret_acl to manage access to secrets in Databricks workspace.
- databricks_secret_scope to create secret scopes in Databricks workspace.
- databricks_user to manage users, that could be added to databricks_group within the workspace.
- databricks_user data to retrieve information about databricks_user.