Databricks Provider
Docs coming soon...
Use the Databricks Terraform provider to interact with almost all of Databricks resources.
Compute resources
Deploy databricks_cluster on selected databricks_node_type
Schedule automated databricks_job
Control cost and data access with databricks_cluster_policy
Speedup job & cluster startup with databricks_instance_pool
Customize clusters with databricks_global_init_script
Manage few databricks_notebook, and even list them
Manage databricks_repo
Storage
Manage JAR, Wheel & Egg libraries through databricks_dbfs_file
List entries on DBFS with databricks_dbfs_file_paths data source
Get contents of small files with databricks_dbfs_file data source
Mount storage with databricks_mount resource
Security
Organize databricks_user into databricks_group through databricks_group_member, also reading metadata
Create databricks_service_principal with databricks_obo_token to enable even more restricted access control.
Create databricks_service_principal with databricks_service_principal_secret to authenticate with the service principal OAuth tokens (Only for AWS deployments)
Manage data access with databricks_instance_profile, which can be assigned through databricks_group_instance_profile and databricks_user_instance_profile
Control which networks can access workspace with databricks_ip_access_list
Generically manage databricks_permissions
Manage data object access control lists with databricks_sql_permissions
Keep sensitive elements like passwords in databricks_secret, grouped into databricks_secret_scope and controlled by databricks_secret_acl
Create workspaces in your VPC with DBFS using cross-account IAM roles, having your notebooks encrypted with CMK.
Use predefined AWS IAM Policy Templates: databricks_aws_assume_role_policy, databricks_aws_crossaccount_policy, databricks_aws_bucket_policy
Configure billing and audit databricks_mws_log_delivery
Databricks SQL
Create databricks_sql_endpoint controlled by databricks_permissions.
Manage queries and their visualizations.
Manage dashboards and their widgets.
Machine Learning
Create models in Unity Catalog.
Create MLflow experiments.
Create model serving endpoints.
Want to learn more about Databricks + Resourcely? Get in touch.
Last updated