Get Pipeline Executions by Pipeline ID: A Complete Guide
Modern data platforms rely heavily on pipelines to orchestrate workflows, automate tasks, and ensure seamless data processing. Monitoring these pipelines is just as important as building them. That’s where the Get Pipeline Executions by Pipeline ID API comes in.
In this blog, we’ll walk through how this API works, its parameters, and how you can use it effectively to track execution history.
API Overview
Endpoint:
/api/v1/pipelines/{pipelineId}
This API retrieves execution details for a specific pipeline using its unique pipelineId. It also supports filtering, sorting, and pagination, making it highly flexible for real-world use cases.
Input Parameters
Path Parameter
-
pipelineId (Required)
The unique identifier of the pipeline whose executions you want to fetch.
Type:
int32
Query Parameters
1. status (Optional)
Filter executions based on their current state.
| Status | Code |
|---|---|
| RUNNING | 0 |
| STOPPED | 1 |
| COMPLETED | 2 |
| FAILED | 3 |
| STARTING | 4 |
| STOP | 5 |
| KILLED | 6 |
| QUEUED | 7 |
| TRIGGERED | 8 |
Default: -1 (returns all statuses)
2. page (Optional)
Defines which page of results to retrieve.
- Default:
0
3. size (Optional)
Specifies the number of records per page.
- Default:
500
4. sortBy (Optional)
Field used to sort execution results.
| Field | Description |
|---|---|
| startTime | Execution start time |
| id | Execution ID |
| name | Pipeline name |
| category | Pipeline category |
| description | Pipeline description |
| username | Executed by user |
| status | Execution status |
Default: startTime
5. sortOrder (Optional)
Sorting direction.
-
asc→ Ascending -
dsc→ Descending (Default)
6. filterBy (Optional)
Additional filtering criteria.
-
myExecution→ Only your executions -
allExecution→ All executions -
''→ No filter returns My Executoins (Default)
7. searchParam (Optional)
Search keyword to filter results by name, description, or other metadata.
- Type:
string
Example Request
GET /api/v1/pipelines/4253?status=3&sortBy=startTime&sortOrder=dsc&page=0&size=100
This request retrieves:
-
Executions for pipeline ID
4253 -
Only FAILED executions
-
Sorted by latest start time
-
First 100 records
Example Response
{
"content": [
{
"id": 79651,
"pipelineId": 4253,
"userId": 1,
"projectId": 6703,
"pipelineScheduleId": null,
"status": 3,
"name": "databricks_create_cluster_run_notebook_pipeline",
"category": "airflow-databricks",
"description": "FailedReason: Cluster id is not added",
"analysisflowExecutionIds": null,
"connectionId": 2953,
"connectionName": "LOCAL-AIRFLOWS-Sparkflows-2026",
"startTime": 1777248306104,
"endTime": 1777248306318,
"emailOnFailure": "",
"emailOnSuccess": "",
"username": "admin",
"content": "{\"name\":\"databricks_create_cluster_run_notebook_pipeline\",\"uuid\":\"cc4d2f3d-d517-41c0-bc1c-2a9d88457ce1\",\"category\":\"airflow-databricks\",\"description\":\"\",\"parameters\":\"\",\"nodes\":[{\"id\":\"1\",\"path\":\"/12-Databricks/\",\"name\":\"Create Cluster\",\"description\":\"This node creates a new Cluster in Databricks by using details in configuration and passes the cluster ID to the next step.\",\"details\":\"\\u003ch2\\u003eCreate Databricks Cluster\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\nThis node creates a new Cluster in Databricks by using details in configuration and passes the cluster ID to the next step.\\u003cbr\\u003e\",\"examples\":\"\\u003ch2\\u003e Create Databricks Cluster Examples\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\n new_cluster\\u003d{\\u003cbr\\u003e\\n \\\"spark_version\\\": \\\"11.3.x-scala2.12\\\",\\u003cbr\\u003e\\n \\\"node_type_id\\\": \\\"Standard_DS3_v2\\\",\\u003cbr\\u003e\\n \\\"num_workers\\\": 2,\\u003cbr\\u003e\\n}\\u003cbr\\u003e\",\"type\":\"databricks\",\"nodeClass\":\"fire.pipelineNodes.DatabricksCreateCluster\",\"x\":\"239.4px\",\"y\":\"287.4px\",\"fields\":[{\"name\":\"Name\",\"value\":\"create_cluster\",\"widget\":\"textfield\",\"title\":\"Task Name\",\"description\":\"Unique name of the task in airflow DAG.\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"SparkVersion\",\"value\":\"10.4.x-scala2.12\",\"widget\":\"array\",\"title\":\"Databricks Runtime Version\",\"description\":\"Databricks Runtime Version like 11.3.x-scala2.12\",\"optionsArray\":[\"12.2.x-scala2.12\",\"11.3.x-photon-scala2.12\",\"15.3.x-cpu-ml-photon-scala2.12\",\"10.4.x-cpu-ml-scala2.12\",\"15.4.x-photon-scala2.12\",\"15.4.x-cpu-ml-photon-scala2.12\",\"9.1.x-photon-scala2.12\",\"15.3.x-photon-scala2.12\",\"10.4.x-scala2.12\",\"13.3.x-scala2.12\",\"16.0.x-scala2.12\",\"11.3.x-cpu-ml-scala2.12\",\"11.3.x-scala2.12\",\"13.3.x-cpu-ml-scala2.12\",\"10.4.x-photon-scala2.12\",\"14.3.x-photon-scala2.12\",\"15.4.x-gpu-ml-scala2.12\",\"16.0.x-cpu-ml-photon-scala2.12\",\"14.1.x-scala2.12\",\"14.3.x-cpu-ml-scala2.12\",\"9.1.x-scala2.12\",\"15.2.x-gpu-ml-scala2.12\",\"16.0.x-gpu-ml-scala2.12\",\"12.2.x-photon-scala2.12\",\"12.2.x-cpu-ml-scala2.12\",\"15.2.x-scala2.12\",\"16.0.x-cpu-ml-scala2.12\",\"15.3.x-cpu-ml-scala2.12\",\"15.4.x-scala2.12\",\"11.3.x-gpu-ml-scala2.12\",\"15.3.x-scala2.12\",\"9.1.x-cpu-ml-scala2.12\",\"14.3.x-scala2.12\",\"15.3.x-gpu-ml-scala2.12\",\"15.4.x-cpu-ml-scala2.12\",\"15.2.x-cpu-ml-scala2.12\",\"16.0.x-photon-scala2.12\",\"14.1.x-gpu-ml-scala2.12\",\"9.1.x-gpu-ml-scala2.12\",\"13.3.x-gpu-ml-scala2.12\",\"16.1.x-scala2.12\",\"14.3.x-gpu-ml-scala2.12\",\"14.1.x-cpu-ml-scala2.12\",\"16.1.x-cpu-ml-photon-scala2.12\",\"16.1.x-cpu-ml-scala2.12\",\"16.1.x-photon-scala2.12\",\"16.1.x-gpu-ml-scala2.12\",\"12.2.x-gpu-ml-scala2.12\",\"15.2.x-photon-scala2.12\",\"13.3.x-photon-scala2.12\",\"10.4.x-gpu-ml-scala2.12\",\"14.1.x-photon-scala2.12\"],\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"NodeTypeId\",\"value\":\"Standard_D4plds_v6\",\"widget\":\"textfield\",\"title\":\"Worker Type\",\"description\":\"Worker Type\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"DriverTypeId\",\"value\":\"Standard_D4plds_v6\",\"widget\":\"textfield\",\"title\":\"Driver Type\",\"description\":\"Driver Type\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"MinWorkers\",\"value\":\"1\",\"widget\":\"textfield\",\"title\":\"Number of minimum Workers\",\"description\":\"Number of minimum Workers\",\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"MaxWorkers\",\"value\":\"1\",\"widget\":\"textfield\",\"title\":\"Number of maximum Workers\",\"description\":\"Number of maximum Workers\",\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"SparkConf\",\"value\":\"[{\\\"Key\\\":\\\"spark.sql.shuffle.partitions\\\",\\\"Value\\\":\\\"200\\\"}]\",\"widget\":\"variablesList\",\"title\":\"Spark Config\",\"description\":\"Add spark config values\",\"optionsArray\":[\"Key\",\"Value\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":true,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"CustomTags\",\"value\":\"[{\\\"Key\\\":\\\"environment\\\",\\\"Value\\\":\\\"production\\\"}]\",\"widget\":\"variablesList\",\"title\":\"Custom Tags\",\"description\":\"Add custom tag values\",\"optionsArray\":[\"Key\",\"Value\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"InitScripts\",\"value\":\"[{\\\"Type\\\":\\\"Workspace\\\",\\\"File path\\\":\\\"/Users/jayant@sparkflows.io/python_dependency/init_script.sh\\\"}]\",\"widget\":\"variablesList\",\"title\":\"Init Scripts Path\",\"description\":\"Add Init Scripts Path\",\"optionsArray\":[\"Type\",\"File path\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false}]},{\"id\":\"2\",\"path\":\"/12-Databricks/\",\"name\":\"Run notebook\",\"description\":\"This node use to submit a new Databricks job to Cluster by using details in configuration\",\"details\":\"\\u003ch2\\u003eSubmit Spark Job to Cluster\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\nThis node use to submit a new Databricks job to Cluster by using details in configuration.\\u003cbr\\u003e\",\"examples\":\"\\u003ch2\\u003e Submit Spark Job to Cluster Examples\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\nnew_cluster \\u003d {\\u003cbr\\u003e\\n \\\"spark_version\\\": \\\"9.1.x-scala2.12\\\",\\u003cbr\\u003e\\n \\\"node_type_id\\\": \\\"r3.xlarge\\\",\\u003cbr\\u003e\\n \\\"aws_attributes\\\": {\\\"availability\\\": \\\"ON_DEMAND\\\"},\\u003cbr\\u003e\\n \\\"num_workers\\\": 8,\\u003cbr\\u003e\\n}\\u003cbr\\u003e\\n\\u003cbr\\u003e\\nnotebook_task_params \\u003d {\\u003cbr\\u003e\\n \\\"new_cluster\\\": new_cluster,\\u003cbr\\u003e\\n \\\"notebook_task\\\": {\\u003cbr\\u003e\\n \\\"notebook_path\\\": \\\"/Users/airflow@example.com/PrepareData\\\",\\u003cbr\\u003e\\n },\\u003cbr\\u003e\\n}\\u003cbr\\u003e\\n\\u003cbr\\u003e\\nnotebook_task \\u003d DatabricksSubmitRunOperator(task_id\\u003d\\\"notebook_task\\\", json\\u003dnotebook_task_params)\\u003cbr\\u003e\",\"type\":\"databricks\",\"nodeClass\":\"fire.pipelineNodes.DatabricksSubmitRun\",\"x\":\"389.6px\",\"y\":\"296.6px\",\"fields\":[{\"name\":\"Name\",\"value\":\"submit_job_task\",\"widget\":\"textfield\",\"title\":\"Task Name\",\"description\":\"Unique name of the task in airflow DAG.\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"ClusterId\",\"value\":\"\",\"widget\":\"textfield\",\"title\":\"Cluster Id\",\"description\":\"If Cluster ID is empty, the step tries to pick the cluster ID from previous create cluster node(task).\",\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"NotebookPath\",\"value\":\"/Workspace/Users/jayant@sparkflows.io/AnalyticalApp/api-examples\",\"widget\":\"textfield\",\"title\":\"Notebook Path\",\"description\":\"Notebook Path\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"BaseParameters\",\"value\":\"[]\",\"widget\":\"variablesList\",\"title\":\"Base Parameters\",\"description\":\"Base Parameters\",\"optionsArray\":[\"Key\",\"Value\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"timeout\",\"value\":\"120\",\"widget\":\"textfield\",\"title\":\"Timeout\",\"description\":\"Timeout for your Databricks task in Airflow to give it more time to complete, especially if it\\u0027s waiting for the cluster to reach the RUNNING state(In seconds).\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"DatabricksConnectionId\",\"value\":\"DATABRICKS_SPARKFLOWS\",\"widget\":\"connections\",\"title\":\"Databricks Connection\",\"description\":\"Databricks Connection\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"trigger_rule\",\"value\":\"all_success\",\"widget\":\"array\",\"title\":\"Trigger Rule\",\"description\":\"Trigger Rule to be used\",\"optionsArray\":[\"all_success\",\"all_failed\",\"all_done\",\"all_skipped\",\"one_failed\",\"one_success\",\"none_failed\",\"none_failed_min_one_success\",\"none_skipped\",\"always\"],\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false}]},{\"id\":\"3\",\"path\":\"/07-TriggerNextDag/\",\"name\":\"notebook_run\",\"description\":\"This node triggers the next pipeline to run.\",\"details\":\"\\u003ch2\\u003eTrigger next dag run\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\nThis node triggers the next pipeline to run.\\u003cbr\\u003e\",\"examples\":\"\\u003ch2\\u003e Trigger Next DAG Examples\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\n\\u003ch4\\u003e Example of Creating a Trigger Next DAG Node\\u003c/h4\\u003e\\n\\u003cbr\\u003e\\n{\\u003cbr\\u003e\\n\\\"Name\\\": \\\"TriggerNextPipeline\\\",\\u003cbr\\u003e\\n\\\"Select Pipeline\\\": \\\"DataProcessingPipeline\\\"\\u003cbr\\u003e\\n}\\u003cbr\\u003e\\n\\u003cbr\\u003e\\n\\u003ch4\\u003e Explanation\\u003c/h4\\u003e\\n\\u003cul\\u003e\\n\\u003cli\\u003e This example demonstrates how to create a Trigger Next DAG node that triggers the `DataProcessingPipeline` as the next pipeline to run.\\u003c/li\\u003e\\n\\u003c/ul\\u003e\\n\\u003ch4\\u003e Usage\\u003c/h4\\u003e\\n\\u003cul\\u003e\\n\\u003cli\\u003e Task ID: \\\"trigger_next_pipeline_task\\\"\\u003c/li\\u003e\\n\\u003c/ul\\u003e\\n\\u003ch4\\u003e Additional Example of Creating a Trigger Next DAG Node with a Different Pipeline\\u003c/h4\\u003e\\n\\u003cbr\\u003e\\n{\\u003cbr\\u003e\\n\\\"Name\\\": \\\"TriggerAnotherPipeline\\\",\\u003cbr\\u003e\\n\\\"Select Pipeline\\\": \\\"ReportingPipeline\\\"\\u003cbr\\u003e\\n}\\u003cbr\\u003e\\n\\u003cbr\\u003e\\n\\u003ch4\\u003e Explanation\\u003c/h4\\u003e\\n\\u003cul\\u003e\\n\\u003cli\\u003e This example illustrates how to create a Trigger Next DAG node that triggers the `ReportingPipeline` instead, allowing for flexibility in the next pipeline execution.\\u003c/li\\u003e\\n\\u003c/ul\\u003e\\n\\u003ch4\\u003e Usage\\u003c/h4\\u003e\\n\\u003cul\\u003e\\n\\u003cli\\u003e Task ID: \\\"trigger_another_pipeline_task\\\"\\u003c/li\\u003e\\n\\u003c/ul\\u003e\",\"type\":\"TriggerDagRun\",\"nodeClass\":\"fire.pipelineNodes.TriggerDagRun\",\"x\":\"546.4px\",\"y\":\"299.4px\",\"fields\":[{\"name\":\"Name\",\"value\":\"child_notebook_run\",\"widget\":\"textfield\",\"title\":\"Task Name\",\"description\":\"Unique name of the task in Airflow DAG\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"Select Pipeline\",\"value\":\"11514\",\"widget\":\"pipeline\",\"title\":\"Select Pipeline\",\"description\":\"Select pipeline to trigger next\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"wait_for_completion\",\"value\":\"False\",\"widget\":\"array\",\"title\":\"Wait for completion\",\"description\":\"Waits for triggered DAG to finish\",\"optionsArray\":[\"True\",\"False\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"poke_interval\",\"value\":\"\",\"widget\":\"textfield\",\"title\":\"Poke Interval\",\"description\":\"Poke Interval in seconds\",\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"parameters\",\"value\":\"[{\\\"Key\\\":\\\"cluster_id\\\",\\\"Value\\\":\\\"{{task_instance.xcom_pull(task_ids\\u003d\\u0027create_cluster\\u0027,key\\u003d\\u0027return_value\\u0027)}}\\\"}]\",\"widget\":\"variablesList\",\"title\":\"Parameters\",\"description\":\"List of parameters to be passed to the next pipeline\",\"optionsArray\":[\"Key\",\"Value\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":true,\"disableRefresh\":false,\"keyValue\":false}]}],\"edges\":[{\"source\":\"1\",\"target\":\"2\",\"id\":1},{\"source\":\"2\",\"target\":\"3\",\"id\":2}]}",
"applicationId": null,
"airflowResponse": null,
"dagFilePath": "/home/sparkflows/airflow/dags/databricks_create_cluster_run_notebook_pipeline.py",
"metrics": null,
"pipelineParameters": "",
"fireJobId": null
},
{
"id": 79650,
"pipelineId": 4253,
"userId": 1,
"projectId": 6703,
"pipelineScheduleId": null,
"status": 3,
"name": "databricks_create_cluster_run_notebook_pipeline",
"category": "airflow-databricks",
"description": "FailedReason: Cluster id is not added",
"analysisflowExecutionIds": null,
"connectionId": 2953,
"connectionName": "LOCAL-AIRFLOWS-Sparkflows-2026",
"startTime": 1777248199575,
"endTime": 1777248199791,
"emailOnFailure": "",
"emailOnSuccess": "",
"username": "admin",
"content": "{\"name\":\"databricks_create_cluster_run_notebook_pipeline\",\"uuid\":\"cc4d2f3d-d517-41c0-bc1c-2a9d88457ce1\",\"category\":\"airflow-databricks\",\"description\":\"\",\"parameters\":\"\",\"nodes\":[{\"id\":\"1\",\"path\":\"/12-Databricks/\",\"name\":\"Create Cluster\",\"description\":\"This node creates a new Cluster in Databricks by using details in configuration and passes the cluster ID to the next step.\",\"details\":\"\\u003ch2\\u003eCreate Databricks Cluster\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\nThis node creates a new Cluster in Databricks by using details in configuration and passes the cluster ID to the next step.\\u003cbr\\u003e\",\"examples\":\"\\u003ch2\\u003e Create Databricks Cluster Examples\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\n new_cluster\\u003d{\\u003cbr\\u003e\\n \\\"spark_version\\\": \\\"11.3.x-scala2.12\\\",\\u003cbr\\u003e\\n \\\"node_type_id\\\": \\\"Standard_DS3_v2\\\",\\u003cbr\\u003e\\n \\\"num_workers\\\": 2,\\u003cbr\\u003e\\n}\\u003cbr\\u003e\",\"type\":\"databricks\",\"nodeClass\":\"fire.pipelineNodes.DatabricksCreateCluster\",\"x\":\"239.4px\",\"y\":\"287.4px\",\"fields\":[{\"name\":\"Name\",\"value\":\"create_cluster\",\"widget\":\"textfield\",\"title\":\"Task Name\",\"description\":\"Unique name of the task in airflow DAG.\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"SparkVersion\",\"value\":\"10.4.x-scala2.12\",\"widget\":\"array\",\"title\":\"Databricks Runtime Version\",\"description\":\"Databricks Runtime Version like 11.3.x-scala2.12\",\"optionsArray\":[\"12.2.x-scala2.12\",\"11.3.x-photon-scala2.12\",\"15.3.x-cpu-ml-photon-scala2.12\",\"10.4.x-cpu-ml-scala2.12\",\"15.4.x-photon-scala2.12\",\"15.4.x-cpu-ml-photon-scala2.12\",\"9.1.x-photon-scala2.12\",\"15.3.x-photon-scala2.12\",\"10.4.x-scala2.12\",\"13.3.x-scala2.12\",\"16.0.x-scala2.12\",\"11.3.x-cpu-ml-scala2.12\",\"11.3.x-scala2.12\",\"13.3.x-cpu-ml-scala2.12\",\"10.4.x-photon-scala2.12\",\"14.3.x-photon-scala2.12\",\"15.4.x-gpu-ml-scala2.12\",\"16.0.x-cpu-ml-photon-scala2.12\",\"14.1.x-scala2.12\",\"14.3.x-cpu-ml-scala2.12\",\"9.1.x-scala2.12\",\"15.2.x-gpu-ml-scala2.12\",\"16.0.x-gpu-ml-scala2.12\",\"12.2.x-photon-scala2.12\",\"12.2.x-cpu-ml-scala2.12\",\"15.2.x-scala2.12\",\"16.0.x-cpu-ml-scala2.12\",\"15.3.x-cpu-ml-scala2.12\",\"15.4.x-scala2.12\",\"11.3.x-gpu-ml-scala2.12\",\"15.3.x-scala2.12\",\"9.1.x-cpu-ml-scala2.12\",\"14.3.x-scala2.12\",\"15.3.x-gpu-ml-scala2.12\",\"15.4.x-cpu-ml-scala2.12\",\"15.2.x-cpu-ml-scala2.12\",\"16.0.x-photon-scala2.12\",\"14.1.x-gpu-ml-scala2.12\",\"9.1.x-gpu-ml-scala2.12\",\"13.3.x-gpu-ml-scala2.12\",\"16.1.x-scala2.12\",\"14.3.x-gpu-ml-scala2.12\",\"14.1.x-cpu-ml-scala2.12\",\"16.1.x-cpu-ml-photon-scala2.12\",\"16.1.x-cpu-ml-scala2.12\",\"16.1.x-photon-scala2.12\",\"16.1.x-gpu-ml-scala2.12\",\"12.2.x-gpu-ml-scala2.12\",\"15.2.x-photon-scala2.12\",\"13.3.x-photon-scala2.12\",\"10.4.x-gpu-ml-scala2.12\",\"14.1.x-photon-scala2.12\"],\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"NodeTypeId\",\"value\":\"Standard_D4plds_v6\",\"widget\":\"textfield\",\"title\":\"Worker Type\",\"description\":\"Worker Type\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"DriverTypeId\",\"value\":\"Standard_D4plds_v6\",\"widget\":\"textfield\",\"title\":\"Driver Type\",\"description\":\"Driver Type\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"MinWorkers\",\"value\":\"1\",\"widget\":\"textfield\",\"title\":\"Number of minimum Workers\",\"description\":\"Number of minimum Workers\",\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"MaxWorkers\",\"value\":\"1\",\"widget\":\"textfield\",\"title\":\"Number of maximum Workers\",\"description\":\"Number of maximum Workers\",\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"SparkConf\",\"value\":\"[{\\\"Key\\\":\\\"spark.sql.shuffle.partitions\\\",\\\"Value\\\":\\\"200\\\"}]\",\"widget\":\"variablesList\",\"title\":\"Spark Config\",\"description\":\"Add spark config values\",\"optionsArray\":[\"Key\",\"Value\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":true,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"CustomTags\",\"value\":\"[{\\\"Key\\\":\\\"environment\\\",\\\"Value\\\":\\\"production\\\"}]\",\"widget\":\"variablesList\",\"title\":\"Custom Tags\",\"description\":\"Add custom tag values\",\"optionsArray\":[\"Key\",\"Value\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"InitScripts\",\"value\":\"[{\\\"Type\\\":\\\"Workspace\\\",\\\"File path\\\":\\\"/Users/jayant@sparkflows.io/python_dependency/init_script.sh\\\"}]\",\"widget\":\"variablesList\",\"title\":\"Init Scripts Path\",\"description\":\"Add Init Scripts Path\",\"optionsArray\":[\"Type\",\"File path\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false}]},{\"id\":\"2\",\"path\":\"/12-Databricks/\",\"name\":\"Run notebook\",\"description\":\"This node use to submit a new Databricks job to Cluster by using details in configuration\",\"details\":\"\\u003ch2\\u003eSubmit Spark Job to Cluster\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\nThis node use to submit a new Databricks job to Cluster by using details in configuration.\\u003cbr\\u003e\",\"examples\":\"\\u003ch2\\u003e Submit Spark Job to Cluster Examples\\u003c/h2\\u003e\\n\\u003cbr\\u003e\\nnew_cluster \\u003d {\\u003cbr\\u003e\\n \\\"spark_version\\\": \\\"9.1.x-scala2.12\\\",\\u003cbr\\u003e\\n \\\"node_type_id\\\": \\\"r3.xlarge\\\",\\u003cbr\\u003e\\n \\\"aws_attributes\\\": {\\\"availability\\\": \\\"ON_DEMAND\\\"},\\u003cbr\\u003e\\n \\\"num_workers\\\": 8,\\u003cbr\\u003e\\n}\\u003cbr\\u003e\\n\\u003cbr\\u003e\\nnotebook_task_params \\u003d {\\u003cbr\\u003e\\n \\\"new_cluster\\\": new_cluster,\\u003cbr\\u003e\\n \\\"notebook_task\\\": {\\u003cbr\\u003e\\n \\\"notebook_path\\\": \\\"/Users/airflow@example.com/PrepareData\\\",\\u003cbr\\u003e\\n },\\u003cbr\\u003e\\n}\\u003cbr\\u003e\\n\\u003cbr\\u003e\\nnotebook_task \\u003d DatabricksSubmitRunOperator(task_id\\u003d\\\"notebook_task\\\", json\\u003dnotebook_task_params)\\u003cbr\\u003e\",\"type\":\"databricks\",\"nodeClass\":\"fire.pipelineNodes.DatabricksSubmitRun\",\"x\":\"394px\",\"y\":\"292px\",\"fields\":[{\"name\":\"Name\",\"value\":\"submit_job_task\",\"widget\":\"textfield\",\"title\":\"Task Name\",\"description\":\"Unique name of the task in airflow DAG.\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"ClusterId\",\"value\":\"\",\"widget\":\"textfield\",\"title\":\"Cluster Id\",\"description\":\"If Cluster ID is empty, the step tries to pick the cluster ID from previous create cluster node(task).\",\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"NotebookPath\",\"value\":\"/Workspace/Users/jayant@sparkflows.io/AnalyticalApp/api-examples\",\"widget\":\"textfield\",\"title\":\"Notebook Path\",\"description\":\"Notebook Path\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"BaseParameters\",\"value\":\"[]\",\"widget\":\"variablesList\",\"title\":\"Base Parameters\",\"description\":\"Base Parameters\",\"optionsArray\":[\"Key\",\"Value\"],\"required\":false,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"timeout\",\"value\":\"120\",\"widget\":\"textfield\",\"title\":\"Timeout\",\"description\":\"Timeout for your Databricks task in Airflow to give it more time to complete, especially if it\\u0027s waiting for the cluster to reach the RUNNING state(In seconds).\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"DatabricksConnectionId\",\"value\":\"DATABRICKS_WORKSPACE_3018\",\"widget\":\"connections\",\"title\":\"Databricks Connection\",\"description\":\"Databricks Connection\",\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false},{\"name\":\"trigger_rule\",\"value\":\"all_success\",\"widget\":\"array\",\"title\":\"Trigger Rule\",\"description\":\"Trigger Rule to be used\",\"optionsArray\":[\"all_success\",\"all_failed\",\"all_done\",\"all_skipped\",\"one_failed\",\"one_success\",\"none_failed\",\"none_failed_min_one_success\",\"none_skipped\",\"always\"],\"required\":true,\"display\":true,\"editable\":true,\"expandable\":false,\"disableRefresh\":false,\"keyValue\":false}]}],\"edges\":[{\"source\":\"1\",\"target\":\"2\",\"id\":1}]}",
"applicationId": null,
"airflowResponse": null,
"dagFilePath": "/home/sparkflows/airflow/dags/databricks_create_cluster_run_notebook_pipeline.py",
"metrics": null,
"pipelineParameters": "",
"fireJobId": null
}
],
"totalElements": 2
}