Relative Content

Tag Archive for pysparkdatabricksazure-databricks

Share cluster params between jobs

I have a workflow WF1 in which I trigger another workflow WF2 in the task T2. Here is the thing. I know I can use os.environ.copy() output as a parameter in the task T2 and os.environ.update() inside WF2. But imagine any task of WF2 fails. When I repai & run the WF2, it will not have WF1 environ parameters. So my question, is there any other way to copy env parameters from WF1 to WF2 that supports repair runs & tasks without losing all the variables?