Pyspark saveasTable gives error on overwrites for pyspark data-frame
In my Pyspark code I am performing more than 10 join operations and multiple groupBy in between. I want to avoid a large DAG and so I decided to save the dataframe as a table to avoid re-computations. As a result I created a database and started saving my dataframe inside that.