Facing duplicate column error while writing date from spark dataframe to oracle table
I am facing a weird issue while writing data to oracle table from spark dataframe. After reading the data from the parquet file a temp table created on top of the dataframe and below to_date function has been applied and i am using the same alias name because the parquet file having same column as oracle table so i am not changing the column