Parquet column cannot be converted in file (…) Expected decimal, Found: FIXED_LEN_BYTE_ARRAY
After adding a column of type decimal(16,8)
to the schema of a Hive table which already contained other decimal(38,18)
columns, we’re facing the following error when trying to read data from it from Spark:
Parquet column cannot be converted in file (…) Expected decimal, Found: FIXED_LEN_BYTE_ARRAY
After adding a column of type decimal(16,8)
to the schema of a Hive table which already contained other decimal(38,18)
columns, we’re facing the following error when trying to read data from it from Spark: