Spark JDBC Postgres with partitionColumn yields duplicate and missing rows
I am reading from my POSTGRES table using Spark JDBC with 10 connections, and using RNO as the partition column with boundaries ranging from 0 to the number of rows in the table.
Spark 3.5.0 postgresql bpchar OutOfMemoryError
Calling show() or other similar actions in Spark 3.5.0 (pyspark) dataframe for a column read from postgresql with datatype bpchar throws an OutOfMemoryError. This error does not occur in lower version of Spark (3.4.1)