Convert a spark DataFrame to a slighly different case class?
I have some data in HDFS that is in parquet-protobuf.
Due to some project constraints, I want to read that data with a spark DataFrame (easy) and then convert to a case class that is slightly different (i.e. it has the same data, but some fields have a different name, some need to be translated with a dictionary).