Access spark context inside mapPartitions in pyspark I have a code, written on scala, and I want to use it in pyspark (3.1.1) via Py4J: