spark-shell and pyspark command is not getting initialized after the pre-requisites has been done
spark-shell
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
ReplGlobal.abort: bad constant pool index: 0 at pos: 49842
[init] error:
bad constant pool index: 0 at pos: 49842
while compiling:
during phase: globalPhase=, enteringPhase=
library version: version 2.12.17
compiler version: version 2.12.17
reconstructed args: -classpath -Yrepl-class-based -Yrepl-outdir C:TEMPspark-8c7c7071-0844-45a1-92e7-e97996bb80b8repl-c344ae8d-0919-4569-97d2-0cc91ad23a37