Pyspark gave ModuleNotFoundError when run in command line
After installing spark 3.4.4, JDK 1.8.0_152, Python 3.13.1 and Hadoop, I ran pyspark in the command line and got the error message ModuleNotFoundError: No module named ‘typing.io’; ‘typing’ is not a package. I double-checked, triple-checked the environment variables and made sure that JAVA_HOME, SPARK_HOME and HADOOP_HOME were correctly set up. Any help is really appreciated.