2

I installed pyspark 3.2.0 via pip install pyspark. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark since that is my understanding of what SPARK_HOME should be.

  1. Where can I find spark-defaults.conf? I want to modify it
  2. Am I right in setting SPARK_HOME to the installation location of pyspark ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark?
MiloMinderbinder
  • 3,300
  • 15
  • 34

1 Answers1

3

2. The SPARK_HOME environment variables are configured correctly.

1. In the pip installation environment, the $SPARK_HOME/conf directory needs to be created manually, then copy the configuration file template to this directory and modify each configuration file.

过过招
  • 2,515
  • 2
  • 2
  • 6