0

In Azure Databricks notebook i use:

spark.sparkContext.addPyFile("path/to/module.py")
import module

But when i overwrite file, notebook still remembers the old one.

classic python's reload() doesn't work at all Module is being reloaded only when I restart whole cluster.

Krukosz
  • 21
  • 4
  • am not sure what type of python module you are trying to overwrite it . Similar issue with solution here : https://stackoverflow.com/questions/51450462/pyspark-addpyfile-to-add-zip-of-py-files-but-module-still-not-found – Karthikeyan Rasipalay Durairaj Nov 18 '21 at 21:29

0 Answers0