In Azure Databricks notebook i use:
spark.sparkContext.addPyFile("path/to/module.py")
import module
But when i overwrite file, notebook still remembers the old one.
classic python's reload() doesn't work at all
Module is being reloaded only when I restart whole cluster.