Best practice to set up loggers with two different outputs in Databricks
I want to set up a logger in Databricks that does two things everytime its called:
Logs not being written of parent directory included in filename
Updated with new code I have a have a main notebook in databricks that is run which creates the root logger in the main notebook, imports the configuration from a logger.py and calls other modules which contain other loggers: # main notebook import logging from logger import get_logfile_name, configure_logger NAME = ‘job1’ from logger import […]