qat.purr.utils.logger module
- class BasicLogger(name, _log_folder=None)
Bases:
Logger
The basic logger class that should be used. Upon setup, this is provided to the built-in logging by calling
logging.setLoggerClass
. This way, every new logger created will be of this class. This allows us to define custom fields and functions that we want to use with our loggers. This class should not be instantiated separately, only calllogging.getLogger("qat.purr.some_name")
, and this will return an instance ofBasicLogger
.Initialize the logger with a name and an optional level.
- close()
Closes this logger, cleans up the file handles and appropriate folder.
- code(source)
Outputs a section of executable code (used for Jupyter files). This logging function is PuRR specific.
- Parameters:
source¶ (
List
[str
]) – The list of code lines as strings
- property logs_path
- makeRecord(name, level, fn, lno, msg, args, exc_info, func=None, extra=None, sinfo=None)
Override that allows us to override record values via the extras dictionary. Initially built to allow for printing out messages that look like they come from different places in the source code than where the logger was called.
- output(data, *args, cell_type=None, fit_type=None, msg='', section_level=1)
Displays the result of some experiment or computation. This logging function is PuRR specific.
The output depends on the
cell_type
parameter:table
- it creates a table in markdown syntax. The data needs to be a list of lists of strings, where the first list will be the header of the table, and then each list represents a row.fit
- it displays the Fit function in LaTeX syntax. It also requires thefit_type
parameternew_section
- creates a new section in markdown syntax. The title will be the string provided bydata
- Parameters:
data¶ – The data to output based on the rest of the parameters.
cell_type¶ (
Optional
[str
]) – It can have multiple values:table
,fit
,new_section
fit_type¶ (
Optional
[str
]) – Required whencell_type
isfit
. It can be one of the following types:SINUSOID
,SINUSOID_EXPONENTIAL
,POLYNOMIAL
,EXPONENTIAL
,LORENTZIAN
msg¶ (
str
) – Message to insert in front of the table ifcell_type
istable
section_level¶ – The level of the new section in markdown, if
cell_type
isnew_section
- record_override_key = '$_enable_record_override'
- save_object(obj, name, numpy_arr=False)
Serializes the specified object. This logging function is PuRR specific.
- class CompositeLogger(loggers_or_names=None, _log_folder=None)
Bases:
BasicLogger
The default logger class of PuRR. It is intended to store all the loggers in a list, and when logging, the functions from here should be called, which iterate through the list and apply the logging to each logger separately. This way, only one function needs to be called when logging, and it is ensured, that all the enabled loggers will log the message.
Creates the list of loggers on which the logging functions will iterate
- Parameters:
logger_names¶ – List of loggers by their names (e.g.
["qat.purr.json", "qat.purr.file"]
) or actual logger instances.
- add_loggers(loggers_or_names=())
- close()
Closes this logger, cleans up the file handles and appropriate folder.
- code(source)
Outputs a section of executable code (used for Jupyter files). This logging function is PuRR specific.
- Parameters:
source¶ (
List
[str
]) – The list of code lines as strings
- critical(msg, *args, **kwargs)
Log ‘msg % args’ with severity ‘CRITICAL’.
To pass exception information, use the keyword argument exc_info with a true value, e.g.
logger.critical(“Houston, we have a %s”, “major disaster”, exc_info=1)
- debug(msg, *args, **kwargs)
Log ‘msg % args’ with severity ‘DEBUG’.
To pass exception information, use the keyword argument exc_info with a true value, e.g.
logger.debug(“Houston, we have a %s”, “thorny problem”, exc_info=1)
- error(msg, *args, **kwargs)
Log ‘msg % args’ with severity ‘ERROR’.
To pass exception information, use the keyword argument exc_info with a true value, e.g.
logger.error(“Houston, we have a %s”, “major problem”, exc_info=1)
- exception(msg, *args, **kwargs)
Convenience method for logging an ERROR with exception information.
- info(msg, *args, **kwargs)
Log ‘msg % args’ with severity ‘INFO’.
To pass exception information, use the keyword argument exc_info with a true value, e.g.
logger.info(“Houston, we have a %s”, “interesting problem”, exc_info=1)
- log(level, msg, *args, **kwargs)
Log ‘msg % args’ with the integer severity ‘level’.
To pass exception information, use the keyword argument exc_info with a true value, e.g.
logger.log(level, “We have a %s”, “mysterious problem”, exc_info=1)
- output(data, **kwargs)
Displays the result of some experiment or computation. This logging function is PuRR specific.
The output depends on the
cell_type
parameter:table
- it creates a table in markdown syntax. The data needs to be a list of lists of strings, where the first list will be the header of the table, and then each list represents a row.fit
- it displays the Fit function in LaTeX syntax. It also requires thefit_type
parameternew_section
- creates a new section in markdown syntax. The title will be the string provided bydata
- Parameters:
data¶ – The data to output based on the rest of the parameters.
cell_type¶ – It can have multiple values:
table
,fit
,new_section
fit_type¶ – Required when
cell_type
isfit
. It can be one of the following types:SINUSOID
,SINUSOID_EXPONENTIAL
,POLYNOMIAL
,EXPONENTIAL
,LORENTZIAN
msg¶ – Message to insert in front of the table if
cell_type
istable
section_level¶ – The level of the new section in markdown, if
cell_type
isnew_section
- save_object(obj, name, numpy_arr=False, overwrite=False)
Iterates through the list of loggers and calls the corresponding save_object implementations for each of them. This is the method that should be called from the code.
- warning(msg, *args, **kwargs)
Log ‘msg % args’ with severity ‘WARNING’.
To pass exception information, use the keyword argument exc_info with a true value, e.g.
logger.warning(“Houston, we have a %s”, “bit of a problem”, exc_info=1)
- class ConsoleLoggerHandler(stream=<_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>)
Bases:
StreamHandler
Basic console handler for the logger. It defaults to stdout.
Initialize the handler.
If stream is not specified, sys.stderr is used.
- class FileLoggerHandler(file_path)
Bases:
FileHandler
Basic file handler for the logger. A file path must be provided. The log file is created with a delay, so the stream is None until the first emit. This also allows to write some initial stuff to the log file when creating it.
Open the specified file and use it as the stream for logging.
- create_initial_file()
Implement this method in the derived class to insert some initial text in the log file. Use emit and flush while writing directly to the stream.
- emit(record)
Emit a record.
If the stream was not opened because ‘delay’ was specified in the constructor, open it before calling the superclass’s emit.
If stream is not open, current mode is ‘w’ and _closed=True, record will not be emitted (see Issue #42378).
- class JsonHandler(file_path)
Bases:
FileLoggerHandler
The JSON file handler needed for logging in JSON format. In
logging.FileHandler
, at each time something is written to the file, the emit function is called. By overriding the method, the JSON format can be ensured at each writing.Open the specified file and use it as the stream for logging.
- create_initial_file()
Implement this method in the derived class to insert some initial text in the log file. Use emit and flush while writing directly to the stream.
- emit(record)
Emit a record.
If the stream was not opened because ‘delay’ was specified in the constructor, open it before calling the superclass’s emit.
If stream is not open, current mode is ‘w’ and _closed=True, record will not be emitted (see Issue #42378).
- class JsonLoggerHandler(file_path)
Bases:
JsonHandler
The basic JSON file handler logger. It is intended to generate the same output as
FileLoggerHandler
, but in JSON format.Open the specified file and use it as the stream for logging.
- create_initial_file()
Implement this method in the derived class to insert some initial text in the log file. Use emit and flush while writing directly to the stream.
- class KeywordFilter(keyword='')
Bases:
Filter
A customized keyword filter that can be added to a log handler or a logger. Filters all the log messages, and if the message content contains the keyword, the log will not be printed.
Initialize a filter.
Initialize with the name of the logger which, together with its children, will have its events allowed through the filter. If no name is specified, allow every event.
- filter(record)
Determine if the specified record is to be logged.
Returns True if the record should be logged, or False otherwise. If deemed appropriate, the record may be modified in-place.
- class LevelFilter(level)
Bases:
Filter
Filter out the debug messages from the Jupyter logs. This is needed because the specialized logging functions, like code or output have smaller level than the DEBUG logging level (so that other than Jupyter handlers don’t process them).
Initialize a filter.
Initialize with the name of the logger which, together with its children, will have its events allowed through the filter. If no name is specified, allow every event.
- filter(record)
Determine if the specified record is to be logged.
Returns True if the record should be logged, or False otherwise. If deemed appropriate, the record may be modified in-place.
- class LogFolder(base_folder_path=None, labber_style=None, cleanup=None, folder_name=None, prefix=None, suffix=None)
Bases:
object
It is the main log folder in which all the log files are saved. It can be configured multiple ways, like the base folder path, which can be a specified path of the disk or
None
to save the logs in the system temporary folder. Thelabber_style
specifies whether to create a Labber-style folder hierarchy for the logs or not. If not, depending on thefolder_name
parameter, it will either create random folders for each run, or iffolder_name
is notNone
, it will create a sub-folder specified byfolder_name
. Also, aprefix
andsuffix
can be specified to append the created log folder (iflabber_style
is not True).The constructor for the LogFolder. It can be configured by the parameters. If the parameters are not provided, the default variables are used from the front of the file (which can be also set by importing a configuration).
- Parameters:
base_folder_path¶ (
Optional
[str
]) – Specifies the base directory, where the new log folder will be created. If it isNone
, then it is set to the default valuedefault_logger_base_directory
, which is set by the imported configuration file, otherwise it is defined at the top of this module. If the default value isNone
, the log folder will be created in the system’s TMP folder.labber_style¶ (
Optional
[bool
]) – If it is true, it will create a labber hierarchy log folder.cleanup¶ (
Optional
[bool
]) – If it is true, it will remove the log folder together with the logs at the end of execution.folder_name¶ (
Optional
[str
]) – Iflabber_style
is false, thenfolder_name
will be the name of the new log folder instead of generating a random one.prefix¶ (
Optional
[str
]) – It appends to the front of the generated folder namesuffix¶ (
Optional
[str
]) – It appends to the end of the generated folder name
- close()
- create_sub_folder_labber_style(base_folder, folder_name=None)
- get_log_file_path(file_name='log', over_write=True)
- static get_main_folder_path_labber_style(base_folder)
- class LoggerLevel(value)
Bases:
Enum
An enumeration.
- CODE = 1
- CRITICAL = 50
- DEBUG = 10
- ERROR = 40
- INFO = 20
- NOTSET = 0
- OUTPUT = 2
- WARNING = 30
- class ModuleFilter(module_name='')
Bases:
Filter
A customized module filter that can be added to a log handler or a logger. Filters all the log messages, and if the log was produced by a module with the specified module name, the log will not pass.
Initialize a filter.
Initialize with the name of the logger which, together with its children, will have its events allowed through the filter. If no name is specified, allow every event.
- filter(record)
Determine if the specified record is to be logged.
Returns True if the record should be logged, or False otherwise. If deemed appropriate, the record may be modified in-place.
- close_logger()
This method is executed upon exit, and it closes all the file handlers from the default loggers.
This allows to remove the log folder after the execution is finished. This is needed because otherwise, the handlers would be closed after everything else, after the log folder is being removed.
- get_default_logger()
Initializes the global logger or fetches one if it already exists.
- get_logger_config(config_file=None, log_folder=None)
It imports the logger configuration from the provided JSON file. If this is not provided, then the current directory is searched for a logger_settings.json configuration file. If not found, then the default JSON file is read from qat/purr/logger_settings.json
- Parameters:
- Returns:
A DefaultLogger instance configured with the names of the imported loggers
- import_logger_configuration(logger_config, log_folder=None)
It imports the configuration of the loggers from a JSON data structure. This must be in the format described by logging.config built-in module.
It can also contain some additional settings:
- default_logger_directory:
This is where a new log folder will be created for each execution. If it is set to None the system’s temp folder is used.
- default_logger_cleanup:
Specifies whether the log folders should be removed after execution or not.
- default_logger_labber_style:
If this is true, it will create a log folder hierarchy in labber style at the specified default_logger_directory
The logger list in the config file may also contain some additional settings:
- class:
If this is specified, then the logger is of a custom class, not included in the built-in logging package. Similar to how the handlers are defined by the ‘()’ key if they are custom handlers.
- active:
If this is false, than the corresponding logger will not be imported. It’s an easier way not to include a logger than to remove it from the config file, because then if the logger will be needed sometime, it doesn’t require to re-write the config file, just change active from false to true.
The configuration may also contain the starting log folder settings (
log_folder
). Each time the logging configuration is imported, the log folder will be set up as it is specified. If this is not provided in the JSON structure, than the created log folder will use the default settings (which can also be specified in the configuration, as described above).
- load_object_from_log_folder(file_path)
Loads and deserializes an object from its JSON representation from the disk.
- Parameters:
file_path¶ (
str
) – The JSON file on the disk- Returns:
The loaded object after deserialization
- save_object_to_log_folder(obj, sub_folder_path)
Serializes the specified object.