LoggingExtras.jl

LoggingExtras.ActiveFilteredLoggerType
ActiveFilteredLogger(filter, logger)

Wraps logger in an active filter. While loggers intrinsictally have in built filtering mechanisms. Wrapping it in a ActiveFilterLogger allows for extract control, at the cost of a bit of overhead.

The ActiveFilteredLogger has full control of what is logged, as it sees the full message, this does mean however it determines what to log at runtime, which is the source of the overhead. The EarlyFilteredLogger has less control, but decides if to log before the message is computed.

The filter should be a function that returns a boolean. true if the message should be logged and false if not. As input it will be given a named tuple with the following fields: (level, message, _module, group, id, file, line, kwargs) See LoggingExtras.handle_message_args for more information on what each is.

LoggingExtras.DatetimeRotatingFileLoggerType
DatetimeRotatingFileLogger(dir, file_pattern; always_flush=true, rotation_callback=identity)
DatetimeRotatingFileLogger(f::Function, dir, file_pattern; always_flush=true, rotation_callback=identity)

Construct a DatetimeRotatingFileLogger that rotates its file based on the current date. The constructor takes a log output directory, dir, and a filename pattern. The smallest time resolution in the format string determines the frequency of log file rotation, allowing for yearly all the way down to minute-level log rotation.

The pattern can be given as a string or as a Dates.DateFormat. Note that if you wish to have a filename portion that should not be interpreted as a format string, you may need to escape portions of the filename, as shown in the example below.

It is possible to pass a formatter function as the first argument to control the output. The formatting function should be of the form f(io::IOContext, log_args::NamedTuple) where log_args has the following fields: (level, message, _module, group, id, file, line, kwargs). See LoggingExtras.handle_message_args for more information about what each field represents.

It is also possible to pass rotation_callback::Function as a keyword argument. This function will be called every time a file rotation is happening. The function should accept one argument which is the absolute path to the just-rotated file. The logger will block until the callback function returns. Use @async if the callback is expensive.

Examples

# Logger that logs to a new file every day
logger = DatetimeRotatingFileLogger(log_dir, raw"\a\c\c\e\s\s-yyyy-mm-dd.\l\o\g")

# Logger with a formatter function that rotates the log file hourly
logger = DatetimeRotatingFileLogger(log_dir, raw"yyyy-mm-dd-HH.\l\o\g") do io, args
    println(io, args.level, " | ", args.message)
end

# Example callback function to compress the recently-closed file
compressor(file) = run(`gzip $(file)`)
logger = DatetimeRotatingFileLogger(...; rotation_callback=compressor)
LoggingExtras.EarlyFilteredLoggerType
EarlyFilteredLogger(filter, logger)

Wraps logger in an filter that runs before the log message is created.

For contrast see the ActiveFilteredLogger which has full control, but runs after the log message content is computed. In most circumstances this is fine, but if your log messages are expensive to create (e.g. they include summary statistics), then the EarlyFilteredLogger is going to be better.

The filter should be a function that returns a boolean. true if the message should be logged and false if not. As input it will be given a named tuple with the following fields: (level, _module, group, id) See LoggingExtras.shouldlog_args for more information on what each is.

LoggingExtras.FileLoggerMethod
FileLogger(path::AbstractString; append=false, always_flush=true)

Create a logger sink that write messages to a file specified with path. To append to the file (rather than truncating the file first), use append=true. If always_flush=true the stream is flushed after every handled log message.

LoggingExtras.FileLoggerMethod
FileLogger(io::IOStream; always_flush=true)

Create a logger sink that write messages to the io::IOStream. The stream is expected to be open and writeable. If always_flush=true the stream is flushed after every handled log message.

Examples

io = open("path/to/file.log", "a") # append to the file
logger = FileLogger(io)
LoggingExtras.FormatLoggerType
FormatLogger(f::Function, io::IO=stderr; always_flush=true)

Logger sink that formats the message and finally writes to io. The formatting function should be of the form f(io::IOContext, log_args::NamedTuple) where log_args has the following fields: (level, message, _module, group, id, file, line, kwargs). See LoggingExtras.handle_message_args for more information on what field is.

Examples

julia> using Logging, LoggingExtras

julia> logger = FormatLogger() do io, args
           println(io, args._module, " | ", "[", args.level, "] ", args.message)
       end;

julia> with_logger(logger) do
           @info "This is an informational message."
           @warn "This is a warning, should take a look."
       end
Main | [Info] This is an informational message.
Main | [Warn] This is a warning, should take a look.
LoggingExtras.MinLevelLoggerType
MinLevelLogger(logger, min_enabled_level)

Wraps logger in an filter that runs before the log message is created. In many ways this is just a specialised EarlyFilteredLogger that only checks the level. This filter only allowed messages on or above the min_enabled_level to pass.

LoggingExtras.TeeLoggerMethod
TeeLogger(loggers...)

Send the same log message to all the loggers.

To include the current logger do: TeeLogger(current_logger(), loggers...) to include the global logger, do: TeeLogger(global_logger(), loggers...)

LoggingExtras.TransformerLoggerType
TransformerLogger(f, logger)

Preprocesses log messages, using the function f, before passing them to the logger that is wrapped. This can be used, for example, to truncate a log message. to conditionally change the log level of logs from a given module (which depending on the wrappped logger, might cause the message to be dropped).

The transforming function f is given a named tuple with the fields: (level, message, _module, group, id, file, line, kwargs) and should return the same. See LoggingExtras.handle_message_args for more information on what each is.

LoggingExtras.handle_message_argsMethod
handle_message_args

This creates NamedTuple containing all the arguments the logger gives to handle_message It is the type pased to the active logger filter. These argument come from the logging macro (@info,@warn` etc).

  • level::LogLevel Warn, Info, etc,
  • message::String the message to be logged
  • _module::Module can be used to specify a different originating module from the source location of the message.
  • group::Symbol can be used to override the message group (this is normally derived from the base name of the source file).
  • id::Symbol can be used to override the automatically generated unique message identifier. This is useful if you need to very closely associate messages generated on different source lines.
  • file::String and line::Int can be used to override the apparent source location of a log message.
  • kwargs...: Any keyword or position arguments passed to the logging macro
LoggingExtras.next_datetime_transitionMethod
next_datetime_transition(fmt::DateFormat)

Given a DateFormat that is being applied to our filename, what is the next time at which our filepath will need to change?

LoggingExtras.shouldlog_argsMethod
shouldlog_args

This returns a NamedTuple containing all the arguments the logger gives to shouldlog It is passed to the early logger filter. These argument come from the logging macro (@info, @warn etc).

  • level::LogLevel Warn, Info, etc,
  • _module::Module can be used to specify a different originating module from the source location of the message.
  • group::Symbol can be used to override the message group (this is normally derived from the base name of the source file).
  • id::Symbol can be used to override the automatically generated unique message identifier. This is useful if you need to very closely associate messages generated on different source lines.