logs package

Submodules

logs.logDecorator module

class logs.logDecorator.log(base)[source]

Bases: object

decorator for logging values

This decorator can be used for injecting a logging function into a particular function. This takes a function and injects a logger as the first argument of the decorator. For the generated logger, it is also going to insert the time at which the paticular function was called, and then again when the function was finished. These serve as convinient functions for inserting values into the decorator.

initialize the decorator

Parameters:base ({str}) – The string used for prepending the value of the decorator with the right path for this function.
class logs.logDecorator.logInit(base, level, specs)[source]

Bases: object

initialize the decorator for logging

This generates a decorator using a fresh file with the right date and time within the function name. This way it will be east to find the last log file generated using a simple script in the case that a person wants to generate instantaneous statistics for the last run.

Initialize the logger object for the program

This logger object generates a new logger object. This is able to handle significantly improved logging capabilities in comparison to earlier versions of the cutter. For details of the available functionality, check the logging documentation.

Parameters:
  • base ({str}) – name that starts the logging functionality
  • level ({str}) – One of the different types if logs available - CRITICAL, ERROR, WARNING, INFO and DEBUG. These wil be mapped to one of the correct warning levels with the logging facility. In case there is an input that is not one shown here, it will be automatically be mapped to INFO.
  • specs ({dict}) – Dictionary specifying the different types if formatters to be generated while working with logs.

Module contents

Module containing helper classes for logging

This module contains two classes. The first one will be used for generating a new logger object, and another one for uisng that logging object for new tasks. Each class is modeled as a decorator, that will inject a logging.getLogger instance as a first parameter of the function. This function furthermore logs the starting and ending times of the logs, as well as the time taken for the function, using the time.time module.

Configuration Information

Configuring the logger is done with the help of the configuration file config/config.json. Specifically, the logging key identifies all configuration associated with logging information within this file. An example if the logging section is shown below. Details of the different sections will be described in the documentation that follows.

"logging":{

    "logBase" : "RLalgos",
    "level"   : "INFO",
    "specs"   : {

        "file":{
            "todo"     : true,
            "logFolder": "logs"
        },

        "stdout":{
            "todo"     : false
        },

        "logstash":{
            "todo"     : false,
            "version"  : 1,
            "port"     : 5959,
            "host"     : "localhost"
        }

    }
}

The "level" Segment

The logging module comes preconfigured to log at the "INFO" level. However this can be set to one of the following levels, and is mapped to their respective logging levels.

  • 'CRITICAL' mapped to logging.CRITICAL
  • 'ERROR' mapped to logging.ERROR
  • 'WARNING' mapped to logging.WARNING
  • 'INFO' mapped to logging.INFO
  • 'DEBUG' mapped to logging.DEBUG

The "specs" Segment

This module comes preconfigured for a number of logging sinks. The logs can go either to a logging file, to the stdout, or to logstash. Each section has a parameter "todo" that will determine whether a particular sink shall be added to the logging handler. The other parameters for each section is described below.

The "specs.file" Segment

This segment is used for sending the logger output directly to a file. A base folder should be soecified within which the logging file should be generated. Each time the program is run, a new file is generated in the form YYYY-MM-DD_hh-mm-ss.log. The default formatting string used is: "%(asctime)s - %(name)s - %(levelname)s - %(message)s".

The "specs.stdout" Segment

The output can potentially also be sent to the standard output if this section is turned on using the doto key. By default, this section is turned off. The default formatting string used is: "%(asctime)s - %(name)s - %(levelname)s - %(message)s".

The "specs.logstash" Segment

It is also possible to use logstash as a sink. This is entirely JSON based. This uses TCP rather than the standard UDP. For configuring the logstash server, make sure to add the input:

tcp {
    'port'  => '5959'
    'codec' => 'json'
}

The input port should match the port specified in the config/config.json the config file. If your logstash is running on a different machine, make sure that you specify the host IP along with the port. An example output is shown:

{
 "@timestamp" => 2018-08-12T03:49:25.212Z,
      "level" => "ERROR",
       "type" => "logstash",
       "port" => 55195,
   "@version" => "1",
       "host" => "Sankha-desktop.local",
       "path" => "/Users/user/Documents/programming/python/test/mytests/mnop/src/lib/testLib/simpleLib.py",
    "message" => "Unable to add the two values [3] and [a]:\nunsupported operand type(s) for +: 'int' and 'str'",
       "tags" => [],
"logger_name" => "mnop.lib.simpleLib.simpleTestFunction",
 "stack_info" => nil
}

This can then be sent to elasticsearch. If you need specific things filtered, you can directly use the filtering capabilities of logstash to generate this information.