Logging
ProCon uses Python’s built-in logging to log information during execution. Through judicious use of logging statements, you can track the progress of your processing Step, and understand runtime errors, by logging at the appropriate time.
Configuring a logger
Section titled “Configuring a logger”The code below will create a logger LOG with the same name as the module (at import time).
It then creates an INFO level message once, when the function starts; DEBUG messages during every iteration; and a WARNING if the function fails.
Remember that logs can be filtered by level: use log levels appropriately!
import logging
LOG = logging.getLogger(__name__) # "my_step"#logging.basicConfig() # DO NOT DO THIS!
def my_fun(i: int) -> list[int]: LOG.info(f"Argument received: {i}") temp = [] try: for j in range(i): LOG.debug(f"j = {j}") temp.append(i ** 3) return temp except Exception as e: LOG.warning(f"Unexpected error, continuing: {e}") return []Viewing your logs
Section titled “Viewing your logs”The portal makes viewing the logs from a specific job easy. You can search through logs, filter for ERRORs, or simply watch as your job works.

Logging and testing
Section titled “Logging and testing”To configure log capture during testing, pytest has the caplog fixture.
You can use this fixture to check the contents of logs during tests (e.g. to test whether a warning is raised), or to simply show them on screen.
Note that caplog will use the default level of the configured logger; caplog.set_level() can be used to change the level.
import loggingfrom my_step import my_fun
def test_balance_query_step(caplog): """ Test step execution, using caplog to capture logs from data_loader.data_import_logic. """ caplog.set_level(logging.DEBUG, logger="my_step") # logger has same name as module
my_fun(9)
for record in caplog.records: assert record.levelname not in ["CRITICAL", "WARNING"] # no warnings or errors print(caplog.text)