Using ProCon from CLI
Running a ProCon-script will start a command-line interface, providing options to inspect the container, run it locally or connect to a JobManagement service.
foo@bar:~/my_project$ uv run ./my_step_file.py --helpUsage: example.py [OPTIONS] COMMAND [ARGS]...
CLI interface for the processing container.
Options: --help Show this message and exit.
Commands: list List the available functions remote Connect to job-management via RabbitMQ run Execute a function locally. signature Show a functions signatureYou can get additional information to all commands by passing the --help parameter after a command:
foo@bar:~/my_project$ uv run ./my_step_file.py list --helpUsage: example.py list [OPTIONS]
List the available functions
Options: -j, --json Output in JSON format -y, --pretty Prettier output with color and comments --help Show this message and exit.Commands
Section titled “Commands”A short listing of the available functions in the container is given by the list command. Without the parameters, it’s just the plain names, but you get a nicer output, including the short description of the docstring with the --pretty switch.
foo@bar:~/my_project$ uv run ./my_step_file.py list --pretty
Available functions:dev_no_param # A function with no parameter. Returns 42dev_nothing # A function with no parameters or return value.dev_optional_param # Functions that accepts optional / multi-typed parameters.dev_pass_through # Just returns the parameter.dev_sleep_and_return # Returns a given value after a given time and fails ifneeded.dev_square # Calculate the square of xsignature
Section titled “signature”The signature command prints out the manifest, the internal representation, of a function to the console.
The name of the function is provided via the --function parameter.
This functionality is mainly used to register a function to the JobManagement system.
If you do this manually via the API, you can output the manifest as JSON via the --json switch and copy & paste the output into the RegisterProcessingStep command.
foo@bar:~/my_project$ uv run ./my_step_file.py signature --function calculate_square
FunctionModel( function_name='calculate_square', version='1.2.0-dev', short_description='Calculate the square of x', long_description='', parameters={ '$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': { 'x': { 'description': 'a float number', 'title': 'x', 'type': 'number' } }, 'required': ['x'], 'title': 'Parameters', 'type': 'object' }, returns={ '$schema': 'http://json-schema.org/draft-07/schema#', 'properties': { 'value': { 'description': 'the square of x', 'title': 'Value', 'type': 'number' } }, 'required': ['value'], 'title': 'Returns', 'type': 'object' }, errors={}, input_dataslots=[], output_dataslots=[], procon_version= "2.0.3.dev1")A container can execute functions stand-alone using the run command.
Just provide its name with the --function parameter.
Optional parameters are provided as a string with a Python dictionary after the --parameters option.
To avoid confusion when escaping quotation marks on the command-line, take care to use double quotes as the outer string denominator and single quotes inside.
foo@bar:~/my_project$ uv run ./my_step_file.py run --function dev_square --parameters "{'x': 5}"25The result of the function is printed to the console and can also be formatted using the --json and/or --pretty switches.
Adding Dataslots
Section titled “Adding Dataslots”If your function uses input or output dataslots, you can easily map them to files. Just use the —input-dataslots or —output-dataslots options, following the same syntax you’d use for parameters:
uv run ./my_step_file.py run --function dataslot_function --input-dataslots "{'slot_name':['/usr/test/my_data.txt']}"Note:
It’s important to note that dataslots generally expect a list of filenames as their parameter. However, if an input or output DataSlot is linked to a function parameter via a reader or writer, it typically accepts a single file.
Dataslots marked with @dataslot.input_collection or @dataslot.output_collection are designed to handle multiple files.
remote
Section titled “remote”To connect a container to a JobManagement service, start it with the remote command.
Communication is handled via a RabbitMQ-server, so most of the parameters are the credentials to connect to this server.
It is important to provide the name of the exchange on the server, the JobManagement is listening on, if it is not the default exchange.
# example command argumentsuv run ./my_step_file.py remote --allfunctions --host 127.0.0.1 --port 5672 --vhost pinexq -d --user <rabbitmq-user> --password <rabbitmq-password> --exchange <name-of-rabbitmq-exchange>uv run ./my_step_file.py remote --function example_function --function example_function_2 --host 127.0.0.1 --port 5672 --vhost pinexq -d --apikey <rabbitmq-remote-apikey> --exchange <name-of-rabbitmq-exchange>
# force override of ProCon function versionuv run ./my_step_file.py remote --function example_function:2.1.0-alpha --function example_function_2 --host 127.0.0.1 --port 5672 --vhost pinexq -d --apikey <rabbitmq-remote-apikey> --exchange <name-of-rabbitmq-exchange>
# force override of ProCon function list versionsuv run ./my_step_file.py remote --function example_function:2.1.0-alpha example_function2:1.1.0 --function example_function_2 --host 127.0.0.1 --port 5672 --vhost pinexq -d --apikey <rabbitmq-remote-apikey> --exchange <name-of-rabbitmq-exchange>Once ProCon is connected to the server, the JobManagement takes over control of the container.
This command is the main use of the container, when it is deployed in a docker container in computing cluster.
Nonetheless, it can still run locally in this mode for testing and development.
Note that in order to connect to a remote JobManagement server, you will need a remote-enabled API key.
By default the version of the function decoration @version will be used to listen to job offers.
For more convenient automation or to configure them globally, the parameters can also be set via environment variables, e.g. in a .env file:
PROCON_REMOTE_HOST=127.0.0.1PROCON_REMOTE_PORT=5671PROCON_REMOTE_VHOST=pinexqPROCON_REMOTE_DEBUG='true' #or just -dPROCON_REMOTE_EXCHANGE=<name-of-rabbitmq-exchange>PROCON_REMOTE_APIKEY=<remote-enabled-rabbitmq-apikey-instead-of-user-pass>PROCON_REMOTE_CONTEXTID=<user-or-org-id-of-function-owner>PROCON_REMOTE_FUNCTION='example_function example_function_2 fn_name'#or alternativelyPROCON_REMOTE_ALLFUNCTIONS='true' #or just --allfunctionsuv run --env-file=.env my_step_file remoteWith proper configuration, this can be used to debug your ProCon worker while it is connected to JobManagement. Note that the function must be registered with JMA; see here for more information.