Bots From Extension: guardrails
guardrails based AI System
This extension provides 2 bots.
Bot @guardrail:prompt
Bot Position In Pipeline: Source Sink
Do the text completion using LLM for specified input
This bot expects a Restricted CFXQL.
Each parameter may be specified using '=' operator and AND logical operation
Following are the parameters expected for this Bot
| Parameter Name | Type | Default Value | Description |
|---|---|---|---|
| prompt* | Text | Prompt text to be sent to the LLM | |
| send_input_data | Text | no | Send input dataframe as CSV or JSON to the LLM. Specify 'yes' or 'no'. Default is 'no' |
| data_format | Text | csv | Send input dataframe as CSV or JSON to the LLM. Valid values are 'json' or 'csv' |
| prefix | Text | Optional prefix to be sent before including data or prompt. | |
| limit | Text | 100 | If sending input data, how many rows should it be limited to. 0 means noo limit. |
| propagate_cols | Text | Comma separated list of columns that should be propgated from input to output. | |
| exclude_cols | Text | Comma separated list of columns from input data that should be excluded when sending to LLM | |
| include_cols | Text | Comma separated list of columns from input data that should only be included when sending to LLM |
|
| attach_datasets | Text | If additional datasets need to be attached, specify a meta dataset name which has a list of datasets. This meta dataset must contain two columns: name (dataset name) and description (dataset description) |
|
| temperature | Text | 0.3 | Temperature parameter for response generation. Should be between 0.0 to 2.0. Closer to 0.0 means more deterministic, closer to 1.0 or higher means more random. |
| debug | Text | no | Specify 'yes' to print full data being sent to LLM. Default is 'no' |
Bot @guardrail:prompt-using-template
Bot Position In Pipeline: Source Sink
Prepare contents of prompt from the specified Query Template and then send the prompt to LLM
This bot expects a Restricted CFXQL.
Each parameter may be specified using '=' operator and AND logical operation
Following are the parameters expected for this Bot
| Parameter Name | Type | Default Value | Description |
|---|---|---|---|
| prompt_template* | Text | Query template name to generate the prompt | |
| input_data_as_variable | Text | no | Send input dataframe as specified variable to Template |
| temperature | Text | 0.3 | Temperature parameter for response generation. Should be between 0.0 to 2.0. Closer to 0.0 means more deterministic, closer to 1.0 or higher means more random. |
| debug | Text | no | Specify 'yes' to print full data being sent to LLM. Default is 'no' |
This bot also accepts wildcard parameters. Any additional name = 'value' parameters are passed to the bot.