Canvas Debug Mode

Prev Next

Debug Mode provides an isolated testing environment within the workflow canvas where you can simulate workflow execution and validate task behavior. This replaces the time-intensive, error-prone process of manually mocking data using the stub task, significantly streamlining workflow testing. By simulating workflow inputs and task outputs, you can test and debug workflows predictably without external dependencies. You can mock tasks to simulate their behavior without executing against live systems, then create debug scenarios that use this mock data to validate your workflow's behavior.

Prerequisites

Before using Debug Mode, ensure you have:

  • Platform 6.1.x installed
  • WorkflowBuilder engineering or admin role
  • A workflow with configured tasks

Enter Debug Mode

To enter Debug Mode:

  1. Open your workflow in Platform 6.1 Studio
  2. Click Debug in the titlebar

From the Debug Configuration Panel, you can configure your starter scenario or create new scenarios. The yellow border around the canvas indicates that you are in Debug Mode.

Figure 1: Debug Mode UI
Figure 1.png

Debug Scenarios

Debug scenarios define the set of mock instance data and conditions used to validate your workflow. Each scenario represents a different test case or execution path of the workflow that you would like to validate within Debug Mode.

Add a New Scenario

  1. Enter Debug Mode from the workflow canvas
  2. Click "+" in the Debug Configuration Panel
  3. Click New Blank Scenario

Figure 2: Add New Blank Scenario
Figure 2.png

Mock Data

When you configure mock data in your debug scenario and run it in Debug Mode, the system replaces real responses from external systems and built-in functions with the mock data you provided. This mock data is stored separately from the workflow itself, making it reusable across different workflows throughout the system.

Create Mock Data

  1. Select a scenario from the Debug Configuration Panel
  2. Select the task you want to provide mock data for
  3. In the Debug Task Panel panel that opens, click "+"
  4. In the Create Mock Data dialog, provide values for the mock data fields:
    • Name
    • Description (optional)
    • Transition type
    • Delay
    • Response type
    • Response value
Note

The mock data output must match the structure of the real task response to ensure your workflow tests accurately. This output should also align with your workflow's data transformation and variable mapping requirements.

  1. Click Save to save the mock data instance

Figure 3: Create Mock Data Dialog
Figure 3.png

By default, the transition type is set to Success, and the Response Type dropdown will preselect the expected outgoing data type for the task. However, if the task isn't available in the current environment—due to a missing adapter or custom application—no default will appear in the Response Type dropdown. When this occurs, refer to previous completed jobs for this workflow to determine the expected mock data response, since no task data is available in the system.

Figure 4: No Task Data Available UI Example
Figure 4.png

Mock Data Fields

The following fields are available when creating mock data instances. Configure these fields to define how your mock data behaves during debug execution and what response it returns to the workflow.

Name

A unique name for this mock data instance. Use descriptive names that clearly indicate what this mock data represents.

Example: Cisco IOS Target File Found

Description (optional)

A brief explanation of what this mock data simulates or its purpose in testing. This helps other team members understand the test case and makes scenarios easier to maintain.

Example: Mocks Cisco IOS target file found during file verification

Transition Type

Specifies how the job should proceed after this mock response is processed. This determines the execution path your job will follow.

Valid values: Success, Failure, Error

Delay

The amount of time in seconds to wait before returning the mock data response. This simulates real-world processing delays and helps test timing-dependent workflow logic.

Example: 5

Response Type

The data type of the mock data you are providing. This must match the expected workflow task output.

Valid values: Boolean, String, Number, Integer, Object, Array, Null

Response Value

The mock data output that will be returned from your workflow task. This should match the structure and data types that the task expects from the actual external system.

Example:

{
    "status": "complete",
    "templateName": "IOS - File Verification",
    "reattempt": false,
    "reattemptWaitTime": "",
    "reattemptQuantity": "",
    "deviceName": "test-ios-device",
    "suppressFailureMessage": false,
    "_id": "63634d2b354d43efbe25d694",
    "suppressSuccessMessage": false,
    "templateVariables": {
        "version": "csr1000v-universalk9.03.10.03.S.153-3.S3-ext.SPA.bin",
        "flashMemory": "bootflash:"
    },
    "initiator": "bob",
    "templateResults": {
        "all_pass_flag": true,
        "result": true,
        "commands_results": [
            {
                "raw": "show ver | i System image file is",
                "all_pass_flag": true,
                "evaluated": "show ver | i System image file is",
                "parameters": {},
                "rules": [
                    {
                        "rule": "csr1000v-universalk9.03.10.03.S.153-3.S3-ext.SPA.bin",
                        "eval": "!contains",
                        "severity": "error",
                        "flags": {
                            "case": true
                        },
                        "raw": "<!version!>",
                        "result": true
                    }
                ],
                "device": "test-ios-device",
                "response": "System image file is \"bootflash:csr1000v-universalk9.03.10.02.S.153-3.S2-ext.SPA.bin\"",
                "result": true
            }
        ],
        "name": "IOS - File Verification"
    }
}

Run Scenario

After configuring your debug scenario and mock data:

  1. Select the scenario you would like to run from the Debug Configuration Panel
  2. Click Run Scenario
    Figure 5: Run Scenario

Figure 5.png

  1. The debug job opens in Operations Manager where you can monitor the debug run results.
Note

To distinguish between mocked tasks and jobs and those ran against their designed configuration, mock data is annotated with a yellow beaker icon (experiment-20250915-195813.svg) .

Figure 6: Debug Run in Operations Manager
Figure 6.png

Exit Debug Mode

To return to Design Time Mode from the Debug Mode screen, select Exit Debug Mode at the top of the canvas.

Figure 7: Exit Debug Mode
Figure 7.png

Debug scenarios are automatically saved when you exit Debug Mode and will be available with their configured mock data the next time you enter Debug Mode for this workflow.

Best Practices

Scenarios

  • Create scenarios that cover different execution paths (success, error, failure)
  • Use descriptive names that clearly indicate the test condition
  • Keep scenarios up-to-date with latest workflow changes to ensure they correctly capture passing vs. failing test cases

Mock Data

  • Include realistic data values to better simulate real-world execution. It is recommended to pull mock data from actual system responses when possible.
  • Mock data is reusable and will potentially affect other debug scenarios if updated.
  • Mock data is stored externally from the workflow and will need to be transported together with the workflow between environments.

Testing Coverage

  • Test all conditional branches in your workflow
  • Validate error handling paths
  • Verify data transformations work as expected
  • Test with different data volumes and complexity levels

Troubleshooting

Scenario Execution Fails

If your scenario fails to execute:

  1. Ensure data types align with job variable definitions
  2. Verify the mock data configured for the scenario is in the system
  3. Verify the workflow does not have validation errors (aside from any resolved by mocking the task)

Unexpected Workflow Behavior

If the debug workflow execution differs from expectations:

  1. Confirm task mock data aligns with any data transformation mapping requirements
  2. Verify the mock data Transition Type setting is defined correctly
  3. Verify conditional logic and decision points
  4. Verify variable assignments and task mappings

Next Steps

After testing your workflow in Debug Mode:

  • Create additional scenarios: Expand test coverage for different logic paths and edge cases
  • Validate mixed execution: Run workflows with a combination of mocked and live tasks to confirm integration behavior
  • Review results: Inspect task outputs and transitions in Operations Manager for accuracy and expected behavior
  • Integration testing: Validate workflow behavior within larger automation processes
  • Transport mocks: Import/export mock data via API together with scenarios across environments
  • Deploy to staging: Test your workflow with live systems in a controlled environment