Debug Mode provides an isolated testing environment within the workflow canvas where you can simulate workflow execution and validate task behavior. This replaces the time-intensive, error-prone process of manually mocking data using the stub task, significantly streamlining workflow testing. By simulating workflow inputs and task outputs, you can test and debug workflows predictably without external dependencies. You can mock tasks to simulate their behavior without executing against live systems, then create debug scenarios that use this mock data to validate your workflow's behavior.
Prerequisites
Before using Debug Mode, ensure you have:
- Platform 6.1.x installed
- WorkflowBuilder engineering or admin role
- A workflow with configured tasks
Enter Debug Mode
To enter Debug Mode:
- Open your workflow in Platform 6.1 Studio
- Click Debug in the titlebar
From the Debug Configuration Panel, you can configure your starter scenario or create new scenarios. The yellow border around the canvas indicates that you are in Debug Mode.
Figure 1: Debug Mode UI
Debug Scenarios
Debug scenarios define the set of mock instance data and conditions used to validate your workflow. Each scenario represents a different test case or execution path of the workflow that you would like to validate within Debug Mode.
Add a New Scenario
- Enter Debug Mode from the workflow canvas
- Click "+" in the Debug Configuration Panel
- Click New Blank Scenario
Figure 2: Add New Blank Scenario
Mock Data
When you configure mock data in your debug scenario and run it in Debug Mode, the system replaces real responses from external systems and built-in functions with the mock data you provided. This mock data is stored separately from the workflow itself, making it reusable across different workflows throughout the system.
Create Mock Data
- Select a scenario from the Debug Configuration Panel
- Select the task you want to provide mock data for
- In the Debug Task Panel panel that opens, click "+"
- In the Create Mock Data dialog, provide values for the mock data fields:
- Name
- Description (optional)
- Transition type
- Delay
- Response type
- Response value
The mock data output must match the structure of the real task response to ensure your workflow tests accurately. This output should also align with your workflow's data transformation and variable mapping requirements.
- Click Save to save the mock data instance
Figure 3: Create Mock Data Dialog
By default, the transition type is set to Success, and the Response Type dropdown will preselect the expected outgoing data type for the task. However, if the task isn't available in the current environment—due to a missing adapter or custom application—no default will appear in the Response Type dropdown. When this occurs, refer to previous completed jobs for this workflow to determine the expected mock data response, since no task data is available in the system.
Figure 4: No Task Data Available UI Example
Mock Data Fields
The following fields are available when creating mock data instances. Configure these fields to define how your mock data behaves during debug execution and what response it returns to the workflow.
Name
A unique name for this mock data instance. Use descriptive names that clearly indicate what this mock data represents.
Example: Cisco IOS Target File Found
Description (optional)
A brief explanation of what this mock data simulates or its purpose in testing. This helps other team members understand the test case and makes scenarios easier to maintain.
Example: Mocks Cisco IOS target file found during file verification
Transition Type
Specifies how the job should proceed after this mock response is processed. This determines the execution path your job will follow.
Valid values: Success
, Failure
, Error
Delay
The amount of time in seconds to wait before returning the mock data response. This simulates real-world processing delays and helps test timing-dependent workflow logic.
Example: 5
Response Type
The data type of the mock data you are providing. This must match the expected workflow task output.
Valid values: Boolean
, String
, Number
, Integer
, Object
, Array
, Null
Response Value
The mock data output that will be returned from your workflow task. This should match the structure and data types that the task expects from the actual external system.
Example:
{
"status": "complete",
"templateName": "IOS - File Verification",
"reattempt": false,
"reattemptWaitTime": "",
"reattemptQuantity": "",
"deviceName": "test-ios-device",
"suppressFailureMessage": false,
"_id": "63634d2b354d43efbe25d694",
"suppressSuccessMessage": false,
"templateVariables": {
"version": "csr1000v-universalk9.03.10.03.S.153-3.S3-ext.SPA.bin",
"flashMemory": "bootflash:"
},
"initiator": "bob",
"templateResults": {
"all_pass_flag": true,
"result": true,
"commands_results": [
{
"raw": "show ver | i System image file is",
"all_pass_flag": true,
"evaluated": "show ver | i System image file is",
"parameters": {},
"rules": [
{
"rule": "csr1000v-universalk9.03.10.03.S.153-3.S3-ext.SPA.bin",
"eval": "!contains",
"severity": "error",
"flags": {
"case": true
},
"raw": "<!version!>",
"result": true
}
],
"device": "test-ios-device",
"response": "System image file is \"bootflash:csr1000v-universalk9.03.10.02.S.153-3.S2-ext.SPA.bin\"",
"result": true
}
],
"name": "IOS - File Verification"
}
}
Run Scenario
After configuring your debug scenario and mock data:
- Select the scenario you would like to run from the Debug Configuration Panel
- Click Run Scenario
Figure 5: Run Scenario
- The debug job opens in Operations Manager where you can monitor the debug run results.
To distinguish between mocked tasks and jobs and those ran against their designed configuration, mock data is annotated with a yellow beaker icon () .
Figure 6: Debug Run in Operations Manager
Exit Debug Mode
To return to Design Time Mode from the Debug Mode screen, select Exit Debug Mode at the top of the canvas.
Figure 7: Exit Debug Mode
Debug scenarios are automatically saved when you exit Debug Mode and will be available with their configured mock data the next time you enter Debug Mode for this workflow.
Best Practices
Scenarios
- Create scenarios that cover different execution paths (success, error, failure)
- Use descriptive names that clearly indicate the test condition
- Keep scenarios up-to-date with latest workflow changes to ensure they correctly capture passing vs. failing test cases
Mock Data
- Include realistic data values to better simulate real-world execution. It is recommended to pull mock data from actual system responses when possible.
- Mock data is reusable and will potentially affect other debug scenarios if updated.
- Mock data is stored externally from the workflow and will need to be transported together with the workflow between environments.
Testing Coverage
- Test all conditional branches in your workflow
- Validate error handling paths
- Verify data transformations work as expected
- Test with different data volumes and complexity levels
Troubleshooting
Scenario Execution Fails
If your scenario fails to execute:
- Ensure data types align with job variable definitions
- Verify the mock data configured for the scenario is in the system
- Verify the workflow does not have validation errors (aside from any resolved by mocking the task)
Unexpected Workflow Behavior
If the debug workflow execution differs from expectations:
- Confirm task mock data aligns with any data transformation mapping requirements
- Verify the mock data Transition Type setting is defined correctly
- Verify conditional logic and decision points
- Verify variable assignments and task mappings
Next Steps
After testing your workflow in Debug Mode:
- Create additional scenarios: Expand test coverage for different logic paths and edge cases
- Validate mixed execution: Run workflows with a combination of mocked and live tasks to confirm integration behavior
- Review results: Inspect task outputs and transitions in Operations Manager for accuracy and expected behavior
- Integration testing: Validate workflow behavior within larger automation processes
- Transport mocks: Import/export mock data via API together with scenarios across environments
- Deploy to staging: Test your workflow with live systems in a controlled environment