Skip to content
Back to blog
Product 2024-04-03 5 min read

Visual API Testing — No-Code Flow Editor | LoadGen

28 node types, OpenAPI import, environment management, auth profiles. API testing in EUC — what Login VSI and ControlUp don't have.

LoadGen Engineering

Product Strategy

API testing inside an EUC platform is a category of one. Most VDI tools — Login VSI, ControlUp — don't do it at all. The tools that do — k6, JMeter, Postman, the API surface inside NeoLoad — assume a JavaScript developer or a perf engineer who lives in code. Neither group is the operator running a Citrix farm.

LoadGen's visual flow editor closes that gap. It treats API testing as a first-class workload alongside load and monitoring, in a no-code editor designed for the team that already operates the platform. This article walks through the editor, the node taxonomy, OpenAPI import, environments, auth profiles, and the debug experience that turns "the test failed" into "this assertion on this node, here's the variable state."

Why visual API testing in an EUC platform

Three reasons LoadGen ships API testing as a built-in module rather than asking you to bolt on a separate tool:

  • No JavaScript required. k6 and Postman scripts assume a developer or a perf engineer comfortable with code. The flow editor is drag-and-drop. QA, ops, and platform teams can build API tests without standing up a JS toolchain.
  • EUC and API on one timeline. Your monitoring, your load tests, your uptime checks, and your API flows all share the same agents, the same audit trail, and the same comparison view. When something breaks, you don't stitch four dashboards together to find out where.
  • Bridges IT and dev teams. Most EUC tools ignore the API layer entirely. Most API tools ignore VDI entirely. LoadGen's positioning makes it relevant to both — same engine, same operator, no silo.

The three-panel flow editor

Open any flow at /api-testing/flows/{FlowId}/edit and you'll see the same three-panel shape: NodePalette on the left, FlowCanvas in the middle, NodeDetailPanel on the right. The model is borrowed from visual programming environments — but the nodes are domain-specific, not generic.

NodePalette — 28 node types across 7 families:

| Family | Nodes | What they do | |--------|-------|-------------| | Request | operation, raw, replay | Send an HTTP request — by OpenAPI operation id, by raw URL/body, or as a replay of a captured run | | Assert | status, jsonpath, schema, latency | Validate the response — status code, JSONPath match, JSON Schema compliance, latency budget | | Auth | bearer, apikey, oauth2 | Apply an auth profile to subsequent requests in the flow | | Control | if, switch, loop, parallel, join, delay, retry, error-handler | Branch, loop, parallelize, retry, recover — the structural backbone of any non-trivial flow | | Data | set, variable, csv, db-lookup | Read or write variables, drive iteration from a CSV, look up state from a database | | Transform | builtin, js | Reshape data — built-in transforms for the common cases, JavaScript escape hatch for the rest | | Subflow | call | Compose flows out of smaller flows — your auth bootstrap is one subflow, your business logic is another |

FlowCanvas — drag, connect, validate. The canvas is where the flow is assembled. Edges connect nodes; the canvas validates connectivity in real time so you don't ship a flow with an orphan branch.

NodeDetailPanel — configure the selected node. URL, headers, body, assertions, retry policy, timeout — every node's configuration lives in the right panel, with revision switching and an environment dropdown above. Run-in-editor lets you fire the flow once before saving as a revision.

OpenAPI import — start from spec, stay in sync

Manual URL entry doesn't scale. /api-testing/sources lets you register an OpenAPI / Swagger spec by URL, watch versions change, and migrate flows when the upstream evolves.

The Sources table shows: name, spec URL, versions, status, diff summary, and a list of flows impacted by the latest version. Approve a new version and the flow editor surfaces a migration banner — every node bound to a renamed or restructured operation surfaces with a fix-up suggestion. The flow stays in sync with the contract, not the other way around.

This is the single biggest difference between LoadGen API testing and a code-first tool. In a JS-first stack, you write the test against the spec once, the spec changes, and someone updates the test six weeks later when CI breaks. In LoadGen, the version-aware source layer makes drift visible the moment it happens.

Environments and auth profiles

Two configuration surfaces live alongside flows because they're orthogonal to flow logic:

Environments at /api-testing/environments:

  • Base URL, headers, allowed hosts
  • Variables (overridable per run) and secrets (encrypted at rest)
  • Add / Edit modal with validation

A flow that runs against staging.example.com should run against prod.example.com without a single edit — only the active environment changes.

Auth profiles at /api-testing/auth-profiles:

  • None, Bearer, API Key
  • OAuth2 — Client Credentials, Authorization Code, Password Grant
  • Test credentials before the run — failed credentials surface inline, not at run-time

Reusable across flows, version-aware, and bound to environments — so your prod-readonly-token lives in one place.

Debug mode — step, resume, abort

Run a flow at /api-testing/runs/{RunId} and the run-detail page renders a four-panel debug surface:

  • Timeline — node execution as horizontal bars, color-coded by HTTP status, sized by duration
  • Events — the event log: assertion fires, retries, branch decisions, errors
  • Artifacts — captured outputs (response bodies, headers, computed variables)
  • Variables — the runtime state at any selected step

Three controls — Resume, Step, Abort — turn API testing into a debuggable artifact rather than a black-box failure. When an assertion fires on node #14, you don't grep a CI log; you click the node and see the response that failed, the variable that triggered it, and the request that produced it.

How to apply this in LoadGen

A pragmatic first flow looks like this:

  1. Register the OpenAPI spec at /api-testing/sources.
  2. Set up an environment at /api-testing/environments with the staging base URL.
  3. Configure an auth profile at /api-testing/auth-profiles — start with Bearer or API Key.
  4. Author a flow at /api-testing/flows/new — a Request node, an Assert (status, latency, jsonpath), and a save.
  5. Run with debug at /api-testing/runs/{RunId} — verify the timeline matches the contract.
  6. Promote to a scheduled job and let it run alongside your load tests.

Most teams ship their first useful flow inside a couple of hours.

Reference

Routes reference

| Surface | Route | |---------|-------| | Flows index | /api-testing/flows | | Flow editor | /api-testing/flows/{FlowId}/edit | | New flow | /api-testing/flows/new | | Sources (OpenAPI) | /api-testing/sources | | Environments | /api-testing/environments | | Auth profiles | /api-testing/auth-profiles | | Runs | /api-testing/runs | | Run detail (debug) | /api-testing/runs/{RunId} |

Closing thought

Conclusion

API testing in EUC has been a gap for years. The VDI tools didn't do it; the API tools didn't do EUC. LoadGen closes the gap with a visual editor whose node taxonomy was designed by people who actually operate the systems being tested — not borrowed from a generic web-test product.

The result: QA teams who don't write JavaScript can build real API flows. Platform teams who don't normally touch the application layer can run gate tests in CI. Both groups operate from the same dashboard, the same audit trail, and the same engine — and the API layer stops being someone else's problem.

Ready to baseline your environment?

Run the wizard, hit the cockpit, watch the audit trail build itself.

LoadGen Official Logo