Architecture Decision Document
Project Context Analysis
Requirements Overview
Functional Requirements:
The UI Simple APIs Visual Pipeline Builder enables API producers to create integration pipelines through a visual interface rather than writing Camel YAML directly. The tool targets medium-to-complex integrations where visual pipeline building adds value; simple pass-through proxies will continue using existing mechanisms.
Trigger Types:
Integrations can be triggered by:
- HTTP Routes (GET, POST, PUT, DELETE, etc.) - defined by OAS paths
- Events - user selects an event type they want to react to; triggered by platform events
The UI presents these trigger types alongside each other (e.g., GET, POST, EVENT) when defining pipelines.
Pipeline Building Blocks:
The visual pipeline supports these step types:
- Call API - Invoke backend API from platform catalogue (abstracts Kamelets, handles auth)
- Content Transformation - Field renaming, mapping, data restructuring
- Choice - Conditional logic (if/else) with simple operators
- For Loop - Iterate over arrays/lists
- Parallel Execution - Fork to multiple branches (sync/async)
- Script - Groovy code for complex transformations
- Publish Event - Publish an event to platform event system (can be used in both HTTP-triggered and Event-triggered flows)
These blocks work identically regardless of trigger type (HTTP or Event).
1. Visual Pipeline Construction
- Drag-and-drop canvas for building integration flows (top-to-bottom flowchart)
- Support for all pipeline blocks listed above
- Inline step configuration with expand/collapse capability
- Real-time validation as pipelines are built
2. Trigger Definition
- HTTP Routes: Upload OpenAPI specification → discover routes from OAS paths → create pipeline per route
- Events: User selects from available event types → create pipeline triggered by that event
- Both trigger types use same pipeline building blocks and logic
3. Context-Aware Field Mapping
- Visual selection of data sources (original request/event payload, previous step outputs)
- Dropdown UI for selecting fields from current context using dot notation (e.g.,
submission.basicInformation.submittedBy.name) - Type awareness with safe conversion inference
- Support for nested JSON structures and exchange property management
4. Backend API Integration
- Discover available APIs from platform catalogue
- “Call API” block abstracts backend calls (hiding Kamelet complexity)
- Smart search for API selection with external catalogue linking
- Request/response field mapping with dropdown selection per input parameter
5. Event Integration
- Event Trigger: User selects event type from available events (input to pipeline)
- Publish Event Step: Publish events during pipeline execution (output from pipeline)
- Event discovery from platform event registry/catalogue
- Event payload structure available in context for field mapping
- Works in both HTTP-triggered and Event-triggered pipelines
6. Orchestration Pattern Support
- Sequential backend calls: Extract data from one response to use in next call (Domain API pattern)
- Single backend enrichment: Call backend, transform, return (common pattern)
- Parallel execution: Fork to multiple pipelines (sync/async) with result aggregation
- Conditional logic: Simple operators (equals, not equals, greater than) with static values or data field selection
- Loop iteration: Iterate over arrays/lists with current item available in context
- Error handling: Try/catch blocks or error routing
- Content transformation: Field renaming, mapping, formatting
- Event publishing: Publish events to platform (trigger downstream systems, fire-and-forget or with acknowledgment)
7. Code Generation
- Generate valid Camel YAML matching established patterns (direct routes, Kamelets with unique routeIds, subroutes)
- Follow HIP platform best practices (learned from Domain APIs and existing integrations)
- Ensure unique Kamelet routeIds per invocation (critical for Camel routing correctness)
- Support reusable pattern library (direct subroutes for common operations)
- Generate both REST DSL (HTTP) and Event consumer DSL (Events)
8. Testing & Deployment
- Test pipeline before deployment (mock data or sandbox environment)
- Integration with MR Worker for deployment via ClickOps
- Support versioning and deployment through environments
Use Case Spectrum:
| Complexity | Example | Tool Used | Pattern |
|---|---|---|---|
| Simple | Pass-through proxy with auth | Existing mechanism (not this tool) | Kong config only |
| Medium | Field mapping, single backend call, publish event | Visual Pipeline Builder | Request → Transform → Backend → Publish Event → Response |
| Complex | Multi-backend orchestration (Domain APIs), event-driven workflows | Visual Pipeline Builder | Event → Backend A → Extract → Backend B → Publish Event → Response |
The Visual Pipeline Builder targets medium-to-complex integrations where:
- Content transformation is needed beyond simple pass-through
- Multiple backend calls require orchestration
- Event-driven processing is required (consume events, publish events, or both)
- Conditional logic and loops are needed
Example Flows:
- HTTP → API → Publish Event: POST endpoint receives submission → stores in backend → publishes “submission.received” event
- Event → API → Response: “submission.approved” event → enriches from customer API → stores result → publishes “notification.required” event
- HTTP → Parallel (API + Publish Event): POST endpoint → parallel branches: (1) store in backend, (2) publish event for async processing → merge responses
Non-Functional Requirements:
-
Correctness
- Generated Camel YAML must be syntactically valid and semantically correct
- Must handle Camel-specific constraints (unique route IDs, explicit CamelHttpMethod between calls, header management)
- Exchange property flow must be correct for sequential orchestration
- Support both HTTP and Event trigger mechanisms
- Correctly generate event consumer and producer DSL
-
Usability
- Clear value proposition: Medium-complexity integrations should be significantly easier to build visually than writing YAML
- Intuitive UI for non-Camel experts
- Clear visualization of data flow and context availability
- Helpful error messages and validation feedback
- Low learning curve for API producers familiar with REST APIs
- Event selection and publishing feels natural and integrated
-
Maintainability
- Generated YAML should follow established patterns for consistency
- Encourage reuse patterns (subroutes) over inline duplication
- Support for updating pipelines without breaking existing integrations
-
Security & Compliance
- Part of Critical National Infrastructure platform
- Audit trail for pipeline changes
- Integration with existing HIP security model (Keycloak, network policies)
-
Performance
- Real-time validation without blocking UI
- Fast pipeline rendering for complex flows
- Efficient code generation
Scale & Complexity:
- Primary domain: Web frontend prototype (Loveable) + Camel YAML code generation engine
- Complexity level: High - Must handle medium-to-complex use cases elegantly with proper abstractions
- Estimated architectural components:
- Frontend: Canvas renderer, pipeline builder UI, field selector, validation engine, trigger type selector, event selector
- Code generation: Camel YAML template engine, Kamelet abstraction layer, routeId uniqueness manager, trigger adapter (HTTP vs Event), event DSL generator
- Integration: API catalogue connector, OAS parser, event registry/catalogue connector, MR Worker client
- Testing: Mock data provider, sandbox execution environment, event simulator
Open Questions & Future Considerations
The following items require further exploration and decision-making after initial prototype validation:
1. Field Mapping UX Enhancement
- Current: Two-level dropdown (source → field)
- Consider: Autocomplete with dot notation for faster field selection
- Tradeoff: Discoverability vs. speed for experienced users
2. Error Handling Patterns
- Current: CHOICE blocks with IF branches for conditional error routing
- Consider: Explicit “On Error” branch visualization or try/catch patterns
- Question: Does CHOICE pattern sufficiently cover error handling needs, or should error handling be more visually distinct?
3. Parallel Execution Visualization
- Current: Parallel block concept exists
- Detail needed: How to show sync vs. async? How to visualize branch merging?
- UX consideration: Branching visualization for parallel paths
4. Complex Field Transformations
- Current: Transform block with field mapping
- Question: How to handle complex transformations (arrays, nested objects, computed fields)?
- Consider: Groovy script templates and examples, visual array manipulation tools (filter, map, reduce)
5. Additional Step Types
- Review if additional blocks needed beyond current set: Call API, Transform, Choice, Loop, Parallel, Script, Publish Event, Response
- Consider: Delay/Wait, Aggregate, Split, Filter, etc.
- Question: What patterns emerge from user feedback that aren’t covered?
- Question: Do we allow them to provide camel yaml as a block as a fallback?
- Question: How/do we allow producers to create and share blocks themselves?
6. Non-Functional Requirements (NFRs)
- Should pipelines define NFRs (max latency, min throughput, error rate thresholds)?
- Where in UI would this be specified? Pipeline-level configuration?
- How would NFRs be validated/enforced in production?
7. API Catalogue Integration (Production)
- Must APIs exist in Hub catalogue before use?
- Allow inline OAS/WSDL upload for new APIs not yet catalogued?
- Auto-publish uploaded specs to catalogue?
- Discovery workflow for production environment
Technical Constraints & Dependencies
Platform Constraints:
-
HIP Platform Integration
- Must integrate with existing Gateway and Camel orchestration layer via Simple API deployment APIs
- Target medium-to-complex integrations (simple pass-through continues using existing approach)
-
Event-Driven Architecture Support
- Event Consumption (Trigger): Integrate with event catalogue system to allow definition of pipelines that consume events
- Event Publishing (Step): Support “Publish Event” as pipeline step block (available in all pipelines regardless of trigger type)
- Event payloads become pipeline input context (similar to HTTP request body)
- Same pipeline blocks and logic for both HTTP and Event triggers
- Event schema discovery for both consumption and publishing
- Event acknowledgment and error handling patterns
-
Camel YAML DSL Requirements
These were observed during Domain APIs and platform experience, and should be reviewed.
- Unique route IDs across all route files
- Explicit
CamelHttpMethodheader between sequential HTTP calls removeHeaders: pattern: "CamelHttp*"between backend calls- Groovy scripts require specific dependencies:
camel-groovy,groovy-json:4.0.29 - Query parameters arrive as HTTP headers in Camel REST DSL
- Kamelet invocations MUST include unique routeId per call (format:
kamelet:<name>/<unique-route-id>?params) - Support REST DSL (HTTP), Event consumer DSL (event triggers), and Event producer DSL (Publish Event step)
- Integration Pattern Support
- Single backend enrichment (call, transform, return)
- Multi-backend orchestration (sequential or parallel)
- Exchange property management for passing data between steps
- XML-to-JSON transformation support (for legacy backends)
- Response combination using Groovy
- Error handling and fallback patterns
- Event publishing (fire-and-forget or with acknowledgment)
External Dependencies:
- API Catalogue: Platform service for discovering available backend APIs (OAS Discovery Service)
- Event Registry/Catalogue: Discovery of event types and schemas (for both consumption and publishing)
- MR Worker: Deployment service for Simple API YAML configurations
- Keycloak: Authentication for API producers
- Kong Gateway: Runtime execution of HTTP-triggered integrations
- Event Platform: Runtime execution of event-triggered integrations and event publishing (part of HIP EDA)
- Apache Camel: Underlying orchestration engine (Camel YAML DSL on JBang)
Technology Constraints (TBC):
- Camel YAML DSL: Must generate valid Camel 4.x YAML (not Java DSL)
- Groovy 4.0.29: For transformation scripts within generated YAML
- OpenAPI 3.0+: For OAS parsing and HTTP route discovery
- Event Schemas: Schema format for event payload structure (JSON Schema, Avro, AsyncAPI, or platform-specific)
Cross-Cutting Concerns Identified
-
Trigger Abstraction
- Unified pipeline model: Same blocks and logic work for HTTP and Event triggers
- Trigger-specific input: HTTP requests vs Event payloads as initial context
- Trigger selector UI: Clear presentation of GET/POST/EVENT options when defining pipelines
- Schema discovery: OAS for HTTP, event schemas for Events (both consumption and publishing)
-
Event Integration (Bidirectional)
- Event Consumption: Selecting event type as trigger, schema-driven context
- Event Publishing: “Publish Event” block with event type selection, field mapping from current context to event payload
- Event discovery: Unified interface for browsing available events (similar to API catalogue for backend APIs)
- Schema-driven mapping: Both consumed and published events use schemas for field selection
-
Context Flow Management
- Track available data at each pipeline step
- Manage exchange properties for sequential orchestration
- Provide clear visualization of what fields are accessible where
- Handle both structured (HTTP request/event payload) and dynamic (step outputs) context
- Support “Publish Event” step accessing current context for payload construction
- Critical for correct field mapping and preventing runtime errors
-
Validation & Error Prevention
- Real-time YAML syntax validation
- Semantic validation (routeId uniqueness, property references, field paths)
- Type checking for field mappings
- Prevent common Camel mistakes (missing CamelHttpMethod, header conflicts)
- Trigger-specific validation (HTTP status codes vs event acknowledgment)
- Event payload validation against schemas (both consumption and publishing)
-
Code Generation Quality
- Consistency with HIP platform best practices (informed by Domain APIs lessons)
- Encourage reuse patterns (subroutes) over duplication
- Readable generated YAML with comments
- Support REST DSL (HTTP), Event consumer DSL (triggers), and Event producer DSL (Publish Event step)
- Version tagging for tracking changes
-
API Catalogue Integration
- Discover available backend APIs for “Call API” blocks
- Fetch API schemas for field mapping
- Handle authentication configuration (abstract away from user)
- Link to external catalogue for API documentation
-
Event Catalogue Integration
- Discover available event types (for both consumption and publishing)
- Fetch event schemas for field mapping (payload structure)
- Present event fields similarly to HTTP request fields in context dropdown
- Support searching/filtering events by domain or category
- Link to event documentation (if available)
-
Testing & Debugging
- Pre-deployment testing with mock data (HTTP requests, event payloads, or both)
- Sandbox environment for safe execution
- Clear error messages mapping back to visual pipeline
- Test case management for regression testing
- Support testing HTTP-triggered, Event-triggered, and hybrid pipelines
- Simulate event publishing without actually sending events to platform
-
Versioning & Lifecycle
- Pipeline version control (Git integration via MR Worker)
- Deployment through environments (dev → qa → prod)
- OAS and event schema versioning handling
- Backward compatibility for existing integrations
- Handle schema evolution for events (forward/backward compatibility)
-
Security & Audit
- Audit trail for pipeline creation/modification
- Integration with platform authentication (Keycloak)
- Secret management for backend credentials (handled by egress gateways, not in UI)
- Event publishing authorization (what events can this integration publish?)
- Event consumption authorization (what events can this integration consume?)
- Compliance with CNI requirements
-
User Experience
- Progressive disclosure (hide Camel complexity, expose when needed)
- Context-sensitive help and documentation
- Error recovery (save draft pipelines, undo/redo)
- Copy/paste pipeline blocks for reuse
- Clear trigger type selection: Visually distinguish HTTP routes from Event triggers
- Intuitive event selection: Searchable event picker for both consumption and publishing
- Template library: Pre-built patterns for common scenarios:
- HTTP → API → Response
- HTTP → API → Publish Event → Response
- Event → API → Publish Event
- Event → Parallel (API + Publish Event)
- API → Transform → Publish Event
Prototype Scope & Objectives
Purpose
This is a clickable prototype to validate UX concepts and visual pipeline building patterns. The focus is on demonstrating the user experience and gathering stakeholder feedback, not building production-ready infrastructure.
Primary Goal
Validate that visual pipeline building is:
- Intuitive for API producers (non-Camel experts)
- Capable of representing complex orchestration patterns (Domain API use case)
- Clearer than writing Camel YAML directly
- Worth investing in full production implementation
Scope Boundaries
In Scope (Prototype Demonstrates):
- Visual pipeline builder UI with drag-and-drop or add/remove capabilities
- Route selection (GET, POST, EVENT) and configuration
- Pipeline step manipulation (add, remove, reorder steps)
- Step types: Call API, Transform, Choice, Loop, Parallel, Script, Publish Event, Response
- Context-aware field selection (dropdowns showing available data at each step)
- Mock data for API catalogue, events, and field selection
- Complex orchestration example (VPD Domain API with sequential backend calls)
- Event trigger selection and event publishing in pipelines
- Branching and nesting for conditional logic
- Simulated pipeline testing/execution visualization
Out of Scope (Deferred to Production):
- Backend code generation service
- Actual Camel YAML generation and validation
- Integration with HIP platform (API Catalogue, MR Worker, Event Registry)
- Pipeline execution or testing in real environment
- Persistent storage or state management beyond browser session
- User authentication or authorization
- Multi-user collaboration
- Deployment workflows
- Versioning and deployment workflows (handled by MR Worker in production)
- Monitoring and tracing integration (production runtime concern)
- Test case management and regression testing
- Production credential management
Success Criteria
Prototype validates UX if:
- ✅ Stakeholders can understand pipeline flow without Camel knowledge
- ✅ Complex orchestrations (like Domain API) are visibly clearer than YAML
- ✅ Field mapping and context flow are intuitive
- ✅ Event integration feels natural alongside HTTP routes
- ✅ API producers express confidence they could build integrations with this tool
- ✅ Team agrees to invest in production implementation
- ✅ Branching/nesting for conditionals is clear and manageable
Prototype fails validation if:
- ❌ Visual representation is more confusing than YAML
- ❌ Critical pipeline patterns can’t be expressed visually
- ❌ Context flow is unclear or confusing
- ❌ Too much cognitive overhead to use the tool
- ❌ Feedback indicates writing YAML would be easier
- ❌ Nested branches become unmanageable or confusing
Visual Reference
UI Mocks: .ai/projects/hip/ui-simple-apis/v1.0-mocks/
Core UI Screenshots:
new-route.png- Empty pipeline canvas with route selectionroute-selected.png- POST route with incoming request body shownsteps.png- Complete pipeline with multiple steps and step type menulink-to-expand.png- CHOICE step with nested branch (collapsed view)expanded-detail.png- Inside a branch with breadcrumb navigationselect-field.png- Field mapping dropdown showing context sources
Key UI Elements:
- Left Panel: Route selector showing POST, GET, EVENT with descriptions
- Main Canvas: Top-to-bottom pipeline flow
- Collapsible Sections: Incoming request, individual steps, response
- Step Type Menu: “Add step” button reveals available step types
- Step Cards: Numbered steps with type badge, title, description, expand/collapse
- Breadcrumb Navigation: For navigating into/out of branches (e.g., “Pipeline > Validation Failed”)
- Context Dropdowns: Show available data sources (Incoming Request, Previous Steps, Static Value, Script)
Key UX Patterns from Mocks
Branching and Nesting Pattern
Evidence from Mocks: link-to-expand.png, expanded-detail.png
The prototype demonstrates a nested branching pattern for CHOICE steps:
Collapsed State (link-to-expand.png):
- Step 2: “CHOICE - Check Validation”
- Shows: “Branch based on validation result”
- Nested branch shown as: “IF - Validation Failed”
- Link indicator: “1 step in this branch >”
- Condition displayed inline:
Validate & Calculate (Step 1).valid equals false
Expanded Branch State (expanded-detail.png):
- Breadcrumb navigation: “Pipeline > Validation Failed”
- Shows badge: ”■ Validation Failed”
- Step 1 inside branch: “TRANSFORM - Build Error Response”
- “Add step” button within branch
- ”■ End of branch” marker
Key UX Insights:
-
Inline Condition Configuration
- Condition shown directly in collapsed CHOICE step
- Three-part condition:
[Source Field] [Operator] [Value] - Example:
Validate & Calculate (Step 1).valid | equals | false
-
Branch Navigation
- Click “1 step in this branch >” to navigate into branch
- Breadcrumb shows context: “Pipeline > Validation Failed”
- Shows you’re inside a conditional branch
-
Branch Isolation
- Branch has own step numbering (Step 1, Step 2… within branch)
- “Add step” works within branch context
- Clear “End of branch” marker
-
Context Awareness
- Fields from main pipeline available in branch
- Example: Step 1’s
validfield used in condition - Branch steps can access parent context
This pattern answers several architectural decisions:
✅ Decision 3 (Step Configuration): Inline expansion confirmed - conditions shown directly in collapsed state
✅ Decision 7 (Step Ordering): Not shown in mocks - defer or implement simply
✅ Decision 9 (Error Handling): CHOICE block handles it - “IF Validation Failed” branch pattern
Architectural Implication:
The nesting pattern requires:
- State management for current navigation context (main pipeline vs inside branch)
- Breadcrumb rendering
- Context tracking across nested levels
- Branch step isolation (separate numbering, separate “Add step” context)
Prototype Design
Visual Pipeline Builder Interface
The prototype presents a three-panel layout for building integration pipelines:
Left Panel: Route Selector
- Displays available routes from uploaded OAS specification
- Shows GET, POST, and EVENT triggers in a unified list
- Each route shows HTTP method badge, path, and description
- Selecting a route loads its pipeline configuration in the main canvas
Main Canvas: Pipeline Flow
- Top-to-bottom flowchart visualization
- Collapsible “Incoming Request” section showing request structure (parameters, body, headers)
- Sequential pipeline steps displayed as numbered cards
- “Add step” button between sections to insert new steps
- Collapsible “Response” section at bottom showing response structure
Step Type Menu
- Appears when “Add step” is clicked
- Available step types:
- CALL API - Invoke backend API from catalogue
- TRANSFORM - Rename or map fields
- CHOICE - Conditional branching (if/else)
- SCRIPT - Custom Groovy script
- PARALLEL - Parallel execution branches
- LOOP - Iterate over a list (array/list field from context; current item available as context source in each iteration)
- PUBLISH EVENT - Publish event to platform
- RESPONSE - Map fields to API response
Call API Step Configuration
When configuring a Call API step:
API Selection:
- APIs are selected from the platform API catalogue
- For prototype: Mock catalogue with predefined APIs (excise, customer, tax-platform)
- Open question (production): Must APIs exist in catalogue before use? Or allow inline OAS/WSDL upload for new APIs?
API Details Display:
- API name and description
- Link to full API documentation (external catalogue)
- SLA Information (if available):
- Expected response time
- P50/P99 latency percentiles
- Max throughput supported
- Current availability status
- Filtering Requirements: Some APIs may have filtering requirements (security scopes, permissions). Prototype shows where this information would appear.
Input Mapping:
- List of required parameters from API schema
- Field mapping dropdown for each parameter (from context)
- Support for both simple fields and nested paths
Step Configuration Pattern
Steps use inline expansion for configuration:
- Click step card → expands to show configuration fields
- All configuration visible in main canvas (no modals or drawers)
- Collapse/expand to manage visual complexity
- Keeps pipeline flow visible at all times
For branching structures (CHOICE steps), the interface uses breadcrumb navigation:
- Collapsed view shows branch condition inline (e.g.,
Validate & Calculate (Step 1).valid equals false) - Link indicator shows “1 step in this branch >”
- Clicking navigates into branch with breadcrumb: “Pipeline > Validation Failed”
- Branch has isolated step numbering and “Add step” context
- “End of branch” marker indicates return to main flow
Context-Aware Field Selection
Field mapping uses a dynamic dropdown system showing available data sources:
Available Sources:
- Incoming Request (or Event payload for event-triggered flows)
- Previous Steps (e.g., “Validate & Calculate (Step 1)”)
- Static Value - for hardcoded values
- Script (Groovy) - for computed values
As pipeline steps are added, the context grows dynamically. Each field mapping dropdown shows:
- First dropdown: Select source (Incoming Request, Step 1, Step 2, Static Value, Script)
- Second dropdown: Select field from that source (supports dot notation for nested fields)
This pattern enables users to visually trace where data comes from and how it flows through the pipeline.
Array and List Handling:
The field selection system supports working with arrays in multiple ways:
- Direct array field selection: Select entire array field for passing to another step
- Array element access: Use Loop step type to iterate over array elements
- Array item fields: Within Loop context, access fields of current array item using dot notation
- Future consideration: Visual array transformation tools (filter, map, reduce operations)
Event Integration
Events are treated alongside HTTP routes in the route selector:
- Mixed list showing GET, POST, EVENT triggers
- Visual distinction via badges (GET, POST, EVENT)
- Same pipeline builder for both HTTP and Event triggers
- “Publish Event” step available in all pipelines (HTTP or Event-triggered)
Publishing events follows the same pattern as calling APIs:
- Select event type from event catalogue
- Map context fields to event payload schema
- Events are fire-and-forget but return basic status code
- Status can be used in subsequent CHOICE conditions
Pipeline Manipulation
Reordering Steps:
- Drag-and-drop handles on step cards
- Visual feedback during drag operation
- Drop indicators show valid placement positions
Step Management:
- Delete button on each step card (trash icon)
- Expand/collapse toggle on each step
- Inline editing of step configuration
Mock Data Structure
The prototype uses structured JSON files for realistic demonstration:
routes.json- OAS routes (GET, POST, EVENT triggers)api-catalogue.json- Available backend APIs (excise, customer, tax-platform)events-catalogue.json- Available events to consume/publishexample-pipelines.json- Pre-built examples (VPD Domain API, event-driven flows)
This structure allows easy updates to examples and maintains separation between UI logic and demonstration data.
Example Pipelines
The prototype demonstrates three complexity levels:
1. Simple GET (Blank Template)
- Empty pipeline showing basic structure
- Demonstrates route selection and adding first step
2. Complex Orchestration (VPD Domain API - POST)
- Multi-backend sequential calls
- Conditional branching (validation check)
- Field extraction and mapping between steps
- Response assembly from multiple sources
- Demonstrates: Call API, Choice, Transform, Response
3. Event-Driven Flow (To be added)
- Event trigger (e.g., “submission.approved”)
- Enrich data from backend API
- Publish new event (e.g., “notification.required”)
- Demonstrates: Event integration and publishing
Testing Simulation
The “Test pipeline” button triggers simulated execution visualization:
Test Execution Flow:
- User provides mock request data (or uses example from OAS)
- Pipeline executes in simulation mode
- Each step highlights as it “executes”
- Step outputs displayed in-context
Visibility During Test:
- Step-by-step execution: Visual highlighting of active step
- Input/Output inspection: Expand any step to see its input data and output result
- Context evolution: See how context grows as pipeline executes
- HTTP request visibility (future): Show actual HTTP requests made to backend APIs (mock or real)
- Final response: Display complete response structure
Authentication (Production Consideration): For production testing with real backends:
- Users would need credentials with appropriate scopes
- Could create/update test applications for isolated testing
- Prototype uses mock data, no real authentication required
Test Result Management (Future):
- Save test scenarios for regression testing
- Compare results across runs
- Out of scope for prototype
This demonstrates the value of testing before deployment without requiring actual backend integration.
Appendix: Loveable Implementation
Current Prototype Implementation: Loveable Platform
Platform: Loveable (https://lovable.dev)
Live Prototype: https://cle-mock-5.lovable.app/
Technology Stack
Loveable provides:
- React 18+ with TypeScript
- Tailwind CSS for styling
- Vite build tooling
- Hot module replacement
- Built-in deployment and hosting
Implementation Notes
State Management:
- React Context API or useState for prototype state
- No persistence needed (browser session only)
Data Mocking:
- Mock data in JSON files or components
- No backend API calls required
- All logic runs in browser
Component Organization:
- Standard Loveable project structure
- Components, pages, hooks, utils folders
- Tailwind for styling