Role: Lead product designer (collaborated with another product designer, PM and Engineering)
Year: 2023
Quix is a platform that enables developers and data engineers to build, deploy, and manage real-time data pipelines using streaming technologies like Kafka. It provides a visual interface for managing pipelines, stream processing, and real-time ML models, reducing the complexity of working with distributed systems.
While stream-based architectures are powerful, they are notoriously hard to work with:
Figma, Kafka, Kubernetes, Quix backend APIs, React
This project deepened my understanding of data engineering workflows, and how to design visual abstractions for complex, distributed systems. It taught me how to balance power and simplicity in UX,making advanced capabilities accessible to a broader engineering audience.
The Quix Pipeline Editor transforms complex streaming architectures into an interactive visual canvas, helping engineers understand, build, and operate their Kafka-based pipelines with clarity.
Instead of relying on CLI commands or raw YAML, users can now design and inspect their pipelines in a fully visual flow:
This design empowers teams to collaborate across roles, making it easier for data engineers and developers alike to work with streaming data and accelerate time-to-production.
An interactive editor that makes complex Kafka data pipelines visual, explorable, and intuitive to operate.
A control center for monitoring live pipeline deployments. This view surfaces critical information, run state, resource consumption (CPU & memory), replicas, and version, at a glance. I designed the table to scale for large deployments, with intuitive visual cues (status icons, usage graphs) and actionable details. Engineers can quickly identify unhealthy components, optimize resource usage, and drill into individual deployments for deeper diagnostics.
The topics explorer to give engineers full visibility and control over their Kafka topics without needing to switch to external tools. In this view, users can inspect topic metadata, track throughput and retention, and manage partitions.
We prioritized clarity and operational safety: actions like cleaning or deleting a topic are surfaced with contextual cues, while real-time stats help teams monitor message flow across the pipeline.
To help users get started faster, I designed the Application Templates view to surface pre-built components for common streaming use cases. Whether it’s sourcing data, transforming it, or writing to a destination, users can deploy templates in one click, or preview the code to customize.
I focused on clarity and discoverability: templates are grouped by pipeline stage and complexity, with filtering to match a wide range of technical backgrounds. This made the platform much more approachable for both data engineers and app developers.
andreachomiak@gmail.com | LinkedIn
© 2025 Andrea Chomiak