Sirius — Next-Generation Data Collection Device
Integrated Hardware & Software Design for Enterprise Field Operations
Overview
Most enterprise device design works within fixed hardware specifications, the software adapts to a form factor that has already been decided. Sirius inverted that dynamic entirely. This was a project where the physical device and the digital interface were designed simultaneously, from the ground up, with neither allowed to constrain the other prematurely. The result is a data collection device where the hardware form and the software experience were built to the same standard and read as a single, intentional thing.
The scope was total: ergonomics and physical form factor, high-resolution touchscreen interface, performance architecture, and a voice command system powered by natural language processing that extended interaction beyond touch entirely.
The Challenge
Three constraints defined the design problem and had to be resolved in parallel.
Form factor was the most complex because Sirius was being designed from scratch, which meant the physical dimensions of the device were as much a design variable as the layout of the software. A device too large to hold during extended use fails regardless of how good the software is. A device too small to accommodate the required touchscreen area constrains the UX from the outside. Resolving this required iterating on physical proportions and digital wireframes simultaneously, with each discipline informing and constraining the other in real time.
Vertical space introduced a different problem. A data collection device operates in portrait orientation, often at arm's length, in environments where glanceability matters as much as information depth. Every screen state had to be designed with the assumption that the user had limited time and limited attention.
Data density was the third constraint. Transactions on a data collection device involve product codes, quantities, prices, locations, timestamps, and status fields, all of which a user may need to see, enter, or verify in a single workflow. Presenting that on a small touchscreen without overwhelming the user required a disciplined hierarchy of what was primary, what was secondary, and what lived behind demand.
Approach
The process moved through Assessment, Exploration, Design, Production, and Deployment with the critical difference that hardware and software decisions ran on the same track throughout.
Assessment mapped how data collection was being done across four industries: what devices were in use, where they were failing, and what users needed that existing products weren't providing. The consistent findings, hardware built for durability but uncomfortable for extended use, interfaces designed for function without attention to usability, and zero integration of voice interaction in a category where hands-free operation had real operational value became the design brief for Sirius.
Exploration tested hardware form factors and software interaction architectures in parallel. A front-end developer and software architect were present throughout this phase, making technical feasibility for hardware integration and software rendering a real-time conversation rather than a late-stage discovery. Decisions that would have been expensive to revisit after engineering committed were made collaboratively while the cost of change was still low.
The Work
Wireframes on a project like this had to resolve two overlapping sets of decisions simultaneously, the structural logic of the software experience across transaction and management workflows, and the spatial and proportional constraints the hardware had to satisfy to make that software work as intended. The wire phase produced dual specifications: a coherent software architecture and a hardware design envelope derived directly from it.
High-fidelity designs translated those structural decisions into the complete visual and interaction language of the device, touchscreen UI, visual treatment of voice command states across all interaction modes, and the hardware design language that gave the physical form its identity.
BuildKit delivered the full implementation specification to engineering: component states, animation behaviors, voice interaction flows, hardware integration specs, and asset exports organized for both software and hardware development teams. Not a handoff, a complete build package designed to protect design intent through production.
The Outcome
A product where the physical object and the digital interface reinforce each other rather than competing. Sirius delivered durability, ergonomics, interface sophistication, and voice interaction in a product category where trade-offs between those qualities had previously been accepted as inevitable.
Staying close through production and deployment was as critical as setting the direction during exploration because the gap between a well-designed spec and a well-built product is where quality is most often lost.
Role — Creative & Design Direction (UX/UI) Team — UX · UI · Research · Front-End Developer · Software Architect · Product Management Industries — Retail · Manufacturing · Healthcare · Hospitality

SaaS Product Design Strategy
Enterprise Platform Transformation — Legacy On-Premise to Modern Cloud SaaS
Overview
This project represents one of the most consequential design challenges in enterprise software: taking a mature, on-premise product suite and rebuilding the entire experience for the cloud. Not a visual refresh. Not a feature update. A complete rethinking of how a product serves real people across five industries, three distinct user types, and an organization that had spent years building around tools that technically worked but quietly exhausted everyone who used them.
The Challenge
The legacy platform had been built over many years by multiple teams solving problems in isolation. The result was a product that carried inconsistent patterns, redundant flows, and a visual language that reflected a different era of software. Moving to SaaS introduced an entirely new set of user expectations, the speed and responsiveness of a modern web product combined with the configurability that enterprise buyers had come to depend on.
The harder challenge was human. Aligning cross-functional teams around a shared design vision, when every product area had developed its own conventions and its own sense of ownership, required earning trust through process before earning buy-in through output.
Approach
The process moved through six structured phases. Assessment, Exploration, Research, Prototype, Iterations, and Deployment, designed to reduce risk while moving decisively through ambiguity.
Discovery started with a full audit of the legacy product: cataloging interaction patterns, mapping friction points, and aligning stakeholders on what success actually looked like before any design work began. Research followed not as a checkbox, but as an ongoing practice. User interviews and contextual inquiry sessions with managers, full-time employees, and hourly workers shaped every significant decision that came after.
Prototyping kept the cost of learning low. Low and mid-fidelity concepts were tested early, before commitment hardened. High-fidelity work came only after the core interactions had been validated. Each iteration closed the gap between what the product was and what it needed to become.
The Work
Rather than delivering screens, the output was infrastructure, a design foundation built to outlast the project itself.
A Design System became the shared language for every team building on the platform: not just what components looked like, but how they behaved, when to use them, and why. A system that teaches you how to think, not just what to copy.
Common Components were designed to be flexible across five industry contexts, retail, manufacturing, healthcare, HR, and hospitality, while remaining consistent enough to build genuine familiarity across the product.
Design Tokens defined the core visual values at the code level, allowing changes to propagate systemically rather than manually and making the platform extensible for themes, brand variations, and future products without breaking the underlying structure.
The Outcome
A platform that felt native to five industries without being built five separate times. A design system that gave distributed teams the tools to make good decisions independently. And a product experience that measurably reduced onboarding friction, shortened the learning curve for new users, and helped the business retain and grow within accounts, not just launch.
Strategy without execution drifts. Execution without strategy fragments. Holding both together was what made this work.
Role — Design Leadership & Execution Team — UX · UI · Research · Product Management Industries — Retail · Manufacturing · Healthcare · HR · Hospitality
Apple Watch App
Native Wearable Experience for Workforce Management
Overview
What does it mean to design for a screen the size of a postage stamp that lives on someone's wrist? This project explored that question within a workforce management context, and the answer required stepping back from conventional app design entirely. This was not about shrinking an existing mobile experience onto a smaller display. It was about identifying which moments in an hourly worker's shift genuinely benefit from a glance-and-go interaction, and then designing those moments with a level of precision that most software never demands.
The native Apple Watch app was purpose-built to complement the broader scheduling platform, functioning as a real-time assistant that surfaces the right information at the right moment without requiring the employee to stop what they are doing, reach for their phone, or navigate through multiple screens.
The Challenge
Two constraints defined every decision.
The first was the form factor. The Apple Watch screen is small by design, and that constraint is not just visual. It shapes how information can be organized, how interactions must be sequenced, and how much context a user can hold in a single view. Standard UI patterns from mobile and web simply do not translate. Navigation hierarchies that feel effortless on a phone become exhausting on a watch. Every design decision had to pass through a single filter: what is the one thing this person needs to know right now, and how do we show only that?
The second constraint was the dependency on the iPhone. The Apple Watch relies on a paired device for data connectivity, which means the experience is subject to Bluetooth reliability, background refresh limitations, and physical proximity. Designing for a device that is occasionally out of sync with its data source required building graceful states for stale content, clear last-updated indicators, and interaction patterns that did not frustrate users when connectivity was momentarily unavailable.
There was also the harder problem of information redundancy. The goal was not to duplicate what already lived on the mobile app but to identify the specific subset of information that gains real value by being on the wrist. Getting that distinction right required careful research and a willingness to cut features that seemed useful in theory but added noise in practice.
Approach
Assessment began with an audit of how hourly employees were actually using the existing mobile app. Usage data, support tickets, and direct interviews revealed a consistent pattern: the most accessed features were shift times, break schedules, and manager messages. Everything else was secondary. That finding became the foundation for the entire wearable strategy.
Exploration focused on what watchOS could do that mobile could not, rather than mapping mobile screens to watch screens. The team investigated how haptics, complications, and Siri integration could create genuinely new interaction moments, from ambient shift countdowns displayed as watch face complications to voice-activated status check-ins that required no visual attention at all.
Design translated the strongest concepts into structured flows. At this screen size, wireframes were as much about content priority as visual layout. Deciding what appeared on the primary card, what lived one swipe away, and what required a tap involved constant negotiation between completeness and clarity.
Production ran in close collaboration with the front-end developer and software architect throughout, because watchOS has its own layout engine and rendering behavior. Staying in sync with engineering from an early stage was the only way to ensure design intent survived the technical constraints of the platform.
The Work
Wireframes played a more critical role here than in most projects. Because screen real estate was so constrained, low-fidelity explorations were used to stress-test content hierarchy and interaction logic before any visual decisions were made. A wireframe that could not communicate the right thing in the right order at the right time was not going to be rescued by color or typography.
High-fidelity designs brought the visual language of the broader platform into the watchOS context, adapted to the constraints of a small, always-on display. This included detailed work on dark mode optimization, since the Apple Watch OLED makes dark interfaces not just visually appropriate but functionally superior for battery life and legibility. Typography choices, icon sizing, and layout density were all validated against real device previews rather than simulator estimates.
BuildKit and Design System documentation served as the connective layer between design and engineering, covering spacing, component behavior, interaction states, and animation timing in a format the development team could reference independently. Given the precision required by watchOS layout constraints, clear specs were a prerequisite for shipping a product that matched the design intent, not a formality.
The Outcome
A wearable experience that fits naturally into the rhythm of a shift without adding cognitive load. Retail associates, warehouse workers, and hospitality staff get shift times, break windows, task assignments, and real-time manager updates in under two seconds, on a device they never have to reach for. Haptic feedback replaces audio alerts in loud environments. Voice input replaces touch when hands are occupied.
The broader outcome was establishing the wearable as a legitimate productivity surface within the platform ecosystem, one that increased engagement with the scheduling system by lowering the barrier to access.
Role — Design Leadership and Execution Team — UX · UI · Research · Front-End Developer · Software Architect · Product Management Industries — Retail · Manufacturing · Hospitality
Back to Top