Designing the Autonomous Future of 911 Operator Interfaces.
Building trust between 911 operators and AI by visualizing AI decisions, leading to greater understanding.
Better operational efficiency by simplifying decision making of complex drone data.
Faster and safer emergency response by using great design to help operators assess the situation and dispatch accordingly.
Supports scalability by laying the groundwork for broader adoption of AI to emergency services and smart cities.
As part of a Purdue UX design course, our 7-person team partnered with Marc Ward, CEO of Socian Technologies, to design a graphical user interface (GUI) for an AI-powered police drone system in a safety-critical environment. The drone is a core product of his new startup and required a functional interface to move forward with development and marketing efforts.
The Challenge.
Imagine receiving a 911 call, but instead of sending officers blindly into a situation, an AI-powered drone is already en route, streaming live intelligence to a dispatcher’s screen. Our challenge was to design the interface that bridges human judgment and machine autonomy.
We were tasked to create the command-center interface for Socian’s E.D.G.E. drone, an AI system capable of analyzing real-time footage, identifying threats, and guiding 911 operators during high-stress incidents.
Socian E.D.G.E Drone
At the start, no design system or GUI existed. Socian gave our team full ownership of the UX strategy, interface design, and operator experience. The CEO’s only direction?
"Make it look like it came out of a Marvel movie."
The Solution.
Dispatch and Flight Visualization
Once the callers location is confirmed, the drone autonomously launches. The operator can view real-time flight telemetry including ETA, route, and mission data.
The operator’s role is to monitor progress, ensure the AI’s path is accurate, and prepare to activate cameras upon arrival. Every detail is designed for clarity at a glance, giving operators confidence before the drone even reaches the scene.
Live Scene Monitoring
Upon arrival, the interface transitions into active scene mode. The operator now views a live feed from multiple drone cameras, centering the most active camera.
The system automatically detects activity on the ground, highlighting areas of interest in real-time. The operator can monitor unfolding events, communicate updates to first responders, and ensure that the caller’s safety is continuously tracked. This screen transforms complex aerial data into human-readable insight.
New Detections and Camera Alerts
As new movement or individuals are detected, the interface visually prioritizes urgency.
Cameras with new detections pulse red and display exclamation markers, drawing the operator’s eye without requiring them to navigate through menus.
Operators can instantly switch views, filter between “units” (first responders) and “suspects”, and tag detections for follow-up. These interaction patterns were designed to minimize cognitive load, ensuring that during critical moments, the interface thinks as fast as the operator.
Map Coordination and Multi-Drone Tracking
Operators can see all active drones, their flight paths, and nest locations in real time.
This visualization reinforces the operator’s spatial awareness and gives a macro-level overview of the unfolding situation.
Recordings and Post-Incident Review
After the mission, the operator can access the recordings panel to review and analyze footage from previous calls. This allows for both investigation and training purposes, giving dispatchers the ability to replay critical events and evaluate response patterns.
The interface’s playback design mirrors the live system layout, maintaining consistency while shifting from real-time operations to reflection and learning.
Understanding 911 Operators World.
As a team, we recognized a critical gap: none of us had firsthand experience as a 911 operator. To bridge this, we immersed ourselves in their world through multiple channels:
On-site visit to a police dispatch center.
Interviews with four active dispatchers.
Observational research via YouTube footage of real 911 calls.
Online forum analysis from dispatcher communities.
This research approach helped us begin to understand the mental models, pressures, and workflows that define the operator experience.
We then synthesized the data we learned into design direction by crafting a user persona. This persona was used to identify the user's main goals, frustrations and wants.
User persona crafted from user interviews and secondary research.
We developed a user task flow to visualize the lifecycle of a typical 911 call before, during, and after operator intervention. This helped us identify exactly where our interface would plug into the existing emergency response workflow.
A task flow we created displaying a 911 call lifecycle, data gathered from user interviews and secondary research.
Next, we needed to meet with Marc to learn more about the full capabilities of this drone.
What's Currently on the Market?
We wanted to evaluate the current interaction patterns, information architecture, and design flows people are currently using within the 911 and drone operating space. To do this, we crafted a comparative analysis:
Comparative analysis of DJI Drone, Skydio Drone, 911 Operator Game, Axon dispatch software, other police drone concepts, and DroneOptID.
Aligning with Stakeholder Vision.
With user insights in hand, we met with Marc (CEO of Socian Tech) to better understand the drone’s full range of AI-powered capabilities. This allowed us to identify design constraints, safety features, and key edge cases our interface needed to support.
Based on our research and talking with Marc we also identified three critical user journey's our design must include:
Camera Prioritization
“As a user, I can easily identify and switch to the most urgent camera feed.”Suspect Identification
“As a user, I can clearly distinguish and understand who is present at the scene.”Map Interpretation
“As a user, I can effectively read and interpret map-based information.”
Lifecycle of an E.D.G.E. drone through the eyes of Socian.
Design, iterate, repeat.
We moved into the interface design phase, gradually increasing fidelity while staying in contact with both Marc and the 911 operators we interviewed earlier.
Design iteration 1.1 (left) and 1.2 (right).
Design 1 iteration highlights:
Key drone metrics displayed (battery life, serial number, bearing, speed, etc.)
Centralized camera view
Map featuring "homebase" and "scene"
Profile board showing pinged individuals by the AI system.
Design iteration 2.1 (left) and 2.2 (right).
Design 2 iteration highlights:
Increased fidelity to enhance information clarity
Consolidated UI into a single screen by removing tabs
Improved visual hierarchy for faster decision-making
Introduced camera rewind functionality
Added camera movement controls and high urgency alerts
Design iteration 3.1 (left) and 3.2 (right).
Design 3 iteration highlights:
Refined information architecture and overall layout
Integrated map location data into camera
Introduced ability to switch between multiple drones
Added vision mode toggles: day, night, and thermal
Integrated license plate/vehicle recognition
Upgraded to 1920×1080 video resolution
The MVP: Visualizing Drone Data for the 911 Operator
Reflection & Retrospective.
This was my first time designing for an AI product, although the AI capabilities and limitations were provided up front, the challenge came in translating that technical potential into a clear, usable interface for high-stakes decision-making.
What went well:
Team chemistry and collaboration were strong. We each brought unique design ideas and pushed each other to iterate and grow.
We used diverse research methods, including on-site visits, dispatcher interviews, and secondary sources to quickly build empathy with 911 operators.
What didn't:
We sometimes became too attached to early design ideas, making it harder to pivot when user or stakeholder feedback required change.
What to improve next time:
Keep fidelity lower, longer to allow for quicker feedback cycles and reduce emotional attachment to visuals.
Spend more time testing interactions under pressure to simulate how users perform in fast-paced, safety-critical situations.
Conclusion.
We presented our final work in-person to our UX Experience Studio class, while Marc Ward (CEO of Socian Tech) joined remotely via Zoom. After the presentation, we conducted a formal handoff meeting with Marc, where we delivered the complete project package, including all research documentation, design files, and final interface mockups, to support potential future development.
In memory of Marc Ward, whose vision and leadership greatly influenced this project.




















