This is a case study for a test vehicle scheduler web-app that I helped design for Uber ATG.
This is a case study for a test vehicle scheduler web-app that I helped design for Uber ATG.
The track testing team at Uber ATG does not have a good way to efficiently schedule tests due to the experimental nature of the testing done. The challenge is to design a web-app that auto-assigns track tests to vehicles while meeting various testing and configuration requirements on the backend and displays the daily schedule in an easy to read way.
August 2019 | Duration: 3 weeks, Part-time | ~50 hrs.
Sole designer working with a full-stack engineer
Design | UX
Figma | G-suite
The first step to creating the web-app was to fully understand the problem. Highlighting the pain points in the current processes and designing around what an ideal workflow would be.
The current workflow for Track Operations required a lot of manual processes including scheduling, conflict resolution, data inputs, in addition to actually running the tests themselves.
We developed requirements for the MVP after interviewing our key users and shadowing them at the track facility during track testing setup and scheduling.
We simplified the workflow minimizing operator inputs and actions to remove as many touchpoints as possible.
I made user flows based on user stories to make sure we captured all needs of each aspect of the workflow.
Using the requirements provided and information discovered on my own I started the design process.
I focused the initial design developments around the calendar view since that was the least thought out and would have the greatest impact. I used familiar calendar apps (i.e. G-cal) as inspiration to help drive the design process.
Uncertain whether the scheduler tool would be a stand-alone web-app or imbedded in another tool we explored several navigation options
After getting more direction from leadership we were able to shore up the designs and do some user testing.
We learned that we would be using the same design system as another internal tool and that the web-app would be a stand-alone application.
I wired some frames together to make a clickable design mockup to verify the flow of the web app worked as intended as well as to catch any issues during development.
We had participants run through 5 tasks:
(1) Push all but one of the tests to the schedule. (2) Locate test information from the Calendar view. (3) Edit the number of assets required for a test scenario and submit for approval. (4) Add a test scenario and submit for approval. (5) Edit the location of a vehicle.
Based on feedback we received during user testing and from observing participants interact with the tool we made some design revisions.
This project was a lesson in the importance of clearly defining the requirements and goals of any project. We lost several days worth of work because the vision of what this tool was supposed to be used for and the problem it was supposed to solve was not clear. Not enough background information was obtained before beginning to build and not all stakeholders were interviewed. This can all be traced back to a lack of leadership; we were operating without a true Product Manager, and the designer (me) was brought into the project late. Next time we should spend more time defining the problem and interviewing all stakeholders.
The next steps in this project will be to make the 'AV Status' table and test definition tables more robust and explore further use cases to expand the functionality of the tool or plan for integrations with other tools. First, we will focus on building out the “Test editing” functionality as well as the “AV Status” functionality allowing for a more seamless editing and approval workflow that will include historical tracking. Second, we will begin to make the overall experience more delightful and see where we can expand the functionality of the web app to accommodate the workflows of various other internal teams.