I identified an opportunity for a review tool to enable Global Operations to launch workflows with integrity and creative solutions that cater to legal, privacy, policy and bias-based concerns. The tool would also empower the team to scale to meet an increase in product team demand for Ops services.
The workflow review process was tedious with many manual activities and inefficient processes. Working across a team of over 20, including engineers, operation, legal, privacy, bias, etc., I needed to design and build both a business process and an accompanying tool that would enable the team to increase throughput and reduce risk without adding additional team members.
A semi-automated review tool that streamlined the review process, reduced time to approval by 20% for reviews, decreased time spent by reviewers by 74% and standardized risk mitigation at scale, ultimately increasing the team’s throughput and ability to scale.
Working as the UX/UI designer, leading the end to end design process and a team of engineers to launch, this was the approach I took.
Before building a business case to secure funding from leadership, I approached the problem of scalability from a process improvement approach; identifying waste in the existing process, redundant processes or areas where tasks can be automated. Ultimately, this newly streamlined process would be empowered by the review tool.
With the previous process, reviewers, submitting teams and review owners engaged in a large amount of manual work and churn that led to a long time to approval, increased risk, and frustration by all users.
After streamlining the process and with the implementation of the review tool, automation opportunities, redundant processes and waste is eliminated.
After defining a new tooling enabled streamlined process, I created a business case to present to leadership to secure the funding needed to design and build the product.
After successfully presenting to org leadership, I received funding approval to design and build the new review tool.
I also needed to conduct an initial analysis for which internal platform we would leverage to build the tool. After completing a competitive analysis of the internal platforms available, I chose the platform that optimized for our business sensitivities; compliance and security.
I began with conducting user interviews across all three user types totaling to over 50 conversations. These interviews enabled me to gain insights into the existing pain points with the review process and discuss opportunities with each user for improvement from a process, tooling and risk mitigation perspective.
The current review process is tedious, complex, unintuitive, and difficult to track.
“The existing review tool is filled with noise that it is difficult to track and prioritize what needs my approval. I need to sift through various sources and materials / information in order to identify risks that are relevant and important.”
“The review process is slow and the intake form is unintuitive – for instance, questions make it difficult to provide accurate answers & there is a lot of back-and-forth with both the review owners and reviewers."
“There is a lack of standardization and consistency that makes it difficult to review and triage risks. Tracking across ~15 reviews at a given time is tedious, manual, and time-consuming.”
With the user interviews, process map and competitive analysis top of mind, I began whiteboard prototyping for quick iterations. Soliciting feedback from different users in the office and updating the designs and flow live with them.
workflow owner user flow | submitting a review
review owner user flow | viewing a review
reviewer user flow | assessing review log
After laying out basic site outline and structure, I moved into Adobe XD to build a low-fidelity wireframe that I was able to put in front of .
workflow dashboard
workflow review page
workflow review intake form
Using the same group that I initially interviewed, I conducted user testing with the low-fidelity prototypes. I provided each user specific task scenarios that would enable them to achieve different business goals for the workflow review.
Leadership requested a self serve operations dashboard. I designed and build this dashboard separately. This enabled identification of trending risks, quick responses to press and policy refinement.
Reviewers needed the details of the previous reviews when reviewing a revision to a previously approved review because of the dependencies and context that it provided to revision reviews.
Workflow owners expressed a concern with long times to approval by reviewers and the existing automatic notifications weren't sufficient reminders.
The workflow owner enjoyed having the risks highlighted at the top of the review to understand what risks that they will need to mitigate to be approved.
Review owners needed a clear and upfront space to view approver status as opposed to sifting through a stream of comments, messages and documents to try to find an approval.
Many of the questions in the intake form needed to be updated and designed in a way that accurately highlights the risks of the workflow and ensures adherence to risk assessments.
The new review flow leverages automation to enable reviewers to quickly assess the most important aspects of the review, drive standardized risk mitigation -- ultimately reducing time to approval.
Reviews leverage consistent risk assessment frameworks that support review & risk identification standardization. A condensed revision review supports scalability for low risk reviews.
The tool enables complex back end data capture that aides when responding to external press inquiries and identifying risk related trends. These insights are used to refine existing policies and guide more responsible product development.
The review intake form is tailored to the specific risks of the appropriate product area and has the systems, processes and controls in place that ensure adherence to the risk assessments.
I worked with the engineering team over the course of the next six months to execute a successful on time build of the product.
I conducted user acceptance testing by building specific test scripts and instructions for each user flow to ensure the tool worked as planned, surface bugs and enable a seamless launch to over 500 users.
After launch, I continued to work on the following: