The publishing validations feature simplifies issue identification for admins and instance owners, minimizing errors before and after publishing events.
Tasked with the challenge of devising a comprehensive platform-wide validations system visible across the admin console, I delved into a myriad of UX solutions. Through extensive iteration and exploration of concepts, I ultimately crafted a successful solution that facilitates error-free task completion for multiple user groups. This was created in collaboration with the Google Gather cross-functional team at Left Field Labs.
Designs completed in October 2022. All designs for this featre were created in collaboration with the Google Gather team at Left Field Labs.
Project overview
The need for this feature arose in response to feedback indicating that our admin users encountered challenges in quickly scanning the events they managed for errors. Manually scrutinizing each page of an event for issues proved to be exceedingly laborious, particularly given the substantial volume of events managed by many admins. This project addressed the needs of both standard event admins, seeking a streamlined error-checking process during and after event creation, and instance owners, requiring oversight of all events generated by their team members.
By engaging directly with current platform users to pinpoint pain points, I synthesized this feedback to devise a strategy aimed at mitigating these issues. The outcome was a sleek, intuitive UI component that globally displayed the number of errors per event, provided localized error per-page displays, enabled users to identify and address errors efficiently, and offered seamless event publishing or unpublishing capabilities in tandem with validation status.
This project showcases the type of thinking that went into designing the internal-facing admin portal, the backbone of the Gather product, which corresponded to the creation of guest-facing event templates. For context into how this integrates with the Gather platform on the front-end, feel free to reference the Gather overview project.
The dashboard view with the publishing and validations component in its collapsed state, fixed to the right edge of the screen.
The component in its expanded state, showing a full list of page-level items to review for the event.
Discovery
During the discovery phase, I focused on consolidating research findings to pinpoint what the heart of the problem was with the current workflow, to ensure that the solution I proposed would be effective. This started with a series of user flows, followed by sketches of key screens to establish the basic function and layout of the new multi-faceted component.
User flow diagram I made to outline the story of an admin’s basic workflow with the addition of the new feature.
Process sketches exploring the behavior of the “side tag” component, which would expand into a larger drawer when interacted with.
Process sketches I did in the process of considering how to handle the notion of showing a numeric badge which would indicate the number of issues on a page. This version was for an alternate design, which used a floating action button in the bottom right corner of the screen.
Exploring an alternate concept, which dealt with the concept of multi-tiered warnings, including ‘suggestions’, which were lower-severity, and ‘issues’, which were higher-severity.
In the process of designing the new publishing/validations feature, I also took the opportunity to rethink the global navigation, as the two were directly intertwined. This sketch shows my process of outlining a clean, consistent breadcrumbs pattern to assist navigation throughout the platform.
Annotated designs
As part of the design-development handoff process, I provided detailed annotations for each state of the feature, to explain both functionality and visual specs.
Key states of user flow
The following states illustrate the key user flow, including published/unpublished states, expanded/collapsed states, event-level warnings, page-level warnings, and the no-warning state.
Unpublished event state, with the collapsed ‘side tag’ on the left, and the expanded drawer on the right. (As featured at the top of the page)
Page-level , collapsed state, with the error fields highlighted in orange to facilitate easy identification.
Page-level, expanded state, published event state, when there are no items to review.
I created designs for specific pages of the platform, to indicate patterns for how the warnings would be displayed in-line on the page, in addition to the global component. This set of screens shows the “Layout” page, where users manage and build page content modules.
This set of screens shows the “Speakers” setup page, where users manage and build speaker content, which can be assigned to talks in the event. Examples of warnings would be “Speaker missing name”, or “Speaker missing description”.
Various alternate designs explored
Like all projects on Gather, this work entailed extensive collaboration with the engineering team, as well as with the producers of our internal team, who both happen to be long-time users of the internal tool. Their insight was invaluable, and I led a series of internal workshops, opening up discussions and presenting several iterations of the designs, before landing on the final design that felt right to everyone.
Alternate design A: Explores status and warning indicator in the top nav, and a non-collapsible, ever-present side module.
Alternate design B: Explores status and warning indicator in the top nav, and a collapsible, ever-present side tag, which would expand upon being interacted with.
Alternate design C: Uses a floating action button pinned to the bottom right, which indicates status, and can be clicked to expand the full drawer.
Alternate design D: Explores visuals to represent the concept of assigning severity to issues. Items considered mere ‘suggestions’ would be highlighted in purple, and items deemed true ‘issues’ would be indicated with orange. This idea was eventually abandoned after discussing internally, as we realized it would involve significant research to determine which severity level each warning would fall under, and each user instance might have their own notions about this classification.