Security dashboard

On the Secure section of GitLab, the feature’s foundation is comprised of 5 security scanners: static analysis, dependency, dynamic analysis, license, and container scanning. I served on this section's UX team and scaled from a team of 1 to 5 product designers (70+ across the UX department). For wider perspective: the company as a whole during my tenure grew from 300 to 1300 team members!

As the team grew so did the processes, team structure, and areas of focus. My role focused on: the Composition Analysis Group on Secure section, which is: dependency scanning, container scanning, and license compliance. Also Container Network Security on the newly formed Protect section. While these were my areas of individual focus, there are also shared areas the design team iterates on collaboratively. One such feature is the Security Dashboard, which displays results from 4 of 5 of the security scanners. The following are contributions I made along the way, building off previous work, and laying the foundation for subsequent iterations.

Collborative iterations on a team endeavor

My objective: audit the security dashboard baseline experience and map a path forward with improvements. The primary user we are designing for works in an organization's web security department. Roles such as: security analysts, security engineers, or head of security. Although, our customers that are mid-to-smaller organizations may not have a dedicated security department, therefore in this case the users would be: developers, tech leads, and devops engineers.

I started by identifying and focusing on the following user task (or job-to-be-done) to improve on: “When reviewing vulnerabilities for multiple projects, I want to see them all in one location, so that I can prioritize my efforts to resolve or triage them while seeing the larger picture”. Based on the heuristics review; here were the key problems I identified:

  • The most important data was not immediately visible in a common desktop viewport: table view vulnerabilities are pushed down the page below the summary and large chart. The user needs to scroll up and down to adjust filters then view the table data.
  • The data visualization is using a line chart to show multiple different vulnerability severity types. However, it's not important to see how the vulnerabilities compare with each other, rather how they have trended over-time individually.
  • Line Chart
  • When the user adjusts the severity filters, for example to "high" only, the summary view still displays “0” for the non-selected criticality levels (chart displays only the filtered object). This occupies space and displays non-relevant data.
  • Line Chart
  • It's unclear in the security dashboard what projects in the group have or have not been tested and when. The source of the displayed data is absent.
  • Line Chart

    Video walkthrough of problems identified


Then my proposed solutions and shipped iterations:

  • Layout changes that prioritize the table vulnerability data: 1) header: Section header label ("Vulnerability Management") and filters, 2) main: table data at the top making it immediately visible to the user, 3) aside: contains the summary severity count and chart showing changes over time, also accounting for upcoming iterations: displaying affected project and untested projects
  • Layout change
  • If user filters by severity, the chart data summary shows only the selected data points
  • Display data
  • Improve information design by using sparklines : 1) show trends over time with individual vulnerability type, 2) severity label same as seen in the table (visual consistency), 3) percentage of change in selected days 30/60/90, help users understand how they have progressed +/-, 4) current # vulnerability count displayed
  • Sparklines
  • Show on dashboard what and when latest test were run for what project.
  • Data source
Reviewing proposed changes with the team
Overview: show on group security dashboard when security tests are not configured
Bot Overview Group security dashboard

Passing the baton

Since these changes: 5 new designers joined the secure and we’ve been continually ideating and shipping iterations together. Next up, I started looking at improving system communication for the user.

Iterations coming together exemplify the spirit of collaborative teamwork in action. In a continually changing environment and rapidly growing team, we work together a lot like a relay race: quickly iterating, then passing the baton to the next designer to contribute an iteration, then passing it to the next – continually contributing toward a common endeavor. Whether our contributions innovate or fail, it’s always valuable learning to improve and inform our fellow teammate’s next iteration. It inspires me to be a contributor among a wider team and see iterations come together to deliver customer value.

Thanks for visiting. Questions or thoughts? drop me a note.

See home page or view next case study: innovation by iteration .