For the first time in security monitoring Calipsa customers could see analytics for their camera network, but there was no way to see an overview of the platform and the processed data.
"I want to see how many false alarms we have avoided using Calipsa and have a hub of activity." - Calipsa customer
Demonstrating value to customers is important in AI as most of the work gets done in the background, by measuring progressing and presenting data we can give customers confidence that Calipsa is making a difference and encourage them to transition to a paid service.
How might we...
- demonstrate value to users using Calipsa data?
- share camera activity with the user?
- show the effects of using the platform?
- Track customers, sites and camera.
- Give an overview of false alarm elimination.
- Visualise the frequency of alarms across the platform.
- Camera activity notifications.
With the "How might we's..." in mind I started drawing inspiration from other products and designs. When working on a visual-heavy solution I like to find relevant designs that could work well for an idea and use elements to inspire my own work.
I broke down the Dashboard in to four main components: counters, circle charts, line chart and notifications. I used the inspiration I'd found as a foundation and iterated based on feedback and testing.
The counters would simple but useful, they would count the number of clients, sites and cameras the user was managing on the Calipsa platform. Although it is a simple metric to measure I wanted to make it easy to digest and glance at regularly.
As part of the Calipsa Design System I designed a set of icons for the product, I used these to reference each metric in the counters.
Initial designs used only 'Cameras active' and 'Alarms processed' metrics (see first circles iteration).
We later added 'Alarms reviewed' as we thought it would be useful (see second circles iteration) but then removed it after testing (see below).
By showing the user how many cameras they had active it shows whether they are using Calipsa to it's full potential.
The more cameras they have active the more alarms we can process; reducing more alarms and demonstrating value to the customer.
When we tested the design we found 'Alarms reviewed' to be least useful and typically frustrating.
"Alarms reviewied will never be 100%, there's no way we can review every single alarm detected." - Calipsa customer
Users didn't want to review ALL the alarms, and they didn't have to, Calipsa would work despite every alarm being reviewed, so we scrapped it.
By visualising the frequency of alarms monitoring stations can assess prime activity times and make changes to their monitoring schedules.
I included the 'false alarm' metric to demonstrate value, users can see the false alarm reduction when comparing 'total alarms' and 'false alarms'.
Iterations include curve interpolation to help show changes in data. Click here for a comparison of curve interpolation.
During research security monitoring station staff weren't interested in cameras until "something goes wrong". So, we determined scenarios where somthing has likely "gone wrong", including:
- When a camera hasn't received an alarm for more than 24 hours.
- When a camera has received an irregular number of alarms in 24 hours.
Features for V2 would include user activity e.g. creating, deleting or changing nodes (clients, site or cameras).
Here's an example of the components being used in production with real data, including two notifications.