Helping the birds sing louder by contributing to the Cacophony project
The Cacophony Project is an open-source project developing and deploying a set of technologies to control introduced mammal predators in New Zealand, with the aim of helping the native bird population recover. Over the years I've been contributing to the project, from redesigning Android apps, to reworking a web portal and visualising an index of bird song.
- Client
- The Cacophony Project
- Role
- UX design
- Date
- 2018-ongoing
Background
The Cacophony Project is developing a set of unique technologies and tools, covering both hardware and software. The project has three main tools currently being actively developed: an audio recorder for bird monitoring; a thermal camera for mammal detection and automatic identification (by using artificial intelligence); and an automated trap.
Over the years I've contributed to the project both as an open-source collaborator and as a UX contractor (with some occasional front-end work).
Challenge
As in any experimental and cutting-edge project, change happens often. As a result, the interface of both the mobile app and the web portal have grown organically. I had to keep in mind that this would happen again in the future and that any proposed solutions had to be able to accommodate future change. Additionally, this is an open source project and funding is limited, which meant that pie in the sky designs were a no-go; we needed feasible solutions that could be implemented by the team over time.
I was enlisted to tidy up the web portal where users can see the recordings taken by both the audio recorders and the thermal cameras, and to restructure the Android app used to connect and setup the thermal cameras.
Android app
The previous version of the Android app (used to connect to the thermal cameras and fetch video recordings) grew organically, and its setup was confusing for users different actions all happened in a single screen. Additionally, there were no instructions explaining how to connect to cameras or how to troubleshoot connection problems a major issue when the app was being used in areas without network coverage.
Process and solution
I suggested a refactor of the application structure and organised related areas into logical sections. To test assumptions with the team, I created an interactive prototype. I've also recommended using standard material design components, styled to match the branding of the project, as this would fast-track the development process.
The previous app, on the left, shows the single screen where every single action happened. On the right is the main screen of the new version of the app.
Making the mobile app easier to use by teaching users how to connect to their thermal cameras
The app now has an onboarding guide for first time users, explaining the steps users need to take to get the connection between the camera and the application up and running.
Guiding users when something goes wrong
Some of the thermal cameras are currently deployed in remote locations without cellphone coverage. Previously, if something went wrong, users would need to have a manual with them to troubleshoot why the phone wasn't connecting to the camera. I suggested making the troubleshooter part of the app while it won't solve esoteric connection issues, it will guide most users in the right direction.
Improving the web portal
The web portal was (and is) in a state of flux, but I identified a few major issues that could be addressed regardless of any changes:
- There was no visual hierarchy.
- There was too much information being displayed, and some of it was irrelevant for most users.
- The styles were inconsistent.
Process and solution
I created a prototype of the proposed changes, documented it and broke this information down into smaller chunks that were logged as GitHub issues. This meant that any open source contributor to the project (including myself) could jump straight in and start implementing these changes. For the user interface, I chose (mostly) Bootstrap components, as Bootstrap was already being used in the project.
Side by side comparison of the previous and current view of the web portal where users can search for video and audio recordings on the web portal. The previous version had little visual hierarchy and the filters many of which were not relevant took a significant amount of space. The new version addresses this by restructuring the content of the page and hiding some of the filters. Drag to reveal one or the other.
The redesigned pages of the web portal are now mobile friendly. On the left, mobile view of the page where users can search for audio recordings. In the center, view of the search filters expanded. On the right, a video recording page, where users can tag videos with mammal pests or confirm the AI predictions.
Proposed design and implementation of the Cacophony Index, a representation of the amount of bird song. The graph was implemented in D3.js and shows how bird song changes during the day peaking at dusk and dawn and dropping in the evenings. The trend line is complemented by additional bar charts indicating the variation of song throughout each hour during the time period analysed.
Results and future improvements
There is still much more that can be done to improve the portal, and I'm currently working with the Cacophony team to restructure the portal information architecture and to further improve the user interface. However, these changes have made the portal usable on mobiles and tablets, and made the video tagging interface used to train an AI modal much easier to use for collaborators.