The interface guides the users through the necessary steps needed in order to create a stream of data, without the need for a complex introduction to the product.
Data can be fetched and delivered to and from several platforms, and existing data can be discovered as a starting point for a data model.
Users can grab data from discovered sources and use it as a starting point for their model of the data stream. The model can easily be constructed by following guided steps which abstract away the complexity of data warehousing methodologies, allowing users to focus more on their model and less on the technical details of the implementation.
Data models created in the Design area are quickly deployed in four steps, allowing users to see the live stream of data immediately.
This was a very technical project that required learning more about databases, how they work and their structures, and the rules of specific data warehousing methodologies. It also involved being keenly aware of implementation details and constraints. Getting to know the field meant learning more about the company and the business logic surrounding the problems – I wasn't a domain expert but this turned out to be useful to bring a new perspective.
I was heavily involved in the early stages of scoping and requirement gathering, and worked closely with the product owner and product manager during the phases of scoping and defining business requirements.
One of the biggest challenges was how to model the flow of data. Traditional applications in this space present the data in a list view, but during whiteboard sessions the team always ended up defaulting to drawing the flows of data in a diagram fashion; it soon became obvious that this was the natural way of presenting and manipulating flows of data.
Large diagrams soon become unreadable and were, from a technical point of view, challenging. This required researching successful node based interfaces, their features and analysing how they solved the hurdles we encountered.
The solution included incentivising users to break their data structures into smaller, more manageable chunks as soon as they start modelling the data. The diagram itself has features informed by the earlier research, such as search, filtering functionality and a mini-map.
Design and development underway
Whiteboarding and sketching sessions were a staple during the whole process, and they helped inform the quick prototypes I created in order to test assumptions with team members, define the information architecture and visualize how the product could look.
The prototypes (developed in Axure) quickly evolved and became more complex. They were soon being used to identify technical constraints as soon as possible and demoed to future users and stakeholders. As the project progressed it became apparent that the proposed functionality and interactions weren't always clear, so I started documenting them. Although this was time consuming, it became invaluable to ensure the team was all on the same page.
With the project moving fast and the development team one step behind, the interactive prototype developed in Axure quickly morphed into the UI design tool and reference for the developers. The final polish was added directly by tweaking the styles and markup of the web application – I often committed code in order to ensure the consistency of the look and feel, and also to make changes to the tone of voice of the application.
Additional graphics, such as the iconography for the product, were created using Adobe Illustrator; Illustrator was also used to create the templates for the nodes used in the diagram and later exported into SVG – this involved working closely with the developers to understand technical constraints and documenting the nodes' structure, in order to ensure that further changes wouldn't break the diagram.
Release and feedback
After the initial product release, feedback from early adopters was incorporated into the product and changes were made by working closely with the product manager. This often involved understanding the real pain points behind new feature requests and working out if user-proposed solutions made sense in the overall context of the application.
WhereScape Automation with Streaming has definitely put WhereScape on the map of real-time data streaming. Since the release of the product WhereScape has been:
- Recognised as one of the "Top 5 Vendors to Watch" in the third annual Datanami Readers' and Editors' Choice Awards
- Chosen as one of the "Trend-Setting Products in Data and Information Management for 2019" by Database Trends and Applications
- Shortlisted for the "2018-19 Cloud Awards" under "Best Cloud Automation Solution"