We have been facing a bit of uncertainty regarding the weather in the past few days here in Montreal (where I live). Some days are gorgeous, blue sky, +25C , while in the next day we can wake up to a rainy day a few degrees lower than the previous one.
On days like this I find my Google Home Mini to be super useful, giving it a little shout "Hey Google, what's the weather like today?" even before I get up from bed in the morning while the curtains are still down (even though, a sunny morning doesn't guarantee a sunny afternoon here in our little corner of the world). I have to admit, sometimes I would even ask my Home Mini to tell me scores for the FIFA World Cup from the previous day just so I can try to guess what's gonna happen in the tournament in the next days. After I realized I actually do my own "reporting" using this device, why won't companies follow and do the same thing?
I can clearly imagine the next generation of business reporting being way more interactive than just reports that come in either tables or dashboards. Even though it could be an entire study for researchers in the Human-Computer Interaction (HCI) field to understand how humans best interact with BI, I decided to take a stab at envisioning what could happen to BI and reporting once we step outside the box and start creating a world that goes beyond two dimensions.
With the rise of Siri, Amazon Alexa, and Google Home, the first trend I'd like to mention is the voice user interface (VUI). Only recently Google introduced two highly sophisticated features for its voice user interface that could be easily used in offices for reporting purposes. The first one is continuous conversation that is currently only available in the US and allows people to really have a continuous conversation with the device without having to pause after one question. The second one is Google Duplex, a feature that allows the interface to be almost completely autonomous and even book a table at a restaurant completely on its own. Such interfaces could be placed in meeting rooms (after enhancing all the data pipelines and configuring the interface) for all sort of Q&A along the way for the mid and top level management meetings, and within time they will enable even more sophisticated type of reporting since these systems' capabilities are constantly enhanced by the engineers. Here's an example of what such integration could look like.
Another option beyond the voice user interface is semantic type of querying that can be done by writing. For instance "What was our top performing market last month?". Such capabilities already exist for some tools such as Microsoft Power BI (Q&A) that can identify semantic queries typed by users if they contain certain keywords such as the table and column names. However, this method still requires a bit of a learning curve to know how to identify the right sources, even though the learning curve should be significantly less steep than the one required to learn actual programming languages used for database querying such as the SQL and python.
Last but not least, applications of Augmented Reality (AR) could make the whole experience of looking at data visualizations much more fun with 3D models of graphs, maps, and what not.
After enhancing the data pipelines, transforming the data from a raw format into a nicely formatted and structured format, a lot can be done to make reporting much less boring and more engaging for the business users, and once that's done, a lot more added value can be generated from a company's BI existing systems. For some, such scenarios could sound very futuristic, but given the technological progress that has been done so far, I believe such reality where reporting goes beyond graphs and tables is around the corner. So next time you ask the voice user interface in your car to call your friend (or your parents) think about the other stuff you could ask it to do for you, and the information you can get even before you arrive at your office.