Home >> Insights  >>

Unbundling Design For Innovation In Big Data

by Ivaturi Vijaya Kumar, Co-founder and CTO, Crayon Data October-2015

With its Headquarters in Singapore, Crayon Data Specializes in Big Data, Analytics, and Business Intelligence etc.

Visual design in big data systems has undergone a significant transformation in the recent years. It is mainly driven by the emergence of tactile interfaces in UI and the trends in information systems design architecture. The rise of mobile first designs helped fuse these two axes together. It is now driving a new paradigm when it comes to the integration of data and experience.

The decoupling of the surface layer from the underlying wireframe elements and logical structure of the data to be represented has resulted in the freedom of experience, without being bogged down by the data sets and navigational aspects under it. This is very similar to the other shifts, which happened with back end technologies in the earlier decades, where the logical model got separated from it storage model and later from its computational model.

It is often stated that innovation happens when there is unbundling of layers in a technology stack and the above trends are a case in point. The advances in the tactile technologies covering touch and gesture took the user experience to a different level of interaction and made it sensorial. This brought many right brained artists to the realm of big data, to collaborate with left-brained focused computer and data scientists. In fact, many present solutions are experience led designs rather than computation led designs. This is one of the main reasons why there is a consumerization of enterprise IT systems, where the device led experience of the business users is driving the design changes in backend systems; running either on corporate servers or cloud systems.

While it is much easier to understand the changes led by consumers into the business user domain, a similar shift is taking place in the developer world when it comes to dataware houses, data models and devops. In the conventional design paradigm, the design of databases in general and dataware house in particular is driven by the efficiency and cost of the hardware available. For those who come from that era, it is a well-known fact that most of these systems are memory optimized. In the present era, the resource constraint led design is substituted by experience at scale led design model. The unbundling of the resource layer from the logical model and persistence layer above, is forcing us to rethink many of the conventional design patterns.

Using a reflective mindset, one can start at the big data stack from the top layer and uncover many nuances in the unbundling of layers, by going down to the lowest hardware layer. It is of course much easier to appreciate this change in the presentation layer, as it is visual in nature. However, the changes in the layers below are perhaps much more dramatic than what is there at the surface layer. The changes in the development and test environments are one such example. In the traditional systems, the developer had to struggle with the difference between the lab environment and the target environment in the production state. With the advent of virtualization and HW neutral design engineering, there is a seamless mobility of different environments using containers and dockets. This is pushing the process designers to relook at the development process models and deployment models in the cloud era.

While it is still in its infancy compared to its software counterpart, open hardware design is another example of unbundling the stack for innovation. The real world impact of these changes will be seen in AI, Robotics and IoT systems of the future. This is perhaps the first time in the world that we are a witness to a HW architecture which is SW neutral and a SW design which is HW neutral at the same time. We are entering an era where the logic is more compositional than procedural or descriptive in nature. This drives the system creation to be a relatively simple affair in terms of time and effort, but this makes debugging the system quite complex.

As the systems evolve in the above model, it is going to be the age of solutions with vertical integrations, while leveraging the innovations in the horizontal layers. These new generation solutions are innovations, with a pause in time and a phase in completeness.

 

Facebook