We have a data problem, and it’s delaying the future
We have a data problem, and it’s delaying the future
The promise of continued innovation hangs on our ability to make data freely accessible to the people and teams who are driving towards the future. Companies everywhere are disrupting their own industries with mobile, agile, DevOps, and of course the cloud. But this is the Information Age and the Digital Economy and those same people, processes, and technologies are bumping into a new problem: access to data.
See also: How AI will transform devops and operations
Whether it be new advancements in machine learning or the ever increasing pressure for faster software innovation, the demand from data consumers (ex: developers, quality assurance teams, and B/I analysts) for fresh, production data has never been higher. At the same time data operators (the people tasked with the supply side of data, like DBAs and security professionals) are facing industry trends like mobile and IoT that are pushing exponentially more data into enterprises. On top of that, new industry and geographic regulations demand increased consumer protections, because though having data everywhere can be great, it can also result in enormous vulnerabilities.
These competing pressures result in a new, painful problem: data friction.
One on hand, it’s no secret that data is growing in size and complexity: IDC predicts that, by 2025, global data will grow to be ten times that of today. Enterprise data is expected to grow even faster, increasing nearly 25 fold in ten years to represent the lion’s share of all data created worldwide. Because of the physical limits of bandwidth, this kind of growth presents daily challenges. But even more sinister is the sneaking complexity of data that quietly crept into
But even more sinister is the sneaking complexity of data that quietly crept into enterprise architecture. The confluence of open source, cloud, and the disrupt-or-die market reality, has led enterprises to empower developers to use the best databases for each purpose. As a result, data is stored in a variety of formats across many data centers and clouds, making it exceedingly difficult to get a full picture of your customers and your business.
On the other hand, we are in a post-Snowden era. Consumers are aware of the need to secure their data: data breaches have become a way of life and we’re seeing an alarming uptick in the amount of personal customer data that is compromised every day. Industries and governments are responding by putting in place a range of protections and regulations, such as GDPR in the EU. Enterprises that fail to secure their data will see their customers lose confidence in them.
This challenge is left to Chief Information Security Officers and InfoSec professionals to solve, and too often the solution is to simply deny access to data. It’s no wonder there is great opposition to unchecked, free access to enterprise data.
This dynamic causes data friction, slowing innovation to a halt, when businesses have never needed to move faster. Data friction costs time, and therefore also can cost businesses their competitive edge, delaying innovations of the future from becoming reality.
The future? DataOps.
But fear not. Just as DevOps introduced people, processes, and technologies to rise to the challenge of increasing operational demands, I believe a new movement is emerging: DataOps. The first step is recognizing that the reactive, ticket-based IT model of the past simply won’t scale to meet the demands of data consumers. Instead, data operators must flip the model on its head and build a proactive enterprise data platform that gives data consumers self-service access to exactly the data they need and are approved for.
Doing so will require commitment by everyone involved to fundamentally rethink their existing processes and workflows. It will also require some amount of technologies as well: How will data be continuously pulled down from production? How will sensitive data get reliably secured before distribution? What happens to the underlying infrastructure if dozens or hundreds of people want their own personal copies of the data to develop against or run experiments on? Those that can tackle the people, process, and technology problems will be the ones that break through today’s innovation barriers.
Data is the most valuable asset for an organization today. I joined Delphix because as a product manager I’ve felt the pain of data friction and I’ve see the what can happen when data flows freely, letting me make data-driven decisions. Delphix is a founding member of this new DataOps movement and works with some of the most important brands in the world to help them define the future.
From autonomous cars and smart cities to AI breakthroughs in cancer research, new technologies will require access to massive amounts of data. Data that needs to be securely and continuously delivered. These visions of the future are incredibly exciting, but without the enterprises behind them making smart, data-driven decisions, the future won’t come fast enough.
The post We have a data problem, and it’s delaying the future appeared first on ReadWrite.
(28)