The State of Big Data 2017

Jan 30, 2017


BEST PRACTICES SERIES

Article Image class="p1"> 

It’s January 2017, and you’ve turned over the digital calendar page on a new year. Was your resolution to walk an extra 1,000 steps a week this year, to be tracked by your Apple Watch, shared on Facebook for extra accountability, and added to your Lifetick goal-tracking web app? Congratulations—you’ve both added to and exemplified the state of Big Data in 2017. It’s everywhere, and it’s only getting more multidimensional. But similar to your best intentions around exercise for the new year, there are significant challenges between theory and action.

The onward march of both consumer and commercial adoption of the Internet of Things (IoT), relatively cheap storage, and improved methods of data capture are all contributing to the growth in scope of Big Data. The Wikibon community of business technology practitioners sized the global Big Data market at $18.3 billion in 2014, predicting it will grow at an annual rate of 14.4%, to hit $92.2 billion by 2026.

When organizations can make the most of Big Data’s characteristic velocity, variety, and volume—and truly leverage their Big Data assets—the impact on customer satisfaction can be palpable. Take Coca-Cola Freestyle dispensers, which allow users to specify mixtures of flavors from the brand for a custom drink. The Coca-Cola Co. captures information on what drinks are dispensed and at what time of day, among other data elements. This data is used to fine-tune stocking and inventory even for non-Freestyle vending machines. And Duetto Research offers a hotel revenue management SaaS solution that allows hotels to price rooms dynamically, based on factors such as holidays, weather, and local events. 

But most organizations are still grappling with how to unlock Big Data’s potential. A September 2016 Gartner, Inc. survey on planned Big Data investments illustrates the challenge: While 73% of companies surveyed say their organizations had invested—or plan to invest—in Big Data, only 15% of respondents had deployed Big Data projects to production. Gartner analyst Nick Heudecker, co-author of Survey Analysis: Big Data Investments Begin Tapering in 2016, says, “There are challenges in governance, infrastructure, scale, and skills that create a real gap.”

The Year in Review

John Mancini, president of AIIM International, says that the notion of Big Data being mainstream by 2016 is conceptually right. “Yes, organizations are consuming more varieties of data,” he says. “But people are still struggling to get data under control.” He attributes the lag to a couple of factors, starting with the difficulty of getting information into a form that is consumable and ingestible. He adds, “Most organizations have Big Data that is also dark data—how do you extract the meaning of things like .tif files and videos? It’s a bigger hurdle than people think.”

It’s telling that much of the growth cited in the Wikibon study is expected to derive from analytics, applications, and tools—the weapons needed to tame the terabytes and petabytes of data created as we merrily trail data exhaust in our wake, both at work and at home. The Gartner survey found a slowing intent to invest, as the focus shifts from Big Data as a discrete initiative to how it impacts business processes. Heudecker says, “Customers don’t say, ‘I’m doing Big Data analytics’ anymore, but instead say, ‘I’m doing security analytics’ or ‘I’m doing customer-360 analytics.’ Big Data has become more specific and less nebulous.”

Mancini believes that 2016 may have represented the point at which companies began taking a harder look at valuing Big Data, citing the choice by AIIM’s Executive Leadership Council to focus on “infonomics”—the theory of assigning economic value to information—for its annual think tank meeting last year. “Big Data isn’t a monolith,” Mancini says. “People want to use it in the context of particular problems, in these pockets of application.” Futurist Thornton May, whose work focuses on how companies create value with information technology, agrees that there’s been a shift to quantification. “We live in an information age, but haven’t valued information properly,” says May. “But equity analysts are starting to pay attention to things like analytical mastery and prophylactic content management—which I define as the avoidance of self-inflicted data damage—and that means senior managers are too.”

May says Big Data has also permanently changed consumer expectations. “Consumers are fed up with unsatisfying information experiences,” he says. “It used to be [that] a journalist could target a mass audience, but now there’s a demand for bespoke messaging.”

A Look Ahead

Just as the volume of Big Data shows only signs of further exponential growth, so are the tools and infrastructure around Big Data evolving quickly. Heudecker says, “When it comes to infrastructure, we’re seeing the next iteration emerge. It will be memory-centric and cloud-dominant and will need to be dynamic.”

That ties in to another trend that continues to gather steam: the ability to enable business users—and not just data scientists—to access and model with Big Data. In part, it’s an imperative due to the ongoing imbalance in supply and demand for qualified data scientists and data analysts. “Companies will continue to move Big Data into BI [business intelligence] platforms, so end users can access it,” Heudecker predicts. Machine learning and machine intelligence will be part of what enables end users to work effectively with Big Data within those platforms.

Mancini sees this data democratization as an encouraging development. “When Big Data is viewed only through the prism of data scientists, it can only go so far,” he says. Mancini points out that access to data creates opportunities for incremental learning. “Patterns can be surfaced that you never knew about before, so what questions do you ask next? Most organizations are still on the cusp of this kind of discovery.”

Pressure around getting data governance right will continue to build. “The companies who are getting it wrong treat Big Data as an engineering problem, not a people problem,” says Heudecker, while the ones doing a more exemplary job “are taking a less technical view and engaging at the organizational level.”

Finally, expect to see Big Data initiatives tied even closer to business outcomes that can be valued and quantified. “It’s not enough to just manage Big Data,” says May. “It’s about, ‘What value can we create?’ ”  


Related Articles

During the great California gold rush, early prospectors sifted through rocks and streams in the hopes of finding a fortune. But bigger hauls were often claimed by going below the surface and digging deeply into mines, where a treasure trove of gold lay waiting to be unearthed. The gold rush may be ancient history, but there's still potential treasure to be tapped--and crises that can be prevented too-by companies that are willing to delve deep into their dark data.
Digital marketers who want to see more green and avoid the red on their sales charts, post-Thanksgiving, need to think less about mass appeal and instead stress better brand experiences via one-on-one engagement. In other words, package more emarketing customization to gift-minded shoppers in the form of personalized digital advertising.
Just do it. I'm lovin' it. Diamonds are forever. You know these words in part because they've been very carefully chosen to appeal to audiences. But as Paul Blamire, VP of client experience at Atomic Reach, told an audience at the Gilbane Digital Content Conference, it's about more than the words themselves—it's also about structure, what device people are reading on, and the emotions they evoke when this all works together. According to Blamire, machine learning (ML) and artificial intelligence (AI) can help marketers get the words and their structure right for more people on more devices.