This is Part 5 of a 5-part series devoted to exploring the concept of Big Data to determine what makes it different from other hyped data “revolutions” of the past.
In 2009, the Securities and Exchange Commission announced they would be adopting new rules for how material information should be structured for public consumption. The SEC described how in this new format, eXtensible Business Reporting Language (XBRL), ”financial statement information could be downloaded directly into spreadsheets, analyzed in a variety of ways using commercial off-the-shelf software, and used within investment models in other software formats.”
The concept of universalizing the underlying structure of data is not new – the history of XML can be traced back to the late 1980s, before the Internet explosion. What is new, is the broad acknowledgement of the value associated with data standards and the growing adoption of data standards across multiple societal levels.
When the government starts pushing technical standards, it is typically a good indicator that they have hit the mainstream.
The adoption has moved far beyond the financial vertical and is impacting almost every sector of industry. RSS has dramatically changed the publishing industry and facilitated completely new forms of interaction (e.g. blogging). More recently, Geotagging has changed the spatial dynamics of the of the Internet, creating the possibility for a localized user experience.
The use case in Part 1 of the series described how social platforms facilitate data production on an individual level. The impact of social media is two-fold however. They not only increase the volume of data being generated but also bring increased structure to the data, by virtue of the fact that the majority of production on the individual level takes place on a handful of platforms that use similar standards. In the process of opening their platforms to the public through API’s, social platforms have fostered an environment in which data structure standards have become the norm.
Moving forward, a development that may catalyze the rate at which standards are adopted, is Google’s increasing focus on delivering a more semantic experience to its users. In Amir Efrati’s recent Wall Street Journal article, With Semantic Seach, Google Eyes Competitors, he describes how Google is increasingly looking to semantic search as a means to reinforce their market share in search while increasing their market share when it comes to social (e.g. Facebook) and mobile (e.g. Apple’s Siri).
As Google turns, so does the Internet marketing industry.
Bringing it all together
Throughout this series, we have drawn the following conclusions of what has changed in the past decade to substantiate a case for Big Data:
- Social media facilitates an incremental increase in the amount of data that enters the data sphere by providing individuals with access to the tools of production while imposing structure on this data by virtue of the fact that the majority of production takes place on a handful of platforms.
- The broad societal shift towards greater openness and transparency, has created a flow of institutional data to the data sphere that was once hidden.
- As data migrates from the desktop into the cloud, it transforms from being dormant into a catalyst for user interaction and consequently increases the overall volume of quantifiable data in the data sphere.
- There is broad cultural acceptance of the value associated with common data formats as well as growing adoption of data standards across multiple societal levels.