Innovative solutions

The term big data seems to be everywhere these days. You even see it in discussions about science, government, and mathematics, just to name a few. Big data is making a difference in the Government and financial services sector where security, profit and loss pendulum can swing in a matter of seconds.

So, why all the fuss about big data? First let’s start with defining what the term actually means. In the context of this discussion big data refers to the ability to process and analyze vast amounts of information in milliseconds. It’s important to understand how technology impacts the way financial markets operate and how the adoption of new forms of technology, such as big data, change the IT landscape for the financial sector.

Government

  • In 2012, the Obama administration announced the Big Data Research and Development Initiative, which explored how big data could be used to address important problems faced by the government. The initiative was composed of 84 different big data programs spread across six departments.
  • Big data analysis played a large role in Barack Obama‘s successful 2012 re-election campaign.
  • The United States Federal Government owns six of the ten most powerful supercomputers in the world.
  • The Utah Data Center is a data center currently being constructed by the United States National Security Agency. When finished, the facility will be able to handle a large amount of information collected by the NSA over the Internet. The exact amount of storage space is unknown, but more recent sources claim it will be on the order of a few Exabyte’s.

How Fast Is Too Fast?
Over the years financial exchanges and Government agencies like NSA have continually made improvements to their technology infrastructure, all in the name of speed and accuracy. For example, direct feeds from the New York Stock Exchange (NYSE) allow companies to receive and process information in real-time and then react to the results. Moving forward, however, financial institutions must be cautious. Analyzing and reacting to big data has become so aggressive that the SEC has created a Quantitative Analytic Unit to police high frequency traders, market movers and trades that that have sub-second irregularities. In fact, the parent company of the NYSE was even fined $5 million for delivering data several seconds later to some customers.

 So why is it important to analyze large amounts of data? There’s always been data available about customers and trends in the market. The difference with big data is that vast amounts of information can be analyzed in milliseconds, which can help improve decision making and allow financial institutions to customize programs for specific customers.

It’s clear that technology can help improve the performance of financial institutions through the analysis of big data. Improved customer segmentation and identification of industry trends can greatly enhance competitive advantage. While there are always issues with utilizing new forms of technology, it’s clear that the benefits outweigh the risks when leveraging the information learned from big data.

Critiques of the big data paradigm

“A crucial problem is that we do not know much about the underlying empirical micro-processes that lead to the emergence of the[se] typical network characteristics of Big Data”. In their critique, Snijders, Matzat, and Reips point out that often very strong assumptions are made about mathematical properties that may not at all reflect what is really going on at the level of micro-processes. Mark Graham has leveled broad critiques at Chris Anderson‘s assertion that big data will spell the end of theory: focusing in particular on the notion that big data will always need to be contextualized in their social, economic and political contexts. Even as companies invest eight- and nine-figure sums to derive insight from information streaming in from suppliers and customers, less than 40% of employees have sufficiently mature processes and skills to do so. To overcome this insight deficit, “big data”, no matter how comprehensive or well analyzed, needs to be complemented by “big judgment”, according to an article in the Harvard Business Review.

Much in the same line, it has been pointed out that the decisions based on the analysis of big data are inevitably “informed by the world as it was in the past, or, at best, as it currently is”. Fed by a large number of data on past experiences, algorithms can predict future development if the future is similar to the past. If the systems dynamics of the future change, the past can say little about the future. For this, it would be necessary to have a thorough understanding of the systems dynamic, which implies theory. As a response to this critique it has been suggested to combine big data approaches with computer simulations, such as agent-based models, for example. Agent-based models are increasingly getting better in predicting the outcome of social complexities of even unknown future scenarios through computer simulations that are based on a collection of mutually interdependent algorithms. In addition, use of multivariate methods that probe for the latent structure of the data, such as factor analysis and cluster analysis, have proven useful as analytic approaches that go well beyond the bi-variate approaches (cross-tabs) typically employed with smaller data sets.

In Health and biology, conventional scientific approaches are based on experimentation. For these approaches, the limiting factor are the relevant data that can confirm or refute the initial hypothesis. A new postulate is accepted now in bio sciences : the information provided by the data in huge volumes (omics) without prior hypothesis is complementary and sometimes necessary to conventional approaches based on experimentation. In the massive approaches it is the formulation of a relevant hypothesis to explain the data that is the limiting factor. The search logic is reversed and the limits of induction (“Glory of Science and Philosophy scandal”, C. D. Broad, 1926) to be considered.

Privacy advocates are concerned about the threat to privacy represented by increasing storage and integration of personally identifiable information; expert panels have released various policy recommendations to conform practice to expectations of privacy.

Facebooktwitterlinkedinrssyoutubeby feather
Facebooktwitterredditpinterestlinkedinmailby feather

Written by

Comments are closed.