Blogs

Post a Comment

Big data architecture: Methods to protect identity and the citizen

March 20, 2014

This break out session on the architectural and methodological impact of big data was one part of a colloquium run by IBM’s Technical Consultancy Group, the UK & Ireland affiliate to the Academy of Technology, which hosted 40 leading academics in big data and analytics from UK universities last week at the Royal Academy of Engineering in London. 

Each of the six minute rapid-fire presentations and deep-as-you-like question rounds demonstrated the passion and vitality of the leaders in this field. Many common themes emerged from the diversity of the topics covered including:

  • academic colloquium bd architecture.jpgSeismic shifts in Exascale compute architecture
  • Big data knowledge worker value contribution skew
  • Self-optimising surgical queries in Hadoop clouds
  • Super-human-like cognition linking images signatures across vast libraries
  • Identity convergence with data collapse to the individual

Commodity, any-scale computation in the hand or cloud is solving problems which approach human cognition, but old problems remain. Macro and micro design mishaps destroy performance on exascale datasets—hardly an attribute of mature architectural process.

While data at a global, or even national, scale is structured to confound collaboration and reuse, leaps in machine cognition is allowing machines to transcend some of the challenges. However, architectural focus on data structure, its quality and standards setting is a key issue. Consequently, workforce skills for data structure and architecture, manipulation and especially analysis are increasingly in demand and now scarce.

Everything-to-everything connectivity in the Internet of Things yields transactions tracked and tagged to the individual, setting new standards for personal space intrusion. Low tact marketing and high profile data escapes are making citizens feel queasy and controlled, at best. 

Industry-wide, methodological improvements need to take a leap in maturity in order to avoid poor, exposed design, performance and reduced collaboration capability.  More importantly, ongoing collaborations need to deliver a set of responses which minimize exposing the individual and ensure trust is not eroded further. The spin-off discussions, each far from trivial, are set to continue. Watch here on the Hub for more from this academic colloquium.

Read the first two posts from this series:

  1. Big data is multi-disciplinary
  2. Are data artists essential for big data success?

Download Ethics for big data and analytics