22 Sept 2016

Who owns the infrastructure of the future?

I’m sure I’ve seen this movie before, and I have also seen tech companies pitching hardware and software with similar videos multiple times. What I find interesting about this vision of the future is the underlying infrastructure required to power the seamless interaction between people, technology and the offline world.

The key enablers here are beacons and sensors, cloud analytics and push notifications to mobile devices. Sensors provide contextual information about the environment, beacons give you the ability to tie people to specific locations with high accuracy, cloud analytics allow you to process these information feeds along with other data in real time, and push notifications allow you to get end user attention at a specific moment and provide a call to action.

Today all of these technologies exist, and most of them are highly accessible. Smartphone penetration in the most developed markets is at 70%+, the number of beacons installed in the US has grown from zero to 4 million in the last 5 years, and the cost of processing data in the cloud is almost at nothing. A few challenges remain, for example in the US only 40% of people have bluetooth enabled on their phones, but these will be resolved as more valuable use cases emerge.

In my view, the missing piece of the puzzle is a platform for accessing this network of sensors and beacons. From what I have seen, the public sector is furthest ahead here with data.gov in the US and data.gov.uk in the UK providing access to almost 250,000 datasets combined. But even this is more of an index than a platform, with data in many separate locations and formats.

In the private sector, there is another hurdle: who owns the infrastructure, the sensors, beacons and data that they generate? I think that part of the hesitation companies have in giving access to their infrastructure and data, is the loss of competitive advantage with nothing in return. Think of the retailer whose competitor suddenly knows how its customers are shopping, or the independent restaurant whose chain competitor notices it has incredibly high footfall and opens up next door — what would drive them to open up this data for free?

I have met a number of companies trying to solve these challenges in different ways, including owning and installing large networks of sensors or beacons for a specific function themselves (slow and expensive, but gives them full control of data format and distribution), building partnerships one by one with sensor owners (slow as the owners are highly fragmented and each will have different requirements, and data formats are still a challenge) and building data brokerages (a proven business model, but harder to build into a scalable platform).

I think a successful solution here must have the following key features:

  • A flexible way to ingest data from any source and store it in a way that makes it easy to query and transform
  • Data discovery tools that allow developers to browse and compare sources
  • APIs to programatically access and manipulate data, along with relevant meta information
  • Methods for creating and subscribing to specific data feeds or event streams
  • Fine grained access and approval, with the option to grant access on a case by case basis
  • Enterprise level security controls and audit logs
  • A consistent way of indicating data quality and service levels so that publishers do not bare responsibility for uses of their data
  • Payment infrastructure that allows publishers to price and charge for data on consumption based model
  • The ability to combine data feeds and additional analysis and republish, inheriting privacy and other characteristics of the sources, and with a revenue sharing model

This is an ambitious system I admit, but I think that it would change the mindset of those who own this valuable infrastructure and data, making it more accessible and helping us secure the seamless interactivity of the future we have envisioned.