We Are Reaching A Critical Juncture On The Path To “Smarter” Systems Where Organizations Will Be Crying Out For A Completely New Approach
The IT and telecom sectors have failed to re-evaluate their relationships with advancing technology and their constituents. The business and technology paradigms to which these industries cling today are far too limiting, too cumbersome and too expensive to foster what’s required to truly realize an internet of things.
From a Telco perspective, today’s discussions of M2M systems focus almost exclusively on communications — the “pipe” — and very little on the information value. In other words, on things that look good to the carriers. There are many popular visions about wireless monitoring and wireless control. Such as it is, wireless is a fantastic new advance — no question. But, focusing on the communication element alone as ‘first-order’ business value amounts to grabbing the wrong end of the technology stick. Wireless communications alone steals the limelight and potentially eclipses the real revolution — utilizing new networking technologies and processes to liberate information from sensors and intelligent devices to leverage collective awareness and intelligence.
From an IT perspective, most of today’s corporate IT function are direct descendants of the company mainframe, and work on the same “batched computing” model—an archival model, yielding a historian’s perspective. Information about events is collected, stored, queried, analyzed, and reported upon. But all after the fact. That’s a very different thing from feeding the real-time inputs of billions of tiny “state machines” into systems that continually compare machine-state to sets of rules and then do something on that basis. In short, for connected devices to mean anything in business, the prevailing corporate IT model has to change.
The next cycle of technology and systems development in the smart connected systems arena is supposed to be setting the stage for a multi-year wave of growth based on the convergence of innovations in software, so-called cloud computing architectures; back-room data center operations; wireless and broadband communications; and smaller, more powerful client devices connected to personal, local and wide-area networks. But is it?
When it comes to preparing for the global information economy of the 21st century, most people assume that “the IT and Telco technologists are taking care of it.” They take it on faith that the best possible designs for the future of connected things, people, systems and information will emerge from large corporations and centralized authorities. But those are big, unfounded assumptions. In fact, most of today’s entrenched players are showing little appetite for radical departures from current practice. Yet current practice will not serve the needs of a genuinely connected world.
What are the major obstacles that need to be overcome?
Leveraging collective intelligence: For all its sophistication, many of today’s M2M systems are a direct descendent of the traditional cellular telephony model where each device acts in a “hub and spoke” mode. The inability of today’s popular enterprise systems to interoperate and perform well with distributed heterogeneous device environments is a significant obstacle. The many “nodes” of a network may not be very “smart” in themselves, but if they are networked in a way that allows them to connect effortlessly and interoperate seamlessly, they begin to give rise to complex, system-wide behavior. This allows an entirely new order of intelligence to emerge from the system as a whole—an intelligence that could not have been predicted by looking at any of the nodes individually. What’s required is to shift the focus from simple device monitoring to a model where device data is aggregated into new applications to achieve true systems intelligence.
Automated development: When telephones first came into existence, all calls were routed through switchboards and had to be connected by a live operator. It was long ago forecast that if telephone traffic continued to grow in this way, soon everybody in the world would have to be a switchboard operator. Of course that has not happened, because automation was built into the systems to handle common tasks like connecting calls. We are quickly approaching analogous circumstances with the proliferation of smart connected devices. Each new device requires too much customization and maintenance just to perform the same basic tasks. We must develop software and methods to automate development and facilitate re-use, or risk constraining the growth of this market.
Optimizing all assets – tangible and intangible: New software technologies and applications need to help organizations address the key challenge of optimizing the value of their balance sheets, allowing them to move beyond just financial assets and liabilities to their physical assets and liabilities (like electric grids or hospitals) and then to their intangible assets and liabilities (like a skilled workforce). The task of optimizing the value of financial assets, physical assets and people assets requires new technologies that will integrate diverse asset information in unprecedented ways to solve more complex business problems.
Flexible, scalable systems: IT professionals rarely talk these days about the need for ever-evolving information services that can be made available anywhere, anytime, for any kind of information. Instead, they talk about web services, enterprise apps and now cloud computing. The Web stores information in one of two basic ways: utterly unstructured, or far too rigidly structured. The unstructured way gives us typical static Web pages, blog postings, etc., in which the basic unit of information is large, free-form, and lacking any fundamental identity. The overly structured way involves the use of relational database tables that impose rigid, pre-ordained schemas on stored information. These schemas, designed by database administrators in advance, are not at all agile or easily extensible. Making even trivial changes to these schemas is a cumbersome, expensive process that affects all the data inside them. Just as importantly, they make deep, inflexible assumptions about the meaning and context of the data they store. Both of these approaches to data-structure enforce severe limitations on the functions you want most in a global, pervasive-era information system: scalability, interoperability and seamless integration of real-time or event-driven data. The client-server model underlying the Web greatly compounds the problem.
Some things that look easy turn out to be hard. That’s part of the strange saga of the Internet of Things and its perpetual attempts to get itself off the ground. But some things that should be kept simple are allowed to get unnecessarily complex, and that’s the other part of the story. The drive to develop technology can inspire grandiose visions that make simple thinking seem somehow embarrassing or not worthwhile. That’s not a good thing when defining and deploying real-world technology to deliver innovation.