However, in order for the market to really take off several key technology hurdles will need to be addressed, including:
- Lack of truly ubiquitous device communications standards;
- Costs and ease-of-use associated with provisioning device communications;
- New software architecture to support massively peer-to-peer, complex event-driven data management;
- Smart device, system and application interoperability.
Barriers, such as the lack of powerful, inexpensive devices and the inability to communicate over a wide range of wired/wireless networks are diminishing, but interoperability of distributed, real-time device data remains a huge hurdle.
The growing bottleneck lies in the relationship and interactions between ever more complex devices and the antiquated client/server architecture of the web. With memory and processor capabilities getting cheaper by the day, product designers are embedding feature upon feature into their designs. What may finally bring Moore’s law to its knees is the sheer complexity of software driving infinite interactions.
The growing disparity of “edge” devices on networks is diluting the ability of technicians to effectively manage them. It is extremely difficult to keep up with the unique requirements of each new device and all its advanced features. Customers increasingly expect networked devices to be functional, ubiquitous, and easy-to-use. Within this construct, however, the first two expectations run counter to the third. In order to achieve all three, the network must be loaded with “edge” intelligence.
On the hardware forefront, smartphones and tablets are perhaps the most ubiquitous devices with falling prices and continuous new innovations making even the most sophisticated new smart end-point devices affordable. For embedded devices, OEMs are racing to develop increasingly powerful and smaller devices. Small size comes with the trade-off of computing power. One can argue that the true hurdle to edge computing is the absence of fast, powerful, lightweight applications that can run on any small device in a truly distributed manner – at the end of the day, you really can’t fit a traditional “PC” architecture into a sensor.
Intelligent processing and transactional computing cannot occur on dumb clients where intermittent server connections, proprietary “locked” platforms, and large install footprints are prohibitive. The inability of today’s popular enterprise systems to interoperate and perform well in distributed heterogeneous collaborative environments is an obstacle that intelligent middleware can now overcome. Devices needed to host intelligent software components can communicate to other devices directly (peer-to-peer) or to logical collections of devices (peer-to-group) in any programming language, and do so autonomously.
Many basic technology hurdles that have previously held back Pervasive Computing and M2M are increasingly being met head-on, but the challenges of integrating complex event-driven systems and intelligent device communications in an interoperable manner remains a critical requirement. Interoperability is a key goal when evaluating new technologies, as wireless systems meld with legacy wired systems and developers integrate enterprise software systems. The inability of today’s popular enterprise systems to interoperate and perform well in distributed, heterogeneous “smart device” environments is an obstacle that newer intelligent platforms for edge devices can overcome.
However, some things that look easy turn out to be hard. That’s part of the strange saga of the Internet of Things and its perpetual attempts to get itself off the ground. But some things that should be kept simple are allowed to get unnecessarily complex, and that’s the other part of the story. The drive to develop technology can inspire grandiose visions that make simple thinking seem somehow embarrassing or not worthwhile. That’s understandable in science fiction. But it’s not a good thing when defining and deploying real-world technology to deliver new innovation. This is where today’s technologies and IT departments behaviors come into play.
For all its sophistication, today’s corporate IT function is a direct descendent of the company mainframe, and works on the same “batched computing” model—an archival model, yielding a historian’s perspective. Information about events is collected, stored, queried, analyzed, and reported upon. But all after the fact.
That’s a very different thing from feeding the real-time inputs of billions of tiny “state machines” into systems that continually compare machine-state to sets of rules and then do something on that basis. In short, for connected devices to mean anything in business, the prevailing corporate IT model has to change.
In its most basic and practical form, the story surrounding “edge” innovation is “enterprise systems meets embedded device computing.” But that’s not as simple as it sounds. Capturing the real value of Internet-connected devices goes much further than providing connectivity, databasing, and some XML-based transport scheme. For example, real Web services will allow networked, embedded devices to execute remote applications as if those applications were part of the internal operating system. This type of enablement can bring extraordinary value to the growing population of network embedded devices and collaboration in and amongst devices as well as humans.
At the end of the day, the convergence of collaborative systems and machine communications implies a total paradigm-shift in IT suppliers and users. The depth of this shift has begun to suggest itself, but it is by no means accomplished. It’s a shift from knowing “what happened” to knowing “what is happening”—all the time—and then automatically controlling systems with that knowledge. IT professionals rarely talk these days about the need for ever-evolving information services that can be made available anywhere, anytime, in true real-time, for any kind of information—human or device. Instead, they talk about “web services” or “cloud computing” interchangeably without giving it a thought. New reference architectures for “edge-driven” integration and collaboration are required.
Put in simple terms, low cost, easy to integrate sensors and edge devices with interoperable data management will drive wider adoption of real-time, complex event-driven systems. Networking technologies and the standards that support them must evolve to the point where data can flow freely among sensors, computers, actuators and people. Software to aggregate and analyze schema-less data, as well as pleasing user/system interaction design techniques must improve to the point where huge volumes of data can be absorbed by smart systems and by human decision makers more appropriately.
For IT suppliers and users to really succeed they will need to fully embrace the real-time benefits of the Internet device collaboration. Edge-driven and edge-designed systems demand that we design not only devices and networks but also information itself in ways not addressed by IT today.