DESIGNING the FUTURE
of CONNECTED SYSTEMS
The advent of smart connected products,
systems and services
Scroll to access the full insight
For quite a few years now, Harbor Research has focused most of its research and consulting on what we call “Smart Systems and Services”—the convergence of pervasive or embedded computing with the packet-switching “network of networks” called the Internet.
We prefer “Smart Systems” over other terms in common use—notably Internet of Things or looking back M2M which usually stands for “machine-to-machine”—because it captures the profound enormity of the phenomenon: the world on the Internet, the Internet in the world.
Smart Systems—also commonly called “pervasive” or “ubiquitous” or “invisible” computing— usually refers to digital microprocessors and sensors embedded in everyday objects. But even this makes too many assumptions about what the Smart Systems phenomenon will be. Encoded information in physical objects is also smart computing—even without intrinsic computing ability, or, for that matter, without being electronic at all. Seen in this way, a printed bar code, a CD or DVD disc, a house key, or even the pages of a book can have the status of an “information device” on a network.
But very few people are thinking about smart connected systems on that level. Current IT and telecom technologists are operating with outdated models of data, networking and information management that were conceived in the mainframe and client-server eras and cannot serve the needs of a truly connected world. “Smart Systems” should automatically be understood as “real-time networked information and computation,” but it isn’t. The Internet’s most profound potential lies in the integration of smart machines, information systems and people—its ability to connect billions upon billions of smart sensors, devices, and ordinary products into a “digital nervous system” that will smoothly interact with individuals and systems. The nature and behavior of a truly distributed global information system are concerns that have yet to really take center stage—not only in business communities, but in most technology communities, too.
UNDERSTANDING N.IO'S CREATIVE COMBINATION OF TECHNOLOGY
We have not encountered very many compelling visions about the complete integration of things, people, systems and the real-time world. The n.io team of innovators understand that the tools we are working with today to make products “smart” on networks were not designed to handle the scope of new capabilities and the scale and diversity of interactions.
The n.io platform is not a simple incremental improvement, patch, Band-Aid, or new flavor of what we already do. Their development represents a true shift in thinking about how devices, people and physical systems will be integrated and how they will interact. n.io greatly eases the friction and challenge of converting existing IT systems into interoperable systems; it’s about looking forward to a single, unified platform for interactions to which any PERSON or any THING, or any DATA source can contribute, and which liberates information interactions by being interoperable with relational and unstructured data and by abandoning strictly client-server computing.
n.io’s platform jumps ahead of all the current noise and clutter about the Internet of Things by providing a truly extensible software platform that enables interoperable applied intelligence and automation to any asset, any signal, any event, message, feed or stream from any source without the mandated requirement of data storage. Their core innovation has been to invent a highly adaptive real-time signal processing platform that enables universal – systemic – distributed semantic intelligence, control, and automation.
The platform takes on the toughest challenges of interoperability, latency, database dependency and user complexity and enables adaptable real-time intelligence and automation to empower users and developers to exploit the vast potential of technology and the Internet. But what does this really mean? n.io’s key design and architectural features include:
Asynchronous Processing: no other platform can process and apply full context (temporal- spatial-state) with automation and actions from multiple content types and sources, including: streams, data repositories, physical inputs, and physical outputs
All processing is at stream: no other platform that Harbor Research has experienced is ambivalent to data storage – all processing is at the stream/feed/message/API level (whether on a chip or device n.io processes at the edge and/or stream), including: true real-time and low overhead
Truly distributed semantic architecture: virtually no constraints on how it gets deployed whether chip, device, server, cloud, or hybrid.
Viewing n.io’s platform seriously begins to look like it’s not a bad idea to just junk all current IT practice in one fell swoop. The pillars of present-day information technology will not crumble overnight, nor has the great existing investment in them suddenly lost all value. There are reasonable, technically and fiscally sane paths for migrating to the future. But migrate we must. The assumptions and practices of the mainframe and PC eras are now decades old and not suitable for the smart systems era.