Denver, CO - Zurich, CH
303.786.9000
720.282.5800

EPIC APPLICATION FAILURE

App Failure

For the Iowa Caucus of 2020, Democratic Party operatives decided to build an app. This app would collect voting results from around the state and send them back to headquarters. The operatives were aware of the hard-won best-practices for building software. But they chose to ignore those rules, and this led to epic app failure.

They didn’t research the people who would use their tool. They didn’t pay attention to security issues. They didn’t do extensive real-world testing. In fact, they kept the app’s development secret, then sprang it on users at the last minute. When the users didn’t understand how it worked, they were told to “play around with it.”

The app failed spectacularly and the Iowa Caucus became a national embarrassment. Many voices around the Net started saying: You can’t conduct elections online. Our elections are sacred, and the online environment can’t be trusted.

Our devices require millions of lines of code

Now, our banking system and securities markets are fully digital. The same is true for most factories and weapons systems. There’s no reason that digital systems can’t be used in our elections.

And yet, we understand the misgivings of the common person. They grasp that digital reality is too small and fast for direct human perception. There’s no real basis for trusting it. Major corporations lose control of our data all the time. That’s bad enough when a person’s online identity is compromised. Imagine if we couldn’t know whether our sacred elections were valid.

Information Islands

Our key enabling information technologies are seen as inevitable and unquestionable. And our client-server model compounds the problem. The information in our systems is machine-centric. Its life is tied to physical servers that can easily go offline or become extinct altogether.

What we really have is a huge collection of information-islands. Even if the islands remain in existence reliably, they are still fundamentally incapable of truly interoperating with other information-islands.

IoT systems architecture looks strangely familiar because it is. It‘s organized exactly like client-server based computer systems. Our vaunted IoT platforms are really to “data traps.” We can create bridges between them, but they remain islands because that’s what they were designed to be.

Our platforms and infrastructure are not well-aligned

Peer-to-Peer

However, we can now begin to imagine a different kind of application environment. It will feature widely diverse operational technology (OT) computing devices running applications dispersed across sensors, actuators and other intelligent devices. They will share and leverage the compute power of a whole “herd” of much smaller and smarter applications.

In smart buildings, for example, the processor in an occupancy sensor is used to turn the lights on, change the heating or cooling profile, and the alert security. In this evolving architecture, the network essentially flattens until the end-point devices are merely peers. A variety of applications can reside on one or more OT computing devices.

It’s ironic that the movement towards peer-to-peer is regarded as novel. In fact, Internet itself was originally designed for peer-to-peer interactions. We seem to be heading “back to the future.”

We’ve been saying this for years. Our computing systems are not designed for a world of pervasive information flow. They fall far short of enabling real-time intelligence based on data, relationships and interactions. Our applications are monolithic and force organizations to use the same data, relationships and interactions over and over again. This creates huge redundancies and vast replication.

Instead of Liberating Data, We Trap It

In the early days of telephones, all calls were connected by a live operator working at a switchboard. If telephone traffic continued to grow in this way, everyone in the world would have had to become a switchboard operator. That didn’t happen because we automated the systems that handled common tasks like connecting calls.

We have an analogous problem today with our tools for making products “smart.” These tools were not designed to handle the diversity of device data types, the scope of interactions, and the massive volume of data-points generated from devices. Each new device requires too much customization and maintenance just to perform the same basic tasks. These are basic design mistakes. They will prevent our systems from scaling and interacting when billions of nodes are involved.

For years, we at Harbor have advocated for a transition to a truly distributed architecture. An internet of “things” processing streaming data in a “cloud” will not work. It makes much more sense to execute these interactions at “the edge” where the sensors and actuators actually do the processing.

We need a different approach to data and applications

The Future Wants to Break Out

Machine learning, artificial intelligence and the Internet of Things are all trying to break away from legacy computing paradigms. As devices become more intelligent, their data will become like neurons of the brain, or ants in an anthill, or human beings in a society.

The many “data nodes” of a network don’t have to be particularly “smart” in themselves. If they are networked to allow them to connect effortlessly and interoperate seamlessly, they will give rise to complex, system-wide behavior. An entirely new order of intelligence “emerges” from the system as a whole. This is intelligence that could not have been predicted by looking at any of the nodes individually. There’s a distinct magic to emergence, but it happens only if the network’s nodes and data are free to share information and processing power. ◀︎

Insights You May Be Interested In

Leave a reply