Stay up to date with the latest information on network security,
business continuity, and network availability.

navigating netops

Download our white paper.

Network Automation Week, presented by Fierce Wireless, kicked off last week with Opengear as a gold sponsor. At the conference, I was honored to conduct the opening keynote, The Resilient Foundation of AIOps. As the VP of Product Management & Strategy, I understand the journey to zero touch networking that many enterprises have undertaken.  Zero touch networking is now becoming an actual possibility with the increasing prevalence of virtual networks powered by AI and machine learning, which power this technology. These networks  are able to make decisions with little to no human input, freeing skilled staff from monotonous tasks and giving them the ability to focus on innovation while simultaneously reducing costs and wasted time. Nevertheless, the move toward complex automated networks will not be without its challenges, namely, the question of decreasing complexity for machines.

What Really is Automation? 

The history of automation begins with Henry Ford and his Model T. Ford’s assembly line. It was revolutionary because the heart of the idea was to reduce complexity by removing unnecessary steps, simplifying actions with tools and dividing tasks among individuals. Despite these innovations, Japanese automobiles would eventually surpass General Motors. By using fewer and fewer parts, the Japanese significantly minimized the margin for error and kept their cars high quality. This automobile arms race symbolizes the essence of optimizing automation, which is accomplished by decreasing complexity.

Inaccurately, some believe that the purpose of automation is to replace humans. While it may lead to a decrease in human input, the primary objective of automation is to remove complexity and ambiguity. Contrary to popular belief, AI is not that intelligent – from a human perspective. AI is purely a means of getting a machine to learn inputs and produce outputs. Consider the famous Chinese Room Argument, which inferred that machines have no understanding of meaning – they merely use rules to arrive at outcomes, but they don’t know why they arrived at that outcome.

Machine learning is best comparable to the process of human tinkering with weights and a scale, adding and subtracting until eventually the human’s desired outcome is achieved. Likewise, machine learning is a lot like how the brain stores memories. One memory I have, is sticking a screwdriver into a wall outlet and getting shocked, something that I think we’ve all thought about doing. Instantly, inside my brain, neurons’ synapses began to fire electrical impulses throughout the neural network until they reached a threshold, where I learned, don’t do that again or you’ll get shocked. Learning is the creation of new linkages between neurons. A similar process is happening inside an artificial neural network.

Self-Learning and Assisted-Learning 

There are two ways a machine can learn: self-learning and assisted learning. For self-learning, an engineer gives a machine a huge set of data with structured rules and then lets it play out the inputs until it has taught itself. A real-world example is that of machines beating all human Chess and Go Grandmasters. The machines played against each other again and again until they recognized every possible permutation. Eventually, they became unbeatable.

Assisted learning, however, is a teaching process. In the preliminary testing of the early U.S. drones over an army fort in Texas, the operators quickly realized that the drones couldn’t differentiate between cows and tanks based on thermal imagery. Initially, the operators manually identified each cow and tank for the machine. Like the analogy of the weights and neural network, the machine’s algorithm gradually recognized the difference between the two with a higher degree of precision.

Despite advancements in technology, machines still have their limitations. Our complex human world is a stumbling block for them. For instance, a truly autonomous car can be overwhelmed by sensory overloads. The types of complexity found in society and the real world are something machines struggle with, and there is no exception for network automation. Nevertheless, to successfully utilize automation to enhance network processes, we must also understand the fundamental importance of network resilience.

The Foundation of a Resilient Network 

A console server is the Tailorization of network operations. It is also a tool that enables a network engineer to do two things. First, engineers don’t have to log into each individual box – instead, they can go into the console server and then access anything connected. Second, a console server allows for presence and proximity, meaning that a highly-skilled engineer can connect with and manage equipment such as critical infrastructure remotely. Zero-touch provisioning is an excellent example.

By leveraging zero-touch provisioning, administrators can automate repetitive tasks and reduce the number of human touchpoints, which will lessen the frequency of errors and assist with scaling for deployment. The console server is essentially just another specialized tool for the network engineer, much like the tools designed for the assembly line workers in Ford’s factories. The console evolved from a person-to-machine interface into a machine-to-machine interface that allows increased automation, including AI tools that boost efficiency but keep humans in the loop. And although automation has many benefits, like reducing errors, it is also a double-edged sword as someone could accidentally automate an error that could take down a whole network. Organizations must minimize complexity so that machines can help humans do what we do best – being creative.

Nonetheless, no matter how much automation, virtualizing, or AI tools we add to a system, managing the network with the network itself is futile. If there is network congestion, misconfiguration or an outage, there is no way to access it and make repairs. To successfully deploy AIOps, organizations need an out-of-band network separate from the primary network. As issues arise,  engineers have the ability and necessary access to make repairs. Such a model is true network resilience. And while historically out-of-band was associated with emergency use only, the crux of its capabilities is that it extends network management from the network core to the edge.

Network Automation Tomorrow 

Looking towards the future, the two things dominating headlines is 5G and automation.  In the case of 5G, it will economically replace wireline access, increase bandwidth and open new capabilities for out-of-band access. Automation will ensure reliability, enable the ability to scale and is the basis for deploying new technologies like zero touch.  As the network becomes more advanced and more central to all business processes, organizations need to ensure resilience. They also have to ask themselves if they are willing to put a machine in control of the network. A  machine can manage simple tasks with low complexity, but for those more complex functions, it should make a recommendation and leave the final decision to the highly skilled humans.