How Next-Gen Analytics and Verification Helps Realize Resilient, Secure Networks

by Brighten Godfrey

At the ONUG Fall 2016 conference in New York, one theme struck me: the community realized more than ever the need for advanced analytics and verification. A poll of IT users at the event, for example, highlighted that the siloed nature of current monitoring solutions prevents them from understanding the entire network, end-to-end.

Understanding how components interact is especially important as the layers of complexity pile up in network infrastructure. The scale of networks is growing, and modern enterprises are quite diverse, with dozens of vendors and models of routers, switches, firewalls, and other devices. Traditional gear is now augmented with hybrid cloud, software-defined data center overlay networks, SD-WAN, and more, each with disparate methods and data formats for configuration, monitoring, and troubleshooting. Because all this complexity exceeds what any human can confidently understand, it presents a barrier to agility. Changes rely heavily on manual procedures for implementing and validating correctness, so any change might lead to an unintended outage or vulnerability.

The ONUG Monitoring and Analytics Working Group has been studying how to tackle this challenge. The first step is to promote open formats for exchanging information about the network – including not only traffic monitoring, but also device state information (forwarding tables, interface states, device CPU usage, and so on). Providing awareness of the reality of the network, in a way that’s normalized across all kinds of infrastructure and vendors – on-premises and in the cloud – is key to breaking out of the siloed world into network-wide understanding. And it enables an ecosystem where advanced analytics that consume the data can flourish.

A great example of the opportunity and pressing need in analytics and verification of the network is in the work of another ONUG effort, the Software-Defined Security Services Working Group. Based on IT users’ votes, the group identified the three top needs for software-defined security services. One of those key requirements is verification: essentially, the ability to prove that the network’s confidentiality, integrity, and availability policies are actually realized. Such verification is particularly difficult because the new reality is that the threat landscape spans hybrid cloud, virtual overlay and physical underlays, containers, VMs, and multi-vendor products. In fact, new technology like SDN overlays and hybrid cloud have in some ways actually made the network more opaque and increased risk. It can be difficult to correlate the overlay with the underlay, and network automation will amplify any mistakes that do occur – as exemplified by Amazon’s recent outage where a typo impacted many web sites. The result is high complexity, poor visibility, and a lack of auditability – yet in this environment, enterprises need to assure more than ever that segmentation and micro-segmentation policies are correctly realized.

What’s exciting is that novel technology is rising to meet this vision. Continuous network verification, for example, builds a network-wide, vendor-independent model of the real network, and mathematically verifies that the reality of the network matches the business intent. This new way of analyzing the network predictively (without taking a wait-and-see approach) reduces the risk of outages, accelerates incident response, and rigorously assures security.

The industry has made great advances in agile control of the network, through technology like software-defined data centers and cloud. But if we automate control, we need to automate the whole “control loop” – both implementation and verification – particularly in the complex brownfield environments that are the reality. Closing that loop with next-generation analytics lets us take a leap towards secure, resilient networks focused on achieving the network-wide intent.


Author bio 

Brighten Godfrey

Veriflow

Dr. Godfrey has conducted research in networked systems and algorithms for more than a decade, and is a co-inventor of key technology in use at Veriflow. His work has developed novel architectures and systems for Internet routing, data center networking, high performance data transport, and network data plane verification, as well as advancing theoretical analysis of network algorithms. He has co-authored over 50 scientific publications, and several of these technologies have been deployed by hyperscale cloud computing providers.

In 2015, Dr. Godfrey received the ACM SIGCOMM Rising Star Award in recognition of outstanding research contributions, including contributions to network verification. Only one individual worldwide is selected for this prestigious award each year. Dr. Godfrey was awarded the Alfred P. Sloan Research Fellowship in 2014, and has also received the UIUC Dean’s Award for Excellence in Research, the National Science Foundation CAREER Award, and the Internet2 Innovative Application Award, in addition to several best paper awards. He was a Beckman Fellow at the UIUC Center for Advanced Study in 2014-2015, and has served as program committee chair of several academic conferences. Dr. Godfrey continues to advise young researchers in his role as Associate Professor of Computer Science at the University of Illinois at Urbana-Champaign, and is co-instructor of a popular Coursera course, “Cloud Networking”.

He holds a Ph.D. in Computer Science from the University of California, Berkeley.

Author's Bio

Guest Author