Physical distancing has blunted a virus's impact; the same idea can be applied to computers and networks to minimize breaches, attacks, and infections.

Trevor Pott, Product Marketing Director at Juniper Networks

May 29, 2020

6 Min Read

Social distancing has recently entered the popular lexicon in a big way and, by now, we are all intimately familiar with the idea of keeping a safe distance from others to minimize health threats. There are a lot of parallels between biological epidemiology and information security, and the concept of "digital distancing" as one layer in a multi-layered approach to protection doesn't just apply to in-person interactions.

As with social distancing, the basic concept behind microsegmentation is to limit as much unnecessary contact as possible. Most computers must only talk to a very limited subset of other computers, and that's where microsegmentation comes in as the social distancing of computer systems. 

How It Works
Microsegmentation improves data center security by controlling the network traffic into and out of a network connection. Ultimately, the goal of microsegmentation is to implement Zero Trust. 

Done properly, microsegmentation is effectively a whitelist for network traffic. This means that systems on any given network can strictly communicate with the specific systems they need to communicate with, in the manner they are supposed to communicate, and nothing else. With connections and communications so regimented, microsegmentation is among the best protections we have today against lateral compromise.

This allows microsegmentation administrators to protect whatever is on the other end of that network connection from whatever else is on the network. It also allows everything else on the network to receive a basic level of protection from whatever might be on the other end of that network connection.

This is a huge change from the "eggshell computing" model in which all defenses are concentrated at the perimeter (the eggshell) but everything behind that edge is wide open (the soft insides of the egg). Eggshell computing is ineffective; attackers have used lateral spread from an initial point of compromise for decades and it is critical that east-west defenses exist in data centers alongside the more traditional north-south ones. 

Can You Get Too Isolated?
In more advanced implementations, microsegmentation moves beyond what is basically just another firewall run by a different team and adds network overlays. With a combination of overlays and ACLs, it is possible to restrict all traffic in and out of a specific system so that only those other systems that are supposed to receive it can even see that traffic, let alone respond to it.

In the real world, however, most systems cannot realistically be isolated such that they only communicate with peer systems in an east-west manner within the data center. At the very least, they must reach out to something, somewhere to get security updates. Well-designed microsegmentation systems offer the ability to place virtual – or, increasingly, containerized – firewalls at the edge of a given microsegment so that any traffic that leaves the segment passes through that firewall.

This approach allows a system to be isolated as much as is practicable – only systems which absolutely need to communicate among themselves are attached to a given network segment – while still offering routing beyond that segment. Passing traffic in and out of that segment through a firewall (or any other network security functions you wish to include) provides an additional – and increasingly necessary – level of protection that isn't easily achieved with only ACLs and network overlays. 

The ability to securely add a server or virtual machine anywhere on the network dramatically increases the flexibility of workload placement. Common experience with microsegmentation shows that adoption is frequently tied to the popularity of distributed applications. In some cases, demand for distributed applications drives the need to implement microsegmentation. In other cases, the availability of microsegmentation opens the door to distributed applications that weren't realistic before.

Distributed applications, like all applications, have varying levels of resiliency to failure. The widespread adoption of distributed applications can magnify the scope of impact of a switch failure because that switch may potentially be hosting parts of multiple applications or services. Redundancy is always a good plan in IT, but it gains new urgency when microsegmentation is deployed in earnest.

If You Must Make Changes, Make All Changes
Architecture and planning are key to successful microsegmentation deployments. If you haven't implemented microsegmentation before, ensure that your infrastructure can support significantly more microsegments than you think you're going to need, as growth of new functionality within an organization can be unpredictable. This means ensuring that all relevant components (and management software) can handle the scale you will require.

Network equipment – switches, routers and virtual switches – typically have a limited capability to filter, restrict, or encapsulate traffic. Deep Packet Inspection (DPI), SSL/TLS proxying, and many other information security capabilities still require traffic to pass through (or at least be mirrored to) more capable defenses, such as an enterprise-class firewall.

Pay attention to what the ongoing management overhead of the proposed microsegmentation scheme looks like. This is also a good time to talk to the vendor about software-defined LANs (SD-LANs) because if you're going to upend your entire network management approach, you might as well get all the automation and orchestration handled at once. Chances are you won't be making a change this big for at least another decade.

Microsegmentation has a justified reputation for being difficult to implement, more than a little bit of a pain to manage and, as a result, rather expensive. It has been this way for years and, if implemented incorrectly, can still be so today.

No Longer a Luxury
Microsegmentation does not have to be a nightmare to implement, however. Well-planned implementations architected by experienced professionals can not only be successful, but can significantly increase an organization's ability to respond to unexpected change, ultimately proving to be of financial benefit. 

It is understandable you might not have implemented microsegmentation if you have a massive, sprawling network with decades of technical debt. But from an information security perspective, nobody should be deploying any new networks today without microsegmentation. Microsegmentation is no longer some niche, emerging feature. It should be considered a fundamental capability for both networking agility and information security today.

There is no end to the struggle of attacker and defender in the IT space, as attackers get better at rapidly spreading throughout a network every year. Minimizing contact between systems using digital distancing is an obvious tool available to organizations to reduce the scope of compromise when those inevitable compromise events do happen.

Related Content:

 

 

 

 

Learn from industry experts in a setting that is conducive to interaction and conversation about how to prepare for that "really  bad day" in cybersecurity. Click for more information and to register

About the Author(s)

Trevor Pott

Product Marketing Director at Juniper Networks

Trevor Pott is a Product Marketing Director at Juniper Networks. Trevor has more than 20 years of experience as a systems and network administrator. From DOS administration to cloud native information security, Trevor has deep security knowledge and a career that matches the diversity of the IT industry itself.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights