Category Archives: Gigamon

  • 0

The Case for Network Visibility

Category : Gigamon

As a security professional and a consumer, my ears perk up when I hear about security breaches in the daily news. My first thought is, “Has my personal data been compromised?” (most of us initially react with emotion and self-interest) and then I ponder how the solutions from my company, Gigamon, could be applied to prevent such breaches in future. I also look at what security experts, analysts, reporters and other influencers around the industry are saying.

Companies that have suffered serious breaches have invested much in security.  Reports I’ve seen state that, in many of these instances, significant investments have been made in firewalls, intrusion prevention systems, malware protection and a host of other security solutions.  The companies are doing their best – and most organizations do – to secure business critical data and the personally identifiable information of their customers.  So why is it so hard to stop these attacks? What are cybersecurity operations teams missing?  How could they rethink cybersecurity to address the modern day threat landscape?

From my perspective, a totally new and different security approach is required that goes beyond the traditional “buy more tools approach” that is not only becoming more cost prohibitive, but also creates inefficiencies and hinders performance. All signs point to the fact that consistent and concerted attention to visibility, rather than prevention, is the key to robust network security.

The exponential growth of data traveling through enterprise networks means that instead of investing in more tools, organizations must invest in and implement technology that detects and analyzes data-in-motion and sends only the necessary data to the nearest available set of security tools such as the firewall or intrusion prevention system for processing.  This type of approach levels the playing field and changes the equation from “man fighting against machine” (since the attacks are likely coming from well-appointed systems in use by hackers and nation states) to “machine vs. machine.”  This approach is eloquently explained in the Defender Lifecycle Model security approach proposed by my friend and colleague, Shehzad Merchant and is one proposed, at least in theory, by a recent research report from Gartner entitled “Use a CARTA (continuous adaptive risk and trust assessment) Strategic Approach to Embrace Digital Business Opportunities in an Era of Advanced Threats.”

The harsh, new reality is that cyberattacks and data breaches are inevitable. And while there is not yet a perfect approach, it’s essential that enterprises shift their approach to add pervasive visibility to their traditional prevention measures – alongside detection, prediction and containment – to improve the security of their applications and the business critical and personal data traversing their network.

With detection and response integrated into security operations, today’s businesses gain a strategic advantage in the fight to wrestle the massive volume of network cyber threats that exist in this brave new world. And that is a major step forward in shifting control and advantage way from malicious attackers and back to defenders.

Source: https://blog.gigamon.com/2017/09/15/case-network-visibility/

Author: Graham Melville


  • 0

Visibility is Essential

Category : Gigamon

Vanson Bourne Report: Lack of Visibility is a Leading Obstacle to Securing Enterprise and Cloud Networks

Lack of visibility is leaving organizations struggling to identify network data and investigate suspicious network activity tied to malicious attacks. Sixty-seven percent of respondents cited network blind spots as a major obstacle to security:
Monitoring and security tools are stressed by the increasing speed and growth of network traffic.
High value information is being migrated to the cloud, where visibility is limited and application data is not easily accessible.
A large amount of network data remains hidden due to data and tools still being segmented.

Executive Summary

Learn the root causes of lack of visibility and their impact on your network security.

Vanson Bourne Report

Get insights from surveyed global IT leaders about network security on premises and in the cloud.

Source: https://www.gigamon.com/campaigns/vanson-bourne.html<


  • 0

All Killer, No Filler, How Metadata Became a Security Superpower at Gigamon

Category : Gigamon

Gigamon on Gigamon: Learn how we leverage the power of metadata to proactively, intelligently and efficiently defend our own networks.

Network data is growing in volume, speed and variety. It seems to be everywhere – in our data centers, remote offices, virtual machines (VMs), hosted in the cloud. Does this sound familiar? No wonder it’s getting harder and harder to stay on top of cybercrime.

Under today’s conditions, detecting good traffic from bad is not only costly, it’s become almost impossible to do. According to a recent survey by independent market research firm, Vanson Bourne, 72 percent of IT decision-makers haven’t scaled their network infrastructure to address growth in data volume, and more than two-thirds say that network blind spots are a major obstacle to data protection.

For most InfoSec teams, there’s simply too little time and too few resources available to efficiently correlate the information needed to make accurate predictions on potential security threats. As an InfoSec engineer myself, I can certainly relate.

But, what’s the answer?  In my experience, leveraging metadata can solve many of these problems.

Several years ago, here at Gigamon, we decided to augment our packet-based monitoring tools – which were overwhelmed with the volume of traffic – with metadata analysis using our own Gigamon Visibility Platform. Yes, we drink our own champagne! This allows us to triangulate on threats in a more intelligent, efficient way.

Making this transition has resulted in major benefits to our business:

  1. Reduction in false positives by filtering out “noise.”
  2. Faster time to threat detection through proactive, real-time traffic monitoring versus reactive forensics. We generate, monitor and analyze metadata traffic in real-time from literally anywhere on our network – north/south and east/west.
  3. Greater leverage of our small – but mighty! – security team.

The end result is a comprehensive security posture and more cost-effective protection against cyber threats that other organizations simply do not have.

How did we do it?

Well, we started by using our Gigamon Visibility Platform to generate metadata off the wire and then feed it to our Security Information and Event Management (SIEM) solution. Previously defined high-fidelity correlation searches in our SIEM help us to quickly identify patterns. Here are three real-world examples of searches we implemented:

1) Unusual patterns in HTTP response codes. 

Seeing numerous HTTP 404 errors, for example, helps us quickly identify infected machines that may be attempting to contact command and control hosts. It could also indicate a bad config of your web server or some other issue – crucial for the WebOps team – that is preventing online business from being transacted. 

2) Specific domain(s). 

During the Wannacry outbreak, we searched for and quickly found the KillSwitch domain – an indication that there was a potential breach. Disaster averted! Not a single machine was affected. 

3) SSL certificate issuer.

We recently generated alerts that identified users navigating to sites signed by the repudiated WoSign SSL certs. We were able to quickly identify users attempting to access these potentially spoofed sites, an otherwise extremely difficult and time-consuming process.

By now, hopefully it’s clear to you how powerful metadata can be. It really is “all killer, no filler.” If you’d like to learn more about how metadata became a security superpower for the InfoSec team at Gigamon, I invite you to register for my upcoming webinar: “All Killer, No Filler: How Metadata Became a Security Super Power at Gigamon.”

Source: https://blog.gigamon.com/2017/08/30/all-killer-no-filler-how-metadata-became-a-security-superpower-at-gigamon/?lipi=urn%3Ali%3Apage%3Ad_flagship3_company%3BiGloNoUZTCy52ns1xVinhw%3D%3D

Author: Jack Hamm


  • 0

Why Pervasive Visibility is So Important

Category : Gigamon

Please join us as Gigamon CMO, Kim DeCarlis, walks you through the critical importance of pervasive visibility on your network and how to achieve it.

We’ll review the key differentiators you should be looking for when evaluating network monitoring solutions and why Gigamon was recently named the #1 network monitoring tool by IHS Technology.

Attendees will learn:

  • Why visibility is key to any type of security, monitoring or analysis solution.
  • Valuable analyst insights from IHS Technology on market size, market share, forecasts and market trends in the Network Monitoring Equipment Market to help you make better business decisions.
  • How Gigamon customers benefit from ease of deployment, maximized efficiency of security tools, platform reliability and increased resilience of their architecture.

If these security topics and use cases are valuable to you, you won’t want to miss this webinar!

Register


  • 0

No More Network Blind Spots: See Um, Secure Um

Category : Gigamon

East Coast summer nights of my childhood were thick with humidity, fireflies and unfortunately, merciless mosquitoes and biting midges. So, when a West Coast friend said she had a summertime no-see-um tale to tell, I was ready to commiserate.

My friend likes to camp – alone. Not in deep, dark, remote backcountry, but, you know, at drive-in campgrounds. Pull in, pitch a tent, camp – that’s her style. While not the most private, she likes the proximity to restrooms and even, people.

Before one adventure, she was gathering provisions at Costco when she saw a “no-see-um” tent for sale. “Well, this is exactly what I need,” she thought. No longer would she have to lower her “shades” or head to the restroom to change. She’d be free to undress in her tent, relax and fall asleep to the hum of an adjacent freeway.

Of course, we can all figure out how this story ended. After having enjoyed her newfound freedom for an evening, she returned the following morning from a visit to the loo only to realize the naked truth.

Like a Good Boy Scout, Are You Prepared?

While my friend’s false sense of security bordered on the ridiculous – okay, it was ridiculous – it speaks to the potential for misjudging cybersecurity readiness. Her problem was that she felt secure when she wasn’t – a blind spot of sorts that could have led to more than just awkward consequences.

In a way, the same holds true with enterprises who have bought innumerable security tools – perimeter firewalls, endpoint antivirus, IPSs – to keep prying eyes out. They, too, often have a false sense of security. Unlike my friend, it’s not that they don’t understand how these tools work; rather it’s that they don’t understand that these tools cannot provide complete network protection.

There are simply too many bad guys and too little time to detect and prevent all cyberattacks. Not only is malware everywhere – for example, zero-day exploits and command-and-control infrastructures are available for purchase at a moment’s notice by anyone with a computer and the desire to wreak havoc – but with data flying across networks at increasing speeds and volumes, it’s more and more difficult for enterprises to do any intelligent analysis to uncover threats and prevent attacks from propagating across core systems.

Detecting compromises is hard. It requires monitoring a series of activities over time and security tools only have visibility into a certain set of activities – most cannot see and comprehend the entire kill chain. This incomplete view is more than problematic – it’s dangerous.

In fact, according to 67 percent of respondents to a new Vanson Bourne survey, “Hide and Seek: Cybersecurity vs. the Cloud,” network blind spots are a major obstacle to data protection. The survey, which polled IT and security decision-makers on network visibility and cloud security preparedness, also revealed that 43 percent of respondents lack complete visibility into all data traversing their networks and half lack adequate information to identify threats. By all counts, such data blindness could lead to serious security implications – not only within enterprise environments, but also in the cloud, where 56 percent of respondents are moving critical, proprietary corporate information and 47 percent are moving personally identifiable information.

See the Forest and the Trees

Sometimes we apply an available tool because it sounds like it’ll do the job – ahem, my dear friend and her no-see-um tent – but fully understanding the purpose and assessing the efficacy of your security tools isn’t a minor detail to be overlooked. Enterprises who’ve been buying more tools to address the security problem are beginning to question if they are getting the right return on their investments, especially when they have no means to measure how secure they are. To further complicate matters, more tools often increase the complexity of security architectures, which can exacerbate the data blindness issue.

So, what can be done? For sure, preventative solutions shouldn’t go away – they play a critical role in basic security hygiene and protecting against known threats – but they must be augmented with solutions for better detection, prediction and response in a way that doesn’t create more blind spots. In other words, with a new approach that is founded on greater visibility and control of network traffic to help increase the speed and efficacy of existing security tools and that allows enterprises to say, “Okay, this is where my investments are going and these are the gaps I need to address to become more secure or even, to identify if it’s possible to become more secure or not.”

If you’re unsure how secure your network is, maybe start with a few simple questions:

  • Can you see into all data across your network? Or does some data remain hidden due to silos between network and security operations teams?
  • Are your security tools able to scale for faster speeds and increased data volume? Without diminishing their performance?
  • What about your cloud deployments – are they being used securely? Is there clear ownership of cloud security?

Source: https://blog.gigamon.com/2017/08/16/no-network-blind-spots-see-um-secure-um/

Author: Erin O’Malley


  • 0

Gigamon IT Survey Highlights Lack of Visibility as a Leading Obstacle to Securing Enterprise and Hybrid Cloud Networks

Category : Gigamon

Over two thirds of IT decision-makers cite blind spots as a major obstacle to data protection

Gigamon, the industry leader in traffic visibility solutions, today announced the results of a commissioned survey, “Hide and Seek: Cybersecurity and the Cloud,” conducted by Vanson Bourne, an independent market research company. The survey polled information technology (IT) and security decision-makers in the U.S., the U.K., Germany and France about their cloud security preparedness and network visibility issues.

The results of this survey demonstrate that lack of visibility is leaving organizations struggling to identify network data and investigate suspicious network activity tied to malicious attacks. Sixty-seven percent of respondents cited network “blind spots” as a major obstacle to effective data protection while 50 percent of those, who do not have complete visibility of their network, reported that they lacked sufficient information to identify threats.

Survey findings pinpoint three root causes of data blindness that are posing network security risks:

  • The increasing speed and growth of network traffic stresses monitoring and security tools, which are not adept at handling large amounts of traffic. Seventy-two percent of respondents report that they have not scaled their monitoring and security infrastructure to meet the needs of increased data volume.
  • High value information is being migrated to the cloud, where visibility is limited and application data is not easily accessible. Eighty-four percent of respondents believe that cloud security is a concern holding their organization back from adopting the latest technologies. When asked what types of information they are moving to the cloud, 69 percent of respondents reported day-to-day work information and 56 percent cited critical and proprietary corporate information.
  • A large amount of network data remains hidden due to data and tools still being segmented by organizational boundaries. IT and security decision-makers are not able to quickly identify and address threats and security events. Seventy-eight percent of respondents report that because different network data is being utilized between NetOps and SecOps teams, there is no consistent way of accessing it nor understanding it. Forty-eight percent of respondents, who do not have complete visibility over their network, report they did not possess information on what is being encrypted in the network.

“Today’s attackers have the advantage as cybercrime is a thriving economy and attacks are focused on infiltrating the network and stealing important company information,” said Ananda Rajagopal, vice president of products at Gigamon. “It is imperative for enterprises to adopt a visibility platform that provides visibility and control of their network traffic, and one that’s integrated with their security tools to accelerate threat detection and improve efficiencies.”

The Gigamon Visibility Platform directly addresses network “blind spots” by offering:

  • The most scalable visibility platform with up to 800Gbps of processing capability per node and up to 25.6Tbps when clustered, to meet the latest demands of the network.
  • Cross-architecture deployments on premises, in remote offices and in the cloud to securely migrate high-value information to public clouds.
  • An end to siloed data and tools that are segmented. Monitoring and security tools access the same data, encrypted or not, so that network and security operators can consistently access and understand what matters.

Gigamon solves data blindness by providing security and network operations teams with the pervasive visibility and control to automate and accelerate threat detection for securing enterprises and hybrid clouds. Learn more about our Gigamon Visibility Platform and Gigamon Visibility Platform for AWS.

The independent survey was commissioned by Gigamon and administered by Vanson Bourne in May 2017. Respondents consisted of 500 IT and security decision-makers of organizations with over 1,000 employees. The regional representation of respondents includes 200 respondents in the U.S. and 100 respondents each in the U.K., France and Germany.

Additional Resources

  • Vanson Bourne survey overview page
  • “Hide and Seek: Cybersecurity and the Cloud” report presentation
  • “Hide and Seek: Cybersecurity and the Cloud” executive summary (U.S. results)
  • “Hide and Seek: Cybersecurity and the Cloud” executive summary (U.K. results)
  • “Leading Obstacle to Securing Enterprise and Hybrid Cloud Networks” instagraphic
  • Highlights of the Vanson Bourne survey blog

Source: https://www.gigamon.com/company/news-and-events/newsroom/gigamon-it-survey-highlights-lack-of-visibility-leading-obstacle-security-enterprise-hybrid-cloud-networks.html


  • 0

Under Armour on Top of Network Security

Category : Gigamon

Real, hands-on experience always supersedes any marketing or communication campaign. It’s by listening to our customers and putting our hands on the product that we learn its real value.

At Cisco Live! US, I was lucky enough to present with Alex Attumalil, director of global cybersecurity at Under Armour and a long-time Gigamon customer. It gave us the unique opportunity to hear him share his experience with the GigaSECURE Security Delivery Platform.

Listen to the video recording of our presentation – and below, find a few golden nuggets from what was a dynamic and thought-provoking discussion for all involved:

In high-growth environments, it is tough for security and network teams to stay in control of their architecture and tools.

IT teams typically need to compensate scale-up by adding more tools or devices to the network. This results in a very complex environment and unexpected security blind spots.

The GigaSECURE Security Delivery Platform helps to simplify security architectures, scale the capacity of existing tools and address security flaws. Alex shared the unique challenges that his company’s rapid growth and diverse physical and digital assets pose and how his team leverages visibility to bolster its overall security posture. For example, using the GigaSECURE Security Delivery Platform, they can route raw data to their behavior analytics tools while separately sending subsets of data only to select security tools, preventing duplicate analysis.

The time for siloed IT teams is over. It is inefficient for teams to still use their own tools to access specific data.

At Under Armour, the networking team uses the GigaSECURE Security Delivery Platform for application visibility and awareness. They use visibility to understand what is happening with end-to-end traffic, specifically over VPN connections. The benefits are being transferred to the multiple teams throughout the organization who need access to this data and to the tools used to monitor and secure it.

Visibility is one thing. The GigaSECURE Security Delivery Platform can also improve the resiliency of your monitoring and security architecture.

By placing the GigaSECURE Security Delivery Platform in the middle of their security stack, Under Armour reduced the amount of downtime that was needed for servicing their tools. Resiliency is critical to an organization’s security. As Alex said, “That was the big clincher, the big winner for us.”

To see the entire presentation, please go here. For a complete recap of the overall themes of the show, please see my earlier post “Industry Trends Seen from Cisco Live! – Visibility Is at the Core of These Initiatives.”

Source: https://blog.gigamon.com/2017/08/10/under-armour-on-top-of-network-security/

Author: Tom Clavel


  • 0

Everyone Needs Better Network Visibility

Category : Gigamon

I just returned from a busy week in Las Vegas attending the Black Hat 2017 conference – an exciting and crowded event with over 15,000 attendees participating in training classes, briefings and the business hall expo that showcased the latest vendor solutions from over 250 security vendors.

The overall training sessions and briefings addressed a myriad of security areas: cryptography, data forensics, mobile hacking, application security, IoT and malware defense. Across all these security efforts is a common need – the need for greater visibility into the data and potential threats traversing networks. This coincides with the Gigamon perspective of accessing and managing network data across physical, virtual and cloud environments and providing the traffic of interest to the appropriate security and monitoring tools.

Our Visibility Platform enables and enhances security and monitoring tools by providing only the appropriate traffic to them and not burdening them with irrelevant traffic that wastes processing power and network bandwidth. Supporting this premise, we highlighted our latest security-enabling visibility solutions, including our GigaSMART® SSL Decryption for both inline and out-of-band network traffic,  Metadata GenerationApplication Session Filtering and our Visibility for AWS solution.

Black Hat 2017

Key Takeaway

In the hectic two days of the Black Hat business hall expo, the Gigamon team spoke with hundreds of customers and prospects as well as countless existing and potential ecosystem partners. In my own discussions, a key challenge was repeatedly raised: IT security operations teams are overwhelmed.

As dangerous malware and successful data breaches continue at an ever-increasing rate, the myriad of tools SecOps teams use to combat these threats cannot keep up. Furthermore, the new tools and technologies slated to address the latest threats require a level of expertise many SecOps teams don’t have – yet. While existing teams work hard to minimize these threats, the security risk is still too great.

Gigamon can’t solve the InfoSec staffing and training challenges within organizations, but we can assist them with optimizing their security infrastructures to help ensure they can do as much as possible with their existing resources. The Gigamon Visibility Platform helps organizations see, manage, secure and understand the data traversing their network. This platform enhances and complements existing security and monitoring tools with pervasive network visibility. It minimizes blind spots. It supports physical, virtual and cloud environments. It acts as a basis for a new, collaborative security model that can provide rapid risk mitigation: The Defender Lifecycle Model.

It was quite exciting meeting with so many customers, prospects and partners to discuss their security challenges and how Gigamon can possibly help them. On many accounts, Black Hat was a great success.  Gigamon thanks all those who visited us there.

Source: https://blog.gigamon.com/2017/08/01/what-i-learned-at-black-hat-2017-everyone-needs-better-network-visibility/

Author: Greg Mayfield


  • 0

On-demand Webcast: Next Generation Cyber Security

Category : Gigamon

In this short webcast, Gigamon CTO, Shehzad Merchant, presents the Defender Lifecycle, a new model that addresses the increasing speed, volume and polymorphic nature of network cyber threats. Focused on a foundational layer of pervasive visibility and four key pillars — prevention, detection, prediction, and containment — the new model integrates machine learning, artificial intelligence (AI) and security workflow automation to shift control and advantage away from the attacker and back to the defender.

Topics discussed:

  • The factors contributing to the inevitability of breaches
  • Challenges and deficiencies of current security model
  • The Defender Lifecycle Model foundation layer and key pillars
  • How pervasive visibility is essential to optimizing existing incident response systems

Who should watch:

  • Cybersecurity professionals seeking to prevent, detect, predict, contain and act to thwart bad actors and threats quickly and decisively
  • Senior technology executives and agency leads who want to reduce cost and complexity, mitigate security risks and improve business decision making.

Duration: 26 minutes

Available On Demand


  • 0

Perspectives on the New Defender Lifecycle Model, A Q&A with Gigamon CTO Shehzad Merchant

Category : Gigamon

In his blog “Moving Toward a Security Immune System,” our CTO Shehzad Merchant spoke of the need for a new security model, predicated on a paradigm shift and acceptance of the fact that data breaches will happen. More and more, organizations are seeing that traditional, prevention-focused security strategies are not enough to defend against the increasing speed, volume and polymorphic nature of today’s cyber threats. Instead, they must embrace a new model – the Defender Lifecycle Model – that shifts control and advantage back to defenders by integrating machine learning, artificial intelligence (AI) and security workflow automation with a foundation of pervasive visibility.

To learn more, we sat down with Shehzad to see what customers and prospects are saying about their current challenges and how this new approach can help.

Shehzad Merchant blog headshot

What top challenges does the Defender Lifecycle Model address?

Over the last few years, we’ve seen an exponential increase in the number of different security tools on the market and increasingly, we’ve heard much talk about machine learning, AI and security orchestration. The problem is that it’s not clear how these all play together to improve a company’s security posture. For example, if you deploy them, are you more secure? Less secure? The same? There isn’t a model against which organizations can measure security success or understand where gaps remain.

Another problem is that a model needs to address real, practical industry challenges that include a massive shortage of skilled personnel, massive growth in the volume of attacks and manual, siloed processes.

To address these problems, the Defender Lifecycle Model offers a framework against which organizations can map out different solutions as well as the ability to tackle the practical challenges via more automated processes.

Have enterprises already started down this route to build out their infrastructure?

They have, but in more of an ad hoc manner – meaning they stumble onto it. Hopefully they will get to the right place, but without a structured approach, the process will likely take longer than it should and there’s the chance they won’t end up in the right place. Key to the Defender Lifecycle Model is providing a more structured approach they can use to get to the desired outcome, quickly and efficiently.

What questions are customers asking? What do they want to learn more about?

There are a few big questions: How can we automate? How do we deal with the API explosion? What is the role of machine learning and AI? And how does Gigamon help with each?

Fundamentally, machine learning addresses the big data challenge of security, which is gathering context from across an entire infrastructure and building a baseline. AI is applying algorithmic techniques on top of that to surface out anomalies. Automation and orchestration is the ability to act on those anomalies.

The GigaSECURE Security Delivery Platform is not only an enabler – of the machine learning, AI, automation and containment layers – but it’s foundational. A foundation upon which enterprise network defenses can be layered and more importantly, efficiently leveraged.

How does this foundation enable automation?

There are multiple aspects as Gigamon plays into the machine learning phase, the automation and containment phase, the initial basic hygiene phase. For example, machine learning is all about big data and providing ways to assimilate large volumes of data and build a baseline. Gigamon provides easy access to content-rich data that allows companies to build that baseline.

In terms of automation, our platform offers an alternative to dealing with the massive API explosion by providing a default API to orchestrate various pieces of solutions. If you want to deploy a basic good hygiene technique like firewalls, we make it easy to do so without having to deal with network maintenance windows or outages.

Have customers started to take this approach?

Customers find themselves in different phases of the cycle. Many are in the first stage of doing the basics of providing firewalls, segmentation and multi-factor authentication. Some have moved beyond and are beginning to build out a baseline, leveraging machine learning techniques, big data and both open source and commercial tools.

Gigamon feeds them very rich content data either directly in the form of network traffic streams or metadata – encrypted or decrypted. I don’t believe many are yet in the automation phase as that is relatively new. However, I expect to see more customers starting to deploy aspects of automation and orchestration in 2018.

Can you talk more about the benefits?

In moving to an automation model, you can begin to address two challenges: First is the challenge of the shortage of skilled personnel. Second is speeding up the defender’s ability to respond in a timely manner to contain and prevent attacks from propagating.

Another big advantage to this type of platform is easy access to data. It’s not that organizations don’t have access to data – they can get data from routers, firewalls, endpoints, domain controllers – the challenge is getting it. Each one of these entities is controlled by a different part of an IT organization and coordinating across these different, siloed departments is a challenge. Many of these approaches also add load on the devices, impacting their performance. So, simply leveraging network traffic becomes a quick shortcut to getting access to content-rich information.

What do you see happening next as this new model matures?

I see two key next steps. One is people embracing the model and moving towards automation and orchestration. It’s very early days, but again, I do expect by 2018 that organizations will be moving in that direction – towards, if you will, a DevOps for security. Two is seeing this roll out in the public cloud.

What excites you about what customers are saying?

The concept and the problems it solves resonate; so perhaps the solution set also resonates.

The most exciting part is that if we are successful in implementing and making this new model operational in customer environments, we have the opportunity to reverse the attacker-defender asymmetry – and make the asymmetry work to the advantage of defenders.

To learn more, please watch Shehzad’s webcast “Next-Generation Cybersecurity: The Defender Lifecycle Model,” check out the GigaSECURE Security Delivery Platform web page, read the new white paper “Disrupt the Machine-to-Human Fight with a New Defender Lifecycle Model in Security Operations” and visit booth #760 at Black Hat USA 2017 to meet with Shehzad and other Gigamon security experts.

Source: https://blog.gigamon.com/2017/07/25/perspectives-new-defender-lifecycle-model-qa-gigamon-cto-shehzad-merchant/