Interview: Gigamon President & CEO on Winning the Cybersecurity Cat-and-Mouse Game

One week into being promoted to the top job of President and CEO of Gigamon, Shane Buckley sat down with tech4tea.com to share his strategy for Gigamon and how Deep Observability can help enterprises turn the table on sophisticated hackers in the cat-and-mouse game that is Cybersecurity.

Shane Buckley, President and CEO of Gigamon.

Shane Buckley, President and CEO of Gigamon.

Cloud adoption has come into its own in recent years, with the COVID-19 pandemic spurring more digital transformation around the world in two years than the preceding 10 years.

“Here in Asia, the growth of hybrid cloud is phenomenal. 73% of organisations in Asia Pacific that were surveyed will deploy hybrid cloud – more than just a multi public cloud solution – because they want to leverage the existing infrastructure and data centres they have; and to enjoy the flexibility, scale, automation, orchestration, that you get from any of these different cloud based technologies, such as Nutanix, VMware and others,” Buckley said.

This growth and move to the cloud is the driver behind the increased challenges of cybersecurity in this era – because of the complexities of operating on hybrid multi cloud IT infrastructure, and the increased sophistication of hackers.

Buckley shares that “one of the biggest lessons from 2021 is that the nature of the threat is continuing to get more sophisticated, with nation state actors, as well as nefarious groups that have become extremely adept at hacking into some of the world’s supposedly most secure networks and infrastructures”.

Enterprises need all the help they can get securing their IT infrastructure and fending off attacks arising from the increased exposure of the cloud environment.

Buckley observes that this battle is like a “cat-and-mouse” game in which – unfortunately – the mouse is winning. His role at Gigamon is to make sure the nefarious actors don’t win, by teaching and equipping companies and organisations with the ability to better provide security for their infrastructure.

“Gigamon is the leader in visibility and analytics for organisations worldwide, we help secure some of the most secure, most complex, most challenging networks on the planet. Our ability to see, control and secure workloads, no matter where they sit in the hybrid, multi cloud is what we’ve delivered through the Gigamon labs,” says Shane Buckley, President and CEO, Gigamon.

Research has shown that in 2021, 68% of all US organisations were hacked, which was up from over 50% in 2020. Many of those cases – some 25-30 percent – are ransomware attacks where organisations have to pay literally billions of dollars to these actors to release data that had been illegally encrypted and beyond use for those organisations.

“In that environment, where the nature of the threat is getting more sophisticated, we companies who actually provide protection for customers, we’ve got to move faster, because the level of sophistication of these people is becoming exponentially higher. We have to make sure that we can respond,” said Buckley.

“We’re in a unique position to provide those customers with the foundational visibility analytics they need to enhance their security posture and ensure they can minimise risk, maximise compliance, and try as much as we can to get the bad actors out,” Buckley added.

More details from the interview below.

Broad Based Hybrid Cloud Adoption

Because of the pandemic, companies have clamoured to go online to provide remote access to their applications and their businesses for customers generally working and living from home.

“The quality of information or actionable intelligence makes the difference to the response and the action against a hacking event. One of the key things that Gigamon does incredibly well is we provide actionable intelligence to the tools, we distill down the data to the most relevant information that we then present to the tools and from a correlation perspective, they can make better quality decisions faster. Because if a tool, no matter what it is, has to see everything – if it has to look through all the fields to look at all the haystacks to find the one needle – it’s a humongous task. But we can distill down into takeaway fields, takeaway haystacks, and we come back and say, here’s the haystack and the needle’s in here – that helps the tools’ performance. And so improving the quality of actionable intelligence is a big deal for organisations. And it’s one of the key things that Gigamon does incredibly well for companies around the world,” said Shane Buckley on the importance of actionable intelligence in Cybersecurity.

Adopting the cloud enables these companies to build a scalable platform that’s cost effective and easy to deploy, because you don’t have to deploy physical servers, and you can provision your applications remotely from anywhere.

“But the big challenge is that it creates a lot of uncertainty and risk, particularly from a security perspective. As you move your workloads to the cloud, you’re basically moving them to an area where you don’t have inherent security built in. The cloud service providers put in security for their own infrastructure, but your virtual instances that you deployed are basically up to you to secure. So it’s very challenging and complex because you can’t use any of the traditional tried and tested mechanisms as you do with your own physical data center,” observed Buckley.

“What Gigamon does is we give customers the flexibility to move their workloads wherever they like, whether you move them from home premise to some form of colocation or private cloud deployment, using technology like VMware or Nutanix, from a container perspective. It doesn’t really matter whether you move them to public clouds, multiple different public clouds, whether you have multiple workloads talking to each other across different cloud platforms etc,” Buckley added.

“Because we give you a single pane of glass, for your visibility and analytics fabric that helps you understand what’s going on inside your business. And we can then reconnect all the network application performance and security tools that you’ve used in the past with those workloads to ensure you have no increase in your risk profile. And you can run your infrastructure in the knowledge that you’re still using best practice for industry,” shared Buckley.

Deep Observability Pipeline to Secure Hybrid Cloud Networks

With the unprecedented levels of attacks, US President Biden had signed an executive order in 2021 for the Federal Government to advance towards a Zero Trust Architecture. Buckley feels that the advances that Gigamon has made in creating a deep observability pipeline are critical as a foundation to the security for hybrid cloud networks.

Since the zero trust framework identifies visibility analytics as being foundational for security, it means we need to eliminate blind spots inside these networks, particularly with east west traffic, not just north south.

“Gigamon can enable tremendous simplicity and flexibility. We are Switzerland, when it comes to the choice of deployments on premise, colo, private public cloud. we enable you to move your workloads wherever you want, without increased risk, for example, in compliance. And to do so we can actually reduce the cost of the whole infrastructure, because we optimise traffic, we reduce the amount of traffic, up to 90% of traffic that’s hitting all these tools and infrastructure, you can save money, it can pay back for itself in less than six months for any of our customers,” said Buckley.

“From a visibility perspective, to achieve zero trust, you need to have comprehensive visibility. That’s the one thing that Gigamon can provide you – across all workloads. It’s kind of a single pane of glass, whether it’s on premises, whether it’s colo, whether it’s any form of private and public cloud or often referred to as the hybrid multi cloud,” said Buckley.

Buckley explains that with the higher fidelity of information and telemetry that Gigamon provides to customers in their networks, the Deep Observability pipeline delivers valuable immutable data that cannot be compromised.

Hackers often turn off or alter the logs for networks and application servers that traditional observability monitors, in order to obfuscate or confuse attempts by the security department; and hamper their responses to the hacking.

“By leveraging network data, we take copies of network data as it comes into the network. So it is immutable and can’t be altered. We then take that information, parse it, do a lot of advanced stuff to transform the data and optimise it. And then we send that corrected data to the tools for analysis. This level of deep observability pipeline will become the industry standard for organisations to provide security within the Cloud. The technique we’re using is well known, but the way we do it as an enhancement to our platforms is unique today,” Buckley explained.

“I think in the next three years, hybrid cloud networks will require deep observability pipeline competence in order to ensure that they remain secure. It also provides a better quality and fidelity of information for application performance tools as well. But the key use case here we’re talking about customers is predominantly driven by security,” concluded Buckley.

Tags: , , , ,

Leave a Reply