IBMs Nataraj Nagaratnam on the cyber challenges facing cloud … – ComputerWeekly.com

Nataraj Nagaratnam, IBM fellow and cloud security CTO, has been with the supplier for nearly 25 years. Security has been his forte throughout this time, whether it be cloud security, hybrid cloud security or technology strategy.

Natarajs interest in security started when he was studying for his masters and PhD. One good, fine day, my professor walks in and says there will be this new thing, called Java, he recalls. He was already working with the core Java engineering team, which created Java at the time. Intrigued, I started to work on the security aspects of Java, and then my PhD was in security in distributed systems.

Following his studies, when Nataraj was looking for fresh challenges, IBM approached him with an opportunity to help shape the future of security. Just as the internet was going to change the world and how business was conducted, IBM offered him the chance to develop systems for how businesses could securely operate over the internet.

IBMs offer to lead enterprise web security for IBM products appealed to the young Nataraj, as the new technologies promised to be both disruptive to markets and enabling to the world. I jumped right onto the opportunity. And, as they say, the rest is history, he says. I was fortunate enough to be part of the way, with WebSphere shaping the industry, and working with industry on standard security specifications, such as web services security.

Technology, especially enterprise IT, has expanded massively throughout Natarajs career. While this has created opportunities for enterprise solutions, it also carries certain risks. In the history of computing, there are three major chapters mainframes, then web, and now there is cloud, says Nataraj. This is a defining moment in the entire IT space, and I am fortunate enough to define and lead the work on security from web to cloud.

Relying on data and services in the cloud can be challenging, as organisations need to ensure that data remains sharable across networks, while having sufficient protections in place to ensure data is confidential and protected. This is especially the case for heavily regulated industries, such as the defence, healthcare and financial sectors. This has become a defining moment for such industries, which are concerned about risk, security and compliance.

Rather than relying on the subjective term of trust, which implies that one can have faith in or rely on someone or something, Nataraj prefers to use technical assurance. Technical assurance demonstrates that technological and human processes have been put in place to ensure data is being protected.

Part of this is ensuring that identity and access management (IAM) is uniformly addressed across all of the organisations cloud platforms, from their cloud storage capabilities to their on-premise services. Given that no two cloud platforms are ever the same, this can complicate matters, as more than one platform is typically used.

The rapid expansion of the tech sector means there is a growing security skills gap, which needs to be addressed. This has left organisations struggling to fill vitally important roles and relying on external contractors instead. This adds further cost, especially if a significant amount of work is required, as contractors are expensive for long-term projects.

To address such concerns, organisations are turning to IAM tools to act as an overlay across their existing cloud infrastructure. If we standardise the access management and security overlay, and enable them with automation and continuous monitoring, we can solve complex problems, says Nataraj. Taking a hybrid multicloud approach with security and compliance automation addresses this with consistency and continuous monitoring.

Government policy is also evolving, as regulators become ever more technologically aware, with additional demands on data protection when sharing data between regions. There has, however, been greater collaboration between countries in this regard. For example, the European Unions (EUs) General Data Protection Regulation (GDPR) has effectively become a de facto global standard for data protection, as countries realise that trade is reliant on an unimpeded flow of data.

Lawmakers and regulators are starting to understand the impact of technology, and that policies and standards need to evolve in a way that accommodates those technologies, while also providing a level of risk and regulatory compliance. Standardisation needs to happen

Nataraj Nagaratnam, IBM

Laws, regulations and policies are becoming much more technology aware, says Nataraj. Lawmakers and regulators are starting to understand the impact of technology, and that policies and standards need to evolve in a way that accommodates those technologies, while also providing a level of risk and regulatory compliance. Standardisation needs to happen, as opposed to every country having its own regulatory requirements, because that will have its own complexity.

With information interchange between different countries being dependent on data sharing agreements, organisations are looking at approaches that allow them to meet the regulatory and technical requirements.

A few weeks back, when I was in India, we talked about this notion of data embassies the fundamental concept is if you run services within these datacentres and service providers, you get immunity from certain laws, says Nataraj. A country can have a data embassy in one country, and in reciprocity, they can have a data embassy in their country. There are innovative and creative ideas coming up in different parts of the world. Thats a reflection of a policy and a practical approach to solve this data sharing problem, and that is going to evolve.

These data embassies are similar to TikToks proposed Project Texas, which would see the social media platform storing all data in the US under the watch of American firm Oracle. These data embassies could evolve into independent third-party organisations.

One of the most significant future concerns facing organisations relying on cloud services will be the risk posed by quantum computing, which could disrupt encryption security. Reliance on existing encryption technologies is not an option, as the processing speeds offered by quantum computers would enable them to swiftly break encryption, especially as certain public key algorithms have proven to be susceptible to quantum computer attacks.

The most common public key infrastructure (PKI) technology used across the world is transport layer security (TLS), which secures the data in transit. As such, that should be considered the greatest risk, because if data is captured in transit today, the encryption could be broken in five years time, if quantum computing becomes commercially available. As such, we need to rethink the way we approach hybrid cloud, secure connectivity and TLS.

When it comes to quantum safe, I believe the first thing to fix is connectivity. Two years ago, we introduced support for quantum safe algorithms in IBM cloud, says Nataraj. When you do application transactions over the wire, that link can be quantum safe. You prepare for the threat. That has to be one of the first things, when it comes to cloud security, that one needs to work through.

With the increasing levels of functionality offered by artificial intelligence (AI) and machine learning (ML), automation will become a growing part of an organisations security posture. Automated monitoring of security and compliance posture allows for continuous security.

Furthermore, security deployment will become automated, thereby bridging the gap between the CISOs and CIOs and IT teams. This will ensure they are all consistent with each other and aligned with the organisations global security and compliance requirements.

There is more to be done in continuous security and compliance infused with automation, and how we change from a reference architecture that may be in a Visio diagram to something prescriptive, deployable and automated, says Nataraj.

Concerns surrounding data sovereignty and data privacy residency are likely to increase, given the regulatory compliance and geopolitical aspects of dealing with data. As such, there will be a need for more demonstrable controls and technologies that can help in protecting data and privacy, which will become infused with confidential computing.

Applications of confidential computing are still in their infancy and there is more to be done, because its not just a technology, but its use cases in confidential AI, says Nataraj. IBM has leveraged confidential computing technology to enable unique approach use cases around encryption key management called Keep Your Own Key, where a customer has technical assurance that only they have access to the keys, where keys are protected within hardware as well as within secure enclaves. This is now extended to hybrid multicloud key management through Unified Key.

The IT sector is undergoing a fundamental shift, as it transforms from a web-based model to one reliant on cloud services. This is being compounded by technological and regulatory issues coming to the fore. A multicloud system can enhance adaptability to shifting market trends, but this brings certain challenges. Automating network management policies enables swift and effective sharing of information within networks, regardless of location, while ensuring that compliance with shifting regulatory compliance is maintained.

We can help industry, governments and others move forward, concludes Nataraj. We will collaborate with governments and their policies to make that happen.

Follow this link:
IBMs Nataraj Nagaratnam on the cyber challenges facing cloud ... - ComputerWeekly.com

Related Posts

Comments are closed.