“Addressing these security challenges requires
complete database visualization. It involves
moving data protection into the application layer.”
Securing the data used in componentized applications is challenging. There is a perception that cloud vendors use the best patterns to address security and data breaches, but the reality is that data breaches are a huge risk and compliance is difficult. Most companies don’t really know what their
vulnerabilities are and what risks they bear. In fact, 72 percent of organizations use production data in other environments, such as staging areas, where security policies are often more open because people are using it for testing and development.
Another example is the global company that provisions everything in Seattle. From a General Data Protection Regulation perspective, such companies can’t be sure where their consumer data resides. If they’re running experiments on large data sets for, say, algorithm development, they’re using customer data. How do they make sure that the data has been removed? Where are backups stored, and how are they secured? Or, if they are using data lakes, where is that data actually being written? Even containers used for temporary storage become a problem. How do the companies audit those containers? How do they secure cross-site scripting? How do they address new vulnerabilities?
Addressing these security challenges requires complete database
visualization. It involves moving data protection into the application layer, not just relying on platform-as-a-service security functions. When moving data protection into the application layer, you must provide strong protection for the applications themselves.
Protecting applications is a serious challenge in modern distributed, serverless, no-ops, high-volume computing environments. With API proliferation, the monolithic security frameworks designed for older architectures are no longer adequate. Authentication mechanisms such as OAuth introduce their own level of risk if you are relying on a third party for authentication. In addition, there’s the risk of bringing down the OAuth server with too many token requests.
One approach is to minimize data exposure by using microcontainerization to create an endpoint that looks like a physical database but contains only a tiny fraction of the data that has changed. Still, it’s difficult to ensure security when these instances are short lived in a no-ops environment where everything is automated. The key is making sure that you’ve got access to those nodes to see what’s actually going on. This becomes not only a security and governance issue but a visibility issue. We use a security platform that provides real-time adaptive security auditing. It gives much better visibility
into vulnerabilities in real time across every part of the system, and it provides self-healing capabilities.