WP Agentless is a DAM Better Option for Securing Cloud Data | Imperva

Agentless is a DAM Better Option for Securing Cloud Data

Agentless is a DAM Better Option for Securing Cloud Data

When it comes to on-premises database activity monitoring (DAM), security teams have consistently relied on agents to seamlessly track all incoming requests and outgoing responses within the databases. The agent-based approach effectively ensures independent monitoring of database activity, regardless of the specific database system and the database administrator (DBA). This results in a system that efficiently gathers data access information with minimal friction and performance impact to effectively fulfill reporting, compliance, and security requirements across an organization’s entire architecture.

The good about agent-based DAM

Agents are highly effective with older on-premises systems such as DB2 z/OS and other data repositories, providing an alternative native logging process. Using agents necessitates software installation and maintenance. However, given the constant number of databases and volume of data requests, the impact on the performance of collecting data access logs will be minimal.

Historically, agent-based Database Activity Monitoring (DAM) has primarily served for compliance reporting. This approach has proven successful as most organizations were primarily focused on creating audit trails to demonstrate regulatory compliance.

The not-so-good about agent-based DAM

In today’s landscape, organizations must leverage their Digital Asset Management (systems to conduct thorough security analyses. As security teams require extensive insights into vast volumes of data activity across multiple repositories, the “data footprint” has significantly expanded. To minimize friction and performance impact, organizations must allocate substantial resources to servers, appliances, agents, and the necessary workforce. Simultaneously, compliance regulations have become more stringent, necessitating the retention of data activity records for extended periods, leading to increased compliance costs.

A kernel-based agent deployment model can prove to be problematic

Major DAM technology providers insist on a kernel-based agent deployment model, which impacts both CPU and latency for databases due to external agents. This model requires data to pass through two or three devices and mandates an agent installation on each database server. The need for hundreds of appliances in large environments can lead to significant expenses. While the “on-premises” deployment can be managed with more resources, Database as a Service (DBaaS) and containerization present new challenges to the current agent-based DAM approach.

Traditional DAM struggles for cloud-managed data

The speed and complexity of cloud platforms, coupled with the widespread adoption of multiple disparate environments, render traditional agent-based data logging, monitoring, and auditing impractical and expensive. These tools fail to manage data in a way that allows for actionable analytics critical to business sustainability and growth.

Any organization in the midst of digital transformation or cloud migration must devise a new strategy. Those with sensitive data already hosted in cloud-managed infrastructures must possess the necessary skills in-house to gain visibility into the data and enforce security policies. Alternatively, they require a data security solution to bypass the skill set requirement and ensure visibility into cloud-managed infrastructures. The absence of a solution poses a significant risk of non-compliance and unnoticed data breaches.

With the adoption of cloud technologies, the landscape of data protection has evolved, and agents no longer form a viable part of the solution. Cloud-hosted infrastructures, also known as DBaaS, are managed by the cloud provider. Consequently, management responsibility shifts from the data owner to the cloud provider, impeding the owner from installing an agent.

Addressing traditional DAM limitations

When transitioning to a DBaaS, organizations should be aware that using a server-side agent to monitor the DBaaS is not practical. In many cases, relying on the native audit trail of the database is sufficient. Typically, databases are accessed by humans through jump servers, and a feasible approach is to install client-side agents on these servers to create a hybrid solution. This involves using native audit logs for applications and client-side agents for human access.

It’s important to note that a new solution, such as a database reverse proxy, can be developed for this use case. This would function similarly to most Web Application Firewalls (WAFs). Interestingly, opting for a managed database was meant to reduce complexity, and implementing a database reverse proxy would introduce the management of a new service. Essentially, it shifts the focus from managing a database to handling a scalable cluster of reverse proxies for each instance of the database.

The container challenge

Running databases in containerized environments like Kubernetes and OpenShift comes with its own set of challenges. Installing an agent for each instance isn’t feasible due to the frequent launching and termination of containers. Embedding an agent into the database image can disrupt the CI/CD process, so we have two options:

  1. Disregarding containerization by forcing an agent on the database containers.
  2. Using eBpf with UProbes to allow the agent to use in-memory agents while maintaining containerization and ensuring the purity of the database container. It’s important to note that eBpf is becoming increasingly popular in container environments.

How to make it easier

If you are currently using an agent-based DAM tool to monitor user interactions with your data, you should simply overlay an agentless solution to get the best of both worlds. This approach offers several benefits for your organization. Firstly, it enables you to continue deriving value from your agent-based DAM tool without the need for a complete overhaul with the agentless solution. Additionally, as you incorporate new data stores, whether on-premises or cloud-based, you must monitor and enforce security policies without incurring additional expenses for servers, appliances, and storage space.

By combining your DAM tool with an agentless solution, you can continuously ingest data from all data stores into a single repository and automatically identify behavior that violates policies. This affords comprehensive visibility and significantly reduces the risk of data breaches. Moreover, it eliminates the necessity for numerous redundant data security tools and the requirement to configure each cloud-managed environment individually to secure the hosted data.

Ultimately, bear in mind that the objective of DAM is to gather data access information with minimal friction and performance impact, and to address reporting, compliance, and security requisites across your entire architecture. Using an agentless solution in conjunction with your DAM tool allows you to achieve this objective. Attempting to fit existing tools into a new solution often leads to adverse performance impacts, not to mention the associated costs and expertise required.

Learn more about how an agentless DAM can assist you in staying abreast of an expanding and increasingly complex data landscape.