When Henry Ford, the de-facto inventor of mass production, was asked during a production meeting in 1909 in which colors his Model T automobile would be available to consumers, Ford – a notorious stickler for keeping costs to the bare minimum – offered almost no optional extras and that included the range of colors. He was rumored to have replied, “The customer can have any color so long as it is black.” While this approach may have been unquestioned at the dawn of the age of automobiles, it has no application when investigating a database security solution built for today’s threat landscape. In this post, we’ll explain why traditional data security methods are no longer effective, and what features and options organizations must-have in a solution to gain the agile, comprehensive data source coverage needed to quickly onboard new data sources and create real database security.
Database security was already hard, even before the complexities introduced by digital transformation and the rapid pandemic-driven movement of workloads to public, private and hybrid cloud environments. Organizations that have made substantial investments in audit compliance solutions are now feeling pressure to create a data-centric security model that can monitor all data sources, not just those being audited. Unfortunately, many of the traditional tools that organizations have relied upon for database monitoring are not a great fit for cloud-based data sources. Also, the skills gap that was already in place around database security has become more pronounced as things are done with new, different technologies and methods in the cloud. The cloud affords greater flexibility, but that flexibility can become a double-edged sword because things are moving much more quickly and there are more things to learn. Finally, there is the reality of database proliferation. In the on-premise database world, an organization may have used three to five databases that are the approved standards and developers would try to funnel every application into using those databases. In the cloud world, those three to five are going to 10 to 15 and in some cases 20 to 25 databases. Suddenly, the landscape is much more complicated. There are many more moving parts and it’s difficult for organisations and busy cybersecurity teams to wrap their arms around so many changes.
What data source coverage capabilities must today’s organization have to ensure database visibility today and in the future?
- Robust security oversight for diverse data estates. The solution must have visibility into all on-premises databases, unstructured data, and all types of cloud environments; public, private and hybrid. Many developers and security practitioners think it’s the cloud service providers’ responsibility to secure data so they pay little attention to it. This is a big mistake. Cloud service providers have a responsibility to create secure architectures. Database security is the data owners’ responsibility. Each cloud environment has specific methods and APIs. Even if nobody in your organization understands every cloud environment in which data resides, your solution must overcome that challenge and still be able to monitor all your data so you can detect policy-violating behavior.
- Comprehensive support for onboarding new on-premise and cloud databases. As discussed, organizations add new data sources all the time. In many instances, vendors may take as long as 9-12 months to make a new data source visible to a solution. This is not even close to being sufficient. The longer the onboarding process takes, the more impact the lack of visibility has on an organization’s security posture. Your solution should provide an out-of-the-box library of development tools for database onboarding and afford you access to an experienced technical team so you can do it quickly and right.
- A straightforward onboarding process. The more moving parts a process requires, the longer it takes, the more resource-intensive it becomes, and the more things that can go wrong. Many solution providers say that they can easily onboard new data sources, but this is usually a subjective claim at best. Some solution providers offer limited support for database onboarding that relies on either a proxy-based architecture or funneling native cloud logs through agents into collection appliances. Some providers offer a universal connector tool that pushes the “heavy lifting” development work to you and makes you responsible for onboard new database types yourself. Henry Ford would probably have approved. Look for a solution that can provide you with certainty around when your new databases will be onboarded.
Bottom line: Take these points into consideration when making your decision. Understand that terms like “universal connectors” from a vendor mean “we’ve rough-plumbed it, but you have to make it work.” Also remember that vendors offering processes that funnel data through agents into appliances are applying 30 year old technology to a challenge that gets more complex every day. Look for robustness and simplicity in a solution and make sure it’s scalable and designed to deliver top-level performance as your data source footprint grows.
Try Imperva for Free
Protect your business for 30 days on Imperva.