Auditing-related overhead imposed on databases is one of the most misunderstood topics in the field of data security, both when using native capabilities as well as when using database activity monitoring agents. Native auditing subsystems have evolved quite a bit over the past 10 years, to the point where the performance impact on the databases for data collection is on par with agent-based collection. Since monitoring/auditing overhead differs not only in the collection method but more so on the auditing policy and the amount of data collected, any statement that asserts overhead will be X% is misleading and merely a way to dismiss a frank review.
In this Whitepaper, Imperva Senior VP and Data Security Fellow Ron Bennatan provides summary empirical results of overhead using native methods based on real-world scenarios. It summarizes observed results for the most common cases and then provides a discussion on how these results can be used to extrapolate to additional scenarios.
Note: lmperva recommends reading Understanding the Overhead of Database Monitoring & Auditing as a prerequisite, since an understanding of the internal workings of various monitoring and auditing methods will help you understand more completely the numerical results published in this Whitepaper.