Rule #1 of security: If someone has access to information, they have access to that information
That tautology is annoying, but it is true. If you give access to an individual, they have access to the data. For users, this usually means access control, but for developers... well... they're the ones that have to write the access control.
If this is a major issue for you (and it sounds like it is), consider building security into your software. A common pattern is to design secure software in layers. At the lowest layer, a trusted development team designs software which manages the most naked of access control. That software is validated and verified by as many people as possible. Anyone designing that code has access to everything, so trust is essential.
After that, developers can build more flexible access control on top of that core layer. This code still has to be V&Vd, but it isn't quite as stringent because you can always rely on the core layer to cover the essentials.
The pattern extends outwards.
The hard part, indeed the art of designing these systems, is how to build each layer so that developers can continue to develop and debug while still providing your company with the security you expect. In particular, you will need to accept that debugging demands more privileges than you think it should, and attempting to lock that down will result in some very angry developers.
As a side solution, consider making "safe" databases for testing purposes where developers can rip out all of the safety mechanisms and do serious debugging.
In the end, both you and your developers need to understand a key tenet of security: All security is a balance between security and usability. You must strike your own balance as a company. The system will not be perfectly secure, and it will not be perfectly usable. That balance will probably even move as your company grows and/or demands on developers change. If you are open to this reality, you can address it.