Look at this: Hibernate is more complex than the problem it tries to solve
I’ve to agree with that. There seems to be over engineering mantra embedded in to Java world. If something is not re-factored into ”enterprisy” complexity, then technology is not considered good. In other words this is known also mantra of job security; if solutions are simple, then there ain’t job security, but if solutions run into baffling complexity is job safety guaranteed since nobody else can maintain logic any longer.
This reminds a lot of early versions of JavaEE, which were absolutely over-engineered into levels of insanity. Luckily some light has been shining with later versions, which have started to put some sanity into place.
No I’m not starting to say that Hibernate is bad, I’m just saying that trying to change not so difficult problem into highly abstracted black box, which tries to hide all actual details. Then, then there can be problems ahead, since instead of reducing complexity additional complexity arises, since now very quickly both original problem scope, and this black box must be understood. Especially problematic black boxes are when something needs to be tuned for use case. Since black box has done its’ own abstraction, doing such tuning can be tricky, challenging or completely impossible.
Also black box abstraction can give developer false sense of security, and make them believe that they need to understand only the black box. Well, at least when considering Object Relational Mapping, that is highly untrue, since understanding mismatch between database model and object model is important. If that is ignored it can lead into horribly bad performance when data set grows.
So, if data set is 10 elements, then anything just works. However, if data set is increased to 200 000 or million, then rules of the game change. It starts to matter how things are done, considering both CPU and memory usage.