You can argue that the fundamental principle behind any human behaviour, and the reason forroot rationale behind nearly all man-made things, is the wish to find order in chaos.
To do so, we must identify patterns. No life could existhave existed without this ability - if a baby would be exposed to a different language every day in their life, they would have never learnt any languagenot be able to learn any of these languages. If your girlfriend's face and general appearance would change every day, you would find it rather impossible to keep her as your girlfriend - after all, you wouldn't be able to recognise her.
The problem with changes is that they break patterns, which mustmust trigger asome cognitive 'discomfort'. Once's you have learnt something, you don't want it to be taken away and forced to learn something again.
Somewhere around 2007 (if I'm not mistaken) Apple came to give a talk at ourthen my school, which specialises in audio and film production. They could not have picked a worse time - 3 month beforeearlier, Apple have announced the discontinuation of PCI based Macs in favour of the (then unheard of) PCIe interface.
For the school this meant an estimated 5 digit cost over 2 years, and ditto for pretty much the whole of the professional audio and film industry (which waswere relaying on a particular, highly popular PCI DSP cardhardware).
Mankind would have never achieved what it has without change - No change, no progressNo change, no progress.
My point is that while totally understood from a user point of view, one should also understand the consequences of allowing change aversion to prevail.
If your users have disliked the change, you have one evidence against it, but. But if a triangulation with a multitude of other empirical research methods would prove otherwise, you have strong case to overrule the users.
I wouldn't rely on your own expert reviewopinion, but perhaps use some predictive evaluation techniques, or user testing involvingwith those unfamiliar with the system.