When getting better makes things worse

When you change things, people will resist.  When you change things for the better, people will still resist.

Often, just being better isn't as important to acceptance of a new design as conformity with the conventions in place.  Take, for example, the control panel in Windows Vista.  Arguably, the configurations and applications are better organized, better labeled and easier to find.  Easier, that is, if you haven't already been trained to find them in different areas and under different labels in earlier versions of Windows.  There are twice as many controls in the Vista control panel, giving the user more granular control over the settings and configurations.  That should be a good thing, right?  Not if you are used to finding configurations buried under other objects and have just accepted the fact and moved on.  Now you are being asked to relearn where things are and will spend time fumbling around, especially if you have some legacy computers running different versions of Windows. While this reconfiguration will benefit new users, experienced users don't see enough marginal benefit to accept the change gracefully.

This concept should be considered in all design projects.  While you might have a better way of doing things, decide if it's enough better to force all of your users to change their mental maps.  It's amazing what people can get used to (imagine some of the legacy systems in use today), but unless you can provide compelling reasons to change, people won't.  Your choices are to follow conventions, thereby further cementing that design in the minds of the users, shift them radically and force the transition, or introduce the changes gradually through evolution.

One thing is certain.  No matter how you make changes, some users will complain. The key is to provide enough benefit that they get over it quickly.

For another perspective, see Gery's Law.