There's also another highly salient point in all this. We've now had a natural experiment running for more than two decades in regards to the relative effectiveness of different distribution models for getting software from vendors to end users.
Users have proven they're willing to adopt new operatings systems: for example, Mac OS -> Windows -> Mac OS X, as well as the rise of iOS and Android. So "people just can't handle change" doesn't explain the failure of the Linux Desktop to become widespread.
Users have also proven they're willing to tolerate limited hardware compatibility, with the rise of Apple. Sure, Apple's superior design skills makes it easier to tolerate the fact that there is no choice in hardware vendor, but it's enough to show that broad hardware compatibility is a negotiable item.
There's one feature that all of the mass market operating systems (including the game consoles) have in common, though: an easy mechanism for application vendors to provide prebuilt applications with bundled dependencies (except for the core OS services) to end users.
So what do I see as the core difference between Ubuntu and other Linux distributions? Canonical have made a conscious decision to tolerate binary hardware drivers, and to allow the provision of binary applications with bundled dependencies through the Ubuntu Software Centre.
Many people see this as a betrayal of the principles that underpin free and open source software. In a certain sense, that's true: Canonical have deliberately placed the goal of providing a compelling user experience ahead of the goal of promoting the cause of free and open source software. They're not preaching to their end users or their vendors, they're leaving that discussion to others.
But that's a topic for a different thoughtstream (We're Only Part Time Tinkerers)