I think the general thrust of your argument is correct, but I also think that you're overlooking a few things. Perhaps it might make sense to divide these up into headings.
Are elementary really making such radical changes?
Be a little bit wary here of believing the hype. The elementary team talk a good game about how they care about their user experience and visual design more than everyone else does, but they are perhaps not changing as much as they might imply, and everyone else is doing more than might be implied. In fairness, they would say that they're doing it better than others, rather than more, and I have a lot more sympathy with that position. But the changes they're making are actually relatively trivial; it's important to standardise, and to make things more attractive (the aesthetic usability effect is real), but as noted they are resource constrained. Actual major user experience changes -- things which involve creating large structure -- are pretty much beyond their current capabilities: think of, for example, all apps automatically saving files so you never lose anything, or all apps keeping version history of all saved files so you can always revert, or all apps being automatically sandboxed so they can't steal data, or all apps being relocateable so you can install more than one version, or, or, or. Lots of big-picture changes which require infrastructure work, which elementary can't do. Obviously, other upstreams are working on this, and the elementary team contribute to that to their credit, and use these things as they arrive (and Houston is infrastructure, and they're working on that), but as you note, the resource constraints are likely to keep their changes minimal and relatively trivial unless they get a lot more popular (and more importantly better funded by their community's growth or generosity). They do a good job of optimising the "how hard is this to do" to "how much impact does this have" ratio, though.
Why aren't Canonical doing more to change and improve Ubuntu's design?
This is probably better phrased as "why aren't they doing that any more, when they used to?", because Unity was a pretty radical change. The Launcher with its quick menus and overlaid badges and progress bars; the global menu; being able to scrub between indicators and unifying left and right clicks; putting window buttons on the left because that made it integrate better with Unity's layout when maximised; the window spread: these were all major changes when they happened. (Some of them had already appeared on other platforms, certainly, but that doesn't make a thing invalid, unless only Xerox get credit for GUIs :)) More importantly, though, most of Canonical's design effort these days is going into Unity 8, not Unity 7. That's full of radical design changes: it inherits everything from Unity 7 and makes a bunch more change besides, and the plan is to make the Ubuntu desktop be Unity 8. (Leave aside whether that's a good idea or not; we're talking here about where design effort is going, not where it should be going.) So, if you're not looking at Unity 8 (which I imagine you're not, because why would you be?) then you're not seeing what the design team are spending their time on.
Why aren't project creators doing more to fit in with Ubuntu's design changes?
As noted, Unity made some fairly serious changes to the desktop. However, Canonical don't write most of the apps that run in that desktop. And at least some of those app writers aren't interested in integrating properly into the Ubuntu desktop. Nautilus, and other upstream Gnome projects, are one of them; they prefer app menu buttons to the global menu. Firefox is another; they prefer consistent UI across platforms to doing better integration with the platform they're currently on. I would also note that elementary has the same problem; this is why they encourage people to use epiphany rather than Chrome, because epiphany integrates better with their desktop and that's more important than whether it's a good web browser. This conflict is partially the fault of upstream software developers for not valuing integration into the desktop, and partially the fault of developers of those desktops for pushing different views (so there is no single "desktop" to integrate into; Unity and Gnome Shell and Pantheon and KDE and LXDE are all different). And it's also partially the fault of users of these desktops, who don't value this integration and therefore signal to software developers that they don't need to do it. Windows is the same here, as Jono notes: Windows users in general do not indicate a preference for an app which integrates properly and works like other apps. Mac users do; an app which does things "its own way" or differently to most apps will often be rejected or at least called out precisely for doing that. iOS is similar; Android celebrates divergence (although Google are trying to change that, with Material). The elementary development team do value app integration, but they're not a big enough deal to have upstream software developers pay attention to their desktop. They're also trying to build a community of people who also care about it, but they're not having a great deal of luck with that (as discussed in the interview I did).