Like many a great technological innovation, USB has turned from a blessing into a monster, or at least grossly against its original principles.
When USB 1.0/1.1 was released in the 1990s, it was a breath of fresh air for the clunky, dusty IT world. At one fell swoop, it promised to replace AT/PS2 keyboard and mouse ports, serial and parallel ports, and also plenty of use cases for ISA/PCI/PCMCIA/PC Card expansion buses.
Frankly, the port situation was a mess with little overall design behind it. For example, PS/2 keyboard and mouse ports could accept either only one of the devices, or both kinds, depending on how the manufacturer had chosen to omit pins, and you could never be sure which way it would be -- even a lot of color codings went wrong.
The parallel port offered little more than a printer connection to most users, and few had any use for the serial port between the time of disappearing serial mice and the brief need for modems in the early Internet explosion. Both ports looked notoriously clunky with their metallic cases and industrial screw joints.
Along came USB, meaning all of these devices had the same, sleek and flat connector, ready to be plugged into any of the identical sockets. At least when you were careful with the orientation; in the low visibility behind a case, it was easy to miss which way the rectangle should be.
Nevertheless, the face of computing was quickly turned into something much more fashionable. Apple's iMac was one of the first and most famous of these "legacy-free" computers (though it still used Apple's version of serial port internally to connect the IRDA port, and in fact many devices today use serial ports in some form.)
For a while, this was good, barring the driver problems of the very early days. There was even enough capacity to provide many devices in this format, such as sound cards, that had previously mandated an expansion bus like PCI. Speaking of capacity, things were already getting a little complex, with the 1.5 Mbit/s Low Speed suitable for most peripherals like input devices, and the 12 Mbit/s Full Speed for heavier uses.
Around the same time, an ostensible competitor appeared on the market. Except Firewire was by no means in the same league, being a more complex and faster design for more demanding applications, such as real-time video and hard drives. Most importantly, Firewire was a network where each device is a potentially equal participant. For example, you could link a computer to a hard drive, and the HD to a camera. Each device could access any of the others.
This was the first great NIH moment for USB. It saw Firewire as a competitor, and USB 2.0 was released with the 480 Mbit/s High Speed in order to beat Firewire's most common 400 Mbit/s capacity.
However, this was only a nominal increase, and Firewire was much faster in practice. USB was designed as the simple, light protocol for simple devices, while Firewire was designed for heavy, time-critical applications, and you could not change that by simply upping the frequencies.
Also, with three speed grades, things were looking a little confusing. For example, Full Speed still meant 12 Mbit/s even in the USB 2.0 world, so devices thus advertised would not give the full speed of USB 2.0, which would be High Speed.
On the other hand, increasing the signaling rate 40-fold is no mean feat, and USB did this while maintaining the old electrical design. It was a genuine backwards-compatible upgrade, and it is fast enough for a lot of mass storage, such as hard drives, not to mention memory sticks. In the meantime, though, technology still marched on, and USB was hungry for more.
The first decade of the 2000s brought even faster alternatives for external devices, for example eSATA. It was obvious that USB could never compete with the native format of hard drives, but still it had to try. Another competitor uses the same idea of providing a traditionally internal protocol as external, namely Thunderbolt which combined Displayport and PCI Express. However, hard drives seem to be the main application for USB 3.0 for now.
The technical problem was that USB 1/2 had reached its limit, and a new electrical interface was needed. This also gave the leeway to redesign everything else about the protocol, to make a fresh start and leave the old USB in its legacy state.
Alas, it seems USB is all about backwards compatibility and being somehow "universal", even if you cannot technically have everything. The ingenious solution was to bolt this new interface on top of USB 1/2, so that you could always plug an old USB cable into the new socket, and sometimes even do the reverse.
For example, the Micro USB 3.0 plug makes this painfully obvious — it's the old and new interfaces side by side. You cannot buy just the new interface anywhere, because that would break the backwards compatibility. Because you never know when you want to use Low, Full or High Speed rather than the new SuperSpeed.
This, to me, marks the point where USB turns against its "legacy-free" ideals. The only sane option, in my opinion, would be to keep USB 1/2 as it is, and develop the 3rd iteration with a new name, with different aims. Because that's what it is — not a seamless upgrade, and neither something for simple peripherals such as keyboards or mice. You might even say they are different tools for different jobs.
Of course, the cynic in me might say that USB 3.0 has nothing to offer over the existing alternatives. By accepting the dual nature of USB 1/2 vs. 3.0, one would accept defeat — if not due to technical limitations, at least for admitting the futility in striving for universality.
What next? USB 4.0 with three different interfaces bolted together? USB N with N-1 ports? Ludicrous Speed?
The old idea of "legacy ports" was that you get rid of them if something better comes up to replace them. In a more general philosophical sense, it could mean letting go of past baggage. Naturally, it took a long time for USB 1/2 to replace the old ports, but the solution was not this fist-sized lump of all possible ports together.