It is 2010. You are buying a computer display. You are very unlikely to find an analogue one. Yet it is likely that the display and the computer come with analogue display connectors, namely the HD-15 connector for VGA.
The situation is the worst with laptops, which due to the lack of space come with only one display port. It is almost always VGA, even though the computer is a digital one, and the display is very likely going to be a digital one.
There are even flat-panel monitors with VGA as their only input. So you'll have a digital computer and a digital display, but the video signal is converted into analogue and back to digital.
Just like there is no completely digital audio, there is no completely digital video. The colour of each pixel is, at some point, converted into analogue levels of primary colours. So, apart from the slight signal degradation, what is the problem?
To see another problem with VGA, we need to look back at cathode ray tubes. Imagine a spray can sweeping across the display, painting individual pixels as it passes by. The colour of the spray can be adjusted continuously. This is the basic idea of a CRT, and the analogue signals of VGA are designed for a CRT.
Thus in VGA, pixel addressing is also analogue. It is hard to paint one pixel exactly white, while keeping the neighbouring pixels exactly black. On the other hand, in a digital display, you can control each pixel independently. In practice, this turns into a much crisper display, even to the point that it may seem jarring to eyes long accustomed to the comfortable blur of a CRT.
However, there is also a deeper computational aspect. With a digital display, there is a more logical correspondence between your code and its output.
See this post by Sycraft-fu for a couple of further technical points.
The obvious solution is to use a digital display link, or DVI. These days you often see HDMI which is merely another form factor for single-link DVI, with the same electrical connections. The new standard of DisplayPort is also out there.
What about those occasional needs to connect into VGA? The solution is DVI-I. In this connector, there are extra pins for the VGA signals, so only a passive adapter is needed. It is somewhat unfortunate that not all DVI ports handle DVI-I, but AFAIK all laptops with DVI have it.
I wonder if this is the crux of the problem; VGA is kept available, just in case you want to hook up something from the 1980s. The adapter would be too hard a concept for the typical consumer. Not that we ever have to use adapters between old and new things otherwise.
First of all, the concept of 'legacy ports' is mostly associated with the lack of them. In a 'legacy-free PC', there are none of the ports which can be replaced by USB: PS/2 keyboard and mouse, serial, and parallel (printer) ports.
Arguably, there are aspects about these old ports that are better than USB, for example the electronics hobbyist uses of serial and parallel ports. On the other hand, USB does have a number of benefits, but they are not always obvious. For example, a PS/2 keyboard works exactly as well as a USB keyboard. (Given no driver problems, and USB is rarely better in this respect.)
A PS/2 keyboard doesn't blur your keystrokes. It works just as well as its successor. Yet it has been replaced by USB. On the other hand, VGA blurs your video, but it has not yet been replaced by its 10-year old successors.
It's VGA that should be called a 'legacy port', much more so than the other ports.