That's because the cost and effect of texture quality, geometry detail and screen resolution are very hardware-dependent.
The texture quality usually does not have much impact on the speed of the rendering pipeline, but only when they are read from GPU memory. When not all textures fit into the GPU memory, they need to be read from normal RAM or even worse from the hard drive cache, which affects performance negatively. Reducing geometry* and omitting expensive effects** won't help much. But when the execution speed of the rendering pipeline is the bottleneck, reducing texture resolution won't help much either.
Vertex shaders are usually unaffected by the output resolution. The only way to reduce the load on them is to reduce quality and quantity of the 3d models in the scene.
But any pixel shaders still scale linearly with the number of pixels on the screen. Reducing the screen resolution is still an important tool to improve performance. Having half the horizontal- and vertical resolution means you have just a quarter of the calls to the pixel shaders.
In contrary to the older CRT displays, modern LCD or plasma screens have a native pixel-resolution. When they are fed with a video stream in a different resolution, they need to interpolate. Some interpolate much better than others which means that running them on a lower resolution doesn't reduce the picture quality a lot while other monitors really look bad when not run on their native solution (the first LCD screen I owned used nearest-neighbor interpolation, which looked horrible. With my current screens, it's hard to tell when they don't run on the correct resolution).
The game engine can not know how well the monitor of the user interpolates, so it's better to leave the choice to either reduce texture and geometry detail or reduce the screen resolution to the user.
*) OK, reducing geometry might help a bit because vertices also consume GPU memory.
**) unless, of course, omitting these effects means that some textures are no longer required