Well, looks like I'm the newbie here. Been lurking around for the longest time, but finally decided to make an account.
Anyway, I've been reading about people using hacks and third-party programs to enable native resolutions, and speculating about how OS X is actually going about scaling things.
I think OS X is doing something slick behind the scenes to keep things looking nice. I
don't think it's as simple as rendering at double the resolution and scaling down to 2880x1800. That would cause things like the 1080p video demo in FCPX to not be pixel-accurate and not actually display at 100%. The same thing goes for fonts and images. Fonts have special properties like hinting and anti-aliasing, and scaling them down isn't the same as properly rendering them at a fraction of the size. Scaling up an image to scale it down again doesn't make sense either, as it loses detail.
Apple's been hinting at resolution independence for the past couple years. Even way back in 10.4, there was an
experimental feature buried in Apple's developer tools to enable UI scaling. It kinda worked the same way Windows currently scales things, which meant it ended up breaking some programs and caused things to look funky.
It seems like most people forgot about that hint, but in newer versions of Quartz Debug, that UI slider in was changed to a "Enable HiDPI display modes" checkbox. It also mentions something about "virtual display modes" in the Displays prefpane.
I'm on a 2010 15" MBP, so this option doesn't seem to do anything for me. Maybe it'll allow you to actually set real "hardware" resolutions instead of "virtual" scaled ones? I have no clue. If anybody here has a new rMBP, it would be great if you could figure out what that checkbox does.