Consoles use GDDR6 so definitely using some sort of latency hiding mechanism.
That's consoles though - different animal altogether from PC gaming.
Having ultra limited potential hw configs to code for vastly increases relevant coding expertise in the job market and software optimisation over time.
Games made at the end of a console life cycle (like The Last of Us 1 or GTA5 for PS3) can get far more out of the hw at release time than any PC or Android game/app will ever manage on equivalent spec hw.
Game dev tools/internal code libraries + console SDK/proprietary APIs + 3rd party middleware are optimised continually for the 5+ years of the hw life cycle.
I can only imagine how badly modern Switch games would perform on a nVidia Switch TV, or equivalent Android hw from 2015 having to account for so many possible different Android hw configurations.
This is why DICE and AMD originally teamed up to create Mantle which started the low level gfx API ball rolling in the industry, to (at least partially) address this imbalance.