Nuvia should help that substantially. But regardless, you need to think of this from Microsoft's perspective. Which means asking, "What will fundamentally change how someone uses their computer?".
10% better CPU performance, while nice, isn't going to change how you use your device. But true all day battery life, pervasive cellular connectivity, and ubiquitous AI - those will. Microsoft is willing to put up with Qualcomm's shortcomings in order to have a partner willing to push those experiences will them.
AI is broad, but for what it's worth: in the near future, for stuff on a 128-bit bus with basic LPDDR5, running generalized LLM's locally won't be as much of a hit as people think nor as plausible in the sense of a good experience.
With superior data quality and and training data or time (see: Chinchilla-optimal models, or conversely pruning existing ones), more aggressive quantization (to a point, and quantization-aware training architectures) you can massively lower the floor for useful LLMs especially in a specialized sense - we have 2.7B param models passing medical exams - but I still don't think it will be a particularly fun experience especially keeping the actual bandwidth requirements in line if it's running constantly. With a 256-bit bus (and more memory, though the bandwidth is arguably a bigger deal for e.g. a 3-10B param model) things change, but that's still niche and certainly not coming to Windows save for what will likely be a gaming laptop-but-chiplet-APU (AMD's Halo) more than a monolithic device with a broad market.
Plus, all of what I described is still useful for cloud instances and just means we'll see more advanced models plausible at the same compute as these techniques evolve and become increasingly utilized, in particular the better training data etc, and network latency is just low these days.
Wrapping back around, there are still various particularly useful things on-device AI can be used for and with a superior perf/MM^2 or perf/W case than guzzling it down on the CPU (GPU I think is defensible but still). Audio transcription, some image editing, noise isolation, stuff like that.
But I strongly suspect that if they execute, the main win from QC will be about performance at low power - ST especially which is important - and then overall runtime (going back to idle power) more than AI, 5G in part because the others will compete on the former and on-device is only so relevant, and on the latter the market still isn't big enough yet and it only buys them so much of an advantage.
So a slightly watered down M-chip of sort for Windows. If they don't hit that bar on runtimes or low power perf, then AMD and Intel will wreck them, because no one cares that much about having a slightly better Apt-X and I think AMD et. al will eventually adopt 5G modems as it becomes more common.
As for modems, the modem is still discrete so no power savings on that front from integration, it's rumored to be an X65. Probably just doesn't make sense to integrate it given the market interest today.