Is this based on actual dev talk ,or just your own assertions based on what you would expect nVidia to do based on their past behaviour with CUDA?
Because Nintendo as a company don't strike me as foolish enough to get into bed with such a walled garden, lock in type vendor like nVidia without some sort of agreements and reassurances in place for exactly this sort of problem.
The problems that Microsoft and Sony have had with nVidia over their consoles in the past were well documented and with no small amount of bad blood, and Nintendo have already had their own problems with them over security issues with the original TX1 hardware - it stands to reason that Nintendo's lawyers and higher ups have an exit strategy in place unless they took leave of their senses when the contract was being drawn up.
Also, as far as toolchains go, the state of some modern engines like Unreal and Unity is such that they are toolchains (and also pseudo drivers from a Vulkan/D3D12 perspective) unto themselves - so as long as they support the platform it is already good to go for a good many developers unwilling or unable to develop a game engine from scratch for themselves.
The WiiU may have been underpowered compared to XB1 and PS4, but it was not that big a difference - and the Switch itself is a relatively meagre change from WiiU in spec anyway. The PPC based CPU was basically a faster tri core Wii chip - so any well experienced Nintendo dev could work on it, and the AMD GPU was practically off the shelf so again not a huge stretch toolchain wise to develop for.
The sudden difference in the kind of games found on the Switch vs the WiiU has more to do with a change in policy from Nintendo than toolchains and hardware - they are well aware that their home console business could go the way of Sega if they don't diversify from their mostly first party casual gaming model of previous generations, as the dire sales of the WiiU shows.
I don't think you realize how woeful Nintendo's previous stuff was, that moving to Nvidia is HUGE improvement for developers. Its been a constant issue, and was likely the major reason why the Wii U was basically a doubled up Gamecube (to maintain consistency and make it easy for developers), and then the Wii U built off of of that somewhat, and somewhat the CPU stuff that the 360/PS3 had, and then newer GPU. Even so, the Wii U dev situation was a complete mess and was a big reason for the dearth of games. Eurogamer had an article detailing what a mess the situation was (supposedly Nintendo didn't even have finalized API spec or something when the Wii U launched so devs had to try and figure things out on their own).
I also don't think its nearly as much of a walled garden as you think. In some ways it actually opens things up for Nintendo as games made for the Switch should meet modern API standards, and so they should be able to run on other hardware that also meets those API specs. Its also why there's been so many ports to the Switch, games that were developed for those API specs could run on it (just much reduced). And it instantly meant their hardware had compatibility with the popular game engines. What this means is that Nintendo could move to another ARM SoC that meets similar API. Which I think a good amount of them do these days. And AMD could potentially get back in by doing what Nvidia did, pairing a small version of their GPU with some ARM cores.
The problems Microsoft and Sony had were mostly due to pricing and contracts. Which Sony's was their own fault (they stupidly thought they'd be able to use Cell to run graphics, and then I think even considered just putting two Cell chips, but GPUs were much better for graphics, so they had to rush to find a GPU, and then had to pay Nvidia a bunch; Microsoft's was because Nvidia controlled the IP which limited Microsoft's ability to do things like die shrinks to lower costs). Plus Nvidia couldn't offer a CPU solution, whereas AMD could offer both CPU and GPU solution. Its true that Nintendo had the security issue, but from what I've read they seem to be quite happy otherwise. And Nvidia did very little to really woo them, which I think Nvidia would be willing to make a custom Tegra chip, or at least a more modern one for them. And I think a lot of that is due to the massive improvement to the development situation.
I've wondered if cloud gaming might have also been in play. I know Nvidia touted those ray-tracing render boxes, and I feel like Nintendo has to be seeing companies moving to subscription models. Plus it would let Nintendo keep the console costs low, while getting more control over the games. Granted, I don't know how much if any advantage the cloud would have been for Nvidia, as I believe AMD is in Microsoft's (and therefore Sony as well since they worked a deal) and Google's streaming. I guess maybe Apple's too although I don't know what hardware Apple is using. And I'm not sure if Nintendo would be that high on it, considering their bizarre online service behavior in general. But, it would provide them an avenue to match or even possibly exceed the other consoles in graphics, while making the buy in cost much lower since they could offer cheaper console costs (but making more money over time).