Edit:
At Ultra and 1080, you have to have a good graphics card to achieve playable frame rates in any case:
From those 2 charts, with a G3420, the max FPS is 18-19 but with a modern quad, 780 on high textures gets 62 and almost 50 on Ultra textures.
Watch Dogs is just 1 game. There are plenty of games where there will be
no difference in performance between an R9 280X/770 and 780/R9 290X because the dual core G3220 will be the primary bottleneck.
After some consideration, it seems that proponents of "good CPU now, good GPU later" are advocating the practice of delayed gratification in order to achieve an experience that over the long haul, provides the most performance per dollar. This is an admirable goal. But the reality that I see most of the time is that gamers will tend to want the the highest performance they can afford right now, even if it means a less than optimal perf/dollar a year or two from now when not just the GPU needs upgraded, but the CPU as well.
Problem is something like an i5 4690K/i7 4790K will provide 36-74% higher gaming performance vs. G3220 with a GPU of R9 290X performacne level in many popular games. So why spend $460-500 on a GTX780 when a $280 R9 280X or even a $220 GTX760 will provide similar performance in those titles when paired with an anemic G3220? Your view advocates spending a lot of money on a GPU that depreciates in value significantly but the performance benefit will be very minimal in many modern games from Watch Dogs to Metro Last Light to Thief to Crysis 3 to Elder Scrolls V. That's a lot of popular games and doesn't even touch the new games in the next 12 months that will all tank on a dual core non-i3 CPU. You think you'll be able to play Witcher 3, The Division, Dragon Age Inquisition, BF:H multiplayer on a G3220 and get 90% of the performance of a 780?
Elder Scrolls Skyrim - i7 4790K is
65% faster than G3220
Metro Last Light - i7 4790K is
74% faster than G3220
Thief - i7 4790K is
36% faster than G3220
The OP is far better off long term getting something similar to an R9 280X for
$280, and a Core i5 4690K. As he will upgrade his monitor and games get more GPU demanding, he will resell his 280X and get a new videocard.
Maximum resolution is 1600x10
Given how quickly GPUs depreciate in value and are replaced, what's the point of "future-proofing" with a 780 for 1680x1050? It makes it even worse because 780 is overpriced relative to the performance level it offers and it will be anchored by the G3220 on top of that.
Honestly, if the OP is interested in Watch Dogs, for his resolution, I'd get the Asus GTX760 for
$217 that comes with Watch Dogs and then upgrade in 18-20 months to 20nm GPUs if he needs more performance.
What you aren't thinking about is that in 4 years since $500 GTX480 came out, we can buy a card faster for
$130. In 4 years from now, a 780 level of performance will be available at $150, while a $500 by June 2018 should be 2.25-2.5x faster, if not more.
The smarter strategy is to invest into the CPU platform and upgrade the GPU more frequently.
A few weeks ago before watchdogs release, people were praising Haswell Core i3 gaming performance. Suddenly everyone is using Watchdogs as an example for quad+ core CPUs.
Who exactly was praising i3s for gaming? The i3 4330 costs just $50 less than an i5. Even a Core i7 860 @ 3.9ghz from September 2009 is without question an overall better gaming CPU than any i3 in 2014. You can cherry-pick all you want showing that an i3 2100 is not a bottleneck in some games for 780SLI but what about games where it tanks GPU's potential?
vs.
an overclocked 1st generation i7 in the same game
i3 for gaming should be avoided at all costs unless a gamer only plays games that use 2 cores. If someone is tight on money, they should go and buy a used i5/i7 from any generation and overclock it. A used 2500K can't be that much more expensive than a new i3.
Going AMD route is not peachy either. In many games even FX-9xxx line-up compromises single GPU performance.
It's pretty remarkable how the 1st generation 3.9-4Ghz overclocked i7s stood the test of time. This is even more evidence that a smart gamer should get an i5 4690K/i7 4790K and overclock or buy a used i5, preferably i7 if they are really strapped on cash. The short-term money saved on i3 or constant upgrades on the AMD side to catch up t AMD's CPUs from 4-5 years ago seems like a long-term waste of money and time. If future games use more than 4 cores (hexa becomes beneficial), then an i3 will be even worse. The i5 4690K covers a gamer for most of games that use 4 cores or less (this eliminates i3) and for all games that are dependent on IPC (this eliminates all AMD CPUs). In the context of the total cost of a PC gaming rig and given how quickly some of the other components such as SSDs and GPUs depreciate, the small price increase over i3 and lower AMD FX-6xxx/8xxx, spending a little more for an i5 is hardly material but is well worth the 4 years peace of mind.