https://youtu.be/JkT_fQTofOU?t=42
Whats with the shadows? Looks like they are reacting to the water?
Whats with the shadows? Looks like they are reacting to the water?
Just because you don't understand the value of CF/SLI, doesn't mean it's not there.
Let's ignore how 290 CF that cost $500 is wiping the floor with a $550 GTX980 in BF1
The SLI/CF haters often haven't used SLI/CF in their life or in the last 3 years.
If someone spends $1200-2400 on SLI/CF and expect support in almost all AAA games on launch date, they only created these unrealistic expectations for themselves and will always be disappointed.
It's also hilarious people defending another gimped GWs title when we know GameWorks = NV worked closely with the developer which means their cards should perform better especially since NV has a history of poorly optimized GWs games that also "as if by pure magic" cripple performance on older NV cards and GCN. It's been their strategy for the last 4 years.
Its always been this way. Driver update would sometimes break other things while fixing others. Always.
I was reffering to this
Getting GTX 1070 today will make him need an upgrade next year if you want to max out everything.
GTX 1070 is not able to max all games at 1080p today, next year it will be worse.
Yup, thank AAA console ports and Gameworks features. With GWs features, GTX1080 drops to 22-30 fps at 1080p. That's right.
https://www.youtube.com/watch?v=AIKm3882Zbk
are there any tests without the crippling gameworks effects cranked up? all i see is "omg 1060 is faster than 480" while the game beeing unplayable with its 40fps.
I understand that there's little value for me. I mostly play old games, non AAA games. Sometimes I use emulators, which never support SLI. And assuming CF/SLI still don't work well here, I also like to use window mode without losing 50% of my GPU lol.
What am I missing? That 2 1070s beat a 1080 in games that support SLI? Cool. If those are your games go for it. Don't pretend it's the more reliable option.
I was pointing out if 4gb bottleneck not found.....
3gb also not found. The gtx780ti would fall on its face also if it ran out of Vram.
1200Mhz is max boost clock not average.Average is 1000-1100Mhz.You must set max power limit in afterburner and maybe then it will boost at 1200Mhz whole time.
Also Pcgameshardware is not favouring nv cards.Their 7970 vs 680 review was excelent
http://www.pcgameshardware.de/Grafi...sts/Test-Geforce-GTX-680-Kepler-GK104-873907/
Ahh hell, not even GTX 1080 can max out all games at 1080. Even without PCSS it couldn't reach 60 fps not even inside the tunnel .
Yes for those graphics this performance on a 600-700 GPU at 1080p is bad bad bad. Its time to recommend TITAN XP for 1080p, NV will love this
Ahh hell, not even GTX 1080 can max out all games at 1080. Even without PCSS it couldn't reach 60 fps not even inside the tunnel .
I get what you are saying but what old or non-AAA games require a GTX1070/1080 level of GPU to enjoy them? Unless you are playing them at 4K or 1080 with SSAA or using a 144Hz monitor?
I play old emulators at 6x internal res minimum. There are certain dolphin titles that you can benefit for having a 1070. Doesn't hurt to use it. My 290 is good enough for dolphin except for a few niche spots in Galaxy. I bet it's since improved perf though.
That's what I would expect. The 980ti is a beast gpu. I still am mad at myself for not picking up an MSi 980ti.... I'd be happy right now with gaming. Oh well.The 980ti that boosts to 1450 at peak but settles at 1405 after it hits 80 and above. Is that good?
Going back 5-6 generations of GPU testing, as far back as I can remember, PCGamesHardware has always favoured NV cards more than AMD, no matter the generation or the games. The most objective European site is Computerbase.de.
Ok computerbase.de seems to be ur favoritt i will base my reply on that site.
What's your point that 780Ti SLI 3GB does poorly? What's that have to do with Fury X CF is outperforming GTX1080 8GB? The entire discussion was on 4GB VRAM being a bottleneck. 4GB VRAM bottleneck not found.
Can you or other people claiming 4GB VRAM bottleneck on the Fury X provide evidence backing this up with hard facts?
From computerbase.de :
The Fiji GPU in the form of the Radeon R9 Fury X is not doing well (again). This is also due to the only four gigabytes of memory. The texture DLC can not be used on graphics cards with this feature. The Radeon R9 Fury X is only seven percent faster than the Radeon RX 480. In 2.560 × 1.440, the lead increases, but is still quite low at 18 percent.
----
High memory consumption for Watchdogs 2
The DLC with the higher resolution textures should indeed be downloaded only with graphics cards with a 6.144 MB memory or higher. With a four gigabyte graphics card, this has little influence on the general Framerate, but already on the Frametimes.Because the picture stalls now and then, even in Full HD. The six gigabytes are not always enough. As of 3.840 × 2.160, eight gigabytes are needed.
Without the texture DLC, the graphics card should have 4 GB, with at least 6 GB
If you do without the texture package, up to 2.560 × 1.440 also cope with a four-gigabyte graphics card well. For Ultra HD, however, the graphics card should have at least six gigabytes.
The texture quality falls slightly lower without the highest quality level, but it is not serious. However, since high texture details do not cost performance but only graphics card memory, they should be used with a corresponding graphics card.
----
The texture DLC requires up to 8 GB of memory
For the maximum texture details under Ultra HD, a large graphics memory of 8.192 MB is required. For low resolutions like Full HD, six gigabytes are enough. If you only have four gigabytes, you have to reduce the texture details.
Yup, thank AAA console ports and Gameworks features. With GWs features, GTX1080 drops to 22-30 fps at 1080p. That's right.
Not gameworks, gamework. As in only 1 feature takes high performance and that is shadows. As you can see the other gameworks features works great and runs BETTER on AMD.
If sufficient computing power is available, HBAO + should be used. With the Ultra-Preset, the GameWorks effect on the GeForce GTX 1060 costs only three percent, while on a Radeon RX 480 it is only one percent.
---
70 percent power loss with shadows after HFTS
As far as the performance is concerned, the two shadowing techniques are once again extremely intensive. PCSS costs 54 per cent FPS on an AMD GPU, on an Nvidia chip it is 55 per cent. HFTS reduces the speed on a GeForce GTX 1060 by a further nine percent. The performance difference between HFTS and the ultra-shadow is thus at an extreme 70 percent.
The power losses of PCSS and HFTS are strongly dependent on the lighting. Thus the alternative shadows at night cost significantly less in performance and hit only at midday.The test series are based on the worst case scenario.
---
As you can see it's the tech it self that is very costly, not if it's nvidia or AMD that runs it.
Computerbase.de got 52 fps on an RX 480
GameGPU shows 45 fps for the RX 480. GameGPU often picks some of the most demanding scenes and as soon as AMD's latest drivers came out, they updated the test, even splitting Gameworks off with GameWorks On. It is true that GameGPU often rushes the review before the latest drivers and patches are released, but then they often update the review with the latest drivers/patches and they also do year end testing with all of the popular games released in 2016.
GameGPU has also not shown much bias when testing NV or AMD cards, unlike PCGamesHardware, PCLabs, etc. that almost always have NV cards leading. Computerbase and TechSpot also have an excellent track record of reliable and objective GPU testing.
The 52 FPS number u have come with is from very high settings, use google translate and check it yourself. Correct number for ultra is 39.
That's what I would expect. The 980ti is a beast gpu. I still am mad at myself for not picking up an MSi 980ti.... I'd be happy right now with gaming. Oh well.
The mediocre i3 6100 is worthless and no i5 can hit 60 fps averages. I love it how all the AMD FX haters ignore all the gaming CPU benchmarks where the i3 and i5 fall flat on their face. The January 2011 2600K is once again outperforming the i5-6600.
There is no question that at this time, if I was picking between the two, I'd get a 1070. You ALWAYS want to be on Nvidia's latest architecture if you're using Nvidia. But I'm of course talking about the best deals possible and getting a GTX 1070 for $330-340. You're interested in the 1080ti. To me, that's the only level of performance I care about this generation. I want to see what a 1080Ti will do, what AMDs competitor will do, and what AMD's next step cut down chip does(and of course how it's priced). I want to enjoy high end PC gaming one last time.It has an 82 asic and Samsung memory fwiw. It might be a bit foolish for not choosing the 1070 but their is something about it I don't like. And after looking at recent reviews the price premium of the 1080 is no longer worth it to me.
Computerbase test system guide 2016Every review I've read online during GTX980Ti launch reported that the max boost was 1.2Ghz in games.
"For you spec junkies out there we reached a consistent in-game frequency of 1201MHz on the GPU at 84c with 52% fan speed at a maximum voltage of 1.1930v."
http://www.hardocp.com/article/2015..._980_ti_video_card_gpu_review/11#.WD9PG-YrKHs
"The reference card proved to be a willing companion during the OC testing but we decided to limit fan speeds to 50% so were ultimately limited by temperatures. Nonetheless, Boost speeds easily reached 1330MHz with a few peaks around the 1350MHz mark."
1350mhz 980Ti was 7.9% faster in GTA 5 and 6.9% faster in The Witcher 3 compared to a reference stock 980Ti. That means the reference 980Ti was boosting between 1232-1261mhz out of the box, with 0 overclocking.
Your assertion that 980Ti max boosts to only 1000-1100mhz has alrady been proven wrong 1 million times during 980Ti's launch when people claimed the card has 40-50% overclocking headroom. 980Ti has 25-35% overclocking headroom, not 40-50%.
Computerbase test system guide 2016
GTX980TI 1066Mhz boost.Aftermarket gigabyte 980TI 1290-1300Mhz(this card is faster than GTX1070 in their tests and also 28% faster than furyx.)
1400Mhz 980Ti is very fast.Its pretty close to beat every GTX1070 even at 2150/9500(tested in COD IW)Oh wow so that means my 980 ti at 1405 boost is rock solid then.
MSAA in a game like this is similar in cost to SSAA.
25% more pixels + SMAA looks better and is likely faster.
I play old emulators at 6x internal res minimum. There are certain dolphin titles that you can benefit for having a 1070. Doesn't hurt to use it. My 290 is good enough for dolphin except for a few niche spots in Galaxy. I bet it's since improved perf though.
The 980ti that boosts to 1450 at peak but settles at 1405 after it hits 80 and above. Is that good?
That's what I would expect. The 980ti is a beast gpu. I still am mad at myself for not picking up an MSi 980ti.... I'd be happy right now with gaming. Oh well.
Now dont get me wrong im sure you know alot about graphic cards, but you dont know it all. I to know alot you see and i actualy took the time to read it and from what u posted to what is reality is very far off. And no gamegpu is not great site, it's lazy at best.
I added the benchmarks i found into a spreadsheet for all to see and funny thing is the site u said pcgameshardware that favoured nvidia has the RX 480 at it's highest fps compared to all the other sites.
https://docs.google.com/spreadsheets/d/1FtVphRyUABwTwW1lxErDSI7PDjzR5K2os-VHcC9j8bo/edit?usp=sharing
What matters is the GPU standing in the same game, not only just the FPS.
Without GameWorks features, Computerbase has RX 480 at 52.8 fps, you have it at 39 fps.
https://www.computerbase.de/2016-11...abschnitt_benchmarks_von_full_hd_bis_ultra_hd
I am guessing you took it from this page with HBAO+?
https://www.computerbase.de/2016-11/watch-dogs-2-benchmark/2/
The way you presented the benchmarks is actually very good to see where the performance lands on average. It makes it tricky since most sites don't use the same settings when testing games.
Sorry for the way i quoted i might know alot about gpu, i do not know alot about forums and i dont act like i do, being humble comes along way in life.
OK, that's fair. I am all for using 5-6 review sites to better gauge the performance in a game. The bigger take-away for me is how unoptimized the game is overall. That doesn't look good for PC gaming as a whole. It's a much bigger theme than NV vs. AMD.
1400Mhz 980Ti is very fast.Its pretty close to beat every GTX1070 even at 2150/9500(tested in COD IW)
That is why i like pcgameshardware benchmarks.We all know what performance/clock we have.This is why i was surprised that aftermarket 980Ti at 1350Mhz is 2% slower than GTX1070 at 1924mhz in watchdogs2
But it will still beat GTX1070 at 1500/8000 for sure.(because gtx1070 at 1924mhz have almost zero oc headroom left)
1400Mhz 980Ti is very fast.Its pretty close to beat every GTX1070 even at 2150/9500(tested in COD IW)
That is why i like pcgameshardware benchmarks.We all know what performance/clock we have.This is why i was surprised that aftermarket 980Ti at 1350Mhz is 2% slower than GTX1070 at 1924mhz in watchdogs2
Computerbase test system guide 2016
GTX980TI 1066Mhz boost.Aftermarket gigabyte 980TI 1290-1300Mhz(this card is faster than GTX1070 in their tests and also 28% faster than furyx.)
https://www.computerbase.de/2016-05..._die_benutzten_grafikkarten_und_die_taktraten
But it will still beat GTX1070 at 1500/8000 for sure.(because gtx1070 at 1924mhz have almost zero oc headroom left)
@Russian the card boosts their on its own I did not touch anything. plus considering its rated boost is 1291 is still pretty good.