Watchdogs2 benchmarks

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tg2708

Senior member
May 23, 2013
687
20
81
Did not want to create another thread I just need a few inputs. I got a 980ti classified but upon checking the backplate area I see like a sponge like pad through one of the cutouts. Not sure if it's used to give the bare pcb space from the black plate but it's somewhat worrisome. The card temps seem out of wack but judging from the various 1070 and 1080s I have tried I'll chalk it up to bad case airflow but it still boosts to 1405. With the small quibble above do you think it's worth exchanging?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Just because you don't understand the value of CF/SLI, doesn't mean it's not there.

That's because there is no real value in crossfire/sli, nothing to understand.

Let's ignore how 290 CF that cost $500 is wiping the floor with a $550 GTX980 in BF1

Funny you use the launch price of the gtx980 but forgot that the 290 cost $400 at launch. SO ita a $550 gtx980 vs the stuttering mess, high latency of the $800 290's in xfire. And lets not forget the game support sucks for xfire/sli. Oh yea and 290's use 3x more power.!

The SLI/CF haters often haven't used SLI/CF in their life or in the last 3 years.

Ever think that the sli/xfire haters that have used it , DONT USE IT NOW BECAUSE IT SUCKED?

If someone spends $1200-2400 on SLI/CF and expect support in almost all AAA games on launch date, they only created these unrealistic expectations for themselves and will always be disappointed.

Hmmm if I spend $1200-$2400 on sli/CF, I would expect DAY ONE profiles and sli/xfire support in EVERY game.

It's also hilarious people defending another gimped GWs title when we know GameWorks = NV worked closely with the developer which means their cards should perform better especially since NV has a history of poorly optimized GWs games that also "as if by pure magic" cripple performance on older NV cards and GCN. It's been their strategy for the last 4 years.

Yea hate on Nvidia for trying to offer us more than just A PURE CONSOLE PORT. Ironically the very thing you were complaining about. console vs 1080 right?

on a side note....

I don't understand what all the fuss about video card prices is coming from......
Poor people can afford to game at high settings 60fps. If you save .75 a day for a year, thats $275 to buy a videocard every year. OR $550 every 2 years. Unless your a kid that depends on mommy to buy you stuff, gaming
is NOT an expensive hobby.

Most of us with fairly decent jobs can game at ultra settings and buy whatever we want....

But what is the real difference from high to ultra settings in todays games?
A better looking rock or shadow coming off the rock.

I personally want better visuals than a xbox/playstation and faster framerates. At this time I'm just about do for an upgrade.
 

Ranulf

Platinum Member
Jul 18, 2001
2,755
2,191
136
Its always been this way. Driver update would sometimes break other things while fixing others. Always.

Yes it has always been a factor, but it happens more because I need to update drivers far more often now than in the past.

I was reffering to this

The HoMM3 problem might have been D3D9 but the game worked fine on a laptop with intel graphics (2010 chip running 2012 drivers or so). Both systems are running win7, not win10. PA was a problem with AMD drivers for sure, but there was a work around at least.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Getting GTX 1070 today will make him need an upgrade next year if you want to max out everything.
GTX 1070 is not able to max all games at 1080p today, next year it will be worse.

Yup, thank AAA console ports and Gameworks features. With GWs features, GTX1080 drops to 22-30 fps at 1080p. That's right.

https://www.youtube.com/watch?v=AIKm3882Zbk

Ahh hell, not even GTX 1080 can max out all games at 1080. Even without PCSS it couldn't reach 60 fps not even inside the tunnel .

Yes for those graphics this performance on a 600-700 GPU at 1080p is bad bad bad. Its time to recommend TITAN XP for 1080p, NV will love this
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
are there any tests without the crippling gameworks effects cranked up? all i see is "omg 1060 is faster than 480" while the game beeing unplayable with its 40fps.

It's even worse once the game is tested in downtown areas. CPU performance gets hammered. If this is the future of AAA games in 2017, $400 8-core 16T Zen with 5960X level of performance is going to sell like hot cakes.

The mediocre i3 6100 is worthless and no i5 can hit 60 fps averages. I love it how all the AMD FX haters ignore all the gaming CPU benchmarks where the i3 and i5 fall flat on their face. The January 2011 2600K is once again outperforming the i5-6600.



vs.



I understand that there's little value for me. I mostly play old games, non AAA games. Sometimes I use emulators, which never support SLI. And assuming CF/SLI still don't work well here, I also like to use window mode without losing 50% of my GPU lol.

What am I missing? That 2 1070s beat a 1080 in games that support SLI? Cool. If those are your games go for it. Don't pretend it's the more reliable option.

I get what you are saying but what old or non-AAA games require a GTX1070/1080 level of GPU to enjoy them? Unless you are playing them at 4K or 1080 with SSAA or using a 144Hz monitor?

I never pretended 1070 SLI is a more reliable option. It's simple logic and mathematics at work. When SLI scales well (let's say GTX1070 SLI gets 60% scaling), you actually get extremely close to a $1200 Titan XP level of performance. When SLI scaling doesn't work, you still get 80% of GTX1080's performance. Let's say if SLI works in 50% of cases, we get: 160% * 0.5 + 80% * 0.5 = 120%. That means on average GTX1070 SLI offers 20% more performance than a 1080 if SLI works only 50% of the time. In games released in the last 5 years, SLI works in a lot more cases than 50% of the time.

Another point is GTX1070 SLI has some serious power at 4K gaming (or 1440p with higher MSAA), but 1080 barely provides a superior gaming experience over a single 1070/980Ti. The hate for SLI is strong on this forum and HardOCP but benchmarks don't align with reality.
http://www.guru3d.com/articles_pages/geforce_gtx_1070_2_way_sli_review,16.html

I was pointing out if 4gb bottleneck not found.....
3gb also not found. The gtx780ti would fall on its face also if it ran out of Vram.

I see your point. Should you add 1-2 lines about how Kepler is going to join GeForce 5 and 7 as one of the worst NV GPU architectures ever made? Remember all those people who defended and recommend 780/780Ti over R9 290/290X? Oh the fun.

$700 780Ti is slower than a $400 R9 290 = aka R9 390, and equals RX 470 4GB in this game:
http://www.guru3d.com/articles_pages/watch_dog_2_pc_graphics_performance_benchmark_review,6.html

On November 2013, GTX780Ti cost $699. November 2016, RX 470 4GB is $150. 3GB of VRAM or not, 780Ti is a complete turd and will go down in history as one of the worst flagship GPUs ever made from AMD or NV. 780Ti was only good for 1 year until November-December 2014, after which point it got completely annihilated by R9 290/290X/390/390X in AAA games of 2015-2016. Out of 3 years that one expects to use a GPU, the 780Ti lasted just 1 year. The less is said about the $650 GTX780, the better. That card is now performing on par with a $299 R9 280X.
1200Mhz is max boost clock not average.Average is 1000-1100Mhz.You must set max power limit in afterburner and maybe then it will boost at 1200Mhz whole time.

Every review I've read online during GTX980Ti launch reported that the max boost was 1.2Ghz in games.

"For you spec junkies out there we reached a consistent in-game frequency of 1201MHz on the GPU at 84c with 52% fan speed at a maximum voltage of 1.1930v."
http://www.hardocp.com/article/2015..._980_ti_video_card_gpu_review/11#.WD9PG-YrKHs

"The reference card proved to be a willing companion during the OC testing but we decided to limit fan speeds to 50% so were ultimately limited by temperatures. Nonetheless, Boost speeds easily reached 1330MHz with a few peaks around the 1350MHz mark."

1350mhz 980Ti was 7.9% faster in GTA 5 and 6.9% faster in The Witcher 3 compared to a reference stock 980Ti. That means the reference 980Ti was boosting between 1232-1261mhz out of the box, with 0 overclocking.

Your assertion that 980Ti max boosts to only 1000-1100mhz has alrady been proven wrong 1 million times during 980Ti's launch when people claimed the card has 40-50% overclocking headroom. 980Ti has 25-35% overclocking headroom, not 40-50%.

Also Pcgameshardware is not favouring nv cards.Their 7970 vs 680 review was excelent
http://www.pcgameshardware.de/Grafi...sts/Test-Geforce-GTX-680-Kepler-GK104-873907/

That's 1 review almost 5 years ago. What about all the other reviews of single games tested? I won't link to 100 reviews of games they tested, but they always showed preference for 680 over 7970, 770 over R9 280X/7970Ghz, 780Ti/780 over 290X/290, and 970 over R9 390, 980 over Fury, etc.

Computerbase in Europe, TPU in North America and TechSpot in New Zealand are THE Gold Standard of objective GPU reviews. For 7-8 years I've followed them, overall, they tend to be the least biased (although TPU is often very late updating their games and retesting all the GPUs with the latest drivers. They also had a tendency to litter their reviews with GameWorks titles but they've improved a lot).

https://www.computerbase.de/thema/grafikkarte/rangliste/#diagramm-performancerating-2560-1440

Computerbase has AIB 1070 beating Fury X by 23% at 1080p and by 21% at 1440p across 24 games
https://www.computerbase.de/thema/grafikkarte/rangliste/#diagramm-performancerating-1920-1080

TPU has AIB 1070 beating Fury X by 22% at 1080p and by 16% at 1440p across 21 games
https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/29.html

Yet, as if by pure magic/coincidence, Ubisoft's NV-partnered + GameWorks open-world games happen to gain an additional 25-30% on NV cards compared to AMD? Really?...



AIB 1070's lead grows to 50% at 1440p.


Interestingly enough, that wasn't the case in 2014 with Unity. Why is that?





I would even accept if Ubisoft's open-world games looked stunning, but they aren't. Looks like they haven't learned anything from Unity.

Ahh hell, not even GTX 1080 can max out all games at 1080. Even without PCSS it couldn't reach 60 fps not even inside the tunnel .

Yes for those graphics this performance on a 600-700 GPU at 1080p is bad bad bad. Its time to recommend TITAN XP for 1080p, NV will love this

Don't forget you are going to need a $1000-1100 i7 5960X or i7 6900K too. $1000 8-core Intel CPU + $550-600 GTX1080 to barely outperform the graphics of a $250 PS4 Slim. Add another GTX1080 SLI to barely have graphics that are better than a $400 PS4 Pro.

No wonder console owners make fun of PC users. I would too if I saw a $2000-3000 PC barely having superior graphics to a $250-400 console.

Cannot wait to upgrade to a $1700 10-core Skylake-X and $1500 Volta GTX2080 SLI to play the next open-world Ubisoft optimized wonder.

LOL! Ubisoft may delay the next AC to 2018 [Insert: since by then a $450 Volta GPU should have the power of a $1200 Titan XP and gamers won't be as upset to play that game at 60 fps 1080p.]
 
Last edited:
Reactions: Bacon1

dogen1

Senior member
Oct 14, 2014
739
40
91
Ahh hell, not even GTX 1080 can max out all games at 1080. Even without PCSS it couldn't reach 60 fps not even inside the tunnel .

MSAA in a game like this is similar in cost to SSAA.

25% more pixels + SMAA looks better and is likely faster.

I get what you are saying but what old or non-AAA games require a GTX1070/1080 level of GPU to enjoy them? Unless you are playing them at 4K or 1080 with SSAA or using a 144Hz monitor?

I like to use SSAA or DSR for older games. Also, certain emulator features can be highly demanding on the GPU.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
I play old emulators at 6x internal res minimum. There are certain dolphin titles that you can benefit for having a 1070. Doesn't hurt to use it. My 290 is good enough for dolphin except for a few niche spots in Galaxy. I bet it's since improved perf though.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
I play old emulators at 6x internal res minimum. There are certain dolphin titles that you can benefit for having a 1070. Doesn't hurt to use it. My 290 is good enough for dolphin except for a few niche spots in Galaxy. I bet it's since improved perf though.

In pcsx2 some games are bandwidth bound even at native ~512x448 resolution. Apparently super high bandwidth(GDD5X/hbm2 even) may be useful to increase emulation accuracy as well. The PS2 is a pig to emulate though lol.
 

tg2708

Senior member
May 23, 2013
687
20
81
The 980ti that boosts to 1450 at peak but settles at 1405 after it hits 80 and above. Is that good?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The 980ti that boosts to 1450 at peak but settles at 1405 after it hits 80 and above. Is that good?
That's what I would expect. The 980ti is a beast gpu. I still am mad at myself for not picking up an MSi 980ti.... I'd be happy right now with gaming. Oh well.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Going back 5-6 generations of GPU testing, as far back as I can remember, PCGamesHardware has always favoured NV cards more than AMD, no matter the generation or the games. The most objective European site is Computerbase.de.

Ok computerbase.de seems to be ur favoritt i will base my reply on that site.

What's your point that 780Ti SLI 3GB does poorly? What's that have to do with Fury X CF is outperforming GTX1080 8GB? The entire discussion was on 4GB VRAM being a bottleneck. 4GB VRAM bottleneck not found.

Can you or other people claiming 4GB VRAM bottleneck on the Fury X provide evidence backing this up with hard facts?


From computerbase.de :

The Fiji GPU in the form of the Radeon R9 Fury X is not doing well (again). This is also due to the only four gigabytes of memory. The texture DLC can not be used on graphics cards with this feature. The Radeon R9 Fury X is only seven percent faster than the Radeon RX 480. In 2.560 × 1.440, the lead increases, but is still quite low at 18 percent.

----


High memory consumption for Watchdogs 2
The DLC with the higher resolution textures should indeed be downloaded only with graphics cards with a 6.144 MB memory or higher. With a four gigabyte graphics card, this has little influence on the general Framerate, but already on the Frametimes.Because the picture stalls now and then, even in Full HD. The six gigabytes are not always enough. As of 3.840 × 2.160, eight gigabytes are needed.


Without the texture DLC, the graphics card should have 4 GB, with at least 6 GB
If you do without the texture package, up to 2.560 × 1.440 also cope with a four-gigabyte graphics card well. For Ultra HD, however, the graphics card should have at least six gigabytes.

The texture quality falls slightly lower without the highest quality level, but it is not serious. However, since high texture details do not cost performance but only graphics card memory, they should be used with a corresponding graphics card.

----


The texture DLC requires up to 8 GB of memory
For the maximum texture details under Ultra HD, a large graphics memory of 8.192 MB is required. For low resolutions like Full HD, six gigabytes are enough. If you only have four gigabytes, you have to reduce the texture details.

Yup, thank AAA console ports and Gameworks features. With GWs features, GTX1080 drops to 22-30 fps at 1080p. That's right.

Not gameworks, gamework. As in only 1 feature takes high performance and that is shadows. As you can see the other gameworks features works great and runs BETTER on AMD.

If sufficient computing power is available, HBAO + should be used. With the Ultra-Preset, the GameWorks effect on the GeForce GTX 1060 costs only three percent, while on a Radeon RX 480 it is only one percent.

---

70 percent power loss with shadows after HFTS

As far as the performance is concerned, the two shadowing techniques are once again extremely intensive. PCSS costs 54 per cent FPS on an AMD GPU, on an Nvidia chip it is 55 per cent. HFTS reduces the speed on a GeForce GTX 1060 by a further nine percent. The performance difference between HFTS and the ultra-shadow is thus at an extreme 70 percent.

The power losses of PCSS and HFTS are strongly dependent on the lighting. Thus the alternative shadows at night cost significantly less in performance and hit only at midday.The test series are based on the worst case scenario.


---

As you can see it's the tech it self that is very costly, not if it's nvidia or AMD that runs it.


Computerbase.de got 52 fps on an RX 480

GameGPU shows 45 fps for the RX 480. GameGPU often picks some of the most demanding scenes and as soon as AMD's latest drivers came out, they updated the test, even splitting Gameworks off with GameWorks On. It is true that GameGPU often rushes the review before the latest drivers and patches are released, but then they often update the review with the latest drivers/patches and they also do year end testing with all of the popular games released in 2016.

GameGPU has also not shown much bias when testing NV or AMD cards, unlike PCGamesHardware, PCLabs, etc. that almost always have NV cards leading. Computerbase and TechSpot also have an excellent track record of reliable and objective GPU testing.


The 52 FPS number u have come with is from very high settings, use google translate and check it yourself. Correct number for ultra is 39.

Now dont get me wrong im sure you know alot about graphic cards, but you dont know it all. I to know alot you see and i actualy took the time to read it and from what u posted to what is reality is very far off. And no gamegpu is not great site, it's lazy at best.

I added the benchmarks i found into a spreadsheet for all to see and funny thing is the site u said pcgameshardware that favoured nvidia has the RX 480 at it's highest fps compared to all the other sites.

https://docs.google.com/spreadsheets/d/1FtVphRyUABwTwW1lxErDSI7PDjzR5K2os-VHcC9j8bo/edit?usp=sharing

Sorry for the way i quoted i might know alot about gpu, i do not know alot about forums and i dont act like i do, being humble comes along way in life.
 
Last edited:

tg2708

Senior member
May 23, 2013
687
20
81
That's what I would expect. The 980ti is a beast gpu. I still am mad at myself for not picking up an MSi 980ti.... I'd be happy right now with gaming. Oh well.

It has an 82 asic and Samsung memory fwiw. It might be a bit foolish for not choosing the 1070 but their is something about it I don't like. And after looking at recent reviews the price premium of the 1080 is no longer worth it to me.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The mediocre i3 6100 is worthless and no i5 can hit 60 fps averages. I love it how all the AMD FX haters ignore all the gaming CPU benchmarks where the i3 and i5 fall flat on their face. The January 2011 2600K is once again outperforming the i5-6600.

More and more BS.
The $115 i3 6100 is the best price performance chip money can buy when overclocked in most games. That's not why I bought one. I bought one at release because at the time it ran every game at or about 60fps+.
Now I just overclock it to get about 60fps.

Every i5 can hit 4.5+ and mop the floor with anything AMD has to offer 99% of the time.

The i7 2600K should be compared to a i7 4970k or 6700k or 7700k not a plain i5 6600 NON K.

AMD haters? whats to like about AMD CPU's? They are power guzzling , underperforming pieces of crap. Even AMD lovers will tell you that.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
It has an 82 asic and Samsung memory fwiw. It might be a bit foolish for not choosing the 1070 but their is something about it I don't like. And after looking at recent reviews the price premium of the 1080 is no longer worth it to me.
There is no question that at this time, if I was picking between the two, I'd get a 1070. You ALWAYS want to be on Nvidia's latest architecture if you're using Nvidia. But I'm of course talking about the best deals possible and getting a GTX 1070 for $330-340. You're interested in the 1080ti. To me, that's the only level of performance I care about this generation. I want to see what a 1080Ti will do, what AMDs competitor will do, and what AMD's next step cut down chip does(and of course how it's priced). I want to enjoy high end PC gaming one last time.
Based on how this game performs, it looks like the PS5/XboxTwo will outclass PC in graphics quality as the ports stay unoptimized and the hardware/optimization only gets better from Sony and MS. The golden era of PC graphics quality is coming to an end
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Every review I've read online during GTX980Ti launch reported that the max boost was 1.2Ghz in games.

"For you spec junkies out there we reached a consistent in-game frequency of 1201MHz on the GPU at 84c with 52% fan speed at a maximum voltage of 1.1930v."
http://www.hardocp.com/article/2015..._980_ti_video_card_gpu_review/11#.WD9PG-YrKHs

"The reference card proved to be a willing companion during the OC testing but we decided to limit fan speeds to 50% so were ultimately limited by temperatures. Nonetheless, Boost speeds easily reached 1330MHz with a few peaks around the 1350MHz mark."

1350mhz 980Ti was 7.9% faster in GTA 5 and 6.9% faster in The Witcher 3 compared to a reference stock 980Ti. That means the reference 980Ti was boosting between 1232-1261mhz out of the box, with 0 overclocking.

Your assertion that 980Ti max boosts to only 1000-1100mhz has alrady been proven wrong 1 million times during 980Ti's launch when people claimed the card has 40-50% overclocking headroom. 980Ti has 25-35% overclocking headroom, not 40-50%.
Computerbase test system guide 2016
GTX980TI 1066Mhz boost.Aftermarket gigabyte 980TI 1290-1300Mhz(this card is faster than GTX1070 in their tests and also 28% faster than furyx.)
https://www.computerbase.de/2016-05..._die_benutzten_grafikkarten_und_die_taktraten
 
Last edited:

tg2708

Senior member
May 23, 2013
687
20
81
Computerbase test system guide 2016
GTX980TI 1066Mhz boost.Aftermarket gigabyte 980TI 1290-1300Mhz(this card is faster than GTX1070 in their tests and also 28% faster than furyx.)

Oh wow so that means my 980 ti at 1405 boost is rock solid then.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Oh wow so that means my 980 ti at 1405 boost is rock solid then.
1400Mhz 980Ti is very fast.Its pretty close to beat every GTX1070 even at 2150/9500(tested in COD IW)
That is why i like pcgameshardware benchmarks.We all know what performance/clock we have.This is why i was surprised that aftermarket 980Ti at 1350Mhz is 2% slower than GTX1070 at 1924mhz in watchdogs2
But it will still beat GTX1070 at 1500/8000 for sure.(because gtx1070 at 1924mhz have almost zero oc headroom left)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
MSAA in a game like this is similar in cost to SSAA.

25% more pixels + SMAA looks better and is likely faster.

Hardly makes a difference. Cannot achieve 50 fps average on a GTX1060 and 1080 manages just 66 fps. That's without HBAO+/PCSS and not maxed out car headlights/reflections.

Joker Productions - this is NOT fully maxed out Ultra settings.

1080p - Average / Minimum FPS
RX 480 = 42 / 34
GTX1060 6GB = 47 / 40
GTX1080 = 66 / 54

1440p
RX480 = 30 / 23
GTX1060 6GB = 34 / 27
GTX1080 = 51 / 42


This game is definitely in the running for the worst optimized AAA game of 2016 but the competition is very tough this year.

I play old emulators at 6x internal res minimum. There are certain dolphin titles that you can benefit for having a 1070. Doesn't hurt to use it. My 290 is good enough for dolphin except for a few niche spots in Galaxy. I bet it's since improved perf though.

What if every AAA game had native 4-16x SSAA in the menu? Should we start testing GPUs at 4K with SSAA on too? Using questionable settings on 10-20 year old games that ran on NES/SNES/N64/GC, etc. does not change the fact that emulated games still look outdated compared to BF1/SW:BF/Metro LL, Rise of the Tomb Raider, etc.

Using your logic and emulators, I could take a GPU 10X faster than GTX1080 and render it internally at 8K or 16K just because I can. Once we get 8K TVs/monitors, you'll complain that a 2025 GPU isn't fast enough at emulated XB1/PS4/Switch games. Meanwhile an RX 480 achieves in excess of 80 fps in Infinite Warfare and 110 fps in Doom. There is a difference between requiring a GTX1070/1080 because you are running a cutting edge 2015-2017 PC title and running an emulated console game that was made by some Polish/Ukrainian/Russian programmers because they don't want to buy legitimate games.

I get your point though: Under particular circumstances, no amount of GPU power is enough

The 980ti that boosts to 1450 at peak but settles at 1405 after it hits 80 and above. Is that good?

It's good for a reference card, nothing special for an AIB card. The best AIB 980Tis hit 1525-1550mhz and most hit 1475-1500mhz.

That's what I would expect. The 980ti is a beast gpu. I still am mad at myself for not picking up an MSi 980ti.... I'd be happy right now with gaming. Oh well.

Well that card would have cost you $650+ USD in 2015. The same level of performance can be had in a $350-370 GTX1070 but you aren't happy with that. That's confusing to me. What was wrong with buying a GTX1070 for $400 6 months ago? Let's say next year you buy a $700 Vega/GP102 1080Ti, in 12 months a $400 Volta will be better.

My point is you can buy a $700-800 GPU and keep it for 2-3 years (780Ti/980Ti/1080) or just buy the $400 x70 card every generation. Are you trying to time a $700 flagship card purchase that will last 5 years? That's not going to happen anymore.

Now dont get me wrong im sure you know alot about graphic cards, but you dont know it all. I to know alot you see and i actualy took the time to read it and from what u posted to what is reality is very far off. And no gamegpu is not great site, it's lazy at best.

I added the benchmarks i found into a spreadsheet for all to see and funny thing is the site u said pcgameshardware that favoured nvidia has the RX 480 at it's highest fps compared to all the other sites.

https://docs.google.com/spreadsheets/d/1FtVphRyUABwTwW1lxErDSI7PDjzR5K2os-VHcC9j8bo/edit?usp=sharing

What matters is the GPU standing in the same game, not only just the FPS.

Without GameWorks features, Computerbase has RX 480 at 52.8 fps, you have it at 39 fps.
https://www.computerbase.de/2016-11...abschnitt_benchmarks_von_full_hd_bis_ultra_hd

I am guessing you took it from this page with HBAO+?
https://www.computerbase.de/2016-11/watch-dogs-2-benchmark/2/

The way you presented the benchmarks is actually very good to see where the performance lands on average. It makes it tricky since most sites don't use the same settings when testing games.

Sorry for the way i quoted i might know alot about gpu, i do not know alot about forums and i dont act like i do, being humble comes along way in life.

OK, that's fair. I am all for using 5-6 review sites to better gauge the performance in a game. The bigger take-away for me is how unoptimized the game is overall. That doesn't look good for PC gaming as a whole. It's a much bigger theme than NV vs. AMD.
 

tg2708

Senior member
May 23, 2013
687
20
81
1400Mhz 980Ti is very fast.Its pretty close to beat every GTX1070 even at 2150/9500(tested in COD IW)
That is why i like pcgameshardware benchmarks.We all know what performance/clock we have.This is why i was surprised that aftermarket 980Ti at 1350Mhz is 2% slower than GTX1070 at 1924mhz in watchdogs2
But it will still beat GTX1070 at 1500/8000 for sure.(because gtx1070 at 1924mhz have almost zero oc headroom left)

Yea thanks for the info, I'm just going to leave it for awhile and let it boosts by itself, I'm terrible at checking for stability. But then again witcher 3 is very good at finding OC instability, quite a sensitive game.
 

casiofx

Senior member
Mar 24, 2015
369
36
61
The performance of aib 980ti and 1070, even after both overclocked is too close to be noticeable in game. But 1070 do offer:
1) Multiple projections without performance loss and correct ratio for multiscreen 3d view.
2) Uses 100 watts less, available in compact form.
3) 2GB more VRAM
4) Much easier to resell when volta comes out. Can sell to folks which runs cheap PSU.
 
Reactions: RussianSensation

tg2708

Senior member
May 23, 2013
687
20
81
@Russian the card boosts their on its own I did not touch anything. plus considering its rated boost is 1291 is still pretty good.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Because Nvidia won't let me have Gsync on the monitors I want. So 1070 performance would be RX 490 performance.... which is no where to be seen for a long time. And I don't want to buy then sell in 5 months. I want to enjoy the 4K, so flagship is my best bet, unless AMD has something good for $500.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
1400Mhz 980Ti is very fast.Its pretty close to beat every GTX1070 even at 2150/9500(tested in COD IW)
That is why i like pcgameshardware benchmarks.We all know what performance/clock we have.This is why i was surprised that aftermarket 980Ti at 1350Mhz is 2% slower than GTX1070 at 1924mhz in watchdogs2

Computerbase test system guide 2016
GTX980TI 1066Mhz boost.Aftermarket gigabyte 980TI 1290-1300Mhz(this card is faster than GTX1070 in their tests and also 28% faster than furyx.)
https://www.computerbase.de/2016-05..._die_benutzten_grafikkarten_und_die_taktraten

It's hard to tell if the Tested Clock they report is the average or the maximum.

Here is Sweclockers:

980Ti reference
Base = 1000
Boost = 1076
Max Turbo they saw in games = 1176mhz
http://www.sweclockers.com/test/20703-gigabyte-geforce-gtx-980-ti-g1-gaming/8#content

980Ti G1
Base = 1152
Boost = 1241
Max Turbo they saw in games = 1341mhz

1341mhz G1 / 1176 mhz reference card = 14% increase in GPU clock speed

Actual gaming performance @ 1440p = G1 leads 980Ti reference by 14%. (160/140 rating).

Max overclock on the G1 980Ti was 1482mhz or 26% faster over the reference 980Ti's 1176mhz max boost.
http://www.sweclockers.com/test/20703-gigabyte-geforce-gtx-980-ti-g1-gaming/8#content

The number you linked aren't adding up. In Computerbase chart, the G1's boost is 1290-1304mhz, on an average of 1297mhz. If you look at their 1066mhz number from 980TI reference, the G1 has, the GPU clock speed increase is 22% (1297mhz / 1066mhz). However, the actual gaming performance increase is only 128% / 108% = 18.5%. For that to be true the 980Ti was boosting closer to 1094mhz, not 1066 mhz.

Either way, it seems you are correct that some 980Ti reference cards boost to 1100mhz but I am also correct that many 980Ti reference cards boost to 1150-1230mhz as I already provided for you from at least 3 GPU reviews.

But it will still beat GTX1070 at 1500/8000 for sure.(because gtx1070 at 1924mhz have almost zero oc headroom left)

All 3 1070s I have can boost to 2060-2080mhz. 2080mhz is another 8% over 1924mhz. Either way, we already know that GTX1070 is one of the worst cut-down x70 cards ever made but you aren't considering the context. It's possible to buy almost 2x GTX1070s for the price of a single 980Ti a year ago. The fact that 980Ti @ 1500mhz beats a GTX1070 @ 2Ghz is largely irrelevant because someone buying a 2016 card can get GTX1070 SLI for $700-740, or purchase a single 1070 for $350-370 and have $300 left over from not spending it on a 980Ti in 2015.

@Russian the card boosts their on its own I did not touch anything. plus considering its rated boost is 1291 is still pretty good.

Oh my bad! I misunderstood you. That's amazing then! Excellent 980Ti. Did you buy it in 2016?
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,946
1,250
126
Not even sure why people compare the likes of the 1070 to the 980ti. They're not the same market. 980Ti was a high level enthusiast card whereas the 1070 is the 970 replacement. The fact that the 1070 meets or even beats the previous generation high level enthusiast card is pretty good in my book. People that bought that 1070 (like me) weren't looking to replace a 980Ti tier card. I was looking to replace a Radeon 290 and that's the market that 1070 buyers are in.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |