Question Is There Much Difference Between 75Hz and 100Hz Monitors for Gaming?

ascendant

Member
Jul 22, 2011
140
6
81
I'm looking at 2 Sceptre monitors, one is 24" and 100Hz, the other one is 27" and 75Hz. Will that extra 25Hz make any significant visual difference? I'm not sure of how much our eyes can actually process and actually notice as far as it being smoother, whether it would be noticeable, or negligible?

I'd much rather have a 27" monitor, unless it's going to be a fairly noticeable decrease in gaming visual smoothness. Has anyone actually compared, or seen testing on this (not on the actual refreshing rate, but on the eye's ability to visually process the difference in the refresh rate difference)? If so, has there been any tests showing the maximum Hz the human eye can actually process and notice visually before it simply can't keep up and the difference is unnoticeable to us?
 

Tech Junky

Diamond Member
Jan 27, 2022
3,415
1,148
106
Since video is measured in chunks of 30fps it seems a bit irrelevant. Typically you find them in 30/60/120/240 and so on. Where you spot things is when it's high motion including the background.

For instance I convert my OTA Plex files from TS to MKV using hand brake. Well, I did something last week that removed a part of the process and resulted in dropping the filler frames that gave me results like max hedroom jittery motions in the processed files. I did manage to fix it by rebuilding the program and underlying software pieces.

For normal video though I would say there's not going to be much difference. Something like football or racing might show some slight differences though.
 
Reactions: ascendant

ascendant

Member
Jul 22, 2011
140
6
81
Since video is measured in chunks of 30fps it seems a bit irrelevant. Typically you find them in 30/60/120/240 and so on. Where you spot things is when it's high motion including the background.

For instance I convert my OTA Plex files from TS to MKV using hand brake. Well, I did something last week that removed a part of the process and resulted in dropping the filler frames that gave me results like max hedroom jittery motions in the processed files. I did manage to fix it by rebuilding the program and underlying software pieces.

For normal video though I would say there's not going to be much difference. Something like football or racing might show some slight differences though.
Thank you for the information. So then the more expensive "gaming" monitors with the higher Hz beyond the typical 60Hz is all just a gimmick that is going to be negligible, even with high-end gaming or HD videos?
 

Tech Junky

Diamond Member
Jan 27, 2022
3,415
1,148
106
It depends on how sensitive you are to the motion. The higher the hz the more filler you get in content to smooth things out.

Think of it like a flipbook animation. Slower you flip the pages the more you can see the transitions from the drawings. Now if the artist does more pages to smooth transitions you don't see the jerkiness from page to page.

Personally I don't bother with anything over 120. Even 60 fps is pretty good. The other factor is the source though and how it's processed. This is the bitrate. For low res say 480 you can fit an hour of content in say 500mb but 720 might be 1gb and 1080 can hit maybe 1.5gb and 4k can be all over the place but minimally 2gb+. The other factor in the bitrate though when it comes to size is the audio whether it's stereo, 5.1, 7.1, and special codecs for spatial audio processing like Atmos. If you go to atsc 3/mp4 it gets a little harder to compare because there's no open codec to process audio. It has to be handled by the HW capturing the stream or an app with the paid codec to convert it.
 

Tech Junky

Diamond Member
Jan 27, 2022
3,415
1,148
106
I find myself correcting the above it's AC4 for the audio side of ASTSC3.

Video keeps getting more complicated as time passes like all tech. Just like Thunderbolt changed its name from 3 >> 4 which basically is the same animal but it unlocked use in USB4 format by dropping the licensing costs. It's still 40gbps in terms of speed but, also dropping the licensing / strict properties offered USB to have a base of 20gbps to be considered USB4.

On the TB path to TB5 / USB4v2 i.e. 80gbps data or up to 120gbps for video. It all gets murky again. Now, data is easy to measure in this aspect as it either does it or doesn't do it.

But, it still is MP2 for video on ATSC1 and MP4 for ATSC3. It's just the audio codec that presents more issues if you're not watching live and decoding it w/o processing through a 3rd party app.
 
Reactions: ascendant

Fallen Kell

Diamond Member
Oct 9, 1999
6,039
431
126
Just to clear up, the 30/60/120/240hz are all remnants of processing for film and video playback (there is also a 24hz, that wasn't mentioned). This all comes down to fundamental math and the film and video standards that existed. Film was mainly shot at 24hz partly as it was seen as good enough to capture the motion but also slow enough that it saved on cost of the film itself (going faster meant needing more film per minute). TV broadcast initially used 30hz, but fairly quickly changed to 60hz (in the USA market, Europe and some other countries used 25hz and 50hz respectively).

The 120hz was an interesting number because it was the first intersection of all the 3 main formats of video in the US market 24hz, 30hz, and 60hz video could all be displayed on a monitor that could provide 120hz refresh and processing rate without any changing of the pacing of the original content (i.e at 120hz refresh rate, and 24hz film would simply show each frame 5 times, 30hz video would have its frames shown 4 times, and 60hz video would have its frames shown 2 times). Previous monitors and TV sets that attempted to show 24hz film usually suffered from problems as they were typically 60hz refresh rates, leading to a stutter in the film's pacing using 3:2 pulldown processing where-in the first frame was shown for 3 frames, and the next frame was shown for 2 frames (this mathematically converted the 24 frames per second into 60 as (24/2*3)+(24/2*2)=60). But as you can see that created a jerky feel to the film compared to its original since 1/2 the frames are shown at one speed and the other 1/2 are shown for 50% shorter amount of time. The 120hz removed that stuttering. This same condition appears again at 240hz, as everything is just doubled again over the 120hz values.

For computer displays, none of this really is an issue outside of reproducing/playing back those video forms using the computer. For generated graphics, simply having a display that could keep up with whatever your graphics card could generate is the ideal display. There are also diminishing returns after a certain point as the human eye and brain can process only a certain amount frames per second. Some of the latest studies have shown the for conscious interpretation of images, the limit is about 13 milliseconds that the image needs to be displayed on a screen for a person to be able to remember seeing it and know what the picture was. This translates loosely to ~77hz. But also remember that the brain and eye did not evolve looking for something that would mysteriously appear/disappear like an image on a screen, but instead based on object recognition and translation/movement of those objects from an existing state to a new state. This is why studies have shown we can subconsciously process and recognize some images even faster (especially images of things like human faces, and even more so of faces of people we are familiar with such as close family members). But this still leads back to that once you hit that 120hz mark, a person has a very hard time obtaining additional benefit of a faster display (we are already pushing the limits of reaction times at that point as well).
 
Reactions: ascendant

ascendant

Member
Jul 22, 2011
140
6
81
Just to clear up, the 30/60/120/240hz are all remnants of processing for film and video playback (there is also a 24hz, that wasn't mentioned). This all comes down to fundamental math and the film and video standards that existed. Film was mainly shot at 24hz partly as it was seen as good enough to capture the motion but also slow enough that it saved on cost of the film itself (going faster meant needing more film per minute). TV broadcast initially used 30hz, but fairly quickly changed to 60hz (in the USA market, Europe and some other countries used 25hz and 50hz respectively).

The 120hz was an interesting number because it was the first intersection of all the 3 main formats of video in the US market 24hz, 30hz, and 60hz video could all be displayed on a monitor that could provide 120hz refresh and processing rate without any changing of the pacing of the original content (i.e at 120hz refresh rate, and 24hz film would simply show each frame 5 times, 30hz video would have its frames shown 4 times, and 60hz video would have its frames shown 2 times). Previous monitors and TV sets that attempted to show 24hz film usually suffered from problems as they were typically 60hz refresh rates, leading to a stutter in the film's pacing using 3:2 pulldown processing where-in the first frame was shown for 3 frames, and the next frame was shown for 2 frames (this mathematically converted the 24 frames per second into 60 as (24/2*3)+(24/2*2)=60). But as you can see that created a jerky feel to the film compared to its original since 1/2 the frames are shown at one speed and the other 1/2 are shown for 50% shorter amount of time. The 120hz removed that stuttering. This same condition appears again at 240hz, as everything is just doubled again over the 120hz values.

For computer displays, none of this really is an issue outside of reproducing/playing back those video forms using the computer. For generated graphics, simply having a display that could keep up with whatever your graphics card could generate is the ideal display. There are also diminishing returns after a certain point as the human eye and brain can process only a certain amount frames per second. Some of the latest studies have shown the for conscious interpretation of images, the limit is about 13 milliseconds that the image needs to be displayed on a screen for a person to be able to remember seeing it and know what the picture was. This translates loosely to ~77hz. But also remember that the brain and eye did not evolve looking for something that would mysteriously appear/disappear like an image on a screen, but instead based on object recognition and translation/movement of those objects from an existing state to a new state. This is why studies have shown we can subconsciously process and recognize some images even faster (especially images of things like human faces, and even more so of faces of people we are familiar with such as close family members). But this still leads back to that once you hit that 120hz mark, a person has a very hard time obtaining additional benefit of a faster display (we are already pushing the limits of reaction times at that point as well).
Thank you so much, that is exactly what I was looking for - info on how much our brain can actually process. This definitely helps in my decision making. I'm on a budget, but also don't want to short myself if it was a substantial difference. So, I'm going to keep this in mind when I pick one out. I appreciate it!
 

dr1337

Senior member
May 25, 2020
341
589
106
In my opinion the difference between 60hz and 90hz in a VR headset is night and day. 100hz will be a lot more noticeable than 60, but 75? Harder call. For me having frame drops in VR means motion sickness so anything less than a stable 90 feels bad to me.

If anything just go with higher refresh rates for the reduction in input lag. Even in flat games the difference between a 75-60 fps average and 120 avg. feels massive, but if you only play turn based games or never swing your mouse around you might not notice.

Also I just wanna say that response times and peoples reaction times vary a lot. There is a lot of ballpark science out there when people talk about video refresh rates, but there has never been something like a nyquist frequency for motion smoothness established. Blurbusters has done a lot of research in this area and they find motion quality always improves with higher refresh rate. https://blurbusters.com/4k-120hz-with-bonus-240hz-and-480hz-modes/ They find that that even 480hz makes a substantial difference on image quality when the picture has a fast motion in it. So even if things like human reaction times are slow, or our speed to remember a single image, when images are moving fast in a flowing video its not the same story.
 
Last edited:

ibex333

Diamond Member
Mar 26, 2005
4,091
120
106
Disclaimer: I hate all competitive FPS with burning passion such as Overwatch, Apex Legend or Call of Duty. I do like FPS with good story and RPG elements such as STALKER, Fallout, Mass Effect or Doom.

There's literally NO DIFFERENCE unless you play FPS shooters competitively. I have a 75fps monitor and a 144fps monitor side by side and there's NO DIFFERENCE when playing 90% of all games. I understand this is very subjective and many folks will argue about this till there foaming at the mouth, but there's really no "increased smoothness" or "responsiveness". There's no ghosting or shadowing either. It's a marketing gimmick. All of it. Now what actually is very noticeable is higher resolution and panel technology. For example, IPS really does look better than TN or VA. 1440p really does look much better than 1080p. Invest in higher quality panels and high-resolution panels. Don't invest in high refresh rate, unless you are a competitive FPS gamer.

And yes, I very much know what I'm doing. I read thousands of guides. I am using compatible cables and yes all my hardware is very much compatible. Further both monitors are set to their maximum refresh rate.

STILL, there's NO DIFFERENCE.
 

WelshBloke

Lifer
Jan 12, 2005
30,512
8,180
136
There's literally NO DIFFERENCE unless you play FPS shooters competitively. I have a 75fps monitor and a 144fps monitor side by side and there's NO DIFFERENCE when playing 90% of all games. I understand this is very subjective and many folks will argue about this till there foaming at the mouth, but there's really no "increased smoothness" or "responsiveness". There's no ghosting or shadowing either.

And yes, I very much know what I'm doing. I read thousands of guides. I am using compatible cables and yes all my hardware is very much compatible. Further both monitors are set to their maximum refresh rate.

STILL, there's NO DIFFERENCE.
I can see on the desktop the difference between 60fps and 144fps never mind games!
All I need to do is open a windows VM in a window and I can tell. The VM sets its output at 60hz and the rest of my desktop is 144hz.
Or I can drag a window over to my secondary monitor thats running at 60hz.

Just dragging a window around the desktop you can see the difference, and I'm not someone that sensitive to these things.
Don't get me wrong 60hz is fine for an office monitor where you mostly have pages of text that isnt moving but you can absolutly see the difference.
 
Reactions: dr1337

ibex333

Diamond Member
Mar 26, 2005
4,091
120
106
I can see on the desktop the difference between 60fps and 144fps never mind games!
All I need to do is open a windows VM in a window and I can tell. The VM sets its output at 60hz and the rest of my desktop is 144hz.
Or I can drag a window over to my secondary monitor thats running at 60hz.

Just dragging a window around the desktop you can see the difference, and I'm not someone that sensitive to these things.
Don't get me wrong 60hz is fine for an office monitor where you mostly have pages of text that isnt moving but you can absolutly see the difference.

Respectfully, I just don't see it. Someone on YouTube said "look at your mouse pointer" it leaves more of a trail on slower panels. Yes, I notice that. But these things do not translate to gaming at all for me. I play video games a lot and see no benefit whatsoever. For example, I am playing a game "Forgive me Father" on Steam right now. I see no difference between my monitors other than panel color quality.
 

WelshBloke

Lifer
Jan 12, 2005
30,512
8,180
136
Respectfully, I just don't see it.
I mean thats fine that you dont see it but its pretty hard not to notice the difference on the desktop.
Someone on YouTube said "look at your mouse pointer" it leaves more of a trail on slower panels. Yes, I notice that.
If you can see it on the mouse pointer you can certainly see it everywhere else, its just that it doesnt bother you.
But these things do not translate to gaming at all for me. I play video games a lot and see no benefit whatsoever. For example, I am playing a game "Forgive me Father" on Steam right now. I see no difference between my monitors other than panel color quality.
I don't know that game so I can't speak specifically about it, some games I can tell a difference more than others.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,366
10,061
126
My friend found out that a standing tear mark that occurs in Tekken 8 @ 1080P60 on a 60Hz screen w/o vsync, no longer displays the tear when connected to a 144Hz freesync premium display.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,039
431
126
My friend found out that a standing tear mark that occurs in Tekken 8 @ 1080P60 on a 60Hz screen w/o vsync, no longer displays the tear when connected to a 144Hz freesync premium display.
That makes sense to be honest. The reason the tears exists is because the graphics card started to output a new frame onto the screen during the middle of the previous frame being displayed due to the monitor's refresh rate. VSYNC would have removed that issue, forcing the graphics card to sync it's output of frames to the monitor's refresh rate, but also would add some delay into the game (i.e. the game itself will possibly be a frame or two ahead of what is being rendered onto the screen, which is something typically very important on fighting games like Tekken where a single frame may be all the warning you are given for providing a defense, counter, or exploit a wrong move by the opponent). A freesync display would also minimize and reduce tears as well as the monitor itself would adapt it's refresh rate to what the graphics card is producing (up to the monitor's refresh limit).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |