PDA

View Full Version : Overclocking screen refresh rates.



L3nnuk
06-02-2013, 12:28 PM
Hey, I made this screen overclocking guide. Tried on my desktop monitor, but it fails after 63 hz lol.

If anyone is interested then check out the guide: http://diyhacksandmods.blogspot.com/2013/06/how-i-overclocked-acer-aspire-one-522.html

Hope someone finds this useful and posts results of their refresh rate gainz.

EDIT: Before anyone complains: Yeah, this is my own blog and I'm trying to get some viewers. But the content should be beneficial for everyone who want to get more out of their screens.

Hazzah
06-02-2013, 05:09 PM
Hey, I found this screen overclocking guide. Tried on my desktop monitor, but it fails after 63 hz lol.



EDIT: Before anyone complains: Yeah, this is my own blog and I'm trying to get some viewers. But the content should be beneficial for everyone who want to get more out of their screens.

Then you didn't find this guide, you wrote it. That's a little misleading and if other are like me, we won't be too happy about being mislead.

L3nnuk
06-02-2013, 05:26 PM
Alright, fixed it.

Hazzah
06-02-2013, 05:40 PM
Does doing this lower battery life for laptops, are there any adverse effects (shortens screen life)?

I also assume that this would void any warranty as well?

Ian
06-02-2013, 05:43 PM
What are the advantages to this?

L3nnuk
06-02-2013, 06:32 PM
What are the advantages to this?

Smoother movement on screen, less tearing if you get high fps. Also if you use v-sync, it sets the FPS limit higher.


Does doing this lower battery life for laptops, are there any adverse effects (shortens screen life)?

I also assume that this would void any warranty as well?

Yes it will void warranty but its software modification then its reverssible and shouldn't be detected. Haven't heard anyone killing their screen with this, you can check more info in the thread I linked in blog (thread is made by the guy who made this software).
Also the effect on battery life was very minimal.

If things go south and you get black screen in windows, then simply booting into safe mode and removing the bad setting will fix it. Or running reset file in safe mode simply erases all the changes.

denart5
06-02-2013, 08:06 PM
There is NO, I repeat NO point going higher than 60hz UNLESS you have a 3D monitor and then you will need 120hz atleast. Or if you are playing fast-paced shooters on a top level. Other than that you won't need anything higher than 60hz except if you are using a crt monitor, then you may need 75hz (which needs no overclocking in most cases). I did a lot of research on this topic when buying my new monitor. Seriously, you won't see any difference with a LCD.

L3nnuk
06-15-2013, 06:51 PM
Smoother movement, can set vsync over 60 fps.. I see plenty of points to get over 60hz screens.

Brandon
06-15-2013, 07:18 PM
Smoother movement, can set vsync over 60 fps.. I see plenty of points to get over 60hz screens.

No such thing.. It's all in the mind. The guy above you is right because 60hz is the standard rate at which most graphics cards/computers draw to the screen.

Better monitor isn't going to give you more FPS at all and the only thing that will give you a better look and a better feel and overall better for your eyes, is the resolution.. Not the cycles per second because regardless of how many cycles per second your monitor has, it's always up to the developer/game/cpu/graphics card to decide on how fast to draw and how many frames per cycle.

To get more, the developer has to increase the draw rate/frame rate.. Also you'll need a whole new computer or cpu or graphics card which draws at that rate.

You can try it by drawing to the screen in an infinite loop using GDI on windows or OpenGL/DirectX with no sleep in the loop.. OR you can put a sleep for 60 ms and see that the FPS will still be the same. PS. The infinite loop will lag your whole comp because your processor will be giving that thread all the processor power and not sharing with any others.

Also note there's something called a gameloop which is a concept that states that all users playing the game must tick at the same rate regardless of lag because it'd be quite unfair if some ticked before others and were able to do things before others.. http://en.wikipedia.org/wiki/Game_programming

Visit that link above then press Ctrl+F then type game loop.

Hence RS's 600ms tick cycle and 50-FPS max cap.. Same goes for videos too.. It's all about the encoded frame rate, cpu and graphics card.. Not the monitor itself. The monitor is there to make things look prettier, not speed them up.

You're wasting your time and money.. Overclocking your monitor may just take away from it's lifetime rather than help you at all..

Note: Lowering your monitor's draw rate is going to display less but increasing it does nothing unless the devs supports a higher rate.

L3nnuk
06-15-2013, 07:47 PM
No such thing.. It's all in the mind. The guy above you is right because 60hz is the standard rate at which most graphics cards/computers draw to the screen.

Better monitor isn't going to give you more FPS at all and the only thing that will give you a better look and a better feel and overall better for your eyes, is the resolution.. Not the cycles per second because regardless of how many cycles per second your monitor has, it's always up to the developer/game/cpu/graphics card to decide on how fast to draw and how many frames per cycle.

To get more, the developer has to increase the draw rate/frame rate.. Also you'll need a whole new computer or cpu or graphics card which draws at that rate.

You can try it by drawing to the screen in an infinite loop using GDI on windows or OpenGL/DirectX with no sleep in the loop.. OR you can put a sleep for 60 ms and see that the FPS will still be the same. PS. The infinite loop will lag your whole comp because your processor will be giving that thread all the processor power and not sharing with any others.

Also note there's something called a gameloop which is a concept that states that all users playing the game must tick at the same rate regardless of lag because it'd be quite unfair if some ticked before others and were able to do things before others.. http://en.wikipedia.org/wiki/Game_programming

Visit that link above then press Ctrl+F then type game loop.

Hence RS's 600ms tick cycle and 50-FPS max cap.. Same goes for videos too.. It's all about the encoded frame rate, cpu and graphics card.. Not the monitor itself. The monitor is there to make things look prettier, not speed them up.

You're wasting your time and money.. Overclocking your monitor may just take away from it's lifetime rather than help you at all..

Note: Lowering your monitor's draw rate is going to display less but increasing it does nothing unless the devs supports a higher rate.

If its all in the mind and refresh rate has nothing to do with feel of the picture, then according to you 120hz screens don't really work and over 60 hz screens are just a scam.

I never said that getting a monitor with better refresh rates will give you more FPS. It will just let your screen output more of the rendered frames.
Sure you can have over 300 FPS on some game, but if your monitor is only 60hz, then it will draw only 60 of the 300 rendered frames on your screen ( and some come out with tearing effect ). Or if you get less that 60 fps, then some frames will simply be redrawn. Resulting in a choppy picture when moving.

Brandon
06-15-2013, 08:56 PM
If its all in the mind and refresh rate has nothing to do with feel of the picture, then according to you 120hz screens don't really work and over 60 hz screens are just a scam.


Did you read the note at the end where I mentioned "UNLESS" the developer/cpu/graphics card supports that rate? It's why you overclock your CPU and Graphics card.. Not your monitor.

Read this: http://www.pcmag.com/article2/0,2817,2379206,00.asp

"Source footage is never greater than 60hz". Should be enough to grab your attention no?

That nice "effect" you're talking about "seeing" is your monitor compensating.. Aka drawing what's in the buffer multiple times over.. For example, some game sends an image to a buffer at 60 cycles per second aka 60hz refresh rate. A 120hz monitor isn't going to speed anything up or fix anything at all. All its going to do, is just draw the same image twice because its 2x as fast and the game hasn't had time to update the buffer yet.. Your driver controls the draw rate mostly.

You might want to read what tearing is again: http://en.wikipedia.org/wiki/Screen_tearing
Also read the prevention part. 60hz is standard so it's obvious that something else is causing your tearing: video card technology, software in use, and the nature of the video material. In the very rare case, buying a new monitor might help I guess but it's less likely; unlike a new gfx card, new software, etc.

As long as your monitor is at a standard 60hz or higher rate, tearing shouldn't occur. Bad developers don't follow the right concepts and that's probably why you think you need a new monitor or overclocked monitor.



I never said that getting a monitor with better refresh rates will give you more FPS. It will just let your screen output more of the rendered frames.


Read what you just said.. What does "Let your screen output more rendered frames mean" to you? Sounds like you're talking about FPS no? Because that's exactly what it is.. The amount of frames rendered in one second? Yes? I'm not sure if you're confusing yourself or me :S



Sure you can have over 300 FPS on some game, but if your monitor is only 60hz, then it will draw only 60 of the 300 rendered frames on your screen ( and some come out with tearing effect ). Or if you get less that 60 fps, then some frames will simply be redrawn. Resulting in a choppy picture when moving.


lol.. Iunno what in the world you're talking about.. Anyone who programs games or perhaps plays them, knows that all the frames are going to be drawn. FPS stands for the amount drawn in a single second.. Not the total amount drawn.. So no, you're not going to have 60/300 drawn... You're going to have all 300 drawn. The time in which it's drawn, is your FPS. You may have 300 drawn in 2 seconds and someone else may have all 300 drawn in 1 second.. Thus your FPS would be 150/s and theirs at 300/s..

Doesn't mean its going to draw less at all. Your screen would be the limiting factor then because at different refresh rates, it may "skip" some frames since they don't fall within its cycle. However, game programmers take care of that in the game loop by sleeping at a standard rate and making sure everyone who plays sees the same things at the same time regardless of input. Thus the lag is on the client side (your computer) not the server.

No no. No frames are redrawn. They are simply drawn at a delay. Aka a lag. Try running a Debugger on any game, and you'll see that it never sends the same instruction twice or redraws the same frame because you lagged or your cpu threw a hissy fit.


Anyway, check out Smart's source on Benland's github.. See line 145: https://github.com/BenLand100/SMART/blob/master/src/Client.java
Then see line 326.. Now when you Run Smart, press the ~ (tilde) key to open the RS developer console and type "displayfps" without the quotes..

Now run the official RS client and do the same.. You should notice that the FPS is the exact same! Ok so now remove the two lines I just mentioned and compile/run Smart.. Again, type displayfps. It should be the exact same but guess what? You lag now because your CPU isn't sharing any processing time with other threads. Now go buy a new monitor, come back and do the same tests.. tell me what's changed?

L3nnuk
06-15-2013, 09:35 PM
Refresh rate and FPS are not the same thing.
FPS = Frames per Second = How many distinct images a GPU outputs, per second

Refresh Rate = Number of times a screen is capable of DISPLAYING, per second

Hence, if you have a 60Hz monitor, and are outputting 120 FPS, you only actually display 60 frames. Likewise, if you have a 120Hz moniter and 75 FPS, you will output 75 unique frames.

Brandon
06-15-2013, 09:38 PM
Refresh rate and FPS are not the same thing.
FPS = Frames per Second = How many distinct images a GPU outputs, per second

Refresh Rate = Number of times a screen is capable of DISPLAYING, per second

Hence, if you have a 60Hz monitor, and are outputting 120 FPS, you only actually display 60 frames. Likewise, if you have a 120Hz moniter and 75 FPS, you will output 75 unique frames.


Did you just google that or read it from the links in my previous post? Because: "Can set vsync over 60 fps.. I see plenty of points to get over 60hz screens."

Certainly doesn't seem to be what you were talking about.. You also haven't mentioned a distinct difference before and no reason or "plenty of points" as to why you should get over 60hz screens..

However, I already understand where this is going and where it's going to go and what you were trying to say. You just never said it.

I apologize if I confused you or myself when reading what you said previously. I'll go now but I won't advice anyone waste money to get anything higher until the standard rate changes.

L3nnuk
06-15-2013, 09:49 PM
Did you just google that or read it from the links in my previous post? Because: "Can set vsync over 60 fps.. I see plenty of points to get over 60hz screens."

Certainly doesn't seem to be what you were talking about.. You also haven't mentioned a distinct difference before and no reason or "plenty of points" as to why you should get over 60hz screens..

However, I already understand where this is going and where it's going to go and what you were trying to say. You just never said it.

I apologize if I confused you or myself when reading what you said previously. I'll go now but I won't advice anyone waste money to get anything higher until the standard rate changes.

Yea, I searched for some better explanation, since it got hard for me to really explain the thing I wanted to explain. Also I might have written something poorly that I actually wanted to say that might have made something confusing (for me and others ).

Will read everything over again and I'm gonna reply bit later and I'll try to be clearer.

EDIT: http://www.rockpapershotgun.com/2013/03/04/week-in-tech-overclock-your-monitor-with-nvidia/
Some of the stuff I wanted to explain is written better there^
EDIT 2: and this http://www.tweakguides.com/Graphics_7.html

Brandon
06-15-2013, 10:24 PM
Yea, I searched for some better explanation, since it got hard for me to really explain the thing I wanted to explain. Also I might have written something poorly that I actually wanted to say that might have made something confusing (for me and others ).

Will read everything over again and I'm gonna reply bit later and I'll try to be clearer.

EDIT: http://www.rockpapershotgun.com/2013/03/04/week-in-tech-overclock-your-monitor-with-nvidia/
Some of the stuff I wanted to explain is written better there^
EDIT 2: and this http://www.tweakguides.com/Graphics_7.html


So in other words, according to that article:



If your FPS is less than your refresh rate at any time, the same frame may simply be redrawn several times by the monitor.

If your FPS is higher than your refresh rate at any time, your monitor will not actually be able to display all of these frames, and some will come out with a graphical glitch known as Tearing. To prevent this, you can enable an option called Vertical Synchronization (VSync). However here's the tricky part: if VSync is enabled, then your refresh rate and FPS will have a direct relationship with each other - they will become synchronized together. This is all covered in more detail in the Vertical Synchronization section of this guide.


Exactly what I said? The same frame redrawn multiple times. Also note that the VSync which you mentioned does nothing then.. You said you'd increase yours to 60 FPS.. Yet RS is capped at 50. Thus your monitor is still capped at 50 or whatever game/video you're running/watching.


Then if you read the comments, you can tell that it's all marketing FUD.. Quote from a comment:



It’s not true 120hz, what it does is double each frame, shift it over and blur it to give the effect of smoother video. Which requires processing, which requires time (lag) which of course results in the input delay. A lot of people don’t understand this. If you plug in your PC to it and go to the monitor settings you’ll in fact see that it’s locked to a true 60hz.

Them sneakers buggers!

Oops, I didn’t notice someone already said this.
Glad the world is getting it now.
So yes, it’s better to just buy a cheaper 60hz one 90% of the time if it’s an HDTV.

About Plasma’s however, the image will still be visiably smoother from my experience, however the input is still limited to 60hz most of the time on any I’ve tried. Like the Quatro Pros.. Even if you turn off vsync.. The cable you’ll likely be using, and/or the input is still limited to 60hz..


Again, exactly what I said.. It's all a scam because no matter what, everything is still limited to 60hz. Doesn't matter if the monitor draws the same exact frame 50 times more often and blurs it to fool you. It will only make sense when the "standard" changes. For now, it's really a waste of time, effort, and money. One thing I learned from developing things.. Never assume the standard will change or be what you want it to be in the future.

It's a sorta "virtual" effect I guess. The same effect I described previously.. It may "seem" better to your eyes but really it is no different because it really isn't smoother and time controls everything. The monitor still needs to wait on the graphics cards/cpu's, etc.. Still synchronized, still 60hz no matter what. And I'll say again, you're better off with a better graphics card/cpu/developer.

The sad part is: I do have a 240hz tv because my parents are technologically stupid and it really makes "no" difference. It interpolates and blurs as well as adds extra frames that really aren't there because the rate is much higher than the standard.. Yeah I did have to turn it down to enjoy TV again (which I rarely watch). However, if its a monitor and it does that, you're really seeing things that aren't really there until the game buffer/video buffer updates.

L3nnuk
06-15-2013, 10:50 PM
Exactly what I said? The same frame redrawn multiple times. Also note that the VSync which you mentioned does nothing then.. You said you'd increase yours to 60 FPS.. Yet RS is capped at 50. Thus your monitor is still capped at 50 or whatever game/video you're running/watching.

Increasing refresh rate increases the cap (assuming the FPS you get with your system is higher than refresh rate). So the monitor is not anymore capped at 60 or what it was running, but capped at the new rate.


Again, exactly what I said.. It's all a scam because no matter what, everything is still limited to 60hz. Who cares if the monitor draws the same exact frame 50 times more often and blurs it to fool you. It will only make sense when the "standard" changes. For now, it's really a waste of time, effort, and money. One thing I learned from developing things.. Never assume the standard will change or be what you want it to be in the future.


The standard is only set when you talk about TV-s.( also the comment is about TV and the pcmag article you linked before talks about TV-s ).
Also check again the comment it was replying to and the other replies. He was commenting on how the other commenter TV that was marketed as 120hz is actually 70hz screen that emulates 120hz.
But I was talking about games, where you actually get higher than 60 fps.

Brandon
06-15-2013, 11:14 PM
Increasing refresh rate increases the cap (assuming the FPS you get with your system is higher than refresh rate). So the monitor is not anymore capped at 60 or what it was running, but capped at the new rate.



The standard is only set when you talk about TV-s.( also the comment is about TV and the pcmag article you linked before talks about TV-s ).
Also check again the comment it was replying to and the other replies. He was commenting on how the other commenter TV that was marketed as 120hz is actually 70hz screen that emulates 120hz.
But I was talking about games, where you actually get higher than 60 fps.

You are correct and it is true that the monitor would not be capped at 60 but it also would not be capped at your FPS either. See if your FPS is higher than your refresh rate then the cap is that of your refresh rate. If your FPS is lower than the refresh rate, then the cap is that of your FPS. But then again, none of what I just said is true since all your other components have to have that synchronized rate as well. If your CPU processes crap at a rate of 60hz, then there's nothing you can do about it. If your game processes its components at 60hz, still nothing you can do about it. RS has 50fps maxed but it's rate of processing is actually at 60. No matter what you do or what monitor you have, you're going to be stuck at 60 because the rate at which it swaps the back and front buffers are.. You guessed it.. 60hz. As with almost all games.

Both ways, you are either capped by the software or the hardware and in any case, the rate of refresh by standard is ALWAYS 60 (especially for CRT displays). So it doesn't matter how high your FPS gets.


But I was talking about games, where you actually get higher than 60 fps.

As was I.. I believe I mentioned "video, games, tv, programs".
Also, no the standard is NOT set just for TV's.. I took electronic & mechanical engineering. The standard is set for all display monitors ranging from TV's to computers to cellphones. It's a "Standard" for a reason.

The comment was specifically for a TV but also applies to any electronic display and utilities.
https://en.wikipedia.org/wiki/Refresh_rate
http://en.wikipedia.org/wiki/Utility_frequency

Ctrl+F and type 60. Let me know how many different categories it comes up under as the "standard" and "locked" refresh rate. Also note it's the default and LOCKED for almost all Operating systems.

Check here for why:
http://in.answers.yahoo.com/question/index?qid=20090415043322AApyBbW

And some guy asking why on engineering boards:
http://engineerboards.com/index.php?showtopic=4485

You can google why I guess..

Rest of the world: 50.. Americas: 60.. Awkward isn't it? But you just dived into a world of science and history..

Good luck writing a kernel mode driver to override that to support your 120hz+ display. Until then or until the standard changes, everything you see is virtual/faked. So again, its not just your monitor but all the circuitry in your entire computer and ALMOST all electronics.

Are we finished this debate? We will never really agree so is there really a point? I'm not arguing for the sake of winning or proving you or anyone wrong.. I'm just trying to make sure you and anyone else don't go to the store for any wrong reasons and get fooled.

L3nnuk
06-16-2013, 05:26 AM
You are correct and it is true that the monitor would not be capped at 60 but it also would not be capped at your FPS either. See if your FPS is higher than your refresh rate then the cap is that of your refresh rate. If your FPS is lower than the refresh rate, then the cap is that of your FPS. But then again, none of what I just said is true since all your other components have to have that synchronized rate as well.

Never said it will be capped at FPS. But at the refresh rate of your monitor. This doesn't change, only the number at which it will be capped at. So when you push your refresh rate to lets say 75, it will be now capped to 75 when vsync is on.


If your CPU processes crap at a rate of 60hz, then there's nothing you can do about it. If your game processes its components at 60hz, still nothing you can do about it. RS has 50fps maxed but it's rate of processing is actually at 60. No matter what you do or what monitor you have, you're going to be stuck at 60 because the rate at which it swaps the back and front buffers are.. You guessed it.. 60hz. As with almost all games.

Newest CPUs process stuff at over 1 Ghz (over 1 000 000 000 hertz) now. And where do you get that the buffers are stuck at 60 Hz?


Also, no the standard is NOT set just for TV's.. I took electronic & mechanical engineering. The standard is set for all display monitors ranging from TV's to computers to cellphones. It's a "Standard" for a reason.

The comment was specifically for a TV but also applies to any electronic display and utilities.
https://en.wikipedia.org/wiki/Refresh_rate
http://en.wikipedia.org/wiki/Utility_frequency

Ctrl+F and type 60. Let me know how many different categories it comes up under as the "standard" and "locked" refresh rate. Also note it's the default and LOCKED for almost all Operating systems.

Check here for why:
http://in.answers.yahoo.com/question...5043322AApyBbW

And some guy asking why on engineering boards:
http://engineerboards.com/index.php?showtopic=4485

Utility frequency has absolutely nothing to do with your monitor refresh rate... Other than both are measured in Hz.
It seems to me that you think everything is limited by the AC frequency.

R0b0t1
06-16-2013, 07:23 AM
Let's not use so many words.

A refresh rate of 60Hz is a carryover from CRT displays - the original tube electronics needed the refresh rate to be a multiple of the mains frequency, and as a plus any noise on the power line was silently ignored. Yes, 50Hz was used in Europe and other nations. LCDs do not flicker, as they have no screen blanking period. They do however have a minimum transition time. If you look at computer gaming right when LCDs came out, pro gamers continued to use CRT because screen refrash rate, which was lengthened to allow the first LCD screens to function properly, was noticable by humans. Modern LCDs are still limited by this phenomena although the times are much shorter. As far as humans go, there is no reason to make the displays update faster. There are screens which are made and are viewable by cameras under any condition due to absurdly high refresh rates, but again, this serves no purpose in the context of people, and they may not actually be achieving the desired framerate because of the minimum transition time.

So, on an LCD, when you experience frame lag... You experience frame lag. The game takes longer than 1/30 of a second to generate a frame and does not update fast enough for you to see it (to those of you gifted with special powers and who claim to see refresh rates of 50/60Hz, it's usually due to being under fluorescent lights and no you are not special). The monitor really plays no part in it, your GPU is the one not keeping up. You could try to make your screen refresh faster, perhaps. It might, it might not, it depends how cost optimized the screen you have is. Control circuitry takes space away from pixels.