Pages: 1 2 3 4 5 :: [one page] |
|
Author |
Thread Statistics | Show CCP posts - 0 post(s) |

johny B5
|
Posted - 2009.04.17 09:43:00 -
[1]
before the patch, i had 150 fps and higher with the highest settings, now i only have about 60 fps. why? please remove these stupid restrictions. this is not doom 3.
i recently bought a new grafic card, and cannot make use of it. i like my old fps back, wether i am able to see them or not. when i am in a battle situation with less frames than 60 frames, this is not acceptable.
ccp, please remove this useless restriction fast!
|

NeoTheo
Dark Materials Heretic Nation
|
Posted - 2009.04.17 09:45:00 -
[2]
Originally by: johny B5 before the patch, i had 150 fps and higher with the highest settings, now i only have about 60 fps. why? please remove these stupid restrictions. this is not doom 3.
i recently bought a new grafic card, and cannot make use of it. i like my old fps back, wether i am able to see them or not. when i am in a battle situation with less frames than 60 frames, this is not acceptable.
ccp, please remove this useless restriction fast!
you have your interval set, that locks the FPS to same as your refresh rate of your monitor.
change it back to imediate.
Dark Materials |

Elizabeth Joanne
Minmatar New Angel Industries United Federation Of Corps
|
Posted - 2009.04.17 09:47:00 -
[3]
Unless you have a CRT display or one of those new-fangled 120 fps panels, you'll never see more than 60 fps anyway. That's the refresh rate of most panels, and there's no way going beyond that.
That aside, if you absolutely, definitely need to see more than 60 fps on the meter, look at your graphics setting. Perhaps the patch made changes to the interval, and switching back to "Interval immediate" should turn off VSync and any triple buffering.
-- "Boo hoo. Cry some more." -- CCP Whisper
|

Bimjo
Caldari SKULLDOGS
|
Posted - 2009.04.17 09:47:00 -
[4]
Originally by: johny B5 before the patch, i had 150 fps and higher with the highest settings, now i only have about 60 fps. why? please remove these stupid restrictions. this is not doom 3.
i recently bought a new grafic card, and cannot make use of it. i like my old fps back, wether i am able to see them or not. when i am in a battle situation with less frames than 60 frames, this is not acceptable.
ccp, please remove this useless restriction fast!
as above poster said,it can also be played around with in your control panel of gcard(for Nvidia at least) called, I believe, vertical sync
lesson you might have learned here ? before whine, ask , then if still not happy, do the whine ====================
|

Malcanis
Vanishing Point. The Initiative.
|
Posted - 2009.04.17 09:48:00 -
[5]
Originally by: Elizabeth Joanne Unless you have a CRT display or one of those new-fangled 120 fps panels, you'll never see more than 60 fps anyway. That's the refresh rate of most panels, and there's no way going beyond that.
That's not quite true. LCDs don't work in the same way as CRTs.
|

Red Wid0w
Caldari Science and Trade Institute
|
Posted - 2009.04.17 10:02:00 -
[6]
Yeah listen to what people are telling you. YOU NEVER HAD 150 FPS. YOUR MONITOR CAN ONLY DISPLAY 60 (more than likely). All you are doing with your 150fps is wasting processing power, thus reducing the responsiveness of your background apps. Turn VSYNCH on and lock your fps to 60 and there WILL BE NO DIFFERENCE TO EVE. However, everything else on your pc will run faster.
Oh, and those 120hz LCDs are rubbish, they interpolate between frames basically. |

Elizabeth Joanne
Minmatar New Angel Industries United Federation Of Corps
|
Posted - 2009.04.17 10:06:00 -
[7]
Originally by: Malcanis That's not quite true. LCDs don't work in the same way as CRTs.
Care to elaborate or give a cite?
To my knowledge, the refresh rate comes from the video card, where it's usually 60 Hz since that's what the panel says it will support. That's the rate at which the DVI link feeds the display. How exactly do you override that if the display doesn't accept signals above 60 Hz?
-- "Boo hoo. Cry some more." -- CCP Whisper
|

johny B5
|
Posted - 2009.04.17 10:06:00 -
[8]
Originally by: NeoTheo
you have your interval set, that locks the FPS to same as your refresh rate of your monitor.
change it back to imediate.
thanks, it worked! I didn'd know these settings before and didn't change them. that's why i was confused. Here's how to do it: 1. Go to Grafiks & Displays 2. Check box "Advanced settings" 3. Set "Present intervall" to "Intervall immediate"
|

Lord Fornicate
|
Posted - 2009.04.17 10:08:00 -
[9]
Originally by: Malcanis
Originally by: Elizabeth Joanne Unless you have a CRT display or one of those new-fangled 120 fps panels, you'll never see more than 60 fps anyway. That's the refresh rate of most panels, and there's no way going beyond that.
That's not quite true. LCDs don't work in the same way as CRTs.
True, but it still apply what he said.
|

Durzel
The Xenodus Initiative.
|
Posted - 2009.04.17 10:08:00 -
[10]
Thanks for the info, I learn something new every day. 
Surely would make more sense to just have a checkbox for "V-sync" that does the same thing? I'd wager most people know what "V-sync" means, but don't have a clue what "Interval immediate" means.
Also, for an idiot, should I have "Dither" on or off for best image quality? 
|
|

Grez
Minmatar Core Contingency
|
Posted - 2009.04.17 10:10:00 -
[11]
Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.
Turn on advanced options in your graphics menu in EVE and change "Interval Default" to "Interval Immediate". It effectively turns vsync off, whereas now, it's set to driver default. --- Grez: I shot the sheriff Kalazar: But I could not lock the Deputy BECAUSE OF FALCON |

Elizabeth Joanne
Minmatar New Angel Industries United Federation Of Corps
|
Posted - 2009.04.17 10:12:00 -
[12]
Originally by: Grez Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.
[citation needed]
|

Lord Fornicate
|
Posted - 2009.04.17 10:37:00 -
[13]
Originally by: Grez Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving that I know absolutely nothing.
Fixed it for you.
|

Another Forum'Alt
Gallente Center for Advanced Studies
|
Posted - 2009.04.17 10:43:00 -
[14]
Originally by: Elizabeth Joanne
Originally by: Grez Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.
[citation needed]
[citation needed]
Humans can tell the difference between 60 and higher framerates, it is just that most monitors can't display >60fps with any difference. BECAUSE OF FALCON. Guide to forum posting |

Maglorre
|
Posted - 2009.04.17 11:22:00 -
[15]
Originally by: Grez Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.
Turn on advanced options in your graphics menu in EVE and change "Interval Default" to "Interval Immediate". It effectively turns vsync off, increases CPU and GPU usage thus increasing the amount of heat generated inside your PC, reducing it's lifetime and increasing power consumption for no significant benefit, whereas now, it's set to driver default.
Fixed that for you
|

Pan Crastus
Anti-Metagaming League
|
Posted - 2009.04.17 11:37:00 -
[16]
Originally by: Grez Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.
Turn on advanced options in your graphics menu in EVE and change "Interval Default" to "Interval Immediate". It effectively turns vsync off, whereas now, it's set to driver default.
That just means that while your display still uses 60Hz, the displayed image is updated more frequently in the frame buffer of your graphics card and you will usually see a "tearing" line somewhere when your card switches to a new frame and in extreme cases (very high fps) you will just never see some of the frames on your display because they will be skipped.
How to PVP: 1. buy ISK with GTCs, 2. fit cloak, learn aggro mechanics, 3. buy second account for metagaming
|

Angelik'a
Gallente
|
Posted - 2009.04.17 11:41:00 -
[17]
It's usually the feeling of smoothness rather than anything actually picked up by your eye. For example if you're playing a first person shooter at 25fps (why any more, your eye cant see it anyway amirite?) and turn around quickly you'll perceive it to be "jumpy". Playing at higher framerates removes this jumpyness.
|

Pottsey
Enheduanni Foundation
|
Posted - 2009.04.17 11:43:00 -
[18]
Grez said "Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing." It's not a myth it's just not completely accurate. Most LCD screens are limited to displaying 60fps so even though the human Eye can see more then 60fps the screen is limiting the eye to 60fps.
If the monitor has a refresh of 100hz like some CRT's then you can see up to 100fps. Some LCDs go as high as 75fps. Its impossible to see more complete FPS then the monitor can display and the amount of FPS the monitor can display is linked to the refresh rate. Having 150+fps is a waste of time if your monitor is limited to 60fps.
______ How to Passive Shield Tank T2
|

Catherine Frasier
|
Posted - 2009.04.17 11:54:00 -
[19]
Originally by: Angelik'a It's usually the feeling of smoothness rather than anything actually picked up by your eye. For example if you're playing a first person shooter at 25fps (why any more, your eye cant see it anyway amirite?)
No you're not "rite". Your eyes process a continuous stream of information and can easily detect the difference between 25 fps and 60 fps when there is motion in the image.
The "feeling of smoothness" comes from what your eyes (and brain) perceive (what else could it possibly be)? |

Grez
Minmatar Core Contingency
|
Posted - 2009.04.17 12:19:00 -
[20]
Idd, the last two posts have hit it on the head.
Human eyes can perceive up to, and just about past 200fps. Some people cannot, some people can - it's a bit like hearing range for eyes, but, well yeah, you get the idea...
Locking your fps to your monitors refresh rate can also have detrimental effects. Lookup vsync and what it does.
All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps. It's also not a reason to use vsync. Vsync should only be used if you experience tearing of textures/scenes (the issue of a frame being dropped halfway through being rendered). --- Grez: I shot the sheriff Kalazar: But I could not lock the Deputy BECAUSE OF FALCON |
|

Catherine Frasier
|
Posted - 2009.04.17 12:31:00 -
[21]
Originally by: Grez All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps.
No, what you said was "Not being able to see more than 60 fps from your monitor is a myth". It's not. If your monitor only displays 60 fps you can only see those 60 fps. How else? Are you somehow seeing frames that your monitor isn't displaying? (Good trick!)
Originally by: Grez It's also not a reason to use vsync.
No, there are lots of good reasons to use vsync. This simply means that there's no down side to it. |

Pan Crastus
Anti-Metagaming League
|
Posted - 2009.04.17 12:33:00 -
[22]
Originally by: Grez Idd, the last two posts have hit it on the head.
Human eyes can perceive up to, and just about past 200fps. Some people cannot, some people can - it's a bit like hearing range for eyes, but, well yeah, you get the idea...
Locking your fps to your monitors refresh rate can also have detrimental effects. Lookup vsync and what it does.
All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps. It's also not a reason to use vsync. Vsync should only be used if you experience tearing of textures/scenes (the issue of a frame being dropped halfway through being rendered).
You don't get it.
No matter what you do to EVE (or any other game for that matter), you will not see more than 60 (or whatever the vsync rate of your display is) different images per second on your display. EVE will just render more images and you will see only some of them / mixed-up parts of them on one of the 60 images your display will show within a second.
There's no such thing as "smoother movement" with you remove the vsync locking, that's utter rubbish.
How to PVP: 1. buy ISK with GTCs, 2. fit cloak, learn aggro mechanics, 3. buy second account for metagaming
|

WarlockX
Amarr Free Trade Corp
|
Posted - 2009.04.17 12:42:00 -
[23]
Originally by: Grez Idd, the last two posts have hit it on the head.
Human eyes can perceive up to, and just about past 200fps. Some people cannot, some people can - it's a bit like hearing range for eyes, but, well yeah, you get the idea...
Locking your fps to your monitors refresh rate can also have detrimental effects. Lookup vsync and what it does.
All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps. It's also not a reason to use vsync. Vsync should only be used if you experience tearing of textures/scenes (the issue of a frame being dropped halfway through being rendered).
No one said anything about your eyes, it doesn't matter what your eyes can see if the monitor only displays 60fps. Anything above that is wasted. ----------------------------------------------- Free Trade Corp - Flash page
|

Shintai
Gallente Balad Naran Orbital Shipyards
|
Posted - 2009.04.17 12:45:00 -
[24]
Originally by: Grez Idd, the last two posts have hit it on the head.
Human eyes can perceive up to, and just about past 200fps. Some people cannot, some people can - it's a bit like hearing range for eyes, but, well yeah, you get the idea...
Locking your fps to your monitors refresh rate can also have detrimental effects. Lookup vsync and what it does.
All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps. It's also not a reason to use vsync. Vsync should only be used if you experience tearing of textures/scenes (the issue of a frame being dropped halfway through being rendered).
First of all the human eye dont know if its 30 FPS or 2mio FPS. Does the screen flicker for you in the cinema?
Secondly..LCDs...60hz...60FPS..Bingo. anything above FPS simply wont get shown. the frames are discarded. ! --------------------------------------
Abstraction and Transcendence: Nature, Shintai, and Geometry |

Gabriel Loki
|
Posted - 2009.04.17 12:46:00 -
[25]
Edited by: Gabriel Loki on 17/04/2009 12:46:24
Originally by: Grez
There's no such thing as "smoother movement" with you remove the vsync locking, that's utter rubbish.
Its all psycomological. If you put 2 games next to each other, doing exactly the same thing on exactly the same hardware, and display the real fps on one, and 60 on the other, they will say the one with 150 is better. Even if both are only showing 60.
|

Miagi Sans
Amarr PURgE-Corp The Chamber of Commerce
|
Posted - 2009.04.17 12:48:00 -
[26]
Originally by: Pan Crastus
Originally by: Grez Idd, the last two posts have hit it on the head.
Human eyes can perceive up to, and just about past 200fps. Some people cannot, some people can - it's a bit like hearing range for eyes, but, well yeah, you get the idea...
Locking your fps to your monitors refresh rate can also have detrimental effects. Lookup vsync and what it does.
All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps. It's also not a reason to use vsync. Vsync should only be used if you experience tearing of textures/scenes (the issue of a frame being dropped halfway through being rendered).
You don't get it.
No matter what you do to EVE (or any other game for that matter), you will not see more than 60 (or whatever the vsync rate of your display is) different images per second on your display. EVE will just render more images and you will see only some of them / mixed-up parts of them on one of the 60 images your display will show within a second.
There's no such thing as "smoother movement" with you remove the vsync locking, that's utter rubbish.
you know for playing a complex, mathematical game, people are pretty dumb when it comes to simple math. The above poster is 100% correct. All you are doing on your video card when you exceed your fps of your monitor is causing undue stress on your video card. If your input (video card) is processing the game at 120 fps and your output (monitor) can only handle 60 fps, then you are doubling the stress on your video card and getting the same performance as a card which is performing at 60 fps and outputting through your monitor at 60fps.
In fact, your 120 fps will have more artifacts and tearing than a card processing at 60fps.
your monitor is what is limiting your fps.
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.17 12:57:00 -
[27]
As someone stated earlier in this thread, while the display may only be pumping out 60 frames per second, in fast-paced games such as most FPS's, it generally does feel more responsive with v-sync turned off. But Eve is not an FPS nor is it by any stretch of the imagination fast-paced, so you are unlikely to notice any difference with v-sync on or off. So, keep v-sync on - your hardware will thank you for it.
|

Pan Crastus
Anti-Metagaming League
|
Posted - 2009.04.17 13:17:00 -
[28]
Originally by: Polly Prissypantz As someone stated earlier in this thread, while the display may only be pumping out 60 frames per second, in fast-paced games such as most FPS's, it generally does feel more responsive with v-sync turned off.
Apart from religious beliefs and psychotic fits, the only cause for "v-sync off" setting looking different is artifacts. In fact, you will sometimes see the bottom part of the screen containing a newer frame than the top part of the screen, but this will not fool the eye into "seeing more frames", only into seeing distorted/shaking objects.
How to PVP: 1. buy ISK with GTCs, 2. fit cloak, learn aggro mechanics, 3. buy second account for metagaming
|

Lonzo Kincaid
Black Nova Corp KenZoku
|
Posted - 2009.04.17 13:23:00 -
[29]
what's the frame rate for human eyes? ----------------------
Quote: The rule of thumb is you have to have outnumber them 2:1 before you even think about engaging them
|

Gabriel Loki
|
Posted - 2009.04.17 13:26:00 -
[30]
Originally by: Lonzo Kincaid what's the frame rate for human eyes?
They dont have one.
|
|

Ralara
Caldari the united Negative Ten.
|
Posted - 2009.04.17 13:42:00 -
[31]
Originally by: Grez Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.
It doesn't work that way. Your graphics card can produce much more than 60 FPS, yes. However your monitor, if it's a normal LCD screen has it's refresh rate at (normally) 60Hz. That means the screen updates its information 60 times a second whether all that's being displayed is a big red square or a 3D rendering at 500 FPS from the graphics card. You are getting 60 frames per second as a result of this.
Your graphics card can feed the monitor whatever it does but the monitor will never go above displaying 60 frames per second because the hardware inside the screen is set at that amount.
Therefore, it's kinda pointless having Eve generate more than 60 FPS for normal game-play - it's wasted resources - wasted heat and wasted electricity. Who cares if it's generating at 150 FPS? You don't see that.
The only time it really matters is when you're FRAPSing or something and you wanted to perhaps have it in ultra slow motion - have it record at the 150 FPS and then slow it down 5 times to 30 FPS etc. However unless you're doing that, it's pointless.
TLDR: You're doing it wrong :) --
|

Leeluvv
The Black Ops Black Core Alliance
|
Posted - 2009.04.17 14:16:00 -
[32]
Edited by: Leeluvv on 17/04/2009 14:26:54 (This is in CRT speak)
It is simpler to think of each horizontal line being refreshed at the refresh rate, so a 75Hz screen will refresh each line 75 times in a second. Having a higher FPS will not increase this, but it will affect what is actually displayed.
Simple example:
20 line screen Vertical Refresh Rate at 20 Hz (i.e. how fast the horizontal lines are refreshed) FPS of 20 (FPS synced/locked to Vertical Refresh rate)
Each line is refreshed 20 times in a second and this makes up a full frame, so each frame is shown for 0.05 seconds.
20 line screen Vertical Refresh Rate at 20 Hz FPS of 40 (FPS not synced/locked to Vertical Refresh rate)
Each line is refreshed 20 times a second, but the PC is producing 40 frames per second, so the top half of the screen will be the first frame and the bottom half will be the second frame. You only ever see half a frame and it is displayed for 0.025 seconds, but you still only see 1 screen display every 0.05 seconds.
This can cause 'tearing' or distortion in the image, because the item you have on screen in the top half may have moved in the lower half, so the image wont align correctly; HOWEVER, in FPS games it is often more important to see things as soon as posible so the image corruption is acceptable.
Disabling VSync is a hang up from FPS and other 'twitch' games and I cannot see any benefit to disabling VSync in Eve.
The human eye is usually happy with a static image being refreshed at approx 75 FPS, but this is lower for a moving image, usually 25 FPS.
TV is 50 or 60 Hz, but interlaced, so you actually only get 25 or 30 FPS. Normal movies are 24 FPS.
To all those people claiming that they can't play below large FPS values, you are lying if you can happily watch TV or movies.
Lee == Sig to follow |

Sirani
|
Posted - 2009.04.17 14:25:00 -
[33]
Originally by: Elizabeth Joanne Unless you have a CRT display or one of those new-fangled 120 fps panels, you'll never see more than 60 fps anyway. That's the refresh rate of most panels, and there's no way going beyond that.
That aside, if you absolutely, definitely need to see more than 60 fps on the meter, look at your graphics setting. Perhaps the patch made changes to the interval, and switching back to "Interval immediate" should turn off VSync and any triple buffering.
*sigh*
its not just about what you can see, trying playing the game at 150 FPS and then at 60 FPS, it *feels* slower and more sluggish ------------------- |

WarlockX
Amarr Free Trade Corp
|
Posted - 2009.04.17 14:43:00 -
[34]
Originally by: Sirani
Originally by: Elizabeth Joanne Unless you have a CRT display or one of those new-fangled 120 fps panels, you'll never see more than 60 fps anyway. That's the refresh rate of most panels, and there's no way going beyond that.
That aside, if you absolutely, definitely need to see more than 60 fps on the meter, look at your graphics setting. Perhaps the patch made changes to the interval, and switching back to "Interval immediate" should turn off VSync and any triple buffering.
*sigh*
its not just about what you can see, trying playing the game at 150 FPS and then at 60 FPS, it *feels* slower and more sluggish
are you on crack? how can it feel slower if it looks exactly the same? ----------------------------------------------- Free Trade Corp - Flash page
|

Catherine Frasier
|
Posted - 2009.04.17 14:48:00 -
[35]
Originally by: Leeluvv The human eye is usually happy with a static image being refreshed at approx 75 FPS, but this is lower for a moving image, usually 25 FPS.
That's backwards. The human eye is "happy" with 0 fps for a static image. (You ever hear anyone complain that the refresh rate was too low on the Mona Lisa?) The greater the degree of motion being displayed the more frames are required to give the illusion of continuity. |

Red Wid0w
Caldari Science and Trade Institute
|
Posted - 2009.04.17 15:04:00 -
[36]
Yes there is a difference between 150 and 60 fps. But it's not that you mouth-breathers think.
150FPS IS ACTUALLY SLOWER - YOUR PC IS DOING MORE WORK!! For no gain at all since monitor can't display it. You might even notice controls becoming less responsive, since input generally runs in a different thread from graphics. With VSYNCH off you are letting the graphics thread run rampant and waste CPU resources. Not to mention fans having to work harder etc, you are actually wrecking your pc - KEEP DENYING IT! 60 FPS IS SMOOTHER. Because it's locked to 60fps - a CONSTANT smooth fps!
There's a good thread on SHC about 120hz lcds etc, that explodes lots of common myths. ITT ppl with more money than sense claim to have 200hz displays etc.
|

Grez
Minmatar Core Contingency
|
Posted - 2009.04.17 15:47:00 -
[37]
A game pumping out 150 fps, is running quicker, more responsive and generally doing everything it can/should do, a lot quicker than a game doing that at 60 fps, hence, a game running at 150 fps will feel more responsive, even if you can only see 60 fps from the monitor.
For example, the game is running on the computer, not the monitor. The hardware is plugged into the computer, not the monitor. Just because the monitor is displaying it at 60/75fps (in their most common settings), does not mean that the game is running at that speed.
Limiting a game to output at a certain frame rate can also cause detrimental effects to how the game is processes. Hence, a game can definitely, and will almost always feel, more responsive.
Those of us with computers that can handle the extra fps, are all free to do it, if you lock your FPS to your refresh rate using vsync, the computer is still going to be working as hard on other things, the only component that's not, is the graphics card, and they are designed to be worked at 100% (most gaming ones anyway).
Hence, there is no point to forcing vsync other than trying to solve any tearing issues you see, or if your graphics card cannot cope with 100% workload. --- Grez: I shot the sheriff Kalazar: But I could not lock the Deputy BECAUSE OF FALCON |

Highwind Cid
|
Posted - 2009.04.17 15:54:00 -
[38]
Originally by: johny B5 before the patch, i had 150 fps and higher with the highest settings, now i only have about 60 fps. why? please remove these stupid restrictions. this is not doom 3.
i recently bought a new grafic card, and cannot make use of it. i like my old fps back, wether i am able to see them or not. when i am in a battle situation with less frames than 60 frames, this is not acceptable.
ccp, please remove this useless restriction fast!
My spider sense is anticipating a new thread about John Doe's GPU becoming insanely hot while in stations...
|

Spurty
Caldari Amok. Minor Threat.
|
Posted - 2009.04.17 15:56:00 -
[39]
Someone is mixing up the fact that after the patch, something was really messed up with the client.
To get my client to draw nicely (not stuttering and the fps hanging around 42) even when in the station:
a) clear my caches b) install nVidia's 182.50 driver c) reboot Vista 64bit SP1
Now, everything is nice and smooth again. Prior to what I did, it was choppy, the updates to the screen were slow, the fps had dropped.
Not all of these things are tied to eachother I know, but I am confirming that I had this issue and I resolved it.
Originally by: Butter Dog
I think you'll find that 10 seconds > 1 month
|

Pottsey
Enheduanni Foundation
|
Posted - 2009.04.17 16:01:00 -
[40]
A game running at 150fps will have screen tearing and people appearing in more than one place at once. You might have the legs inches away from the body. Hardly what you need when trying to aim at a person. All you're doing at 150fps is forcing the game to draw half frames or part of frames. How can half frames or less be more responsive? Unless you have mouse lag there is never a reason to not use vsync.
Perhaps you're confusing the FPS responsiveness diffrence with a mouse lag problem some people get with vsync. There are cases where vsync cause a mouse lag problem but that's not a FPS problem. Its not the extra FPS that stop the mouse lag.
______ How to Passive Shield Tank T2
|
|

Pan Crastus
Anti-Metagaming League
|
Posted - 2009.04.17 16:46:00 -
[41]
Originally by: Pottsey
Perhaps you're confusing the FPS responsiveness diffrence with a mouse lag problem some people get with vsync. There are cases where vsync cause a mouse lag problem but that's not a FPS problem. Its not the extra FPS that stop the mouse lag.
That's not caused by waiting for v-sync, it's caused by double/triple-buffering (sometimes graphics cards support 5..9 buffers also). This utilizes the hardware better, but whatever is displayed, is delayed for 1+ frames, so the mouse is not as responsive as it could be. Double-/triple-buffering is not very useful without waiting for v-sync to switch between buffers.
How to PVP: 1. buy ISK with GTCs, 2. fit cloak, learn aggro mechanics, 3. buy second account for metagaming
|

Akita T
Caldari Navy Volunteer Task Force
|
Posted - 2009.04.17 16:56:00 -
[42]
Also, turning VSynch off just so you can SHOW OFF your "omfg, 279 FPS EVE, wtfbbq" is downright stupid, because it's KNOWN to have contributed a lot to premature frying of several video cards. Also, like many, MANY people have said here before, if the monitor can only display 60 (or 75, or 100, or 120, whatever), there's no BENEFIT in turning VSynch off ("Interval immediate") other than the FPS e-p33n number on the FPS monitor. At best, you will see half a frame and half of the other with a bit of tearning in-between, but that's just stupid.
In EVE, at pretty much all times, VSynch should be turned on (Interval one) and left that way forever. It has no serious drawbacks (some MIGHT argue that it doesn't "feel that dynamic anymore" because they were used to the tearing effect) but a lot of benefits compared to the alternative (longer vidcard life, for starters).
_ The problem with EVE || Fit a ship || Get some ISK |

Hariya
|
Posted - 2009.04.17 17:00:00 -
[43]
Originally by: Gabriel Loki
Originally by: Lonzo Kincaid what's the frame rate for human eyes?
They dont have one.
The US Air Force conducted some scientific research on this matter in 2001, and they concluded that their fighter pilots (people with near perfect human eyes) can in simulations still benefit from 250-275fps difference in how they perceive moving objects. That being said, more fps in computer games is good, and the human perception (subconsciously) can use as much as the technology will ever be able to provide.
TFT panels do commonly only 60 updates to the image per second. That number is tied mostly to the vertical refresh rate, but also into the fact that the all-digital processing unit that is handling the signaling to the grid should be cheap(ish) and the monitors are commonly designed for one exact and constant fps rate regardless of what the incoming signal would be. Some monitors do the 60fps better than others though, there are several tricks that can be used to fool humans think that the action is smoother (although it really is not).
|

Beth Noir
The Neuroeconomic Foundation
|
Posted - 2009.04.17 17:06:00 -
[44]
After a bit of testing, I'm sure there's something more to this which is causing the disagreement.
I'd noticed Eve had suddenly become much less smooth before reading this thread and checked my fps, was only getting about 45, which is very slow.
The default post patch is the 'interval default' setting. Switching to 'interval immediate' fixed it, smooth again, getting around 130fps, BUT switching to 'interval one' also fixes it, smooth, and spot on 60fps on a 60Hz screen.
So, vsynch isn't the problem. There's something else about 'interval default' which slows things down. Best solution to keep everyone happy is 'interval one' (display and graphics advanced settings)
|

Feilamya
Minmatar
|
Posted - 2009.04.17 17:13:00 -
[45]
Unless you want to see ugly tearing or fry your graphics card, use "Interval One" and be happy with 60 FPS.
|

Cipher7
|
Posted - 2009.04.17 17:17:00 -
[46]
1) The human eye can only perceive a certain amount of frames per sec.
2) The monitor can only put out a certain number of frames per sec.
3) When you turn off vsync and get those "150 dps" rates, #1 your monitor doesn't display most of them #2 If it did, your eyes would get tired more quickly for no benefit.
60 FPS should feel really really smooth to most people, and for some people can result in eyestrain. Anything above 30fps is fine and really almost the limit of our human perception. ---
|

Pan Crastus
Anti-Metagaming League
|
Posted - 2009.04.17 17:19:00 -
[47]
Originally by: Akita T
Turning VSynch off just so you can SHOW OFF your "omfg, 279 FPS EVE, wtfbbq" is downright stupid, because it's KNOWN to have contributed a lot to premature frying of several video cards.
It is also known to have caused the "omg EVE uses up 100% CPU time on the login screen" problem. ;-)
How to PVP: 1. buy ISK with GTCs, 2. fit cloak, learn aggro mechanics, 3. buy second account for metagaming
|

Catherine Frasier
|
Posted - 2009.04.17 17:24:00 -
[48]
Originally by: Cipher7 your eyes would get tired more quickly for no benefit.
Um... your eyes are taking in a constant stream of input regardless of the fps on your display. It's not like they slow down to 60fps just because your game does and it's not like they "speed up" and get tired trying to keep up with faster displays. |

Feilamya
Minmatar
|
Posted - 2009.04.17 17:31:00 -
[49]
Edited by: Feilamya on 17/04/2009 17:31:19
Originally by: Cipher7 Anything above 30fps is fine and really almost the limit of our human perception.
This is only true for movies, and it has nothing to do with human perception: http://www.daniele.ch/school/30vs60/30vs60_1.html
(Text does not cite any sources. If you really care, do your own research on it...)
|

Xianbei
|
Posted - 2009.04.17 18:15:00 -
[50]
wow the amount of armchair science in this thread is amazing
and how people can pound their chest and scream about something and be so wrong is a testament to what the internet has become. the internet at your very fingertips and you cant even be bothered to research before you post.
i was going to offer some constructive info and links but really there is no point
you can lead an idiot to information but you cannot make him un-stupid
|
|

Elain Reverse
Caldari Shokei
|
Posted - 2009.04.17 18:26:00 -
[51]
If you are using V-sync you are basicly saying to graphic card to wait with rendering for next refresh rate of your monitor. It make your GFX working less but also can lower overall or minimum fps and can even bring you microlags. Its maybe not big issue with EVE but realy not nice for fps games.
|

Petra Katell
|
Posted - 2009.04.17 18:29:00 -
[52]
http://www.100fps.com/how_many_frames_can_humans_see.htm
You're welcome.
|

Hariya
|
Posted - 2009.04.17 18:36:00 -
[53]
Originally by: Petra Katell http://www.100fps.com/how_many_frames_can_humans_see.htm
You're welcome.
You have just earned yourself a coupon for 1000 internets.
|

Gabriel Loki
|
Posted - 2009.04.17 19:10:00 -
[54]
Originally by: Grez
Limiting a game to output at a certain frame rate can also cause detrimental effects to how the game is processes. Hence, a game can definitely, and will almost always feel, more responsive.
No programmer worth his salt will put the whole game into a single thread, you can have vsync on, and the rest of the game will still run full throttle.
|

Vaerah Vahrokha
Minmatar Dark-Rising
|
Posted - 2009.04.17 19:10:00 -
[55]
Quote:
Unless you want to see ugly tearing or fry your graphics card, use "Interval One" and be happy with 60 FPS
I am using interval one but I tried with no v-sync and I did not see neither "ugly" tearing nor fried anything. Buy something that does not suck?
Said that, I am happy I am still using a 85 Hz CRT monitor. I happen to see the difference between a v-synced 60 and a 75 and a 85 monitor (it's quite an easy difference spotting 60 => 75, almost invisible 75 => 85).
I have 11/10 visus though.
Quote:
Anything above 30fps is fine and really almost the limit of our human perception.
No way, I'd vomit if I had to follow a monitor at 30fps (not joking).
A television, that's fine as it's usually seen from a greatly bigger distance.
Basically to "feel" the frame rate, all you have to do is to turn your head sideways then look at the screen while turning it (not too fast). You'll see a feeling of "discontinuity" which is very detrimental if you play a game where you tend to "follow" movements around.
I.e. imagine the screen is >= 19" and a guy appears all to the top left. It's probable you'll instinctively turn your head / eyeballs (peripheral vision calls for attention to movement). In that turning, you'll feel the difference between a slow and a fast refresh.
Moreover older LCD monitors exibit a ghosting effect on high contrast images in movement, which further adds to the uncomfortable eye feeling.
|

Zeba
Minmatar Honourable East India Trading Company
|
Posted - 2009.04.17 19:16:00 -
[56]
Originally by: Red Wid0w Oh, and those 120hz LCDs are rubbish, they interpolate between frames basically.
Well as the owner of both a slightly 'older' model 1080p lcd with a 60hz refresh and one of the new fangled 1080p 120hz refresh screens I can tell you that your assumption is rubbish. Both have a 2ms responce time but the 120hz is sooooooo much smoother looking with action on the sceen. Basicially to the point my pc or blu-ray content or hd channels are like looking through a window at a real scene. Matter of fact I was so impressed I now use two 42" LG-70 on my computer to dual screen with. 
Originally by: Achar Losa i might be just a 6 year old stupid boy, but he's a CCP dev writing in the forums!
|

Malphilos
|
Posted - 2009.04.17 19:19:00 -
[57]
Originally by: Vaerah Vahrokha Basically to "feel" the frame rate, all you have to do is to turn your head sideways then look at the screen while turning it (not too fast). You'll see a feeling of "discontinuity" which is very detrimental if you play a game where you tend to "follow" movements around.
I hear that if you smash yourself in the face with a hammer you'll feel a bit of "discontinuity" too.
But why the hell would you do either of those things?
|

Gabriel Loki
|
Posted - 2009.04.17 19:23:00 -
[58]
Originally by: Malphilos
Originally by: Vaerah Vahrokha Basically to "feel" the frame rate, all you have to do is to turn your head sideways then look at the screen while turning it (not too fast). You'll see a feeling of "discontinuity" which is very detrimental if you play a game where you tend to "follow" movements around.
I hear that if you smash yourself in the face with a hammer you'll feel a bit of "discontinuity" too.
But why the hell would you do either of those things?
So your OVER 9000! fps mean something?
|

Phantom Slave
JUDGE DREAD Inc.
|
Posted - 2009.04.17 19:30:00 -
[59]
There seems to be some misconception and alot of personal bias in this thread.
I prefer v-sync on. It's a personal preference. Some people prefer V-sync off, it's their choice.
***SCREEN TEARING, and what it means to you!***
Screen tearing can ONLY be seen on a moving object, faster moving objects means more tearing. If you're sitting in station just looking at your ship, you won't see tearing because there isn't alot of fast movement. Go outside the station and spin your camera around really fast. If your monitor is set at 60 Hz and your framerate is above that, you will notice that if you spin the screen fast enough the station will effectively be split up into parts for very VERY brief moments.
If you do not notice this with v-sync off then don't bother turning it on. If you DO notice it but it doesn't bother you then leave v-sync off. I notice it and it bugs me, so I use v-sync.
It's ALL personal preference. Some people just may not even notice it because they don't care. If you like v-sync then use it, if you don't then don't bother.
To those that are saying that v-sync lowers your system heat then you're not entirely correct. V-sync produces 2 frames for every frame the monitor shows, and if you have Triple Buffering on then you're producing 3 frames for every 1 that shows. It helps keep your framerate smooth because if something happens that slows down the GPU then it has extra frames to fall back on.
|

Vaerah Vahrokha
Minmatar Dark-Rising
|
Posted - 2009.04.17 19:30:00 -
[60]
Quote:
I hear that if you smash yourself in the face with a hammer you'll feel a bit of "discontinuity" too.
But why the hell would you do either of those things?
Because if you want to perform a test, you "stress" the system?
|
|

Elizabeth Joanne
Minmatar New Angel Industries United Federation Of Corps
|
Posted - 2009.04.17 19:34:00 -
[61]
Quoting myself in post #3 in this thread:
Originally by: Elizabeth Joanne Unless you have a CRT display or one of those new-fangled 120 fps panels, you'll never see more than 60 fps anyway. That's the refresh rate of most panels, and there's no way going beyond that.
Oh dear, I've created a monster.
*repents*
-- "Boo hoo. Cry some more." -- CCP Whisper
|

Zeba
Minmatar Honourable East India Trading Company
|
Posted - 2009.04.17 19:35:00 -
[62]
Edited by: Zeba on 17/04/2009 19:35:37
Originally by: Phantom Slave There seems to be some misconception and alot of personal bias in this thread.
I prefer v-sync on. It's a personal preference. Some people prefer V-sync off, it's their choice.
***SCREEN TEARING, and what it means to you!***
Screen tearing can ONLY be seen on a moving object, faster moving objects means more tearing. If you're sitting in station just looking at your ship, you won't see tearing because there isn't alot of fast movement. Go outside the station and spin your camera around really fast. If your monitor is set at 60 Hz and your framerate is above that, you will notice that if you spin the screen fast enough the station will effectively be split up into parts for very VERY brief moments.
If you do not notice this with v-sync off then don't bother turning it on. If you DO notice it but it doesn't bother you then leave v-sync off. I notice it and it bugs me, so I use v-sync.
It's ALL personal preference. Some people just may not even notice it because they don't care. If you like v-sync then use it, if you don't then don't bother.
To those that are saying that v-sync lowers your system heat then you're not entirely correct. V-sync produces 2 frames for every frame the monitor shows, and if you have Triple Buffering on then you're producing 3 frames for every 1 that shows. It helps keep your framerate smooth because if something happens that slows down the GPU then it has extra frames to fall back on.
I'm sorry but did you just bring logic and a tolerant even handed view of other peoples opinions into an eve-o gd thread?
BURN HIM! 
Originally by: Achar Losa i might be just a 6 year old stupid boy, but he's a CCP dev writing in the forums!
|

Th0rG0d
Caldari Pilots From Honour Aeternus.
|
Posted - 2009.04.17 19:38:00 -
[63]
Originally by: Vaerah Vahrokha
I have 11/10 visus though.
Wow... So you can see something at 11 feet, that the "average" person can see at 10 feet?
The last time I had my eyes checked, the doctor didn't have any measuring equipment to determine better then 20/10. But really, anything better then 20/20 vision isn't necessary (unless you are into long range spotting), so bragging about your eyesite on the 'net just lowers you to my level....
<--- yes, that is my 20/10 eyes rolling 
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.18 01:26:00 -
[64]
Originally by: Grez A game pumping out 150 fps, is running quicker, more responsive and generally doing everything it can/should do, a lot quicker than a game doing that at 60 fps, hence, a game running at 150 fps will feel more responsive, even if you can only see 60 fps from the monitor.
For example, the game is running on the computer, not the monitor. The hardware is plugged into the computer, not the monitor. Just because the monitor is displaying it at 60/75fps (in their most common settings), does not mean that the game is running at that speed.
Limiting a game to output at a certain frame rate can also cause detrimental effects to how the game is processes. Hence, a game can definitely, and will almost always feel, more responsive.
Those of us with computers that can handle the extra fps, are all free to do it, if you lock your FPS to your refresh rate using vsync, the computer is still going to be working as hard on other things, the only component that's not, is the graphics card, and they are designed to be worked at 100% (most gaming ones anyway).
Grez is correct. All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds. Eve is slow enough that you won't really notice the difference but in some faster-paced games there is a definite difference in how the game performs when limiting or not limiting the frame rate. We're not talking about how it looks.
This is an older, but still useful thread on the subject.
Liberal use of italics brought to you today by the Learn to ****ing Read Association.
|

Lothros Andastar
Gallente
|
Posted - 2009.04.18 01:31:00 -
[65]
Minmatar scientists believe the 60 FPS limit is...
BECAUSE OF FALCON
|

Elizabeth Joanne
Minmatar New Angel Industries United Federation Of Corps
|
Posted - 2009.04.18 02:13:00 -
[66]
So you all can see things that mere mortals can't. Impressive.
The next time I consider bringing up the 60 Hz rule I will think of you young pioneers who not only produce more frames per second on their 60 Hz monitors than anyone else, but also can see every one of them.
A little knowledge is a dangerous thing, as they say. Let's hope these people... wait, we have nothing to fear with these people.
-- "Boo hoo. Cry some more." -- CCP Whisper
|

Catherine Frasier
|
Posted - 2009.04.18 03:36:00 -
[67]
Originally by: Polly Prissypantz All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds.
Pomo nonsense. Correction. Italicized Pomo nonsense.
This is a mechanical system. It either displays more frames per second or it doesn't. It either processes more update or input cycles per second or it doesn't. There's nothing "feely" about it, period.
|

masternerdguy
Gallente Point of No Return Blade.
|
Posted - 2009.04.18 03:37:00 -
[68]
the human eye sees up to around 22fps, so anything above that is wasted anyway.
|

HankMurphy
Minmatar Pelennor Enterprises
|
Posted - 2009.04.18 04:26:00 -
[69]
Originally by: Xianbei wow the amount of armchair science in this thread is amazing
and how people can pound their chest and scream about something and be so wrong is a testament to what the internet has become. the internet at your very fingertips and you cant even be bothered to research before you post.
i was going to offer some constructive info and links but really there is no point
you can lead an idiot to information but you cannot make him un-stupid
THIS x1000
in the spirit of the above, i would like to constructively suggest that the OP change out their cat5 cable with a cat5e or cat6 at the least, preferably one with gold ends to reduce latency. this should vastly improve your fps
 trust me, i seem legit ---------- Hey, sewer rat may taste like pumpkin pie, but I'd never know 'cause I wouldn't eat the filthy motherf***er. |

Rathelm
|
Posted - 2009.04.18 06:01:00 -
[70]
Originally by: Grez Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.
Turn on advanced options in your graphics menu in EVE and change "Interval Default" to "Interval Immediate". It effectively turns vsync off, whereas now, it's set to driver default.
As much as you'd like to think that monitors work by magic it's simply not the case. Your monitor's refresh rate is just that. Every refresh it redraws the screen. If your monitor's refresh rate is 60 HZ that means it redraws the screen 60 times a second. Therefore the absolute max that the video buffer will be drawn to the screen is 60 times a second. Without verticle sync enabled you effectively create a scenario where screen tearing will occur. This happens because as the monitor is refreshing the video buffer sends a new screen to the output device (your monitor) and it will start drawing the new screen from where it's drawing from mid-refresh. Fast moving objects could therefore be torn.
No matter how often you force the video buffer to be updated it can only be drawn to the screen when the monitor is in a refresh cycle.
|
|

Rathelm
|
Posted - 2009.04.18 06:06:00 -
[71]
Originally by: Phantom Slave V-sync produces 2 frames for every frame the monitor shows, and if you have Triple Buffering on then you're producing 3 frames for every 1 that shows. It helps keep your framerate smooth because if something happens that slows down the GPU then it has extra frames to fall back on.
To add to this you have to have a back buffer. That's the way Direct3D works, and I'd imagine OpenGL too but I'm not familiar with that API. All drawing done is done to the back buffer. The back buffer should only be flushed to the output device when it is actually being refreshed. Disabling V-sync shouldn't even be an option that developers and video card makers give you. The only reason they do is to sell more expensive video cards.
|

Rathelm
|
Posted - 2009.04.18 06:20:00 -
[72]
Originally by: Catherine Frasier
Originally by: Angelik'a It's usually the feeling of smoothness rather than anything actually picked up by your eye. For example if you're playing a first person shooter at 25fps (why any more, your eye cant see it anyway amirite?)
No you're not "rite". Your eyes process a continuous stream of information and can easily detect the difference between 25 fps and 60 fps when there is motion in the image.
The "feeling of smoothness" comes from what your eyes (and brain) perceive (what else could it possibly be)?
That has more to do with the way DirectX works. When you tell DirectX to open up an ID3D9Device and an IDXGISwapChain part of the intialization of the graphics engine is how many frames per second you're trying to achieve through the swapchain. If you're achieving less than that, that means the computer is chugging to try to keep up with what's going on. It also causes timing issues with your input. Another important thing is there are multiple ways to do input and if you tie input to your refresh rate it will drag it down even further.
|

Pottsey
Enheduanni Foundation
|
Posted - 2009.04.18 09:38:00 -
[73]
masternerdguy said "the human eye sees up to around 22fps, so anything above that is wasted anyway." I did a number of blind tests and every single person could tell the difference between 30, 60, 100 and 120fps. Could not go higher as 120 as that was the max my screen could display. It's easy to test just set half the screen to 30fps and half to 60fps and see if different people can spot a difference.
Anyone who cannot see the difference between 30 and 60fps or cannot see beyond 60fps has poor eye sight or a screen that cannot display more then 60fps.
______ How to Passive Shield Tank T2
|

Emperor D'Hoffryn
EXTERMINATUS. Imperial Republic Of the North
|
Posted - 2009.04.18 12:05:00 -
[74]
Originally by: Catherine Frasier
Originally by: Polly Prissypantz All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds.
Pomo nonsense. Correction. Italicized Pomo nonsense.
This is a mechanical system. It either displays more frames per second or it doesn't. It either processes more update or input cycles per second or it doesn't. There's nothing "feely" about it, period.
Actually, his argument is poorly worded, but what he's on about is the feedback delay. When you v-sync lock to 60fps, you are guaranteeing that when you change your input to the game, you will see the results of that change in input no faster then 1/60 of a second later, on average. (IE, your FPS character starts to turn)
If you are running a significantly higher frame rate, its possible that you will get a partial frame of this reaction to input sooner. This is similar to the problem with some cheapo LCD screens with monitor lag, only much reduced.
Does it matter for EVE? No. Can he really tell? I would argue its most likely in his head, at the time spans we are talking, but monitor lag is quite real and you will notice once you get a bad LCD screen. Its quite possible that the update delay with monitor lag (ALL LCDs have it, just some are really bad) amplifies the effect somehow.
Since none of this matter for EVE, and EVE has some super simple screens that go wonky with no limitations (login screen, inside stations) use the eve graphic settings to turn on v-sync, and leave it off in your driver settings for other games. If you ever have to go afk suddenly, and you are docked when you do so, your video card will thank you for it.
Originally by: CCP Whisper No it is not an official statement. Not everything surrounded by blue bars is an official statement which can be quoted as fact until the end of time. Deal with it.
|

Karentaki
Gallente Oberon Incorporated Morsus Mihi
|
Posted - 2009.04.18 12:10:00 -
[75]
Originally by: Gabriel Loki
Originally by: Lonzo Kincaid what's the frame rate for human eyes?
They dont have one.
This is true, but for something to appear as continuous motion it needs to be at least about 15FPS. Higher framerates simply make the image smoother, but anything beyond 60 isn't really noticeable.
Quote:
EVE is like a sandbox with landmines. Deal with it.
|

Draeca
Tharri and Co.
|
Posted - 2009.04.18 12:14:00 -
[76]
Originally by: Red Wid0w Yeah listen to what people are telling you. YOU NEVER HAD 150 FPS. YOUR MONITOR CAN ONLY DISPLAY 60 (more than likely). All you are doing with your 150fps is wasting processing power, thus reducing the responsiveness of your background apps. Turn VSYNCH on and lock your fps to 60 and there WILL BE NO DIFFERENCE TO EVE. However, everything else on your pc will run faster.
Oh, and those 120hz LCDs are rubbish, they interpolate between frames basically.
There is difference, I had this strange "jumping" (kinda like lag, but a lot smoother) when spinning the camera with vsync enabled. Turning the vsync off fixed it and now eve's back to normal.
|

Elizabeth Joanne
Minmatar New Angel Industries United Federation Of Corps
|
Posted - 2009.04.18 15:07:00 -
[77]
Anyone foaming at the mouth about frame rates and their effect on reaction time etc. is conveniently ignoring the real world.
As it has been established there is no way to have a 60 Hz monitor display more than 60 frames per second, this is just one part of the equation when it comes to reacting to what is being displayed.
On the surface, it looks like a 60 fps display allows you to see what is happening with a delay of at most ~17 milliseconds (1/60th of a second), and that this delay could be reduced by increasing the frame rate.
Alas, not so. TFT displays also have this phenomenon called input lag that isn't often discussed because it can be outright horrendous. The Dell 2707WFP for example has 46 milliseconds of input lag. This means you are effectively at least 2 frames in the past at all times.
If you are worried about frame rates and its effect on your reaction times, choose a monitor with a very low input lag first. TFT Central has measured the input lag on a bunch of models. Manufacturers typically don't advertise this figure.
-- "Boo hoo. Cry some more." -- CCP Whisper
|

Pan Crastus
Anti-Metagaming League
|
Posted - 2009.04.19 00:17:00 -
[78]
Originally by: Phantom Slave
To those that are saying that v-sync lowers your system heat then you're not entirely correct. V-sync produces 2 frames for every frame the monitor shows, and if you have Triple Buffering on then you're producing 3 frames for every 1 that shows. It helps keep your framerate smooth because if something happens that slows down the GPU then it has extra frames to fall back on.
Please stop posting such nonsense. Or at least read up on this stuff before you try to post such gibberish. V-Sync does not "produce 2 frames for every 1 that shows", neither does tripple-buffering produce 3.
V-Sync is commonly used with a double buffer / back buffer and what happens is that while one image (buffer) is displayed, the next one is drawn in the other buffer. When that one is finished, the GPU pauses until the next V-sync and the buffers are switched (the one just drawn is displayed, the other is used for drawing).
For triple-buffering, instead of pausing when the hidden buffer is drawn, the 3rd buffer is drawn.
All these methods were more interesting when fps was (sometimes / usually) lower than the v-sync frequency, when it isn't and a frame can be drawn within a v-sync always, using one hidden buffer is enough and the graphics card will pause anyway because you only need 60 (or whatever your monitor wants) frames per second.
Conclusion: only idiots switch off v-sync unless the program is bugged to hell. All it does is stress your gfx card and CPU more and sometimes, as a consequence, various parts of your system (like sound/mouse) will suffer.
How to PVP: 1. buy ISK with GTCs, 2. fit cloak, learn aggro mechanics, 3. buy second account for metagaming
|

Astigmatic
|
Posted - 2009.04.19 00:29:00 -
[79]
Originally by: johny B5 before the patch, i had 150 fps and higher with the highest settings, now i only have about 60 fps. why? please remove these stupid restrictions. this is not doom 3.
i recently bought a new grafic card, and cannot make use of it. i like my old fps back, wether i am able to see them or not. when i am in a battle situation with less frames than 60 frames, this is not acceptable.
ccp, please remove this useless restriction fast!
You now know how to remove it. Switch to Interval Immediate. Now burn out your gfx card and whine elsewhere, All the armchair techs, please follow suit or set up a business. Let's get on with actually discussing Eve mechanics rather than technical ineptitude affecting some of the playerbase.
|

Clementina
Jericho Fraction The Star Fraction
|
Posted - 2009.04.19 00:48:00 -
[80]
Thanks to this thread, I changed my game from Interval Immediate to Interval One. Eve uses less CPU time according to Windows Task Manager, and I am not noticing a difference in appearance.
|
|

Jana Clant
New Dawn Tribe New Eden Research
|
Posted - 2009.04.19 01:21:00 -
[81]
Originally by: Polly Prissypantz Grez is correct. All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds.
Unless you are somehow telepathically connected to your computer, what you see through your monitor is how the game feels and responds. The fact that you have a bunch of frames being rendered by the graphics card and not being used at all is irrelevant, so assuming V-Sync is off and your refresh rate is 60 Hz, it won't make a shred of difference if your graphics card is rendering 60 or 150 frames per second. Any "feeling" you might have about it is purely psychological due to having the frame rate displayed.
Originally by: Polly Prissypantz Eve is slow enough that you won't really notice the difference but in some faster-paced games there is a definite difference in how the game performs when limiting or not limiting the frame rate. We're not talking about how it looks.
I completely agree that V-Sync has a detrimental effect on your frame rate. (even if that only applies if your card frame rate is lower than the refresh rate) However, that difference you claim to "feel" about how the game "performs" is utter crap, what you're seeing is the screen showing you the same frame twice because your graphics card didn't have a new one ready yet, but that's not the game feeling slower, it's the game looking slower, so yes, you are talking about how it looks, and nothing else.
Originally by: Polly Prissypantz This is an older, but still useful thread on the subject.
I suggest you read it again because you don't seem to understand what's said there.
Originally by: Polly Prissypantz Liberal use of italics brought to you today by the Learn to ****ing Read Association.
Right back at ya, with sporadic use of red, bold and larger font for dramatic effect.
New Eden Research, where your research gets done!
|

Sjobba
|
Posted - 2009.04.19 03:45:00 -
[82]
Edited by: Sjobba on 19/04/2009 03:48:33 Edited by: Sjobba on 19/04/2009 03:46:20 Realizing this has been explained a number of times already, I will put it out there once more... (I actually did some researce on this last time I decided to teach myself game programming )
First: The amount of FPS you can see is limited by your monitor. If your screen can only show 60fps, rendering at a higher rate will NOT make the game smoother... the extra frames will either just be dropped or fragmented into other frames (visible as screen tearing).
Second: Whether or not your game is rendering at 60, 150, 250, or a 1000 frames per second... it will NOT change how smooth the game feels. Window positions and such are all calculated on different threads, separately, away from the graphics. (Assuming the programmers designing it used multi-threading, which is a fair assumption, really)
Meaning, having a higher fps will NOT make moving windows and such *feel* smoother... The positions will simply be displayed where it is whenever the GPU gets around to rendering the next frame... The actual position of the window is NOT calculated when the frame is rendered. (As was the case with single-threaded applications.)
If anything, higher FPS (above the monitor max) will decrease the smoothness of the game, as it puts more strain on the hardware to render the graphics, leaving less resources available for the rest of the threads.
Note however, that, obviously, if your computer can only handle 30fps, odds are that the game will be a lot less smooth than on a computer that can handle 150fps. (But not because of the FPS alone, as explained above.)
Third: The fact that your computer can render at 150fps, 250fps, etc... is not impressive. A high FPS does not mean your hardware is uber. If anything, it just means it's configured inefficiently, and that you will have to replace it sooner than you would otherwise have to.
Edit: spelling and minor nitpicks 
|

MechaViridis
Amarr The Program Vanguard.
|
Posted - 2009.04.19 06:34:00 -
[83]
If any of you guys have played competitive counter-strike, quake, etc, you will know that fps higher than 60 DOES matter. It has something to do with netcode/extrapolation of movement, so for eve this isn't as important. HOWEVER, with V-Sync on, your FPS will drop more if it goes below 60 than your fps would be if you didn't have it on. Go into a big fleet battle with v-sync on. You will have lower fps because some of the frames will still be out of sync and causing more latency. Now turn off v-sync and go into same situation. It will most likely be higher. Xfire -- Seph31 |

Alexander Nergal
Three Pony
|
Posted - 2009.04.19 09:31:00 -
[84]
Seems like everytime an issue, legitimate or not, is raised on these forums all that follows is angry, angry posts by mad people. Everything is a "Whine". Its goddam BS.
sidenote, I can feel the difference between 200 and 300 fps so I guess I'm just a god 
|

Feilamya
Minmatar
|
Posted - 2009.04.19 09:37:00 -
[85]
Originally by: Sjobba Second: Whether or not your game is rendering at 60, 150, 250, or a 1000 frames per second... it will NOT change how smooth the game feels. Window positions and such are all calculated on different threads, separately, away from the graphics. (Assuming the programmers designing it used multi-threading, which is a fair assumption, really)
Actually not. Multi-threaded programming is a *****, especially when your application performs a lot of inherently sequential stuff. And in games, there is a lot of inherent sequentiality.
For example, the game client can not look into the future and render the next 10 or so frames in parallel to the rest of the game mechanics, because it has free GPU resources. What is shown on the next 10 frames depends on user input and what the server sends to the client.
There used to be a good article about this, but I can't find it any more. The following seems to describe the same problem: http://www.gamasutra.com/features/20051117/gabb_01.shtml
EVE is an old game. For most old games, it's actually a very safe bet that they work entirely single-threaded, except maybe for one or two dedicated threads that handle sound rendering or networking (and I doubt EVE uses any amount of multi-threading for networking). Some may actually use a dedicated graphics rendering thread, which would mean your argument holds.
So what impact does FPS have on the performance of single-threaded games? It depends! If V-sync is on, the CPU waits up to 1/60 seconds for every frame displayed. This time can not be used for doing other stuff, like networking, handling user input, etc. (Well, it can. But this requires careful programming, or otherwise the single-threaded game loop might miss the next V-sync) If V-sync is off, the CPU does not wait for V-sync, but it is busy rendering additional, useless frames that will only result in garbage on the screen (tearing). Frames are not rendered entirely on the GPU. The CPU still has to do a lot of work on its own. So this time is wasted and can not be used by the CPU to do useful stuff.
So the bottom line is that the game can run more or less smoothly when V-sync is on. It depends how well it is implemented. Smart game developers will probably optimise for V-sync enabled, because it simply looks better.
Smart Counter-Strike developers, on the other hand, will optimise for V-sync off, because they know their customers think high FPS = more E-peen.
|

Borne Seller
|
Posted - 2009.04.19 11:41:00 -
[86]
Originally by: johny B5
Originally by: NeoTheo
you have your interval set, that locks the FPS to same as your refresh rate of your monitor.
change it back to imediate.
thanks, it worked! I didn'd know these settings before and didn't change them. that's why i was confused. Here's how to do it: 1. Go to Grafiks & Displays 2. Check box "Advanced settings" 3. Set "Present intervall" to "Intervall immediate"
Thank you. I was pulling my hair out trying to understand why. Cheers m8
|

Sjobba
|
Posted - 2009.04.19 20:58:00 -
[87]
Originally by: Feilamya EVE is an old game. For most old games, it's actually a very safe bet that they work entirely single-threaded, except maybe for one or two dedicated threads that handle sound rendering or networking (and I doubt EVE uses any amount of multi-threading for networking). Some may actually use a dedicated graphics rendering thread, which would mean your argument holds.
True, but keep in mind that the EVE graphics engine was pretty much remade from the ground up with Trinity, so the premium client we use today is not really that old.
Although, the fact that fetching data from the market freezes the entire game would suggest it might be less threaded than I was picturing 
Originally by: Alexander Nergal sidenote, I can feel the difference between 200 and 300 fps so I guess I'm just a god 
You expect there to be a difference between the two, so you see it whether or not it is there. There is a psychological term for this... which I can't quite remember at the moment.
I mean... I could swear my computer booted up twice as fast after I upgraded my old SATA drive to SATA II... until I realized I messed up the cables and was still booting of my old SATA drive.  (It did feel a bit odd, getting a new HDD partitioned exactly like my old one )
|

Agent Known
Apotheosis of Virtue
|
Posted - 2009.04.19 21:08:00 -
[88]
It's actually a good thing they set it to Interval One by default...people were complaining that EVE was burning out their fancy-pants graphics cards.
Yes, increased FPS gives you a better buffer, but at the cost of increased heat and load on the graphics card (CPU is NOT affected by this as much).
Also, if you have an old CRT @ 60Hz, you'll probably see it flickering a lot and also strain your eyes (if a light's on). This proves the theory of what the eye "sees", and 60Hz is visible light (which causes the flicker). Increasing the Hz drops the flicker since the light and your monitor will refresh at different rates. This isn't true with LCD monitors because of the way they function (backlight w/crystals that block, not generate, light to create an image).
/me is done Obviously what I say isn't always what my alliance thinks. I hate to break it to you, but this is in fact my signature. |

Astigmatic
|
Posted - 2009.04.19 21:14:00 -
[89]
Originally by: Sjobba You expect there to be a difference between the two, so you see it whether or not it is there. There is a psychological term for this... which I can't quite remember at the moment.
Let me help you. As the person you are referring to is Alexander Nergal regarding the difference between 200 and 300fps which is only possible in really top end machines with code that specifically supports it and is so recent there is no historical data to support it's existence. He's nuts.
|

gamertrav
0utbreak KrautbreaK
|
Posted - 2009.05.03 10:36:00 -
[90]
Edited by: gamertrav on 03/05/2009 10:36:48
Originally by: Elizabeth Joanne Alas, not so. TFT displays also have this phenomenon called input lag that isn't often discussed because it can be outright horrendous. The Dell 2707WFP for example has 46 milliseconds of input lag. This means you are effectively at least 2 frames in the past at all times.
Yep, and it should be noted that 46ms is more than the network lag you will generally have in an online FPS, so you are basically doubling (or even tripling) the amount of lag you perceive.
Originally by: Elizabeth Joanne If you are worried about frame rates and its effect on your reaction times, choose a monitor with a very low input lag first. TFT Central has measured the input lag on a bunch of models. Manufacturers typically don't advertise this figure.
The thing is, I chose a LCD with 2ms lag, and it's still very noticeable in games (even in eve) when I enable vsync. Maybe the manufacturer just lied about the lag, but I have 2 other LCDs in my house that are even worse.
It's less of a problem in eve of course because eve is a lot less dependent on split second reactions, but it still annoys me when moving the camera around. The real problem for me in Eve though is that when running 2 clients with v-sync on, they sometimes drop to very low FPS (like 20fps...) for some reason, but with vsync disabled this never happens.
|
|

Astria Tiphareth
Caldari 24th Imperial Crusade
|
Posted - 2009.05.03 11:08:00 -
[91]
Edited by: Astria Tiphareth on 03/05/2009 11:08:40 This is bizarre. You do realise you're arguing about a figure that has no relevance to gameplay in EVE whatsoever?
Fact 1. Server operational cycle in EVE is one second, that's 1000 milliseconds. No matter how fast you see and respond to visual input, I guarantee you that the network latency & server cycle time combined will be slower than your reaction time.
Fact 2. A monitor cannot display frames faster than its own refresh rate.
V-sync on or off makes no practical difference whatsoever to your visual response time. The only distinction is whether you get screen tearing or not. That 150 fps is mostly going to waste as heat in your graphics card, because you are most definitely not seeing the benefit.
The idea that you're not making use of a new graphics card because it doesn't go above 60 fps is so mindbogglingly uninformed, I don't know where to begin. Have you considered that it can do more in that 60th of a second than an inferior graphics card?
If you're in a battle with less than 60 fps, you're an idiot for thinking that v-sync is even remotely related. If you have less than 60 fps, you have less than 60 fps. The card is working as fast as it can. That's what bloody v-sync means! Sync to refresh rate. If it's not managing that, what the hell do you think it's doing, twiddling its thumbs? ___ My views may not represent those of my corporation, which is why I never get invited to those diplomatic parties... Environmental Effects
|

Astria Tiphareth
Caldari 24th Imperial Crusade
|
Posted - 2009.05.03 11:20:00 -
[92]
Edited by: Astria Tiphareth on 03/05/2009 11:21:10
Originally by: MechaViridis If any of you guys have played competitive counter-strike, quake, etc, you will know that fps higher than 60 DOES matter. It has something to do with netcode/extrapolation of movement, so for eve this isn't as important.
No no no. It's what Feilamya outlined. You're making a logical link where none exists.
The issue with going above 60 fps in Unreal Tournament etc. has nothing to do with FPS & v-sync and everything to do with game design. Those older first person shooters are single-threaded by and large. Your network code, physics calculations, graphics code, everything is in a single massive loop in code.
If you configure for v-sync, you require that the game code delays the next graphics frame being pushed to display until the next refresh comes around. Badly or lazily written loop code will mean that everything else waits for that as well.
Whilst it is feasible to write code that ensures a timely 60 FPS whilst letting physics go nuts and calculate as often as possible, it's not trivial, which is why multi-threading was created, because in truth, despite its inherent complexity, multi-threading is easier and scales better than coercing a single loop to do it.
The monitor physically cannot display anything faster than its refresh rate. At best you'll get a screen tear and a frame a 60th of a second sooner (assuming 60Hz refresh).
If your personal response time is that good, then more power to you. However, I'd point out that that response time is 16ms, and that's why the other impacts of lag are so critical to take into account. On a LAN Counter-strike game where super-low ping of 4ms or less is desirable, your response time and monitor refresh may just possibly have an impact. In EVE? Not a chance. Not ever. It's physically impossible. The network & server lag is an order of magnitude larger. ___ My views may not represent those of my corporation, which is why I never get invited to those diplomatic parties... Environmental Effects
|

TamiyaCowboy
Caldari Deep Core Mining Inc.
|
Posted - 2009.05.03 11:50:00 -
[93]
Edited by: TamiyaCowboy on 03/05/2009 11:51:41 When vertical sync is disabled, a video card is free to render frames as fast as it can, But the display of those rendered frames is still limited to the refresh rate of the monitor. For example, a card may render a game at 100 FPS on a monitor running 75 Hz refresh, but no more than 75 FPS can actually be displayed on screen.
Certain elements of a game may be more GPU-intensive than others. While a game may achieve a fairly consistent 60 frame/s, the frame rate may drop below that during intensive scenes. By achieving frame rates in excess of what is displayable, it makes it less likely that frame rates will drop below what is displayable during heavy CPU/GPU load.
above is the explanation you want.also remember GPU instructs frames too be slightly blured when they change too make it a smooth change, and the human eye can detect these blurs more easy so wiki says
|

Mr Malaka
|
Posted - 2009.05.03 13:44:00 -
[94]
Confirming i'm on page 4 of an idiotic whine that was answered in post #2.
What a ****ing train wreck
|

Ghoest
|
Posted - 2009.05.03 16:43:00 -
[95]
Im guessing it has to do with all the high end cards that were burning up when you used max settings.
Wherever you went - Here you are.
|

Weer Treyt
|
Posted - 2009.05.03 16:53:00 -
[96]
Please do the following test:
1.1 Switch your graphics settings to interval immediate. 1.2 In a station take one item on your cursor and drag it left and right over the whole screen, till you get a feeling for the icon lagging behind your movements. 1.3 You realize that the icon is lagging behind only by a tiny amount.
2.1 Switch your graphics settings to interval one (or more). 2.2 Do the same as in 1.2. 2.3 Realize that the icon is lagging behind noticeably more than with interval immediate settings.
Weer Treyt
|

Agent Known
Apotheosis of Virtue
|
Posted - 2009.05.03 17:08:00 -
[97]
Set it to Interval Immediate and leave it be. Burn up your graphics card for all I care. This threadnaught needs to die.
|

Roy Batty68
Caldari Immortal Dead
|
Posted - 2009.05.03 18:26:00 -
[98]
Originally by: johny B5 before the patch, i had 150 fps and higher with the highest settings
 Quit hogging all the FPS! Share with the rest of us. Quit being so greedy. 
Or you don't get any cake... 
----
Thus ev'ry kind their pleasure find, The savage and the tender; Some social join, and leagues combine; Some solitary wander. ~ Robert Burns |

Onus Mian
Amarr Kingfisher Industries
|
Posted - 2009.05.03 19:27:00 -
[99]
Edited by: Onus Mian on 03/05/2009 19:31:07 Edited by: Onus Mian on 03/05/2009 19:26:56 Can you eyes even see things changing at faster than 60fps? Its been a while since I did that at school but from what I remember the rate at which pigmants in your eye can be replaced limits how many frames you'd seen in a second regardless of how high the fps was.
EDIT
Nevermind wikipedia says its hard to quantify a human fps ----
Isn't it enough to see that a garden is beautiful without having to believe that there are fairies at the bottom of it too? - Douglas Adams
|

Armoured C
Gallente Federation of Freedom Fighters Aggression.
|
Posted - 2009.05.03 20:11:00 -
[100]
i am fine with 45 FPS across 2 screen and can watch and be in a large fleet battle 60FPS is aduqate enough i think
twitter blog
|
|

Weight What
|
Posted - 2009.05.03 20:46:00 -
[101]
Originally by: Armoured C i am fine with 45 FPS across 2 screen and can watch and be in a large fleet battle 60FPS is aduqate enough i think
Maybe it's enough for you and your primitive eyes, however, those of use with cybernetic ocular implants can process scenes of up to 200FPS.
-----------------------------------------------
Annonymous, trading as "Weight What". |

Armoured C
Gallente Federation of Freedom Fighters Aggression.
|
Posted - 2009.05.03 20:49:00 -
[102]
Originally by: Weight What
Originally by: Armoured C i am fine with 45 FPS across 2 screen and can watch and be in a large fleet battle 60FPS is aduqate enough i think
Maybe it's enough for you and your primitive eyes, however, those of use with cybernetic ocular implants can process scenes of up to 200FPS.
ahh i have cybernetic forumfingertypo version snake plants :)
twitter blog
|

Weight What
|
Posted - 2009.05.03 20:49:00 -
[103]
Originally by: Armoured C ahh i have cybernetic forumfingertypo version snake plants :)
We shall see :))
-----------------------------------------------
Annonymous, trading as "Weight What". |

Armoured C
Gallente Federation of Freedom Fighters Aggression.
|
Posted - 2009.05.03 20:52:00 -
[104]
Originally by: Weight What
Originally by: Armoured C ahh i have cybernetic forumfingertypo version snake plants :)
We shall see :))
you cant beat me ... you cant even come close ... plus i have no work tomorrow and have stayed up with out sleep to insure my forum victory =)
twitter blog
|

Henry Loenwind
|
Posted - 2009.05.03 22:11:00 -
[105]
O come on people, you are mixing a LOT of things in this discussion.
(1) What can the eye see
(1a) Flicker
Definition: "Flicker" is when the screen goes dark after a frame has been shown.
The eye can see flickering up to very high rates. At about 100-120 Hz the flickering is no longer disturbing, but may still be noticed. CRT monitors flicker at their frame rate, LCDs do NOT flicker at all. CRT-TVs do not flicker, however they pulsate instead (the image doas not go black after each frame but gets darker). Movies flicker, but not at frame rate (24fps) but at doubled (48fps) or trippled (72fps) frame rate---they project the same frame multiple times to avoid the flickering.
(1b) Movement
The eye recognizes movements at about 15 fps. Below it's single images, above it's movement. At higher frame rates the movement will get smoother. How high the frame rate must be for themovement to be absolutely smooth depends on the kind of images that make up the frames. If the images have motion blur (everthing that is recorded with a camera has it), smooth movement start at about 22 fps and gets perfect somewhere between 40 and 50fps. If they don't have motion blur (everything that is created by a computer, unless motion blur is added) much higher frame rates are needed. Some people may be able to detect the effect at 200fps, for others 120 fps might be enough.
Note: Adding motion blur is an expensive process that requires extra (partial) frames to be rendered. We are talking about at least 4-10 frames extra per frame rendered. Guess why those SFX studios need those huge rendering farms for rendering those "simple" 1920*1050px 24fps CGI scenes...
(2) What the monitor can display
(2a) Linked display
CRTs and CRT-TVs are "linked", meaning that the image that is shown is directly linked to the input signal. There is only a small processing delay (1 or 2 scan lines). So if you feed a 59 Hz signal to a CRT, it will display it at 59 Hz. If you give it a 100 Hz signal, it will display it at 100 Hz. If you give it a 200 Hz signal, it will display either nothing or die. Defect CRTs after playing with your display setting were somewhat common 15 years ago...
(2b) non-linked display
LCDs and modern TVs (with image enhancers, e.g. 100Hz-technology) are not linked. They will read the input signal into an internal buffer and then display the image (or more images, or less images, or a changed image) when they "want" to. So a 120Hz-TV will buffer 2 images, then compute a third, then display them. This means there will be a time delay between the input signal and the image on the screen. Some LCDs can have a very large delay, and LCD-TVs usually have a huge delay, too. So if you have a home cinema installation at you notice that the sound from your surround speakers is early, blame the TV for being late with the picture! (Or disable it's image enhancers. Sometime TVs have a "PC mode", that does that.) If you mouse seems to be lagging behind you movements, even on the desktop, try another display.
(2c) black-white change
Early LCDs had the problem that you could see ghost pictures. The reason fo that is that each pixel need some time to change it's color. THAT is the 2ms/5ms value we nowadays see in the techical data. LCD pixels got faster lately, but the main effect is done with image enhancing technology. Basically what is done is that the image sent to the pixel actually has a different brightness then what should be displayed. But due to the slowness of the pixel, it actually is displayed as intended. Again, there's an image enhancer, meaning that we get a delay as in 2b.
(cont'd)
Originally by: Decard Sune on 23/03/2008 12:12:37
Carebear is a derogatory term used by those who feel that every player should be nothing mroe than a target for their pleasure. These individuals usually ha
|

Henry Loenwind
|
Posted - 2009.05.03 22:12:00 -
[106]
(2d) Hz
Non-linked displays (usually) operate at a fixed rate. So a LCD will display 60 frames per second, regardless of the input signal. It may announce to the PC that it can process data at 59, 60 and 75 Hz, but it will display at 60. To my knowledge there is no current consumer LCD that can do 75Hz, though still some display accept 75Hz as input. And with that we come to:
(3) What's on the cable
That's the cable between PC and monitor. In the early days it was a tricky thing to configure the PC so it would send a signal exactly the way the monitor could process it. However, that's the past. Today a monitor will talk back to the PC and tell it exactly what it likes. And with a LCD this would be a 60 Hz signal in 99% of all cases.
(4) What's on the PC
So we know that every 1/60th second one frame must be put on the cable, and that will take about 1/65th second to transfer. So, how does the PC cope with that? There are a couple of different scenarios:
(4a) vsync off, no double buffering
This is the classical way. The GPU renders into a buffer, at fast as it can. At the same time a different part of the GPU reads from that same buffer, pixel by pixel as it sends the date on the line. This is the mode that will (will, not may!) produce "tear lines" by mixing data from different rendered frames into the same frame that is sent to the monitor. But there also is a good side to this: At the moment a picel is sent to the monitor, we know that its maximum age is 1/render-fps seconds. (read: that we get the newest data for that pixel that the GPU can produce)
(4b) vsync on, no double buffering
The GPU renders into a buffer, and when the buffer is filled, it waits. After the buffer is sent to the monitor, the GPU starts to render the next frame. The effect is, that this kind of setup will fail completely---the time when no data is sent to the monitor between frames is so short, that the GPU cannot render a frame in it.
(4c) vsync on, double buffering
Now we have 2 of those bufferers. First the GPU renders a frame into the first buffer. Then it waits until that other part of the GPU starts to send the content of that buffer to the monitor. Then it render the next frame into the second buffer. The other part will continue sending the data from the first buffer to the monitor over and over, until finally the second buffer contains a rendered frame. Then it will switch buffers.
Oops, that only happens if the GPU cannot render fast enough (less than 60fps). If it can, then the content of the buffer will be sent to the monitor only once. But then the rendering part of the GPU must wait until buffers can be swapped.
Sound good, doesn't it? No teatin, no calculating of frames that are never sent to the monitor. But there are 2 problems: First, the GPU might take just a little but longer than 1/60th second to compute a frame. The effect is that you get 30fps---only every second frame is a new one. The second problem happens when the GPU actually is very fast. It may take 1/10th of the allotted 1/60th second the render the frame, then is has to wait 9/10th of 1/60th second for the frame to be sent to the monitor. So that frame sat in the buffer for 9/600th seconds, but if vsync was off, it would only have sat there for about 1/600th second. So we get a slight lag here.
(cont'd)
Originally by: Decard Sune on 23/03/2008 12:12:37
Carebear is a derogatory term used by those who feel that every player should be nothing mroe than a target for their pleasure. These individuals usually ha
|

Henry Loenwind
|
Posted - 2009.05.03 22:13:00 -
[107]
(4d) vsync on, tripple buffering
A third buffer is added---how does that help? This helps agaist both problems noted under 4c. It allows the rendering part of the GPU to continue rendering all the time. There may be a frame in the first buffer that is currently being sent to the monitor, and a frame in the second buffer that has been rendered and is waiting, and the GPU then can render into the third buffer. Now it has rendered the third buffer, and still the first on is being sent (and the second one is getting stale by the nanosecond)...then it will render into the second buffer again, discarding the old image---no problem for the sending part, if it finishes the first buffer, it can continue with the third one.
(5) Details
Yes, I (over-)simplified that all. What did you expect? A scientific paper? <g>
Originally by: Decard Sune on 23/03/2008 12:12:37
Carebear is a derogatory term used by those who feel that every player should be nothing mroe than a target for their pleasure. These individuals usually ha
|

Gariuys
Evil Strangers Inc.
|
Posted - 2009.05.03 22:17:00 -
[108]
simplified but nice anyway, good job on explaining the third buffer. ;-D
|

Henry Loenwind
|
Posted - 2009.05.03 22:23:00 -
[109]
Originally by: Onus Mian Can you eyes even see things changing at faster than 60fps? Its been a while since I did that at school but from what I remember the rate at which pigmants in your eye can be replaced limits how many frames you'd seen in a second regardless of how high the fps was.
The eye works differently than our technology. Simplified:
A camera let's light fall onto it's pixels, then after a certain time asks each pixel how much light it got.
An eye let's light fall onto it's pixels, and every time a pixel got a certain amount of light it "fires" a signal.
The effect is, that in a high-light setting the eye's time resolution is better than in a low-light setting. If an eye-pixel fires 5 times while it sees one display-frame, we can see the changes that are not part of the frame but of the process of frame-changing. If the eye-pixel fires once for 2 display-frames, we see nice smooth movement (those 2 combined frames even get us motion-blur!).
Originally by: Decard Sune on 23/03/2008 12:12:37
Carebear is a derogatory term used by those who feel that every player should be nothing mroe than a target for their pleasure. These individuals usually ha
|

gamertrav
0utbreak KrautbreaK
|
Posted - 2009.05.03 23:23:00 -
[110]
Nice overview of all the aspects in play here Henry. :)
Originally by: Agent Known Set it to Interval Immediate and leave it be. Burn up your graphics card for all I care. This threadnaught needs to die.
So you reply the post keeping it at the top of the forums? Nicely done.
|
|

Adaris
Gallente E X I U S
|
Posted - 2009.05.04 00:43:00 -
[111]
I have been told to come to this thread to purchase some additional FPS. thank you. *******
- E X I U S -
|

Astria Tiphareth
Caldari 24th Imperial Crusade
|
Posted - 2009.05.04 11:58:00 -
[112]
Originally by: Henry Loenwind Superb analysis
This should be required reading . Great analysis and explanation. I must confess I'd forgotten some of those details, so the refresher was appreciated. I'd never known about the hardware side, like the LCD having its own buffer.
However, there's one area I think needs clearing up.
In Direct3D one creates a swap chain of buffers to do the buffering as Henry outlined. Direct3D manages the swapping for you in DX9. As it happens, you can't ever directly access the front buffer (which gets sent to the monitor), so D3D forces you to double-buffer as a minimum, whether you like it or not. However, the presentation interval you set determines what happens when you tell Direct3D the back buffer is ready.
Your steps are in essence: Render everything to the back buffer. Present that back buffer to the chain. The driver goes off and swaps the chain. Your front buffer just became your back buffer, and any intervening buffers moved forward one. The driver presents the new front buffer according to presentation interval. Note that what you just rendered didn't necessarily get sent to the screen just yet. It might take another present and swap, depending on how many buffers you have.
If you're set to presentation interval immediate, then the driver ignores whatever the monitor is currently doing and sends the data as soon as possible. This is v-sync off as Henry outlined.
If you're set to presentation interval one, then the driver will wait until the monitor reaches a vertical blank. In CRT terms, this is the period when the beam must reset from the bottom corner of your monitor to the top corner to start drawing again.
Intervals two & three and so on merely double or triple the waiting period. So if your refresh rate is 60Hz and your presentation interval is two, it'll present only every two vertical blanks and give you 30 fps. Interval default is effectively interval one, but with some subtleties around timing & windowed vs full screen mode.
The Direct3D call to Present returns as soon as the swap operation being requested has been queued up (i.e. transferring one buffer to another). What does this mean for performance?
What the above actually means is that as you run out of frame buffers to render to, Direct3D will stop Present from returning until there's a new buffer available. Thus your frame rate ends up locked to the ability of Direct3D to manage the swap chain. With immediate presentation, buffers cycle quickly, the card works overtime, and you get maximum FPS. With interval one, Present will shortly end up (within 5 frames or so for certain) locked to the refresh rate.
Thus triple buffering in DirectX is acting purely as a reservoir. What you render as a frame now won't turn up for at least two presents. The advantage that Henry describes where you can render to the second buffer again because it's gone stale doesn't exist in DirectX 9. In DX10 and 11 this may become possible.
This is why v-sync on a Direct3D app leads to lower card temperatures with high performance cards. The card simply isn't working as hard. Both rendering and present cycles are locked to the refresh rate.
If the rendering cycle could operate independently of the present cycle, as with the alternative form of triple buffering, no major temperature difference would be observed.
Short version for DirectX 9 applications like EVE: V-sync off -> tearing and maximum FPS, maximum card heat, and the opportunity to see some part of the frame a few ms earlier.
V-sync on -> no tearing, clamped FPS, lower card temperatures, at the cost of some ms delay depending on how long the buffering & refresh rate takes. ___ My views may not represent those of my corporation, which is why I never get invited to those diplomatic parties... Environmental Effects
|

Arkeladin
|
Posted - 2009.05.04 12:27:00 -
[113]
Originally by: Gabriel Loki
Originally by: Lonzo Kincaid what's the frame rate for human eyes?
They dont have one.
They do, just not in the way people think.
Without getting VERY technical, the human eye can only perceive changes in a given ”scene” at a fixed rate. That rate is EQUIVALENT to about 20 fps. Beyond that, since eyes are a organic system and don't have anything like a shutter system, the frames start bleeding together. This is called ”persistence of vision” and is what allows us to perceive motion. Abusing this somewhat is how movies and TV works - TV *FRAMES* per second can be as low as 28 yet still appear smooth (PAL TV).
Look it up yourselves
|

Pan Crastus
Anti-Metagaming League
|
Posted - 2009.05.04 12:51:00 -
[114]
Originally by: Weer Treyt Please do the following test:
1.1 Switch your graphics settings to interval immediate. 1.2 In a station take one item on your cursor and drag it left and right over the whole screen, till you get a feeling for the icon lagging behind your movements. 1.3 You realize that the icon is lagging behind only by a tiny amount.
2.1 Switch your graphics settings to interval one (or more). 2.2 Do the same as in 1.2. 2.3 Realize that the icon is lagging behind noticeably more than with interval immediate settings.
Interesting test, but it doesn't show a higher refresh rate. What it shows is that EVE is rendering 1 or more (seems like more) frames in advance, so the frames you see are 2-3 60ths of a second behind the hardware mouse cursor. I don't know why it is so clearly visible, but it seems to suggest that EVE's double/triple-buffering could be improved...
How to PVP: 1. buy ISK with GTCs, 2. fit cloak, learn aggro mechanics, 3. buy second account for metagaming
|

Astria Tiphareth
Caldari 24th Imperial Crusade
|
Posted - 2009.05.04 13:12:00 -
[115]
Edited by: Astria Tiphareth on 04/05/2009 13:13:09
Originally by: Pan Crastus Interesting test, but it doesn't show a higher refresh rate. What it shows is that EVE is rendering 1 or more (seems like more) frames in advance, so the frames you see are 2-3 60ths of a second behind the hardware mouse cursor. I don't know why it is so clearly visible, but it seems to suggest that EVE's double/triple-buffering could be improved...
It also fails to take into account everything else that happens as you drag the icon around. EVE is entirely single-threaded, so the immediate presentation allows for a faster response time on other code e.g. network-related queries or detecting mouse events.
As I said earlier, the critical issue that people seem to forget is that beyond potentially a very slightly more responsive GUI, and a few milliseconds more warning of a visual change that is entirely client-extrapolated anyway, maximum FPS or clamped FPS makes little difference to EVE's gameplay. It's all focused around a full 1 second update cycle.
For now, immediate mode is useful for those that want every performance aspect pushed as far to the limit as possible. For those of us that want our graphics card to not run at 100% all the time, interval one is far more of an acceptable compromise. The real test is this - play EVE, fight, get in a fleet fight, with v-sync off. Do the same with v-sync on. Has it made any real difference to the game, and your success with it etc.? That is the critical question to ask, and it's a personal one unique to each of us & our environment. ___ My views may not represent those of my corporation, which is why I never get invited to those diplomatic parties... Environmental Effects
|

Taedrin
Gallente Golden Mechanization Protectorate
|
Posted - 2009.05.04 13:23:00 -
[116]
Edited by: Taedrin on 04/05/2009 13:26:35
Originally by: Akita T Edited by: Akita T on 17/04/2009 16:58:02
Turning VSynch off just so you can SHOW OFF your "omfg, 279 FPS EVE, wtfbbq" is downright stupid, because it's KNOWN to have contributed a lot to premature frying of several video cards.
Also, like many, MANY people have said here before, if the monitor can only display 60 (or 75, or 100, or 120, whatever), there's no BENEFIT in turning VSynch off ("Interval immediate") other than the FPS e-p33n number on the FPS monitor. At best, you will see half a frame and half of the other with a bit of tearning in-between, but that's just stupid.
In EVE, at pretty much all times, VSynch should be turned on (Interval one) and left that way forever. It has no serious drawbacks (some MIGHT argue that it doesn't "feel that dynamic anymore" because they were used to the tearing effect) but a lot of benefits compared to the alternative (longer vidcard life, for starters).
NOT ALWAYS TRUE.
Vsynch keeps the graphics card synched with the monitor, and this is all well and good if the graphics card is fast enough to "keep up" with the monitor. But if your graphics card is rendering an intensive scene, vsynch has a couple of consequences. First off - vsynch will NOT start updating a scene on the monitor until the monitor is finished drawing the last frame. This means that if you have vsynch enabled, your frame rate will either be: 4fps, 7-8fps,15fps, 30fps, 60fps. Please note that the FPS being displayed by EVE, FRAPS or what not is actually an AVERAGE frame rate. Not the actual current frame rate, so it will display different numbers.
Here's an example that I've posted on these forums before:
" In this example, your FPS is 5/6 of the refresh rate.
Interval immediate: f1 f2 f3 f4 f5 f6 1111 1111 2222 3333 4444 6666 1111 2222 2222 3333 4444 6666 1111 2222 3333 3333 4444 6666 1111 2222 3333 4444 4444 6666
Interval one: f1 f2 f3 f4 f5 f6 1111 1111 3333 3333 5555 6666 1111 1111 3333 3333 5555 6666 1111 1111 3333 3333 5555 6666 1111 1111 3333 3333 5555 6666
You see here that Interval immediate only drops frame #5, but is only able to draw a portion of frames 2-4 on time. Interval one drops frames #2 and #4 so it can get a head start on Frames #3 and #5 (so it actually finishes #5 on time) However, more frames were dropped under Interval One than interval immediate. "
EDIT: You will see that while Interval immediate does more work, it introduces an artifact called "tearing" into the rendering process. Vsynch will cause your effective frame rate to drop, but will eliminate "tearing".
|
|

Chribba
Otherworld Enterprises Otherworld Empire
|
Posted - 2009.05.04 13:27:00 -
[117]
So, in order for LEAST wear&tear (heat) - vsync+interval one is the way to go? I'm fine with 20fps if that means my GPU sticks at 50C rather than 85 I'm all happy. What to do?
|
|

Astria Tiphareth
Caldari 24th Imperial Crusade
|
Posted - 2009.05.04 13:35:00 -
[118]
Edited by: Astria Tiphareth on 04/05/2009 13:37:45
Originally by: Chribba So, in order for LEAST wear&tear (heat) - vsync+interval one is the way to go? I'm fine with 20fps if that means my GPU sticks at 50C rather than 85 I'm all happy. What to do?
Interval one is v-sync; it's just a different set of naming conventions. It will lock to the refresh rate of your monitor, so that your maximum FPS never goes above it. Empirical evidence and a fair amount of logic indicates that your card will run cooler as a result. How much impact that has on the lifetime of your card will depend .
Edit: You could put it lower to two or three or four, if you want, but max locked FPS will halve each time, and I'd personally not recommend three or four. ___ My views may not represent those of my corporation, which is why I never get invited to those diplomatic parties... Environmental Effects
|

Taedrin
Gallente Golden Mechanization Protectorate
|
Posted - 2009.05.04 13:37:00 -
[119]
Originally by: Chribba So, in order for LEAST wear&tear (heat) - vsync+interval one is the way to go? I'm fine with 20fps if that means my GPU sticks at 50C rather than 85 I'm all happy. What to do?
Yes. Interval Immediate tells the graphics card to do as much work as it possibly can, while interval one (vsynch) allows it to take a break between frame refreshes. BTW, I think we've actually already gone over this in this thread 
|

Seishi Maru
M. Corp Mostly Harmless
|
Posted - 2009.05.04 15:19:00 -
[120]
Edited by: Seishi Maru on 04/05/2009 15:23:17
Originally by: Shintai
Originally by: Grez Idd, the last two posts have hit it on the head.
Human eyes can perceive up to, and just about past 200fps. Some people cannot, some people can - it's a bit like hearing range for eyes, but, well yeah, you get the idea...
Locking your fps to your monitors refresh rate can also have detrimental effects. Lookup vsync and what it does.
All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps. It's also not a reason to use vsync. Vsync should only be used if you experience tearing of textures/scenes (the issue of a frame being dropped halfway through being rendered).
First of all the human eye dont know if its 30 FPS or 2mio FPS. Does the screen flicker for you in the cinema?
Secondly..LCDs...60hz...60FPS..Bingo. anything above FPS simply wont get shown. the frames are discarded. !
That is because movies to not use an aditive composite coloring scheme. They use filtering color scheme on a film that causes slight blurrign when one frame passes to the other. That diminishes greatly the ammount of frame per secodns neeed to deceive human brain. But on a monitor you need quite a bit more to achieve same effect. Usually for most people around 50 fps is enough. Remember that altoguheach receptor in your eye can operate in much lower frequency they are NOT syncronized. You can have one receptor switch on or off every n millisecconds only but N others with same frequency be offseted a little bit and be triggered between those n millisseconds. That is why You CAN notice that there is somethign wrong with scene on some situations at frame rates liek 20-25 fps...
Also when you unlock Vsycn you are just creatign problems sicne the CPU msut calculate all th trash (Trash becuse won't be seen) that msut be sent to GPU. On that scenario you are just overstressing your CPU and makign the rest of the softwares on your computer have a harder time if they are runnignin background.
|
|

Razin
The xDEATHx Squadron Legion of xXDEATHXx
|
Posted - 2009.05.04 15:19:00 -
[121]
Originally by: Taedrin
Originally by: Chribba So, in order for LEAST wear&tear (heat) - vsync+interval one is the way to go? I'm fine with 20fps if that means my GPU sticks at 50C rather than 85 I'm all happy. What to do?
Yes. Interval Immediate tells the graphics card to do as much work as it possibly can, while interval one (vsynch) allows it to take a break between frame refreshes. BTW, I think we've actually already gone over this in this thread 
The linked thread also tells why you need to run with vsynch off if you ever expect your uncapped FPS to drop below screen refresh (60).
To recap: vsynch will cause dropped frames and an actual lowered framerate during GPU intensive scenes when your uncapped FPS drops below screen refresh. ...
|

Benzaiten Reverse
Caldari Shokei
|
Posted - 2009.05.04 17:42:00 -
[122]
Originally by: Red Wid0w Yeah listen to what people are telling you. YOU NEVER HAD 150 FPS. YOUR MONITOR CAN ONLY DISPLAY 60 (more than likely). All you are doing with your 150fps is wasting processing power, thus reducing the responsiveness of your background apps. Turn VSYNCH on and lock your fps to 60 and there WILL BE NO DIFFERENCE TO EVE. However, everything else on your pc will run faster.
Oh, and those 120hz LCDs are rubbish, they interpolate between frames basically.
Eve is all but just Graphic card hungry, it dont take much CPU and background apps depend mostly just on CPU.
As for VSYNC on, you are just hard limiting number of frames that GC render and at least on my PC it reduce responsivnes of game itself when there is shift on GC ussage. Most important is that OS itself is balancing usage of resources between applications and when there is change in requirements there is also delay as OS need to do lots of stuff (ie waiting for current tasks to finish). So, when you are limiting fps to 60 (GK, CPU, buffers memory is from big part used by another application or OS itself) and lets say small fleet jumped on you or you get lots of particles from huge explosion to render, your FPS goes down (lets say by half to 30 but can be more) for while before OS assign you more resources and they goes back up. If you are not limiting your GC performance you have same situation, but not that drastical, but even if so going down from 150 to 120 fps is not something you will recognize.
In EVE its not that big issue, but in some fast action games it make difference.
|

Kyra Felann
Gallente Noctis Fleet Technologies
|
Posted - 2009.05.04 18:24:00 -
[123]
Originally by: johny B5 thanks, it worked! I didn'd know these settings before and didn't change them. that's why i was confused. Here's how to do it: 1. Go to Grafiks & Displays 2. Check box "Advanced settings" 3. Set "Present intervall" to "Intervall immediate"
Yup, that's how to enable ugly screen-tearing and waste resources while gaining no visual improvement.
|

Kyra Felann
Gallente Noctis Fleet Technologies
|
Posted - 2009.05.04 18:41:00 -
[124]
Originally by: masternerdguy the human eye sees up to around 22fps, so anything above that is wasted anyway.
I used to believe this also, but it's completely wrong.
The human eye has no framerate. It is an analog device and can discern very high framerates. Also, lower framerates like in movies look smooth to the eye because of motion-blur, which most games don't have.
Do some actual reading about the subject instead of just repeating wrong things you hear on the internet.
|

Lt Angus
Caldari End Game. Dead End.
|
Posted - 2009.05.04 18:41:00 -
[125]
I love this thread  I bet half these guys paint stripes on their cars to make them go faster please resize your signature to the maximum allowed file size of 24000 bytes. Navigator Shhhh, Im hunting Badgers |

Henry Loenwind
|
Posted - 2009.05.05 15:11:00 -
[126]
Originally by: Kyra Felann lower framerates like in movies look smooth to the eye because of motion-blur, which most games don't have.
I feel I need to qft this fact again.
The human eye LOVES motion blur. It evolved in (or was constructed for, if you like) an environment where there was motion blur and it would have been a major investment to avoid it. So the brain decided to use the motion blur to its advantage instead.
If we see something that moves, we expect motion blur. If there is none, we will notice that.
So, we have a computer that produces frames that have not motion blur. What can we do to avoid the eye from noticing it? 3 possibilities:
(1) Add motion blur. For this the computer needs to compute additional frames and blend them together. To reduce the needed frames, rendering can be restricted to parts of the frame that have changed.
(2) Add blur around moving objects. Cheap and simple, the application knows which objects are moving and can add simple blur around them. Or if the camera is moving, it can just add blur over the whole screen.
(3) More fps to the eye. If there are more frames delivered to the eye, there will be some kind of saturation, meaning that the frames will blend together. And blended frames are blury where they are different.
Frames with motion blur look fine at 24fps (movies, either with real blur or CGI/SFX with solution #1), where computer games need more, e.g. 60fps (the max a normal LCD will display) to utilize solution #3.
----------------------
People state that games feel more responsive without vsync. Read my large 3-post post and you'll know why. In short: Normal vsync introduces additional lag in the chain from game to eye. (actually it's not lag but older frames)
----------------------
BTW: Thanks to the one who posted the info about DirectX. The last time I worked with teh software side of this, "VESA" was something some graphic cards supported <g>
Originally by: Decard Sune on 23/03/2008 12:12:37
Carebear is a derogatory term used by those who feel that every player should be nothing mroe than a target for their pleasure. These individuals usually ha
|

Henry Loenwind
|
Posted - 2009.05.05 15:12:00 -
[127]
Originally by: Lt Angus I love this thread  I bet half these guys paint stripes on their cars to make them go faster
Hey, that actually works.
Cars with stripes appear to move faster to the observer (the stripes look like motion blur). And usually people who paoint stripes on their cars don't actually want to go fast, they want to impress other people by making them think they go fast...
Originally by: Decard Sune on 23/03/2008 12:12:37
Carebear is a derogatory term used by those who feel that every player should be nothing mroe than a target for their pleasure. These individuals usually ha
|
|
|
|
Pages: 1 2 3 4 5 :: [one page] |