Pages: [1] :: one page |
Author |
Thread Statistics | Show CCP posts - 0 post(s) |

BelNevyn
|
Posted - 2009.11.15 11:24:00 -
[1]
Hi! For some reason the test client only generate 0.5-1.8 FPS (Control + F). Also it cause my computer to go up to 78% CPU load and more.
The information I got from the CTRL+F monitor is: AVG FPS 0.5 - 1.8 UIContainer 1528 UISprite 1997
I had turn down the UI settings for Performance (according to the auto guide).
My computer is an AMD x2 4400Mhz 4GB RAM Windows XP service pack 2 Graphic Card: XFX Nvidia GeForce 9800GT 512MB DDR3
According to the spec for requirement, that should be enougth to pass your minimum and recommendation level. With let latest TQ patch I have no trouble to run dual client, and watching an movie at the same time.
Sincerely BelNevyn
|

Grez
Fairlight Corp Rooks and Kings
|
Posted - 2009.11.15 11:33:00 -
[2]
Something wrong on your end or in the background. My brother has similar specs and he's able to run it perfectly. Are you running it windowed across multiple screens? Try running it fullscreen and see what happens. ---
|

Ancy Denaries
Caldari The Confederate Navy Forever Unbound
|
Posted - 2009.11.15 12:13:00 -
[3]
That sounds very bad, indeed. Check that you are not using a pseudo-full screen app like EVEMover and moving the client to the wrong monitor (this can really kill FPS, when the graphic card has to cross render through the DX buffer), also run full screen and see if it's a difference. Update to the latest drivers.
For reference, I have a GeForce 275 GTX and a Core i7 @3.1 GHz and I get over 120 FPS straight on the test server. ---- The Demigodess with a Conscience - An EVE IC Blog Personal Killboard |

Arous Drephius
|
Posted - 2009.11.15 15:07:00 -
[4]
Originally by: Ancy Denaries I get over 120 FPS
WTS Interval One before you damage your GPU.
|

Zaiyo Modi
Minmatar
|
Posted - 2009.11.15 16:17:00 -
[5]
Edited by: Zaiyo Modi on 15/11/2009 16:18:25
I have seen an issue of crippling fps with tranq and sisi client while using mining drones. I made efforts to get a clue as to why, but can't explain it, other than it seem to be related to having "effects" enabled, and somehow linked to drones, as the fps rise back to 60 the instant the now idle drones are sent back to the drone bay.
And I have seen unexplainable fps drop on sisi as I warp between planets with the new textures.
Also, after having the ship blown up in combat on sisi, the fps drop by roughly 50% or more, and then fps rises back to a normal 60 fps after a couple of minutes.
This is with using Radeon 5850 and Cat 9.10. I feel useless as I can't learn why the fps drop happen or if the sisi server itself can be blamed with its reduced power.
The drone issue appear regardless of any antialising settings. While I am not sure yet about the fps drop from planets or ending up in a pod.
|

Zaiyo Modi
Minmatar
|
Posted - 2009.11.15 16:52:00 -
[6]
Edited by: Zaiyo Modi on 15/11/2009 16:52:25
I believe I was told once that ccp does not support the use of antialising, and think I've read that if deveopers of a game create their own shaders it might mess up the antialiasing features in the catalyst drivers.
Would be nice to see some dev feedback to this topic, because I hate spending time and energy on bugreports that ccp perhaps doesn't wish to receive.
|

BelNevyn
|
Posted - 2009.11.15 16:54:00 -
[7]
Originally by: Grez Something wrong on your end or in the background. My brother has similar specs and he's able to run it perfectly. Are you running it windowed across multiple screens? Try running it fullscreen and see what happens.
I'm running each client in window mode on two different screens. I will try to run it later on fullscreen and see whats happening.
It is already from the startup, so when I try to login it takes along time between I type the char and the * appears in the password field.
|

Ancy Denaries
Caldari The Confederate Navy Forever Unbound
|
Posted - 2009.11.16 14:08:00 -
[8]
Originally by: Arous Drephius
Originally by: Ancy Denaries I get over 120 FPS
WTS Interval One before you damage your GPU.
Well yes, this is with interval immediate, naturally. When I actually play, interval is at one, and the FPS sits nicely at 60. ---- The Demigodess with a Conscience - An EVE IC Blog Personal Killboard |

Psycho Tripper
|
Posted - 2009.11.16 14:26:00 -
[9]
Originally by: Arous Drephius
Originally by: Ancy Denaries I get over 120 FPS
WTS Interval One before you damage your GPU.
I don't understand where this rumor comes from. GPUs are designed to run under full load for hundreds of thousands of hours. The REASON you turn interval to one, is because it acts like v-sync. It stops you getting screen tearing, when the graphics updates faster than the monitor refreshes.
If you have to turn interval to one to stop your graphics card overheating (considering that is the only way of damaging your graphics card, bar throwing it into a bucket of water) then you have a faulty graphics card and you should get it replaced. FYI GPUs are designed to run at up to 100¦c.
If anything, having it off will improve your pew pew as you'll get a more responsive game. It's commonplace in first person shooters to turn v-sync off to improve the responsiveness of the game (and boast about your 200fps E-PEEN)
tl;dr; get a clue before you post rubbish, if you see anybody posting rubbish about damaging GPUs, tell them to STFU
|

Lork Niffle
Gallente External Hard Drive
|
Posted - 2009.11.16 15:11:00 -
[10]
Psycho, the reason why we say it is becasue of a certain effect in GPU sales.
The 8000 core was a huge improvement in graphics performance and since it came out at a good time it meant a lot of systems had this line of cards in them. It also meant the 9000 and 200 series are built off the SAME core and since Nvidia won with overall performance from teh 8000 series almost all gaming PCs since then have contained the original 8000 design.
Then bring in the news that the 8000 core has a fault in it. Bring in the fact the higher clock speeds Nvidia cards run at mean under normal conditions they idle at 60C and run at 85C for all series it has been this way. The fault in the core meant if it triggered it could mean under certain conditions the card would reach 110C before the fault was realised and it wouldn't shut itself off and instead burns it out.
This became apparent in the main menu screen for EVE Online which uses large amounts of easy to render graphics easily putting a card under max load very rapidly. This meant the card can end up rendering at over 1000 FPS then the computer blue screens and the card being actually broken.
This is not a fault in the card but a simply an error in the actual Core design in regard to temperature monitoring.
Oh and sure cards can work for thousands of hours at max load but they can last decades when v-sync'd ------------------------------------- Don't click the links or even the forum topics. |

Soma Khan
|
Posted - 2009.11.16 15:32:00 -
[11]
Originally by: Lork Niffle Psycho, the reason why we say it is becasue of a certain effect in GPU sales.
The 8000 core was a huge improvement in graphics performance and since it came out at a good time it meant a lot of systems had this line of cards in them. It also meant the 9000 and 200 series are built off the SAME core and since Nvidia won with overall performance from teh 8000 series almost all gaming PCs since then have contained the original 8000 design.
Then bring in the news that the 8000 core has a fault in it. Bring in the fact the higher clock speeds Nvidia cards run at mean under normal conditions they idle at 60C and run at 85C for all series it has been this way. The fault in the core meant if it triggered it could mean under certain conditions the card would reach 110C before the fault was realised and it wouldn't shut itself off and instead burns it out.
This became apparent in the main menu screen for EVE Online which uses large amounts of easy to render graphics easily putting a card under max load very rapidly. This meant the card can end up rendering at over 1000 FPS then the computer blue screens and the card being actually broken.
This is not a fault in the card but a simply an error in the actual Core design in regard to temperature monitoring.
Oh and sure cards can work for thousands of hours at max load but they can last decades when v-sync'd
Could you provide a credible reference source for the above? ___
|

Psycho Tripper
|
Posted - 2009.11.16 15:32:00 -
[12]
^^ I'm very aware of that problem. I'm also very aware that it was an issue limited to the 8800GT core, this extended into the rebranded 9800GTX and 9800GT (this limited issue didn't survive the die shrink), subsequently turning interval to immediate isn't a problem with most graphics cards. Therfore it should not be used as a general statement.
|

Lork Niffle
Gallente External Hard Drive
|
Posted - 2009.11.16 15:51:00 -
[13]
Actually it was found most commonly in the 8400M chipset. The 8800 was only subject to it since it ran at much higher temperatures and had a much higher clock and specifications. That meant if you put it in a high stress situation it could quite easily top 110C without the fault, seeing that this is a upper final limit on workable temperatures running in extended periods will cause hardware malfunctions. Running any hardware at high temperatures will cause damage and the stock cooling in most cases is done mearly to prevent the hardware reaching the physical temperature limit rather than a optimal temperature. ------------------------------------- Don't click the links or even the forum topics. |

Grez
Fairlight Corp Rooks and Kings
|
Posted - 2009.11.16 16:50:00 -
[14]
Edited by: Grez on 16/11/2009 16:51:56
Originally by: Lork Niffle Psycho, the reason why we say it is becasue of a certain effect in GPU sales.
The 8000 core was a huge improvement in graphics performance and since it came out at a good time it meant a lot of systems had this line of cards in them. It also meant the 9000 and 200 series are built off the SAME core and since Nvidia won with overall performance from teh 8000 series almost all gaming PCs since then have contained the original 8000 design.
Then bring in the news that the 8000 core has a fault in it. Bring in the fact the higher clock speeds Nvidia cards run at mean under normal conditions they idle at 60C and run at 85C for all series it has been this way. The fault in the core meant if it triggered it could mean under certain conditions the card would reach 110C before the fault was realised and it wouldn't shut itself off and instead burns it out.
This became apparent in the main menu screen for EVE Online which uses large amounts of easy to render graphics easily putting a card under max load very rapidly. This meant the card can end up rendering at over 1000 FPS then the computer blue screens and the card being actually broken.
This is not a fault in the card but a simply an error in the actual Core design in regard to temperature monitoring.
Oh and sure cards can work for thousands of hours at max load but they can last decades when v-sync'd
Tripe (bar the fault in the 8Cs core).
Your issue is most probably due to some error in rendering window shadows within windows. I had a similar issue and I had to disable a few desktop effects (window shadows, fade-in startmenu effects) for it to work.
It also didn't like EVE running across multiple screens (even if it went onto the other screen by just one pixel). ---
|

Ecky X
Shade. Cry Havoc.
|
Posted - 2009.11.16 17:05:00 -
[15]
Edited by: Ecky X on 16/11/2009 17:05:31
Originally by: Psycho Tripper
Originally by: Arous Drephius
Originally by: Ancy Denaries I get over 120 FPS
WTS Interval One before you damage your GPU.
Stuff
A GPU renders the same number of frames with "interval one" as with "interval immediate". Interval one just discards extra frames, and sometimes causes a frame to be held a bit longer in buffer before being displayed onscreen (resulting in marginally more display lag).
Good article: http://www.anandtech.com/video/showdoc.aspx?i=3591&p=2
|

Soma Khan
|
Posted - 2009.11.16 17:52:00 -
[16]
Originally by: Ecky X
A GPU renders the same number of frames with "interval one" as with "interval immediate". Interval one just discards extra frames, and sometimes causes a frame to be held a bit longer in buffer before being displayed onscreen (resulting in marginally more display lag).
Not that I am arguing, but why then does the GPU temperature decreases when "interval immediate"/vsync is enabled? ___
|

Aralieus
Amarr Traumark Logistics
|
Posted - 2009.11.16 18:07:00 -
[17]
Hey OP I have a similar setup as well and I get about 45-50 FPS in space with settings on high and bloom low and HDR ON. Something isnt letting you take full advantage of your computer. Here are a couple of links that might help:
Eve Tech Page
NVIDIA Driver Update Page
Game Booster ^^ This helps cut out most background programs you will not need when running Eve.
AMD Driver download page
Hope any of this helps 
Fortune Favors the Bold!! |

Ecky X
Shade. Cry Havoc.
|
Posted - 2009.11.16 19:39:00 -
[18]
Originally by: Soma Khan Not that I am arguing, but why then does the GPU temperature decreases when "interval immediate"/vsync is enabled?
Ideally it shouldn't. If anything,the overall temperature of the card should go up (marginally) since vsync uses up more memory. I'll do some testing though.
|

Siigari Kitawa
Gallente The Aduro Protocol
|
Posted - 2009.11.16 20:06:00 -
[19]
Originally by: Aralieus Hey OP I have a similar setup as well and I get about 45-50 FPS in space with settings on high and bloom low and HDR ON. Something isnt letting you take full advantage of your computer. Here are a couple of links that might help:
Eve Tech Page
NVIDIA Driver Update Page
Game Booster ^^ This helps cut out most background programs you will not need when running Eve.
AMD Driver download page
Hope any of this helps 
Hey, just walked into the thread. Thanks for the links, I am happy to see this kind of help.
Also, I don't notice an FPS impact but my swap file explodes and the textures on the planets/etc are just seeming to KILL my commputer :(
|

Grez
Fairlight Corp Rooks and Kings
|
Posted - 2009.11.17 01:55:00 -
[20]
Edited by: Grez on 17/11/2009 01:56:21 V-Sync will render only up to the number of frames your monitor refreshes at.
However... If it cannot, it will half the number of frames produced (so one frame for every two refreshes), and if it can't manage that, it will half it again. This will slow your game down, and make the game feel sloppy and slow.
If your machine is cooled properly (and by properly, I mean intake and exhaust fans on your case, with ample air flow) then you do not have to worry about turning v-sync on (setting interval one). Leave it on interval immediate for the best performance possible. Anything else is just tripe and rubbish and people thinking they know what they're talking about. I ran my old ATI X1800XT (4 years old now) without v-sync, and it's still going strong now in a different machine, no faults, no worries, nothing. In fact, the only reason v-sync was ever invented, was because when the FPS gets too high (I mean 400+) you start to see INSANE texture tearing which can be a tad ugly.
The issue here is to do with something causing horrendous issues with the rendering - not to do with heat (which would cause instability, and minor performance degradation, not complete performance degradation).
OP, as I've said before - run EVE fullscreen, if it still does it, disable all of your background programs you're not using at all (MSN, Steam, FRAPS, etc). ---
|

Daelorn
|
Posted - 2009.11.17 23:41:00 -
[21]
Originally by: Arous Drephius
Originally by: Ancy Denaries I get over 120 FPS
WTS Interval One before you damage your GPU.
lolwut. Where did you pull this non sense from?
|
|
|
Pages: [1] :: one page |
First page | Previous page | Next page | Last page |