| Pages: [1] 2 :: one page |
| Author |
Thread Statistics | Show CCP posts - 2 post(s) |

Mataki Onimareu
Gallente Life Extermination
|
Posted - 2007.09.26 05:27:00 -
[1]
Ok I know it sounds ridiculous, but hear me out.
Any game I play spikes my GPU to around 68C, all my games actually except one. EVE spikes me to 75 and sometimes even more if Im in a station.
So for an old engine I'm over here scratching my head as to why this game needs so much GPU.
|

Kobushi
|
Posted - 2007.09.26 05:53:00 -
[2]
it's even weirder since the client puts all the computation on you CPU and none on your viddie. Something is amiss, would it be possible that your monitoring software is not configured correctly of that your GPU heatsink is feeding from your CPU heatsink exaust?
|

Mataki Onimareu
Gallente Life Extermination
|
Posted - 2007.09.26 06:00:00 -
[3]
No idea, I have a 7900GT and use the nVidia Monitor, my CPU is fine at 59 and from the HSF I have a tube that goes from the top of the fan directly to the outside of the case.
My point for the thread is, hopefully the new engine will be coded better.
|

eeeweeezeee
Macabre Votum INVICTUS.
|
Posted - 2007.09.26 06:26:00 -
[4]
I also noticed high graphics card temperatures.
My graphics card's fan melted a while back. I think this was caused by fan failure, and I think that because I looked at its monitoring program's log and it registered temperatures of 250+ F (subtract 32 and then divide by 1.8 to get C). The fan actually melted and came apart. Oddest thing is that all of the card survived with the exception of what ever circuitry controls the fan. I certainly play a lot of eve, but I attribute the failure to it being a cheap ABIT card with inferior components. the card is still running now with a new heat sink and fan I bought for a couple dollars at a nearby computer store and connected straight to a 12v connector. I still cant believe the rest of the card survived though. Might have been EVE, but probably wasn't. I am never buying an ABIT product again though if I can avoid it considering all the trouble I have had from their products.
|

Mrmuttley
Guns 'N' Hoses
|
Posted - 2007.09.26 06:48:00 -
[5]
I seriously doubt that eve would put a heavier load on the GPU than other games. and a 7900 GT should be able to handle Eve with ease
Although those temps arn't horrible for a graphics card and therefore you don't need to worry about damaging your card i would recommend reinstallation of graphics drivers and invetigating that card temperature monitoring software as well as pulling the GPU heatsink and fan apart and cleaning it.
HTH
MrM . |

Icome4u
Caldari Dark and Light inc. D-L
|
Posted - 2007.09.26 08:51:00 -
[6]
Edited by: Icome4u on 26/09/2007 08:51:26 Don't play WoW then :D
EVE does require quite a bit of CPU/GPU power (especially at start up) but not THAT much. The worst of all is WoW by far. ______
Originally by: Vyger If I lose connection while walking around a station will my avatar run off in a random direction and go hide in a corner? 
|

Zytrel
Contraband Inc. Mercenary Coalition
|
Posted - 2007.09.26 12:26:00 -
[7]
Get this: Zalman
Keeps my 7900er at nice 48¦C max under load and about 40¦C when idle. I honestly don't know what they were thinking when they designed those crappy stock coolers. =)
regards, zytrel.
|
|

CCP Atropos

|
Posted - 2007.09.26 14:08:00 -
[8]
I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
|
|

Arcadia1701
Gallente The Scope
|
Posted - 2007.09.26 14:11:00 -
[9]
Edited by: Arcadia1701 on 26/09/2007 14:12:06 Eve really hardly even touches any GPU. Its a CPU and RAM hogger. When eve was made, there was no such thing as a GPU. This will change with rev 3 though hehe. My sig>
Post with your main, or don't post at all. |

MOS DEF
0utbreak
|
Posted - 2007.09.26 14:20:00 -
[10]
Originally by: CCP Atropos I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
It is true though. I have had 6 different graphics cards since i play eve online. One thing they all had in common. While docked inside a station the GPU reaches a temperature you only manage if you loop 3dmark for several hours otherwise. No normal game heats the GPU up like eve in stations.
I do know that eve does not use the GPU as it should but it does somethign to heat them up. I wouldn't say it damages the GPU though. They are built do wistand high temperature. Current GPU is use is a 8800 GTS and with fan control at auto eve heats it up to 82 degrees (wich is why i put the fan to 80% now and it runs quite cool now) in stations. FEAR, Bioshock or world in conflict don't heat it up like this while being much much more taxing on the GPU.
|

Batolemaeus
Caldari One man Carebearing
|
Posted - 2007.09.26 15:11:00 -
[11]
I second that. My card is getting hotter than normal, thats why i use eve to test overclocked settings for both gpu and cpu. Its like a giant burn-in-test!
Something is definitly wrong with the trinity-engine.
Is there a framelimiter btw? Didnt find anything of that sort anywhere..
|

Kagura Nikon
Minmatar MASS HOMICIDE Interstellar Alcohol Conglomerate
|
Posted - 2007.09.26 15:40:00 -
[12]
Modern GF cards can sustain up to near 120 degrees of temperature on its core. Anything under 90 degrees is not even nearly scrathing the lifespan of your VGA card. Video card chips are made quiet different from CPU and have the temperature generated more widely spreaded than CPUs.
Don't worry with that temperature.
If brute force doesn't solve your problem... you are not using enough |

Kagura Nikon
Minmatar MASS HOMICIDE Interstellar Alcohol Conglomerate
|
Posted - 2007.09.26 15:43:00 -
[13]
Originally by: CCP Atropos I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
That quite strange statement to come from a dev, since its IMPOSSIBLE to output anything without using the GPU. The raster stage MUST be done completely in the GPU, all modern cards have no other option. Even commands that were originnaly from DX7 age are implemented in GPU currently , just wrapped by the drivers.
If brute force doesn't solve your problem... you are not using enough |
|

CCP Atropos

|
Posted - 2007.09.26 15:54:00 -
[14]
True, what I mean though was that the advanced features of the graphics card, ie the reason most people buy a particular card in the first place, aren't used by the EVE process.
This is one of the main upgrades coming with Trinity II; the ability to use the advanced rendering features of the GPU to offload the work from the CPU.
So, yes, it is being used, but not in the same way that many other games currently employ the resources of the GPU.
|
|

BIind Man
|
Posted - 2007.09.26 16:07:00 -
[15]
maybe this is why only when i have eve running sitting in a station my comp will just go black screen and lock up 
|

Poison Flower
|
Posted - 2007.09.26 16:21:00 -
[16]
This is a feature, not a bug.
|

F2C MaDMaXX
The Arrow Project Morsus Mihi
|
Posted - 2007.09.26 18:58:00 -
[17]
Yeah, this is something i noticed a long time ago, it is true that eve doesn't employ the features of your gfx card to calculate the rendering, the cpu does that, and the gfx card is merely used for it's ability to display to the screen.
I think it's just the way the calls are made, that the gfx card has to take it all from the cpu and it's a lot of input when usually it gets small instructions and does the calculations it's self.
I have noticed a good 10degree increase over some very display intensive FPS games in eve. ______________________________________ Natural Selection Developer
Sound FX
|

Kagura Nikon
Minmatar MASS HOMICIDE Interstellar Alcohol Conglomerate
|
Posted - 2007.09.28 17:21:00 -
[18]
Originally by: F2C MaDMaXX Yeah, this is something i noticed a long time ago, it is true that eve doesn't employ the features of your gfx card to calculate the rendering, the cpu does that, and the gfx card is merely used for it's ability to display to the screen.
I think it's just the way the calls are made, that the gfx card has to take it all from the cpu and it's a lot of input when usually it gets small instructions and does the calculations it's self.
I have noticed a good 10degree increase over some very display intensive FPS games in eve.
Naa what they meant is they don use any of the fancy programable pipeline. But All the TnL and Texturing is for sure made in hardware. Because beleive me if it was 100% in software a Core 2 Duo would barely reach 10 FPS.
If brute force doesn't solve your problem... you are not using enough |

Bishman82
Racketeers
|
Posted - 2007.09.28 17:45:00 -
[19]
I've mentioned this before, my nvidia 8800GTS sounds very noisy while playing eve, mostly in station and somtimes i can be over the other side of the room and i hear the graphics card fan revving up because it's under a lot of load. When i quit, the noise gradually slows down over about 15 seconds.
|

Valandril
Caldari Resurrection R i s e
|
Posted - 2007.09.28 17:59:00 -
[20]
I can confirm that eve is best tool to test how hard your graphcard can heat before crashing, especialy when ur overclocking. All testes passed, many many games on maxed details passed, but in eve it reached critical temperature. ---
Battlecarriers ! |

Ceanthar Cerbera
Minmatar Lone Gunmen
|
Posted - 2007.09.28 17:59:00 -
[21]
I have noticed this aswell. I use a x1650xt graphics card and it does seem to heat up substantially in eve. I have no temp measurements to present but the rig does frek out using certain drivers and not with others. ATIs drivers produce higher temps and artifacts, especially when i run two client at the same time. Omega drivers solve this. Artifacts is a good indicator of overheating in my experience.
I know eve dont use the gpu but those that rule out that the GF still run code? Matbe the absence of GPU code make the cards run some other "things"? Im no expert in how these things work and never thought much of it before reading this thread. Seeing now that others have had the same experience seem to indicate that there is something going on. But with the new engine coming this might all be academic.. ----------------------------------------- For the liberation and safety of the Matari people! |

Plekto
Priory Of The Lemon R0ADKILL
|
Posted - 2007.09.28 23:29:00 -
[22]
Eve really hardly even touches any GPU. Its a CPU and RAM hogger. When eve was made, there was no such thing as a GPU. This will change with rev 3 though hehe. ***
God, I'd love to offload most of this to my GPU.
Oh -the solution for video card cooling? A $10 slot fan. Install one two slots away from the card with the space between open. All that waste heat doesn't sit there in the bottom of the case anymore.
|

Jimbob McKracken
Caldari The Tidemark Interstellar Alcohol Conglomerate
|
Posted - 2007.09.29 00:29:00 -
[23]
Minimum System Requirements: OS: Windows« System 2000 SP2 /XP CPU: Intel Pentium« III 800 MHz or AMD Athlon 800 MHz RAM: 512 MB or more HD space: 6.0 GB Network: 56k modem or better Internet connection Video: 32 MB 3D graphics card with Hardware Transform and Lighting such as NVIDIA« GeForce 2 class card or above Drivers: DirectX« 9.0c (included) and latest video drivers
Eve does require a card with Hardware transform and lighting - so it is using the GPU for something - I believe hardware transform and lighting came in with DirectX 7 which was released back in 1999 along with the Geforce 256 cards.
My guess is that although modern cards are way more powerful, they don't like running code based on an 8 year old API. Its probably stressing the GPU in a way that a modern GPU simply isn't designed to work. A comparison would be turning off 4 cylinders on your V8 engine and then thrashing the 4 you left runnning.
Most of the advancements in modern day GPU's are not in clockspeed but programmability and Eve is not yet taking advantage of any of that flexibility, but it is definately using the GPU, and clearly from the posts of others, in a way that isn't kind to graphics cards.
My 8800 GTS's fan definately spins up to cope with Eve's demands, I just can't wait to test the new client.
p.s to the devs - I'd love to test the new client as soon as you possibly can :D
|

prathe
Minmatar Omega Enterprises Mostly Harmless
|
Posted - 2007.09.29 07:02:00 -
[24]
Originally by: CCP Atropos I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
damn beat me to it... yeah eve loves your cpu not gpu
|

Lt Angus
Caldari the united
|
Posted - 2007.09.29 08:37:00 -
[25]
Ive had many GPU crashes while running eve but no other game,
Shhhh, Im hunting Badgers |

Kaar
Art of War Cult of War
|
Posted - 2007.09.29 13:27:00 -
[26]
I think it's more likely your graphics card has some sort of dynamic fan control enabled. As you say, eve is a very old game and your card might not be detecting the fact you are playing a game and the fan is not turning on.
My theory anyway 
---
---
|

Tonto Auri
|
Posted - 2007.09.29 13:58:00 -
[27]
Originally by: Lt Angus Ive had many GPU crashes while running eve but no other game,
Enable VSync for EVE. That probably will solve Your problem. -- Thanks CCP for cu<end of sig> |

Lochmar Fiendhiem
Caldari International Multi-Player Consortium Interstellar Alcohol Conglomerate
|
Posted - 2007.09.29 15:02:00 -
[28]
I have had eve blow one of my bfg 6600oc 128mb cards in the past (lifetime warranty ftw), also now it seems that at times if I click on certain things in space or on a chat window or just rotate the screen, the game will freeze for a second and then the graphics will go all screwy. (I have screenshots where all of the planets/moons are my portrait instead of the regular textures.
Very odd indeed. maybe there is a rogue process causing issues?
Originally by: Halkin bob is dead, goons are great, cheese is cheesy, there we go no need for any more threads
|

Flamewave
Scorn Again.
|
Posted - 2007.09.29 16:14:00 -
[29]
One thing few people realize is that a really high framerate will heat up your videocard in a very bad way. Since Eve is an older game, most newer cards get extremely high framerates running Eve - which heats the card more than doing other work. Since newer games are more GPU-intensive, they generally run at lower framerates than Eve does, and as many people have seen, they don't heat up the graphics card as much.
Quote: Any game I play spikes my GPU to around 68C, all my games actually except one. EVE spikes me to 75 and sometimes even more if Im in a station.
Your framerate will be higher sitting in a station; this is why your card heats up more. I get about 90 to 120fps in station with the HUD on and my GPU heats up to the point where I can hear the fan trying to catch up.
This will highlight any issues your card may be having. My first week into the game my X800 XT burnt out and had to be replaced by Dell. Not too recently I was getting overheating issues with Eve that I wasn't with other games, which eventually manifested itself in other games. I had to replace my X800 XT again. Thankfully I have the four-year warranty on this PC. __________
|

Mataki Onimareu
Gallente Life Extermination
|
Posted - 2007.09.29 19:34:00 -
[30]
Edited by: Mataki Onimareu on 29/09/2007 19:42:02
Originally by: Flamewave One thing few people realize is that a really high framerate will heat up your videocard in a very bad way. Since Eve is an older game, most newer cards get extremely high framerates running Eve - which heats the card more than doing other work. Since newer games are more GPU-intensive, they generally run at lower framerates than Eve does, and as many people have seen, they don't heat up the graphics card as much.
Quote: Any game I play spikes my GPU to around 68C, all my games actually except one. EVE spikes me to 75 and sometimes even more if Im in a station.
Your framerate will be higher sitting in a station; this is why your card heats up more. I get about 90 to 120fps in station with the HUD on and my GPU heats up to the point where I can hear the fan trying to catch up.
This will highlight any issues your card may be having. My first week into the game my X800 XT burnt out and had to be replaced by Dell. Not too recently I was getting overheating issues with Eve that I wasn't with other games, which eventually manifested itself in other games. I had to replace my X800 XT again. Thankfully I have the four-year warranty on this PC.
Maybe the DEV's should allow a Vsync option? I do not want to put a global setting on since some of my games I do not run with Vsync.
And maybe the DEV's can look at why, if Vsync isn't the problem, what code in stations is doing this.
|
| |
|
| Pages: [1] 2 :: one page |
| First page | Previous page | Next page | Last page |