| Pages: 1 2 :: [one page] |
| Author |
Thread Statistics | Show CCP posts - 2 post(s) |

Mataki Onimareu
Gallente Life Extermination
|
Posted - 2007.09.26 05:27:00 -
[1]
Ok I know it sounds ridiculous, but hear me out.
Any game I play spikes my GPU to around 68C, all my games actually except one. EVE spikes me to 75 and sometimes even more if Im in a station.
So for an old engine I'm over here scratching my head as to why this game needs so much GPU.
|

Kobushi
|
Posted - 2007.09.26 05:53:00 -
[2]
it's even weirder since the client puts all the computation on you CPU and none on your viddie. Something is amiss, would it be possible that your monitoring software is not configured correctly of that your GPU heatsink is feeding from your CPU heatsink exaust?
|

Mataki Onimareu
Gallente Life Extermination
|
Posted - 2007.09.26 06:00:00 -
[3]
No idea, I have a 7900GT and use the nVidia Monitor, my CPU is fine at 59 and from the HSF I have a tube that goes from the top of the fan directly to the outside of the case.
My point for the thread is, hopefully the new engine will be coded better.
|

eeeweeezeee
Macabre Votum INVICTUS.
|
Posted - 2007.09.26 06:26:00 -
[4]
I also noticed high graphics card temperatures.
My graphics card's fan melted a while back. I think this was caused by fan failure, and I think that because I looked at its monitoring program's log and it registered temperatures of 250+ F (subtract 32 and then divide by 1.8 to get C). The fan actually melted and came apart. Oddest thing is that all of the card survived with the exception of what ever circuitry controls the fan. I certainly play a lot of eve, but I attribute the failure to it being a cheap ABIT card with inferior components. the card is still running now with a new heat sink and fan I bought for a couple dollars at a nearby computer store and connected straight to a 12v connector. I still cant believe the rest of the card survived though. Might have been EVE, but probably wasn't. I am never buying an ABIT product again though if I can avoid it considering all the trouble I have had from their products.
|

Mrmuttley
Guns 'N' Hoses
|
Posted - 2007.09.26 06:48:00 -
[5]
I seriously doubt that eve would put a heavier load on the GPU than other games. and a 7900 GT should be able to handle Eve with ease
Although those temps arn't horrible for a graphics card and therefore you don't need to worry about damaging your card i would recommend reinstallation of graphics drivers and invetigating that card temperature monitoring software as well as pulling the GPU heatsink and fan apart and cleaning it.
HTH
MrM . |

Icome4u
Caldari Dark and Light inc. D-L
|
Posted - 2007.09.26 08:51:00 -
[6]
Edited by: Icome4u on 26/09/2007 08:51:26 Don't play WoW then :D
EVE does require quite a bit of CPU/GPU power (especially at start up) but not THAT much. The worst of all is WoW by far. ______
Originally by: Vyger If I lose connection while walking around a station will my avatar run off in a random direction and go hide in a corner? 
|

Zytrel
Contraband Inc. Mercenary Coalition
|
Posted - 2007.09.26 12:26:00 -
[7]
Get this: Zalman
Keeps my 7900er at nice 48¦C max under load and about 40¦C when idle. I honestly don't know what they were thinking when they designed those crappy stock coolers. =)
regards, zytrel.
|
|

CCP Atropos

|
Posted - 2007.09.26 14:08:00 -
[8]
I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
|
|

Arcadia1701
Gallente The Scope
|
Posted - 2007.09.26 14:11:00 -
[9]
Edited by: Arcadia1701 on 26/09/2007 14:12:06 Eve really hardly even touches any GPU. Its a CPU and RAM hogger. When eve was made, there was no such thing as a GPU. This will change with rev 3 though hehe. My sig>
Post with your main, or don't post at all. |

MOS DEF
0utbreak
|
Posted - 2007.09.26 14:20:00 -
[10]
Originally by: CCP Atropos I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
It is true though. I have had 6 different graphics cards since i play eve online. One thing they all had in common. While docked inside a station the GPU reaches a temperature you only manage if you loop 3dmark for several hours otherwise. No normal game heats the GPU up like eve in stations.
I do know that eve does not use the GPU as it should but it does somethign to heat them up. I wouldn't say it damages the GPU though. They are built do wistand high temperature. Current GPU is use is a 8800 GTS and with fan control at auto eve heats it up to 82 degrees (wich is why i put the fan to 80% now and it runs quite cool now) in stations. FEAR, Bioshock or world in conflict don't heat it up like this while being much much more taxing on the GPU.
|

Batolemaeus
Caldari One man Carebearing
|
Posted - 2007.09.26 15:11:00 -
[11]
I second that. My card is getting hotter than normal, thats why i use eve to test overclocked settings for both gpu and cpu. Its like a giant burn-in-test!
Something is definitly wrong with the trinity-engine.
Is there a framelimiter btw? Didnt find anything of that sort anywhere..
|

Kagura Nikon
Minmatar MASS HOMICIDE Interstellar Alcohol Conglomerate
|
Posted - 2007.09.26 15:40:00 -
[12]
Modern GF cards can sustain up to near 120 degrees of temperature on its core. Anything under 90 degrees is not even nearly scrathing the lifespan of your VGA card. Video card chips are made quiet different from CPU and have the temperature generated more widely spreaded than CPUs.
Don't worry with that temperature.
If brute force doesn't solve your problem... you are not using enough |

Kagura Nikon
Minmatar MASS HOMICIDE Interstellar Alcohol Conglomerate
|
Posted - 2007.09.26 15:43:00 -
[13]
Originally by: CCP Atropos I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
That quite strange statement to come from a dev, since its IMPOSSIBLE to output anything without using the GPU. The raster stage MUST be done completely in the GPU, all modern cards have no other option. Even commands that were originnaly from DX7 age are implemented in GPU currently , just wrapped by the drivers.
If brute force doesn't solve your problem... you are not using enough |
|

CCP Atropos

|
Posted - 2007.09.26 15:54:00 -
[14]
True, what I mean though was that the advanced features of the graphics card, ie the reason most people buy a particular card in the first place, aren't used by the EVE process.
This is one of the main upgrades coming with Trinity II; the ability to use the advanced rendering features of the GPU to offload the work from the CPU.
So, yes, it is being used, but not in the same way that many other games currently employ the resources of the GPU.
|
|

BIind Man
|
Posted - 2007.09.26 16:07:00 -
[15]
maybe this is why only when i have eve running sitting in a station my comp will just go black screen and lock up 
|

Poison Flower
|
Posted - 2007.09.26 16:21:00 -
[16]
This is a feature, not a bug.
|

F2C MaDMaXX
The Arrow Project Morsus Mihi
|
Posted - 2007.09.26 18:58:00 -
[17]
Yeah, this is something i noticed a long time ago, it is true that eve doesn't employ the features of your gfx card to calculate the rendering, the cpu does that, and the gfx card is merely used for it's ability to display to the screen.
I think it's just the way the calls are made, that the gfx card has to take it all from the cpu and it's a lot of input when usually it gets small instructions and does the calculations it's self.
I have noticed a good 10degree increase over some very display intensive FPS games in eve. ______________________________________ Natural Selection Developer
Sound FX
|

Kagura Nikon
Minmatar MASS HOMICIDE Interstellar Alcohol Conglomerate
|
Posted - 2007.09.28 17:21:00 -
[18]
Originally by: F2C MaDMaXX Yeah, this is something i noticed a long time ago, it is true that eve doesn't employ the features of your gfx card to calculate the rendering, the cpu does that, and the gfx card is merely used for it's ability to display to the screen.
I think it's just the way the calls are made, that the gfx card has to take it all from the cpu and it's a lot of input when usually it gets small instructions and does the calculations it's self.
I have noticed a good 10degree increase over some very display intensive FPS games in eve.
Naa what they meant is they don use any of the fancy programable pipeline. But All the TnL and Texturing is for sure made in hardware. Because beleive me if it was 100% in software a Core 2 Duo would barely reach 10 FPS.
If brute force doesn't solve your problem... you are not using enough |

Bishman82
Racketeers
|
Posted - 2007.09.28 17:45:00 -
[19]
I've mentioned this before, my nvidia 8800GTS sounds very noisy while playing eve, mostly in station and somtimes i can be over the other side of the room and i hear the graphics card fan revving up because it's under a lot of load. When i quit, the noise gradually slows down over about 15 seconds.
|

Valandril
Caldari Resurrection R i s e
|
Posted - 2007.09.28 17:59:00 -
[20]
I can confirm that eve is best tool to test how hard your graphcard can heat before crashing, especialy when ur overclocking. All testes passed, many many games on maxed details passed, but in eve it reached critical temperature. ---
Battlecarriers ! |

Ceanthar Cerbera
Minmatar Lone Gunmen
|
Posted - 2007.09.28 17:59:00 -
[21]
I have noticed this aswell. I use a x1650xt graphics card and it does seem to heat up substantially in eve. I have no temp measurements to present but the rig does frek out using certain drivers and not with others. ATIs drivers produce higher temps and artifacts, especially when i run two client at the same time. Omega drivers solve this. Artifacts is a good indicator of overheating in my experience.
I know eve dont use the gpu but those that rule out that the GF still run code? Matbe the absence of GPU code make the cards run some other "things"? Im no expert in how these things work and never thought much of it before reading this thread. Seeing now that others have had the same experience seem to indicate that there is something going on. But with the new engine coming this might all be academic.. ----------------------------------------- For the liberation and safety of the Matari people! |

Plekto
Priory Of The Lemon R0ADKILL
|
Posted - 2007.09.28 23:29:00 -
[22]
Eve really hardly even touches any GPU. Its a CPU and RAM hogger. When eve was made, there was no such thing as a GPU. This will change with rev 3 though hehe. ***
God, I'd love to offload most of this to my GPU.
Oh -the solution for video card cooling? A $10 slot fan. Install one two slots away from the card with the space between open. All that waste heat doesn't sit there in the bottom of the case anymore.
|

Jimbob McKracken
Caldari The Tidemark Interstellar Alcohol Conglomerate
|
Posted - 2007.09.29 00:29:00 -
[23]
Minimum System Requirements: OS: Windows« System 2000 SP2 /XP CPU: Intel Pentium« III 800 MHz or AMD Athlon 800 MHz RAM: 512 MB or more HD space: 6.0 GB Network: 56k modem or better Internet connection Video: 32 MB 3D graphics card with Hardware Transform and Lighting such as NVIDIA« GeForce 2 class card or above Drivers: DirectX« 9.0c (included) and latest video drivers
Eve does require a card with Hardware transform and lighting - so it is using the GPU for something - I believe hardware transform and lighting came in with DirectX 7 which was released back in 1999 along with the Geforce 256 cards.
My guess is that although modern cards are way more powerful, they don't like running code based on an 8 year old API. Its probably stressing the GPU in a way that a modern GPU simply isn't designed to work. A comparison would be turning off 4 cylinders on your V8 engine and then thrashing the 4 you left runnning.
Most of the advancements in modern day GPU's are not in clockspeed but programmability and Eve is not yet taking advantage of any of that flexibility, but it is definately using the GPU, and clearly from the posts of others, in a way that isn't kind to graphics cards.
My 8800 GTS's fan definately spins up to cope with Eve's demands, I just can't wait to test the new client.
p.s to the devs - I'd love to test the new client as soon as you possibly can :D
|

prathe
Minmatar Omega Enterprises Mostly Harmless
|
Posted - 2007.09.29 07:02:00 -
[24]
Originally by: CCP Atropos I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
damn beat me to it... yeah eve loves your cpu not gpu
|

Lt Angus
Caldari the united
|
Posted - 2007.09.29 08:37:00 -
[25]
Ive had many GPU crashes while running eve but no other game,
Shhhh, Im hunting Badgers |

Kaar
Art of War Cult of War
|
Posted - 2007.09.29 13:27:00 -
[26]
I think it's more likely your graphics card has some sort of dynamic fan control enabled. As you say, eve is a very old game and your card might not be detecting the fact you are playing a game and the fan is not turning on.
My theory anyway 
---
---
|

Tonto Auri
|
Posted - 2007.09.29 13:58:00 -
[27]
Originally by: Lt Angus Ive had many GPU crashes while running eve but no other game,
Enable VSync for EVE. That probably will solve Your problem. -- Thanks CCP for cu<end of sig> |

Lochmar Fiendhiem
Caldari International Multi-Player Consortium Interstellar Alcohol Conglomerate
|
Posted - 2007.09.29 15:02:00 -
[28]
I have had eve blow one of my bfg 6600oc 128mb cards in the past (lifetime warranty ftw), also now it seems that at times if I click on certain things in space or on a chat window or just rotate the screen, the game will freeze for a second and then the graphics will go all screwy. (I have screenshots where all of the planets/moons are my portrait instead of the regular textures.
Very odd indeed. maybe there is a rogue process causing issues?
Originally by: Halkin bob is dead, goons are great, cheese is cheesy, there we go no need for any more threads
|

Flamewave
Scorn Again.
|
Posted - 2007.09.29 16:14:00 -
[29]
One thing few people realize is that a really high framerate will heat up your videocard in a very bad way. Since Eve is an older game, most newer cards get extremely high framerates running Eve - which heats the card more than doing other work. Since newer games are more GPU-intensive, they generally run at lower framerates than Eve does, and as many people have seen, they don't heat up the graphics card as much.
Quote: Any game I play spikes my GPU to around 68C, all my games actually except one. EVE spikes me to 75 and sometimes even more if Im in a station.
Your framerate will be higher sitting in a station; this is why your card heats up more. I get about 90 to 120fps in station with the HUD on and my GPU heats up to the point where I can hear the fan trying to catch up.
This will highlight any issues your card may be having. My first week into the game my X800 XT burnt out and had to be replaced by Dell. Not too recently I was getting overheating issues with Eve that I wasn't with other games, which eventually manifested itself in other games. I had to replace my X800 XT again. Thankfully I have the four-year warranty on this PC. __________
|

Mataki Onimareu
Gallente Life Extermination
|
Posted - 2007.09.29 19:34:00 -
[30]
Edited by: Mataki Onimareu on 29/09/2007 19:42:02
Originally by: Flamewave One thing few people realize is that a really high framerate will heat up your videocard in a very bad way. Since Eve is an older game, most newer cards get extremely high framerates running Eve - which heats the card more than doing other work. Since newer games are more GPU-intensive, they generally run at lower framerates than Eve does, and as many people have seen, they don't heat up the graphics card as much.
Quote: Any game I play spikes my GPU to around 68C, all my games actually except one. EVE spikes me to 75 and sometimes even more if Im in a station.
Your framerate will be higher sitting in a station; this is why your card heats up more. I get about 90 to 120fps in station with the HUD on and my GPU heats up to the point where I can hear the fan trying to catch up.
This will highlight any issues your card may be having. My first week into the game my X800 XT burnt out and had to be replaced by Dell. Not too recently I was getting overheating issues with Eve that I wasn't with other games, which eventually manifested itself in other games. I had to replace my X800 XT again. Thankfully I have the four-year warranty on this PC.
Maybe the DEV's should allow a Vsync option? I do not want to put a global setting on since some of my games I do not run with Vsync.
And maybe the DEV's can look at why, if Vsync isn't the problem, what code in stations is doing this.
|

Kaar
Art of War Cult of War
|
Posted - 2007.09.29 19:57:00 -
[31]
Originally by: Mataki Onimareu
Maybe the DEV's should allow a Vsync option? I do not want to put a global setting on since some of my games I do not run with Vsync.
Add this line to your pref.ini
advancedDevice=1
Then set "interval one" on the graphics tab ingame.
---
---
|

Mataki Onimareu
Gallente Life Extermination
|
Posted - 2007.10.03 00:17:00 -
[32]
Originally by: Kaar
Originally by: Mataki Onimareu
Maybe the DEV's should allow a Vsync option? I do not want to put a global setting on since some of my games I do not run with Vsync.
Add this line to your pref.ini
advancedDevice=1
Then set "interval one" on the graphics tab ingame.
HAH! I think this worked thanks man, only getting to about 64C now
|

Xeliya
Eternity INC. Mercenary Coalition
|
Posted - 2007.10.03 06:40:00 -
[33]
When docked Eve will use 100% of your GPU. Either play in windowed mode or change your settings to wait for vsync and it should solve your overheating issue.
|

Kvarium Ki
Legion Du Lys GoonSwarm
|
Posted - 2007.10.03 06:43:00 -
[34]
What about when you do not load station enviroment? Does it still use so much gpu? I've had my sation enviroments off for a long time and I haven't noticed my card getting hot but then again I don't really check.
|

Eleana Tomelac
Gallente
|
Posted - 2007.10.03 09:10:00 -
[35]
What about fixing the overheating issue by better heat dissipation?
Things to check : -Dust in fans and heat sinks. (use a paint brush or a vacuum cleaner, you may put something - your finger - in the fan to avoit it to turn backwards too fast while using vacuum cleaner.) -Dust in case air entrance and exits (clean the filters if you have some). -Better case cooling (increase air flow, avoid IDE flat cables to be in the airflow). -Better GPU cooling : It voids your graphic card warranty if you change the cooling equipement on it, and you need to be sure of what you're doing to put a new one on it. Arctic cooling has a good solutions (silent and efficient, always got my 7800GT under 55¦C with an accelero X1). -- Pocket drone carriers (tm) enthousiast ! The Vexor Navy Issue is much more fun than the Myrmidon ! |

ZelRox
Reikoku Band of Brothers
|
Posted - 2007.10.03 13:36:00 -
[36]
Intel Core2Duo e6850. Normal idle temp 20-22c XFX 8800 GTX. Normal idle temp 64-66c Eve aint using much of GPU.
EVE : -cpu : 40c -gpu : 66c
World in Conflict -cpu 38c -gpu 78-80c
Team forteress 2 -cpu : 35c -gpu : 75c ----------------------
BiH 4tw |

Wim'sei
Gallente GoonFleet GoonSwarm
|
Posted - 2007.10.04 10:19:00 -
[37]
For what it's worth, if your graphics card was suffering from heat-related issues, you would probably encounter erratic video behavior, such as graphics corruption or video driver failures. You didn't post anything along those lines, so this was unlikely the issue.
It seems your issue has been resolved, but I'm curious: Are you using Windows Vista? If so, do you have Aero enabled? The new Desktop Window Manager in Vista allocates many window drawing functions to Direct 3D backbuffers which are then composed to form the GUI. The end result is that every window, having been drawn separately, can have special effects applied. Additionally certain functions, such as the Taskbar-application Window preview feature and Flip3D can exist. On a related note, minimizing windows effectively disable these effects since Windows has nothing to pull these previews from.
Naturally, applications which generate lots of graphically intensive updates would take a mighty performance hit, and your videw card would need to work harder. I would guess that if the application were using nothing but built-in Windows widgets they wouldn't experience as many problems. Gamers aren't exactly compatible with that design paradigm.
On the other hand, if you're not running Vista, than I dunno.
|

big5824
|
Posted - 2007.10.04 11:53:00 -
[38]
a similar thing happens to me. after maybes an hour or more of playing i start to get artifacting (normally white lines appearing through text, and big white lines in the middle of space), and this continues onto my desktop and other apps, however if i then quit eve and wait a few minutes these artifacts go away.
fyi im using a 7900gt with 4200+ x2
|

Kagura Nikon
Minmatar MASS HOMICIDE Interstellar Alcohol Conglomerate
|
Posted - 2007.10.04 13:46:00 -
[39]
Originally by: big5824 a similar thing happens to me. after maybes an hour or more of playing i start to get artifacting (normally white lines appearing through text, and big white lines in the middle of space), and this continues onto my desktop and other apps, however if i then quit eve and wait a few minutes these artifacts go away.
fyi im using a 7900gt with 4200+ x2
If you have that , then your card is overheating. Try to check for air circulation. I have pretty much same setup and card temperatures floats around 67 degrees ( they are ok while under 90)
If brute force doesn't solve your problem... you are not using enough |

Izo Azlion
Veto. Veto Corp
|
Posted - 2007.10.04 14:23:00 -
[40]
Edited by: Izo Azlion on 04/10/2007 14:23:39
Originally by: Zytrel Get this: Zalman
Keeps my 7900er at nice 48¦C max under load and about 40¦C when idle. I honestly don't know what they were thinking when they designed those crappy stock coolers. =)
regards, zytrel.
Edit: Nice fan tbh, but water >> :D
Zalman Passive Water Cooling ftw. Keeps my 8800GTX at 55 degrees, under heavy load, rather than the 69-75 it was going at before.
Izo Azlion.
---
|

ZelRox
Reikoku Band of Brothers
|
Posted - 2007.10.04 18:38:00 -
[41]
Water is great until u get a leak ;) ----------------------
BiH 4tw |

DeODokktor
Dark Templars The Fonz Presidium
|
Posted - 2007.10.07 22:51:00 -
[42]
Originally by: CCP Atropos I don't know why your graphics card would be getting hotter, since at present EVE does not use the GPU.
Makes me wonder why a core2duo box with integrated graphics cant run eve as well as a athlon 1700+ with a pimp video card ;)....
|

Krysta Gemme
The Really Awesome Players
|
Posted - 2007.10.08 02:19:00 -
[43]
It's pretty silly that it uses the CPU mainly. To say there was no such thing as a GPU back in 03" is a bit, er, wrong? I swear the 5900 FX was all the rage then.
It makes no sense that my 8800 GTX gets slowdown in a game like Eve when there are many ships on screen. Seriously, there is nothing there! No terrain, no hills, no skymap... where is all the effects that is slowing down the system?
On top of that, whenever my camera faces a side of the system with alot of belts, it also tends to drop framerate, why? Items are over 14-20au away and it still chugs. This makes me think that the game is drawing things even at ridiculous distances. Even if it's just a dot, it still draws it. Seriously, make a way to turn down the draw distance. Even EQ let you do that.
Most fleet battles I have ever seen have people looking at it in cubes. A bunch of square icons on screen. I don't know if CCP is dissapointed that their game runs like this since its inception, but if they don't, really, why bother with the sick looking trailers when you know it isn't like that due to the way the client works? If you would set a draw distance, then ships at a distance would still be icons, and frames would increase dramatically.
The 8800 series always runs hot as hell, so there's no way for me to tell if it's Eve making it run that way, but 1-5fps when there are 20 ships on screen with the latest video card of 07? Come on. ----------------------
|

Malachon Draco
eXceed Inc. INVICTUS.
|
Posted - 2007.10.08 09:22:00 -
[44]
Eve makes my graphics card overheat as well. Blew out 2 cards already over the past 2 years where the fan gave out. ------------------------------------------------
New idea for sovereignty: Sovereignty revisited |

Azuse
The Brotherhood Of The Blade Pure.
|
Posted - 2007.10.08 10:29:00 -
[45]
It really is just out of date code, this is why ccp are making the trinity ii engine but it still sucks. What sucks more is the only way to stop it is using vsync, fine in a station but try undocking and shooting something. Best yet is turning on 3xAA, 2.5 fps .
/me waves trinity ii flag. -------------------------------------------
|
| |
|
| Pages: 1 2 :: [one page] |