Pages: [1] :: one page |
|
Author |
Thread Statistics | Show CCP posts - 0 post(s) |
Vogue
Short Bus Pole Dancers
|
Posted - 2011.06.28 15:41:00 -
[1]
Edited by: Vogue on 28/06/2011 15:41:53 As EVE is a 3d game where there are many types of ships with their different models and textures to render I suppose this puts a greater load on a graphics card than say a FPS that has far fewer models with textures to render and have in memory.
Is there a benefit from using EVE with a graphics card that has 2GB of memory against using one that has 1GB. There are Nvidia 560's with 2GB but I have read the 'frame buffer' is at 1GB. Which is an issue if you want to run a game in a resolution greater than 1920x1200. But I do not. My main monitor runs at 1900x1200. But some ATI 6xxx cards have 2GB of memory and a 2GB frame buffer. And aside from the resolution issue does a > 1GB frame buffer have possible benefits for EVE.
For added value you can have a lolipop and this interesting discussion on the differences in developing a game across PC and console platforms: The Problem With Porting Games
.................................................. Fortress Of Solitude |
Akita T
Caldari Navy Volunteer Task Force
|
Posted - 2011.06.28 16:32:00 -
[2]
I suppose you might see some benefit if you run at 1920x1200 with max AA in a crowded fleet battle, but other than that, I don't think you'd get much of an advantage over a 1 GB card. This is especially valid (at least for EVE and NVIDIA cards) if you disable triple buffering, a thing which personally I would recommend doing (it's default enabled), alongside forcing vsynch on (default is off even in the drivers, and EVE overrides it with "interval immediate", switch over to "interval one") and reducing the number of max pre-rendered frames to 0 (from the default of 3). You might experience brief micro-jerkiness once in a blue moon, but the vast majority of the time, it should feel more responsive (and also, quite likely, spare both your CPU and GPU unnecessary computing cycles). If you feel the "microjerks" often, try pushing max pre-rendered frames up to 1 first. _
Make ISK||Build||React||1k papercuts
|
Reiisha
Splint Eye Probabilities Inc.
|
Posted - 2011.06.28 16:38:00 -
[3]
Resolution has far more impact on video memory than any game on its own.
1gig is enough if you stick with something like 1920x1200 as Akita mentioned at max, if you want to go higher (or multiple screens) then more memory would be a good idea.
"If you do things right, people won't be sure you've done anything at all"
|
Blacksquirrel
|
Posted - 2011.06.28 18:15:00 -
[4]
If its more expensive...and all you play is eve. You're probably throwing money into the wind.
However for the same price (or not much of a difference) or if you play other games it will be worth it.
Eve really isn't all the VGA intensive though. I run 2 clients with max settings (one monitor) and nary a hicup. If you check out systems monitors it eats more RAM than anything.
But like I said if eve isnt your only game...
|
Vogue
Short Bus Pole Dancers
|
Posted - 2011.06.28 18:24:00 -
[5]
Edited by: Vogue on 28/06/2011 18:25:23 I and others with older graphics card have an issue when EVE is running where the PC totally freezes the GUI, though Itunes still plays. It happens when I run three eve clients with lowest settings for mining. Maybe this is to do with the new Carbon API.
I have created a custom Nvidia profile for EVE. Not sure yet if this has made a difference. Though when I did this for Battlefield Bad Company 2, and editing its ini file, I got a big performance increase.
EVE can be very variable for graphics demands. Small fleet PVP is fine. But when I have been outside of a station in 0.0 with 250 ships doing nothing, so I assume the EVE server node is not tasked to much, graphics performance is like 10-15 fps.
I was inclined towards a Nvidia 560 as my replacement this year for the 8800 GTS 512. But I have found out ATI 6950 can be bios patched to a 6970 as 6950's use an identical GPU core but with a different BIOS that disables some shader pipelines.
.................................................. Fortress Of Solitude |
Blacksquirrel
|
Posted - 2011.06.28 21:13:00 -
[6]
You have to be careful when flashing the bios on VGA's though. But yes that is a cheap way to boost performance and not spending the extra 100 or so for the "Other" card.
I dunno my 5850 has never had a problem with eve minus that weird asteroid display issue they patched a few months ago.
|
Vogue
Short Bus Pole Dancers
|
Posted - 2011.06.28 21:59:00 -
[7]
Edited by: Vogue on 28/06/2011 21:59:51 I had to flash the bios on my 8800GTS 512 as it had a runaway temperature issue. The ATI 6950 cards have a unique switch that lets you switch between a live bios and a backup one if the live one is corrupted. Extra handy for doing the 6970 bios upgrade that unlocks GPU shaders.
But once I have flashed a motherboard bios with a wrong version. I was too stubbornly ignorant. I sent the unplug-gable bios chip to someone who flashed it with the right bios for ú5.
.................................................. Fortress Of Solitude |
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.04 13:59:00 -
[8]
Edited by: Vogue on 04/07/2011 14:01:16 Edited by: Vogue on 04/07/2011 13:59:36 I got a ASUS 6950 2GB DirectCU graphics card today. The main reason I got this model as it follows my 'if its big its clever' rule - It has a very big heatsink\fan assembly. The fans are very quiet during gaming. It has a small switch at the top that on ATI 6950 reference design copy gfx cards lets the user switch to a backup bios. But there is some confusion on the internet if for the ASUS DirectCU variant that uses a custom PCB design and components if the switch is for bios backup or to switch between the 4 Display Ports connectors on the face plate. So I am unsure If I can flash a 6970 bios to it. And the word on the internet is that the latest Cayman GPU cores made for the 6950 will have the extra 6970 shader cores disabled at manufacture.
As I suspected it did not fit into my no frills regular desktop PC case. So it was happy hour with tin snips. I cut away half of the 3.5" drives cage and the top of the hard drive cage below it.
Graphics performance is far beyond the 4 year old Nvidia 8800 GTS 512 it replaced. I can run most games at 1920x1200 at maximum detail. EVE in Jita outside the main station is now very fluid. Though the captains quarters with max detail is 48 fps. If CCP expand further with like a multiplayer in station bar it will be a right ball breaker for graphics cards. The most impressive feature of the DirectX 11 card was seeing tessellation in a benchmark - extruded bumpy like 3d detail.
Happy geek day for me with RL at level 1.
.................................................. Fortress Of Solitude |
Cpt Placeholder
|
Posted - 2011.07.04 14:33:00 -
[9]
I was considering getting that one as well but the I don't really like the idea of losing another slot, despite being certain that I will not need the slot.
If you do dare to flash it, post the results :P
|
Blacksquirrel
|
Posted - 2011.07.04 14:59:00 -
[10]
Lol I had like the same set up before 8800 gt oc and bought the MSI 5850 enhanced addition (it only takes up 1.5 slots) But yeah same thing had to use a mini roto saw to get that bad boy in because they stuck the power inlet facing outboard instead of down or up...
But yeah performance was so much better. There are some games however that dont play nice with AMD drivers.
Also pick up MSI afterburner and Kumbustor if it didnt come with similar software.
|
|
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.05 11:27:00 -
[11]
I scoured the forums for more for information about specically flashing a ASUS Directcu II 6950 to a 6970 with a ASUS Directcu II 6970 bios. I found a rar archive of files with a batch file to do this process. It uses ATIWinflash. First time the graphics card accepted the ASUS Directcu II 6970 bios. But after reboot the shader count was still the same as a 6950. I tried a second time which made Windows 7 hang. The PC could still reboot. I tried a third time. Again Windows 7 froze. But after that the PC would not reboot - the graphics card was bricked. So i set the graphics card bios switch to position '2', well the position nearest the faceplate which was not the default when I got the graphics card. PC reboots fine, graphics card work. So it would be the case that I have a ASUS 6950 Directcu II that has the newer Cayman CPU that has the extra 6970 shaders disabled at manufacture.
Though I can overclock my card so it can achieve the same performance levels of a 6970.
I played Metro 2033 with DirectX 11 high graphics settings. It looks much better than in DirectX 9.
.................................................. Fortress Of Solitude |
Cpt Placeholder
|
Posted - 2011.07.05 11:54:00 -
[12]
Hm, either that or or the files you used were kinda fishy. At any rate, I'd fix the first BIOS before experimenting further.
|
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.06 13:15:00 -
[13]
Edited by: Vogue on 06/07/2011 14:26:01 One web review says that the ASUS 6950 Directcu II has a different voltage regulator control chip than the 6970 variant. Which if it is true would make the 6970 Directcu II bios on the 6950 Directcu II inoperable. Though many forum threads have mentioned that there are inauthentic 6970 Directcu II bios's available on the internet.
I have restored the original bios (which I saved when I first got it) to the primary bios.
It may be possible to edit the original bios with Radeon Bios Editor to enable the extra shaders.
/EDIT: Looks like I have a ASUS 6950 Directcu II that will not unlock the shaders at all like other folks on the internet. I tried enabling the extra shaders with Radeon Bios Editor - did not work. I tried again using 'Wizzard php script' - did not work.
ATIWinflash will crash Windows is you try to load it with SmartDoctor and maybe also other overclocking utilities.
As other people have mentioned the ASUS SmartDoctor utillity is dangerous. It will randomly set the GPU voltage from 1.1 to 1.5V. Not using this again.
Anyway I can still overclock it to attain 6970 performance levels.
.................................................. Fortress Of Solitude |
Cpt Placeholder
|
Posted - 2011.07.06 14:59:00 -
[14]
Guess it's factory locked then. I'll get a dual fan version too anyways. The reference cards have a crappy fan, it's not worth the extra FPS.
|
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.07 17:40:00 -
[15]
Edited by: Vogue on 07/07/2011 17:42:04 I found that the version of ASUS SmartDoctor that came with the grahpics card on the CD was faulty. It can randomly set the GPU voltage to 1.5v on Windows boot up which could fry it. So I got the latest 5.75 version which is stable. Going from web reviews GPU 925mhz, 5400mhz memory and 1.25v is a stable overclock which I am using now.
Don't use Catalyst control panel AMD Overdrive as it is designed for ATI reference cards.
I have a standard sized desktop PC case. It now seems that the overall heat generated by the graphics card exceeds the PC case's ability to expel it. I have three low rpm fans to suck air out of the case. So now the PC is running with the main lid off.
.................................................. Fortress Of Solitude |
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.08 20:42:00 -
[16]
The latest version of Smartdoctor also has the random GPU voltage bug. Running a 950mhz overclock now. I did start to use Furmark stress benchmark. But it made the graphics card freeze. And from what some people have said on the web it is too aggressive on graphics cards.
I get 6970 performance levels with 3dmark 11. Though there is a stutter with this benchmark. I did'nt reinstall Windows 7 64 bit after I installed the 6950. Instead I removed the 8800 GTS 512. I then plugged in a PCI graphics card and ran Nvidia File Remover. Then removed the PCI graphics and installed the 6950. |
Barakkus
|
Posted - 2011.07.08 21:35:00 -
[17]
Originally by: Vogue The latest version of Smartdoctor also has the random GPU voltage bug. Running a 950mhz overclock now. I did start to use Furmark stress benchmark. But it made the graphics card freeze. And from what some people have said on the web it is too aggressive on graphics cards.
I get 6970 performance levels with 3dmark 11. Though there is a stutter with this benchmark. I did'nt reinstall Windows 7 64 bit after I installed the 6950. Instead I removed the 8800 GTS 512. I then plugged in a PCI graphics card and ran Nvidia File Remover. Then removed the PCI graphics and installed the 6950.
Sounds cheezy, but this: http://www.drivercleaner.net/ really works well for cleaning up after driver installs. |
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.09 16:48:00 -
[18]
ASUS Smartdoctor was the cause of the stuttering in 3DMark 11 and also some other games. So on Windows boot I let Smartdoctor run. Verify the overclock and then close the utility and click 'Yes' to retain OC settings. ASUS who otherwise have a good brand reputation need to fix Smartdoctor as it is the only tool that can alter a Directcu graphics card GPU voltage. |
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.13 05:18:00 -
[19]
A new problem is playing Supreme Commander Forged Alliance from the SSD. It stalls when I queue orders for engineers and queue template build items. It is fine now I launch the game from a mechanical hard disk.
Last night I had a new problem where Bluray players were complaining about the lack of a HDCP thingymagik not working. Something to do with a HD monitor, graphics card and bluray players co-operating with each other. I fixed it by enabling HDTV options for my Dell 24" monitor in Catalyst Control Panel.
My i5 750 now runs at 80-100% on all four cores with BF BC2. I suppose it has to shovel more stuffs to the far more powerful graphics card. I have tweaked the CPU overclock to run standard voltages at 3.2ghz. Before I ran the same higher voltages if it was running at 3ghz or 4ghz. Standard voltages for CPU lessens the heat output. My Corsair H50 keeps the CPU below 65C in game.
.................................................. Fortress Of Solitude |
Shani Khorme
|
Posted - 2011.07.13 20:56:00 -
[20]
Originally by: Vogue A new problem is playing Supreme Commander Forged Alliance from the SSD. It stalls when I queue orders for engineers and queue template build items. It is fine now I launch the game from a mechanical hard disk.
Last night I had a new problem where Bluray players were complaining about the lack of a HDCP thingymagik not working. Something to do with a HD monitor, graphics card and bluray players co-operating with each other. I fixed it by enabling HDTV options for my Dell 24" monitor in Catalyst Control Panel.
My i5 750 now runs at 80-100% on all four cores with BF BC2. I suppose it has to shovel more stuffs to the far more powerful graphics card. I have tweaked the CPU overclock to run standard voltages at 3.2ghz. Before I ran the same higher voltages if it was running at 3ghz or 4ghz. Standard voltages for CPU lessens the heat output. My Corsair H50 keeps the CPU below 65C in game.
No clue about supreme commander.
Bluray players always complain about it. 720/1080p content from bluray discs can only be transmitted through a hdcp compliant connection (hdcp compliant dvi or hdmi link) and will only play low res if not using that. That only goes for official discs though, normal data (ripped movies) can be played normally.
The last one is obvious, since BC2 is a CPU heavy game more than GPU, especially without HBAO enabled.
|
|
Akita T
Caldari Navy Volunteer Task Force
|
Posted - 2011.07.13 21:25:00 -
[21]
Originally by: Vogue My i5 750 now runs at 80-100% on all four cores with BF BC2. I suppose it has to shovel more stuffs to the far more powerful graphics card.
Vsynch forced on (default disabled), disable triple buffering (default enabled), set max pre-rendered frames to zero (or at most 1 ; default is 3). Or whatever the ATI equivalent of those NVIDIA settings are called. Watch CPU usage drop heavily, while at the same time, games become MORE responsive and visually more appealing due to lack of tearing. The only downside is you will have occasional FPS "pits" (the opposite of "spikes"), but it's not really that noticeable (when it's even noticeable at all). _
Make ISK||Build||React||1k papercuts
|
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.15 00:44:00 -
[22]
For BF BC2 I set vsynch on and render ahead frames to 0. Could not find a triple buffering option. CPU use now fluctuates between 60 and 95% instead of a constant 95-100%. I almost always play the Heavy Metal map as I like tanks and I don't have the reaction times I used to have with fps's. The map is easier for intermediate players.
ASUS Splendid image\movie enhancement thingy corrupts VLC player video playback so i will leave that off.
I will connect the graphics cards two fans to the motherboard via a speed controller. As Smartdoctor has to be loaded to set fan speeds. It makes most games stutter every time it polls the GPU. The default fan speed can go a bit higher before it is too noisy.
.................................................. Fortress Of Solitude |
Akita T
Caldari Navy Volunteer Task Force
|
Posted - 2011.07.15 00:55:00 -
[23]
Edited by: Akita T on 15/07/2011 00:58:15
Far less of a CPU usage drop compared to what I expected, but, well, at least it helped
P.S. The "vsynch on" from drivers is not *quite* mandatory - applications can still override it (EVE can). So make sure you set both "vsynch on" in drivers and the equivalent inside the game. _
Make ISK||Build||React||1k papercuts
|
Reiisha
Splint Eye Probabilities Inc.
|
Posted - 2011.07.15 10:30:00 -
[24]
Originally by: Akita T Edited by: Akita T on 15/07/2011 00:58:15
Far less of a CPU usage drop compared to what I expected, but, well, at least it helped
P.S. The "vsynch on" from drivers is not *quite* mandatory - applications can still override it (EVE can). So make sure you set both "vsynch on" in drivers and the equivalent inside the game.
Vsync would cause less CPU usage but responsiveness would go down a bit... I always have extremely noticable input lag when turning it on.
"If you do things right, people won't be sure you've done anything at all"
|
Akita T
Caldari Navy Volunteer Task Force
|
Posted - 2011.07.15 10:40:00 -
[25]
Originally by: Reiisha
Originally by: Akita T Far less of a CPU usage drop compared to what I expected, but, well, at least it helped P.S. The "vsynch on" from drivers is not *quite* mandatory - applications can still override it (EVE can). So make sure you set both "vsynch on" in drivers and the equivalent inside the game.
Vsync would cause less CPU usage but responsiveness would go down a bit... I always have extremely noticable input lag when turning it on.
The increased input lag and other issues ONLY happen if you don't disable triple buffering and don't set the max. pre-rendered frames to zero. All of those things are linked. If you take all three steps I mentioned, you should see no input lag, no visual tearing, CPU usage goes down and input lag remains the lowest it can be (i.e. actual frame compute time plus monitor's own input lag).
_
Make ISK||Build||React||1k papercuts
|
Mibad
Caldari
|
Posted - 2011.07.16 04:49:00 -
[26]
Edited by: Mibad on 16/07/2011 04:51:27 Stick with the 1gig, go for 2gig if you can throw money at the wall like CCP cashflow. (SLI or Xfire in the future if you find you want more GPU power). Use extra money on a SSD.
|
Vogue
Short Bus Pole Dancers
|
Posted - 2011.07.16 17:03:00 -
[27]
Annoyingly GPU-Z does not show memory usage of my 2GB Directcu 2 6950. Nor does any other utility I can find. I wanted to see memory usage in EVE outside a busy jita station. I like lots of pedantic information about my games PC
.................................................. Fortress Of Solitude |
|
|
|
Pages: [1] :: one page |
First page | Previous page | Next page | Last page |