Pages: 1 2 3 4 [5] 6 :: one page |
Author |
Thread Statistics | Show CCP posts - 7 post(s) |

Sleepkevert
Paradox v2.0
|
Posted - 2007.10.15 14:00:00 -
[121]
Originally by: Mara Kell
Originally by: MHayes Isn't SM3 4 years old, if your graphics card is over 4 years old then no shiny ships for you. This is your fault not CCPs fault. Be glad they are still allowing you people to use the old client.
strokes 7900GTX which is like 2 years old but still does the business.
My gc is 1.5 years old and it wasnt really cheap at that time. And it still has SM2. Just because a graphics card has SM3 it isnt better than a SM2 card. When allmost every other game to be released this year supports SM2 but eve not i certainly dont see the problem with my graphics card but with the support for ATI cards from ccp.
Lolwut? Seriously... No shader model 3, No NEW graphics engine. Consider the old engine the game on low settings, just like every other game will have a "low settings" that has been released this year. Also, ati cards have nothing to do with this... nVidia cards have the same problem, so saying that ccp disatvantages Ati buyers is WRONG, also, what the hell of a card do you have if you bought an expensive card 1,5 year ago that doesn't have sm3 
Sign my sig |

Kelron Queldine
Infinitus Odium The Church.
|
Posted - 2007.10.15 14:02:00 -
[122]
Originally by: MHayes I doubt a 3 year old card could run it so why change it now. I don't want graphics technology to be slowed down by some tightwad with a 3 year old ú50 graphics card.
A 3 year old card can run it fine, just as long as it's a 3 year old NVidia card. Most new games being released support SM2, the ones that don't are the exception rather than the rule, so it seems lazy that the new graphics features for EVE will require SM3. ---------------------------
Vanilla Crazy Cake! |

SirMolly
|
Posted - 2007.10.15 14:03:00 -
[123]
lol @ all the whineage. 
no one forces you to use the new graphics. 
|

Sleepkevert
Paradox v2.0
|
Posted - 2007.10.15 14:08:00 -
[124]
Originally by: SirMolly lol @ all the whineage. 
no one forces you to use the new graphics. 
Quoted for epic truth! Consider the old graphics engine EVE on low settings, and the new graphics engine, EVE on high settings.
That way you don't even need a shader at all to play eve
Offcourse the new graphics engine is going to require new hardware, what you are whining about now is something like this: "<insert random game here> runs fine on low! Why can't I run it on ultra high on the same hardware!!11!!" Well guess what, hardware gets improved, new things are added, you can still run the newer games fine on low, but to enjoy all the shininess (like trinity II) you need some better hardware...
/whinage
Sign my sig |

Kelron Queldine
Infinitus Odium The Church.
|
Posted - 2007.10.15 14:12:00 -
[125]
People without SM3 cards will be able to still play on 'low' settings, yes, but the point is that someone like Mara has a card that at least equals and probably outperforms my 6800GT and will be able to play most new games for a while yet, but won't be able to use EVE's new graphics because SM2 support hasn't been included. ---------------------------
Vanilla Crazy Cake! |

MHayes
|
Posted - 2007.10.15 14:26:00 -
[126]
Edited by: MHayes on 15/10/2007 14:30:05 agree, screw all thes SM2 fanboys, show us some DX9 SM3 screenies, or even better DX10 (and yes I will need a new GCard also for DX10 but I am not complaining because to have good graphics you gotta have good hardware. Shock horror!!).
|

Sleepkevert
Paradox v2.0
|
Posted - 2007.10.15 14:28:00 -
[127]
Edited by: Sleepkevert on 15/10/2007 14:28:29
Originally by: Kelron Queldine People without SM3 cards will be able to still play on 'low' settings, yes, but the point is that someone like Mara has a card that at least equals and probably outperforms my 6800GT and will be able to play most new games for a while yet, but won't be able to use EVE's new graphics because SM2 support hasn't been included.
Sooo, you want an new engine that uses technology that was available even before eve was started? Whats the point in making a new engine then. They chose for sm3, and let's be honest, you and i know they must have had their reasons for that.
Sign my sig |

MHayes
|
Posted - 2007.10.15 14:31:00 -
[128]
But that is ATIs fault, not Eve, should have reserached what he was buying ey? it is like me buying a new card now that doesn't support DX10.
|

Primnproper
|
Posted - 2007.10.15 14:45:00 -
[129]
I can't believe there is still confusion over this, they have said very clearly that every1 will upgrade to the new trinity 2 engine, we will then have the choice of using the new engines dx8 mode which will use low poly models and have less flashy effects or the new engines dx9 with sm3 mode which will use all of the new high poly models and flashy effects.
I could not be simpler. |

Sleepkevert
Paradox v2.0
|
Posted - 2007.10.15 14:53:00 -
[130]
Originally by: Primnproper I can't believe there is still confusion over this, they have said very clearly that every1 will upgrade to the new trinity 2 engine, we will then have the choice of using the new engines dx8 mode which will use low poly models and have less flashy effects or the new engines dx9 with sm3 mode which will use all of the new high poly models and flashy effects.
I could not be simpler.
Then point me to a frigging source? No dev blog about it, news item doesn't report such details... Must have been a forum post about this.. then again, it wouldn't be "very clearly"....
Sign my sig |

Primnproper
|
Posted - 2007.10.15 15:08:00 -
[131]
Originally by: Sleepkevert
Originally by: Primnproper I can't believe .......
Then point me to a frigging source? No dev blog about it, news item doesn't report such details... Must have been a forum post about this.. then again, it wouldn't be "very clearly".... Ha, fail.. i just found this:
Originally by: CCP Wrangler To try and sort this out:
We currently have our client, this client you use every day and it will remain.
In the november expansion we will release Trinity II, a DX9 version of the client.
So we will have two clients at this point, the original and the Trinity II with DX9.
At some point in the future we will also release a DX10 version, but this is not the Trinity II expansion. 
The "everyone will upgrade" thing is just a forum myth, i can't find any dev post or otherwise to support it.. Proof me wrong...
Indeed fail...
Originally by: CCP Explorer Edited by: CCP Explorer on 08/10/2007 13:57:48
Let me try and clear up the confusion and collect discussion from this thread on the size of the install and this thread on the new graphics engine.
First of all, the November expansions, codenamed "Kali3", is not the DX10 upgrade, that will come later.
"Kali3" will contain a new graphics engine that now supports DX9 SM3 in addition to supporting DX8 as before.
There will be one client but two installs. The same graphics engine and the same game code will be in both installs, but different graphics content. The smaller install, codenamed "Classic", will only contain the DX8 graphics content while the full client will contain both DX8 and DX9 SM3 graphics content.
When we release "Kali3" then all Revelations II installs will be upgraded to the "Classic" install (15-30 MB patch). If the hardware is DX9 SM3 capable then you will be offered to download the new DX9 SM3 graphics content in the background (600-900 MB probably).
|

Sleepkevert
Paradox v2.0
|
Posted - 2007.10.15 15:12:00 -
[132]
Originally by: Primnproper Random dev quote fest...
Great, devs contradicting each-other... Anyway, thanks for pointing me that to that forum post...
Sign my sig |

Kelron Queldine
Infinitus Odium The Church.
|
Posted - 2007.10.15 15:18:00 -
[133]
Lets take the recently released Bioshock as an example for the SM2/SM3 issue. Bioshock was released without SM2 support, this caused a massive uproar amongst people who'd been waiting for the game in various forums, because there is a lot of people with SM2 cards that are still perfectly capable of running the latest games when they have SM2 support (which, as I've already said, most of them do). It took 1 day for a fan patch to be released for Bioshock that added SM2 support, it wasn't complete at that stage and had odd or missing textures, but it didn't take much longer to add proper support. So it was either laziness on behalf of the developers, or the publisher pushing for a quicker release. If a member of the community could add SM2 support in such a short period of time, why didn't the devs?
Bioshock uses Unreal Engine 3, which supports SM2 and probably accounted for the ease of adding SM2 support into Bioshock. As the new EVE engine is developed from scratch by CCP, as far as I know, there may be a good reason why it would be difficult or too time consuming to add SM2 support, but it seems odd to alienate what is likely a significant portion of the userbase who have cards easily powerful enough to run the new engine, yet don't have SM3 support. ---------------------------
Vanilla Crazy Cake! |

Primnproper
|
Posted - 2007.10.15 15:24:00 -
[134]
Originally by: Sleepkevert
Originally by: Primnproper Random dev quote fest...
Great, devs contradicting each-other... Anyway, thanks for pointing me that to that forum post...
Explorer wins on that one he's software director whereas wrangler is only (no offense) community manager.  |

Kayhman
|
Posted - 2007.10.15 15:28:00 -
[135]
|

NeoTheo
Caldari Dark Materials Fang Alliance
|
Posted - 2007.10.15 15:33:00 -
[136]
This is s stupid arguement in general, its simple. EvE was starting to look long in the tooth, when they make a new engine they are not just righting a a Engine to make the game look decent now, they are making it to be expansible in the future. Whilst i feel sorry for the people who have SM2 cards that are far more powerfull than the now not so mighty 6800, they have to understand, that sooner or laters you have to stop inculding eveything for the sake of your code base.
this is not halflife, its not a static game that they can inculde more gfx bloat in, because if they did that the game would be less extensible in the future, as managing all your modules sooner or laters becomes a problem.
its called a clean slate, a new platform for them to work from in terms of the client.
I dont doubt they could have inculded SM2 support, but in 1.5 years time 95% of you with SM2 cards will be rid of them and eve will be 6 nearly 7 years old. it needs to look good now and in 2 years, spending more time writing another renderer path stalls development and stifles where the game could be in a few years, this way we are getting a cleaner product imo.
anyhow, thats obviously a load of drunken *******s, but heyho.
sorry to allthe people with none SM3 cards, (my second box has one) :( but at the end of the day i am pretty sure technically its the right decision, lets just hope it does not hurt the player base much.
/Theo
|

Kelron Queldine
Infinitus Odium The Church.
|
Posted - 2007.10.15 15:36:00 -
[137]
Originally by: NeoTheo
anyhow, thats obviously a load of drunken *******s, but heyho.
Actually, it seems to be the first decent answer anyone's given, other than "Why should old cards be supported".  ---------------------------
Vanilla Crazy Cake! |

Paulo Damarr
|
Posted - 2007.10.15 15:58:00 -
[138]
I really dont understand all the panic about OMG! OH NOES! it wont work! According to everything made public so far older machines should have no problems. My backup Rig apparently even meets the requirements
Athlon 64 2800+, GIGABYTE K8NS, 2x 512mb corsair value DDR-400, PNY Nvidia 6600 256mb AGP x8, Windows XP home.
If you have a AGP you can pick 6600 series cards up for around ú50 maybe even less. /sig --->Enter at your own risk<--- |

Mara Kell
|
Posted - 2007.10.15 16:29:00 -
[139]
Edited by: Mara Kell on 15/10/2007 16:30:18 Edited by: Mara Kell on 15/10/2007 16:29:33
Originally by: Paulo Damarr I really dont understand all the panic about OMG! OH NOES! it wont work! According to everything made public so far older machines should have no problems. My backup Rig apparently even meets the requirements
Athlon 64 2800+, GIGABYTE K8NS, 2x 512mb corsair value DDR-400, PNY Nvidia 6600 256mb AGP x8, Windows XP home.
If you have a AGP you can pick 6600 series cards up for around ú50 maybe even less.
Lets see, my machine: Athlon 64 X2 4200+ Asus A8V 4x 512mb DDR-400 infineon sapphire X850PE 256mb AGP 8x XP prof
And the new eve DX9 won run on it...
So which graphics card would you actually suggest me to buy that has SM3 is cheap and is faster than mine and has AGP? The only choice would be the 1950, but so much money for just SM3 and hardly a performance increase?
|

Paulo Damarr
|
Posted - 2007.10.15 16:44:00 -
[140]
Edited by: Paulo Damarr on 15/10/2007 16:47:13
Originally by: Mara Kell Edited by: Mara Kell on 15/10/2007 16:30:18 Edited by: Mara Kell on 15/10/2007 16:29:33
Originally by: Paulo Damarr I really dont understand all the panic about OMG! OH NOES! it wont work! According to everything made public so far older machines should have no problems. My backup Rig apparently even meets the requirements
Athlon 64 2800+, GIGABYTE K8NS, 2x 512mb corsair value DDR-400, PNY Nvidia 6600 256mb AGP x8, Windows XP home.
If you have a AGP you can pick 6600 series cards up for around ú50 maybe even less.
Lets see, my machine: Athlon 64 X2 4200+ Asus A8V 4x 512mb DDR-400 infineon sapphire X850PE 256mb AGP 8x XP prof
And the new eve DX9 won run on it...
So which graphics card would you actually suggest me to buy that has SM3 is cheap and is faster than mine and has AGP? The only choice would be the 1950, but so much money for just SM3 and hardly a performance increase?
I'm not trying to be offensive but you obviously poorly researched your machine when you bought/built it, You have a dual core CPU but didn't think to future proof it by getting a PCI-E mainboard. Due to you having DDR-400 as well im guessing it might be a 939 Socket motherboard and PCI-E boards where available and the price was comparable to the AGP boards.
Plus ATi is Junk. /sig --->Enter at your own risk<--- |

Falkrich Swifthand
Caldari eNinjas Incorporated
|
Posted - 2007.10.15 17:09:00 -
[141]
Edited by: Falkrich Swifthand on 15/10/2007 17:12:28 Edited by: Falkrich Swifthand on 15/10/2007 17:11:38 As someone who actually knows about SM3.0, see this: http://en.wikipedia.org/wiki/Shader_model_3#Pixel_shader_comparison You might notice more than a few differences between SM 2 and SM 3.
It's also worth noting that the ATI X1000 series cards only support the bare minimum of SM 3. Specifically, it only allows max. 512 instruction long pixel shaders. I don't know about the X2000 series, I haven't tried one. NV's 8000 series definitely supports more than 600 instruction shaders.
EDIT: Where's this "nullnull" coming from? nullnull |

Mara Kell
|
Posted - 2007.10.15 17:45:00 -
[142]
Originally by: Paulo Damarr
I'm not trying to be offensive but you obviously poorly researched your machine when you bought/built it, You have a dual core CPU but didn't think to future proof it by getting a PCI-E mainboard. Due to you having DDR-400 as well im guessing it might be a 939 Socket motherboard and PCI-E boards where available and the price was comparable to the AGP boards.
Plus ATi is Junk.
What makes you think that i bought this system as it is? I have never in my whole life bought a complete system. Its a work in progress. I know upgrading is basicly over for this system but not being able to run eve DX9 while crysis will propably run fine is a bad joke.
And btw. Nvidias AGP Cards are junk in terms of technical design, they suck energy like crazy. Not of any use for a silent system. So was the early 1000 series from ATI, therefore the decision for the x800 series.
|

MHayes
|
Posted - 2007.10.16 09:20:00 -
[143]
You knew it only had SM2 when you got it, or if you didn't you should have read the specs. the 6000 series was available but you decided to go ATI. How is this CCPs fault or problem?
It is like me buying a 8800GTX now and then complaining next year that it isn't DX10.1 compatible. If you are upgrading often then it doesnt matter because by the time a game comes out that needs 10.1 this card will be hold hat and I will have upgraded. If I was planning to keep the card for a while I would think, "hmm better get one with SM4 support."
I agree that if SM2 support is easy to implement then make it so but I assume it isn't or they would have done it.
|

Steve Hawkings
|
Posted - 2007.10.16 09:23:00 -
[144]
Originally by: Ealiom Edited by: Ealiom on 17/09/2007 14:04:54 More importantly and something Nobody else seems to be asking is how will the new client fare when multiple accounts are in use.
I have 5 accounts. I'm most comfortable using 3 accounts at once even though with the current client I can run all 5 at once. When the new client comes will i be limited to 2 accounts.........1?!?!
I don't fancy paying for accounts I can't use!
you will be able to use them even if that happens, just not at the same time mr uselss.
|

AsTPlatinum
|
Posted - 2007.10.16 09:42:00 -
[145]
I'd like to know if they are going to implement dual screen compatibility with trinity II, it was availiable in the early versions of eve but got booted out for some reason.
|

Phoenix Lord
The Arrow Project Morsus Mihi
|
Posted - 2007.10.16 11:27:00 -
[146]
People getting ****ed at the SM3 thing are blaming the wrong people.. Dont blame CCP for being smart and using something thats up to date. Dont forget how long we've had the current trinity engine, i dont think they wanted to use something that wasnt the best at the time if we were going to be stuck with it another few years.
Anyway, blame ATi for not being able to include SM3 in that generation, or blame your own damn self for not doing your research. _____
|

Claude Leon
Gallente Ixion Defence Systems The Cyrene Initiative
|
Posted - 2007.10.16 22:34:00 -
[147]
Originally by: Mara Kell Edited by: Mara Kell on 15/10/2007 16:30:18 Edited by: Mara Kell on 15/10/2007 16:29:33
Originally by: Paulo Damarr I really dont understand all the panic about OMG! OH NOES! it wont work! According to everything made public so far older machines should have no problems. My backup Rig apparently even meets the requirements
Athlon 64 2800+, GIGABYTE K8NS, 2x 512mb corsair value DDR-400, PNY Nvidia 6600 256mb AGP x8, Windows XP home.
If you have a AGP you can pick 6600 series cards up for around ú50 maybe even less.
Lets see, my machine: Athlon 64 X2 4200+ Asus A8V 4x 512mb DDR-400 infineon sapphire X850PE 256mb AGP 8x XP prof
And the new eve DX9 won run on it...
So which graphics card would you actually suggest me to buy that has SM3 is cheap and is faster than mine and has AGP? The only choice would be the 1950, but so much money for just SM3 and hardly a performance increase?
My cell phone is faster than your system. I will break it down to you in very simple terms. You either adapt or die.
|
|

CCP Explorer

|
Posted - 2007.10.16 23:57:00 -
[148]
Originally by: MHayes I agree that if SM2 support is easy to implement then make it so but I assume it isn't or they would have done it.
SM2 has a limit of 32+64 shader instructions (see the Wikipedia link above) whereas SM3 has more than 512. We already have shaders in the new graphics engine with more than 200 instructions.
Erlendur S. Thorsteinsson Software Director EVE Online, CCP Games |
|

Avernus
Gallente Imperium Technologies Firmus Ixion
|
Posted - 2007.10.17 02:09:00 -
[149]
Originally by: CCP Explorer
Originally by: MHayes I agree that if SM2 support is easy to implement then make it so but I assume it isn't or they would have done it.
SM2 has a limit of 32+64 shader instructions (see the Wikipedia link above) whereas SM3 has more than 512. We already have shaders in the new graphics engine with more than 200 instructions.
This explantion wins.
...checks card.... doh!
/Starts to put away some cash for a new card.
There are old soliders, and there are bold soliders. But there are very few old, bold soliders. |

riprjak
Hermits Rest
|
Posted - 2007.10.17 03:22:00 -
[150]
Originally by: Ealiom Edited by: Ealiom on 17/09/2007 14:04:54 More importantly and something Nobody else seems to be asking is how will the new client fare when multiple accounts are in use.
I have 5 accounts. I'm most comfortable using 3 accounts at once even though with the current client I can run all 5 at once. When the new client comes will i be limited to 2 accounts.........1?!?!
I don't fancy paying for accounts I can't use!
Well, when Im using multiple clients/games, I run them on separate PCs with their monitors sat side by side using multiplicity virtual keyboard/mouse switching software to jump between them. Multiple DX9 apps, no probs :)
|
|
|
Pages: 1 2 3 4 [5] 6 :: one page |
First page | Previous page | Next page | Last page |