Pages: 1 2 :: [one page] |
Author |
Thread Statistics | Show CCP posts - 0 post(s) |

Siyavash
|
Posted - 2009.04.02 07:28:00 -
[1]
Edited by: Siyavash on 02/04/2009 07:28:18 Dear CCP,
Please remove the check where if we run Eve from a Remote Desktop session is sais that "It can not run in a remote desktop session"... Why? Well, because Eve works PERFECTLY FINE in a Remote Desktop session, atleast in Windows Server 2008.
How do I know this? I left Eve running and forgot about it until I logged in via Remote Desktop to my PC and I saw it running WITHOUT problems!!!
Here is my proof : ( this screenshot is taken from within the Remote Desktop session )
|

Siyavash
|
Posted - 2009.04.02 07:29:00 -
[2]
Just a note : I got a 100/100 line so if you have a slow conncetion it might not be as pleasent experience. However, you could run at a lower res. I guess with smaller window.
|

Primnproper
|
Posted - 2009.04.02 07:40:00 -
[3]
I used to do it using 'remote mobile desktop' which let me play eve over a remote desktop style thing on my smartphone.
It also works through RealVNC if I remeber rightly. ...
Originally by: Graveyard Tan I call bull**** and troll. If you are deaf, how are you even able to read this or type replies?
|

Blane Xero
Amarr The Firestorm Cartel
|
Posted - 2009.04.02 08:12:00 -
[4]
Cool. Fix the image though. ______________________________________________ Haruhiist since December 2008
|

Siyavash
|
Posted - 2009.04.02 09:28:00 -
[5]
Thanks for fixing the image! ( whoever it was )

|

Shinnen
Caldari Northern Intelligence PuPPet MasTers
|
Posted - 2009.04.02 10:04:00 -
[6]
CCP should talk to onLive, would be nice to pay Eve on netbooks and such.
INB4ONLIVE****STORM Linkage
|

Furb Killer
Gallente
|
Posted - 2009.04.02 11:15:00 -
[7]
Onlive reality check ---------------------------------------------
Originally by: Neth'Rae Military experts are calling this a troll.
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.02 11:19:00 -
[8]
Agreed. I hardly ever use remote desktop, but the other day I fired up a remote session (which I had done in the past with classic on occasions) to my other PC and tried to log on to EVE and was like WTF?!? Why add the check CCP? If it works, it works - and it used to work before you intentionally disabled it.
|

Jasahl Toruken
Amarr Open Concepts
|
Posted - 2009.04.02 11:53:00 -
[9]
I used to log in via Logmein.com and play remotly, only worked well enough to do autopilot, switching skills other non-combat stuff. Now all I get is black screen, unless I leave the game running first then remote in. I know it's not intended to run via remote desktop, and they probably did it to keep people from getting into game from their work computer and with the skill que, I dont really need to log in remotely anymore to change skills and that was my primary reason for doing it.
|

Andrest Disch
Amarr Letiferi Praedones
|
Posted - 2009.04.02 12:55:00 -
[10]
Originally by: Furb Killer Onlive reality check
I'd be more inclined to believe eurogamer if they ahd ever atcually used onLive.
Rebuttal. |

jetuserII
|
Posted - 2009.04.02 14:15:00 -
[11]
The check has to be there, because on windows when you remote in, the driver information (i.e. the hardware registry key) switches over to a "virtual machine" when eve tries to launch and gain access to the hardware it only sees remote desktop then fails. The reason it works if you have it running before is because the information/access has already been set up prior to connecting. And of course it'd work with most types of VNC servers, they basically just capture the system view and then send it to you, basically you're just watching a movie of your computer.
|

Nyphur
Pillowsoft Total Comfort
|
Posted - 2009.04.02 14:29:00 -
[12]
Originally by: Shinnen CCP should talk to onLive, would be nice to pay Eve on netbooks and such.
INB4ONLIVE****STORM
A company called Gaikai are already offering EVE and WoW on their remote computing service. Don't know what it's like but I think EVE is one of the few games that would be tolerable to play with a few hundred milliseconds delay. (Source)
Although someone did point out to me that the 1-5 ms delay they're claiming is impossible as it means either there has to be a server in your town or the signal would have to be going faster than the speed of light.
|

Shoukei
Caldari Boobs Ahoy
|
Posted - 2009.04.02 14:43:00 -
[13]
Originally by: Andrest Disch I'd be more inclined to believe eurogamer if they ahd ever atcually used onLive.
Rebuttal.
They use regular ISP's? Because, I will laugh when ISP's start throttling them to hell and back. There is no way they can provide users with so much actual bandwidth and stay in business.
|

Xen Gin
Solar Excavations Ultd.
|
Posted - 2009.04.02 14:53:00 -
[14]
Originally by: Shoukei
Originally by: Andrest Disch I'd be more inclined to believe eurogamer if they ahd ever atcually used onLive.
Rebuttal.
They use regular ISP's? Because, I will laugh when ISP's start throttling them to hell and back. There is no way they can provide users with so much actual bandwidth and stay in business.
And using a 5mbps connection constantly to 'play' your HD game is going to be great when that bandwidth cap sinks your game.
|

Crumplecorn
Gallente Eve Cluster Explorations
|
Posted - 2009.04.02 15:00:00 -
[15]
Originally by: Andrest Disch
Originally by: Furb Killer Onlive reality check
I'd be more inclined to believe eurogamer if they ahd ever atcually used onLive.
Rebuttal.
I'd be more inclined to believe the 'rebuttal' if it wasn't just a bunch of assertions. -
DesuSigs |

Siyavash
|
Posted - 2009.04.02 15:12:00 -
[16]
Originally by: jetuserII The check has to be there, because on windows when you remote in, the driver information (i.e. the hardware registry key) switches over to a "virtual machine" when eve tries to launch and gain access to the hardware it only sees remote desktop then fails. The reason it works if you have it running before is because the information/access has already been set up prior to connecting. And of course it'd work with most types of VNC servers, they basically just capture the system view and then send it to you, basically you're just watching a movie of your computer.
Well, you are right and you are wrong. I am a game developer myself and "everything" "works" just fine even if you start them in a Remote Desktop session. That's the whole idTa behind RDP. I'm 99.99% sure that Eve would not fail to start in via RDP atleast on Windows Vista / Server 2008.
I even have Aero working via RDP.
I'm not sure why the check is in there, I'm guessing windows older versions didn't support it but it still doesn't make sense. Let me ( the user ) choose if I like to run my application in a RDP session or not. It isn't up to them to decide so this whole thing is a bit odd.
|

jetuserII
|
Posted - 2009.04.02 19:26:00 -
[17]
Originally by: Siyavash
Well, you are right and you are wrong. I am a game developer myself and "everything" "works" just fine even if you start them in a Remote Desktop session. That's the whole idTa behind RDP. I'm 99.99% sure that Eve would not fail to start in via RDP atleast on Windows Vista / Server 2008.
I even have Aero working via RDP.
I'm not sure why the check is in there, I'm guessing windows older versions didn't support it but it still doesn't make sense. Let me ( the user ) choose if I like to run my application in a RDP session or not. It isn't up to them to decide so this whole thing is a bit odd.
You are right and wrong as well. I'm a developer myself, the base issue is this, we need hardware acceleration to be able to launch eve. Being able to launch a resource intensive application on a remote computer could degrade the performance of remote desktop enough that it could cause problems.
The reason you can get aero is that its not a -true- hardware accelerated visual effect within vista and you will only get the effect if connected to a vista machine from a vista os based machine (vista or the server version). Also I do believe microsoft said they were working on a way to transfer DX calls via the RDP protocol so you could run simple applications through the interface and use the resources of the computer you are connecting from.
At any rate, the check is un-necessary because the launch will still fail as the DX initializer will try to access a virtual driver (supplied by RDC) that doesn't have the right capabilities and will still crash anyway, thats why the workaround works just fine.
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.05 00:54:00 -
[18]
Originally by: jetuserII
Originally by: Siyavash
Well, you are right and you are wrong. I am a game developer myself and "everything" "works" just fine even if you start them in a Remote Desktop session. That's the whole idTa behind RDP. I'm 99.99% sure that Eve would not fail to start in via RDP atleast on Windows Vista / Server 2008.
I even have Aero working via RDP.
I'm not sure why the check is in there, I'm guessing windows older versions didn't support it but it still doesn't make sense. Let me ( the user ) choose if I like to run my application in a RDP session or not. It isn't up to them to decide so this whole thing is a bit odd.
You are right and wrong as well. I'm a developer myself, the base issue is this, we need hardware acceleration to be able to launch eve. Being able to launch a resource intensive application on a remote computer could degrade the performance of remote desktop enough that it could cause problems.
What the huh? It is not the developers place to decide what should or should not be run on a machine, be it local or remote. Whether a resource intensive game may cause performance degradation (who woulda thunk it?) is entirely the end-user's problem (or in the case of a remote server owned/operated by another party - the server operators problem). You, as a developer, are irrelevent in this regard. Your job is to make the application work. Whether the end-users hardware is up to the task or not is for them to worry about.
Long story short: If your hardware at the remote end is up to running the game, and if your network bandwidth is up to constantly transferring high-res video output, then let it.
|

Cyhawk
|
Posted - 2009.04.05 06:12:00 -
[19]
No, it doesn't work, and i'll tell you why.
(Go past this paragraph to skip the boring abstract protocol talk) Software protocols like RDP/VNC/Etc all work in roughly the same way. They split your screen up into many tiny blocks (aprox 10x10 pixels, depends on the exact protocol) When the screen changes on the machine, it takes that block (or blocks, depending on the size of the change) and sends them to the client. 10x10 pixels is rather small, and compresses quite well so you think. Works great for simple stuff, like typing, management, even web browsing.
However, whats wrong is.. when you have entire SCREEN fulls of changes, like in a game, or a video. Now instead of sending a few 10x10 blocks, it has to send the entire screen. This is ALOT of data, for one update a second. But wait, your game is running at 30-60 FPS (in our case, network updates/second). Which means, you have to send basicly a 1024x768 BMP image (Compression is obviously used) 30 times a second to the client. Even with great compression, this is a near impossible task for most internet pipes. (Even Gigbite Ethernet has trouble with it, your looking at bonded network cards for this to be possible)
Video does the same thing. Try running Youtube through a remote desktop connection, it plays like a slideshow, and thats a tiny box...
So yes, it works.. but its not feasible at this time to actually play the game.
Note: Despite all this, it does work well for skill changes from work as long as your at a station, and have Station environment turned OFF. (and all effects, and screen res changed to something tiny like 1024x768, with NO graphical enhancements, and 16bit color.)
In a few years... once "cloud computing" (god i hate that term) and technologies from Vmware and Citirx come into play more often, we may.. see the ability to do this, until then.. ah well.
Note to the OP: Next time, brin up the start menu when you do screenshots like this to prove its really a remote session :P
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.05 07:49:00 -
[20]
Originally by: Cyhawk No, it doesn't work, and i'll tell you why.
Christ, if being pedantic was a prize winning contest you'd get the gold star.
Here's the low down: It has worked (locally on a 100mbit network) for me in the past, and no it hasn't been at 60fps (or even 30fps), but for what I was using it for (skill changes, manufacturing, market stuff) it was sufficient. No, you can't PvP with it, or probably even run missions with it because it will be a slideshow. But then no one was ever arguing that. We're merely arguing that is does work, however slowly, so why would CCP add a check to disable it.
Sometimes I think I die a little inside every time I post on the Eve forums.
Quote: Edit: Post summarized: Ok it does "work" but isnt practical with today's technology. Mostly due to Bandwidth issues.
No **** sherlock.
|

Nyphur
Pillowsoft Total Comfort
|
Posted - 2009.04.05 08:38:00 -
[21]
Edited by: Nyphur on 05/04/2009 08:39:34
Originally by: Cyhawk No, it doesn't work, and i'll tell you why.
Software protocols like RDP/VNC/Etc all work in roughly the same way. They split your screen up into many tiny blocks (aprox 10x10 pixels, depends on the exact protocol) When the screen changes on the machine, it takes that block (or blocks, depending on the size of the change) and sends them to the client. 10x10 pixels is rather small, and compresses quite well so you think. Works great for simple stuff, like typing, management, even web browsing.
However, whats wrong is.. when you have entire SCREEN fulls of changes, like in a game, or a video. Now instead of sending a few 10x10 blocks, it has to send the entire screen. This is ALOT of data, for one update a second. But wait, your game is running at 30-60 FPS (in our case, network updates/second). Which means, you have to send basicly a 1024x768 BMP image (Compression is obviously used) 30 times a second to the client. Even with great compression, this is a near impossible task for most internet pipes. (Even Gigbite Ethernet has trouble with it, your looking at bonded network cards for this to be possible)
If you're referring to OnLive, your point was already made moot by their current claims. If you aren't, then please excuse the rest of this post:
You can't just assume they're using the same algorithms as are the current standards used for remote desktop, read up on their technical specs. They're claiming they use of a new lossy compression algorithm designed specifically around games with compression and decompression times in the 1-5ms range on a small piece of custom hardware. They're then claiming that the compressed video stream will use 2-5Mb of bandwidth. Assuming their claims are real, and the weight of publishers behind them hints that it is, concerns over feasibility are unfounded.
The one complaint against Onlive that actually holds water is the fact that the average user's internet connection is subject to some harsh restrictions. The average user may have a 2Mb or 8Mb connection, but unless you're on a special deal with a low contention ratio, you won't get that rate at peak times. Moreover, most ISPs have tiny monthly bandwidth caps that this service could use up in days.
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.05 11:39:00 -
[22]
Originally by: Nyphur If you're referring to OnLive, your point was already made moot by their current claims. If you aren't, then please excuse the rest of this post:
You can't just assume they're using the same algorithms as are the current standards used for remote desktop, read up on their technical specs. They're claiming they use of a new lossy compression algorithm designed specifically around games with compression and decompression times in the 1-5ms range on a small piece of custom hardware. They're then claiming that the compressed video stream will use 2-5Mb of bandwidth. Assuming their claims are real, and the weight of publishers behind them hints that it is, concerns over feasibility are unfounded.
The one complaint against Onlive that actually holds water is the fact that the average user's internet connection is subject to some harsh restrictions. The average user may have a 2Mb or 8Mb connection, but unless you're on a special deal with a low contention ratio, you won't get that rate at peak times. Moreover, most ISPs have tiny monthly bandwidth caps that this service could use up in days.
This is way off the OP's original subject, but then, this is the Eve forums.
In response to whatever this crap is about OnLive and cloud computing, you're looking at two distinct bottlenecks.
The first is internet bandwidth. You can't look at one connection taking up 2-5mbits and say it's feasible. What you need to do is look at thousands of users at the same time using that much bandwidth. It'd be like p2p times a bazillion. Keep in mind that the bandwidth that your ISP sells you isn't exclusive to you. They market their available bandwidth on the idea that you, as a user, will only be using that bandwidth for short periods of time each day, and then sell the rest of that time to other customers. That's how they work, and how they can run profitably. It may suck, but thems the breaks. The internet, as it stands, is not capable of handling thousands (hundreds of thousands?) of users all using that much bandwidth at the same time, especially for international users, which is why a lot of telcos/ISPs have been kicking up a fuss over the last couple of years about P2P and video on demand - the internet is already under immense strain to keep up with its users.
The second bottleneck will be server-side hardware. Think of the hardware that it takes to run today's modern games locally - those games are generally using a high percentage of the available processing power of that machine (CPU & GPU) - the operating system and background tasks are only a small overhead. Now, think of how much hardware the server-side is going to need to not only run copies of these modern games for every user but to also perform whatever magical compression algorithm is they plan to use, at least 30 odd times a second, for every user.
Not feasible with todays (or even tomorrows) hardware/infrastructure.
|

Nyphur
Pillowsoft Total Comfort
|
Posted - 2009.04.05 12:11:00 -
[23]
Originally by: Polly Prissypantz
In response to whatever this crap is about OnLive and cloud computing, you're looking at two distinct bottlenecks.
The first is internet bandwidth.
That's what I said.
Originally by: Polly Prissypantz
The second bottleneck will be server-side hardware. Think of the hardware that it takes to run today's modern games locally - those games are generally using a high percentage of the available processing power of that machine (CPU & GPU) - the operating system and background tasks are only a small overhead. Now, think of how much hardware the server-side is going to need to not only run copies of these modern games for every user
You don't need an entire gaming rig at the server for each person connected, that's the point of cloud computing. You give the workload to a server cluster composed of CPU and GPU racks and it figures out the fastest way to get the computation complete. This benefits from contention ratios in the same way that an ISP benefits from using contention on broadband connections. The server doesn't need to be able to handle every single user, it just needs enough hardware to handle the number they have on at peak times without significant performance degredation. You also need to stop thinking about this as a big room full of gaming computers somewhere, the server architecture for this project is almost certainly way more efficient than that. There are certain things you can do with a well-design server rack architecture that you just can't with a network of home computers, due in part to the possibility of using high-speed data links like infiniband and fast-access ram disks.
Originally by: Polly Prissypantz but to also perform whatever magical compression algorithm is they plan to use, at least 30 odd times a second, for every user.
Their compression algorithm allegedly runs fast enough on a cheap piece of external hardware that it isn't an issue. It's just an additional 1-2ms delay, which in terms of a 60 frame per second game is more than acceptable.
Originally by: Polly Prissypantz Not feasible with todays (or even tomorrows) hardware/infrastructure.
People have been saying the same thing about EVE Online's server architecture for the past five years, that it's not feasible with today's technology and they'll eventually have to shard the server. The fact is that unless you're working on Onlive or are doing active research into cloud computing, you're not even qualified to assess feasibility of the Onlive project. Neither am I, of course. But I am willing to admit that if their current claims are true, the service does appear to be feasible from my limited technical perspective. The only arguments I've seen to suggest its infeasibility have all been assertions that their claims are false, but unless you're working on the project and know this for a fact, you can't reasonably make that assertion.
If you want to prove it can't work, you need to start from the assumption that they're telling the truth about the technical aspects and come up with reasons why it still won't work. One such reason may be the average user having bandwidth caps. Another may be that high contention ratios on current home broadband limits bandwidth at peak times and that major broadband providers are throttling the hell out of certain services. But you can't just assert that it won't work because you don't believe they have the technical expertise to overcome processing and compression demands because they've already asserted that they do. |

Lord WarATron
Amarr Shadow Reapers DAMAGE INC...
|
Posted - 2009.04.05 12:23:00 -
[24]
Edited by: Lord WarATron on 05/04/2009 12:23:56 I remember when people said it was impossible to have more than 2800 baud on a modem. How could a phone wire carry more?
I remember when Bill Gates said that all we needed was 640k of memory.
I remember when people said downloading high quality graphic pictures via internet in a reasonable time was impossible, back when people use to use usenet and telnet.
I remember when people said that dymanic databse links to web pages were impossible.
I remember when people said that it was impossible to have a 1mbit net connection via a phoneline/broadband, let alone a 50mbit connection
I remember when wireless was considered impossible, since "radio would interfear with it"
I remember when the idea of streaming video though the internet was considered impossible
Lession: What might not look realistic today, can become very realisting after a time period. A few years ago, the concept of youtube would have been seen as impossible. Its never a matter of "IF", its just a matter of "When" |

Joshua Foiritain
Gallente Coreli Corporation
|
Posted - 2009.04.05 12:34:00 -
[25]
Originally by: Nyphur People have been saying the same thing about EVE Online's server architecture for the past five years, that it's not feasible with today's technology and they'll eventually have to shard the server.
The only people ive ever heard saying that are random forum noobs  ---------------------------
[Coreli Corporation] |

Nyphur
Pillowsoft Total Comfort
|
Posted - 2009.04.05 12:39:00 -
[26]
Originally by: Joshua Foiritain
Originally by: Nyphur People have been saying the same thing about EVE Online's server architecture for the past five years, that it's not feasible with today's technology and they'll eventually have to shard the server.
The only people ive ever heard saying that are random forum noobs 
I used to get it all the time from people in university that I talked to about EVE, people who should really know better after seeing the leaps technology has made in the past few years. It just seems alien to so many people that the server (both on the hardware and the software side) can be constantly updated ahead of the expanding playerbase. I find people who play other MMOs also have a hard time grasping the concept.
Remember when EVE could only handle about 16k people at once and started to lag with more than 100 in system?
|

therealdhs
|
Posted - 2009.04.05 14:06:00 -
[27]
Edited by: therealdhs on 05/04/2009 14:05:54
Originally by: Jasahl Toruken I used to log in via Logmein.com and play remotly, only worked well enough to do autopilot, switching skills other non-combat stuff. Now all I get is black screen, unless I leave the game running first then remote in. I know it's not intended to run via remote desktop, and they probably did it to keep people from getting into game from their work computer and with the skill que, I dont really need to log in remotely anymore to change skills and that was my primary reason for doing it.
This is what I do when I am away from my computer. Eve runs well, and the only issues I have are that it goes at 4 FPS. Defiantly not enough for running a mission or PvPing, but I've been able to semi-AFK mine and chat with my friends without any issues.
As for your black screen, close the box in the upper-right corner that says who's connected to the machine. That seems to be the only conflict that has messed with the game's video that I've found. -------- Bender: Ahhh, what an awful dream. Ones and zeroes everywhere... and I thought I saw a two. Fry: Don't worry, Bender: there's no such thing as two. |

Xianbei
|
Posted - 2009.04.05 15:57:00 -
[28]
Edited by: Xianbei on 05/04/2009 15:57:49
Originally by: jetuserII The check has to be there, because on windows when you remote in, the driver information (i.e. the hardware registry key) switches over to a "virtual machine" when eve tries to launch and gain access to the hardware it only sees remote desktop then fails. The reason it works if you have it running before is because the information/access has already been set up prior to connecting. And of course it'd work with most types of VNC servers, they basically just capture the system view and then send it to you, basically you're just watching a movie of your computer.
this is essentially.....well no, its totally wrong and not how terminal services work
/CCP, I dont need RDP to play per se, just to change skills remotely...please allow it again |

Jim McGregor
|
Posted - 2009.04.05 16:08:00 -
[29]
Remote Desktop allows you to share the Eve installation with your friends. Perhaps thats why. :)
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.06 03:05:00 -
[30]
Originally by: Jim McGregor Remote Desktop allows you to share the Eve installation with your friends. Perhaps thats why. :)
Hmmm I never thought of that. It's not really better than just sharing your login details, but on the same token you're not sharing your login details plus you could have multiple different users on one account and CCP wouldn't be able to track IP's used to login since the actual client will always be coming from the same IP. Sneaky sneaky.
|

Another Forum'Alt
Gallente Center for Advanced Studies
|
Posted - 2009.04.06 03:22:00 -
[31]
Originally by: Andrest Disch
Originally by: Furb Killer Onlive reality check
I'd be more inclined to believe eurogamer if they ahd ever atcually used onLive.
Rebuttal.
Oh yes, because BBC news know so much about technology... </sarcasm>
(same goes for 99% of dead tree media as well) Guide to forum posting |

Taedrin
Gallente Nabaal Engineering of Haarsuk
|
Posted - 2009.04.06 03:22:00 -
[32]
Edited by: Taedrin on 06/04/2009 03:23:30
Originally by: Nyphur
Originally by: Polly Prissypantz Not feasible with todays (or even tomorrows) hardware/infrastructure.
People have been saying the same thing about EVE Online's server architecture for the past five years, that it's not feasible with today's technology and they'll eventually have to shard the server.
For all intents and purposes, EVE's servers ARE sharded. Each system is it's own shard. The only thing special that EVE does is perform character transfers between shards on the fly (each time you do a "session change"). This is aided by the fact that all of these "shards" connect to the same database. Or to put it in layman's terms - Tranquility is NOT a single computer. It is made up of a LOT of computers. From what we've seen, a single "node" can only handle around 2000 concurrent users - something that WoW has been doing for years now.
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.06 03:24:00 -
[33]
Originally by: Nyphur
Originally by: Polly Prissypantz Not feasible with todays (or even tomorrows) hardware/infrastructure.
People have been saying the same thing about EVE Online's server architecture for the past five years, that it's not feasible with today's technology and they'll eventually have to shard the server. The fact is that unless you're working on Onlive or are doing active research into cloud computing, you're not even qualified to assess feasibility of the Onlive project. Neither am I, of course. But I am willing to admit that if their current claims are true, the service does appear to be feasible from my limited technical perspective. The only arguments I've seen to suggest its infeasibility have all been assertions that their claims are false, but unless you're working on the project and know this for a fact, you can't reasonably make that assertion.
I have no doubt that several years from now we certainly may have the technology to pull off cloud computing for todays modern games, but at the same time I keep in mind that games are always pushing the hardware boundary. So "modern" games several years from now will still be resource hogs, not to mention we'll probably all be running our home TV sets at SuperBBQDuperHD resolutions.
I don't necessarily think that cloud computing won't work, simply that it won't work for modern gaming. I'm sure I'd quite enjoy playing Master of Magic on my home TV at 320x200 resolution. 
Quote: If you want to prove it can't work, you need to start from the assumption that they're telling the truth about the technical aspects and come up with reasons why it still won't work. One such reason may be the average user having bandwidth caps. Another may be that high contention ratios on current home broadband limits bandwidth at peak times and that major broadband providers are throttling the hell out of certain services. But you can't just assert that it won't work because you don't believe they have the technical expertise to overcome processing and compression demands because they've already asserted that they do.
If I wanted to prove it can't work, I wouldn't be sitting here talking about it on the Eve forums. (not to mention logic 101: you can't prove a negative).
There is a good reason why we moved away from thin clients and centralized servers for most tasks 20 odd years ago - it was simply no longer cost-effective compared to using local machines. And that's what I still believe today. Maybe they can pull it off on a small scale, but on any kind of scale that would most likely be in demand, it will simply not be cost effective. Now, granted, the concept of thin clients have been making a comeback recently thanks to the likes of Google, but this is still only really for your run-of-the-mill office type activities, and not high-end gaming.
You ever heard of the Phantom Game Console? I'm quite happy to lump OnLive in with that until I see proof of it working. OnLive will get all this funding from venture capital firms, spend a few years "working" on it, the execs and staff will all take home nice pay packets and then it will fail (or become some niche, low market share business performing some side-project only loosely related to their initial claims).
So sayeth teh prissypantz.
|

Another Forum'Alt
Gallente Center for Advanced Studies
|
Posted - 2009.04.06 03:38:00 -
[34]
Onlive will not only fail but when they go bust, all the users who bought games from them will be left with nothing because they didn't actually own the games  Guide to forum posting |

Taedrin
Gallente Nabaal Engineering of Haarsuk
|
Posted - 2009.04.06 04:29:00 -
[35]
Edited by: Taedrin on 06/04/2009 04:34:55 Edited by: Taedrin on 06/04/2009 04:31:00 Edited by: Taedrin on 06/04/2009 04:29:52
Originally by: Polly Prissypantz (not to mention logic 101: you can't prove a negative).
Not true. Under logic, a negative (IE: FALSE) can easily be proved depending upon the circumstances and the given boolean variables. EXAMPLE: Prove that <X != FALSE> given <X => Y = FALSE> and <Y = FALSE>.
1: Y = FALSE (given) 2: X => Y = FALSE (given) 3: !X v Y = FALSE(definition of implication) 4: !X v FALSE = FALSE (Substitution) 5: !(X ^ TRUE) = FALSE (deMorgan's Law) 6: X ^ TRUE = TRUE (negation + algebra) 7: X = TRUE (See truth table) 8: X != FALSE (X is true, so therefore can not also be FALSE at the same time. This is part of the definition of boolean logic)
TRUTH TABLE: X ^ TRUE = RESULT T T T F T F
This is logic in it's rawest form. There is no arguing - only postulates, givens, definitions and the resulting theories. All logic is, is taking a statement and reducing it to another statement which is equivalent in value.
Furthermore, Nyphur is talking about another method for constructing a proof. When trying to DISPROVE something, you assume that the thing you are trying to disprove is true. You then apply various laws (or because we are actually talking about rhetoric here -pieces of evidence) which conflict with the assumption. Once a contradiction is found, you have disproved the assumption. EXAMPLE:
Disprove <X = TRUE> given <X=>Y=TRUE> and <Y = FALSE> 1: X = TRUE (assumption) 2: X=>Y=TRUE (given) 3: Y=FALSE (given) 4: X=>FALSE=TRUE (substitution) 5: TRUE=>FALSE=TRUE (substitution) 6: !TRUE v FALSE = TRUE (definition of implication) 7: FALSE v FALSE = TRUE (negation) 8: FALSE = TRUE (idempotency) CONTRADICTION Therefore X!=TRUE because X=TRUE => FALSE=TRUE
EDIT: Forgot about the application of algebra in first proof.
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.06 05:06:00 -
[36]
Edited by: Polly Prissypantz on 06/04/2009 05:10:28
Originally by: Taedrin
Originally by: Polly Prissypantz (not to mention logic 101: you can't prove a negative).
Not true. Under logic, a negative (IE: FALSE) can easily be proved depending upon the circumstances and the given boolean variables. EXAMPLE: Prove that <X != FALSE> given <X => Y = FALSE> and <Y = FALSE>.
1: Y = FALSE (given) 2: X => Y = FALSE (given) 3: !X v Y = FALSE(definition of implication) 4: !X v FALSE = FALSE (Substitution) 5: !(X ^ TRUE) = FALSE (deMorgan's Law) 6: X ^ TRUE = TRUE (negation + algebra) 7: X = TRUE (See truth table) 8: X != FALSE (X is true, so therefore can not also be FALSE at the same time. This is part of the definition of boolean logic)
TRUTH TABLE: X ^ TRUE = RESULT T T T F T F
This is logic in it's rawest form. There is no arguing - only postulates, givens, definitions and the resulting theories. All logic is, is taking a statement and reducing it to another statement which is equivalent in value.
Furthermore, Nyphur is talking about another method for constructing a proof. When trying to DISPROVE something, you assume that the thing you are trying to disprove is true. You then apply various laws (or because we are actually talking about rhetoric here -pieces of evidence) which conflict with the assumption. Once a contradiction is found, you have disproved the assumption. EXAMPLE:
Disprove <X = TRUE> given <X=>Y=TRUE> and <Y = FALSE> 1: X = TRUE (assumption) 2: X=>Y=TRUE (given) 3: Y=FALSE (given) 4: X=>FALSE=TRUE (substitution) 5: TRUE=>FALSE=TRUE (substitution) 6: !TRUE v FALSE = TRUE (definition of implication) 7: FALSE v FALSE = TRUE (negation) 8: FALSE = TRUE (idempotency) CONTRADICTION Therefore X!=TRUE because X=TRUE => FALSE=TRUE
EDIT: Forgot about the application of algebra in first proof.
WTF?
Ladies and Gentlemen, this is Chewbacca. Chewbacca is a Wookie from the planet Kashyyk...
|

Agent Known
Apotheosis of Virtue
|
Posted - 2009.04.06 05:32:00 -
[37]
A lot of other games will refuse to run in RDP because of the lack of hardware acceleration. In a remote desktop, you're limited to a "virtual" graphics card (Chained-DDD I think it's called) with zero graphics support.
Some games MIGHT run (say, OpenGL games), but will play like a slideshow. I got Vista's solitaire to run on a local network, but that's a different story 
It would be nice to have Eve via remote desktop so I can switch skills if I'm gone longer than expected...guess I'll just leave a client running when I leave then 
|

Nyphur
Pillowsoft Total Comfort
|
Posted - 2009.04.06 10:21:00 -
[38]
Originally by: Polly Prissypantz
I have no doubt that several years from now we certainly may have the technology to pull off cloud computing for todays modern games, but at the same time I keep in mind that games are always pushing the hardware boundary. So "modern" games several years from now will still be resource hogs, not to mention we'll probably all be running our home TV sets at SuperBBQDuperHD resolutions.
I don't necessarily think that cloud computing won't work, simply that it won't work for modern gaming. I'm sure I'd quite enjoy playing Master of Magic on my home TV at 320x200 resolution. 
My point is that you're not qualified enough in the field of Cloud Computing research to make that assessment. Nor have you tested the project to assess viability. The people working on it say it works and claim to have overcome all of the major technical hurdles. In the absence of evidence to the contrary, just saying "I don't believe it will work" is meaningless. My previous point stands, that the only argument you've come up with is to assert that their claims are false, but to truly disprove viability with what limited information and technical expertise either of us have, the only viable means is to assume their claims are true and show that this scenario still leads to fault.
Originally by: Polly Prissypantz
If I wanted to prove it can't work, I wouldn't be sitting here talking about it on the Eve forums. (not to mention logic 101: you can't prove a negative).
Sure you can, it's called proof by contradiction (logic 101).
|

Larg Kellein
Caldari Agony Unleashed Agony Empire
|
Posted - 2009.04.06 10:49:00 -
[39]
Originally by: Andrest Disch
Originally by: Furb Killer Onlive reality check
I'd be more inclined to believe eurogamer if they ahd ever atcually used onLive.
Rebuttal.
"Mr Perlman, who led the early developments into video streaming service QuickTime" Rebuttal of your rebuttal, from your rebuttal.
|

Ankhesentapemkah
Gallente State Protectorate
|
Posted - 2009.04.06 10:51:00 -
[40]
I used to play Factional Warfare all day on a 500MHz computer connected to my desktop through VNC, it works without any problems, except that I only had 16 colors and 1FPS due to bandwidth constraints.
So an artificial check on one specific type of software sounds rather stupid to me. ---
|

Polly Prissypantz
Dingleberry Appreciation Society
|
Posted - 2009.04.06 11:19:00 -
[41]
Originally by: Nyphur
Originally by: Polly Prissypantz
I have no doubt that several years from now we certainly may have the technology to pull off cloud computing for todays modern games, but at the same time I keep in mind that games are always pushing the hardware boundary. So "modern" games several years from now will still be resource hogs, not to mention we'll probably all be running our home TV sets at SuperBBQDuperHD resolutions.
I don't necessarily think that cloud computing won't work, simply that it won't work for modern gaming. I'm sure I'd quite enjoy playing Master of Magic on my home TV at 320x200 resolution. 
My point is that you're not qualified enough in the field of Cloud Computing research to make that assessment. Nor have you tested the project to assess viability. The people working on it say it works and claim to have overcome all of the major technical hurdles. In the absence of evidence to the contrary, just saying "I don't believe it will work" is meaningless.
Rubbish. I could go out tomorrow and claim that I've found a cure for cancer and have spent the last 3 years in "stealth mode" researching it. Now all I need is a bit more funding to complete my research and the world will be saved. Since you have no proof that I haven't cured cancer, are you going to assume that I have indeed done so on the premise that not believing my claim would be "meaningless"?
Quote: My previous point stands, that the only argument you've come up with is to assert that their claims are false, but to truly disprove viability with what limited information and technical expertise either of us have, the only viable means is to assume their claims are true and show that this scenario still leads to fault.
That is if I was even trying to prove that it won't work. I'm not, and have said as much. I am expressing my opinion that I don't think it will work based on my limited knowledge. The burden is not mine to prove them wrong. They are the ones making the claim so the burden is theirs to prove their claims true.
Originally by: Nyphur
Originally by: Polly Prissypantz
If I wanted to prove it can't work, I wouldn't be sitting here talking about it on the Eve forums. (not to mention logic 101: you can't prove a negative).
Sure you can, it's called proof by contradiction (logic 101).
Perhaps you meant to link to this. I'm more than willing to concede that cloud computing is both technically and economically viable once it has been proven, until then all I see is someone out to make a bit of cash out of willing venture capitalists. Call me a cynic.
|

Nyphur
Pillowsoft Total Comfort
|
Posted - 2009.04.06 11:56:00 -
[42]
Edited by: Nyphur on 06/04/2009 11:59:21
Originally by: Polly Prissypantz Rubbish. I could go out tomorrow and claim that I've found a cure for cancer and have spent the last 3 years in "stealth mode" researching it. Now all I need is a bit more funding to complete my research and the world will be saved. Since you have no proof that I haven't cured cancer, are you going to assume that I have indeed done so on the premise that not believing my claim would be "meaningless"?
You're extending the analogy to the point of absurdity. The point I am making is that you are not competant to assess the feasibility of the onlive project on a technical basis. You're not even close to being competant in the fields they're working in and neither am I. The only arguments you can make that aren't bull**** must assume that the technical aspects are true. If you can show that given their current claims there are OTHER problems with the system, THEN you have shown the infeasibility of the system. If you were competant in the field of cloud computing or were on the team developing their compression algorithm, then you'd be credible to discuss the technical feasibility of the project. As it is, you're not and if you want to show that it's infeasible, you'll have to use other arguments.
To use your cancer example, if a company of cancer researchers claimed to have a new cancer treatment drug and had signed on every major pharmaceutical company, you certainly aren't qualified to say their drug can't possibly work. Let's say they claim their drug kills 97% of cancer cells and 0.1% of normal cells in test subjects. You quite simply do not have the technical expertise to disprove that. You're not a cancer researcher and your opinion on the technical aspects of that research are worthless. The same is true of your opinions on the technical feasibility of the onlive project. Your opinions on other problems, though, may be credible.
To draw a parallel, if I started telling you what kind of engine modifications would improve your car's fuel efficiency, you should flat-out ignore me. I barely know how an internal combustion engine works and have no mechanic experience, so my opinion on the matter is worth absolutely nothing.
Originally by: Polly Prissypantz
That is if I was even trying to prove that it won't work. I'm not, and have said as much. I am expressing my opinion that I don't think it will work based on my limited knowledge. The burden is not mine to prove them wrong. They are the ones making the claim so the burden is theirs to prove their claims true.
That makes sense, and that's fine. But the problem I have with your opinion here is that you've been trying to back it up with arguments that you're not credible to make. Ideas like the processing load being infeasibly high, the compression being too cpu-intensive or the bandwidth requirements being higher than they suggested are not arguments that any of us are credible to make. If you want to hold an opinion on the project's feasibility, go right ahead, but if you start trying to back up your opinion with assertions you can't prove then you're as bad as that eurogamer article's writer.
Originally by: Nyphur this.
No, that suggests I'm claiming that onlive does work and citing the lack of evidence to the contrary as proof. That'd be wholly illogical. What I'm suggesting is that none of us can attest to the project's technical feasibility or infeasibility without knowing some (as yet) secret details or seeing it in action. In the absence of this, we can formulate hypothesis on the project's infeasibility by assuming the technical claims are accurate and then showing that this still leads to an infeasible end product. I've suggested two reasons why the project won't work in spite of the technical aspects functioning - bandwidth caps and advertised bandwidth being lower than expected due to high contention ratios etc.
|

Buga Buga
Sajuuk Fleet Crimson Dragons
|
Posted - 2009.04.06 12:00:00 -
[43]
What? I used a ton of remote desktops to play eve since..err..forever o_O This is news to me.
|

Taedrin
Gallente Nabaal Engineering of Haarsuk
|
Posted - 2009.04.06 14:59:00 -
[44]
Originally by: Polly Prissypantz
WTF?
Ladies and Gentlemen, this is Chewbacca. Chewbacca is a Wookie from the planet Kashyyk...
You made a claim - that logic dictates that you can not prove a negative. I was refuting your claim by showing two examples where logic is used to do just that - to prove a negative. First example was proving that a boolean variable was NOT equal to a certain value. The second example was a "disproof" - or a proof which was constructed to prove that an assumption was wrong because it causes a contradiction with a known truth.
TL;DR version: Don't claim to know anything about logic, unless you've actually taken a logic class - you might get called out on it by someone who has actually studied logic a bit (granted, it's been a few years).
|

Callista Omenswarm
Astronautical Engineering
|
Posted - 2009.05.11 13:25:00 -
[45]
/signed.
Another Vista/2008 user who wants to play via RDC (at work, slows my work machine down too much)
So VNC works, huh? Might have to give it a spin, though last time I used it, it had some giant security hole in it that voided the password.. logged on to find all kinds of crap and spyware on the machine. Best keep those OS discs handy methinks.
|

Zoukanix
Caldari
|
Posted - 2009.05.21 17:23:00 -
[46]
Originally by: Jim McGregor
Remote Desktop allows you to share the Eve installation with your friends. Perhaps thats why. :)
I'm note sure i see the problem with this one as you'd all need an account anyways besides playing eve via RDP would not exacly be speedy. However it would be useful for skill checking etc.
i would like to see this working!
|
|
|
Pages: 1 2 :: [one page] |