Gryzemuis on 26/3/2009 at 14:48
Quote Posted by Al_B
Unless my calculations are wrong light travels at approx 300km (just under 200 miles) per millisecond in the
best of cases. A round trip would then limit the maximum theoretical distance between player and server of 150km.
You are almost correct.
Speed of light is 300k km/sec. That's 300 km/millisecond.
However, that is the speed of light through vacuum.
The speed of light through fibre is around 0.6 times speed through vacuum.
In other words, your internet packet travels at around 180 km/milliseconds.
Of course, there are more delays on top of that. One is the transmission delay (time between first bit of a packet and the last bit). But that comes more into play at low speeds. (Example: the time it takes for a 1500 byte packet to be sent over a 64 Kbps line is ~ 20 milliseconds. Not neglectable). But the most noticable is the queuing delays. Each router can send only one packet at most over a line. If multiple packets arrive at the same time, and need to be sent out over one link, they'll have to wait. Good routers will queue packets between 0 and 200 milliseconds. Note that all ISPs want some queuing at each router. If there wasn't any queuing, it means they have overprovisioned their network too much. And they would be paying too much for bandwidth.
So let's look at an average customer. 1000 km away from the server. 10 hops. The 10 hops would introduce maybe 25 ms delays. That's a very rough estimate. 1000 km / 180 ms = 6 milliseconds. Times two, because packets have to travel back. Result: 60 milliseconds RTT. I think this is a very reasonable number. A number many people will recognize as their average lowest ping to the average server they play on.
Now the 1 millisecond these guys are talking about, is the extra delay from encoding the frame into h264 or mpeg2. And we know that number is gonna be higher. But that 1 ms is not the absolute delay. It's the delay increase over the normal network delay.
Now there is one more very important factor about lag.
The lag you know from playing games online is the lag between firing your gun, and hitting someone on the server. That lag can be irritating. But the lag you will see in OnLive is totally different.
When is is determined what you see on screen ?
In an offline game, when you move your mouse, you will immediately look into the new direction. Even if you play online, and your shots are lagging, still your view will respond immediately to your mouse.
Now imagine what happens with OnLive. The rendering of what you see can only be done after the server received the packet that tells it that your mouse have moved. And then the newly rendered frame can only be displayed on your screen after it has been sent from the server to your PC. This means that the result of your mouse movement will show with the full delay of your connection !
Suppose you have 50 ms network delay. (Already very good).
Suppose OnLive can encode frames in 10 ms. (Wild guess).
Suppose Onlive renders 50 fps for you.
That means each frame takes 20 milliseconds to render.
Now suppose you move your mouse.
The packet hits the server 25 ms later.
Suppose the server just started rendering a new frame.
It has to wait 20 ms until it can take that mouse action into account.
It then needs 20 ms to render a new frame.
It then needs 10 ms to encode the frame.
It then needs 25 ms to send the frame back to you.
That's a total delay of a 100 ms. Even though your "ping" is only 50 ms.
So in reality, I think this will mean that your mouse movements will be delayed by 100-200 ms, even for the best real-world situations. I think that will be pretty much unplayable.
Gryzemuis on 26/3/2009 at 14:56
One more thing. About the required bandwidth this time.
According to this: ((
http://www.mythtv.org/wiki/Configuring_HDTV)) a full HD stream from a BlueRay DVD can take up to 40 Mbps. HDTV television broadcasts take up to 18 Mpbs.
That means that when OnLive sends you a 5 Mbps stream, the quality will already be degraded a lot.
Also, note that in movies and television shows, compression depends a lot on the picture being fairly static. A frame is sent, and then for a while only the changes are sent. And then a whole new frame is sent again, followed by the changes. This works well in scenes where you see a face talking, and the face itself and background hardly change. Only the lips moving need to be sent. Anyway, I think most of you understand what I'm saying.
In videogames that is different. There is a lot more action. You move the camera a lot. Some people even do a lot of wild swings left and right to try to detect enemies. I would assume that the change in picture in videogames is a lot higher than the change in movies or tv. As a result, the bandwidth requirements of OnLive will be a lot higher than movie streaming.
You need 40 Mbps for DVD quality movie viewing. I bet you need 60-100 Mbps for DVD quality game viewing. If you get only 5 Mbps (5% of that), I bet the quality of pictures will be pretty bad. Yes, I know they promise only 720p, not 1080p. But still, if you have a 1080p screen, the quality is gonna suck.
Congratulations. You just paid 500 to 2000 euros/dollars for a nice screen. And your gaming service sends you blurry crap.
Koki on 26/3/2009 at 15:00
Quote Posted by LittleFlower
Result: 60 milliseconds RTT. I think this is a very reasonable number.
About as reasonable as stating everyone has a fiberglass connection to their ISP.
Gryzemuis on 26/3/2009 at 15:03
And my last contribution for today.
I wish people would remember (
http://tools.ietf.org/rfc/rfc1925.txt) The Twelve Networking Truths.
They are so simple. They are so true.
And yet people keep ignoring them.
See truth number 2:
Quote:
No matter how hard you push and no matter what the priority, you can't increase the speed of light.
And yet, people still seem to ignore the impact that the speed of light has on their ideas. They keep thinking: "my idea is so awesome, I will almost get it working today. I'm sure technology will come up with something soon to increase the speed of light".
Gryzemuis on 26/3/2009 at 15:11
Quote Posted by LittleFlower
Result: 60 milliseconds RTT. I think this is a very reasonable number.
Quote Posted by Koki
About as reasonable as stating everyone has a fiberglass connection to their ISP.
I meant: this is a reasonable number as the optimal best case number in an area where people are relatively close, and where the infrastructure is reasonably good. (Everyone has access to cable or ADSL or better).
I have 3/0.5 Mbps ADSL. I live outside a small village in the Netherlands. I have reasonable pings to servers in the Benelux, the UK and Germany. That's a 1000 km radius. What reasonable numbers exactly mean depends on the game itself, as different games use different ways to compute pings. (As an example, I had a stable 25 ms ping to my WoW server in Paris).
Unfortunately the US has been very retarded in development of their Internet access infrastructure. Their backbones have always been awesome. But your access technology is still in the stoneages, compared to many countries in Europe and Asia. I think this is more about politics, and the lack of true competition in the US than anything else. I feel sorry for you. And then I'm not even talking about all of the US outside the east and west coasts.
steo on 26/3/2009 at 15:58
I think one of the comments on the wikipedia article for unlive probably has it right: This is obviously impossible and a scam for shareholders.
This thing would likely get more traffic than youtube, the cost of millions of high-end computers would be monumental and, as has been discussed, the problem of streaming the video through networks is insurmountable. It is my understanding that, though ISPs may offer 8 megabit download speeds to all their customers, they don't have anything like the potential to supply everyone with 8Mb/s of download simultaneously, hence all the download caps and bandwidth throttling which have been brought about in the UK by the need to give competitively high internet speeds at competitively low prices.
Another problem: it would be downright foolhardy for the company to have one machine for each subscriber, so at peak times you would inevitably sometimes be stuck in a queue to play your favourite game.
This links into perhaps their biggest problem which is how on earth they're going to make this economically viable - if they're going to offer all the latest games, they're going to need to upgrade every one of their machines at least every two or three years. They're also going to have to pay for a huge amount of storage space, air conditioning, power, maintenance, FTL internet etc. On top of that, they'll have to offer competitive prices and somehow turn over enough of a profit to recuperate the cataclysmic costs of setting it all up.
Let's say I spend £500 on hardware every three years - that works out at around £15 per month. Granted I have the know-how to build my own machine and all which makes stuff cheaper, but that's the sort of figure they'd have to charge to make it worth considering their service instead of just buying my own machine. On top of that, it inevitably wouldn't work as well as a high-end local machine, so in order to make this worthwhile for someone like me, they'd have to be practically giving it away.
One of the joys of the scam is that it's such a groundbreaking idea that could be completely revolutionary, and yet it requires such a huge investment to set it up. I honestly can't see how they can possibly hope to see a return on that investment.
Gryzemuis on 26/3/2009 at 17:46
It's not gonna fly.
But there are a few things that are in their favor.
The benefit of a server farm is that it is shared by all customers. And not all customers play at the same time. You might think that if you have 1 million customers, you'll need enough hardware to serve 1 million customers at the same time. This is not true. Statistically only a small number of customers will be playing at the same time. So it could be possible they only need 100k units for 1 million customers. My guess would be that the number would be even lower. That means that the cost of hardware might not be as high as you think.
Another point is that not all customers will be playing the latest and greatest games. Some will want to play their favorite game, which might be older. Or they will want to try old classics. Over time I could see a schedule where they have different pools of server hardware, each with a different amount compute (graphics) power. And they only need to upgrade 20% of their hardware each year. Customers will automatically be assigned to servers that can just render the game they want to play at that moment. I see a potential for very effective use of the hardware.
I don't know much about rendering videostreams. But I would suspect that putting a movie or a video game into videostream are fairly different issues. In a regular movie, you get supplied with a huge set of frames, each consisting of a huge amount of pixels. Then your software will try to find regularities in the pixels, so it can describe objects, in stead of only single pixels. This process takes a lot of compute power. But when you render a videogame, you do the exact opposite. You start with a bunch of objects, and you try to transform them into pixels on a 2D screen.
I think those 2 could meet somewhere in the middle. When the game is rendering objects, it saves data about those objects, and puts those in a new kind of video stream. Then when the receiver gets that stream, if finishes the rendering from objects to pixels. As I said, I'm not an expert, but I could see some interesting opportunities for optimizations here. Of course, you will need to develop a whole new codec. And the receiver PCs will still need to have some non-trivial compute power to transform the videostream into pixels. E.g. if you would wanna do this, you'd still need to do the Anti-Aliasing on the client side.
But still, the fact that they are introducing extra delay, which can not be solved unless you increase the speed of light, is for me enough reason to believe they can never get something that's better and cheaper than local rendering.
IndieInIndy on 26/3/2009 at 17:50
Consider another factor for your math: encoder/decoder look-ahead delays. To get any sufficient level of compression, you'll need to use Mpeg4 or H.264, with P and B frames. That means you have at least one frame of delay for IP-encoding (and two frames for IPB) through the encoder, and another frame of delay in the decoder. If you get tricky with v-sync, you can try to read-write the same frame buffer on each end and trim half-a-frame delay off encode and decode (but you'd better time it perfectly, or tearing and artifacts will abound). In short, 2.5 frames on encode, 0.5 frames decode. In a perfect world.
Personally, I think they can make it work. In a technical sense. In a former career, I used to deal with live encoding of multiple video streams and transmitting it all over a network connection. I can see how all of this would work, and there are no technical reasons they can't deliver a working solution. Nothing new under the sun here.
That said, even running at 60 FPS, you have 30-45 millisecond delay through the encoder, another 15-20 ms delay to decode, and that little thing called "ping time".
In practical terms, I'd have to see them make the same sacrifices we always did due to bandwidth: you can have a few high-res streams, or lots of low-res streams. Translation: most users will only be allowed to run at somewhere between 640x480 and 1024x768 resolutions, and their vaunted 720p @ 60 FPS will only be offered to select customers.
Honestly, I can't see this as ever being feasible for any FPS/action title. Even under optimal conditions (e.g., test system with client machine and server sitting next to each other and plugged into the same router), it would be extremely difficult to get the pixels from the server's graphics card to the client's monitor in under 50-60 milliseconds.
I'm trying to remember the fastest I was ever able to get such a system to work. With full IBP mpeg4, general performance was in the range 75-90 ms, although that was 30FPS video. With 60FPS video, that would presumably scale to 45 ms, though I'd expect 50+ ms to be more reasonable. The frame duration may be shorter, but some factors won't change at twice the FPS (buffering, concurrency, and other technobabble). And that's assuming a near-zero time over the network.
Mind you, there's tricks that can be played. Use I-frame only transport and you can trim the encode/decode delay to the neighborhood of 20 milliseconds. But you lose almost all of the compression benefits of mpeg or h.264 -- I-frame-only is effectively nothing more than a sequence of JPEG still frames. But it would avoid most of the motion artifacts that would plague the quality of any fast, actiony game. (So yeah, I expect their demo was doing I-frame only and probably no one there would know to ask that question.)
You can also drop the frame rate to 30, 15, or 10 FPS, but that won't reduce the lag through the encoder/decoder. You still have to wait X-frames through the codec to generate the bitstream.
So take everyone's guestimates about round-trip time and add 50 milliseconds. Seventy-five is probably a better choice for a usable level of compression.
Aside: OnLive has given me the best technolaugh I've had in quite a while. I can easily see how to implement all of the tech for this, but more clearly I can see all of the management/sales whitewash they've put on their sales pitch. There are so many things that no one has mentioned. And won't, for obvious reasons. This smells like an IP grab, with intent to sell out to a big fish before anyone realizes it won't work. Not as presented. And apparently Dave Perry is jumping into the same arena with some patent applications. It's a money grab. And it would make a great system for peggle/match-three, but not Crysis/Bioshocx.
steo on 26/3/2009 at 18:05
If they're encoding stuff on the server side, wouldn't that have a similar overhead to recording with fraps while still having file sizes of ~1GB/minute, thus requiring over 100 megabits per second download speed?
And as for not needing top of the range hardware for everyone at once, what happens when GTA 5 comes out and 3.6 million people want to play it on the day it's released?
IndieInIndy on 26/3/2009 at 18:27
Depends on how they're encoding the video. Software encode of 720p in Mpeg4 should take up 10-20% of one core, depending on encode settings.
If they're using a custom hardware codec (which they have to, if they're planning on doing H.264, since the fastest software H.264 encoder I know of, Intel's IPP, would not be able to encode 720p60 in real time), then CPU performance won't matter. But a hardware codec in a frame grabber card (the simplest solution) would introduce another v-sync delay to the system, adding to the latency.