EDIT: Richard Leadbetter at Eurogamer has an article with some great “side by side compare” video to show theoretical quality you can achieve with off the shelf compressors right now. He comes to a similar conclusion on the issue of latency, although having seen the video I disagree on the quality (he feels it’s unacceptable) – it’s clearly inferior, but it still provides a very nice experience, especially if you’re sitting 6-12 feet away from the screen.
Latency. You need >25 FPS rendering and you want 60 FPS rendering, especially on an HDTV. But … that’s just for the rendering of frames. The responsiveness of the game, the cycle “see something, make decision, press button, wait for game to process inputs, game changes, see new screen” runs at a rather different rate.
Humans can typically easily detect FPS differences up to around 50 FPS. Up to around 75 FPS seems to be the limit for most people to see any difference at all – even a very subtle “can’t quite consciously see the difference”. Without looking at the research (based purely on experience), if you provide a video that alternates between white and black each frame, we can detect frame rate differences up to perhaps 200 FPS.
(there will probably be some maths mistakes in this post, of course. There always are :))
So, you need to be delivering at 1080p. That’s 2,073,600 pixels, each of which carries 4 bytes of colour information, i.e. about 8MB per frame.
Remember: internet connection / broadband speeds are measured in bits, not bytes, so an “8Mbps” connection (the fast connection available ubiquitously today) is only capable of 1 MB a second.
The fastest European and US home broadband connections I’ve yet seen peak at 24 Mbps, i.e. less than 1 % of the speed required to deliver full-screen HD video uncompressed.
If you look at the specs for those special HDMI cables you use to connect up your HDTV, you’ll see they are measured not in Mbps (like internet) but in Gbps – i.e. one thousand times as fast.
Fast Video Data
But … in practice, some videogames tend to exhibit extremely high frame to frame coherency, with the most extreme examples being slow-moving strategy and platform games, and racing games.
I noted that the screenshots for OnLive seem to have quite a few racing games in there. Hmm.
The question now becomes: how many pixels can change in the gap of time from one frame to the next?
Assuming you can go into a spin in your car where the whole car rotates 360 degrees in 3 seconds, revolving around the camera’s centre, and you had a viewing angle of 120 degrees (most racing games tend to be wide-angle), you’d be shifting your 1920 columns of pixels out every second.
i.e. 1 FPS. In bandwidth terms, you need to send a mere 64 Mbits in one frame. Your fastest home internet connection outside Asia would “only” be too slow by a factor of 2. In the best connected cities in the Asian countries with the very best internet infrastructures, you’d be able to manage this OK. Just.
The above assumes two things:
- a “perfect” scenario to make compression as easy as possible: as few pixels change as is theoretically possible
- the only ways you have of setting pixels are copying from an adjacent pixel or downloading it from the server
In practice, of course, neither is true. OnLive’s super sekrit proprietary video compression clearly is setup to provide a lot more alternative ways to set pixels, which reduces the data requirements, but on the other hand in practice even in a racing game it tend’s to be much much harder to retain pixels for copying.
In a platformer, you’re typically fine, but in any kind of moving-camera game, you’re generally screwed.
Custom compression will get you a long way, but don’t expect to see it help much in FPS’s, for instance.
For input processing to precisely match rendering speed of 60 FPS, you’d need a ping time of 16 ms or less. Unlike bandwidth, there’s no way to make up for occasional drops in connection quality by rendering lower quality video for a few frames without having a fat client (this is what fat clients are great for, and is one reason why many people think OnLive is a stupid idea). You’d have to *guarantee* that 16 ms latency worst-case, and so you’d probably be looking at more like 10 ms latency as your requirement (have to handle all sorts of other random internet traffic on your system – AND this assumes you don’t share your internet connection with any home PC’s that do any web browsing).
Much has been made of the claim that OnLive is impossible because the best ping times anyone gets to the internet are measured in tens of milliseconds, and the averages are measured in hundreds of milliseconds.
The ping time from London to New York, limited by the speed of light, is physically incapable of ever getting much less than about 20 ms.
Ah. But … your ping time to your ISP is measured in single-digit milliseconds. The sensible thing for OnLive to do would be to partner with ISP’s, and pay them substantial amounts of money to host OnLive servers in each of their POP’s around the country, so that the ISP’s consumers get sub 10 ms ping times to OnLive. This works.
Surprisingly few people in the online games industry today seem to have ever realised this was a viable solution to ping time issues. People who were around in the early 1990’s remember doing exactly this kind of thing back when ping times were much much higher – it used to be that every ISP ran it’s own game servers – but most people who’ve not lived through that don’t seem to ever think of it.
Over the years, I’ve thought of using it, but the only business plan I saw that *demanded* it was the MMOFPS with hundreds or thousands of players per single deathmatch level. Now OnLive comes along and I’d say that OL needs it too. I’m waiting to see an announcement of equity ownership from some major multinational ISP any day now…