A few days ago, Google showed off their new cloud gaming technology, Stadia, to mixed reactions: some people liked it, some people hated it. Personally, I despise the idea of cloud gaming because it's the ultimate DRM, when you're not even running the game on your own hardware, but I can understand the appeal to some casual gamers.
What really caught my attention and triggered me was this:
I don't own an Xbox One X so I can't say anything in its defense, but I've been gaming on PC since I was a kid, so I have a lot to say about that. In this comparison, Google claims that a PC running a game at 60fps has an input lag of 100ms. This is a completely false statement, and it was made to make their shitty 166ms response time on Stadia look somewhat playable (keep in mind, that's 166ms on a 1gbps fiber connection and near their datacenters, most people are going to have a much higher lag). So let's put their lies to the test.
I built a simple device to measure input lag using an Arduino Leonardo clone, a button, and a brightness sensor. Here's how it works.
A simple Java OpenGL application running at 60fps with double buffering enabled will simulate our game. It was built using OpenGL.
At 60fps, we expect lag to be influenced by these factors:
The test will be conducted by running the application on my desktop PC.
The display used is an AOC Q2770P, which is a 2014 IPS display, it is NOT a gaming monitor, so this is to Stadia's advantage; on a gaming monitor, the lag would be much lower, especially if it were 144Hz, we would expect less than 30ms.
And here's the actual input lag of a PC running a game at 60fps on this display:
As you can see, the actual input lag is around 50ms, and this is on a non-gaming display; if it were, we'd be looking at much lower numbers.
What a surprise, Google lied to make their shitty service look better. I expected nothing less from the most evil tech company on the planet.
If you think my experiment was unfair, or that there were other factors that I ignored, then by all means get in touch with me or leave a comment, I will be happy to publish an update, especially if you were involved in the development of this thing.
I got a new monitor, an LG UltraGear 27GL850, 144Hz, G-Sync, 1440p, 1ms pixel response time.
With this new monitor, the latency is down to an average of 19ms.
At 144fps, The latency should be as follows: