The Lies of Google Stadia

A few days ago, Google showed off their new cloud gaming technology, Stadia, to mixed reactions: some people liked it, some people hated it. Personally, I despise the idea of cloud gaming because it's the ultimate DRM, when you're not even running the game on your own hardware, but I can understand the appeal to some casual gamers.

What really caught my attention and triggered me was this:

Input lag comparison

I don't own an Xbox One X so I can't say anything in its defense, but I've been gaming on PC since I was a kid, so I have a lot to say about that. In this comparison, Google claims that a PC running a game at 60fps has an input lag of 100ms. This is a completely false statement, and it was made to make their shitty 166ms response time on Stadia look somewhat playable (keep in mind, that's 166ms on a 1gbps fiber connection and near their datacenters, most people are going to have a much higher lag). So let's put their lies to the test.

The experiment

I built a simple device to measure input lag using an Arduino Leonardo clone, a button, and a brightness sensor. Here's how it works.

The device

On the Arduino

  • Wait for the button to be pressed
  • Send the keystroke K to the PC
  • Wait for the screen to turn white
  • Measure how long has been since the keystroke was sent
  • Send the keystroke R to the PC
  • Wait for the screen to turn black
  • Repeat 10 times to measure average, minimum and maximum times

On the PC

A simple Java OpenGL application running at 60fps with double buffering enabled will simulate our game. It was built using OpenGL.

  • Initialize with a black screen
  • In the game loop: if the key K is pressed, turn the screen white; if the key R is pressed, turn the screen black
  • Render

Expected results

At 60fps, we expect lag to be influenced by these factors:

  • Delay from USB controller (<2ms)
  • Missing the VSync (0-16.7ms)
  • Double buffering (16.7ms)
  • Display input lag (~15-20ms on my display)
  • Brightness sensor delay (unknown)

The setup

The test will be conducted by running the application on my desktop PC.
The display used is an AOC Q2770P, which is a 2014 IPS display, it is NOT a gaming monitor, so this is to Stadia's advantage; on a gaming monitor, the lag would be much lower, especially if it were 144Hz, we would expect less than 30ms.

Running the experiment

The results

And here's the actual input lag of a PC running a game at 60fps on this display:


As you can see, the actual input lag is around 50ms, and this is on a non-gaming display; if it were, we'd be looking at much lower numbers.


What a surprise, Google lied to make their shitty service look better. I expected nothing less from the most evil tech company on the planet.

If you think my experiment was unfair, or that there were other factors that I ignored, then by all means get in touch with me or leave a comment, I will be happy to publish an update, especially if you were involved in the development of this thing.

Update September 2019

I got a new monitor, an LG UltraGear 27GL850, 144Hz, G-Sync, 1440p, 1ms pixel response time.

With this new monitor, the latency is down to an average of 19ms.


At 144fps, The latency should be as follows:

  • Delay from USB controller (<2ms)
  • Missing the VSync (0-6.9ms, not sure about this because of G-Sync)
  • Double buffering (6.9ms)
  • Display input lag (seems to be <5ms)
  • Brightness sensor delay (unknown)

Share this article