All the processing will be done on the server side and you will be in effect running an interactive streaming video only.
The major issue here would be input lag due to the ping times from your PC to the Stadia server.
The latencies of even first world internet is not upto the task so forget us for the next 10 years.
I work in this space. While Google's attempt so far has been sub par, it isn't as simple as that.
Yes you are effectively playing back the video stream but you can do very sophisticated prediction of the game state based on latency.
Prediction also exists in standard multiplayer games since the days of Quake World as without prediction, it is impossible to run the game state of a multiplayer shooter.
In addition, you can always over-render and then apply some warping at the client end to match the current view based on current input.
While this may or may not update the dynamic objects in the scene depending upon how the technique is utilized, it gets rid of a substantial amount of lag.
Stadia is only the tip of the iceberg and its biggest issue is having to buy games on that platform. This is something NVIDIA's service eliminates it completely as it allows you to use your steam library.
On a more personal level - We are working on splitting the rendering load altogether using something called Object Space Rendering. We are tackling the VR issue as this is even more latency sensitive.
If VR works fine, normal games are a piece of cake. This approach can tolerate lag of upto 100-120ms with no obvious effects. Video here: