Okay. Thank you! This explanation made it click for me (now I think I get the original example too). Here the real cause of the violation is the instant communication, isn’t it? If the communication was done via radiowaves (which as far as I know also travel at the speed of light) it would not be violated, because of the time it takes for the information to arrive from the Earth to the spaceship and back, is that correct? Is this why (as I have read/heard on several occasions) the upper bound for the speed of information is also the speed of light?
From Earth’s frame of reference, time on board the ship is slowed down by a factor of 0.866, while from the ship’s frame of reference, time on Earth is slowed down by the same factor
Why is time on earth slowed down from the ship’s perspective? Shouldn’t it be faster? Like if earth perceives that the time on the ship is passing slower shouldn’t the people on the ship perceive the time on earth as passing faster to compensate?
Also, I have quite a hard time understanding how time exactly slows down. Is it sort of as though we adjusted the time step duration (tickrate, more precisely) of a physics simulation in an area (making everything happen slower/faster there in relation with the rest, where the original timestep is kept)? (Without losing precision and all those problems that occur in a simulation normally) Or is this analogy flawed and that is why I’m not getting it?
Thank you for your answer. When viewed from this perspective it makes more sense.
Hmmm… It does tick a lot of the boxes, but the ethernet protocol is way too complex with all of its layers and not reasonably implementable on a low-power microcontroller. Also it requires separate hubs for connecting multiple devices together unlike i2c, which is daisy chainable.