Liste des Groupes |
David Brown wrote:For that kind of thing, the latency you can tolerate will depend on the physical lag of the system, what you are trying to control, and the experience of the person controlling it. It will therefore lie somewhere between the "100 ms feels instantaneous" that you see for many purposes, and the speed you need for gaming.On 19/09/2024 00:54, Lawrence D'Oliveiro wrote:You actually need 20 Hz/50 ms even for joystick/mouse response when you ar enot in a hurry. (Was proven by the space station external arm joystick controller which was initially specified to operate at 10 Hz, but that turned out to be far too laggy for the astronauts so it was doubled to 20 Hz.On Wed, 18 Sep 2024 16:23:01 +0000, MitchAlsup1 wrote:>
>On the other hand, and this is where the deprecation of the CPUs come>
in, The engines consuming the data are bandwidth machines {GPUs and
Inference engines} which are quite insensitive to latency (they are not
not latency bound machines like CPUs).
>
When doing GPUs, a memory access taking 400 cycles would hardly degrade
the overall GPU performance--while it would KILL any typical CPU
architecture.
But if it’s supposed to be for “interactive†use, it’s still going to take
those 400 memory-cycle times to return a response.
In human terms, those 400 memory cycles are completely negligible. For most purposes, anything else than 100 milliseconds is an instant
Even a complete amateur can notice time mismatches of 10 ms in a musical context, so for a professional this does not surprise me. I don't know of any human endeavour that requires lower latency or more precise timing than music.response. For high-speed games played by experts, 10 milliseconds is a good target. For the most demanding tasks, such as making music, 1 millisecond might be required.My cousin Nils has hearing loss after a lifetime spent in studios and playing music, he can't use the offered hearing aids because they add 3-4 ms of latency. (Something which he noticed _immediately_ when first trying a pair.)
The first modem I used was, I believe, 300 baud and there was a definite lag between typing and the characters appearing on-screen.>Early multiplayer games had to invent all sorts of tricks to try to hide away that latency, and well before that, around 1987 (?) I made a version of my terminal emulator which could do the same:
For anything interactive, an extra 400 memory cycles latency means nothing - even if it is relatively slow memory - as long as you can keep the throughput. Network latency is massively bigger than this extra memory latency would be.
>
I.e. give instant feedback for keystrokes while in reality buffering them so that I could send out a single packet (over pay-per-packet X.25 networks) when I got a keystroke that I could not handle locally.
This one hack (designed and implemented overnight) saved Hydro and the Oseberg project NOK 2 Mill per year per remote location.I presume you called the case-change a feature, rather than an artefact, giving the user confirmation that the data was entered correctly?
The only noticable (to the user) artifact was when they were entering data into an uppercase-only field: They would see the lowercase local response until they hit enter or tab, then the remote response would overwrite the field with uppercase instead. Normally I simply checked if the new remote data was reducing the offset between the local buffer and the official terminal view.
Les messages affichés proviennent d'usenet.