Liste des Groupes | Revenir à l c |
On 4/15/2025 12:22 PM, David Brown wrote:The datasheets of some of the components we use measure timings in picoseconds - things like inter-channel skew on memory devices, rise/fall times, or delays on FPGA pins and internal parts can be given as a small number of picoseconds.On 15/04/2025 07:40, BGB wrote:I am not saying that the smaller times don't exist, but that there is no point in wasting bits encoding times more accurate than can be used by a computer running at a few GHz, with clock speeds that will likely never exceed a few GHz.On 4/14/2025 11:15 PM, Lawrence D'Oliveiro wrote:>On Mon, 14 Apr 2025 19:43:04 -0500, BGB wrote:
>On 4/14/2025 5:33 PM, Lawrence D'Oliveiro wrote:>>>
I figured that it would be hard to find an epoch less arbitrary than
the Big Bang ...
But, we don't really need it.
>
If so, could probably extend to 128 bits, maybe go to nanoseconds or
picoseconds.
The reason why I chose the Planck interval as the time unit is that
quantum physics says that’s the smallest possible time interval that makes
any physical sense. So there shouldn’t be any need to measure time more
accurately than that.
Quantum mechanics, the current theory, is not complete. Physicists are aware of many limitations. So while Plank time is the smallest meaningful time interval as far as we currently know, and we know of no reason to suspect that smaller times would be meaningful, it would be presumptuous to assume that we will never know of smaller time intervals.
>>>
Practically, picoseconds are likely the smallest unit of time that people could practically measure or hope to make much use of.
The fastest laser pulses so far are timed at 12 attosecond accuracies - 100,000 as accurate as a picosecond. Some subatomic particle lifetimes are measured in rontoseconds - 10 ^ -27 seconds. Picoseconds are certainly fast enough for most people, but certainly not remotely fast enough for high-speed or high-energy physics.
>>Physicists have measured times a thousand millionth of a femtosecond. It is not easy, of course, but not impossible.
While femtoseconds exist, given in that unit of time light can only travel a very short distance, and likely no practical clock could be built (for similar reasons), not worth bothering with (*).
>
>
This sets the practical limit mostly in nanosecond territory.
But, for many uses, even nanosecond is overkill. Like, even if a clock- cycle is less than 1ns, random things like L1 cache misses, etc, will throw in enough noise to make the lower end of the nanosecond range effectively unusable.I have parts of microcontroller designs reacting a lot faster than that. You use hardware, or at least hardware assistance, rather than pure software.
And, things like context switches are more in the area of around a microsecond or so. So, the only way one is going to have controlled delays smaller than this is using delay-loops or NOP slides.
But, also not much point in having clock times much smaller than what the CPU could effectively act on. And, program logic decisions are unlikely to be able to be much more accurate than around 100ns or so (say, several hundred clock cycles).
...Basically, all you are saying is that different timing resolution and ranges are needed for different things, and 64-bit would not cover it all. 128-bit could cover pretty much everything outside specialist physics use, and 64-bit with appropriate scale is fine for most purposes.
You could express time as a 64-bit value in nanoseconds, and, it would roll over in a few centuries.
Meanwhile, a microsecond is big enough for computers to effectively operate based on them, small enough to be accurate for most real-world tasks.
Les messages affichés proviennent d'usenet.