Sujet : Re: Cost of handling misaligned access
De : cr88192 (at) *nospam* gmail.com (BGB)
Groupes : comp.archDate : 25. Feb 2025, 20:11:59
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vpl4m6$2528c$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
User-Agent : Mozilla Thunderbird
On 2/25/2025 12:07 PM, MitchAlsup1 wrote:
On Tue, 25 Feb 2025 14:20:45 +0000, EricP wrote:
Michael S wrote:
--------------------
>
No, I mean florplanning, as well as most other manual physical-level
optimization are not used at all in 99% percents of FPGA designs that
started after year 2005.
>
Is that because the auto place and route got good enough that it is
unnecessary? Or maybe the fpga resources grew enough that autoroute
didn't have to struggle to find optimal positions and paths
(being an optimal packing problem and a traveling salesman problem).
Athlon (1998) used hand place auto-route. So, auto-route has been
good enough since 2000 at latest.
Seems probably true.
Not entirely sure how it works, but they seem to pull it off mostly acceptably.
Also BGB mentioned in another thread a while back that he was getting
what sounded like random variation of critical paths from run to run.
That suggests to me the automatic tools may not be properly recognizing
the different modules and produce some non-optimal positions or paths.
So giving it a hint that "this stuff goes together" might help.
Consider the optimizer/place/route thingamabob; and a signal that
crosses from one module to another. The optimizer changes from
a 2-LUT delay to a 1 LUT delay, but now the fan-out of that LUT
doubles, so instead of speeding up, the signal path slows down.
Yeah, not only are there sometimes random variations in timing, but also random variations in LUT cost.
Anyway, it should be testable. Inspect the auto placements module wiring
and if there are any obviously crazy decision then try the placement
tool
an see if the speed improves or critical path variation goes away.
I am more at the moment thinking it might be nice if I had some sort of Verilog debugger...
Say, something that works like a simulator but also allows source-level debugging of the Verilog, along with having virtual peripherals; say, so one can pause the simulation and then go inspect variables or similar, rather than being limited to whatever they can figure out enough in advance to print using "$display()" statements.
Trying to debug this stuff as-is kinda sucks.
Could almost do it myself, possibly modifying BGBCC, and using a sort of specialized VM.
Well, and/or add bitfield instructions to BJX2, to hopefully make it sufficient for running Verilog code (probably with similar rules to those in Verilator, eg, no tri-state logic or signals, ...).
These would likely exist as bitfield extract and bitfield insert instructions, probably using 64-bit encodings.
Would need 3 registers and 12 bits of immediate for insert, say:
BITINS Rs, Rt, Rn, Imm12
With the immediate specifying the base bit and bit-length.
Would combine the base-value in Rs, inserting the relevant bits from Rt.
A 128-bit BITINSX could be useful, but would need 14 bits of immediate, and probably acceptably faked acceptably via an instruction pair.
A BITEXT instruction mostly needs a 3RI Imm12, much easier.
Mostly because Verilog tends to make much heavier use of bitfields if compared with C.
Doesn't need to be viable on real hardware in this case.
Most of the work in this case would likely need to be in the compiler frontend.
Granted, doing it this way would be a bit wacky...
Would maybe overlap though with maybe adding source-level debugging to JX2VM, which as-is mostly does ASM-level debugging (registers and disability, but is generally able to display information about functions, source-files, and line numbers).
There is a keyboard sequence to get into an interactive GDB-style debugger, but I rarely use it, so don't remember what it is ATM (usually I am more interested about when it crashes or similar, which displays a big long dump as it exits...).
Preferable for VL would be something more like a Visual Studio style interface.
...