Liste des Groupes | Revenir à c arch |
Niklas Holsti wrote:Not just that, many language forms actually preclude the need for checks,On 2024-09-16 18:58, Michael S wrote:>On Mon, 16 Sep 2024 11:39:55 -0400>
EricP <ThatWouldBeTelling@thevillage.com> wrote:
>David Brown wrote:>On 16/09/2024 15:04, Michael S wrote:>
>With one exception that usize overflow panics under debug>
build.
I'm quite happy with unsigned types that are not allowed to
overflow, as long as there is some other way to get efficient
wrapping on the rare occasions when you need it.
>
But I am completely against the idea that you have different
defined semantics for different builds. Run-time errors in a
debug/test build and undefined behaviour in release mode is fine -
defining the behaviour of overflow in release mode (other than
possibly to the same run-time checking) is wrong.
In the compilers that do checking which I have worked with
there was always a distinction between checked builds and debug
builds. In my C code I have Assert() and AssertDbg(). Assert stay in
the production code, AssertDbg are only in the debug builds.
>
Debug builds disable optimizations and spill all variable updates
to memory to make life easier for the debugger.
One usually compiles debug builds with no-optimize and all checks
enabled.
>
But debug, optimize, and checking are separate controls.
>
In the compilers for checking languages I've worked with,
checking and optimization are compatible.
For example, if the compiler uses an AddFaultOverflow x = x + 1
instruction to increment 'x' then it knows no overflow is possible
and then can make all the other optimizations that C assumes are true.
>
And on those compilers checks can be controlled with quite fine
resolution. Checks can be enabled/disabled based on kind of check,
eg scalar overflow, array bounds,
for a compilation unit, a routine, a section of code,
a particular data type, a particular object.
>
This was all standard on DEC Ada85 so if Rust compilers do not
do this now they may in the near future.
If ability to control compilers checks was standard on DEC Ada then it
made DEC Ada none-standard.
No, it means that DEC Ada could be used as a standard-conforming Ada
compiler or as a non-conforming compiler, to a user-chosen extent.
>
The recommended approach today (for applications where it matters) is to
use static analysis of the Ada code (e.g. SPARK or other tools) to prove
that run-time errors cannot happen, which then makes it possible to omit
the corresponding run-time checks while staying compliant.
DEC Ada did that too. It seems to me this optimization to be a relatively
straight forward "propagation of constants" type of problem.
Les messages affichés proviennent d'usenet.