Sujet : Re: encapsulating directory operations
De : antispam (at) *nospam* fricas.org (Waldek Hebisch)
Groupes : comp.lang.cDate : 10. Jun 2025, 21:22:29
Autres entêtes
Organisation : To protect and to server
Message-ID : <102a463$1l5j1$1@paganini.bofh.team>
References : 1 2 3 4 5 6 7 8 9 10 11
User-Agent : tin/2.6.2-20221225 ("Pittyvaich") (Linux/6.1.0-9-amd64 (x86_64))
Scott Lurndal <
scott@slp53.sl.home> wrote:
Bonita Montero <Bonita.Montero@gmail.com> writes:
Am 09.06.2025 um 17:53 schrieb Scott Lurndal:
Bonita Montero <Bonita.Montero@gmail.com> writes:
Am 09.06.2025 um 16:01 schrieb Scott Lurndal:
>
Have you ever written real-world production code? Like an operating
system, where allocation failures should -never- result in an
inability to recover.
>
If you need an allocation to proceed and it fails you can't recover.
That's your problem caused by poor design and implementation.
>
That's how 100% of all programs that deal with bad_alloc are designed.
>
Exacerbated by the propensity for you to use C++ features that require
dynamic allocation where other forms of data structures are more suitable.
>
When dynamic allocation is needed it is needed.
And there are many ways to handle it that don't include throwing
bad_alloc when the system is unable to provide additional address
space, memory or backing store.
Allocating major data structures at application start (perhaps using a
pool allocator) and crafting your algorithms such that they
don't require infinite memory is a good start.
If you can allocate memory before looking at data, then you really
do not need dynamic allocation. And there are cases when you can
do with something simpler than general dynamic allocation.
But AFAICS there are cases which need general dynamic allocation.
And there are cases which strictly speaking do not need general
dynamic allocation, but dynamic allocation looks better in
practice. Namely, one can estimate amount of memory that is
sufficient to do the job, but this estimate is typically
significantly bigger than amount of memory which would be
dynamically allocated. In other words, users prefer
"stochastic" version, that is program which usually is
efficient, but with low probability may fail or require more
resources to a "deterministic" program that always
require more resources.
There are now in use programs that solve high complexity
problems. On many practical problems they are quite
efficient, but by the nature of problem they need
inpracticaly large resources on some instances. For
such problem the best one can hope is to localize the
trouble, that is fail the computation if it requires
to much, but keep independent things running. And even
such localized failure may be tricky (that is avoiding
trouble in unrelated places).
-- Waldek Hebisch