Sujet : Re: Why GIMP Is Better Than Photoshop
De : recscuba_google (at) *nospam* huntzinger.com (-hh)
Groupes : comp.os.linux.advocacyDate : 05. Jan 2025, 21:51:51
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vlerd7$13ci0$4@dont-email.me>
References : 1 2
User-Agent : Mozilla Thunderbird
On 1/5/25 10:53 AM, DFS wrote:
On 1/4/2025 6:56 AM, Lying Lameass Larry Piet (posting as Farley Flud) wrote:
Photoshop, the name that hangs on every lackey asshole's lips,
is actually severely limited in its bit depths.
>
Photoshop, as well as other commercial garbage, only allows
processing in 8, 16, or 32-bit (integer) depths.
>
The mighty GIMP, otoh, offers 6, 16, 32-bit integer as well
as 16 and 32-bit floating point and this is a HUGE difference
in modern, cutting-edge processing.
Show us the visual difference, with your own code of course, "image processing expert" and "computing virtuoso" that can "program anything".
While Feeb is at it, he needs to explain how & why it is significant for there to be floating point at all, since the input sensor is integer based: where is this necessary? Because without a clear need, it sounds more like sloppy/lazy programming and/or false features.
Similarly, why 16 or 32 bits/channel is necessary when the human eye can't even biologically perceive that resolution (its considered to be 10-12 bits/channel): did the GIMP programmers choose 16 because they chose an easy (lazy) way to get to the 10 or 12 bits/channel that's anthropometrically appropriate? Overkill results in sub-optimally larger file sizes.
Finally, just what good is for this amount of overkill on color bit depth when there's not even the hardware to display the answer on?
Specifically, who makes a 16 bit/channel computer monitors? Who makes 32 bit ones? Name names/makes/models (and price). Because the last I've seen was 12 bits/channel in an expensive reference display; the mainstream 'State of the Shelf' is still at 8 bits/channel (plus there's still some 6 bit based displays which are faking 8 bits/channel - one manufacturer got hit with a lawsuit on that a few years ago).
So even if humans could perceive 32 or 16 over 12 bits/channel, where's the hardware which can actually display more than 12 bits/channel?
-hh