Liste des Groupes | Revenir à col misc |
On 12/06/2024 16:13, D wrote:Hmm, I'd say parsing file types first, and perhaps have a little database that maps file type to compression algorithm, and if that doesn't yield anything, proceed with "brute force".On Wed, 12 Jun 2024, Richard Kettlewell wrote:>
Grant Taylor <gtaylor@tnetconsulting.net> writes:This is true! The only thing I can imagine are parsing the file type, and from that file type, drawing conclusions about the compressability of the data, or doing a flawed statistical analysis, but as said, the end could be vastly different from the start.On 6/11/24 01:53, J Newman wrote:Not just difficult but impossible in general: the input file couldAny suggestions on how to proceed?As others have said, it's very difficult to tell within the first five
seconds what the ultimate compression ratio will be.
change character in its second half, switching the overall result from
that that is (for example) a gzip win to an xz win.
OK good point...as mentioned elsewhere my experience is with compressing video files with lzma.
>
But if we accept that the script will make mistakes sometimes in choosing the right algorithm for compression, do you suggest parsing the file type, or trying to compress each file for the first 5 seconds, as the option with the least errors in choosing the right compression algorithm?
>
Les messages affichés proviennent d'usenet.