Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

MP3-Beating Compression

Name: kieren_j 2006-04-26 17:08

You probably don't believe me, but if you're at all interested in my new "CAR" compression alogrithm, check this out:

The strange thing is, it works better on compressed files!
Zipping an MP3 file gives you 99% of original, but check this out!

**** TESTS ON UNCOMPRESSED FILES ****

TXT File Example
TXT File: 1,318,671
Savings: 1,308,940
CAR File: 9,731
Percent: 0.7%

WAV File Example
WAV File: 8,362,354
Savings: 8,323,477
CAR File: 38,877
Percent: 0.5%

EXE File Example
EXE File: 216,064
Savings: 213,336
CAR File: 2,728
Percent: 1.3%


**** TESTS ON ALREADY-COMPRESSED FILES ****

MP3 File Example
MP3 File: 4,961,773
Savings: 4,945,669
CAR File: 16,104
Percent: 0.3%



MPG File Example
MPG File: 5,976,068
Savings: 5,946,909
CAR File: 29,159
Percent: 0.5%


If you didn't see it first time, I compressed an MP3 file from 5 meg to 16kb.
What CAR actually does is obviously a complete secret, but I'm really really excited about it! I've been thinking of how to do it for years - but now, yay! (I figured it out playing around in QB, of all things!).
What I want to know is basically are there any sites that are relatively easy to understand that tell you how to do:

* Huffman Compression
* LZW Compression
* "Textbook" RLE Compression (I only know PCX's RLE)

I know that you use binary trees and nodes and so on but I have no idea for a software implementation!
Anyways you probably don't believe me, but I just wanna try to make the compression better.

Thanks from a very very excited
Kieren Johnstone

Name: Anonymous 2013-07-27 8:09

Compression works different on different entropy levels.
1.low entropy content like bitmaps compresses excellent.
2.high-entropy like 7zip files don't compress at all.
If there was a way to convert the high-entropy data into a low-entropy form, it would be possible to recursively compress data to some minimal block, but the information required to convert the low-entropy data to high-entropy data will also grow:
meaning even if such compression existed it would have a concrete limit depending on initial entropy.
I think there is potential to use some bit-reordering algorithms like http://en.wikipedia.org/wiki/Delta_encoding
example:
011101011010101110101110011010 can be encoded as difference between 2 bits(starting from initial(not in string) 0, with 1 as 'change' and 0 as 'no change') like:
010011110111111001111001010111 and again
011010001100000101000101111100 and again
010111001010000111100111000010 each time entropy of the string changes, the problem is finding the one with lower entropy.
now if we have a string of bits, and number of times to delta encode(or some other method) it to a slightly lower entropy form we can compress the results string to smaller size, and the process could be repeated again until the 'number of encode passes' metadata would grow to nullify any further compression.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List