Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Shitsux

Name: Anonymous 2009-07-13 9:03

Hi /prog/,

I just finished on my "proof of concept" of a new compression algorithm, and frankly it's awesome.

At current guesstimations I could compress a 10MB file to about ~4MB in less than 30 seconds (Probably less than 5 seconds but I'm being cautious).

To decompress will take about a day. A whole day. Though I'm 100% certain that I could halve that time by using two cores, and CUDA/OpenCL/etc probably couldn't hurt either but I couldn't make a judgement on those.

So basically in 10 years time when we have 64+ cores and new super GPUs it'll be awesome right?

Name: Anonymous 2009-07-15 5:14

>>29
Oh, so you think entropy is bullshit? If that's the case then please design a lossless compression algorithm that can indefinitely take its own output as input and always produce a considerable compression ratio.

Yeah, true random data can't be compressed to any useful degree because it doesn't contain enough redundancy. Note that it doesn't need to be actual random data. The output of a compression algorithm has enough entropy to be considered random, and an untrained eye can't distinguish the two.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List