Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Shitsux

Name: Anonymous 2009-07-13 9:03

Hi /prog/,

I just finished on my "proof of concept" of a new compression algorithm, and frankly it's awesome.

At current guesstimations I could compress a 10MB file to about ~4MB in less than 30 seconds (Probably less than 5 seconds but I'm being cautious).

To decompress will take about a day. A whole day. Though I'm 100% certain that I could halve that time by using two cores, and CUDA/OpenCL/etc probably couldn't hurt either but I couldn't make a judgement on those.

So basically in 10 years time when we have 64+ cores and new super GPUs it'll be awesome right?

Name: FrozenVoid 2009-07-13 12:54

>>5 The idea its self isn't "impossible" just consider:
1.every file is number
2.If you get a formula to result in this number, and variables in the formula are smaller then file its compressed.
3.You can get Inexact results, like 9 can be 9.8 or 9.01 and still decompress into the same file.
4.By manipulating the file content/number (e.g. multiplying it by constant like (2^(8*filesize))) you can increase search space enormously and decompression is, just dividing the result by the same constant and discarding fractional parts.
5.You can use any formula(and using multiple formulas provides best convergence btw) to generate numbers, as long as its variables are compact:x*(j^c),x^y mod N, (x^y)*(v^c), (k^m)/(u^g)
and the result must fit into search space:example here:
file is integer 9
floatingpoint space 9.000... to 9.999......
with multiplication by constant(e.g. 100) 900.000... to 999.999....
with multiplication by formula(e.g. 2.45^7.1+1):5224.809348679253 to 5805.343720754726
 
____________________________________________
http://xs135.xs.to/xs135/09042/av922.jpg
What Djisktra thinks of machines that could think is as relevant today as Algol Programmer's Guides

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List