Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-

xz

Name: Anonymous 2010-05-15 6:36

THE FUTURE OF COMPRESSION FORMATS

Name: Anonymous 2010-05-15 7:15

Micro black holes inside non-volatile storage devices.

Name: Anonymous 2010-05-15 7:55

slow as fuck, feels like ruby

Name: Anonymous 2010-05-15 8:31

>>1
In the future there will be no need for compression because everything will be stored in the cloud.

Name: Anonymous 2010-05-15 9:43

>>4
But the VM will still be compressed.

Name: Anonymous 2010-05-15 9:55

>>5
There won't be any need. There will enterprise-gradeinfinite quantum storage devices.

Name: Anonymous 2010-05-15 10:24

>>1
I'll stick with my lovely infinite compression software, thank you very much.

Name: Anonymous 2010-05-15 13:14

>>4
Fuck the cloud, just give me very dense storage which can also act as programmable logic:

http://highscalability.com/blog/2010/5/5/how-will-memristors-change-everything.html

Name: Anonymous 2010-05-15 14:34

>>8
PROGRAM MY ANUS

Name: Anonymous 2010-05-15 16:22

>>7
/dev/null does not count as compression.

Name: Anonymous 2010-05-15 16:32

>>11
Hawking would agree with you (now), but Penrose hasn't made any commitments but seems to favour it.

Name: Anonymous 2010-05-15 16:37

>>12
Penrose is a feeble-minded moron.

Name: Anonymous 2010-05-15 17:24

>>13
He and Hawking are buds, you know.

Name: Anonymous 2010-05-15 18:27

>>14
They've got one mens sana in corpore sano between them, then. Maybe they should exchange parts.

Name: Anonymous 2010-05-15 18:46

>>13
I know nothing of Penrose's mental state, but he's gotten prizes and stuff, so he can't be all bad.

Name: Anonymous 2010-05-15 19:25

I was hex editing a .xz file just now, and I randomly stumbled upon a "decaf". I mean, what are the odds?

Name: Anonymous 2010-05-15 19:59

>>17
0xH0TC0FFEE

Name: Anonymous 2010-05-15 20:01

>>8
This is really quite interesting.

Name: Anonymous 2010-05-15 20:19

>>16
Did you know Hitler, Mussolini, and Stalin were all nominated for a Nobel Peace Prize?

Name: Anonymous 2010-05-15 21:42

>>17
I was hex editing a .xz file just now
What would you do that for?

Name: >>21 2010-05-15 21:45

Name: Anonymous 2010-05-15 21:48

>>20
Did you know the guy who came up with the Peace Prize also was the inventor of dynamite? It's a hypocrite prize.

Name: Anonymous 2010-05-15 22:28

>>23
Alfred Nobel didn't come up with the Peace Prize, and he invented dynamite because nitroglycerin was killing too many miners. So you're wrong on two levels.

Name: Anonymous 2010-05-16 12:10

>>24
Yeah but nitroglycerin can still be used as an explosive

Name: Anonymous 2010-05-16 12:12

>>25
Did you know that the man who invented the lie detector was also the chief engineer of ENIAC?

Name: Anonymous 2010-05-16 13:03

>>26
LIE MY ANUS

Name: Anonymous 2010-05-16 13:05

What's the difference between LZMA and LZMA2?

Name: Anonymous 2010-05-16 13:06

>>28
One LZMA.

Name: Anonymous 2010-05-16 13:08

>>28
An integer's worth of revisions and updates.

Name: Anonymous 2010-05-16 13:47

>>28
Google is hard.

Basically nothing in terms of the meat of the algorithm.

Name: Anonymous 2010-05-16 14:36

>>28
LZMA2 divides LZMA in packets. This allows:

1. Parallelization of compression and decompression (every packet can be processed in parallel, at the cost of higher memory usage and worse compression ratio)

2. Uncompressed packets, useful for data that can't be compressed (it wouldn't expand much anyway, but decompression of this kind of data used to be slow and this fixes it, as well as improving compression ratio 1% or so)

3. Limited data recovery capabilities

4. Partial flush support (end a packet and the decoder will be able to decode up to that point, no need to restart compression entirely)

5. Some added limits to ease implementations (the number of bits used for prediction is limited to 4 to limit the size of the range coder statistical context model)

The actual compression algorithm is exactly the same (and the code is fully shared between both versions). The added overhead (due to the packet division) and improvements (for incompressible data) means that in practice the compressed size is within 1% of LZMA.

As I read the xz specification, it would appear that you're expected to encapsulate each LZMA2 packet into a xz block. xz blocks allow to preprocess the data with additional filters, such as entropy reduction filters for executables which convert relative jump addresses to absolute ones.

Name: Anonymous 2010-05-16 15:47

such as entropy reduction filters for executables which convert relative jump addresses to absolute ones.
My overlays!

Name: Anonymous 2010-05-17 18:24

>>3
ACTUALLY one of the main reasons for that is that the implementation you're likely using ("XZ Utils") is terrible.

It's an ancient fork of the original LZMA code, mangled in several ways. I just did a few benchmarks using the 9.xx series of 7-Zip. In single-threaded mode, 7-Zip was already over 30% faster, and the resulting file was a hair smaller (a few bytes out of a few megabytes, probably just a different header). I tried to choose comparable settings (defaults for XZ Utils; xz+lzma2+ultra+8mb dictionary for 7-Zip). I checked that the resulting files can be decompressed by both implementations.

On top of that 7z has multithreading. Two threads are absolutely free (same memory use, same result file bit-by-bit) and provide a additional 70% speedup (at this point we've more than doubled the speed). Then you can use even more threads in exchange for more memory and less compression ratio.

Decompression is harder to benchmark since it's too fast but it seemed there wasn't much difference at all.

Name: Anonymous 2010-12-20 21:46

Don't change these.
Name: Email:
Entire Thread Thread List