Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Fucking X86

Name: Anonymous 2010-05-07 6:11

seriously why the fuck is X86 so fucking terrible! it has like a million instruction of which 1/10 are used and all the other ones are either, useless, fucking uncomprehendable, or just so goddamn slow that nobody uses them. seriously most of the functionalty could be implemented if you just combine the simple ops, but no intel had to add instructions that combine like 10 of these so that every single reference on the net looks like invalid perl code. is there anyone on the planet that actualy knows what half of the mnemonics mean? i mean goddamnit i mean just look at some halfway full refrence, what the fuck? what kind of people do they hire at intel? and what do these whackos fucking take? i could take LSD for 50 persons and still wouldnt be so high that i could do such crap? and what the hell is with all those these weird registers? who the fuck uses 3dnow! anyways?
goddamnit i had it with intel and X86!

is PPC any better? what is do you use /prog/?
and where can i find some good PPC asm tuts, i would really like to know more about POWER arch.

GodamnitFuckingCrapOfChrist
i fucking hate this shit X86

Name: Anonymous 2010-05-07 9:03

Sounds like you're just too dumb to understand x86 assembly.  Stick with C.
Holy shit, 1337 |-|4><><0|2 itt! Runaway!
SSE2 replacing the x87 FPU instructions for floating point math in x86-64.
It shouldn't. Because the first `S' in SSEx means ``Streaming''. Because all these extensions were designed to do precisely one specific sort of tasks: to process large amounts of monotonious data - unlike FPU.
Moreover, to be honest, SSEx is an enormous fail. People back in those days were naive enough to think that general purpose CPU would suffice to do all that streaming data processing.
They were wrong, as we all can see. When did you see the last software renderer which wasn't SLOW AS FUCK. Or even was, it doesn't really matter. And why are we so worried about GPU-accelerated video decoding now?
So in the end we have nothing but useless silicon and wasted time.
-----
Oh, traditional GPU architecture sucks even more. Unified shader architecture is probably the only good innovation in the whole area for the last 10-20 years.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List