>>54
degraded completely into empty talks about assembly, CPUs and other irrelevant bullshit.
There it is. This is the heart and soul of the problem with idealists like Hickey. He even says several times that GC is great because the programmer shouldn't have to worry about managing memory.
A computer is a bunch of bits and some logic that can toggle those bits, and
nothing more. If you refuse to see the computer for what it is, then you shouldn't be a programmer -- you should be a mathematician.
So you want to the
fibs program. The idealist wants to pretend that integers are arbitrarily large and the program will just run forever, spitting out the sequence onto the monitor. The
programmer knows that there are many limitations to this program:
- if your integers are fixed-size, they will either saturate or wrap
- if your integers are variable sized, they will consume all of memory
- if you're relying on GC, you'll hit the memory limit even sooner than if you managed your own memory
- if you're outputting to a file, the file will fill up the hard drive
- if you're outputting to the screen, the output will reach a point where even a single number won't fit on that screen
If you don't consider these kinds of limitations to be important, then you're not a programmer. Every program that has ever actually been
run has been limited by the available hardware. Ignoring reality just makes you lazy.
non-sage-ing because this is the only recent thread worth a shit