Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Simple made easy

Name: Anonymous 2011-10-21 10:06

http://www.infoq.com/presentations/Simple-Made-Easy

Rich Hickey's presentation which received a standing ovation from Dr. Sussman.

Name: Anonymous 2011-10-23 1:46

>>80
I'd say intuition shouldn't be conflated with familiarity, programming in general is unintuitive to the population at large. People with the knack often get exposure to programming with an imperative language. Functional and constraint languages seem unintuitive to programmers seasoned to sequential reductionism but not to functional reductionism or logical description; however, people unfamiliar with any of this may find these ``non-intuitive'' modes of thinking easier to grasp because they fit well with related lay knowledge in math and logic.

Name: Anonymous 2011-10-23 1:47

>>79

What a crock of shit. A state machine can loop infinitely as well.

Name: Anonymous 2011-10-23 3:59

>>82
Only when given an infinite input.

Name: Anonymous 2011-10-23 5:22

>>79
That's why total programming can be useful and non-Turing-complete programming languages exist.

Name: Anonymous 2011-10-24 17:09

>>54
degraded completely into empty talks about assembly, CPUs and other irrelevant bullshit.
There it is.  This is the heart and soul of the problem with idealists like Hickey.  He even says several times that GC is great because the programmer shouldn't have to worry about managing memory.

A computer is a bunch of bits and some logic that can toggle those bits, and nothing more.  If you refuse to see the computer for what it is, then you shouldn't be a programmer -- you should be a mathematician.

So you want to the fibs program.  The idealist wants to pretend that integers are arbitrarily large and the program will just run forever, spitting out the sequence onto the monitor.  The programmer knows that there are many limitations to this program:
 - if your integers are fixed-size, they will either saturate or wrap
 - if your integers are variable sized, they will consume all of memory
 - if you're relying on GC, you'll hit the memory limit even sooner than if you managed your own memory
 - if you're outputting to a file, the file will fill up the hard drive
 - if you're outputting to the screen, the output will reach a point where even a single number won't fit on that screen

If you don't consider these kinds of limitations to be important, then you're not a programmer.  Every program that has ever actually been run has been limited by the available hardware.  Ignoring reality just makes you lazy.

non-sage-ing because this is the only recent thread worth a shit

Name: Anonymous 2011-10-24 18:59

>>85
Maybe the primary mistake is thinking that memory management is anything but just another problem to be solved by a programmer.  If there's some GC algorithm out there that suits the needs of the program you're currently writing, then use that GC algorithm just like you'd use your favorite sort algorithm.  But don't try to pretend that it isn't an algorithm or that it isn't part of programming.  Yes, it's great when you can use an existing library to solve a problem, but it's dumb to think that someone's going to write a library (GC or otherwise) that magically works in every situation.  Again, you're just being lazy.

Name: Anonymous 2011-10-24 20:41

>>85
Hey fagstorm, if we had less fucktards like you, we wouldn't have as many security flaws. I'll be writing fast, useful, efficient and most importantly, featureful programs as you debug your crashes that only occur when turning on -O3 and -DUSE_SMP. Now go scrub another midget you fucking faggot.

Name: Anonymous 2011-10-24 21:27

>>85
Go back to Real Programming in Fortran to perform ENTERPRISE INTEGER OVERFLOW on some rockets and leave us Quiche Eaters to conjure our spells.
>>87
This.

Name: Anonymous 2011-10-24 22:35

>>85
>>86
...sure... but the only reason to write your own memory manager is performance, and the thing is, in the near future, performance is going to come from multi-core processing. And manually doing that is just too hard for humans if you're code is side-effect ridden and low-level. Being good at manual memory management in 2011 is like being good at making buggy whips in 1911.

Name: Anonymous 2011-10-24 22:44

>>89
It really depends on the application. Low-power high-function computers like phones and handhelds still needs specialist knowledge to get the most out of the machine; this also includes low-power, low-cost specialist machines like media service devices or specialised service machines. Many people never work with such constraints and so, they don't require such knowledge.

This is really just an application of using the right tool for the job. I like using other people's tools whenever possible so that I can invest my effort into ensuring the computing experience is logically correct and acceptably quick.

Name: Anonymous 2011-10-24 22:47

>>85

 - if your integers are fixed-size, they will either saturate or wrap

x \in Z/Zn where n = 2k, and where k is either 8, 16, 32, or 64.

 - if your integers are variable sized, they will consume all of memory

Let M be the amount of memory in bits, dedicated for storing integers, and let I be the set of all active integers, and V(i) be the value of integer i. Then the following must hold:

\sum_{i \in I}log2(i) <= M

 - if you're relying on GC, you'll hit the memory limit even sooner than if you managed your own memory

Let TGC be the time at which you run out of memory while using GC. Let TMAN be the time at which you run out of memory using malloc free style memory management. Then:

TGC > TMAN

This isn't necessarily true though. Malloc free style can lead to pretty bad fragmentation, and with a copying garbage collector, you can compact the memory that is in use. If is true that you'll always have to keep part of the heap free so you'll have some space to copy stuff to, but if you use a copyin collector with many generations, then you can make unoccupied part small.

 - if you're outputting to a file, the file will fill up the hard drive

Let F(t) be the amount of bits you have outputted to a file, after time t, and H be the number of bits on the hard drive. Then we must have that F(t) <= H, for all t.

 - if you're outputting to the screen, the output will reach a point where even a single number won't fit on that

Let S be the number of digits that can be stored on a screen, and N be a multiset of numbers which you would like to display on the screen, in base 10. Then, if you don't use delimiters, the following must hold:

sum_{n \in N} ceiling(log10) <= S

and if you are using a single separator character, then:

sum_{n \in N} (ceiling(log10) + 1) <= S

or

sum_{n \in N} ceiling(log10) + |N| <= S

Name: Anonymous 2011-10-24 22:49

sum_{n \in N} ceiling(log10(n)) + |N| <= S

sorry for my equation fail.

Name: Anonymous 2011-10-24 22:51

Name: Anonymous 2011-10-25 0:28

>>92
Thats not an equation, it's an inequality.

Name: Anonymous 2011-10-25 1:32

>>87
The fucktard is YOU, idiots who can't think.

If by "security flaws" you mean things like buffer overflows, that's another artifact of teaching idealist bullshit first. Memory is NOT infinite, things can NOT grow arbitrarily long.

When people learn to drive, they learn to realize their car has a certain size and don't try to drive it into places where it won't fit.

What's so fucking hard about making sure the memory you're using is the right size? You don't even have to estimate, just count! It's so simple it should be common sense, and yet it isn't...

...because those who were taught the stupid "infinite resources" bullshit were conditioned to think otherwise.

Ignoring the reality and trying to cover it up by inventing more shit on top of it doesn't make it go away.

Name: Anonymous 2011-10-25 1:48

>>95
What's so fucking hard about making sure the memory you're using is the right size? You don't even have to estimate, just count! It's so simple it should be common sense, and yet it isn't...
Don't ask me, my code is fucking perfect. Ask the fucktards whose commits I have to watch over because around 20% introduce a major security flaw. You've never worked in a team project? Then fuck your shit. I want something that the retards I have to keep in line can use without doing too much damage -- and I know damn right that it ain't C (or C++, I simply can't imagine what creative abominations they'd come up with if allowed to use C++).

Ignoring the reality and trying to cover it up by inventing more shit on top of it doesn't make it go away.
If someone writes shit <proper low level language>, the only way I can fix it is by reverse engineering it and doing a full rewrite (the next person who submits complex code without commenting nor accompanying documentation I swear I will key their fucking car). If someone writes shit in <proper high level language>, at least I can somewhat optimize it (assuming the code isn't completely broken), for example, by using more specific (typed instead of generic, in dynamically typed programming languages) data containers, at least in the cases where the JIT's type inference can't figure shit out. In any case, the code will be shorter and easier to read and hopefully I won't have as much work to do.

All in all, you are half of what is wrong with this world. The other half are the fucking java monkeys who just fell out of the TreeFactory and hit every enterprise branch on the way down.

a.sdkfjas;lkfjw I hate the world ;_;

Name: Anonymous 2011-10-25 1:58

>>95
you know what else is really simple? NAND gates. But I don't program in them. Just sayan.

Name: Anonymous 2011-10-25 2:00

>>95
Well, just to be clear, >>95 is not me: >>85,86.  I'm actually not sure whether >>95 is trolling me or >>87, but it doesn't really matter.

Being good at manually managing memory is extremely valuable today.  I don't care whether anyone here believes it or not, but I've made a lot of money by understanding computers and not treating them like some theoretical toy.

My point is that they'll always have finite speed and storage.  It's nothing more than laziness and short-sightedness to hold out for some imaginary gleaming future where the amount of memory or processing power crosses over some arbitrary threshold, and then, finally, they'll be good enough that we don't have to worry about managing memory.  The computer I learned to program on had 64KB of memory, and 32KB of that was used by the OS and BASIC interpreter.  A cheap new computer today has 4GB, of which over 3GB is available to applications.  That's a factor of 100000 increase -- and guess what -- every "cutting edge" game for sale on the shelf next to that new computer still has to worry about how to efficiently use that 3GB.  We're still managing memory manually.

That shiny future that you're waiting for isn't coming because the systems' resources dictate the applications' design, not the other way around.  That's not going to change, because I can write an application to make good use of any amount of memory that you can give me.  There will never be a computer fast enough or a memory large enough.

I lol'd at scrub another midget.

Name: Anonymous 2011-10-25 2:06

>>96
First off, why would you respond to a troll like that?
Second, this:
my code is fucking perfect
is either a worse troll than >>95 or proof that you're a freshman in college.  The rest of your post makes it sound like you're on some group project with a bunch of retards and you think it's all C's fault.

Name: Anonymous 2011-10-25 2:08

>>97
But knowing how to would make you better at VB.Net or whatever it is that you do program in.

Name: Anonymous 2011-10-25 2:34

>>98
if you're programming for a modern personal computer or smart phone, memory is effectively unlimited. This is by far what most programs are written for today. In my professional life, the only time I have/had to worry about amount of memory is on video cards when writing shaders. Other than that, it's never a problem. Even (or even especially) when I'm writing in a functional style, because there's so much less "defensive copying" going on when you can safely share immutable data structures. Copying of that kind is epidemic in side-effect heavy languages like C++ and Java.

the claim is not that optimization is not important, it's that we are optimizing *wrong*, just like people in the early 80's who were still writing assembly were optimizing wrong. Efficiency comes from simplicity, not from clever hacks and too much attention to system resources. The same thing that happened to assembly -- computers becoming better at writing it efficiently than humans -- is happening and is certainly going to happen more to: memory management, multithreading. Moreso multithreading than memory management, but the thing is, proper, lockless multithreading requires an abstraction of memory and the only reason we are so very concerned with optimizing memory and caches in the first place is because we are so very terrified of writing functional code that nobody on the hardware side is putting much effort into making it worth it.

I'm not claiming that low level programming will be a forgotten art. All abstractions leak. But it really is attitudes like yours that are holding us back. we've hit this single-thread performance bottleneck and nobody wants to put a lot of effort into the right solutions because nobody wants to learn them or take a step back and think about their programs differently.

Name: >>96 2011-10-25 2:35

>>98
Nobody should be allowed to learn a high-level language before a low-level language along with assembly. That way, you'll learn how to recognize and cherish those things that are done automatically for you, and keep in mind that the VM is not magic and will not magically optimize your code for you. As for the manual memory management, fuck this I'm not repeating all of my pro-GC arguments in every troll shit thread; I'll maybe write it into a kopipe (like the anti-Python kopipe), but I'm too drunk to do it tonight and tomorrow I won't remember.

>>99
Yeah, my code isn't really perfect, but my mistakes are often off-by-ones and things that show up immediately on valgrind. It's very rare that my mistakes make it into an actual commit.

The rest of your post makes it sound like you're on some group project with a bunch of retards and you think it's all C's fault.
Fuck you, look at GNOME. Oversimplified crashing piece of shit that takes tons of memory. Computers used to run just fine with much less memory. If GC and a safe language will get me rid of 90% of the crashes and maybe even lead to simpler and faster code, then be it. Why does driving require a license but writing code that can bring down the fucking machine doesn't?

fuck why do I respond to shit like this

Name: Anonymous 2011-10-25 3:10

>>102
While I agree with your overall sentiment, I am compelled to add this to the discussion.

Computers used to run just fine with much less memory.
Computers couldn't do as many things before as compared to today. Have you really forgotten how internet video was like on standard machines before 2006? The reason we can do so much now is because computers can now crunch through the bits a lot quicker.

>Why does driving require a license but writing code that can bring down the fucking machine doesn't?
All sorts of idiots need to be accountable towards every other car driver on the road. Programmers that write important code for a machine ought to have an implicitly high standard; not only that, the code ought to pass through a QC team before being put into production.

Name: Anonymous 2011-10-25 3:19

>>101
But it really is attitudes like yours that are holding us back.
Wait, explain that (I'm not being defensive about it, I really want to know what you mean, and I'm open to criticism)

I'll even sum up my "attitude" for you:
 - I got a masters and worked in Silicon Valley at a really big company that you probably know really well
 - I did a lot of low-level coding there, all the way down to the hardware
 - From there, I started a small game company that was moderately successful
 - From there, I started another company that makes both hardware and software and this company is currently right in the midst of substantial success
 - Over that span, I coded at every level from logic gates to assembly to C to C++ to Perl to Java to C# to a handful of languages that I designed myself to suit a very specific need.

I am comfortable with bare metal languages, device drivers, BIOS code, OS code, game engines, scripting languages, interpreted languages, writing interpreters, writing compilers, code that writes code, genetic programming, you name it...

This current project requires me to do an odd mix of all of it.  I have to design hardware, I have to write firmware to go into that hardware, I have to write drivers to communicate with that firmware, and I have to write really high-level GUI code to allow someone to actually use any of it.

In that massive chain of computing, there are times when I have to get something done in 4KB.  More often, I have 4GB of memory plus 4GB of pagefile swap space.  And in spite of that, there are parts of that high-level GUI code with inline __asm{} and it isn't just to show off my ASM skills.  And believe it or not, that high-level GUI code manages its own memory because it has to.  I've fucked it up several times and the result is a PC that becomes completely non responsive because it has simply run out of memory.  Even in this modern age, computers do not fail gracefully when you consume all of RAM.

Anyway, the point is that I am neither a naive, college graduate, iPhone app hacker nor a Luddite clinging to the glory days of 8086 asm.  And my "attitude" is that I can assign a real dollar amount to the value of not pretending that a computer is an imaginary mathematical construct with infinite resources, and not waiting for the day when GC finally becomes "good enough."

Name: Anonymous 2011-10-25 3:21

>>90
You're right. Good thing we have Java and Python on most mobile phones, since they have no GC and are LOW-LEVEL AS FUCK.

But yes, it depends.

Name: Anonymous 2011-10-25 3:23

>>102

Coming from an entry-level programmer, I would have loved to get deeper into assembly (learned the basics) but effective resources are so difficult to come by.

And I totally agree with your method. I'm a mechanical designer, and back when I was learning drafting several years ago, we did everything by hand on vellum then did it over in AutoCAD and the likes. Very effective.

Name: Anonymous 2011-10-25 3:56

>>104
If you're working on your own hardware then you aren't part of the problem. The problem is cultural. The majority of manual memory management proponents work on programs that assume 4GB of memory or more and work on projects that are orders of magnitude more complex than they have to be and can't be parallelized because of the premature optimization and aforementioned complexity etc etc.

The problem is the mutual worship of the people writing software and the people creating hardware leading us in a completely arbitrary direction. If you're making your own hardware then it's all what you need, presumably, and you aren't in this hellish loop of progress prevention.


also what the hell kind of device has 4MB of memory. Does it fit in someone's contact lens or what?

Name: Anonymous 2011-10-25 4:02

>>108
also what the hell kind of device has 4MB of memory. Does it fit in someone's contact lens or what?
Specialist devices such as consumer network routers and modems, home automation systems, car audio systems. Computer contact lens would require a lot more memory than 4MB.

Name: Anonymous 2011-10-25 4:07

>>107
I didn't say "4MB" anywhere.  I said "4KB" in one instance and "4GB" in another.  The 4KB limit is there because sometimes that's all you get in firmware.  In a lot of cases, you even have to tell your compiler or RTOS in advance how much of that 4KB will be stack space/heap/etc...  It sucks.  The whole system has more than 4KB, but it's very common that some particular task only gets a tiny little piece of that to work with.

But like I said, I also work on the PC side of things.  And in fact, the PC software is also necessarily multi-threaded.  And it's also very complex.  You seem to be implying that there's just no way to do concurrency without throwing in the towel on memory management, but I can assure you that's not the case.  Even modern FPS games generally put their AI and possibly other tasks like physics or audio into separate threads.

people writing software and the people creating hardware leading us in a completely arbitrary direction
Sorry, I just can't buy this...  All the hardware guys ever do is make faster CPUs and bigger memory, and you can't really call that an arbitrary direction.  And who cares what the people writing software are doing?  You can use any platform/language/paradigm you want.  Are you concerned that your particular choice won't have enough library support or something?

Name: Anonymous 2011-10-25 4:27

>>109
And it's also very complex.  You seem to be implying that there's just no way to do concurrency without throwing in the towel on memory management, but I can assure you that's not the case.

I'm saying it's impossible to do multithreading that ISN'T very complex without throwing in the towel on memory management. Or, as Rich was saying, I don't know of a way to reify time without allocating memory.

Even modern FPS games generally put their AI and possibly other tasks like physics or audio into separate threads.

It's my position that at this point in history, the hardcore games should be running on 30 threads, not 3. And they basically do, but only by means of the GPU (which is now being expanded to work for physics.)

Sorry, I just can't buy this... 

the story is:
guy who has to write really fast code:
this is how the hardware works, so I'll code very specifically for it to get the best performance.

guy who has to make the next really fast hardware:
this is how the guys who have to optimize hardest are writing (nobody is using more than a few threads) so this is how we should design processors.

customers:
I really need THIS processor to run my latest game...

programmers who aren't even working in a CPU-bound domain:
well this is what the cool guys are doing so...

Name: Anonymous 2011-10-25 4:53

check my triples

Name: Anonymous 2011-10-25 4:55

>>111
I-I-It's not like I want to... baka!

Name: Anonymous 2011-10-25 5:38

>>104
I think in the domains you've been involved in low level programming practices like manual memory management is either the norm or necessary in the contemporary situation. Nobody would attempt using GC on a mid-range PIC microcontroller when it has 150 bytes of data RAM and 4Kwords of code flash total. However, there are a lot of domains where Turing-like abstraction is either acceptable or the only reasonable idea.

More importantly, economic factors does apply to evaluating the effectiveness of these approaches. When it comes to the wastefulness of GC, overhead of a hefty run-time for these non-imperative systems and other abstractions, the factor would be a ratio of roughly the number of bytes to number of minute the programmer has in a lifetime. A few decades ago, every byte had significant value so assembler was useful, a few decades further before that they were even more valuable such that they would fashion iron doughnuts crossed within iron wires in an array, by hand. Doing this exercise now would be foolish.

Speaking of which, in hardware, there is a ton of abstractions, I would say there is even more progress of ``turingnization'' in that field than there has been in software as of yet. At around the time between LSI and VLSI, around the first Mac I believe, they prototyped with wirewrap, likely ``drew'' the circuit board design by hand, and soldered circuit boards by hand.

Now, people consider CPLDs primitive, they buy and use ``IP cores'', they may connect up virtual wires in a CAD (some of them can auto-connect to GND and Vdd based on part descriptions), the circuit can be type checked, and to some degree debugged, then, pass this schematic to some IDA like EAGLE or whatever to automatically route paths and TADA a circuit board, and then generate Gerber files, drill files, pick and place files, etc. Email files to china. These tools exists because they are needed to remain competitive and improve metrics of profitability: the same trend must be happening in software since all the big players are picking things up LISP did 40 years ago.

I suspect that ``GC is shit'' thinking will be transitory.
There was skepticism that FORTRAN could overtake assembly.
There was skepticism that PASKAL's ``structured programming'' would be as flexible as FORTRAN GOTO.
The cycle goes on.

Functional programming research is about doing the things that the school of thought behind software engineering have been attempting to band-aid over for decades. Computer science is about concrete mathematical structures; likewise, programming language research focuses on better software construction based on concrete mathematical structures. Since programming proper is divorced from the natural world (other than when hardware limits are reached), traditional Engineering concepts can only apply so much, there isn't such things as a weakened byte, or shear modulus of a function call. Likewise the idea of calculating metrics of N level ``bug-free-ness'' in X KLOC of code that will arrive in Y days and cost Z dollars total, is a crock.

Name: Anonymous 2011-10-25 5:45

Don't ask me, my code is fucking perfect. Ask the fucktards whose commits I have to watch over because around 20% introduce a major security flaw. You've never worked in a team project? Then fuck your shit. I want something that the retards I have to keep in line can use without doing too much damage -- and I know damn right that it ain't C (or C++, I simply can't imagine what creative abominations they'd come up with if allowed to use C++).
What the hell are those fucktards doing writing code in the first place? Fire 'em!

Have you really forgotten how internet video was like on standard machines before 2006?
Actually, bandwidth was the main bottleneck. Even a Pentium 233 can play 320x240 MPEG-1.
The reason we can do so much now is because computers can now crunch through the bits a lot quicker.
Give this a read:
http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins

Also http://www.menuetos.net/ - the problem is not that we need faster hardware, but that softare needs to be more efficient.

Name: Anonymous 2011-10-25 7:19

>>114
There's a fine line between writing efficient software and shipping software ASAP. Reusable API's exist for the sake of the programmers so they don't have to invest effort into constantly rewriting general software.

The philosophy I follow is
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil - Donald Knuth
In established engineering disciplines a 12 % improvement, easily obtained, is never considered marginal and I believe the same viewpoint should prevail in software engineering - Donald Knuth

Name: Anonymous 2011-10-25 13:21

>>115
I agree - 10% of the code takes up 90% of the time, so it's often pointless to optimize the remaining 90%.

Name: Anonymous 2011-10-25 13:56

>>115,116
The philosophy I follow is
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil - Donald Knuth

This is a great philosophy, but if you really believe it, then the first thing you should do is dump functional programming.  You're doing exactly what Knuth says you shouldn't.  You're bringing in the huge, expensive "optimization" of immutability and garbage collection across the board, whether it's needed or not.  If you really wanted to follow that philosophy to the letter of the law, I don't see how you could be anything but a C programmer.  C gives you absolutely nothing out of the box.  If you want GC or FP or OOP idioms, you have to add them via libraries or roll your own.  And you're not going to find a language with more broad library/API/OS support than C, so get libraries for the 97% and spend your time doing the remaining 3% right.

Name: Anonymous 2011-10-25 15:10

>>117
Use sage.

This is a great philosophy, but if you really believe it, then the first thing you should do is dump functional programming.
You're talking about purely functional programming, right? Because in a language that merely emphasizes functional programming, there's nothing stopping you from heavily mutating things all while nicely compartmentalizing computation as you would in functional programming.

immutability and garbage collection across the board, whether it's needed or not
If you can't recognize whether the JIT can optimize away unneeded copies of things (e.g. via escape analysis) and that you should replace them with direct mutation, or you can't tell when the JIT will be unable to choose a type-specific container and you should enforce (or hint) it manually, then you shouldn't be allowed to use a high-level language. As simple as that.

If you really wanted to follow that philosophy to the letter of the law, I don't see how you could be anything but a C programmer.
C, when compiled directly to machine code, will always yield lower performance at the same levels of complexity as a higher level language (kindly resist the temptation of pointing out shitty languages with even shittier implementations as a counterexample to this). A well-written JIT may yield faster code than a static compiler via profiling.

All in all, kindly stop hating on a certain way of doing things simply because some (if not most) of its proponents are cretins.

Name: FrozenVoid 2011-10-25 15:15

>>117
GC languages have a niche for their use as user-friendly scripting glue or interpreters, not performance.GC implementations today are very resource intensive and these languages have a layer of safety-cruft which is a bottleneck compared to C, as it often seen on benchmarks.

Name: Anonymous 2011-10-25 15:28

>>119
Fuck off and die back to reddit, fhuckhface.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List