Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Too many languages

Name: Anonymous 2014-03-09 9:44

There are thousands of programming languages.

The purpose of a programming language is to express programs. The
purpose of learning programming languages is to build up a toolbox for
reasoning about and synthesizing programs in any one given language.

There are diminishing returns on learning programming languages, and
time is scarce.

Therefore one must select between programming languages to study.

A good selection of languages has both
+ breadth
  + satisfies a number of real world economic needs.
+ focus
  + exploits similarity between languages and incremental learning.
  + some unifying basis

A good member of a particular selection meets a number of the
following criteria:
+ Satisfies one particular school of thought on programming languages.
+ Significant difference from predecessors
+ Significant influence on successors
+ Economically significant
+ Advanced i.e. no direct, established and proven heir.
+ A good language.
  + Easy to express programs with
  + Easy to read programs expressed with
  + Easy to reason about programms expressed with

No one of these criteria are sufficient or even necessary conditions.

A bad member satisfies the opposite criteria.

Name: Anonymous 2014-03-11 4:41

You forgot XML OP. XML > SQL

Name: Anonymous 2014-03-11 4:42

You forgot XML. XML < SQL

Name: Anonymous 2014-03-11 4:42

You forgot about XML. XML > SQL

Name: Anonymous 2014-03-11 4:50

>>41
How can one be greater or lesser than the other in this case?

XML is not really a programming language anymore than s-expressions are a programming language.

XML (and s-expressions) can be used as (the basis of) a (textual) syntax for a programming language, though.

Name: Anonymous 2014-03-11 7:50

>>41
XML is not a programming language for it is not Turing complete.

Name: Anonymous 2014-03-11 9:41

>>45
Agda is not a programming language for it is not Turing complete.

Name: Anonymous 2014-03-11 10:49

>>40

Hi OP, please include D in your considerations. D feels similar to a Pascal/ADA/Oberon successor in many ways, while maintaining parts of its C heritage. Despite historically the language tended to add features for the sake of it, recently efforts converged towards the goal of very high code reuse, decoupling algorithms and data structures in a STL-way. A lot of sensible decisions were made in the process, which is not immediately apparent when probing the language market.

Name: Anonymous 2014-03-11 11:44

Thread is too old to post in! Make a new one.

Name: Anonymous 2014-03-11 19:00

>>38
Your computers would be slow shit because they'd be made by people like you. Then people would come and make fast good von neumann computers and people would switch. You know, like what already happened.

Lol, fpgas and dataflow architectures.

Name: Anonymous 2014-03-11 20:42

>>49
That never happened.

Name: Anonymous 2014-03-11 20:53

>>50
People were researching all kinds of wacky-ass architectures — data flow architecture goes back to the '70s — but it's out of vogue now because they all turned out to be shit. You can barely make up toy problems dumb enough to perform well on them.

Name: Anonymous 2014-03-12 8:34

>>51
I think they failed commercially because of business practices behind the other architectures and not because they're inherently slow. Intel x86 instruction set is terrible and yet, it's the most successful CPU that's used in the desktop. Other architectures are used in other applications like the ARM design that's popular in modern smartphones. I don't know what Casio uses in their range of products but I guess they succeeded because they found a niche where their set of computers worked well.

Name: Anonymous 2014-03-12 10:26

A lot of good things fail commercially e.g. Lisp Machines, Smalltalk as an operating system etc.

The market optimizes locally, not globally.

Name: Anonymous 2014-03-12 12:39

Adequate programmers develop PHP code, using "PHPDevelStudio"; while "C" and all its spinoffs are messy shit. My acquaintance with "C++" end with the first pages of
the textbook, when I found that to call "_getch()", you have to manually include WHOLE library ("conio.h").

Name: Anonymous 2014-03-12 14:43

>>53
The market optimizes locally, not globally.
So do CPUs. That's why stack variables are faster than malloc or GC.

Name: Anonymous 2014-03-12 16:56

>>52
Business practices nothing. x86 is terrible, but it's a fast kind of terrible, the kind that still makes it a good and useful choice. x86 handily outcompeted Itanium without even trying.

There's still room for alternative architectures — look at the rise of GPUs — but they have to solve real-world problems.

Name: Anonymous 2014-03-12 17:43

mining bitcoins is a very real problem

Name: Anonymous 2014-03-12 22:43

NOEXCEPT

Name: Anonymous 2014-03-12 22:48

>>56
GPUs are special purpose processors. They are intended for a special application and they work well within that niche. I'm thinking about the alternate general purpose high processing CPU designs. I'm confident that if there were businesses that found a niche for a Lisp machine in the past, we'd have Lisp machines today processing out lists of data in the same market as IBM mainframes.

Name: Anonymous 2014-03-13 2:37

>>59
>le pedophile sage

Name: Anonymous 2014-03-13 5:24

>>59
GPUs are special purpose processors.
Exactly. You only need to solve one problem well to be viable, but Lisp machines and dataflow architectures don't even do that.
You're better off compiling your Lisp down to a real GP architecture, just like hardware support for Java turned out not to be worth it.

Name: Anonymous 2014-03-13 12:55

>>59
lol. but that's wrong you fucking retard

Name: Anonymous 2014-03-13 21:34

>>62
lol. but that's wrong you fucking retard

Name: Anonymous 2014-03-13 23:19

>>59
>le pedophile sage

Name: Anonymous 2014-03-14 2:28

>>59,61

I think what's inspiring about Lisp machines today, and what was lost, is the idea of high level hardware and a high level operating system. Where GC, dynamic type checking, memory bounds checking etc. are part of the fundamental services of the hardware.

You're missing the point if you're looking at Lisp machines as an optimization strategy. They are a ``lets not start with shitty abstractions'' strategy.

Name: Anonymous 2014-03-14 7:32

>>65
It wasn't lost as much as deliberately abandoned.

It turns out it's better to give access to a selection of primitive computations than to design towards a specific language model.  Just like it turns out that APIs are more powerful and flexible when they are dumb REST APIs that map to the underlying model, rather than catering to the specific application you are writing.

Look at Java.  Unlike Lisp, Java is in widespread use.  It had its fair shot at hardware execution with support from large actors like Sun and ARM.  Turns out a good JIT beats it handily in all areas that matter.  Flexibility, speed, you name it.  And of course the programmers don't give a whit, they're just writing Java either way.

The thinking that an ISA should have `high level' operations is what got us to x86 in the first place.  A lot of the instructions are just convenience methods for when you're programming assembly code.

Name: Anonymous 2014-03-14 8:14

>>66
``Lost'', in the way I used it meant exactly ``deliberately abandoned'', so I don't understand why you contrasted the two.

Deliberately abandoned in no way implies deliberately abandoned due to the idea being bad, or it being inferior to what exists.

Technologies die due to economic pressures. ``Good enough'' is a thing. People use Java don't they? Do you think Java is the world's best language? It's not, but it's ``Good enough'', proven and there are network effects in using it.

It's insanely foolish to think that markets optimize on technical merits (or that any similar evolutionary sort of process does).

As Alan Kay said (paraphrasing cause it was in some video I watched ages ago) ``Just imagine the most perfect being, and then *prfft* an elephant stomps on it, and that's it. It's gone''

Nothing just ``turns out'' WTF are you talking about? ``flexibility''? I'll grant you ``speed'' but I can't really name anything else, certainly not ``flexibility''.

And why on earth are you talking about ``high level operations''

What in heck does garbage collection, type tagging and bounds checking have to do with ``high level operations''. Those aren't implemented as operations, the whole point.

Finall ``A stupid idea that works is still a stupid idea'': Yiddish proverb.

Name: Anonymous 2014-03-14 8:23

>>65
I think what's inspiring about Lisp machines today, and what was lost, is the idea of high level hardware and a high level operating system. Where GC, dynamic type checking, memory bounds checking etc. are part of the fundamental services of the hardware.
What's inspiring about that? I find that disgusting actually, because it disregards the most fundamental and beautiful property of computation, the existence of universal functions.

Name: Anonymous 2014-03-14 8:44

>>68
I'll give you a chance to explain yourself.

Name: Anonymous 2014-03-14 10:24

>>69
SUCK MAH DIIIIIIIIIIICK

Name: Anonymous 2014-03-14 10:41

>>67
Yiddish
Shalom!

Name: Anonymous 2014-03-14 14:57

>>66                                `
>x86
>high level

Nigga, please. x86 was designed when chips had tens of thousands of transistors. There's nothing ``high level'' about it. Block copy and 8-bit BCD aren't high level. Even a Z80 had those. 68k, PDP-11, and VAX were much nicer for assembly programmers to use. x86 was the worst instruction set ever made. Any praise it gets is from people who don't know anything else.
Intel's iAPX 432 was a high level machine but they made instructions bit-aligned. Totally ruined performance. GC microcode and checking the bounds and authorization of every memory access barely mattered compared to having to shift every bit of every instruction in a slow as fuck non-barrel shifter.

Name: Anonymous 2014-03-14 19:03

>>71
Shalom! A good Shabat to you!

Name: Anonymous 2014-03-15 1:19

>>72
Leave it to Jews to make machine language convoluted and complex.

Name: 68 2014-03-15 8:21

>>69
Which word do you not understand? http://en.wikipedia.org/wiki/Universal_function

In layman's terms, a program written for some abstract machine (which is to say, an index in some particular Goedel numbering) can't possibly tell if it's executing on said machine "directly", like implemented in hardware or something, or inside an arbitrary number of evaluation (interpretation or compilation) layers in arbitrary abstract machines. Maybe even infinite number of such layers.

This is one of the most fundamental results in Computer Science. Why would people reject the eternal mathematical purity of an abstract machine and fetishize irrelevant transient hardware implementation details? Pig disgusting!

(maybe they never did low-level programming themselves and mistakenly believe it to be some sort of lost Eden? I did, it's dirty and exhausting, there's nothing magical about it)

Name: 68,75 2014-03-15 8:24

I mean, go make some chairs or scrub some toilets if you're so infatuated with the material plane.

Name: Anonymous 2014-03-15 8:43

>>75
Please leave this forum and do not come back until you learn the von Neumann architecture.

Name: Anonymous 2014-03-15 8:59

>>75
Thank you for explaining

I have no such misconceptions of Eden.

Currently many software vendors spend a lot of time implementing virtual machines with GC, bounds checking, type tagging etc.

If these were expected services of the hardware, a lot of duplication of effort would be removed (at least from software).

More importantly this would raise the floor of software quality. Gone will be basic mistakes such as memory leaks or corruption.

There's nothing disgusting about wanting to build your house on a concrete foundation.

Name: Anonymous 2014-03-15 9:51

If these were expected services of the hardware, a lot of duplication of effort would be removed (at least from software).
Bollocks. If it was easy to make a one-size-fits-all framework for GC, bounds checking etc, then various virtual machines could just use it.

Now consider the fact that designing, improving, finding and fixing bugs in the same thing implemented in hardware is a hundred times harder.

If you want to build your house on a concrete foundation, why don't you just use JVM? How would the same thing implemented in hardware magically be any better?

And it's not that anyone would be crazy enough to actually implement any significant part of it in silicon (because then improving and debugging becomes not merely much harder, but actually impossible), so what you want is basically a hardware with JVM in the firmware forced on everyone (except it would be a magical JVM without any flaws, lol). Such an alluring prospect!

Name: Anonymous 2014-03-15 10:43

>>79
I do use the JVM in practice.

And it wouldn't be the JVM in hardware.

This jerk has the right idea:

http://www.loper-os.org/?p=55

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List