Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Too many languages

Name: Anonymous 2014-03-09 9:44

There are thousands of programming languages.

The purpose of a programming language is to express programs. The
purpose of learning programming languages is to build up a toolbox for
reasoning about and synthesizing programs in any one given language.

There are diminishing returns on learning programming languages, and
time is scarce.

Therefore one must select between programming languages to study.

A good selection of languages has both
+ breadth
  + satisfies a number of real world economic needs.
+ focus
  + exploits similarity between languages and incremental learning.
  + some unifying basis

A good member of a particular selection meets a number of the
following criteria:
+ Satisfies one particular school of thought on programming languages.
+ Significant difference from predecessors
+ Significant influence on successors
+ Economically significant
+ Advanced i.e. no direct, established and proven heir.
+ A good language.
  + Easy to express programs with
  + Easy to read programs expressed with
  + Easy to reason about programms expressed with

No one of these criteria are sufficient or even necessary conditions.

A bad member satisfies the opposite criteria.

Name: Anonymous 2014-03-09 9:46

I have made a good (as defined above) selection of programming languages:

The most meaningful categorization for this selection is by syntactic
(and therefore (mostly) semantic) tradition or school of
thought. Nevertheless the categorization is not clean, and very rough.
Specifically:
+ the division between concatenative and point free is more a semantic
  one, and
+ rule syntax in this case strictly means "Prolog like rule syntax".

+ Applicative languages:
  : Common Lisp, Scheme, Clojure, [Symta, AP5, InterLisp, T]
+ Structured, procedural:
  : Java, C++, Go, [Modula-3, Oberon, ALGOL68, Eiffel]
+ Message:
  : Smalltalk, Self, [Newspeak]
+ Concatenative:
  : Forth, Factor, [PostScript, Joy]
+ Point free:
  : J, [APL]
+ Rule:
  : Prolog, Datalog, [Erlang, Logtalk, Bloom]

This categorization highlights six branches of focus and exploited
similarity.

Name: Anonymous 2014-03-09 9:47

Additionaly
: Mathematica [Maxima, Axiom]
and
: SQL [CLIPS]
and
: [Refal]
are in the selection. But including them in the above categorization
would be troubling:
+ Refal, Mathematica, Maxima and Axiom would fall into the applicative
  category, but strongly differ in evaluation semantics to the Lisp
  descendents listed. Moreover, Refal differs from the CAS languages.
+ SQL and CLIPS are rule languages semantically, but the surface
  syntax is entirely different to the Prolog descendents listed.

Name: Anonymous 2014-03-09 9:47

Languages in brackets are carefuly chosen extensions of the selection
which (aim to) maximize the diminishing returns (and minimize costs)
on pursuing study further in their particular categories.

The additional uncategorized languages and extensions hopefuly reveal
(lucidate or support) that the "unifying basis" of the selection is
symbolic compuation, knowledge representation and stratified
programming for large systems on von Neumann machines.

Finally, the languages (with associated tools) cover almost the
entirety of today's economic spectrum (one notable gap is .NET).

Name: Anonymous 2014-03-09 9:49

Some languages were excluded but just barely. These
languages follow below:

One group of languages with ample opportunity for profitable study is
the ML family, which would fit nicely in the adopted taxonomy as:
+ Structured, applicative, pattern matching languages:
  : Clean Fortress [Haskell, OCaml]
Additionaly the primary selection above (i.e. not extension) has some
overlaps with the "unifying basis".

Outside of the set of categories in the established taxonomy for the
selection (as strictly defined above) there exist a number of
interesting languages:
: Maude, [CafeOBJ, OBJ3]
: Coq
: Lustre, [Lucid Synchrone, Esterel, Signal]
: Unicon [Icon, SNOBOL4]

Name: Anonymous 2014-03-09 9:50

Similarly there are a number of languages in the intersection of
categories:
: AmbientTalk, Curl, Dylan, Ioke, Julia, Lasso, Logo, Metalua,
: Nemerle, Oz, PLOT, Rebol, Slate

Finally the Algol (procedural, structured) and Lisp (applicative)
selection could be extended much further with (historically and
otherwise) interesting members.
: BETA, BCPL, Delphi, Goo, HyperTalk, 3-Lisp, Rexx, ZetaLisp

Name: Anonymous 2014-03-09 9:54

Now, other languages were excluded more strongly,
because they are average and completely unremarkable.

So, many "blub" languages were entirely excluded:
: ASP.NET, Bash, C, C#, Objective C, JavaScript PHP, Perl, Python,
: Ruby, Scala, VB.NET
etc.

Many unremarkable technology and domain specific languages were
excluded.

Similarly there are a whole host of unremarkable languages very
similar to a (perhaps remarkable) predecessor, augmented with a
novelty feature (or even less justifiably a *library*) targetting an
emerging market, technology or beginners e.g. Monkey, Processing etc.
These were strongly excluded too.

Name: Anonymous 2014-03-09 10:04

In conclusion, even though there are many languages, one can make a
selection which outperforms the pareto principle with regards to costs
and benefits. This selection is not unique, and perhaps someone
with a different "unifying basis" would take a different approach.

Name: Anonymous 2014-03-09 10:09

Learn C. Learn Scheme.

Name: Anonymous 2014-03-09 10:11

>>9
Scheme was included, C was deliberately (and strongly) excluded.

Name: Anonymous 2014-03-09 10:27

clamp my anus

Name: Anonymous 2014-03-09 12:41

>>2
Symta
Nikita "Delicate Flower" Sadkoff detected.

Name: Anonymous 2014-03-09 12:58

Sadkike

Name: Anonymous 2014-03-10 9:21

I am not Nikita.

I am however, in many ways, his apprentice.

Name: Anonymous 2014-03-10 10:27

>>14
did u suck his DIK, faggot?

Name: Anonymous 2014-03-10 10:37

>>15
No. I became aware of him over time and saw remarkable similarities between our tastes in languages.

For example today I wanted to see if anyone on /prog/ knew about Refal. Surprise surprise only Nikita.

Name: Anonymous 2014-03-10 11:17

>>16
Because he's Russian.

Name: Anonymous 2014-03-10 11:29

>>17
It also turns out all those good posts on Common Lisp, or the ones pointing out the flaws in Scheme, Python and Haskell: Nikita.

The ones about J, Forth, Smalltalk: Nikita.

It seems pretty much everything I liked about /prog/ was just the one guy.

The rest of you are UNIX skiddies who think programming in C is ``hardcore'', Haskell is for smart people, minimalism is cool and ``OOP'' sucks.

That being said there are differences. For example I'm not so averse to the ALGOLs (well its love hate really). I also have a thing for CAS style term-rewriting, which I don't know if Mr. Sadkov has discovered.

I also don't seem to have much in common with Nikita outside of programming interests and languages. I certainly don't share any of his political or personal views.

Name: Anonymous 2014-03-10 11:39

>>18
But Haskell is for smart people.

Name: Chris Done 2014-03-10 11:45

In C# I was always dealing with null errors in maintenance and confused by the limits of the type system and the arcane lambdas (C#: "Lam.. lam.. lamdoh!") it had at the time. I also wasted time trying to decide when to use an object class versus functions that work on a value. Until recently in C# you had to declare all types up front in a really duplicating way. Now only some are required. That gets old quickly and you can feel your hands wearing down every time you have to type that boilerplate. This is undoubtedly the bad experience that leads people to hate anything with the word "static" in it. C# fools you into thinking it's safe with its type system, and then shangais you at runtime. So you end up doing "defensive" programming at composition time, trying to make sure it won't blow up, which slows you down.

In JavaScript I'm always nervous about every line of code I wrote in case it'll explode, so I end up re-reading what I wrote. It doesn't matter much because I get runtime exceptions later anyway. Standard JS has no functional libs, so most code I write can't involve using them, and most code I read online doesn't either. So it's always verbose. You also pay for abstractions in performance in JS. No partial application is also a bother. JavaScript (vanilla) lacks lexical let which kills me.

In Elisp I always forget argument order of functions. There's zero convention. There are some functions that I use all the time and lookup the argument order every single time. Haskell's argument order always favours partial application. Elisp has no pattern matching and a single namespace is icky. It has LET and LET* which is an unfortunate distinction. Lisp has macros that would enable pattern matching, but who cares if only I know and use it? That's like making a boat I can only ride in my bathtub.

I like that Haskell has a "readFile" and "appendFile" function, simple "obvious" things like that. In Common Lisp it's like a three line expression. Also, for a "list processing" language, Lisp sucks at lists compared to Haskell. While I'm bashing Lisp, the LOOP macro sucks because it's heterogenous (shocker: this is why all macros suck). The syntax is unintuitive and special-cased. Whereas Haskell's looping facilities are all just normal functions that you compose. You only have to learn foldr/foldl/zip/map etc. once. They all have their use standalone or in concert. Elisp also lacks lexical scope which kills me.

The poor equality that Lisp has (and JS, and most everything else) trips me up a lot. (Did you mean EQ, EQV, EQUAL, string= or =)? Oh, you want to compare to arrays for equality? Ho, ho, ho. Lack of pattern mathing also kills me. It's laborious and boring to deconstruct objects manually. Haskell's type inspection is fantastic. Most of the time I don't have to read docs. I have to do that a lot in the above languages (though, C# a little less so). You have to sit there and read pros written by someone who resents having to write it in the first place while the other programmers are playing frisby outside. Again, composition and partial application are wonderous. I groan every time I have to write function(){ return ... } in JS or (lambda () ...) in Lisp or whatever weird stuff you have to do in C# these days.

I didn't use Java much. I spent a few days with it to do some PDF manipulation, which would've been a couple days if I wasn't swearing so much. If you think Haskell's type-laden APIs are confusing, look at the iText Java API. HP Lovecraft would have trouble coming up with an apt description. Although I think Iron Maiden could have a good go at it.

People say that they spend more time thinking in Haskell than writing it. I don't have that experience. I just start writing and then worry about the right design later. Thanks to the static checker, the cost of refactoring is very low. Probably the lowest of any language I've used to make pennies. In other languages you really have to think "is this the right design?" ahead of time otherwise you will hate life afterwards when you realise it wasn't. I've had competent Lisper colleagues just give up on changing a complex piece of code because "it works" even though it's a horrific mess. So in that way Haskell lets me just start going. Sure, it won't let me compare a number with a string but that's not something I want to do anyway.

Also, the type system is faster because you don't need to run a program to find out it's fundamentally broken. You find out at the tap-tap-tap-oops phase rather than the run-and-wait-for-it-to-run-oh-bugger phase.

In Bash I forget the syntax all the time because it's just stupid. Whenever I want to do something more than piping I start to wish I was using Haskell. This is why I'm working on a Haskell shell. I'm near to it being my full-time shell, once I've implemented one last thing.

Also doing threaded stuff in Haskell is trivial. And I'm now at the stage where I pretty much know every library I'm going to reach for to accomplish any task I tend to work on. But that comes with time and isn't peculiar to Haskell.

Name: Anonymous 2014-03-10 12:00

>>16
For example today I wanted to see if anyone on /prog/ knew about Refal. Surprise surprise only Nikita.

He is not Nikita.

                  -- The real "Nikita"

Name: Anonymous 2014-03-10 12:01

>>19
LOL

Yeah O.K.

I put it in the extension part of my near misses category. I suppose I did this because Clean was there, and Clean is intersting because it evaluates by graph rewriting. Haskell descended from ML via Miranda and Clean, but Haskell is more like Miranda.

When I first learned Haskell, I learned it to use it, not to study it as a language; so there's also a desire to go back for this reason as well. Finally, I hear good things about GHC.

Name: Anonymous 2014-03-10 12:11

>>22
They actually wanted to use Miranda but the intellectual property kikes refused so Haskell had to be invented as a replacement to Miranda.

Name: Anonymous 2014-03-10 12:28

>>24
>Thanks to the static checker, the cost of refactoring is very low.

I am baffled.

It doesn't take much thought to conclude that static typing (without subtyping and a default implicit universal type for all variables not declared otherwise) would increase the cost of refactoring, not decrease it.

Name: Anonymous 2014-03-10 12:33

>>24
It would decrease that cost because it enables the compiler to point out the spots that are still in need of reworking, with a fast response time because the program doesn't even have to be run.
You should do more practice, less thought.

Name: Anonymous 2014-03-10 12:44

>>20
I am also baffled by the similarities you drew between Lisp's equality functions and JavaScripts equality operators.

For one Lisp doesn't have any type coercion issues to worry about. The existence of each equality function is well justified as well
+ if you want to compare identities use eq,
+ if you want to compare conceptual memory cells assuming unboxed integers and characters use eql,
+ if you want to compare conses by the elements they contain (likewise for bitvectors and strings character by character) use equal.
+ if you additionaly want to compare arrays, structure and hashtables by their elements (and strings case insensitively) use equalp

It's very useful to have these things separate.

Name: Anonymous 2014-03-10 12:52

>>25
Fast response time? Your whole program has to be recompiled.

When I'm programming in Lisp, I often make a change that would require
+ Adjusting my mandatory type decorations even though procedure bodies need no other adjustment,
+ An entire recompile,
in Haskell. But ``poof'' neither is an issue with Common Lisp.

Name: Anonymous 2014-03-10 13:52

>+ Significant difference from predecessors
>+ Significant influence on successors
this are not meaningful at all

>+ Economically significant
The things worth making are priceless

>Now, other languages were excluded more strongly,
because they are average and completely unremarkable.
I didn't know a language was supposed to impress by itself, instead of being obvious and useful.

>So, many "blub" languages were entirely excluded:
>: ASP.NET, Bash, C, C#, Objective C, JavaScript PHP, Perl, Python,
>: Ruby, Scala, VB.NET
>etc.
No reasoning, yet you haven't shown an actual alternative to C in computer systems programming.

Name: Anonymous 2014-03-10 15:28

>>28
No reasoning, yet you haven't shown an actual alternative to C in computer systems programming.
Kneel and repent!

SYMTA> (defun sepplefy (x)
  (if (atom x)
      x
      (case (first x)
        (defun (format nil
                       "int ~(~a~)(~{int ~(~a~)~^, ~}) {~%~{~a~%~^;~}}"
                       (second x) (third x) (mapcar #'sepplefy (cdddr x))))
        (if (format nil "if (~a) {~a;} else {~a;}"
                    (sepplefy (second x)) (sepplefy (third x)) (sepplefy (fourth x))))
        (while (format nil "while(~a){~%~{~a~%~^;~}}"
                       (sepplefy (second x)) (mapcar #'sepplefy (cddr x))))
        (return (format nil "return ~a" (sepplefy (second x))))
        ((+ - * / < > <= >= = ==)
         (format nil "(~a ~a ~a)"
                 (sepplefy (second x)) (first x) (sepplefy (third x))))
        (otherwise (format nil "~a(~{~a~^, ~})" (first x) (rest x)))
        )))



SYMTA> (sepplefy '(defun f(n) (if (> n 1) (return (* n (f (- n 1)))) (return 1))))
"int f(int n) {
if ((N > 1)) {return (N * F((- N 1)));} else {return 1;}
}"

Name: Anonymous 2014-03-10 16:55

No reasoning, yet you haven't shown an actual alternative to C in computer systems programming.
*blubbing intensifies*

Name: Anonymous 2014-03-10 17:13

>>30
blub
Shalom!

Name: Anonymous 2014-03-10 19:54

>>28
What?

In my selection were Oberon, Forth, Common Lisp and Smalltalk. All of these have run (quite successfuly) on bare metal and have been used for commercial operating systems.

C++ is also there.

The only reason anyone would use primitive C over something like Oberon are economic reasons (amount of existing programs written in C, size of community, potential to find others willing to pay you to write programs in C) etc.

Name: Anonymous 2014-03-10 20:29

>>32
Don't forget Pascal, PL/I, and BLISS.

Name: Anonymous 2014-03-10 22:31

>>29
Now write some hardware devices drivers using Symta.

Name: Anonymous 2014-03-11 0:09

>>28
Imagine today's most ubiquitous consumer computers were not Von Neumann machines but stack or dataflow machines or something like this.

Now imagine that there are no living operating systems written in C, that all C compilers are buggy and poorly maintained, there are no C compilers at all for consumer hardware, and that the C community is very small and that it is almost impossible to find documentation or help with anything.

No one will employ you for knowing C, and certainly no day to day applications you use or anyone else uses are written in C.

In this reality, C might still be worth studying, if it had other redeeming features, but it doesn't.

So it's very interesting that you'd say something like ``Oh economics don't matter'' and then say ``C is worthwhile''. I wonder what made you say this?

Name: Anonymous 2014-03-11 0:28

>>35
In your hypothetical computer hell, von neumann machines would be a research topic with both academical and commercial interest, and C would be a great language to do the research in.

Name: Anonymous 2014-03-11 0:49

>>36
I believe it would not be ``computer hell'' but ``Lisp machine bliss''.

Name: Anonymous 2014-03-11 2:30

>>36
von neumann machines would be a research topic with both academical and commercial interest
*hypothetical world where FPGAs and dataflow architectures are the norm and von Neumann means some embedded 8-bit chip*
Mainstream computers are doing too many computations at once. I mean, who wants to be able to calculate the length of an arbitrary linked list in one cycle per element while doing a bunch of other stuff at the same time? Who wants to be able to add multi-dimensional arrays in one cycle? We should force programmers to load everything through a ``cache hierarchy'' into these things called ``registers'' and come up with gimmicks called ``out-of-order execution'' and ``register renaming'' so we can have our CPU do 3 or 4 things at once but pretend it still does only one thing at a time. What took one cycle now takes hundreds! Everything has to go through a tiny path called a ``bus'' about 128 bits or so. I can see commercial prospects too. Those companies would pay us billions to have software that runs 1000x slower, maybe 10000x. So, who's with me?

Name: Anonymous 2014-03-11 4:11

OK OP, let's say I'm not a researcher or freeloading autist and I actually need to make money as a programmer. How would that affect your recommended classes of languages?

Name: Anonymous 2014-03-11 4:38

>>39
It wouldn't.

You should be able to make money with Java, SQL, C++, Clojure and Go, which are all in my selection.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List