>>19
Most Common Lisp implementations compile to native code (so x86 or similar). A few compile to bytecode which is interpreted, and an even smaller number interpret it as S-Exp's (while providing options for native compilation).
Name:
Anonymous2011-01-21 16:53
Still canceling every useless toy program in LISP I come across
Sometimes I start downloading them JUST TO CANCEL THEM
Oh wait it was rewritten in C++ by 30 developers without as many features as the Lisp version which took 2 guys a year and a half while they were also running a business by themselves.
Yeah, as someone who has actually made it through books on Lisp, I love to bash useless, uneducated Lisp faggots (mostly college Freshmen) at every possible turn, but Yahoo's decision to rewrite all of their crap in dogshit C++ is one of the worst business moves I've ever heard of. I don't understand how it was not cheaper to just teach everyone Lisp.
Although, the only opinion I ever hear on how awful the rewrite was is Paul Graham's, who isn't exactly unbiased. So, it could have been that performance or something was the issue, and incessantly-blubbering Paul wouldn't admit it.
why not just take Common Lisp and *fix* it! Do you have any idea how long the future is? Do you really
think people in 1000 years want to be constrained by hacks
that got put into the foundations of Common Lisp because
a lot of code at Symbolics depended on it in 1988?
Ahaha, typical Lisp religious bullshit. How lame. It's especially funny that he says this, since a lot of the cruft in Common Lisp is due to the language trying to support a bunch of really lame Lisp dialects that were used by a total of 12 people in a graduate program somewhere.
Aaand, the parser for this thing is retarded, as usual. Should be
Do you have any idea how long the future is? Do you really think people in 1000 years want to be constrained by hacks that got put into the foundations of Common Lisp because a lot of code at Symbolics depended on it in 1988?
>>35
Gimp is scripted in Scheme, it is written in C. Also, the word bloated gets thrown around without any real qualification as to what it could possibly mean.
Name:
Anonymous2011-06-29 23:30
Maxima is written entirely in Lisp, and most of what is interesting in Emacs is written in EmacsLisp or some other dialect. Also, Guile Scheme is the ``official'' (not that it matter much) scripting language of the the GNU project, and you see it used in GNOME's games, Lilypond, Gimp and maybe some other odd programs.
Oh, and AutoCad uses some sort of Lisp for its scripting too, and who knows what else. ITA software was mainly a Lisp shop, but now it was bought by Google, though it's doubtful that they'll rewrite their flagship products very soon just to fit into their language schemes.
Name:
Anonymous2011-06-29 23:54
Our hypothetical Lisp programmer wouldn't use either [Cobol or assembly]. Of course he wouldn't program in machine language. That's what compilers are for. And as for Cobol, he doesn't know how anyone can get anything done with it. It doesn't even have x (Lisp feature of your choice).
As long as our hypothetical Lisp programmer is looking down the power continuum, he knows he's looking down. Languages less powerful than Lisp are obviously less powerful, because they're missing some feature he's used to. But when our hypothetical Lisp programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Lisp, but with all this other hairy stuff thrown in as well. Lisp is good enough for him, because he thinks in Lisp.
When we switch to the point of view of a programmer using any of the languages higher up the power continuum, however, we find that he in turn looks down upon Lisp. How can you get anything done in Lisp? It doesn't even have y.
By induction, the only programmers in a position to see all the differences in power between the various languages are those who understand the most powerful one. (This is probably what Bjarne meant about C++ making you a better programmer.) You can't trust the opinions of the others, because of the Lisp paradox: they're satisfied with whatever language they happen to use, because it dictates the way they think about programs."
Maxima has broke every single time I've tried to use it. I mean that literally. It just freaks out and throws me back to the REPL at some point or another, and this is several different versions over a span of a few years (I wasn't even doing anything complicated...just partial differentiation and plots of functions of several variables). Axiom looks interesting, especially because it's been in development for-fucking-ever, and they've cataloged their source code/documentation in these hilarious 1,000 page PDF's. Wait, no.
and most of what is interesting in Emacs is written in EmacsLisp or some other dialect.
Emacs Lisp is about as engaging as petrified horse shit (no lexical scoping, LOL!) and Emacs is a bloated mess (I use it because everything else sucks more; that doesn't make it good).
Also, Guile Scheme is the ``official'' (not that it matter much) scripting language of the the GNU project, and you see it used in GNOME's games, Lilypond, Gimp and maybe some other odd programs.
I hate Scheme. Scheme is ALGOL with parentheses. If you're going to brag about using Lisp, man the fuck up and use CL (although Scheme being a Lisp-1 makes is marginally less braindamaged). Guile is a slow, crippled mess. For the record, GIMP uses an interpreter called TinyScheme, which is also shit. Have you ever seen the Script-Fu scripts? Horrible.
Oh, and AutoCad uses some sort of Lisp for its scripting too, and who knows what else.
The AutoCAD Lisp dialect thing is hilariously awful.
ITA software was mainly a Lisp shop, but now it was bought by Google, though it's doubtful that they'll rewrite their flagship products very soon just to fit into their language schemes.
This was a decent example, but according to you Lisp is going to be phased out here.
I've actually been forcing myself to learn it. It's not bad, but it's not very hackable. Case in point, I was using Tuareg mode in Emacs to edit OCaml. There was a default setting that sucked and didn't work. I type a few commands in to change the mode, reload, and I keep on working, with my OCaml window on the bottom and my code on top. Emacs, after a while, feels like a second skin, and I just never feel that way about vim.
I do use vim when SSHing somewhere, however, because it's awesome 'out-of-the-box' compared to Emacs. Maybe I'll eventually switch over to vim, although SLIME sort of keeps me in Emacs.
:GJS1M 67dcbdbce4a0b67c4b48e86a6ae29205a95e4b83024a9d947213d1231800e8d9
:66 749020605ab193c323926844a2f6acb1
:1294720492 1309409086 >>35 <-- that's cool and all, but check 'em
>>39
The only thing of note in your post was your berating of Emacs Lisp for lacking lexical scoping while decrying Scheme for being ``ALGOL with parentheses'', considering that lexical scoping is about the only notable thing that Scheme shares with ALGOL, Mr. 'I read Lisp books under the influence of Paul Graham' guy.
Oh no, you found the rest of post unworthy of attention! Whatever will I do?
was your berating of Emacs Lisp for lacking lexical scoping while decrying Scheme for being ``ALGOL with parentheses'', considering that lexical scoping is about the only notable thing that Scheme shares with ALGOL,
Yeah, most of my message was trolling (I obviously like lexical scoping) because I'm a bitter Lisp weenie. I'll be fair and admit that Scheme isn't that terrible (Chicken Scheme is pretty cool, the Stalin compiler is neat) as long as you're doing nothing of real-world value. Scheme still feels awful though, so I'm going to blame ALGOL.
Mr. 'I read Lisp books under the influence of Paul Graham' guy.
Common Lisp is overly weak and complicated and was conceived in a time when computation was not yet mainstream or accessible.
Practical languages like Python are a much better attempt at trying to solve the problems Lispers first set out to solve.
Name:
Anonymous2011-08-18 2:14
Python
Python is not as good as it is made out to be, in other words it suffers from a degree of hype. I'll try to argue this point. Potential detractors of the language usually lack the experience to criticize it authoratively. This is more true for Python as it is not (yet) common that people are coerced (by work, school) into learning and working with the language. So the detractors are few and drowned out by the vocal supporters.
The proponents of Python cite 'indentation' as the worst problem, this is a strawman argument 'this is the worst problem and its not really a problem'. This argument has been voted up presumably by people who like the language, because it is certainly not a good reason not to use Python.
I am far from an expert at Python, but I have done a couple of semi-serious projects in the language and will try to recall specifically what I didn't like.
- Everything you write will be open source. No FASLs, DLLs or EXEs. Developer may want to have control over levels of access to prevent exposure of internal implementation, as it may contain proprietary code or because strict interface/implementaion decomposition is required. Python third-party library licensing is overly complex. Licenses like MIT allow you to create derived works as long as you maintain attrubution; GNU GPL, or other 'viral' licenses don't allow derived works without inheriting the same license. To inherit the benefits of an open source culture you also inherit the complexities of the licensing hell.
- Installation mentality, Python has inherited the idea that libraries should be installed, so it infact is designed to work inside unix package management, which basically contains a fair amount of baggage (library version issues) and reduced portability. Of course it must be possible to package libraries with your application, but its not conventional and can be hard to deploy as a desktop app due to cross platform issues, language version, etc. Open Source projects generally don't care about Windows, most open source developers use Linux because "Windows sucks".
- Probably the biggest practical problem with Python is that there's no well-defined API that doesn't change. This make life easier for Guido and tough on everybody else. That's the real cause of Python's "version hell".
- Global Interpreter Lock (GIL) is a significant barrier to concurrency. Due to signaling with a CPU-bound thread, it can cause a slowdown even on single processor. Reason for employing GIL in Python is to easy the integration of C/C++ libraries. Additionally, CPython interpreter code is not thread-safe, so the only way other threads can do useful work is if they are in some C/C++ routine, which must be thread-safe.
- Python (like most other scripting languages) does not require variables to be declared, as (let (x 123) ...) in Lisp or int x = 123 in C/C++. This means that Python can't even detect a trivial typo - it will produce a program, which will continue working for hours until it reaches the typo - THEN go boom and you lost all unsaved data. Local and global scopes are unintuitive. Having variables leak after a for-loop can definitely be confusing. Worse, binding of loop indices can be very confusing; e.g. "for a in list: result.append(lambda: fcn(a))" probably won't do what you think it would. Why nonlocal/global/auto-local scope nonsense?
- Python indulges messy horizontal code (> 80 chars per line), where in Lisp one would use "let" to break computaion into manageable pieces. Get used to things like self.convertId([(name, uidutil.getId(obj)) for name, obj in container.items() if IContainer.isInstance(obj)])
- Crippled support for functional programming. Python's lambda is limited to a single expression and doesn't allow conditionals. Python makes a distinction between expressions and statements, and does not automatically return the last expressions, thus crippling lambdas even more. Assignments are not expressions. Most useful high-order functions were deprecated in Python 3.0 and have to be imported from functools. No continuations or even tail call optimization: "I don't like reading code that was written by someone trying to use tail recursion." --Guido
- Python has a faulty package system. Type time.sleep=4 instead of time.sleep(4) and you just destroyed the system-wide sleep function with a trivial typo. Now consider accidentally assigning some method to time.sleep, and you won't even get a runtime error - just very hard to trace behavior. And sleep is only one example, it's just as easy to override ANYTHING.
- Python's syntax, based on SETL language and mathematical Set Theory, is non-uniform, hard to understand and parse, compared to simpler languages, like Lisp, Smalltalk, Nial and Factor. Instead of usual "fold" and "map" functions, Python uses "set comprehension" syntax, which has an overhelmingly large collection of underlying linguistic and notational conventions, each with it's own variable binding semantics. Using CLI and automatically generating Python code is hard due to the so called "off-side" indentation rule (aka Forced Indentation of Code), also taken from a math-intensive Haskell language. This, in effect, makes Python look like an overengineered toy for math geeks. Good luck discerning [f(z) for y in x for z in gen(y) if pred(z)] from [f(z) if pred(z) for z in gen(y) for y in x]
- Quite quirky: triple-quoted strings seem like a syntax-decision from a David Lynch movie, and double-underscores, like __init__, seem appropriate in C, but not in a language that provides list comprehensions. There has to be a better way to mark certain features as internal or special than just calling it __feature__. self everywhere can make you feel like OO was bolted on, even though it wasn't.
- Python is unintuitive and has too many confusing non-orthogonal features: references can't be used as hash keys; expressions in default arguments are calculated when the function is defined, not when it’s called. Why have both dictionaries and objects? Why have both types and duck-typing? Why is there ":" in the syntax if it almost always has a newline after it? The Python language reference devotes a whole sub-chapter to "Emulating container types", "Emulating callable Objects", "Emulating numeric types", "Emulating sequences" etc. -- only because arrays, sequences etc. are "special" in Python.
- Python's GC uses naive reference counting, which is slow and doesn't handle circular references, meaning you have to expect subtle memory leaks and can't easily use arbitrary graphs as your data. In effect Python complicates even simple tasks, like keeping directory tree with symlinks.
- Problems with arithmetic: no Numerical Tower (nor even rational/complex numbers), meaning 1/2 would produce 0, instead of 0.5, leading to subtle and dangerous errors.
- Poor UTF support and unicode string handling is somewhat awkward.
- No outstanding feature, that makes the language, like the brevity of APL or macros of Lisp.
Name:
read it 5 times2011-08-18 2:43
>>49 triple-quoted strings seem like a syntax-decision from a David Lynch movie
lol'd in my mouth a little