It's got a whopping 980 functions for all your needs with intuitive and easy to remember names like least-negative-normalized-double-float, update-instance-for-redefined-class, load-logical-pathname-translations, simple-condition-format-arguments, internal-time-units-per-second, pprint-exit-if-list-exhausted and so on. Of course, it purposefully lacks the negligible nonsense like a graphical toolkit, image processing, nonblocking asynchronous IO, remoting, cryptography, SQL, text processing, archive tools, concurrency, parallelism, thread-safe data structures, monitoring & management, printing support, sound and processing, XML toolkits - all of that is useless, after all! The most important thing is that you get to use anaphoric lambdas and pandoric captures whilst munching on momma's tasty soup!
Seriously, you say "a core library" as if Lisp is a practical general-purpose PL. It's a lie. Lisp is no such thing, hence such a short stub in place of a core library and a trashpile of unmaintained cruftworks from fanboy-soup-eaters (a.k.a. CLiki). Look at the core libs of industrial-strength languages like Java or Python. They were forged in the fires of practical problem-solving over many years. While the Lispers have spent the whole 50 years of their totem's existence forging only one thing: their self-aggrandizement.
Although, to do Lisp justice, it should be noted that Franz and Allegro did crank out their bicycle-crutch-ersatz-"core" libraries. This is because during lisp-hype of the 80-90ies these gescheftmachers have managed to get several rich yet clueless Pinocchios addicted to the Lisp drug. Like Boeing, for example. And when the Pinocchios started trying to solve real problems lispishly, they had to hurriedly plug that gaping hole. So go ahead and spend some $4500 for a real Lisp - it's an easy sum for a Lisper, right? - and welcome to the miraculous world of Professional Lisp Pinocchios.
Common Lisp is a language. Systems or applications stuff wasn't standardized, because that is the stuff you build with a language, not the language itself.
To understand this approach study e.g. the character or file dictionaries, and why they are the way they are (this should illuminate many things).
Even implementation details were delibarately less specified (see environment).
There are standards for some things e.g. CLIM.
Also Lispers are not plumbers. Libraries and frameworks that would take a matter of weeks for average e.g. Python programmers to write are written by Lisp programmers in a matter of hours. Lispers consider it an insult to use someone else's programs, unless they have been vetted over 30 years and implemented as a fundamental part of a Lisp operating system running on a Lisp machine.
Lisp is not a plumbing language. For example Common Lisp pretends UNIX does not exist, because simply put, it should not. It is a travesty, and Lispers will not compromise.
So if you come to Common Lisp expecting facilities for plumbing UNIX or Windows or whatever you will be disappointed by what the language itself provides.
However, implementations provide some facilities in these ragards, which are slowly becoming canonicized over time (e.g. Bordeaux threads)
Name:
Anonymous2013-12-28 23:38
easy to remember names like least-negative-normalized-double-float
Hilarious.
Name:
Anonymous2013-12-29 0:22
What else would you call least-negative-normalized-double-float? std::numeric_limits<float>fuck_you()?
Common Lisp is the only language I have used with a sane number heirarchy/library/dictionary/whatever
Every other language gets something subtly wrong (Haskell), or is just stupidly limited (C family).
Name:
Anonymous2013-12-29 0:25
>>5
Well for one I wouldn't put subtraction/negation signs in function names because it's just silly.
It's not "context sensitive" anymore than the 'f' in float and for is context sensitive in C.
You don't know what context sensitive means.
Name:
Anonymous2013-12-29 0:46
>>7
Then why are they on my number pad? I know it's because ASCII is shit and we should all be using Space Cadet keyboards on Lisp machines.
Name:
Anonymous2013-12-29 0:51
>>10
Or anymore than j-i is different from j - i.
Yes, I am aware that Lisp uses polish notation with parenthesis for delimitation because it's hipster like that. Doesn't make it any less silly.
The space cadet was but one (early) Lisp machine keyboard.
Some Lisp machine keyboards were very different e.g. the Fujitsu FACOM Alpha keyboard had Japanese letters (I do not know Japanese)
Name:
Anonymous2013-12-29 1:02
>>12
And now that I think of it, using * to dereference pointers in C was also a stupid idea.
They should have used a completely separate symbol like $ or #.
Since the beginning of Lisp there have been efforts to change the textual syntax, each with its own tradeoffs.
In the 50s and 60s McCarthy and others put some effort in Lisp 2 (the successor to Lisp 1.5) which would use "M-Expressions" (very similar to what Mathematica uses today).
Apple's Dylan used s-expressions for its textual syntax at first, but then switched to an infix notation.
David Moon has recently described PLOT, which is similar to Dylan.
There is also Wheeler's "sweet expressions" or "readable s-expressions" (I don't like the latter name).
However the main Lisp line (i.e. Common Lisp, ISLisp etc.) uses "old school" S-expressions, because Lisp programmers simply find them useful for writing their programs.
You have to understand Lisp has been around for a very very very long time. Every language feature under the sun has appeared in some Lisp offshoot or another. Most languages in some way owe part of themselves to Lisp 1.5.
Lisp today(i.e. Common Lisp) is a really really good set of trade offs re: language design. But it is tradeoffs; you can't have everything (e.g. Common Lisp has no continuations cause we want (a working) unwind-protect).
This is why other languages which share a common predecessor (e.g. Scheme) are around today; they make different tradeoffs.
Name:
Anonymous2013-12-29 1:16
I almost forgot to say:
The reason Lispers like s-expressions is because they "mirror" (I can't think of a better word atm) Lisp forms (i.e. the objects (in memory) the compiler (or interperter) sees).
Different textual syntaxes either require a different and harder to work with (programatically) intermediate representation (again using that loosely) OR a more complicated reader (and a less direct mapping between what are called "forms" and the textual syntax used to represent them) (the advantages and disadvantages of each approach I hope is clear).
That what the "structure" part of the book "structure and interpertation of computer programs" refers to: the fact that you can see the structure of your program right there in front of your eyes in a very similar way to how a computer program can see it.
I also personally really like how s-expressions are indented and printed.
I'm not sure you're aware but any character can be used as part of a symbol name in Common Lisp. The reader accepts the textual syntax |like this :)| to delimit literal symbol names. Now, \ and | have to be escaped: this should explain it (note the multiple escape levels due to how the reader treates ""):
CL-USER> (symbol-name (read-from-string "|some stupid \\\| name :)|"))
"some stupid | name :)"
How characters are encoded is considered an implementation detail in Common Lisp. Common Lisp characters are objects, they are a distinct type from numbers. Character objects can have implementation defined attributes e.g. a face (i.e. font color etc.).
Most Common Lisp implementations today use some unicode and integers with regards to char-code and code-char
Last thought on this whole s-expression and syntax thing:
There exist two things called a "structure editor" (unfortunately two things. In the Lisp world it always meant one and one thing only, till a bunch of not very intelligent people encountering some language loosely related to Lisp (e.g. Clojure or Scheme) and hearing about the concept elsewhere decided to pollute the Lisp meaning in the context of Lisp, thus confusing newbies everywhere. Worst of all the two meanings are regularly conflated into a nonsensival meaning almost everywhere)
Anyway the first (foreign) meaning of "structure editing" or "strucucture editor" an interface to editing some representation of a program in a structured way e.g. Paredit in Emacs or Blockly.
The second (native) meaning is editing program objects (i.e Lisp forms) directly sans reader (i.e. sans text, sans files) IN MEMORY in a running Lisp image. This says nothing about the interface for doing so.
Interlisp pioneered this. Keyboard keystrokes would directly change the program objects as they were in memory; the resulting program was continuously being reprinted with a pretty printer. All clients of that program would immediately use the new updated program as it was at every keypress: This is part of the reason why Xerox D machines were known as voodoo boxes.
Many Lispers have claimed that Common Lisp's acknowledgement of the file abstraction has been a mistake. I fall into this camp.
Anyway the point is that for both meanings of "structure editor" s-expressions are very very very useful. So this is another reason why Lispers like s-expressions.
Get this through your thick skulls, Lispers: No amount of proselytization, language features, whatever the fuck you want to call it, can change the simple and obvious fact that parentheses are bloody irritating to work with when nested deeply!
You're right Cudder, it is annoying to edit Lisp programs as text with an ordinary text editor. I agree. I'd never program in Lisp if I had to do this simply cause editing would be too painful.
But that's not what Lisp programmers do. Use some of the tools a Lisp programmer uses. I use Emacs with Paredit to edit Lisp programs. There are other tools, perhaps some are better, I don't know, but Emacs with Paredit is what I use today, and its nice. Try doing the same. Just for laughs. Just for a day.
Lisp offers a lot of flexibility, at the price of macro-friendly syntax, rather than user-friendly. Besides the overrated problem of getting used to those lots of parentheses, it's all too tempting to mix macros and normal code in Lisp, in a way that doesn't visually stand out; this really doesn't encourage the writing of reusable, mutually compatible libraries. As a result of this extreme flexibility, large scale collaboration doesn't seem to happen, and Lisps lack a de facto comprehensive set of standard libs, besides those included in Common Lisp's specification. Comparisons have been drawn between getting Lispers to work together and herding cats...
Name:
Anonymous2014-01-01 17:26
There's plenty of high quality, ultra-stable Common Lisp libraries:
+ Bordeaux threads
+ Standardizes Threads.
+ CFFI
+ Standardizes C interop.
+ CL-FAD
+ Standaridzes files and directories.
+ Closer to MOP
+ Meta object protocol.
+ Trivial backtrace
+ Standardizes backtrace printing.
+ Trivial features.
+ Standardizes =*features*=.
+ Trivial Gray streams
+ Ensures CLOS streams (already defacto in most implmentations).
+ Trivial garbage
+ Standardizes weak pointers and weak hash-tables.
+ USOCKET
+ Standardizes the sockets interface.
+ Babel
+ Character encoding.
+ Chipz
+ Decompression.
+ CHUNGA
+ Chunked streams
+ CL+SSL
+ Interface to OpenSSL.
+ Flexi-streams
+ Stream interface over streams for character encoding.
+ Ironclad
+ Cryptography.
+ Nibbles
+ Reading and writing machine types.
+ Salza2
+ Compression.
+ CL-JPEG
+ JPEG input and output.
+ CL-OPENGL
+ Bindings for OpenGL.
+ CL-VECTORS
+ 2D primitive drawing library.
+ CLX
+ Bindings to X.
+ PNG-READ
+ PNG input.
+ Skippy
+ GIF input and output.
+ Vecto
+ High level interface to CL-VECTORS.
+ ZPB-TTF
+ Parses ttf files.
+ ZPNG
+ PNG output.
+ Hunchentoot
+ Web server.
+ Drakma
+ HTTP client.
+ RFC2338
+ Multi-part form data.
+ Parenscript
+ Javascript transpiler.
+ CL-WHO
+ HTML DSL.
+ Postmodern
+ Postgres interface.
+ Alexandria
+ Standard library.
+ Anaphora
+ Anaphoric macro library.
+ Cells
+ Data flow extensions.
+ GUI framework (e.g. celtk cells-gtk).
+ CL-PPCRE
+ Regular expressions library.
+ CLIM
+ GUI framework.
+ Mainly commercial.
+ Iterate
+ Alternative to LOOP.
+ optima
+ Pattern matching.
+ SERIES
+ Lazy evaluation.
+ Iteration.
+ SPLIT-SEQUENCE
+ Partitioning sequences.
+ LOCAL-TIME
+ Time and dates.
+ Based on Eric Naggum's History of Time.
+ metabang-bind
+ Combines labels, let, flet, destructuring-bind, with-slots,
with-accessors, and multiple-value-bind into one form.
+ Extensible.
+ Stefil
+ Unit tests.
+ ASDF
+ Build system.
+ Quicklisp
+ Library package manager.
I listed a number of high quality, free, stable, well-maintained Common Lisp libraries. There is also a large amount of very high quality non-free Common Lisp libraries from the commercial vendors.
Metalua (nor Lua for that matter) does not have as many high quality libraries. So the Metalua people claiming Common Lisp has a library problem is stupid.
That being said let it be known that Metalua is on my list of languages I would learn if I had a few more lives. As it is though, I cannot be expert in more than 6 languages, and promising languages like REBOL, Metalua and even classics like Fortran and Simula just do not get my time.
Also, I used Python professionaly after hearing good things about it, but later discovered that the reasons I used Python at that time did not apply to Python at all (Python, next to PHP, is the worst language I have ever used), but instead applied to Lua. I rewrote 10% of the Python in Lua, but at that point it was too late and I just stuck with the Python. If I could go back to that situation now though everything would have been written in Common Lisp and that would have been that.
Once you know Common Lisp your outlook towards 99% of languages is "this is just some small variation with half of everything missing".
Name:
Anonymous2014-01-02 9:52
>>34
I'm curious about the 5 other languages you care to be expert at.
>>36
LOL! those are not real programming languages
but [b]exoteric[/i] languages or turing tarballs
(ala LOLCode, u forgot this one)
you can't get a job with these!
dont put that on your resume!
Name:
Anonymous2014-01-02 11:24
also forgot Symta ;)
Name:
Anonymous2014-01-02 11:43
>>37
It's good that you got the joke, moron. Too bad you've confused exotericity with esotericity, though.
Name:
Anonymous2014-01-02 11:58
>>39
at least I had the turing tarball right, moron²
C++ (and C for that matter) was the second language I learned (first was Java), and the one I have the most experience in. C++ is also a good way to get interesting jobs.
The value of (Pharo or Squeak) Smalltalk depends where you're coming from:
1. If you're already very well versed in Lisp and Lisp history, you are visiting Lisp's "West coast" relative: Alan Kay's take on John McCarthy's "Maxwell's equations of software", and the product of Dan Ingals implementing the one page compiler writing compiler called "Meta". Even more interesting for a Lisp programmer is that you're seeing a language with a lot of cross polination with Interlisp; the sister evolution of the Lisp machine; You are seeing the cutting edge of a whole divergent and just as rigorous branch of operating system and user interface programming.
2. If you're not a Lisp programmer then you are seeing what computers could (have) be(en): Ever wondered what it would be like to be able to modify your operating system's scheduler at run time? What about clicking on anything ("object" in Smalltalk parlance) you see and calling functions on it ("sending messages" in Smalltalk parlance), seeing the program text (or compiled program!) object corresponding that object, modifying that, and having those changes reflected immediately? What about opening up something like top or task manager ("process browser" in Pharo) showing you every process running in the system, halting a process, looking at its stack, modifying the programs in that process and resuming it? What about an operating system where there is no such thing as files? Where program text is just an object like any other? Where the hard-drive is just another transparent level of cache? Just as there were Lisp machines, there were Smalltalk machines (in fact they were sometimes the same machine with different microcode loaded). It is imperative at every point to realize that Smalltalk was never meant to be a program running on UNIX or some other anachronistic crapware, it was meant to be the entire system. (Now, a Lisp programmer has already seen a Lisp machine, so many of these things aren't novel)
3. For all programmers, if you learn Smalltalk along with some Smalltalk literature (or from an old Smalltalker), you learn for the first time how to build large systems. You learn for the first time that the reason you rolled your eyes at all those "patterns", "designs", "methodologies" etc. are just because the people telling you about them were maniacs (the California computer ponzi scheme people) trying to fit square pegs into round holes (ideas from Smalltalk into C++/Java/C#/VB.NET/PHP/Python/Ruby etc.). You learn for the first time just what "reusable code" is (actually if you've used Genera you already know). A Smalltalk system as a whole is a paragon of programming. Individual parts are pretty hard to beat in a language with Smalltalk's design (i.e. single dispatch and inheritance) e.g. the "Collections" package, while other parts are just hard to beat e.g. Morphic (which comes to Smalltalk by way of Self).
4. Pragmatically, I use Smalltalk for the web (you can use the "Seaside" class library as a "whole stack", or you can mix Amber Smalltalk (a Javascript transpiler transruntime thingo) with your backend of choice (e.g. a Common Lisp)) and for protyping GUIs.
Name:
Anonymous2014-01-02 19:49
Oberon (and its offspring) is the language of the Oberon family of operating systems (ETH-Oberon (i.e. System 3) Linz-Oberon (i.e. System 4), AOS/Bluebottle (and others?)). The Ceres series of workstation were the "Oberon machines" (so to speak), but are no longer produced. However, you can run the Oberon operating systems on x86 PCs, or hosted on your operating system. Also, thanks to Niklaus Wirth's many freely available books, and the fact that all relevant code is GNU free or public domain, you can implement or port the Oberon language or operating system to your platform of choice.
Now, Oberon the language (07 rev 2013) descends from Pascal via Modula. Oberon is very conservative (think Dijkstra and Knuth); It is strictly procedural, structured and statically typed (traditional meaning i.e. type erasure). It is also one person understandable from the compiler up in a reasonable amount of time (one month of study) (as is Oberon the operating system!).
As an example of "conservative" Oberon doesn't have anything like "break" or "return" in a loop, instead it uses Dijkstra's "while" form e.g.
WHILE m > n DO m := m – n
ELSIF n > m DO n := n – m
END
Similar to C, it is easy to analyze the performance characteristics of Oberon programs on a Von Neumann machine (one of the design goals was "No hidden costs").
Pragmatically, when you write an application in Common Lisp, it is already extensible in Common Lisp. However, not everyone feels comfortable using Common Lisp, or having others use Common Lisp. Oberon is a good extension language because it is very easy for novices to understand, and it is very hard for inexperienced programmers to write bad programs in it.
Name:
Anonymous2014-01-02 20:06
Forth is Chuck Moore's (a student of McCarthy's) take on the Maxwell's equations of software (i.e. another sibling of Lisp).
I am not yet an expert in Forth; The extent of my experience with Forth is that I put Forth (amforth) on an AVR (Atmel 2560) microcontroller and defined some words related controlling step motors. On an ARM computer (an Odroid XU) running Debian Linux I used Clozure Common Lisp to generate small Forth programs (based on various inputs from elsewhere), and send them to the Atmel via USB.
I also read part of a postscript tutorial once (which as I understand is similar or related to Forth).
I also emulated a PC booting to colorForth and ran some demo programs.
I plan on learning more about and gaining expertise in Forth over time.
Name:
Anonymous2014-01-02 21:17
Finally, I lied. I have no experience with J at all.
However, I have reasons for believing it to be worth mastering (It is sufficiently different from the other languages I do know (I do not know any array languages (although I have written some MATLAB programs)), and I can think of a few niche uses for it, or at the very least I will gain from being intimate with its implementation).
If Steele's Fortress was ever "completed" I would probably be thinking of concentrating my next efforts there instead of J.
Anyway this brings me to the fact that I have wasted a lot of time half learning languages that I will probably never use again unless someone pays me to, and even then I will be quite upset and think it is a further wate of time. I learned most of these because of University, but also because of some bumbling about on my own part afterwards, and employers. These languages are:
+ VB.NET
+ Java
+ Perl
+ PHP
+ Haskell
+ Python
+ Clojure
+ Lua
+ JavaScript
I am especially bitter about my time wasted with Python and Haskell, which I have concluded are massive troll languages or CIA disinformation campaigns no different from ADA.
Even for an expert programmer, it takes a while to become truly expert in some language i.e. know its libraries, community, tools, "dialects", impelmentations, how to write common algorithms most succintly in them etc. intimately. And for the vast majority of languages there's no point especially when you know something like Common Lisp. What the heck is the point of Scala or Ruby or some other nonsense if I know Common Lisp? I'd put C++ on that above list of mispent youth languages if it weren't for the "real world" and my need of employment in it (my strategy so far has been to infiltrate employers touting my C expertise, spend a month or two proving myself, then just getting permission to use whatever I want. For some reason most ppl are of the belief that C and C++ is the only way to do systems or embedded programming, so if you want a systems programming job with only light application level work this is the way to go (these jobs tend to give you the most freedom later)).
I already feel like I have a lot on my plate just keeping up with C++, Common Lisp, Forth, Oberon and Smalltalk.
To put it more succinctly:
The marginal benefit of learning a new language (given the ones I already know) is smaller (in almost all cases) than the marginal cost of obtaining and maintaining a life time of true expertise in that language.
This is why languages such as Metalua, REBOL, Factor, Julia, Self, IO, Dylan or any other mix and match or evolution of Lisp, Smalltalk, Forth and a Wirth family language (Modula, Pascal, Oberon) simply don't get my attention even though there's some potential these could be interesting (of course they could be quite unpleasant too). I have the mental capacity for maintaining expertise in maybe 3 more languages, but only the patience and desire to do so for 1 more. idk if it will be J (or some other language in the APL family) (another strong contendor is Unicon or Icon, and as mentioned Fortress).
Also these are just high level general purpose languages. I also have to maintain expertise with DSLs (Mathematica, ACL2 and Maxima) and assemblers (AVR, x86, ARM).
One trend I've noticed in the languages I have chosen to maintain expertise in is that they are all languages which were once used or are intended to be used as languages for an understandable computer from the bottom up (all obviously taking different approaches). They are "no-compromise" languages. Obviously UNIX and C won, but just as easily any of the others could have been the foundation of computers today (ideally it would have been Lisp :D)
Name:
Anonymous2014-01-04 10:28
None of those languages are pure functional. You should become an expert Haskell or Agda to feel what it's like to break away from the imperative miasma.
I mentioned Haskell. I mispent much time with Haskell (it was the 3rd language I learned) before concluding it is a troll language. I recommend only learning Haskell for a similar reason someone might learn INTERCAL (good, fun mental exercise).
I see only a very very very small amount of utility in Agda. Not enough to warrant expertise in it.
Name:
Anonymous2014-01-04 17:25
I especially liked how Haskell programmers name functions typeclasses etc. after Mathematical concepts they resemble only so slightly to greatly confuse someone who previously only encountered the mathematical concept.
I guess the same can be said of the use of the word "function" instead of the more appropriate "procedure".
However I agree with you; you have been trolled. By Haskell.
Name:
Anonymous2014-01-05 17:21
>>55
Do any programming languages have functions in the sense that you're talking about?
Name:
Anonymous2014-01-05 19:26
Not that I know of. Not even in e.g. Mathematica.
What's called a function in most programming languages does not enforce the required properties of a function. Even if it did, it's very debatable whether an abstract concept like a function can ever be reified (general sense) like that. It's far more acceptable for it to be the other way around. You can analyze part of some program as a function (but it is not a function!)
In fact, out of all the words used to refer to the same thing (procedure, subroutine, callable unit etc. etc.) function is the worst.
Especially annoying is that in e.g. VB.NET you have "subroutines" and "functions" and the difference between them is that a "function" might return a value (i.e. if it exits locally). What a load of baloney!
However, I have long since accepted that function in programming languages means something entirely different than it does in mathematics. However this one uibqutious historical precedence is no excuse for Haskell programmers to keep making the same mistake over and over again; misusing words with clear meanings from mathematics to describe things they slightly but not fully resemble to the detriment of anyone who knows the slightest bit of mathematics.
On Lisp. In Lisp. Above Lisp. Below Lisp. Within Lisp. Without Lisp. Around Lisp. Atop Lisp. Beneath Lisp. Beside Lisp.
Name:
Anonymous2014-01-06 1:18
>>58
They were called "functions" in BASIC/FORTRAN because they could be used in expressions. "Subroutine" calls are on a statement of their own and return no values. I think some BASIC variants restricted functions to being a single formula while subroutines could be multiple statements e.g. you couldn't PRINT from a function, but you could do math. I agree that "function" is a bad name for them, especially in languages like C that call every procedure a function.
Name:
Anonymous2014-01-06 3:59
>>55
A function is a well-defined static mapping of input to output. Haskell functions are functions because they always evaluate to exactly the same output if provided with exactly the same input.
It's sad I have to spell out these truths for you.
Name:
Anonymous2014-01-06 4:18
And of course you can have a language with pure mathematical concepts reified, except that this language will have to be something like Agda or Coq. Which is why I mentioned Agda, but you can't learn Agda because you're too wired into ancient imperative relicts to even admit the possibility of having mathematical purity in a programming language. You probably haven't even heard of the Haskell Curry and William Howard's isomorphism.
But guess what, according to the new Homotopy Type Theory all mathematics is fundamentally calculational. So you can go back to your Lisp or Forth or whatnot and keep wanking your imperative actions which are impossible to reason about.
Name:
Anonymous2014-01-06 4:29
>>62 Haskell Curry
Wait... that's a real person? I thought it was some kind of in-joke.
Is it possible to have a language that reads like math?
Most math papers I've touched spend most of their time describing stuff in words rather than notation.
Name:
Anonymous2014-01-06 5:08
>>63
That's the person the language Haskell is named after, genius.
Name:
Anonymous2014-01-06 5:09
Haskell Brooks Curry (September 12, 1900 – September 1, 1982) was an American mathematician and logician There are three programming languages named after him, Haskell, Brooks and Curry
LEL.
Name:
Anonymous2014-01-06 5:13
>>65
Who names their kid after a programming language?
Terrible!
A "function" in Haskell represents a computer program not a function.
A computer program is something that is executed on a computer. It has nothing to do with a mathematical function. You can model computer programs as functions to analyze them, but they are not functions.
Name:
Anonymous2014-01-06 6:02
>>67
Isn't that the reason why lifted types contain _|_ (bottom)?
>>67
In order to have termination, just choose a total language like Agda.
A function in Haskell is a program that calculates a mathematical function. Since the vast majority of practically important functional values are calculated by computer programs nowadays, the depth of this terminological distinction is dubious. The difference between a computer function and a mathematical function is usually obvious from context. It is especially negligible since lazy pure functional languages support equational reasoning: substituting a computer function into a code term is analogous to substituting a mathematical function into an algebraic expression modulo the limitations of computer memory.
Lisp, on the other hand, does not have functions even in the computer sense.
Imperative languages are not difficult to reason about.
A (very) good resource is "Introduction to algorithms" if you are experiencing difficulty.
People have been reasoning about imperative programs for a very long time, and, empirically speaking, quite successfuly. For example, all convex hull algorithms I know are "imperative", but many properties are known about them (e.g. for which set of inputs they terminate, for which set of inputs they give a "correct" answer, all the asymptotic properties (time, space, big and little omega and theta)).
In comparison, not much analysis of "purely functional" (I strongly disagree with this term, but I think I know what it means to you so I use it) programs has been done. One resource I know of is the book "Purely functional datastructures", but this does not analyze too many programs. Also notable is that most programs are very awkward, and have terrible characteristics compared to programs written without the restrictions this book places on programs.
Now, I haven't tried to learn Agda. Perhaps you are correct and I cannot learn it. I have some limited experience with Coq.
I mentioned earlier that I use ACL2. ACL2 can analyze Lisp forms as mathematical functions (Note: the Lisp forms aren't mathematical functions, however the subset of Lisp forms which ACL2 recognizes are (in this case) representations of mathematical functions). Here is a small example I took from and executed on the website "Try ACL2":
(defun factorial (n)
(if (zp n)
1
(* n (factorial (- n 1)))))
Succeeded. (click for details)
The admission of FACTORIAL is trivial, using the relation O< (which is known to be well-founded on the domain recognized by O-P) and the measure (ACL2-COUNT N). We observe that the type of FACTORIAL is described by the theorem (AND (INTEGERP (FACTORIAL N)) (< 0 (FACTORIAL N))). We used the :compound-recognizer rule ZP-COMPOUND-RECOGNIZER and primitive type reasoning.
(thm (> (factorial n) 0))
Succeeded. (click for details)
But we reduce the conjecture to T, by primitive type reasoning and the :type-prescription rule FACTORIAL.
Note that you can run factorial as a Lisp program. This is because apart from being a representation of a function which ACL2 understands and can reason about, it is also a valid Lisp program.
I mentioned the language Fortress earlier. Fortress comes with an Emacs program called "fortify" which will format (for lack of a better word) a Fortress program using LaTeX.
However, I personally really really dislike classical (for lack of a better word) mathematical notation. It is confusing and often amibguous. I especially dislike TeX and LaTeX syntax (can you imagine if e.g. your favorite CAS used such syntax?)
Simply put: mathematical notaiton lacks a grammar.
Name:
742014-01-06 6:58
I meant to refer to >>63
apologies. My reply to Mr. 70 is coming soon :)
I'm having a lot of trouble following what you're trying to say.
For example I am not sure what you mean when you say "practically important functional values are calculated by computers nowadays".
Also I'm not sure if you're refering to term rewriting or plain denotational semantics in your sentence about "equational reasoning". I assume you are refering to Haskell when you say "lazy purely functional language", and AFAIK Haskell does not use term rewriting.
In any case I don't understand what you are trying to communicate. I know that any programming language which features expressions which evaluate to values can be analyzed or described at least partially using denotational semantics. I just don't understand why you would state this fact.
I also don't understand what you could mean by "function in the computer sense" if Lisp doesn't have them. As discussed earlier, a better word for "function in the computer sense" (at least for all the things "function" means in each of the hundreds of programming languages that use the word "function" to name some feature) is "procedure" or "callable unit". Common Lisp has these.