Mike Levin is a leading developer on LISP, Levin is a JEWISH name.
Enjoy your zionist scripting language fag
Name:
Anonymous2011-02-02 6:37
autism dubs
Name:
Anonymous2011-02-02 7:04
>>43
Never heard of Levin
>Lisp was first implemented by Steve Russell on an IBM 704 computer. Russell had read McCarthy's paper, and realized (to McCarthy's surprise) that the Lisp eval function could be implemented in machine code
>Macros have been invented in 1963 by Timothy Hart
Name:
Anonymous2011-02-02 7:05
>>45
Garbage collection was invented by John McCarthy around 1959 to solve problems in Lisp
>>48
The LOOP macro is a good example of the power of Lisp macros, but I'd never really call it ``useful'' or ``EXPERT LISPER BEST PRACTICES''
Name:
Anonymous2011-02-03 6:06
>>50
No. It is a good example of how you can abuse Lisp macros, creating unneeded complexity. LISP philosophy is all about small specialized utilities, that can be nicely combined together.
>>51 LISP philosophy is all about small specialized utilities, that can be nicely combined together.
No, it's faggot UNIX ideology. Lisp philosophy is about providing full-featured ENTERPRISE QUALITY solutions that resolve the whole spectrum of REAL problems in REAL life situations. Like LOOP macro.
Name:
Anonymous2011-02-03 6:29
>>52
No. Unix philosophy is about KISS, meaning you segfaults often, burn in DLL-hell, need to close files manually and cant splice them like lists.
Two famous people, one from MIT and another from Berkeley (but working on Unix) once met to discuss operating system issues. The person from MIT was knowledgeable about ITS (the MIT AI Lab operating system) and had been reading the Unix sources. He was interested in how Unix solved the PC loser-ing problem. The PC loser-ing problem occurs when a user program invokes a system routine to perform a lengthy operation that might have significant state, such as IO buffers. If an interrupt occurs during the operation, the state of the user program must be saved. Because the invocation of the system routine is usually a single instruction, the PC of the user program does not adequately capture the state of the process. The system routine must either back out or press forward. The right thing is to back out and restore the user program PC to the instruction that invoked the system routine so that resumption of the user program after the interrupt, for example, re-enters the system routine. It is called ``PC loser-ing'' because the PC is being coerced into ``loser mode,'' where ``loser'' is the affectionate name for ``user'' at MIT.
The MIT guy did not see any code that handled this case and asked the New Jersey guy how the problem was handled. The New Jersey guy said that the Unix folks were aware of the problem, but the solution was for the system routine to always finish, but sometimes an error code would be returned that signaled that the system routine had failed to complete its action. A correct user program, then, had to check the error code to determine whether to simply try the system routine again. The MIT guy did not like this solution because it was not the right thing.
The New Jersey guy said that the Unix solution was right because the design philosophy of Unix was simplicity and that the right thing was too complex. Besides, programmers could easily insert this extra test and loop. The MIT guy pointed out that the implementation was simple but the interface to the functionality was complex. The New Jersey guy said that the right tradeoff has been selected in Unix-namely, implementation simplicity was more important than interface simplicity.
>>58
Unless the API changed, it will run just fine. Otherwise, it simply won't compile.
Compare this with a certain other operating system, where programs using wrong DLLs will silently continue and crash at undefined points.
Name:
Anonymous2011-02-03 7:03
I see math as a tool. I see /prog/ as a collection of confused racists that reject a tool because of a perverse association to their object of hatred.
I use inductive proofs in my work to test the validity of various solutions. Completing an inductive proof shows that a statement involving a given variable 'k' holds true at a given original value of 'k' and also at every value greater than the original value of 'k'. I understand that the value of 'k' will never reach infinity due to the restrictions of our physical world, however using the abstract concept of infinity to determine that a statement will hold true at an arbitrarily high 'k' is quite useful.
>>60
0]=> perl6
perl6: error while loading shared libraries: libparrot.so.2.11.0: cannot open shared object file: No such file or directory
127]=> parrot -V
This is Parrot version 3.0.0 built for i386-linux.
Copyright (C) 2001-2011, Parrot Foundation.
This code is distributed under the terms of the Artistic License 2.0.
For more details, see the full text of the license in the LICENSE file
included in the Parrot source tree.
>>72
Yes, that's the point of induction. You solve for (k+1). As along as there are no physical constraints preventing you from increasing the number (addressable space etc..), you will be able to.