C# is VB with a better syntax.
The good ideas in Lisp wouldn't be possible without its "retarded syntax." Its power comes from the fact that code is stored as a parse tree so it can easily self-modify.
Name:
Anonymous2005-11-29 15:38
My view:
- C rocks as a low level language but it was abandoned with a shitty standard library, so you can't use it unless you go around collecting the good stuff like in RPGs.
- C++ sucks without a garbage collector, and allows you to suck by doing stuff you shouldn't, although it's done with goodwill.
- Java sucks ass, it's the new COBOL and the Emperor's OO clothes, and nobody in his/her sane mind would want to use its API, let alone be productive with it.
- C#: never tried it, keep hearing so many good things about it.
- Perl: it's like a genius with a terrible personality; I'd like to use it for the features, but man, can you cope with such a piece of shit syntax, everything Perl is fugly.
- PHP: less ambitious than Perl and with a mediocre core library, but a clean syntax that gets things done and lets you work fast.
- Python: clean and ordered, I should give it a try. I've also heard it's very fast for a scripted language; faster than Perl and much faster than PHP and Ruby.
- Ruby: OO IS SUPERIOR. Forget about it.
- LISP: ((((NO U)))). /r/ its features incorporated in a decent language.
- Visual Basic: A toy language with the disadvantages of the worst scripting languages and none of the advantages of them.
Out of the ones I know I'm using mostly plain C99 for low level stuff and PHP for web applications and system scripts. I'm interested on learning C# and Python.
Name:
Anonymous2005-11-29 18:05
Actually, I agree with the OP that we CS types should have studied a natural science. In my last job, doing some fairly advanced dev for a product (it later won a pile of industry awards), 3/4 of the programmers were originally physicists. The were all fucking good at coding too.
Face it, we in the CS industry largely suck. We're the rejects of the sciences (are we even a science?). Real geeks study physics, or at least EE, then learn software dev on their own.
Name:
Anonymous2005-11-29 18:42
CS isn't science. It's more of a branch of mathematics. Also, programming ≠ software engineering ≠ CS.
Name:
Anonymous2005-11-30 0:16
You want math? Physicists are far better at that.
CS isn't science.
Yeah, yeah, we've all heard that before.
Name:
Anonymous2005-11-30 22:08
All that and no one mentions Prolog...
Name:
Anonymous2005-11-30 23:44
Nobody uses prolog except for some European AI researchers.
Name:
Anonymous2005-11-30 23:59
_French_ European AI researchers at that.
Name:
Anonymous2005-12-01 6:26
The Japs selected prolog for their 3rd gen AI systems, but after they'd invested millions of yen (10bux lol) they realised that prolog sucks.
I expect Objective-C is more popular for a reason..
Popularity is the best measure of a good language!
Java must be a great language!
Name:
Anonymous2005-12-03 9:31
For niche languages, popularity probably isn't that bad a metric. After all, what possible reason would a person have to use a niche language, other than usefulness/beauty/what have you?
Name:
Anonymous2005-12-03 9:47
>>15
Yes, you are somehow right, but you don't win anything. What you should ask yourself is: "Is it fast enough for me ?" not: "Is it as fast as C ?". And JFY Dylan is faster than Java.
Also, you may be interested in OCaml, if performance is your primary concern, it is as fast as C++.
Name:
Anonymous2005-12-03 11:29
>>Also, you may be interested in OCaml, if performance is your primary concern, it is as fast as C++.
should be
>>Also, you may be interested in OCaml, if performance is your primary concern, it is as fast as C++ in certain tests of things which never occur in real-world situations.
Name:
Anonymous2005-12-04 14:00
| C rocks as a low level language but it was abandoned with a shitty standard library, so you can't use it unless you go around collecting the good stuff like in RPGs.
Now there's an idea for a RPG
Name:
Anonymous2005-12-05 2:01
I spent two years programming 6502 assembler. Try that before you complain about higher level languages.
Name:
Anonymous2005-12-05 17:24
>>23
cc65 sucks because it doesn't optimize variable types, so using long results in 32-bit adds even if you never go over 8-bit, and if you optimize it properly by hand it confuses the hell out of GCC :(
Name:
Anonymous2005-12-05 20:27
using long results in 32-bit adds even if you never go over 8-bit
And how is the compiler supposed to know that?
Name:
Anonymous2005-12-05 21:35
>>25
Modern C compilers are self-conscious and intelligent. I've seen them say something like:
main.c:325: Hmm... That's a good idea.
Name:
Anonymous2005-12-05 23:09
>>25
int i;
...
for (i=0;i<=255;i++) {<i is not changed here>}
This can be done entirely 8-bit, but cc65 won't. This is sort of excusable since it doesn't optimize anything else either, but neither will gcc...
Name:
Anonymous2005-12-06 0:51
Hello, >>27! Why are you using an int instead of an unsigned char then?
If you tell the compiler to use an int, it'll use an int. Don't be a total fucking retard.
Name:
Anonymous2005-12-06 1:03
BrianFuck is awesome
Name:
Anonymous2005-12-06 5:29
>>28
Modern compilers tend to try and optimize code for speed and/or size (sometimes on and off etc. option). So although you told the compiler to use an int if it realized that only a byte is required it will change it assuming it can do it safely without damaging your program logic.
A starker example of the kind of optimization changes they can do is rolling out a loop, so for example code:
for (i=1;i<100;i++)
{print(i);}
The compiler may choose to sacrifice size for increased speed by removing the need to increment i. Changing the code too:
print(1);
print(2);
…
print(98);
print(99);
This is one of the many weird things compilers can do to try and speed up your code. Generally you would set your compiler not to optimize while still debugging the code otherwise you may get a nasty surprise when you cant find your loop counter =P
Name:
Anonymous2005-12-06 7:37
--funroll-loops ftw!
Name:
Anonymous2005-12-06 10:29
>>30
That's all true, but if he's all so worried about unoptimized ints on a 6502, why the hell is he using asm instead of just changing the datatype?
Name:
Anonymous2005-12-06 10:33 (sage)
>>30
you know, using a byte instead of an int won't get you any sort of speedup if the machine word is an int anyway
>>34
They're talking about loop counters, an array is a different thing entirely. For one thing using different types affects memory usage, not execution speed.
Name:
Anonymous2005-12-06 15:19
>>33
Normally I'd agree with you, except the 6502 is an 8-bit CPU.
If that's the case, incrementing a 32-bit number will take several cycles.
Name:
Anonymous2005-12-11 2:42 (sage)
Then again, a load from memory takes at least 4 cycles on a 6502 and a carry flag clear is what, 2 cycles? Them old microcoded CPUs sure were teh funnay.
C99 is great, but doing things like OO ends up being pretty verbose. That's honestly my only complaint about it. Yes stdlib is extremely limited, but you can find libraries for *anything* you want. The flaws in stdlib like shitty string handling don't even matter because you can replace them with better libraries, e.g. bstring. The fact that you implement OO yourself means you have *total* control over how it works; open classes, separation of allocation and construction, totally custom inheritance, etc.
C++ is generally shitty. The OO mechanisms are actually quite limited. *Horrible* allocation system (this alone was enough to make me use C for my project). IMHO the language is not even complete without move semantics, finally coming in sepplesox but should have been in 20 years ago. So many restrictions cause lost optimization opportunities compared to equivalent C code, such as no restrict support, ridiculous requirements for POD, etc. Templates are good but still missing some features even in sepplesox, such as conditional field inclusion (partially hacked together in boost with if_enable). Actually boost itself is a great example of what C++ amounts to, a huge language of hacks.
Java is an *awful* language. Best demonstrated by example: array slicing. Try writing a generic method for slicing an array. Not possible. In C it's easy: take void* and stride, done. In C++ even easier, template it. Actual high-level languages have much higher constructs, like functional slices than can iterate and only memcpy when needed. Java finally added Arrays.copyOf in version 6; it took until version SIX before they realized people might want to do this. In the meantime, every fucking call in the API that takes an array also takes offset and size to overcome this shitty limitation. This is just one example; I've never worked on a Java project that didn't have an enormous Util class to implement basic functionality inherent to all other languages, no matter how high- or low-level. Some features of the language are *completely* broken, like exceptions. Checked exceptions were an awful idea, and they're still in. There's *still* no RAII or context-managers. Where the fuck is the 'with' statement? Every other goddamn language with exceptions has this, or equivalent (RAII etc). Java is awful. A great experiment in VM design, and that is *it*.
C# I've never used, but only ever heard good things. Sort of like Java done right. C# and C++ seem to be slowly merging actually, partly with Microsoft's efforts in CLR (as much as I hate Microsoft, more platforms will start to do this and it will be good), and partly with the C++ standards committee finally adding things like garbage collection and threading. This seems like a good thing to me.
Python is great. *Extremely* rapid development, which is the main reason why it's good, but also generally easy to make readable maintainable code. Incredibly rich libraries available, more so than any other language on this list (except C). A number of bad decisions that were fixed in Python 3 (like Java-style unicode strings, the only good thing about Java), but slow uptake of Python 3 unfortunately.
Perl/PHP is exactly the opposite of this. Code is horrifying and obfuscated. Unmaintainable pile of shit. Regexes for fucking everything. DO NOT WANT
Haskell is a joke. Purely functional is great and all, until you realize oh fuck, I need to actually *interact with a user*. Well there goes your immutability out the window.
Objective-C is *awful*. A glorified preprocessor. Dynamic call dispatch. No compiler support for polymorphism means no strict aliasing. Java-style bullshit polymorphic vectors, requiring individual allocations of NSNumber to store a fucking int array. So many unbelievably poor decisions, like calling methods on nil is allowed and becomes a no-op (and incredibly complex rules on return values from such calls, regarding register packing of small structs, etc.) Stupid behavior of retain properties not releasing on dealloc (like a smart pointer that doesn't free when destroyed, only when you change its value). So many warnings that should be errors: method doesn't exist? No problem! abort() at runtime. Awful C++ integration, for example massive bugs in returning references, Obj-C exceptions don't unwind the stack, etc.
Honestly the only languages I care to code in anymore are C99 and Python.
>>42
My thoughts exactly when I saw his post, but I decided not to post it as it would have been likely pointless, however it's a bit funny how he complains about things like RAII, when you can do that in a much proper manner and much much more with Lisp macros.
Name:
Anonymous2009-11-01 18:09
>>37
Anyone who says /prog/ was better before is full of shit. The 6502 was not microcoded, unlike e.g. the Z80.
Name:
Anonymous2009-11-01 20:44
>>42,43
Only on /prog/ is Lisp ever a realistic choice as a serious programmer language.
>>52
When it comes to Common Lisp's syntax, it's still trivial (and modifiable - read macros which you've already shown a few examples of). Everything, including the parens are done using read macros, which is nice, simple and uniform:
CL-USER> (get-macro-character #\()
#<FUNCTION SB-IMPL::READ-LIST>
NIL
Lisp's semantics on the on the other hand can be quite rich and complex compared to other languages.
>>1
How about you write your own io, string, hash and tree-libraries on top of stdlib like everyone elses does?
Not much to bitch about if you do that, is there, bitch?