Instead of sticking with C/C++ for performance consider:
1.Free Pascal -Small memory use,very fast, easy debug
2.OCaml - medium memory use, fast,very terse, functional and secure.
3.FreeBasic - very fast,easy to use, low memory use, supports QBasic code as dialect. Other Basics to consider: PureBasic/Gambas/PowerBASIC
4.Digital Mars D- fast, easy to write and debug, large library, transition from C/C++ much easier.Garbage collection can be turned off.
Pascal is a dead language. It was one of my first languages and I recently tried going back to it... it's clunky as fuck.
Sorry bro, but I'm sticking with C++ (and learning C++0x), and using Python, Lua, and Scheme for higher level stuff.
Name:
Anonymous2010-07-18 7:08
FreePascal was recently ported to whole range of mobile devices.
As for speed, naive pascal code is optimized much better than most of "Expert C programmer" code. Plus its catches/prevents errors which would take the sanity of C programmers away(pointer typecasts,function pointers,etc) and the code is very readable(even more than Java).
FreeBasic - very fast naive pascal code is optimized much better than most of "Expert C programmer" code code is very readable(even more than Java)
Oh you guys.
Name:
Anonymous2010-07-18 8:14
http://wiki.freepascal.org/Modernised_Pascal
This page was originally a summary from an enthusiast user to "create" a more modern (!?!) Pascal. The suggestions are in practice a mix of random Basic syntax (which is already pretty random in itself), the regularly occuring namespace proposal etc. The page, and the rebutal was left in as an example, since it demonstrates a few traits common in these kinds of requests:
* Random borrowing from other languages on instinct, not reason
* Failure to grasp basic concept of Pascal parsing and philosophy.
* Syntax that saves minor typing, while IDE can do quite complex handling nowadays like declaring vars of for loops (this goes doubly for Pascal since it is parsing model makes it easy for IDEs)
* Missing the consequences (and advantages of) the unit concept.
* Pretty shallow description of the feature, no implementation, minimal code examples only. (see e.g. the elsif. Example what it solves(dangling else) can be found anywhere) It is a long way from idea to final implementation, and most ideas stumble on that road.
* No implementor offered. Who is going to do the hard work is the hard nut to crack. Everybody has more ideas then he can realise, the core developers even more likely so.
Name:
Anonymous2010-07-18 8:30
Hey dudes. You should take a look at FreePascal. It's getting pretty good. Non-standard compliance ftw! Enjoy your stale compilers
I'm one of the maintainers of the backend(ARM, working on NIOS support) ^^
Name:
Anonymous2010-07-18 12:05
Program Lesson1_Program1;
Begin
Write('Hello World. Prepare to learn PASCAL!!');
Readln;
End.
Name:
Anonymous2010-07-18 12:19
>>7
This isn't relevant today, comparing FreePascal to 30-year old standards is asinine. You have to use better sources than old tired arguments which bear no similarity to modern implementation.
Name:
Anonymous2010-07-18 14:21
I found this gem in http://www.ocaml-tutorial.org
OCaml uses one of the bits in an int internally in order to be able to automatically manage the memory use (garbage collection). This is why the basic int is 31 bits, not 32 bits (63 bits if you're using a 64 bit platform). In practice this isn't an issue except in a few specialised cases. For example if you're counting things in a loop, then OCaml limits you to counting up to 1 billion instead of 2 billion. This isn't going to be a problem because if you're counting things close to this limit in any language, then you ought to be using bignums (the Nat and Big_int modules in OCaml). However if you need to do things such as processing 32 bit types (eg. you're writing crypto code or a network stack), OCaml provides a nativeint type which matches the native integer type for your platform.
>>16
I'm not sure which use it for GC, but I've seen quite a few that use the first few bits for type tags
Name:
Anonymous2010-07-18 14:50
>>17
...an 32bit int[4294967296/+-2147483648] with few bits of type tags becomes 29(3bits)~27bits(5bits)=536870912/134217728 unsigned and +-268435456/+-67108864 signed.
>>22
Fine, all the non-retarded users can enjoy writing functions in their Glorious Democratic Republic of OCaml. Both of them.
The others will either:
1.wait for Ocaml update for a fix
2.switch back to C++/Java/.Net
>>24
a quick yum install ocaml, running ocaml with that both at the REPL and in a file gave me the expected result. So, that's either your fault, or your packagers fault
>>13
Tag bits are not unusual, lisp implementations, even high-performance CL ones do use them (unless the compiler can safely optimize them away). The result is having ints of size lesser than the CPU register size. This is a problem if the language can't promote the int to a bignum, and it can be a problem if you're writing crypto code (expected to deal in ints of the same size as the CPU register size) and need it to be efficient. In the second case, some CL compilers can still do it efficiently if you get the compiler to properly assume the types (32 or 64bit) and as long as the values are used in internal operations, it doesn't do needless fixnum<->bignum operations and the final equivalent code is just about as efficient as the C one, however that's not to say it isn't a bit of a pain to make sure the compiler gets it right (unboxed ints), maybe in some cases it's less work to just write this kind of code in C if you want unboxed ints being portable across a large variety of CL implementations (assuming the FFI supports loading your external code).
Name:
Anonymous2010-07-18 18:27
If anything this thread proves C/C++ are irreplaceable.
my 2cents:
1.Pascal is ancient ALGOL BDSM system with tons of boilerplate.
2.Ocaml: Ints which are garbage collected(with tagbits inside the Int) and actually are a castrated version of real ints. A program like "Hello world" fails to run(i have not seen a language (until today) on which canonical Hello World breaks).
3.FreeBasic - Its neither fast or memory efficient enough to compete with C/C++. best BASICs are unfortunately commercial and closed source.
4.Digital Mars D - garbage collection/runtime can't be "just turned off" because the libraries depend on it. Plus its has much more overhead than plain C. There 2 standard libraries and two branches which get in the way.nicer language for large programs than C++.
>>31
O'Caml has bignums and 32bit ints as well. Tag bits are a bit annoying, but they solve a lot more problems than they cause. The major problem they solve is allowing precise garbage collection, but in some other languages like Lisp, they even allow you given any data, to complerly and accurately decode it in full:
I can open any data and get a printed representation of it, or examine it in the debugger and find anything about it. No silly overflow, no silent type errors (allows safety in a dynamically typed language). If you need native ints, you can get them in good implementations.
>>3
CL implementations do have bitvectors of course, they're in the standard. They're also pretty useful. I only mentioned the 32bit int thing because a lot of crypto algos are designed to work with them, so if you want to just port some C crypto code, having efficient 32bit ints is useful.
>>42
Feels like its 94' and SuperVGA is cutting edge.
Name:
Anonymous2010-07-19 5:11
People who still program in Pascal and use FreePascal are like conspiracy theorists. No matter how much evidence you show them to the contrary, they'll continue to stick to their beliefs, and in fact will do so more fiercely.
>>44
What exactly is wrong with Pascal? You can do pretty much anything in it, but that doesn't mean that it's not just a more restricted C with a simple object system and more verbose syntax. There is little reason to use it these days, but people usually do because they know it(it's thought in some countries in schools, which makes it as a sort of poor man's Java, but more low-level). It's used for the same reason BASIC(and derivatives) are used: people know it because it was the first language they tried. It's not completly terrible, but there are many better options out there.
Name:
Anonymous2010-07-19 7:31
>>45 There is little reason to use it these days
Consider me offended
You're an academician, my friend. Name me a language that can compile to native code on a decently wide scale of popular architectures
Now tell me if those languages provide equally crossplatform lowlevel tools that are useful in daily life(threading and synchronization, atomic operations, automatically converted codepaged "unlimited" length strings, possibility of almost pure object oriented programming)
Java? Don't make me laugh
C? Not a chance. Too inbred with UNIX
C++? God dammit, what a disaster? Something as simple as atomic operations. I don't want to code them in assembler for any platform I come to
C#? Well, it's getting there but it's still a botched syntax, designed to cater to brain dead american university students
If only Lisp machines still existed. Then Object Pascal would have had a worthy opponent
>>46
What if Lisp machines are outdated? There are plenty of real Lisp implementations ported to a variety of platforms.
So far Common Lisp + defacto standards seems to fit your goals, but the language is quite high-level. If you want low-level you have to limit yourself to an implementation or use an FFI, which brings you back to (inline or external) assembly or C.
Name:
Anonymous2010-07-19 9:27
The only real choice for performance is D, though you have to rewrite to avoid garbage collection and use (mostly) C-level abstractions.
>>49
That's probably why people are still writing C. The sad thing is you don't need to be scared of automatic GC, so long as you can keep it from stu tte rin g.
Name:
Anonymous2010-07-19 15:41
>>50
True.
Just trigger the GC before doing something resource-intensive, shut it off, do your work, turn it back on and collect when you're finished.
In context of a game, you'd explicitly run it during transitions between levels, after a cutscene, etc. There's tons of times when running it won't affect the user experience; people who complain about GC'd languages are naive to think that they're losing any sort of control by letting the computer keep track of allocations. You're actually gaining control and flexibility, because you suddenly don't have to burden yourself with making sure memory is always free()'d when you don't need it anymore but never when you still do. That's a tedious task, and exactly the sort that the computer is better at than the programmer.
>>51
I wouldn't go that far. Manually managing memory is still more efficient and usually more accurate than letting the runtime do it (assuming you don't introduce related bugs), but for safety's sake (i.e. so you don't introduce related bugs) it's not too much to ask.
I've read that malloc can take as long as automatic GC. I'm sure in a worst case vs. best case (respectively) scenario that can happen, but sadly most of the audomatic GC you find is pretty terrible.
http://www.joelonsoftware.com/articles/fog0000000319.html Eventually, the free chain gets chopped up into little pieces and you ask for a big piece and there are no big pieces available the size you want. So malloc calls a timeout and starts rummaging around the free chain, sorting things out, and merging adjacent small free blocks into larger blocks. This takes 3 1/2 days. The end result of all this mess is that the performance characteristic of malloc is that it's never very fast (it always walks the free chain), and sometimes, unpredictably, it's shockingly slow while it cleans up. (This is, incidentally, the same performance characteristic of garbage collected systems, surprise surprise, so all the claims people make about how garbage collection imposes a performance penalty are not entirely true, since typical malloc implementations had the same kind of performance penalty, albeit milder.)
The claim is not speed, but collection stutter. The problem with this is that calling malloc during RT events (eg. in signal handlers, say... vblank) is verboten, literally incorrect. So your stutter never has an opportunity to happen at the wrong time (with video games don't think Linux, think gaming consoles. Sometimes developers would even use hblank handlers to get more sprites on the screen, or whatever zany effects--with NTSC you have something like ~0.07ms to get the job done.) Compare with Java's GC which (last I had the pleasure) runs whenever the hell it wants, usually long after the program has exhausted its heap and stalled for over an hour (I wish I was joking. Things must have improved by now, yes?)
Even if you aren't in a handler (or re-entrant call or whatever) manual management makes it easy to avoid triggering collection processes at the wrong time. So the characteristic is subtly but fundamentally different because youmustdecide when to risk collection. I can't stress that enough, but the BBCode is garish enough as it is.
Not to mention the malloc he refers to is old enough that it makes dlmalloc1 look the very picture of sophistication. This lovely little gem is more or less what you get if you use newlib (cygwin, msys), glibc, etc.
Of course you can talk to your GC and tell it when to shut up or prod it to run whenever you want it to and fine tune it this way and that depending on your language, implementation and so on. It's perfectly reasonable to do so, but just because you can doesn't mean it's all been made a non-issue. It's still essentially harder to constrain GC finely in auto-GC languages (at least when using the auto-GC -- though it really doesn't have to be this way, and there may be a few exceptions. Someone is going to say something long and involved about Lisp now. Easy!)
>>57
Not really. Read his articles (hell, read that article), he's pretty good at what he does. It might be enterprise but he knows what he's doing. Also, try not to argue from prejudice because the same could be said about you, and for that very reason.
I was disappointed to find him being either ignorant or disingenuous in that particular passage, but it's not like his tone is completely serious. He was arguing for GC, though, and completely missed the use-case scenario in that criticism.
>>61
I'm going to pretend you just posted "I like butts", because that's a lot more interesting and insightful than whatever the hell you actually wrote.
>>64
Is that what children are like? I'm sorry, you must not be a child then, your tolerance is to being educated.
Why are you here? You don't seem to know anything and you don't seem to want to learn anything. Were you banned from /g/ or something? /v/? /pr/‽
Name:
Anonymous2010-07-20 3:30
"In 1969, AT&T had just terminated their work with the GE/Honeywell/AT&T Multics project. Brian and I had just started working with an early release of Pascal from Professor Nichlaus Wirth's ETH labs in Switzerland and we were impressed with its elegant simplicity and power. Dennis had just finished reading 'Bored of the Rings', a hilarious National Lampoon parody of the great Tolkien 'Lord of the Rings' trilogy. As a lark, we decided to do parodies of the Multics environment and Pascal. Dennis and I were responsible for the operating environment. We looked at Multics and designed the new system to be as complex and cryptic as possible to maximize casual users' frustration levels, calling it Unix as a parody of Multics, as well as other more risqué allusions. Then Dennis and Brian worked on a truly warped version of Pascal, called 'A'. When we found others were actually trying to create real programs with A, we quickly added additional cryptic features and evolved into B, BCPL and finally C.
We stopped when we got a clean compile on the following syntax:
for(;P("\n"),R--;P("|"))for(e=C;e--;P("_"+(*u++/8)%2))P("|"+(*u/4) %2);
To think that modern programmers would try to use a language that allowed such a statement was beyond our comprehension! We actually thought of selling this to the Soviets to set their computer science progress back 20 or more years. Imagine our surprise when AT&T and other US corporations actually began trying to use Unix and C! It has taken them 20 years to develop enough expertise to generate even marginally useful applications using this 1960's technological parody, but we are impressed with the tenacity (if not common sense) of the general Unix and C programmer. In any event, Brian, Dennis and I have been working exclusively in Pascal on the Apple Macintosh for the past few years and feel really guilty about the chaos, confusion and truly bad programming that have resulted from our silly prank so long ago."
Name:
Anonymous2010-07-20 4:40
>>68
Lies! Why would they write K&R (and even a 2nd edition) if that were true?
Name:
Anonymous2010-07-20 7:10
>we were impressed with its elegant simplicity and power.
Like a big soviet steam excavator, Pascal is powerful. Its simple too. Somewhat idiot-proof. But elegance? I can't see any.
Pascal strings, Pascal Arrays, verbosity of cobol, stubborn adherence to rigid and outdated ruleset.
[@:~] apt-get install frozen-essential
E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied)
E: Unable to lock the administration directory (/var/lib/dpkg/), are you root?
>>84
The person you're talking to is either a Ubanto user or a troll. I don't know what you think you're accomplishing by trying to explain anything to him.
>>85
Unless you're explaining a joke involving a sandwich
Name:
Anonymous2010-07-20 15:49
>>84
You have to write sudo before the text you copy and paste to the DOS box to make sure that you don't get any error messages. This is annoying, but better than that Windows pop up thing which is much less secure.
>>90
No, you moron. If >>88hbt, then he would've also mentioned the other things wrong with >>87. This is what they call 'humour'.
Name:
Anonymous2010-07-20 17:06
>>89
You can start it by clicking on the start menu and look for a black box.
WARNING: Do not write text here or you might mess up your computer PERMANENTLY and have to throw it away. Always copy and paste (Ctrl+C and Ctrl+V) source codes, and add sudo.
Name:
Anonymous2010-07-20 17:08
>>61
Yeah this is pretty much the whole issue. He simply made the fatal mistake of not adding the expected "unless you're writing games" disclaimer to any discussion on GC (as any blogwhore should know), thus incurring the wrath of /prog/lodytes everywhere.
The biggest problem with D is that it's impossible to search for any answer because Google won't heavily skew results towards programming like they do with the letter C.