Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Programming Language to Replace C++

Name: Anonymous 2010-08-11 21:49

I think we can all agree that C++ is a terrible language. So why is it still around?

When talking to most C++ users (game developers, systems programmers), I've found that most seem to recognize C++'s faults, but they don't really care. They aren't even the slightest bit interested in a new language that might solve its problems, even one that gives them all the power of C++ with none of the downsides. You can't even get them to look at something new.

Why is that? Why does everyone just 'live with it' without wanting to improve the situation?

Name: Anonymous 2010-08-13 9:05

>>38
This thread is about why no one cares to replace it. Essentially no language research is being done in this field.
But this is wrong, and it has been pointed out many times; people are looking for a better systems language.

Name: Anonymous 2010-08-13 9:41

>>38
That's why I said the standard *AFTER* C++0x. It will have GC, and will also have things like proper module support (so we can finally get rid of header files), scoped macros (which will probably end up extremely similar to mixins from D), and more.
They're already scraping the bottom of the barrel for syntax. By the time they've added all that, idiomatic C++ will resemble obfuscated Perl.

Name: Anonymous 2010-08-13 10:18

mostly I just want to believe that it's possible to do it without sacrificing anything.
I prefer to deal with this sort of thing myself, so I think there is a sacrifice. Putting it in the type gets you part way, but that will quickly fall apart in (say) C's type system.

The point is that you do not need to significantly change code in order to remove the garbage collector; it is not as though you are rewriting the program, just annotating it
The way you put it originally made me think you meant the compiler should refactor the program with calls to memory management facilities inserted instead of giving responsibility to the programmer.

My only argument is that some allocations would require refactoring to deal with outside of GC, which is able to handle runtime non-deterministic de/allocations which outlive their creation context. Anything that can be solved simply by annotating could probably be handled by the compiler, couldn't it?

The idea is that 99% of the time, you would need to do the check for nil anyway
I don't buy that. Nullable pointers show up out of their initialization context probably at least 50% of the time. So far so good, but it's not true that says every time one is received or passed out of context it is in danger of being null. A good deal of my code checks these once and never has to worry about it again. A lot of my C never checks (because there is no non-nullable type, and in these instances a problem would preclude creation.) Without analysis a compiler couldn't know when this would be okay.

I would go the other way and say that 99% of the automatically inserted checks, barring this kind of optimization, are provably unnecessary. The real figure is probably much better, but I doubt the real-world optimized case approaches parity with the manually checked case (limited to the cases where the manual checking is indeed correct.)

Somewhere there is a paper about all of this, with statistics. The paper will say that the analysis is possible up to the point (or perhaps surpassing) of program complexity that a human can be expected to deal with, according to some metric or estimate. Whether that kind of analyzer is found in implementation is perhaps another story.

Doesn't seem that tall. What are the holes in my reasoning?
I don't fully understand your reasoning, but it sounds like you're pimping a static solution to memory leaks in the face of dynamic allocation. I'm probably mistaken here, but if that is what you mean then I'd say it's impossible.

Name: Anonymous 2010-08-13 10:43

It's probably time to bump this.

Name: Anonymous 2010-08-13 10:55

>>43
My only argument is that some allocations would require refactoring to deal with outside of GC, which is able to handle runtime non-deterministic de/allocations which outlive their creation context.
Correct. This is not most allocations however. I don't think it's even close to most. But of course I have no data to back up this claim, since this hypothetical language does not exist.

Anything that can be solved simply by annotating could probably be handled by the compiler, couldn't it?
No. Manually freeing memory inherently creates unsafe code. You have to figure out what the lifetime of your object needs to be in order to ensure that it is not accessed through old references after it has been freed. It is impossible for a compiler to solve this in the general case; the ol' halting problem and all that. That's why GCs are so ubiquitous in modern languages: they allow you to statically prove that memory is never accessed after it is freed.

I would go the other way and say that 99% of the automatically inserted checks, barring this kind of optimization, are provably unnecessary.
The compile does not automatically insert any checks whatsoever. That is not what a static type system and static analysis system does. The compiler only forces YOU to MANUALLY add checks in places where you need to verify that a pointer isn't null.

These should be few and far between. I think you have misunderstood my 99%: I meant you would be adding it anyway in 99% of places where the compiler would force you to do it; of course not 99% of places where you dereference a pointer!

Your point about a program analyser is not relevant. No whole-program analyser is needed for this. Each function can be compiled individually, in a vacuum, separate from all the others; it only needs to be properly typed based on whether it accepts null. Nice is a variant of Java which does this: if a function argument might be null, you just prepend a question mark on the type of the variable. If you do this, the function needs to check that it is not null before dereferencing it. If you do not, then the responsibility falls on the caller, so you don't need any checks; the compiler will prove for you that it will not be null. This is a syntactically convenient special case of tagged unions from more powerful languages.

Honestly it's pretty mind boggling to me that you are defending null pointers. It's the famed "billion dollar mistake". I can't help but feel that you aren't properly understanding the issue. Maybe I'm the one who isn't understanding the issue.

Name: Anonymous 2010-08-13 11:10

the solution.....PL/I

Name: Anonymous 2010-08-13 11:19

>>34
Pascal vs C by Brian Kernighan
http://www.lysator.liu.se/c/bwk-on-pascal.html

Name: Anonymous 2010-08-13 11:40

>>45
My question would be: how would you resolve the null pointer problem, creating a solution that does not inherently result in the same problems or type-testing requirements a null pointer necessarily incurs?

Name: Anonymous 2010-08-13 12:34

>>45
No. Manually freeing memory inherently creates unsafe code.
Whoa, back up. I thought the object was to tell the GC to piss off? In any event, analysis can ensure that inserted deallocations, which are exactly equivalent to manual deallocation except that they are automatically inserted, are perfectly safe. (And at some point you have to concede that if manual deallocation is inherently unsafe it doesn't improve the GC case either, which is ultimately informed by a programmer as to when it can free memory. Pedantry, I know, but the point keeps things in perspective.)

As far as the general case goes, annotations don't work either. Where analysis fails, annotations can't inform the compiler or GC very well at all--this is what I was trying to bring up. If you have a paper that says otherwise I would really like to see it.

The compile does not automatically insert any checks whatsoever.
After rereading I see I've misread your comment about this. You wouldn't really have to perform the checks, your code could just crash at roughly the same time and place, for a slightly different reason--unless it does actually force you, in which case it either forces you to do it at every dereference or performs some amount of analysis to alleviate the need for it, in either if these cases it might as well insert the checks. For my purposes, your ?var example simply informs analysis.

Honestly it's pretty mind boggling to me that you are defending null pointers. It's the famed "billion dollar mistake".
I know this is a topic that has been debated back and forth forever, but it's not a big safety concern for me--nullables have simply not posed a significant problem to me, and they usually exist in languages that have far more dangerous facilities. My only strong feelings about it are a) it's not nearly as big of a deal (either way) as people tend to make of it, and b) I don't prefer it because it usually relies on exception handling to deal with eg. failed initialization.

As an aside, Go's solution to exceptional cases is a nice try (http://blog.golang.org/2010/08/defer-panic-and-recover.html). I don't think they quite got it, but it does get exception handling out of my face. Combined with the multiple return facility, you can choose to treat errors exceptionally or you can anticipate them. The very best decision Go has made, in my opinion, is not to propagate exceptions out of the standard library. Sadly they used the qualifier "convention" and not "standard" or "requirement" or something even stronger. Personally I would never let an exception escape, be it the standard library or supplied.

Name: Anonymous 2010-08-16 1:25

>>49
In any event, analysis can ensure that inserted deallocations, which are exactly equivalent to manual deallocation except that they are automatically inserted, are perfectly safe.
Not in all cases. Maybe in most cases, yes, and it can certainly insert those automatically. But some of them you are going to need to tell it explicitly what their lifetime is. That was my point; manual memory management inherently creates unsafe code because it's *not possible* for the compiler to statically verify that all of your memory deallocations are safe. This is the same as the halting problem. The compiler can't verify it, therefore the compiler can't automatically do it.

As far as the general case goes, annotations don't work either. Where analysis fails, annotations can't inform the compiler or GC very well at all
I suppose I should clarify what I mean by annotations. They aren't "suggestions" to "help" the compiler; they are explicit instructions. FREE THIS HERE. When this variable goes out of scope, free whatever it is pointing to. That's exactly what the "scope" annotation does in D. This is what I am talking about.

As an aside, Go's solution to exceptional cases is a nice try. I don't think they quite got it, but it does get exception handling out of my face.
Give me a break. Go has exceptions. They just don't fucking call them that, because they are trying to avoid the C programmer prejudice against them. Other than that, they are *identical*. defer is merely a convenient syntactic wrapper to a finally block.

Name: Anonymous 2010-08-16 1:48

When this variable goes out of scope, free whatever it is pointing to. That's exactly what the "scope" annotation does in D. This is what I am talking about.
I remember someone in my university pointing a relationship like this out and wondering, then, why it isn't common practice to auto deallocate variables once scope changes.  So it may be a naive question but: why would automatic deallocation based on scope change - am I right in simplifying this as change to a program's call stack - be a bad idea?

Name: Anonymous 2010-08-16 2:05

>>51
Because this is too dangerous. It has serious potential to bite you in the ass if you need an object to outlive its scope but forget to make it non-scope.

Name: Anonymous 2010-08-16 2:19

>>50
That was my point; manual memory management inherently creates unsafe code because it's *not possible* for the compiler to statically verify that all of your memory deallocations are safe.
I was pointing out that there exists provably safe memory management which is equivalent to the manual case. This against your argument that manual memory management is inherently unsafe. I was not speaking on memory management in general, and in fact pointed that out at least once.

FREE THIS HERE. When this variable goes out of scope, free whatever it is pointing to.
Unless the compiler can ignore these annotations at its own discretion you've just added an unverified call to free(). How is this purported to be safe while the library call is inherently unsafe? I'm sure there's some subtle aspect of D that intends to make a difference here but I wish you'd be upfront about it.

Give me a break. Go has exceptions.
I never said otherwise. By "out of my face" I mean there is no extra level of indentation. The exceptional case logic is delineated separately, right where it should be. Erlang does an even better job here, but Go can't really adopt its mechanisms.

Other than that, they are *identical*.
Identical to what? C++'s exceptions? No. Erlang? Not a chance. There are some nuances in there, and it's not just syntax. I'm quite confident that there are meaningful if subtle semantic differences that makes Go's model identical to nothing else in existence.

Why are you being so ignorant all of a sudden? You weren't like this before.

Name: Anonymous 2010-08-16 5:31

>>52
You could say that about x in y language. If you forget to do a thing, then it's your problem, not the language's.

Name: Anonymous 2010-08-16 8:08

>>53
How is this purported to be safe
It's not purported to be safe!

You are not understanding what I am saying here. I don't care if it's provably safe; I will reason about whether it is correct myself.

The problem is, if it isn't PROVABLY safe, then the compiler can't AUTOMATICALLY resolve it for you. This means you have to either manage the memory manually or use a garbage collector. A garbage collector will check *at runtime* whether an object needs to be freed, which is the only way to automatically and safely handle memory; that is, as >>51 was wondering, why most modern languages use them.

Remember, you originally said that if all I want is annotations for memory management, then a compiler could just annotate it automatically. I said no, because it couldn't do it *safely*; it couldn't prove that its optimizations were correct. Halting problem.

I was pointing out that there exists provably safe memory management which is equivalent to the manual case.
There is certainly provably safe memory management in restricted subsets of the language. "All allocations are scoped on the stack, and you cannot make a reference that doesn't live on the stack. Therefore all references go out of scope before whatever they point to." Provably safe memory management -- except this sort of limited case is not actually useful for writing real-world programs. I'm not sure if this is what you meant or not.

There are some nuances in there, and it's not just syntax.
Yeah, yeah. They are *extremely* semantically similar. I'm sure you could come up with some minor difference, so fine, they aren't identical. (For the record, not having to indent code is not a semantic difference.)

Many years ago I wrote a macro in C++ that did exactly what defer does in Go. Here, I just rewrote it. Give it a try:

#include <stdio.h>

#define defer3(f, line) struct DEFER##line {~DEFER##line() {f;}} defer##line
#define defer2(f, line) defer3(f, line)
#define defer(f) defer2(f, __LINE__)

int main() {
  printf("a\n");
  defer(printf("e\n"));
  printf("b\n");
  defer(printf("d\n"));
  printf("c\n");
}


Of course Go reinvents this as a built-in, since they refuse to support any form of RAII, or something like Python's with statement (among many other things.)

I'm very sorry if I'm ignorant of Go's exceptions, but they did announce the feature a whole twelve days ago.

Name: 55 2010-08-16 8:12

Let me correct myself there, it does *almost* exactly what Go does. Need to be precise in here! My macro defers at the end of the block (or scope), not the end of the function. This is in my opinion much safer and more useful (in the same way that VLAs are much safer and more useful than alloca()), but whatever, I'm not a Go programmer.

Name: Anonymous 2010-08-16 11:25

Plain and simple.

ActionScript 2.0

Name: Anonymous 2010-08-16 11:41

Most programmers are a bunch of babies.

Despite SICP being one of the foremost standards on methodology and implementation standards, people can't handle it because, "IT HAS PARENTHESES OH NO"

It's gone far enough where Abelson has said he's glad to see the SICP class go because something like "people don't need to know the basics anymore". This excuse is always given off when stupid people become entrenched in a field, and no more focus is actually on understanding the area.

Look at areas like mathematics. The majority of people think mathematics is strict rule based things that have no practical use or interpretation. The top tier mathematicians understand math differently (neurological data easily demonstrates this), with even the majority of educated mathematicians still thinking math is some rule based thing that has no use or practical interpretation. This view has heavily pervaded almost every scientific field, it's the reason why negative numbers didn't practically exist until the Renaissance, and why medicine now is practiced from a disease driven, rather than a malfunction driven approach. For a "recent" specific example, some Australian colleges have taught less ANATOMY to their doctors, and focused more on people skills.

Name: Anonymous 2010-08-16 15:11

>>55
It's not purported to be safe!
Oh, well that makes more sense then. Somehow you had given me the impression that the annotations were something other than manual memory management.

I don't care if it's provably safe; I will reason about whether it is correct myself.
One moment you're going on about the inherent unsafety of manual memory management--which isn't always true, or even meaningful1--the next you're saying you don't care. I take it there is more to your point which is deep between the lines, but I just can't find it. I'm trying really hard to keep up the perception that you are not just trying to posture D above the rest but by now I have to admit that doubt is casting a very long shadow over the affair.

[...] Halting problem.
As I mentioned above, you'd given me the impression that the annotations were something other than equivalent to manual management. Anyway this 'halting problem' thing was cute the first time, but you don't need to repeat it constantly like you just heard about it last week. I'm not trying to shit on you, I just haven't been impressed by anyone making ineloquent equivalences to the halting problem in a very long time.

There is certainly provably safe memory management in restricted subsets of the language.
Your examples are quite limited and don't cover even what naive static analysis can accomplish. I was also thinking specifically of heap allocations--stack allocations limited to non-escaping references don't need any management, they live and die quite organically with program flow. So really you're not really talking about manual memory management at all.

I'm very sorry if I'm ignorant of Go's exceptions, but they did announce the feature a whole twelve days ago.
I don't care that you don't know, but I do care that you don't know and yet try to speak authoritatively on the subject.

It was part of the language from very early on. The announcement isn't an announcement, it's something they figured interesting enough to blog about--almost everything on the Go blog has been around within a month of the initial language announcement. IIRC, defer() has been in since day 1 and panic/recover were discussed and implemented shortly after.

On semantics, since you don't have much interest in Go I don't want to get into the similarities and differences between its defer/panic/recover stuff and whatever language's model you happen to be thinking of at any given time. Semantics aside, the syntactical advantage is huge in my opinion--which is all I was trying to point out when you went after me on semantics. Not cool.

Name: Anonymous 2010-08-16 18:28

>>59
I'm trying really hard to keep up the perception that you are not just trying to posture D above the rest but by now I have to admit that doubt is casting a very long shadow over the affair.
Hardly. I've mentioned D often because it's the only major attempt at replacing C++ in embedded and game development (there are a few toy languages out there, but they are still experimental at this stage). I actually hate D, mostly because it's even more complicated than C++. It does not make development any easier or better; it just gives you slightly less chance of shooting yourself in the foot with some obscure C++ gotcha.

Your examples are quite limited and don't cover even what naive static analysis can accomplish. I was also thinking specifically of heap allocations--stack allocations limited to non-escaping references don't need any management, they live and die quite organically with program flow. So really you're not really talking about manual memory management at all.
True enough. Compilers like Stalin, or JIT compilers like the JVM can eliminate many (even most) allocations through the kind of static analysis you're talking about. The point is that it's simply not possible to handle all allocations in general. All of these use a garbage collector as a backup strategy. You seem to be ignoring this rather major issue, which is why I keep mentioning the halting problem. If it isn't possible to do all of them, then it's rather useless for embedded, isn't it? Even a handful of garbage collected objects mean you have to scan the whole heap, so you still have most of the downsides of a garbage collector.

I think we've pretty much come to a point where we agree on most things now (except for interest in Go, heh.) I can't help but feel disappointed in this thread. Not the posts or posters; more in the general outcome. We just ended up quibbling over minor issues in different languages. I don't know what I expected. Bleh.

Name: Anonymous 2010-08-16 19:47

>>60
Hardly. I've mentioned D often because it's the only major attempt at replacing C++
Fair enough. It just seemed like you were in awe of its GC that prevented you from seeing how it equates to memory management in general.

All of these use a garbage collector as a backup strategy. You seem to be ignoring this rather major issue
I tried to make it clear that I was speaking about the compile-time solvable cases. I think I was very explicit about this at one point. I was speaking about them for two different reasons, the first being about equivalence to a great amount of manual management, and the second regarding the confusion viz. D's MM annotations.

I would like to mention that while I don't think there is a serious problem with manual memory management, if your code cannot, in principle, be proven (perhaps with certain allowances for exceptional cases), then it is not one you should use. I don't expect an analyzer to reproduce any given management strategy chosen by the programmer, or even solve for complex provable cases. I don't even expect a prover to try to verify everything that could, in principle, be verified... or even that it could prove with a higher bailout. What I mean is the extent of what is provable at compile time is quite greater than you seem to let on.

At the same time, it is not practical to automatically verify it in detail (even with a great deal of help from the programmer, but for different reasons) but I take it to be the programmer's responsibility to reason about the management strategy and find it sound. If if you find that's too much work, or you are unable to do it reliably enough or just find it undesirable for whatever reason, use a GC. I won't even call you lazy. The point is putting faith in a GC, an analyzer, or your sensibilities + the guy reviewing your code is largely about confidence. We tend to have more when we can verify it, but as you've been insisting a lot, analysis can't solve the general case. So you might make a mistake, or the analyzer might have a bug or the GC might be pure reference counting and never perform satisfactorily on your incestuous data structure (if safety by abstention is admissible, don't ever call free() and all manual management is inherently 'safe'.)

I don't know what I expected. Bleh.
This is about the best you can do here. Personally I'm not much interested in "replacing C++" because it sounds like a demand for something C++ programmers would find appealing, but I am more interested in PL in general. For solving problems C++ and others are traditionally applied to, I have list of languages I am following. None of them resemble C++ very much.

we agree on most things now (except for interest in Go, heh.)
My interest in Go isn't really in Go per se. There are some interesting things going on and I'd be remiss if I ignored them.

Name: Anonymous 2010-08-16 22:26

Name: Anonymous 2010-08-16 23:28

how would you guys rate Objective-C? it appears to me that Obj-C tries to do what D is doing which is implement more of Smalltalk type object orientation. The only thing I know about Obj-C is that it contains C as a subset and that the OO in Obj-C is supposed to be slower than that of C++. Whats your opinion on Obj-C?

Name: Anonymous 2010-08-17 0:11

Objective-C is closer to Java or C#/.NET than it is to D, minus the virtual machine in Java or .NET.

Cocoa has the same breadth and scope as the Java or .NET libraries.

I'm not a big fan of it, I'd rather program in C++.

Name: Anonymous 2010-08-17 1:16

Except Java is designed to be code for a virtual machine and C# is a full fledged language that happens to be written primarily for a virtual machine.

Java doesn't even have close to the scope of the .Net libraries. Cocoa and other Apple API's like the iFag library is shit eatingly stupid for and OO library because it relies on functional methods.

And C++ is fucking garbage.

Name: Anonymous 2010-08-17 2:17

>>63
Objective C is a real superset of C (as opposed to an almost superset, like C++) that adds objects with dynamic dispatch evocative of smalltalk.  Version 2.0 adds garbage collection and a bunch of syntactic sugar.  Objective C is not only way older than D, but it's almost as old as C++ (C++ was way worse back then).

The Cocoa libraries are only a little bit like the .NET or Java libraries.  The Cocoa libraries are geared almost exclusively towards making desktop / mobile applications, so you won't find an equivalent to, e.g., ASP or JSP for Cocoa.  The Cocoa libraries also assume that you already have a working standard C library, whereas the Java platform and .NET platform start from the ground up.

Cocoa is the best library for writing desktop/mobile UIs bar none.  Qt has a good reputation, but if you've seen the kind of preprocessing they have to do to get a decent UI API in C++, you'll just chuckle.  Of course, most sane people don't use Objective C for non-UI parts of their code, except those eccentric developers that only target Apple platforms.

For big, cross-platform apps go ahead and write your code in C++ and then make the GUI on the Mac/iPhone use Objective C.  Yes, they are completely interoperable (no, you can't subclass classes from one in the other, but nobody wants to anyway).  This way you can keep your cross-platform C++ code without having to try to write an UI in C++, which as every C++ UI library ever demonstrates, is a total pain.

Name: Anonymous 2010-08-17 5:00

>>63
You're pretty much correct. As >>66 says, Obj-C is a true superset of C which adds object-orientation using Smalltalk-like syntax. Other than that it's a lot like Java, but it is more dynamic and does not have a JIT, and it's susceptible to memory corruption. Wikipedia has a long writeup if you want to see what it looks like.

It has a lot of bad warts, and while it is used everywhere on Macs, it is used absolutely nowhere outside of Macs.  I wouldn't use it if I didn't have to.

>>65
Except Java is designed to be code for a virtual machine and C# is a full fledged language that happens to be written primarily for a virtual machine.
Nah, Java was designed for enterprise coders. It is almost featureless for this reason: it is so that you can hire a bunch of bad coders, and they can actually get something working without doing too much damage. It is extremely unproductive for anyone who actually knows what they're doing.

>>66
Yes, they are completely interoperable
To elaborate a bit, they are very interoperable (you can mix them freely in the same source file), but their features do not correspond to each other. They have separate class hierarchies: a C++ object is not an Objective-C object, you cannot subclass one from the other, you cannot template Objective-C classes or methods, etc. They have separate exception handling stacks: the try/catch from one cannot catch exceptions from the other, Objective-C exceptions don't unwind the stack, etc. That last one is particularly dangerous; definitely do not throw exceptions across language boundaries.

And lastly, if you thought C++ error messages were confusing, try Objective-C++; since even C++ is undecideable, Objective-C++ is downright hilarious. One of my favorite compiler errors is "Confused by earlier errors; bailing out." That one is especially fun when it's the only compiler error you get.

For OpenGL games (on iPhone or Mac OS X), virtually all developers just wrap the Objective-C they are forced to use and write everything in C/C++.

Name: Anonymous 2010-08-17 5:06

>>66
(C++ was way worse back then)

you got to be kidding?

Name: Anonymous 2010-08-17 9:10

Sure has been a lot of Apple nonsense on /prog/ lately.

Name: Anonymous 2010-08-17 9:41

>>67
Objective-C and C++ exceptions have been compatible since Objective-C 2.0 was introduced.

Name: Anonymous 2010-08-17 14:42

>>68
No, it's true.  C++ was compiled with Cfront back in 1986, when Objective C was introduced.  Cfront was a godawful piece of shit that turned C++ code into C code, and it had a lot of weird corner cases, many of which made it into the C++ spec.

Name: Anonymous 2010-08-17 23:54

>>70
http://developer.apple.com/mac/library/documentation/Cocoa/Conceptual/ObjectiveC/Articles/ocCPlusPlus.html#//apple_ref/doc/uid/TP30001163-CH10-SW2

In addition, multi-language exception handling is not supported. That is, an exception thrown in Objective-C code cannot be caught in C++ code and, conversely, an exception thrown in C++ code cannot be caught in Objective-C code.
What isn't mentioned here is also that Obj-C exceptions will not unwind the stack; no C++ destructors will be run. I'm pretty sure that Obj-C @throw is still a wrapper to longjmp() even in 2.0.

You simply cannot throw exceptions across language boundaries. Don't do it.

Name: Anonymous 2010-08-18 11:14

>>72
http://developer.apple.com/mac/library/releasenotes/Cocoa/RN-ObjectiveC/index.html#//apple_ref/doc/uid/TP40004309-CH1-DontLinkElementID_11

In 64-bit, the implementation of Objective-C exceptions has been rewritten. The new system provides "zero-cost" try blocks and interoperability with C++.
"Zero-cost" try blocks incur no time penalty when entering an @try block, unlike 32-bit which must call setjmp()  and other additional bookkeeping. On the other hand, actually throwing an exception is much more expensive. For best performance in 64-bit, exceptions should be thrown only in exceptional cases.
In 64-bit, C++ exceptions and Objective-C exceptions are interoperable. In particular, C++ destructors and Objective-C @finally blocks are honored when unwinding any exception, and default catch clauses—catch (...) and @catch (...)—are able to catch and re-throw any exception.


So I was wrong in that it's not a feature of 2.0 but the new 64-bit runtime.

Name: Anonymous 2010-08-18 16:05

Aren't there mailing lists for these kinds of discussions? Or reddits? Or overflows? Or anywhere elses?

Name: Anonymous 2010-08-18 16:19

>>74
Hey, you. Yes, you. Fuck you!

Name: Anonymous 2010-08-18 17:45

>>73
Hey, that's interesting. Good news there, except, of course, it doesn't work on iPhone. *sigh*

Name: Anonymous 2010-08-18 21:48

>>74
Die in a fire.  This is probably the most informative thread on the front page.

Name: Anonymous 2010-08-19 4:17

>>77
This is probably the most informative thread on the front page.
Then why did you not bump it?

Name: Anonymous 2010-08-19 4:26

>>78
Didn't you hear? This is /prog/. We sage interesting discussions and only age spam and trolls.

Name: Anonymous 2010-08-19 9:09

>>77
If you want something really informative, you should check out other places instead of turning /prog/ into them.

Just sayin'.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List