Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Programming Language to Replace C++

Name: Anonymous 2010-08-11 21:49

I think we can all agree that C++ is a terrible language. So why is it still around?

When talking to most C++ users (game developers, systems programmers), I've found that most seem to recognize C++'s faults, but they don't really care. They aren't even the slightest bit interested in a new language that might solve its problems, even one that gives them all the power of C++ with none of the downsides. You can't even get them to look at something new.

Why is that? Why does everyone just 'live with it' without wanting to improve the situation?

Name: Anonymous 2010-08-19 14:53

C99 is all you need.

Name: Anonymous 2010-08-19 16:10

>>79
That's not true. There is rarely any discussion in threads beyond the front page. It's not like /prog/ is popular enough to really need it anyway.

Name: Anonymous 2010-08-19 17:43

I think this thread is spent, sadly.

Name: Anonymous 2010-08-21 13:30

Trying to spark more discussion here...

>>7
malloc also gives you GC pauses, by the way.
I hate it so much when people say this. malloc() does not have "GC pauses", but yes, it can be unpredictably slow. We get that. That's why we're smart enough to not call malloc() in the middle of our rendering loop.

In most languages with forced GC, you simply do not have the means to perform any non-trivial computation without invoking the garbage collector. In Java, simple container classes like Point2D, which should be value types, are allocated on the heap. JIT can elide *most* of these with escape analysis, but no promises, and no feedback either. Functional languages are absolutely the worst for this; in something like Haskell, seemingly innocuous function calls allocate thunks for lazy evaluation, or reconstruct lists as they process them.

Many embedded apps have very predictable malloc() performance because they never call free(). They allocate all needed memory on startup, and then work strictly within those bounds. When I did game development on mobile platforms before iPhone (BREW, Symbian) this is how we coded our games. You have to in order to make it safe, especially since different phones can have such wildly different available memory; this way you know that as long as the app starts, it will never run out of memory.

>>7
BitC
BitC is sort of neat... but I'm not excited about it for a bunch of reasons. They seems confused about what syntax they want to use for it. They started out with Lisp, but with no Lisp features (e.g. macros.) Now they are trying to transition to a very ML-like or Haskell-like syntax. It also forces garbage collection, but at least it has the means to avoid causing garbage collection, such as refs to value types.

Most importantly it appears stagnant. Based on the bar on the left, nothing has changed since November 2008. The language is not in a usable state.

Name: Anonymous 2010-08-21 14:35

>>85
When I did game development on mobile platforms
Pff. I refuse to believe anyone on /prog/ knows how to program, let alone do it for a living. Enterprise bullshit sure, but not programming.

Name: Anonymous 2010-08-21 15:08

>>85
I hate it so much when people say this. malloc() does not have "GC pauses", but yes, it can be unpredictably slow. We get that. That's why we're smart enough to not call malloc() in the middle of our rendering loop.

>>7
The advantage of doing it manually is that you will never call malloc while handling a real time event.

Might want to calm that jerky knee at least long enough to read the following paragraph.

Name: Anonymous 2010-08-21 15:39

Malloc is slow.
People use either alloca or (usually) preallocated static buffers and manage them themself(SSE memcpy,memmove, DMA hacks,etc).

Name: Anonymous 2010-08-21 16:37

I'm sick of this bullshit about malloc() being slow. Who the fuck sprouts this bullshit?

The malloc() on my system takes less than 0.05 microseconds (that's less than 100 cycles) for random allocations of 0-128KB. There are some "long" pauses here and there, and by "long" I mean less-than-a-microsecond long. The overhead is also ludicrously low, for example allocating 10 million times 2 bytes (discarding the pointers) consumes less than 21MB of virtual space. I've never seen the overhead exceed 10%.

I'm not saying malloc() is the final solution for memory allocation, but goddamn, it's supposed to be a convenient tool and not something that makes you roll your own shit on top. If yours doesn't work properly, replace it.

Name: Anonymous 2010-08-21 17:22

>>88
Taking advantage of hardware features is a hack?

Name: Anonymous 2010-08-21 17:27

>>89
100 cycles is a long time.

Name: Anonymous 2010-08-21 18:46

>>91
It's shorter than the time it takes for me to ejaculate.

Name: Anonymous 2010-08-22 0:02

>>90
I don't see normal programs resorting to DMA for speed.

Name: Anonymous 2010-08-22 0:44

>>91
So how long should it take to safely allocate memory?

Name: Anonymous 2010-08-22 1:19

>>94
It's not a case of "how long should it take" -- it just shouldn't be done in a real time event.

Name: Anonymous 2010-11-14 2:20

Name: Anonymous 2010-12-26 21:11

Name: Anonymous 2011-01-31 20:39

<-- check em dubz

Name: Anonymous 2011-02-18 17:41

dubzNewer Posts
Don't change these.
Name: Email:
Entire Thread Thread List