Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-4041-

Where were you when...

Name: Anonymous 2012-09-29 23:51

...you realized C is shit?

For me it was ten minutes ago, debugging an overflow involving a thrice-indirected pointer. Never again.

Name: Anonymous 2012-09-29 23:52

Where we you when...
...you realized you are shit?

Name: Anonymous 2012-09-29 23:54

>>2
in your anus

Name: Anonymous 2012-09-30 0:00

when i started using javascript for everything

Name: Anonymous 2012-09-30 0:32

You simply lack the brainpower for advanced abstract thinking, OP. It is you who are shit.

Name: Anonymous 2012-09-30 0:37

>>5
That would imply that I can't understand pointers or debug that program, but I did. It was just a waste of time.

Name: Anonymous 2012-09-30 0:43

when I finished reading a book on D

Name: Anonymous 2012-09-30 0:45

>>1
Halfway through the first page of SICP.

Name: Anonymous 2012-09-30 0:56

>>6
If you really understood pointers, you wouldn't have created the bug in the first place, and debugging would not have been necessary.

Name: Anonymous 2012-09-30 1:03

>>9
It's like you've never written a line C in your life.

Name: Anonymous 2012-09-30 1:04

>>6
well, now we know you're a three-star programmer

Name: Anonymous 2012-09-30 1:05

did you know that unioning an int and a float, then writing to the int and reading back as float is actually undefined behaviour? did you know that (x+A > 0) where A is a nonzero constant and x+A is unsigned will always be true (even on overflow), because overflow is undefined behaviour?

C is undefined shit.

Name: Anonymous 2012-09-30 1:24

thrice-indirected pointer
What the fuck are you doing?

Learn to abstraction for fuck's sake.

Name: Anonymous 2012-09-30 1:29

>>1
That gave you trouble? Seriously?

Here's a thought: DEFINE SOME GODDAMN TYPES for once in your life. It's okay to typedef a simple indirect pointer even if it seems redundant. Then the compiler will tell you when and where you fucked up.

>>11
Hah. I was always fond of that one.

>>12
Unsigned ints have defined overflow semantics. I wish they were user-defined but we can't have everything.

libsndfile (or was it libsamplerate?) has a compile-time test for the int/float punning behaviour. It swaps in its own if it doesn't like the result.

I'm okay with undefined behaviour. C isn't a big language and every problem has a solution. I don't even write C these days (except when I contrive a task that 'requires' it) but I miss it.

Name: Anonymous 2012-09-30 2:45

C leads to unsafe, buggy, verbose, leaky code. C is the #1 culprit of "never meeting the deadline" in software development. But it's super duper fast!

C considered harmful.

Name: Cudder !MhMRSATORI!fR8duoqGZdD/iE5 2012-09-30 3:46

I've never needed to go beyond 5 indirections in one expression.

Name: Anonymous 2012-09-30 5:08

When I found out Kernighan and dmr were jews.

Name: Anonymous 2012-09-30 5:33

>>16
I've never had to go beyond two, and I've written virtual machines and compilers.

Two is fine if you have an array of pointers. Anything beyond that, you introduce a new type. If you don't you're just asking for shit unreadable buggy code.

Name: Anonymous 2012-09-30 5:52

I've never needed more than one level of pointing.
>>1,16 must be doing something wrong.
Also I realised it was shit when I took up a language with functions as first-class citizens and a macro system that didn't make me kick myself in the balls it was so bad,

Name: Anonymous 2012-09-30 6:49

Two levels of indirection is the rule of thumb. Any deeper is a tell-tale sign that you ought to refactor your code.

Name: Anonymous 2012-09-30 6:52

When I realized every C program has integer overflow problems.

>>12
It's well defined as object representation reinterpretation. Read your ABI and you'll be fine.

Name: Anonymous 2012-09-30 7:02

Well don't program complex abstract shit in C then

Name: Cudder !MhMRSATORI!fR8duoqGZdD/iE5 2012-09-30 7:22

>>18
You won't encounter deep indirection in VMs and compilers.

Device driver code, however...

*(ccv->lt_fflk->su.bptr->lst_out.tx_rtn[mbrtxidx++]) = ...

(Taken from a recent project. I won't say more.)

Name: Anonymous 2012-09-30 8:37

>>1
When I began designing my own language, dissatisfied with C/C++.

Name: Anonymous 2012-09-30 8:43

>>24
And then Symta fell out and you felt bad about it.

Name: Anonymous 2012-09-30 8:55

Name: Anonymous 2012-09-30 10:01

>>26
there is one thing that always accompany these: $FREE$! (as in, free drugs!)
I love the pun on Stallman. It really exposes the nature of this subversive evil kike.

Name: Anonymous 2012-09-30 10:12

>>26
lol, Xah Lee...

Name: Anonymous 2012-09-30 10:23

>>26
Finally, about the C++ comment :) I think C++ is better than C (not in syntax, which is what you're talking about). C++ is better because it gives you more abstractions to solve problems. Also, it gives you the tools to avoid printf ("cout" class), to avoid bitmasks to represent boolean ("bool" type), avoid preprocessor tricks (templates instead of #defines), avoid cryptic error codes (exceptions). So yeah, C++ is better than C (even with a more complex syntax).

Yay?

Name: Anonymous 2012-09-30 12:52

>>26
>Java can be considered as a improvement of C
Is this guy serious?

Name: Anonymous 2012-09-30 13:16

>>4
fuck off javashit KIKE

Name: Anonymous 2012-09-30 15:22

Name: Anonymous 2012-09-30 16:46

>>26
What an idiot.
Perhaps the best simple example to illustrate, is its invention of the format function printf. Completely
ad hoc,
So what?
inflexible,
Wrong.
cryptic syntax and semantics.
You never read the fucking manpage, did you?
and can't do arbitrary n-based number system
Because 99% of the time you print out a number it's going to be in decimal. The other 1% in hex or octal.
It can print decimal in several formats, but in a bunch of ad hoc fixed ways.
Same as above. No one cares about printing decimal numbers with their digits backwards or some other obscure shit.
many programers don't really understand what's n-based number system,
Nor does it matter.
And if you show them hex number system using decimal digits in a list, they would be wildly flabbergasted and goes “WHY would you ever want do that??”
Then how about you explain why, genius?
Instead of working on a better compiler, let's invent a short syntax on the spot!
How does a better compiler save keystrokes?
to this day, there are programers who don't understand the difference between a set of true/false formal parameters vs their internal representation, and insists that bitmask is the most efficient way to “encode” boolean parameters.
This sounds like a guy who was too stupid to understand bitmasks properly. Something a 9-year-old kid could probably understand with a bit of teaching.

What's with these idiots who think computers are some magic thing.

Name: Anonymous 2012-09-30 16:53

>>33
Xah Lee
You need know no more.

Name: Anonymous 2012-09-30 21:35

>>12
Both of those are obviously bad if you have even an inkling of how the language works.

The biggest problem with C is that its standard library is too small and the body of idiots using it is too large. It is therefore common to see basic functionality in a standing codebase that was implemented by idiots and has since become too entrenched to be removed without Herculean effort.

Name: Anonymous 2012-09-30 21:50

>>23
Mother of fuck I hate it when people do this, too.

Probably the function in question doesn't even need to have a pointer to ccv, in which case the indirection could have been performed somewhere farther up the call stack. In the unlikely case where you have some seriously fucked data structure which does necessitate access to the top-level, at least define a local to hold the value of tx_rtn so you don't have to deal with that awful shit more than once.

Name: Cudder !MhMRSATORI!fR8duoqGZdD/iE5 2012-10-01 5:06

>>36
That project needed a severe redesign, but it wasn't something I was willing to do given that the rest of the code was like that too.

>>26
Not surprising, he's a Lisp/emacs fan. Those tend to be the ones with their "head so high in the clouds they can't see their feet", the same ones who keep propagating the myth that hardware keeps getting faster while coming up with more and more "elegant" abstraction crap to pile on in their code. It's amusing to see them complaining about why the software they use is so slow...

Name: 26 2012-10-01 6:02

>>33
ad hoc,
So what?
I agree, there's nothing wrong with that.

inflexible,
Wrong.
You can't configure printf to pretty-print a structure you defined. Actually, GNU printf has this but it's slow as ruby.

cryptic syntax and semantics.
You never read the fucking manpage, did you?
printf syntax is very different from the typical C syntax. Also, its behaviour depends on implicit argument conversion. It is clearly defined but way more complex than it should.

and can't do arbitrary n-based number system
Because 99% of the time you print out a number it's going to be in decimal. The other 1% in hex or octal.
And why are they the only used bases? Precisely because printf cannot print in other bases. Circular argument.

many programers don't really understand what's n-based number system,
Nor does it matter.
I think it's pretty unsettling that most programmers don't understand such basic mathematics.

And if you show them hex number system using decimal digits in a list, they would be wildly flabbergasted and goes “WHY would you ever want do that??”
Then how about you explain why, genius?
I honestly don't see the point Xah Lee is trying to make here.

Instead of working on a better compiler, let's invent a short syntax on the spot!
How does a better compiler save keystrokes?
A better compiler doesn't save keystrokes but a macro does. With both, you get optimisation regardless of whether you use the macro or not. That's why it's stupid to put it in the compiler.

to this day, there are programers who don't understand the difference between a set of true/false formal parameters vs their internal representation, and insists that bitmask is the most efficient way to “encode” boolean parameters.
This sounds like a guy who was too stupid to understand bitmasks properly. Something a 9-year-old kid could probably understand with a bit of teaching.
What he is really trying to say is that he dislikes the use of bitmasks as substitute for proper keyword parameters.

>>37
Not surprising, he's a Lisp/emacs fan. Those tend to be the ones with their "head so high in the clouds they can't see their feet", the same ones who keep propagating the myth that hardware keeps getting faster while coming up with more and more "elegant" abstraction crap to pile on in their code. It's amusing to see them complaining about why the software they use is so slow...
That's wrong and you know it, Cudder. Bloat isn't the main cause of unresponsiveness. It is the general carelessness towards latency and user experience in favour of bandwidth. Video games input lag and network bufferbloat are excellent examples of this.

Name: Anonymous 2012-10-01 9:34

>>23
That's not what I was expecting, but it's still pretty bad. I typically write ``accessor'' macros for my structs, so obj->var becomes var(obj) and clusterfucks like that are as simple as uno(dos(tres(obj))).

Name: Anonymous 2012-10-01 9:54

ITT: people complaining about C when the actual problem is that they suck as programmers
/prog/ is considered harmful

Name: Anonymous 2012-10-01 11:16

>>39
I'll confess that usually don't see any point in doing that - is that really so much easier to read than obj->tres->dos->uno? Yes, you've abstracted the data structure, which is nice, but how often are you going to need that? (Though I suppose it would have helped >>37...)

Name: Anonymous 2012-10-01 14:02

Right after i realized i implemented a buggy common lisp just to make c manageable

Name: Anonymous 2012-10-01 14:21

iostreams Are better than printf() because the format can be evaluated at compile time. If some C compilers do this for printf(), it's obviously a non-standard extension and won't apply to your own version if you were to write one. Basically C++ is better because you can do more at compile time.

Name: Anonymous 2012-10-01 15:25

|= high(>>43) V stupid(>>43).

Name: VIPPER 2012-10-01 16:45

As i see it the core language is not so much a problem, but the shitty standard libraries and most of all how shitty it encourages the use of hacks.

Considering the fact that it has very few extendable core principles and a mostly consistent syntax, i would consider it to be a well made language.

Most of the problems lie with the fact that it was designed as a lowlevel language before the 80s for harvard arch computers.
Becuase of that it lacks hashtables and friends, because of limited resources implemntations of such mostly had to be writen by hand to be usable.
To add further back in the days it must have took a while to compiler so i assume thats why the preprocessor is such shit.

Besides some syntactical troubles, that most languages have, the one thing that really disturbes me about C is that some operators are mapped to specific operations assuming they are basic while some stuff is not.
Example: + and * are usually cpu implemented operations, but there is no exponent operator, even if it were to exist as cpu opcode you would still have to use it and treat it as a function.
Perl 6 has this feature that operators are syntactic sugar for functions.

Not to mention the lack of builtin over/underflow protection, even possibilaty to check for without doing a needless arithmetic operation or using inline asm.
Even then you would still have to write a function for this sort of thing in most cases.

I dont wanna hang out the lisper, but anonymous functions are one of the best things ever and i dont really see why C doesnt support that sort of stuff, other than compatibility with harvard arch (im not sure if it can be worked around).
One thing for questionable for sure is why C doesnt support nested functions, i can do that with ease in asm.

However its utterly crappy libraries and lack of interactive features are inexcusable in 2012. They promote hackery and can make work such a pain in the ass sometimes.

I hate C now too, but you must give it credit for what it has got right and not judge it by its shitty users.
And the worst of all is not C in itself, but the brain rotting abomination called unix that it spawned.
Morons will do stupid stuff no matter what you give them.

Name: Anonymous 2012-10-01 17:32

ITT: Python babbies who can't into using gdb properly

Name: Anonymous 2012-10-01 18:17

>>46
gdb
Terrible!

Name: Anonymous 2012-10-01 18:56

I dont wanna hang out the lisper, but anonymous functions are one of the best things ever and i dont really see why C doesnt support that sort of stuff, other than compatibility with harvard arch (im not sure if it can be worked around).
It's called C++11.

Name: Anonymous 2012-10-01 19:00

Perl 6 has this feature that operators are syntactic sugar for functions.
Nope. Nope nope nope. Noap.

You can define operators with 'sub' and 'method', but not all of them are, just like virtually every other language with user defined ops/overloading.

Even those you define that way don't desugar to a sub call; try defining an operator and call it as a sub, see what happens. If you work at it you might just be able to figure out how to do that and see the problem.

Name: Anonymous 2012-10-01 22:42

>>45
Actually, even + and * can give you troubles if you are working with types that are larger than the machine's word size. This is not a big deal for PCs, but with 8/16 bit micros the need to branch into the runtime (or expand the support routine in-line) is something of a pain.

Name: Anonymous 2012-10-01 22:55

>>48
You can get downward funargs using just function pointers in C. They work fine on Harvard architectures. They're not anonymous, but that's not limiting in practice.

Don't change these.
Name: Email:
Entire Thread Thread List