Misdesigned, buggy standard library in a misdesigned, underspecified language.
Name:
Anonymous2011-08-06 14:58
I don't think C gets enough credit. Sure, C doesn't love you. C isn't about love--C is about thrills. C hangs around in the bad part of town. C knows all the gang signs. C has a motorcycle, and wears the leathers everywhere, and never wears a helmet, because that would mess up C's punked-out hair. C likes to give cops the finger and grin and speed away. Mention that you'd like something, and C will pretend to ignore you; the next day, C will bring you one, no questions asked, and toss it to you with a you-know-you-want-me smirk that makes your heart race. Where did C get it? "It fell off a truck," C says, putting away the boltcutters. You start to feel like C doesn't know the meaning of "private" or "protected": what C wants, C takes. This excites you. C knows how to get you anything but safety. C will give you anything but commitment
In the end, you'll leave C, not because you want something better, but because you can't handle the intensity. C says "I'm gonna live fast, die young, and leave a good-looking corpse," but you know that C can never die, not so long as C is still the fastest thing on the road.
Name:
Anonymous2011-08-06 14:59
If it ain't C, it's crap.
Name:
Anonymous2011-08-06 16:50
Its dangerous if you see it written in the middle of ENTERPRISE CODE by some code monkey. If you know what you want, C gets() it without any boilerplate code.
gets() is fine for development, just not suitable for production code. In fact, none of the generic input functions are. If you want bug-proof code it's best to roll your own getchar() loop and parser. The generic functions are just for quick hacks.
>>13
Actually you can safely read a bunch of stuff (no unlimited strings of course) with scanf as long as you check the return value correctly and you specify strings length in the format. See libcaca [1].
I would say the biggest flag that C doesn't get enough credit is I have worked with over 30 developers on C++ projects and only met 2 C++ devs. The others were C devs thinking they were doing c++.
C is a rope. You can do something super-useful with it, sometimes better than with any other tool, but you can go hang yourself with it, too. Continuing the analogy, C++ is like that swiss-army knife with the 800 tools in it. You'll never use 80% of them, many are for only one purpose, BUT THEY CAN ALL KILL YOU!
Direct me to a new site where everyone is not fagshitters.
Please, fagshitters from 10:20-10:31, die in a car fire. I really hope you all do, fagshitters. No, really, faggots, FUCK YOU, FAGGOTS, DIE IN A CAR FIRE. IF YOU WERE HERE, I WOULD STAB YOU IN THE FACE, FAGSHITTERS. FUCKING DIE ALREADY. DIE FAGGOT.
For the fairy prudes, no, I don't hate gay people. I voted for gay marriage in the last few elections where I could, and fully think homosexuality is ichiban A-OK. I just hate your spamming, wanking, fapping fucking guts. This is not what reddit is about, faggots. It is the fagshitters who post fucking bullshit like 10:20-10:31 who need to fucking die in a car fire. EAT SHIT, DELETE YOUR ACCOUNT, AND FUCKING DIE, FAGGOTS.
If you are on this thread, and not a regular reddit reader, I humbly apologize. But reddit is the new Digg, and faggots need to be told they are faggots. FUCK YOU, FAGGOTS. READ THE FUCKING reddiquette, FAGGOTS.
If you are a reddit reader and ignore it, FUCK YOU, FAGGOT.
If you are not, read it, or become a FAGGOT.
Name:
Anonymous2011-08-09 11:02
C is for cookie. That's good enough for me.
Name:
Anonymous2011-08-09 11:05
MrVacBob-sama please redirected referrals from reddit to goatse.
Name:
Anonymous2011-08-09 11:15
>>1
Listen I can't be bothered to explain it to you so here's an image macro depicting the explanation: http://bit.ly/lF39pb
I love C. It's my favorite language, it's the language I have the most experience in, and the language I'm most comfortable with. That was an excellent paragraph.
>>55 YOU ARE NOT QUOTING ANYONE, YOU HAVE NOT QUOTED ANY POST.HOW CAN I KNOW THAT YOU'RE REPLYING TO >>4 (A KOPIPE, NONETHELESS) GET THE FUCK BACK TO REDDIT, ``PLEASE''.
Name:
DIGG_REFUGEE_XD2011-08-09 21:47
:P HI REDDIT
Name:
qwerber2011-08-09 22:06
>>not so long as C is still the fastest thing on the road.
Try assembler
Name:
Anonymous2011-08-09 23:17
Actually that C copypasta can be traced back to the comments of a Livejournal post in 2006. The theme was "programming languages and their relationships".
It's kept on the standard because some old code relies on the existence of gets(). (Note that SUSv4 deprecates gets(), but C99 does not.)
This is what C is about: putting the standard more on the service of bad-written junk of the 60's than on the service of current needs and current technology. There is nothing really intelligent about C -- not nowadays, not even about ten years ago, when the standard was on its conception. Many decisions inside C are pathetic at the very least. C lacks the notion of keyboards or monitors; but the standard library has support for internationalization and formatted output of monetary quantities. Fuck, for what reason, if possibly no screen exists for output, no device exists for input, and there is no portable way of knowing if 'A' translates to 0x41 or to 0xLOL because the execution charset is let unspecified? C not only lacks the concept of threads, but the execution model itself does not support it at all: POSIX and C are actually the description of two _completely different_ languages, since they differ in a core aspect. C does not know what networks are, let alone network protocols; C is ignorant about filesystems, let alone databases or query mechanisms. C lacks screens, how come it could ever be able to standardize _any kind_ of GUIs? C really can calculate a lot of things, but you shouldn't expect too much about actually _seeing_ what these things are, or, better stated, you shouldn't expect too much about how to _input data_ through a myriad of pathetic scanf() format strings, since such is the best approach of C for 'data formats'. C has no well-defined binary format -- not in the standard, at least. C has no notion of libraries. C has no notion of fixed-point calculus (but knows every possible useless aspect of floating-point exceptions, albeit _no_ current compiler implements every one of them). And C still haven't decided whether executing '-x', being 'x' a signed integer, is undefined behavior or not, because C haven't decided much about integer representation, bitwise operation on signed words and overflow conditions. C is a victim of 'trap representations' as much as about a single piece of assembly-language code is victim to spurious interrupts in the hardware level.
Note that neither of these concepts -- networking, filesystems, threads, word-sizes -- are extraneous nowadays, nor completely impossible to standardize, due to the amount of common practice that yields _de facto_ standards of their own sort. SUS defines a set of functionalities as optional; why standard C doesn't have a couple of its own?
The reason is that, for C, there is no real evolution -- just a repetition of old, out-dated arcane kludges, marketed sometimes as 'serious programming' to preserve the renown of an elitist subset that actually outperforms their peers in regards of knowing every possible undefined behavior condition. Actually, such behavior is the hallmark of C: for anyone familiar with C programming communication channels, on IRC, Usenet or other such cesspools, knows heavily, and by the harsh way, that the common behavior of the C community is to act according to some sort of Sabbath code of conduct, praising FUD as a real deity, bashing heavily over newcomers with the 'READ THE STANDARD' crowbar, and over legitimate, complex doubts with the 'THIS IS OUTSIDE THE STANDARD'S SCOPE' excuse. In C communities, ignorance is regarded as a strong weapon to keep the status quo untouched. The things go even worse inside C++ communities, albeit C++ is such an abomination that it is comprehensible why adepts of it behave likewise.
C is the hallmark of blatant barbarism and techonological stagnation. C is obsolete, and was obsolete a decade ago. The fact that 70% of the world software is still developed in C, or running a C compilation, is no argument to counterfeit the fact that C is not capable, and has never been, of dealing with anything better than manipulation of simple byte strings; and even then, the result is _guaranteed_ to be unportable at the very beginning. There are _no_ non-trivial C programs completely free of undefined behavior conditions, because such thing is almost infeasible in theory, and in practice no one really gives a fuck ('If it crashes, just restart it'). Portability is a joke on technology, but it is really taken to an extreme dimension, in C: ill-defined word-sizes, unspecified endianness, unspecified character sets, unspecified memory models, unspecified file formats, unspecified pathname formats, and so on, _ad nauseam_. C compilations have the incredibly retarded bottleneck of simple file I/O due to the extensive copy-pasting (a reality in the old era, a dementia in the new world) of standard header files in _every single translation unit_; and typical C projects extensively lack organization, even in the directory-hierarchy level. C front-ends are completely incompatible (there's barely a portable way to simply compile a translation unit, without any extensions or compiler flags, and even expect the result to be usable between compilers without a bunch of quirks to select call models, symbol naming and so), which propagates to the impossibility of creating portable makefiles, which propagates to the need of developing GIGANTIC balls of shit such as Autotools, Libtool and other GNU smelly crap. C is the cause of tremendous amounts of scripting kludges, severe security bugs and man-years of wasted refactoring and rewriting of unintelligible code.
That said, C is indeed a misdesigned, underspecified and braindamaged language; actually, a 40-year old prank that got incredibly far in computer history much more because of political influence than because of technological quality, if there are any. Not that there are _really better_ alternatives to circumvent the problem with -- maybe this is the reason why C is still widely in use. C is to programming as Unix is to operating-systems, and as lobotomy or trepanation is to headache treatment. People use Unix because there's no better alternative, as general things goes, not because Unix is a good OS in any regard. People use C for the same reason. People drugs themselves not because drugs are good for your health, but because life sucks _even worse_ without them.
In the current technological portfolio, adoption of any one of the available 'choices' are a _consequence_ of the absence of a marginally decent alternative.
>>62
You completely miss the point of C. C is great precisely because most of its behaviour is undefined. It is only a more portable and convenient version of CPU assemblers.
If you want something well specified, standardized, and bundled with a complete standard library, go for a higher-level language.
Name:
Anonymous2011-08-10 13:43
Isn't gets removed in C1x anyway?
Name:
Anonymous2011-08-10 13:59
>>65 You completely miss the point of C. C is great precisely because most of its behaviour is undefined.
Not the slightest. Actually, as far as the current development in C is, C has no 'point' to be missed. At all. C has no clear objective. C is used for both kernel development and GUI applications in an almost promiscuous relationship -- one including headers from the other, for example.
C cannot be called a portable assembler -- it does not make sense, at the very best. C compiles _very badly_ at the assembly level. Current C compilers have pitiful support for SSE or vector instructions; many C compilers lack LTO (Link-Time Optimization), which outperforms intra-procedural optimization in large scale. Some compilers can't even figure out extremely simple assembly-level constructs, due to a variety of factors; for example, I've never seen a single REP SCAS sequence being yielded (outside a builtin call) from a single scan loop, or even some register being used _properly_ inside a specific context in which it was created to be used in the first place. Advocates of the idea that C compilers optimize code better probably haven't ever written a single line of assembly, or haven't done it seriously yet. Even the C function call model (using the stack to push arguments) has directed the hardware industry to heavily optimize the stack access at the hardware level, simply because most compilers couldn't agree on any standard means to transport metadata about such calls at the toolchain level (for example, re-using registers and flags as parameters), a thing any competent assembly programmer would know how to do. Some aspects of modern hardware aren't even reflected on C constructs, and kludges emerge. For example, the need of memory fencing to synchronize cores in an SMP processor can't be foreseen by the compiler analysis, thus the need of hardly inserting such instructions on every call to a pthread function to guarantee correct behavior.
I agree, though, that these problems are not more related to the language than they are to the _implementations_. Nevertheless what matters, in practice, is _practice_, not theory. It is always possible to create a frontend abridging the least common denominator from a set of architectures, thereby yielding correct code for every one of them from the same source file, but claiming that such design is a 'portable assembler' can hardly be sustained, in my opinion.
Last, but not least, this is not about 'what I want'. It is not a personal relation with the matter. It is about 'what the thing really is'. I have had a sane-as-possible relationship with C for about ten years now, but that should not impede me to see what it really _is_, instead of seeing what I _wanted it to be_. Ultimately, as I said, choosing C for any reason is merely circumstantial, due to the current state of affairs of modern technology. No one gives a fuck whether the PHADDW is being properly used for maximum performance on your shiny new Wolfdale core, or whether the crash will occur on the next input byte from the socket. The big issue is whether these things will generate new sales or not: new compiler sales, new standard document sales, new processor chip sales. The corretude is actually the most inconvenient part of the process. C has done its grand part on helping library developers sell their crap, helping hardware developers sell overpowered processors, and helping specialized consultants to help clear all the derived mess up.
Just to render it clearer, the C compiler backends just generate better machine code than other language's, which doesn't mean that they generate _good_ machine code at all.
And newer technology with just-in-time machine code compilation is promising to break this myth out and for once. Unfortunately, I have had few contact with such languages to know whether it is reality or just another market turnaround hype.
Name:
Anonymous2011-08-10 14:25
Something I'd like is a C dialect with all undefined behavior defined as would be best for x86_64.
Name:
a2011-08-10 14:31
The trip site of booking.com is negligent. http://www.booking.com/
There is a problem of credit card in the authentication system
Booking is possible if the Expiry date and CVC-code of the credit card is wrong.
You can book even if you input card hoder name which is random and false.
C is fast because hardware is designed for it, not because it's designed for hardware.
Name:
Anonymous2011-08-10 16:04
>>72 Something I'd like is a C dialect with all undefined behavior defined as would be best for x86_64.
You don't understand what undefined behaviour is for.
It is not about multi-platform support, by and large. It is in fact exactly about compiler optimizations, for most parts, at least these days.
For example, when the standard says that integer overflows are undefined, it is not because your code is supposed to run on a non-two-complement machine, but because then the compiler is free to assume that when you write x++ you know that x will not overflow (for some reasons that are outside of the compiler's reach), and then it knows that the new value is greater than the old, and can use this knowledge for various optimizations, like, now it knows that you are not going to access the previous array element and can delay the memory store.
The same goes for pointer aliasing rules (accessing an int via a pointer to double is undefined, so the compiler should not worry that a store to double might make it necessary to reload the integer loop variable from memory), stuff like pointer arithmetic being undefined outside the allocation unit, and so on.
This is an unscientific and ultimately destructive dead end obviously, but C has painted itself into this corner by overspecifying the low-level details of its Abstract Machine. Which seemed like a good idea at the time, but then the inertia has carried the whole thing much farther than is sane.
>>74
HAHAHAHAHAHAHAHHAHAHAHAAHAHAAAA!!!
you think your tough huh?
one word THE FORCED INDENTATION OF CODE.
i have taken out two mission critical applications at the same time in less than 5 seconds i have been training for 3 years.
also enterprise grade best practices.
your compiler might be bigger than me,but i know mine is smarter and quicker.
my compiler is 130 kb pure lean code.
one keystroke and i'll overflow your buffers.
your the one whose a nerd.i can optimize CFLAGS anytime i want you probably haven't ever touched CFLAGS before.
you probably have sex with your computer.
you don't even know me,and you don't want to.
you'll be lucky if your even worth my attention one look at my code and you'll dissappear forever.
though i'd be hapy to humiliate you in front of all your friends.
btw IM the expertest.
i have worked in maine, new hampshire, new york,utah, colorado,florida,bahamas.
never indented my code!
im undefeated in competitive obfuscation of code.
im on my way to IOCCC.
go ahead and come step anytime you want.b*tch
Name:
Anonymous2011-08-10 16:30
joh joh joh /prog/ is so fresh against /prog/ reddit looks like a rat
Type qualifiers are to programming languages as genital herpes are to sex. Itches, scratches and overall low-quality results.
Name:
Anonymous2011-08-10 16:58
>>80 Type qualifiers are to programming languages as genital herpes are to sex. Itches, scratches and overall low-quality results.
When performance matters, they are invaluable. Performance always matters.
When performance matters on that level, you should be doing straight machine-code instead of trying to tweak the compiler guesses. And obviously after a decent profiling session. No amount of keywords will ever evaluate to any sensible, objective measurement of performance gain.
That's a reason why no one uses individual '-foptimize-whatever-fuck-trees' flags and instead use '-O3' flags, without ever profiling anything. People just say they care, but they don't really care; they're satisfied in _believing_ the compiler will get things any better in terms of performance. Keywords give them an extra self-delusional power.
For what is worth, the only keyword that matters in C is 'restrict'. 'restrict' is a C99 keyword; the vast majority of the programs are still written in C89, thus not getting 'restrict' or getting them as an extension. And often getting them wrong.
Name:
Anonymous2011-08-10 17:18
typedef const int * restrict fast_t;
int main(void) {
register fast_t vroooooooooooooooooooom;
return 0;
}
Name:
Anonymous2011-08-10 17:23
>>83
Hardware optimization is always a suckers game, even ace asm coders like Michael Abrash say its impossible to to guess how all the factors interrelate in the CPU pipeline
There's a difference between handwiring a very limited hotspot in machine language and attempting to optimize the entire code branch through arcane microfusion knowledge. No doubt, anyway, that a compiler does it worse -- specially if the 'register const int array[const static restrict 0x10]' declaration of a function parameter was your best attempt at 'optimizing access to an array' through language keywords.
One doesn't need to know everything about optimization to produce better-than-average code in assembly, specially because, indeed, knowing every possible correlation is impossible. By design, assembly code is much tighter, which by itself is a great advantage -- higher cache hit ratios, reduced number of overall instructions due to non-redundant calculations, etc. Good assembly programmers know how to healthily produce maintainable, optimized code, when they really need to. They only do if it can be proved it's worth the effort.
>>86
IIRC ANSI and ISO C don't prohibit the use of the `_t' suffix for types, only POSIX does. So as long as you don't include any POSIX headers, you can use such type names.
>>88 only POSIX does.
That's why it sucks. "Hey, guys, look! C89 uses cool style for typedefing types! Let's steal it awesomess and telly anyone to not use it". Fuck this bullshit.
You're correct, albeit no need to include POSIX headers to 'be' in POSIX. Just by choosing a POSIX environment (#define _POSIX_C_SOURCE) you're already limited to their namespace.
>>1
Quoting from Rationale for International Standard—Programming Languages—C, revision 5.10:
[quote]7.19.7.7 The gets function
Because gets does not check for buffer overrun, it is generally unsafe to use when its input is
not under the programmer’s control. This has caused some to question whether it should appear
in the Standard at all. The Committee decided that gets was useful and convenient in those
special circumstances when the programmer does have adequate control over the input, and as
longstanding existing practice, it needed a standard specification. In general, however, the
preferred function is fgets (see §7.19.7.2).[/quote]