Your void main is a system-specific way of writing the startup function which may or may not work. The C language requires (hosted) implementations to support int main(void) and int main(int, char **). Also, the printf function must be declared before being used. This is because it is variadic. Implicit declarations are not required to be compatible with variadic functions, so this may or may not work. If you fix these problems, then the program outputs 2 (in a reliable way that should work everywhere). The expression x=2 yields the new value of x. The printf function operates on the value produced by the argument expression, not on the object x. (I.e. the new value of x is stable before the function is called, due to a sequence point that happens before a function call, but that is irrelevant since printf does not know anything about x.)
Name:
Anonymous2012-01-19 12:40
>>7
Oh shit main. Are you the same Kaz whose C expection handling code that got used in wireshark?! Because based on how you write, I'm don't think you're posing as him.
Name:
Anonymous2012-01-19 12:43
>>7
Yeah, I think you are the same person! Wow, I've gone back and forth with you on another programming forum. You're a programming god.
>>15
Not a mental midget to kodak.
Guy must be the second coming of Dennis Ritchie & McCarthy >>7
Put actual email in the email field, must be crazy or trolling.
Name:
Anonymous2012-01-19 13:43
easy guide to detecting faggots:
1. utter "sizeof(char) is always 8 on proper architectures"
2. see if they cringe
3. (optionally) if they do, punch them in the groins
Name:
Anonymous2012-01-19 13:45
>>17
Now tell us again why you can't get a programming job.
Name:
172012-01-19 13:48
self-fix:
1. utter "((char)1<<8 + (char)1<<7) == 1<<7 on all proper architectures"
>>20
8 is ambiguous. Man, I tell ya, your written communication is almost as bad as your programming skills. So again, you're stupid. And yet again, you have no possible future as a computer programmer.
Name:
Anonymous2012-01-19 13:56
I bet kodak is the kind of faggot who respects strict aliasing everywhere.
>>20
Also, once again, you need to learn how to read. Look at how ANSI/ISO C9899 defines char. Do you see 8 anywhere? Does the standard imply 8? Exactly. Now shut your pie hole you mental midget.
Name:
Anonymous2012-01-19 14:00
>>24
I don't, but on all proper (ie non-shit) architectures it will be 8 bits. Fuck your microcontrollers, faggot.
>>24
Me thinks the standard guarantees sizeof(char) is 1, hence you failed to get the hint.
Hugs & kisses.
Name:
Anonymous2012-01-19 14:02
>>23
That kind of reminds me of a firm that I interviewed with right after undergrad school. The managers not only asked me to solve the programming problems, but solve them in such a way that it would have to pass a compiler with all of the warnings enabled.
>>26
It defines it as 1 (byte). Nowhere does it say anything about it being 8 bits. There is a reason for being so subtle. In some respects, it's kind of like how '%' in java isn't modulo. The difference is subtle, but important.
>>25
In other words, you expect the shit to only run on your mothers PC. Nice job chief. You're below average.
Name:
Anonymous2012-01-19 14:10
>>28
What he's saying is that sizeof(char) is always 1, which defeats your little "[code]sizeof(char)[/code
is always 8 on proper architectures" impression.
>>25,27
C is not meant to be a general-purpose algorithmic language. People trying to use it as such is the reason why we have so much shit software today and so many security flaws. Just look at GNOME or the Linux kernel. I'm not saying C++ is any better, of course, it just amplifies C's problem by piling layers and layers of shit on top of it.
So please, use C as what it was meant to be; a portable assembly language.
>>25
Enjoy your unreliability and lack of GC. I truly hope you get stuck designing GUIs in C for the rest of your shitty life.
>>32
The last post I saw from his on comp.lang.c was the source code to emulating the whole try/catch thing. He went on to say something to the effect that this code eventually made its way into wirehark. I also remember him talking about working on some kind of embedded linux thing.
I smell bullshit. The reason why GNOME is crap is because the homos on that project decided to do an entire code rewrite instead of fixing the existing bugs. This is because, as you probably don't know, it's just far easier to rewrite the crap and "get it right this time".
>>36
Here's what's really funny about shit like GNOME. It used to be that part of the code base was stable and secure. Then a new group of wankers came in and just rewrote it. This is because, and I quote the one engineer "The code looks ugly, and besides, I really don't understand what it does".
>>25'
>2012
>using 8-bit bytes for obsolete 8-bit character sets
WE UNICODE NOW
Name:
Anonymous2012-01-19 14:45
>>38'
>2012
>using unicode
ANSI master race here, suck my dick mental midget toilet scrubber. You will never amount to anything in your life and I have to keep coming on /prog/ and making fun of you mental midgets to keep my unstable self-esteem from hitting rock bottom.
>>36`
>implying`we'll get it right this time' ever works
atleast they have sexier sugar coated code now, so maybe someone moar capable will be able to understand their bullshit without reading 10000+ pages of homo bs