Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-

C/C++ Programmers

Name: Anonymous 2011-11-24 12:59

Why must they feel the need to redefine all the primitives with different names?

Name: Anonymous 2011-11-24 13:00

Sometimes for abstraction, sometimes because some implementations are too slow, sometimes because they don't want the implementation to be standard-conformant.

Name: Anonymous 2011-11-24 13:06

If you mean stdint.h types, then to ensure the correct size.
Otherwise, they are mostly just obfuscating the code.

Name: Anonymous 2011-11-24 13:10

...making their code cross-platform by renaming standart types and embracing code by #ifdef horror...

Name: Anonymous 2011-11-24 13:17


#ifdef _MSC_VER
// Windows
typedef unsigned long long u64;
#else
// Posix
#include <stdint.h>
typedef uint64_t u64;
//typedef unsigned long long u64;
#endif

Name: Anonymous 2011-11-24 13:25

>>5
typedef unsigned long long u64;
typedef uint64_t u64;
Why not just typedef unsigned long long uint64_t; for Windows?
Also, you can grab a MSVC version of stdint.h on https://msinttypes.googlecode.com/svn/trunk/stdint.h or simply include the one that comes with VS2010.

Name: Anonymous 2011-11-24 14:53

>>1

Most reasonably dedicated programmers have two crippling disabilities:
    - Lack of formal, serious education and experience in the programming area;
    - Some severe social dysfunction, typically in the autism spectrum (but which, contrary to what is usually implied, does not warrant the associated greater-than-average intelligence).

The first disability is reflected heavily on the (lack of) quality of the generated code by the forementioned sufferer. Speaking about C, from typedef structs constructs to the most extreme attempts on creating OO-driven abstractions and horrible "language extending" macros (GLib and the like), such a coder develops some particular paranoia with respect to some set of worries and issues, while completely disregarding other aspects of the (rather difficult, as it seems) art of programming. Unecessarily redeclaring (that is, hiding) identifiers through typedefs and sprinkling around tons of preprocessor directives (#ifdef being the least harmful of them) is just the small part of the problem.

The big part of the problem -- which consumes and have consumed manyears of worldwide work effort to properly settle things down, or to fix the damage these things have already caused, or to prevent future disasters caused by such coding practices -- is the proliferation of coding style cults; in other words, the same bad programming practices turned into some kind of philosophy, or model to follow, ultimately yielding different dialects of the same language, and giving solid ground for the uprising of zalooms of different, though absolutely equivalent on every practical aspect, programming languages.

Sometimes, these cults become role models to idolatrize; in the most severe cases of this disease, and in the most demented of the programming environments, things such as Boost are regarded as the "expert way" of doing things, no matter how senseless it really may be. Just look around. GNU code, which is rather proud of having literally thousands of lines of (extremely bad written) code for a simple ls command, is what runs on the userspace of almost every modern Linux system nowadays. OpenSSL is what covers all of our security needs: http://www.peereboom.us/assl/assl/html/openssl.html. Microsoft code does not even respect the relevant standards. And things go on, as it seems, more crippled at each passing day; and the sole, the single reason of the problem is the presence of extremely uneducated people, which don't have the smallest grasp of the consequences of their acts, kludges and "well-thought decisions" in their Autotools-build-driven, #ifdef sprinkled, type-redefined, extension-dependent, completely undertested pieces of what they call "software".

The second disability perpetuates every damage caused by the first one. Given due time, it seems that programmers are transformed into computers themselves; they lose any ability they may have had in being in contact with more complex, sensible creatures -- human beings. They over-rationalize everything to some really advanced pathological condition, and this is ultimately reflected on the technology directed to the industry, to the development of new tools. From aberrations like the C++ and the Haskell languages, passing through the adoption of entire SQL database servers to store ridiculously unimportant information, without any needs for extensive availability or any features a database system is directed for; and to the development of large-scale projects in completely non-scalable technologies (mostly interpreted languages like Python), it is the result of simply lacking the ability of hearing the others say: "Hey, man, that seems bad." "Are you sure this language/technology is adequate for the problem?" "Uh... all this template metacoding seems cool, but what was our real world problem we were supposed to be solving right now?"

Programmers begin to worship The Machine and rejoice with unbelievable extasis over absolutely worthless achievements, which are deep buried inside an entire ocean of usually bad written (see above) code and which does not give the end user any useful improvement, let alone be anywhere perceivable by anyone. Many times it does not even offer any advance in any area related even to the development itself: it is just a rewrite/refactor/re-whatever, as bad or worse than what it is intended to substitute. How many completely different and incompatible build systems are out there on the wild (gmake, cmake, nmake, BSD make)? How many different and incompatible GUI systems are out there? How many different and incompatible database systems are out there? How many... the list is exhaustively long. Autism: this is the cause. Programmers isolate themselves on a reality bubble, completely adrift to the world outside it. Programmers worship the Computer God, and become themselves a tool of a tool. Worse of all: the malady is so widespread it becomes mainstream, it's a pandemy.

Think about it for a single minute: if you wanted to write a very simple application (whichever it may be) in any currently available technology, which language or technology would you choose? It is extremely difficult to choose. There are so underwhelmingly many (completely different and incompatible in the most basic aspects) languages and resources, and each one of them achieve so few in a very strict group of real-world necessities (which is not much beyond that of simplicity, compatibility and/or performance), that chances are you'll be sticking with what the world has sticked for so long, even nowadays: C. That's right. If you're barely serious about it, you'll do it in C, not because C is good, but because everything else is simply worse. Go figure: there's a reason why 90%+ of the world's software is written in C, crudely or not, but that reason is not that C, or anything similar, is good. Is that programmers suck, and they suck really, really bad, both as human beings and as a tool for a tool.

Oh, by the way, what was the OP's question in the first place? Uh. I don't know. I'm a programmer myself.

Name: Anonymous 2011-11-24 16:29

>>7
I don't program for living and probably by standards of many, I'm a poor programmer. So I never understood one thing:
Whys is typedef bad with structs?
(I don't use it btw, someone once told me not to do it so I don't.)

Name: Anonymous 2011-11-24 16:43

Personally, I think we should all program in BBB-CODE

Name: Anonymous 2011-11-24 16:52

>>8

typedef struct anus anus; //bad way
#define anus struct anus //proper way

Name: Anonymous 2011-11-24 16:58

>>11

// proper way:
#define private public
#define protected public

Name: Anonymous 2011-11-24 17:04

>>11
'
>doom 3 src
>laughing whales.png

Name: Anonymous 2011-11-24 17:25

>>8

In C, the programmer is "invited" to worry about some aspects of the program which most other programming languages do not directly address. Whether it is good or bad, it's a matter of opinion, but in C, things like size and alignment are meaningful for the programmer.

A structure in C is, in the vast majority of cases, larger than the relevant word size. This is true when the structure contains more than one field, specially when padding is involved. Since there is no overloading in C as there is in C++, types don't mean too much by themselves except for what they hold, contain or represent; it is not very useful to attempt to use structure types as tags to select overloads or to perform other kinds of static code generation. Therefore, it is rather meaningless to declare a struct only to contain a single element (you'd use the single element directly instead), or to otherwise contain word-sized data which would fit better in a single word (and without the unavoidable padding penalties involved within structures).

Thus, a structure is often a rather "big" data chunk which should be treated differently from smaller (scalar) data types. For example, while it is wise to pass around copies of scalar types to routines, it is unwise to do the same with large structures; it is better to pass a (possibly const) pointer to it in order to achieve the same effect. It is also bad to directly return structures; a pointer to an output buffer is usually better. And while it is ok to stack-allocate int arrays in the stack, you'd rather avoid it with structures, since they can be much larger than an int, and may trash your stack area. With a structure, you expect it to be larger than you typical int variable, and you will refrain from doing a lot of things which you would do with scalar types.

So, typedef struct is bad in a number of aspects. First, they disguise a structure as being an ordinary scalar type: it is hard to tell whether it is really a redefined scalar type (which is also bad), hard to tell whether it is addressable, whether they can be dereferenced (if pointers to structures), and so on. People may even attempt to do arithmetic with it, as if the type were really a scalar type (think about someone implementing a binary tree with your structure type as data, and using plain subtraction as a comparation criteria -- it won't work on aggregate types).

Second, they pollute the ordinary namespace unnecessarily. For structures, unions and enumerations, there is the tag namespace -- an entire separate identifier namespace for that purpose. Some code even do things like:


typedef struct my_struct {
    // ...
} my_struct;


which is doubly bad, since they pollute both namespaces with the same identifier. This is a minor cause of problems but it is nonetheless good to avoid messing up too much with identifiers, specially because C lacks proper namespacing mechanisms.

Third, typically people who apply typedef to structure types will do the same with unions. By simply looking at the type you can't differentiate a structure from a union, and the access to them is exactly the same (and they can even contain exactly the same members). While strong syntax is no excuse for lack of proper testing, in this particular case the syntax does help to better understand the code and avoid stupid mistakes.

Last, typedef in C is, in common practice, bad not only on structures, but in everything else. A typedef does not declare a new type: it's just an alias the compiler will happily ignore. Thus, it helps to do what should not be done in C: hiding information from the programmer. You can't properly printf() or scanf() typedef'fed variables, since you can't say if it is an int or long. You can't bitwise operate in it because it might be a float. You can't assign to it safely: it might be const. typedef is roughly meant to substitute #define, thus it limits itself to almost only what #define can achieve. It is meant to create opaque types due to standard-wide constraints and other implementation-specific issues, and even then, these types are very well documented.

Roughly the same apply for putting typedef together with unions (without the same sizing problems, but with the same syntactical unawareness problems), enumerations (namespace pollution) and types in general, when it is visibly not needed.

Name: Anonymous 2011-11-24 20:45

>>13
Gayest post evar

Name: Anonymous 2011-11-24 21:21

>>13
I ask you, sir, do you have links to resources that might teach me good overall coding practices? If what you've been saying is true, there is a large amount of misinformation out there, but you seem to be a person who can share some good, reliable information.

Name: Anonymous 2011-11-24 23:06

If it ain't Lisp, it's crap.

Name: Anonymous 2011-11-25 16:11

>>15

I haven't got any good resources specifically about coding practices, neither in C nor in other languages. I've learned a lot from some newsgroups (though you have to dig through the enormous quantity of crap which compose the majority of the content) and through sheer experience; and whenever I read something interesting I almost always discard the original reference and only keep the content within my knowledge.

In an effort to be useful, I'll give you two good references to keep: the well-known C FAQ (http://c-faq.com/) and the CERT Secure Coding Standards page (https://www.securecoding.cert.org/). These should give you hints on how to, at least, get your program running correctly and safely.

This link (http://www.ibm.com/developerworks/aix/library/au-hook_duttaC.html), from IBM DevWorks, is also good, albeit too much full of corporative crap and extremely retarded recommendations (setbuf(stdout, NULL), what the fuck?!).

Some IRC channels are also very good source of information; #posix in Freenode is one of them.

Now, things to run away from: avoid every tech or "IT" blog you may come up with (programmers who advertise themselves as "IT" should be treated with extreme distrust). In no circumstance consult Reddit, Slashdot, DrDobbs, and programming forums in general for any advice (though the latter seems to be a little better than the formers). StackOverflow is okay, as soon as you read -every- response to a question, in order to form a well composed opinion. And avoid "gurus" (specially in the C++ field): Sutter, Alexandrescu, et al; thus, avoid gotw.ca.

There's an enormous lot of misinformation out there, that is a fact. Unfortunately, when you're out in the wild, that just how you simply are: alone, and out in the wild. It is difficult to filter good, correct information from crap when you don't know much about the subject. From my experience, 9 out of 10 articles I read online are not only crap, but simply technically wrong; in other words, just don't rely on what you read online without verifying (if it matters to you). The only thing you can ever count on absolutely is the standards that define the language, library or technology.

For example, compare this (the correct):
https://www.securecoding.cert.org/confluence/display/seccode/POS03-C.+Do+not+use+volatile+as+a+synchronization+primitive
http://software.intel.com/en-us/blogs/2007/11/30/volatile-almost-useless-for-multi-threaded-programming/

With this:
http://drdobbs.com/cpp/184403766 (and this is written by Alexandrescu himself)
http://java.dzone.com/articles/threading-stories-why-volatile
http://www.martynov.org/2007/07/volatile-and-threading.html
(and dozens of StackOverflow blatantly wrong responses and recommendations.)

Just a last bit of philosophy: people seem to enjoy fame more than everything else. First, they start to write about things online, then they open a blog or site, then they acquire visibility, then they begin to write more about things they don't know about (at all), because visibility and renown is what matters in the end. That's pretty natural and fine, but a reason why I still spend (mostly waste) time on anonymous boards is that, well, they're anonymous: I can't build a reputation to which I can masturbate later on. Thus I deny myself the temptation to write about things I don't know about, since it won't afford me anything in the end: neither cash, nor pussy, nor reputation.

I can focus on stuff that matters.

Name: Anonymous 2011-11-25 16:15

>>13
>>17
Serious answer in /prog/. Am I dreaming?!

Name: Anonymous 2011-11-25 16:35

>>13
Fuck you faggot, I'm not typing struct every fucking time I use one.

>>17
Your fag links don't instruct the programmer on how to properly use memory barriers, and instead tell you to just use the appropriate library functions. What if I'm writing an operating system? What if I don't have any library functions to use? What if you weren't such a huge cock sucking faggot?

Name: Anonymous 2011-11-25 16:41

On ARM unaligned reads and writes implicitly swap bytes. Have fun tracking down those bugs/features without a type guaranteed to be what you think it is.

Name: Anonymous 2011-11-25 16:53

>>19
They do. Library functions which perform synchronization also perform proper fencing themselves. If you're writing an OS, you would not be able to count with a standard library and would inevitably write nonportable code. If you don't have library functions to use, either don't do it or go unportable; volatile won't issue memory barriers for you anyway.

You're just a troll. Get out.

Name: Anonymous 2011-11-25 17:05

>>21
If you're writing an OS, you would not be able to count with a standard library and would inevitably write nonportable code.
Ah, you must of those faggot ARM hipsters that sometimes dwell in our otherwise merry /prog/.  Get it through your thick fat skull already -- there are only two targets you ever need to consider, and they are called i386 and x86-64.  If your code works on both of them, then it's portable.  Your faggot shit will never make it outside locked crippled slow last-decade's-lithographic-process idiots' cellphones.

You're just a faggot. Fuck off and die.

Name: Anonymous 2011-11-25 17:35

>>17
These links are greatly appreciated. If I can extract at least a little intuition about what is good information out of them, you've done me a great service.

Name: Tibi 2011-11-25 17:50

>>18

Serious it may be - but not helpful and at most theroretically correct, mentioning experience being the exception.

With regards to the original question why C programmers like to use abstracted datatypes, its Good Practice. I have programmed systems where the C type "int" was 8 ( Z80 ), 16 ( 68k ), 32 ( ARMv4/x86 ) and 64 ( AMD64 ) bit. Now for the Z80 all hope was lost as its compiler had no 32 bit data type but the rest had one. The problem is that a 32bit integer is "long" on 68k which is 64bit on AMD64 so you typedef or #define a datatype that says "32bit integer" and #ifdef it in a single location. I also noticed in 2003, when i got myself an Opteron from AMD running 64bit stuff, how many programmers used the "long" datatype as a 32 bit Integer and did 32bit arithmetic on them, which caused a lot of grey hair for me. So using stdint.h or your own typedef'ed datatypes help create portable code.

As for typedef being bad i strongly disagree for structs but agree for unions ( which i think are ok if being a member of a struct ). The struct keyword should in my opinion only be used when the type definition of the struct is not available. A typedef helps to generate clean code. The mentioned namespace problems are minimal, given seperate header files for modules and interfaces to libraries.

For the specific example of web sites giving different opinions about the volatile keyword, consider the following code:


[ volatile ] int i_break = 0;

void possibly_infinite( void )
{
    while( i_break == 0 )
    {
    }
}

where i_break may be decalared volatile. If it is not volatile the following code is generated by a popular GNU C compiler:

.globl    possibly_infinite
possibly_infinite:
move.w i_break,%d0
.L3:
tst.w %d0
jbeq .L3
rts

while when its volatile:

.globl    possibly_infinite
possibly_infinite:
.L3:
move.w i_break,%d0
jbeq .L3
rts

As you can see the loop may exit if i_break is declared volatile. This sort of synchronization is lockless and the code does not depend on an operating system supplying such functions. While on modern multi-core systems, for CPU runtime reasons, such constructs have only use as a pre-check condition and then, if the integer would be polled repeatingly, falling back to the operating system to do proper signalling, they are still valid and if used correctly quite handy.

If >>17 had understood the articles he would have noticed that they are about different things. Those he deems correct argue about exclusive access to a resource while those he sees as incorrect are describing the use of the volatile keyword correctly.


>>23
Sure is samefag in here.

Name: 17 2011-11-25 17:57

>>24
those he sees as incorrect are describing the use of the volatile keyword correctly.
No, it's incorrect. volatile was meant for memory-mapped IO and nothing else. If you want to do spinlocks, use assembly.

Name: 24 2011-11-25 18:07

Memory Mapped IO you say. Why does it have its use in Java then as java.dzone.com/articles/threading-stories-why-volatile describes ?

Name: 25 2011-11-25 18:13

>>26
Why does it have its use in Java then as java.dzone.com/articles/threading-stories-why-volatile describes ?
It doesn't. Timers are a low-level abstraction that aren't to be used anywhere outside the kernel. Application code uses threads and specialized processor instructions generally abstracted by mutexes.

Name: 26 2011-11-25 18:29

>>27

while i agree that access to data by multiple threads should be protected by a mutex to prevent race conditions i am also of the opinion that one should be able to do the same without the help of CPU specific instructions or OS specific functions, even if it is not optimal.

Name: Anonymous 2011-11-25 18:43

>>24
This is unsafe and should definitely be considered harmful.

Name: Anonymous 2011-11-25 18:47

ooc.util.concurrent.atomic

Name: > 2011-11-25 19:31

Name: Anonymous 2011-11-25 19:32

>>24

Wow.

You seem to feature some serious reading or learning disability I've never heard before. I suggest you to visit a psychiatrist, or maybe a neurologist. Maybe you're really a new case, unheard to the whole scientific world.

Nonetheless, don't waste time attempting to understand what's been told, neither about volatile in particular, neither about memory fencing, neither about lack of availability of good quality information, neither about nothing in programming. Aside the attrition, there've been a couple of good observations; however, everything you've exposed in your example is a ridiculous, not to say pretentiously insightful, exposition of two-decade-aged processor architecturing, while being terribly off-topic at the same time, but that's easily explainable assuming the installation of such a severe malady. It's better to seek professional advice.

For such people it is Good Practice to just keep away from society for as long as it is possible, or at least for the duration of the treatment, if one is available.

Name: Princess Pudding !MhMRSATORI 2011-11-25 19:42

>>32
While I agree >>24 is a retard, I don't think you need to be so overly dramatic and mockingly verbose.

Name: Anonymous 2011-11-25 19:56

>>33
You're right. Sorry. Sometimes /prog/ just inspires me.

Name: Anonymous 2011-11-25 23:51

not using typedef struct is a failproof way of detecting autism

Name: Anonymous 2011-11-26 1:49

Typedef struct is the gateway drug to sepples

Name: F r o z e n V o i d !!mJCwdV5J0Xy2A21 2011-11-26 1:54

I hate structs, their inconsistent packing, compiler differences and their alignements.
I'd rather use pointer or raw arrays, but some cases just need a struct.

Don't change these.
Name: Email:
Entire Thread Thread List