Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-4041-

what is a compiler

Name: Anonymous 2005-10-09 4:00

what is it? i know you type pretty c++ or w/e code and it converts it into computer language so computer knows waht we mean, how do they differ for different languages asides that they convert two different languages? (its kinda hard to explain what im asking) like if all it does is change what we do to computer language for any programming language, why can't we have 1 universal programming language

Name: Anonymous 2005-10-09 6:04

age for awesome thread pattern

Name: Anonymous 2005-10-09 6:45

We have different programming languages so that we'll be able to have flamewars over which one is the best. Duh!

Name: CCFreak2K !mgsA1X/tJA 2005-10-09 14:37

We have many different programming languages because each language is better at some tasks than others.  For example, PERL stands for Practical Extraction and Reporting Language, which, if you guessed, is useful for gathering lots of data and organizing it in many different ways, which makes it excellent for web-based applications.  C is potentially of course a lot faster in non-web-based applications, so that's a lot more widely used (that and PERL is interpreted, but that's a moot point since you can distribute perl interpreters with your code).

Why can't we have one universal programming language?  You might just as well ask why we don't have one universal spoken language.  Good luck getting everyone to say the same thing.

Name: CCFreak2K !mgsA1X/tJA 2005-10-09 14:39

And by PERL I meant Perl.

Name: Anonymous 2005-10-09 16:21

>>1

With respect to your question, I'll assume you're limiting yourself to, e.g., a single architecture.  The best way to understand this is actually to look back at the history of software development.

(Note that this *is* a simplified, and not completely accurate, representation.  I'm no historian, nor do I pretend to be.)

Back in the day, computers were programmed by essentially opening and closing switches, generally by plugging cables into a large board.  There was no reality of stored programs, so once a computer was programmed this way, it stayed that way until somebody came along and physically changed the configuration of the electrical cables.

Now, this really sucked.  A lot.  You needed highly intelligent experts to write even the simplest calculations because they needed to know and understand the entirety of the innards of their computer.  Fortunately for everybody, technology evolved.

Eventually, through the work of Turing and many other very intelligent folks, it became understood that you *could* create a computer which was actually able to run programs that were defined elsewhere.  It was a machine that could emulate other machines.  This led to storable programs.

Of course, the language that was used didn't immediately evolve with that concept.  It was just really cool that we could write out (or punch out into cards, or however they did it) all those ones and zeroes that we used to connect manually via electrical wire.

And, yes, they were essentially using ones and zeroes to do this work.  Straight machine code, very complex, very difficult, very error-prone.  Forgot a '1'?  'splode.

And so the early assembly languages were created as a mnemonic for all those ones and zeroes.  It's way easier to remember "MOV R1 R2" than it is to remember "11101001".  Of course, this was a breakthrough because it also required the first assembler, a program designed to parse these mnemonics and generate the corresponding binary code within the computer.

Gradually, though, a desire formed to create languages that would represent the problem domain more accurately than assembler.  Assembler was functional, it's true, but certain patterns occurred over and over and it would be a lot easier to use those patterns with the assembler as building blocks than to use the assembler straight on.

And so high-level languages were born.  And a plethora of them, at that.  Because high level languages (COBOL, FORTRAN, B, C, BASIC, Pascal, ADA, Forth, Lisp, et al.) were developed for various goals, they emphasized different aspects of their problem domains.  FORTRAN (from "Formula Translation") was developed for scientific and mathematical applications, e.g., while COBOL was developed for business software.  Different targets, different languages, different semantics.  People continue inventing languages today, in fact, for specific problem domains.  There's some research being done on the creation of a language whose job is to (ah-ha!) create and define domain-specific languages!

So you see, these languages weren't all developed at the same time as computers.  Rather, it's been a layered evolution of the industry which has resulted in the state of things today.  (Oh, btw, assembler is no longer the lowest-level language, either.  Below assembler, in many modern general-purpose processors, lies a processor- or architecture-specific language called "microcode".)

So yes, the compiler translates your C++ code into assembler.  Assembler is not a high-level language, though.  It's a direct conversion/translation into binary code.  The task of that translation is extremely non-trivial, and that's why very few specialists write exclusively in assembler these days.  Compilers can typically do a better job than a single person anyway.

As for the universal programming language, some high-flying researchers have considered it, but it's really a faulty idea.  The reason is based in the reality of the fact that a language must, at some level, reflect the architecture of the underlying hardware.  You'll never get such universal acceptance from the companies involved.

And even if you did, the proliferation of high level languages is really a natural consequence of the fact that, in design (specifically, problem-domain specification), there's no single right answer.  It's all a question of trade-offs, and the only thing that a universal language would really accomplish is making everybody equally unhappy with it. 

C++ is a good general tool for many purposes, it's quite popular in the industry, so it's a good place to start.  It isn't (nor should it be) the only option for all problems, though.  That'd be like saying that a scalpel is the only manual tool anybody could ever need.  Patently silly.

I hope this helped.  If not, ask your question again.  :)

Name: Anonymous 2005-10-10 8:08

>>6
Good answer.

I would like to add: programs are written for other people to read, and only incidentally for a computer to execute.

Name: Anonymous 2005-10-15 1:45

>>7
Bullshit wankery. GTFO and get an English major instead.

Name: Anonymous 2005-10-15 9:16

>>8
Moronic stupitude.  >>7 is right on.  If programs were only written for computers to read, they'd still be written in 1's and 0's.  That's a heckuva' lot easier for a computer to read.

Name: Anonymous 2005-10-15 9:53

>>9
No, man, you're the moron. If we didn't have computers, do you think anyone would be writing? The entire purpose for a language is to make this hunk of shit you're typing on do stuff.

Or are you some asshat that writes curly braces and passes it to your buddies so they can admire the ink on the paper?

Programs are written to make a computer to execute, and only incidentally for other people to read. If you want to make shit that people can read, stop programming and write a book instead.

Name: Anonymous 2005-10-15 9:53 (sage)

do you think anyone would be writing code?

fixed

Name: Anonymous 2005-10-15 10:10

>>10
Well, you're welcome to your opinion, even if it is contrary to pretty much all the great hackers.

Name: Anonymous 2005-10-15 10:14

Just stating the obvious. I don't think your "great hackers" intended people to believe something like >>7.

Refute what I said. Point out how great your code is if there's nothing to use it. Show me you don't need computers. Show me your code really doesn't need a machine to give its existence a purpose.

Hey, dipshit, if a language is readable, but can't make a computer do things, it has no reason to exist. People would go back to writing with 1's and 0's.

Name: Anonymous 2005-10-15 11:22

>>13
>>7 is an Edsger Dijkstra quote. You know, the "gotos considered harmful" guy?

Name: Anonymous 2005-10-15 11:36

>>14  I bet >>13 hasn't even heard of "Gotos considered harmful"

>>8, >>10, >>13 are either ignorant or trolling.  Either way, they're clearly not qualified to be involved with professional dev't.

In a very real sense, programming languages exist independent of the computers they run on.  They're languages that define algorithms.  Sometimes the algorithms are simple, sometimes ridiculously complicated, but they still (at the core) use a set of agreed-upon conventions to define a (generally) deterministic sort of behavior, and this sort of behavior could actually be carried out by a person just as much as they could by a computer.  Thing is, computers happen to be faster.

A programming language, without any computers, may be impractical, but it certainly isn't useless.  It still provides a way to precisely define the semantics of a set of algorithms that will accept some inputs and provide some outputs.

Name: Anonymous 2005-10-15 11:37

Yeah, I know him. I also know he liked ALGOL 60. The guy's famous for his algorithms, the rest is tangental. Besides, the proper quote is (and it comes from SICP, unless you prove otherwise):

Programs should be written for people to read, and only incidentally for machines to execute.

Notice the difference between "should" and "are"? Okay, I take back what I said, you'd fail as an English major.

This is pretty fucking obvious, dude. Nobody yet has refuted what I said. If a language is beautiful but doesn't do shit, nobody will use it. If a language is unreadable but does shit (like, oh, 1's an 0's) people will use it. Helloooo?

Name: Anonymous 2005-10-15 11:40

They're languages that define algorithms.

So machines can use them, dipshit. You notice how a lot of papers use math to explain their algorithms? Ever wonder why that is?

A programming language, without any computers, may be impractical, but it certainly isn't useless.

Like pseudocode? Hah!

Name: Anonymous 2005-10-15 11:43

Your argument seems to consist almost entirely of the word "dipshit".

Name: Anonymous 2005-10-15 11:45

And your "argument" seems to be avoiding countering what I said. Answer the fucking question.

Dipshit.

Name: Anonymous 2005-10-15 11:45

>>16

Sweet!  You can use google, too?

Name: Anonymous 2005-10-15 11:47

Yeah, I'm fucking awesome. Amazing what you can turn up if you just check your facts in google first!

Name: Anonymous 2005-10-15 11:48 (sage)

*ahems* at >>21 verifying >>20's sarcasm, >>16 had no clue who dijkstra was until googling the name.

Name: Anonymous 2005-10-15 11:51

No man, I checked the quote. It went through that idiot Graham's site, then to SICP. Everybody in CS know Dijkstra. Shit, I've been forced to implement his search algorithm in AI.

Besides, even if I didn't know who he was:
a) The guy still fucked the quote up, and
b) Nobody yet has answered the fucking question.

So, suck it up bitches. Prove this ignorant idiot he's wrong.

Name: Anonymous 2005-10-15 11:52

Oh, yeah, and
c) Prove to me the quote come from Dijkstra in the first place.

Name: Anonymous 2005-10-15 15:48

Yep, after extensive googling I come to the conclusion that you are right. I guess I mixed it in with all the other Dijkstra quotes.

You know what the great thing about truth is though? Doesn't matter who said it.

Name: Anonymous 2005-10-15 21:16

Congratulations! You've successfully answered c) (the easiest one).

Only two left to go!

Doesn't matter who said it.

It does if you mangle it so badly it means something else.

Name: Anonymous 2005-10-16 9:04

You're probably one of those autistic people who start screaming every time the professor leaves a semicolon off the end of a C statement. I used to sit next to a guy like that.

How about this: you call Graham an idiot, but he created a startup which he sold to Yahoo for millions. If that's the kind of success that comes from writing for people rather than machines, I know whose example I'd rather follow.

Name: Anonymous 2005-10-16 11:47

You still haven't answered the two questions. You've been beaten in an argument by a troll, kill yourself in humiliation.

And Graham is an idiot. Have you read some of the shit he writes? Oh, he's very bright and all, but his site stinks of so much self-congratulation it's disgusting. I bet you were fapping off as you read "Great Hackers".

oh my god that's me, im a great hacker fap fap fap uuuurrghgh aaaaah!

Name: Anonymous 2005-10-16 12:39

WHICH two questions? The ones posed by the OP? Or your a) b) c) none of which were even questions?

Name: Anonymous 2005-10-16 12:55

The ones I've been harping about for half the thread. You're the idiot that defended >>7, now make good on it.

Name: Anonymous 2005-10-16 13:16

On second though, if I leave it up to you, you might miss the point. Here, let's make it nice and easy:

>>7 says "programs are written for other people to read, and only incidentally for a computer to execute". From this we can conclude that the most important element is that people can read it, not machines.

Pick a language. C, C++, Java, Perl, Python, PHP, Lisp, Ocaml, Ruby, Haskell, whatever. Just pick one. Now, if there were no computers on this planet, would anyone use it?

Okay. Let's pick another language. A really ugly nasty language that everyone would hate to use. Let's say... COBOL or FORTRAN? No, nobody liked them, but they weren't even close to evil either. Maybe something fun like Unlambda. Let's assume that Unlambda is the only way to make computers do anything. No Lisp or C for you, just Unlambda. If Unlambda was the only way to make computers do anything, would anyone use it?

Name: Anonymous 2005-10-16 14:24

Interesting you should pick COBOL as an example of an unreadable language, because its very unreadability stems from the fact that it was designed to be as close to English as possible.

Name: Anonymous 2005-10-16 15:38

Lol @ troll, trollee, and the fact /prog == /vip evals to true

Name: Anonymous 2005-10-17 20:43

i started /prog/ == /vip/ by asking the moderators to make everything monospace heh

Name: Anonymous 2005-10-17 23:49

A compiler takes your code and converts to Assembly language. (The closest English-language computer language before binary). The machine then interprets the ASM (Assembly) code into binary.

Name: Anonymous 2005-10-18 0:37

Modern compilers skip that intermediate step.

Name: Anonymous 2005-10-29 5:06

Of course programs are meant to be read by human beings because, in real life, you work with other people. Unless you write crappy games all alone in your basement...

Name: Anonymous 2005-10-29 5:09

meant to be read by human beings because

Of course. The more readable it is, the better.

But that's not the primary reason.

Name: Anonymous 2005-10-30 3:28

omfg when they said for people to read it they didn't mean publish a fucking book, the lvl of code at that point is ment for a human to comprehend so they can give the machine better instructions. as in the human would read c++ code, and understand and read that better than they would fucking binary making it easier to give better instructions. and if there were no computers, we'd remake them then give instructions.

Name: Anonymous 2005-10-30 3:43 (sage)

>>39 could take his own advice. Learn how to write.

Name: Anonymous 2005-10-30 6:08 (sage)

>>39
what

Name: Anonymous 2005-10-30 7:45

>>39
I hope you don't write your code like int slistgetnextelementstr(). Uppercase motherfucker, do you use it? ;)

Name: Anonymous 2005-10-31 12:07

>>42
underscores > fagHungarianNotationLoLoLoL

Name: Anonymous 2005-10-31 15:36 (sage)

>>43
You don't know what hungarian is do you?

Name: Anonymous 2005-10-31 15:48

Name: Anonymous 2005-10-31 16:34

>>45
Yeah I liked that article too.

Name: Anonymous 2008-05-07 14:37

A MISERABLE PILE OF CODE.

Name: Anonymous 2009-01-14 4:20

Compilers translate english into assembler

Don't change these.
Name: Email:
Entire Thread Thread List