Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

New thread

Name: Anonymous 2010-09-02 14:29

Best way to learn programming:

1) Learn logic
2) Learn assembly
3) Learn C

At this point, two choices: stick with C or learn whatever high-level languages/paradigms suit what you want to do.

Almost everyone does the exact reverse.  First learn BASIC or Java, then attempt to understand what's going on underneath (and probably never get around to it).

If you go from the bottom up, each step gets easier and there's no mystery hiding what's going on under the hood.

Name: Anonymous 2010-09-03 14:12

>>29
You are the poster boy for the OP argument.
It's funny because I have a lot of experience in assembly and C. I started coding back in the early days of DOS, and more recently I was a game developer for BREW phones. I actually did do exactly what the OP is proposing, but unlike you, I went on to do his optional step of learning some real high-level languages. Now I have the magical ability to use the right tool for the job.

From your description of high-level languages, you're completely unaware of just how inefficient your code actually is.  You expect your numeric values to always be some sort of "BigInt" class, apparently.  So your version of a + b is probably, realistically, about five function calls and maybe 50 to 100 lines of code, where it would be a single instruction for anyone who knows what they're doing.
Haha oh wow. You really have no clue how something like Scheme's numerical tower is implemented, do you? You're still taking everything you know from Java's BigInteger. Honestly, do you even know any language other than C and Java?

It's pretty easy to implement a highly-efficient BigInt. One simple way is to use a 64-bit value for your integers, and store small values in the bottom 63 bits. You can then do ordinary arithmetic on small numbers. It's trivial to tell when you overflow it; you just check the top bit after addition, and for multiplication you sum the level of the highest bits. When you do overflow it, you can set the top bit, and use the remaining bits as a pointer (down-shifted by one bit; remember the bottom three bits of a 64-bit heap pointer are always blank due to alignment.) This points to an arbitrarily large number in heap memory on which you can do arbitrary precision arithmetic.

With this sort of system you get on average maybe three additional processor instructions per arithmetic operation when doing math with small values (with highly reliable branch prediction since they are almost always small.) Not exactly five functions and hundreds of lines of code, as you seemed to imply. In almost all applications this trade-off is very much worth it to not have to ever worry about the limitations of your numbers, not to mention having a runtime that can provide absolute security against integer overflow.

And yes, as you can plainly see, I DO care a whole lot about how these things are implemented at the low level. It's interesting and exciting to work with. The point is I don't HAVE to work with it when I'm writing a large application. A well-written platform hides all of this away from you.

You need to get it into your head that not everyone is writing fucking video games for a living. I'm starting to sound like a broken record here, but you need to learn to use the right tool for the job.

Name: Anonymous 2010-09-03 15:29

You really have no clue how something like Scheme's numerical tower is implemented, do you?
Of course he doesn't, he still labours under the expensive procedure call myth.

Name: Anonymous 2010-09-03 15:42

>>40
What I'm saying is, it's so simple that the standard whatever is overkill for many things. I mean, if you'd rather instantiate an Iterator instead of just using a while loop, then that's fine, if you are a heartless monster.

Oh, and I'm not >>29 or any of the other guys wasting so much time over this. In fact, I wrote >>30 and a few other mutually conflicting posts, but not for any mischievous purposes.

Name: Anonymous 2010-09-03 16:22

ITT: Java coders who've been trolled by "read SICP" for so long that they've actually bought into it.  I wonder if they've actually read it, though...

Name: Anonymous 2010-09-03 16:38

>>44
back to /pr/, please.

Name: Anonymous 2010-09-03 16:40

Nobody is saying that programmers don't benefit from understanding assembly and having a rough idea of how the hardware works, by the way. We're just saying that it's a fucking retarded way to teach programming.
I'm beginning to see where the term ``C-tard'' came from, even though I use C myself.

Name: Anonymous 2010-09-03 17:52

>>45
Fuck off, ``faggot''.

Name: Anonymous 2010-09-03 17:59

Writing assembly for a desktop computer is ridiculous for a beginner. However, writing assembly for a small microcontroller is not completely unreasonable, after they have some programming under their belt in any language.

For the desktop there is a massive gap between fibs and fact and an application. Not so for demo boards.

Name: Anonymous 2010-09-03 20:41

>>48
Writing assembly for a desktop computer is ridiculous for a beginner.
How come? Maybe I'm blind after having read and written a lot of x86 asm for both real and protected mode (mostly DOS and Windows, although I have no problem in using it on other platforms, providing I'm familiar with the ABI and overall system APIs). It never seemed too difficult to me, as for microcontrollers, those have their challenges as well.

Name: Anonymous 2010-09-04 0:15

>>1-50
NO ONE NEEDS TO WRITE IN ASSEMBLY ANYMORE SO LONG AS WE HAVE forth

Name: Anonymous 2010-09-04 1:14

>>41
3 instructions minimum (assuming a 3-operand ISA and the arith operation sets a negative flag and the compiler uses said flag), and it'll cause inefficient use of ALUs (modern CPUs can do 2+ adds per cycle but never more than branch)

Also, that only works nicely for unsigned numbers. Most people care about signs.

Name: Anonymous 2010-09-04 1:20

>>51
There's many ways to do it.
One CL implementation that I know of uses the lower 2 bits as tags (so you have 29bit signed integers, which isn't that good actually, but it's okay, if you really want fast 32bit ops, you can use  declarations or make yourselves some macros for such things - for example, if you're writing a crypto library and want to achieve c-like speed). So fixnums would be xx00, thus addition and overflows all work correctly, and converting the value to a normal int would be just a shr reg, 2.

Name: Anonymous 2010-09-04 2:16

>>52
sar I hope you mean.

Anyway, I tried doing something similar using 31-bit signed and the lsb as the flag, and unless you want to use asm to make use of overflow flags (since you can't reliably in C) I don't see it not being ugly. Improvement suggestions?

#include <stdint.h>

typedef int32_t bignum;

bignum add_big(bignum, bignum);
bignum fixup_bignum(bignum);

bignum add_signed(bignum a, bignum b)
{
    if ((a|b) & 1)
        return add_big(a, b);

    bignum result = (a>>1) + (b>>1);
    bignum sign = result & (1 << (sizeof(bignum)*8 - 1));

    if ((sign ^ (result << 1)) < 0)
        return fixup_bignum(result);

    return (result << 1) | sign;
}

Name: Anonymous 2010-09-04 2:25

actually the last or isn't needed, the top 2 bits are guaranteed to be the same from the overflow check.

but still...

Name: Anonymous 2010-09-04 2:36

which then removes the need for sign, now it's "only" an additional 10 ops for x86

I guess I should actually test this...

Name: Anonymous 2010-09-04 2:51

itt people implying that everyone should have to individually manage cpus rather than just compiler devs.

Name: Anonymous 2010-09-04 5:41

>>51
Most people care about signs.
Haha, good one

Name: Anonymous 2010-09-04 6:50

>>57
It is hardly a surprise that someone who can't comprehend the simple concept of punctuation is not able to grasp the idea of signed numbers.

Name: Anonymous 2010-09-04 7:01

>>58
If I want to do maths, I'll use a higher-level language. I've rarely required negativity in C.

Name: Anonymous 2010-09-04 12:22

>>59
That's a good start, now all that is left is to prove that you == ``most people''.

Name: Anonymous 2010-09-04 13:26

>>56
Compilers don't manage CPUs outside of the task of getting code compiled (and even then, only in abstract.)

Name: Anonymous 2010-09-05 1:24

this thread is full of idiots who need to read SICP.
until you can apply the lessons it teaches to programming in any language (including c, assembly, and even brainfuck), you're nowhere near satori.

Name: Anonymous 2010-09-05 11:03

>>62
That's an amazing post; it doesn't address anyone's points or queries (especially the OP's), it doesn't take a position on the topic (instead, it targets people on both sides of the argument without actually contributing to the discussion), and it's not even original.  It took some effort to be that useless, didn't it?  Now take a seat while the adults continue their debate.

...

The participants in this debate fall into two (well, three) camps: academics and professionals (and trolls uninterested in the subject).  The academics are concerned with the most technically correct solution to the problem -- they want to write an implementation that results in the most efficient machine code, regardless of the development cost.  Professionals, on the other hand, seek to write an implementation that balances efficiency with expense  -- there's no margin in doing it the academic way.  (And the trolls don't understand the issue at hand, they just like a good fight.) 

Being a professional, I use high level languages that remove the time-consuming details through abstraction; that doesn't mean I don't appreciate the academics' viewpoint, just that I don't subscribe to their approach while I'm working.

Name: Anonymous 2010-09-06 4:11

>>63
OH NO YOU DID NOT JUST CALL ME AN ACADEMIC

Name: Anonymous 2010-09-06 8:29

>>63
Whereas all you've done is straw man'd your opposition and made them seem like naive children.

Name: Anonymous 2010-09-06 9:07

Best way to learn programming:

1) Fuck around with BASIC until you can write programs that solve arbitrary simple to mid-complexity problems
2) Read SICP

Name: Anonymous 2010-09-06 10:33

>>63
Academics are more worried about optimization than professionals.
It's a shame you're retarded, or that would be a really good point. There are plenty of professionals with totally misplaced priorities.

Name: Anonymous 2010-09-06 11:05

>>63
It's interesting how self-described ``professionals'' always seem to feel the need to call everything they disagree with ``academic'', no matter how incredibly stupid that would be. I can promise you that the people advocating reading SICP because it teaches you the value of abstraction will prefer being called academics over professionals, and don't regard that as a dirty word.
Unless this is just an attempt at trolling both sides because you're in your third category, of course.

Name: Anonymous 2010-09-06 12:00

>>7
Hokey religions and ancient weapons are no match for a good blaster at your side kid

Name: Anonymous 2010-09-06 13:20

Hotkey religions

Name: /prog/ Border Control !fzcXE63Op. 2010-09-06 14:03

>>63, please turn in your /prog] passport and return to /g/.

Name: Fuck off, !Ep8pui8Vw2 2010-09-06 14:43

>>71
/prog]
You fail it.

Name: Anonymous 2010-09-06 17:11

>>66
This is actually the worst way to learn programming, and, it's the most common way.  Evidence?  Look at the ratio of intelligent posts on /prog/ to posts that make you want to stab your own eyes out.  These are the programmers that are created by >>66's method.

>>63
Your argument is surprising to me.  I read the whole thing and got to the end thinking "this guy is on the professionals' side, which means he's on the OP's side."  But then you derailed me completely by saying:
Being a professional, I use high level languages that remove the time-consuming details through abstraction

It sounds like you're saying the OP is more on the academic side, while the "read SICP" crowd is the professional side?  I suspect the exact opposite.

Name: Anonymous 2010-09-06 17:29

>>73
s|/prog/|/prog/|

Name: Anonymous 2010-09-06 17:55

>>73
Evidence? Is that really evidence? I doubt these non-intelligent posts were written by those who can program. At most, they're by people who ``know a bit of Python'' or C (i.e. believe Hello World in either constitutes being able to program). None of them have learnt BASIC, otherwise that'd be the language of choice for "how do i write this program" threads (it isn't).
Aside: Note that a lot of code posts here are in C or Python.

BASIC is a perfect language to learn programming with: it teaches you the basics of imperative programming, then once you can solve simple to mid-complexity programs you realise that actually, it's a bit shit. So you move on to greener pastures - C, perhaps assembly (or, if you're still a bit simple, FIOC). The knowledge gained in using BASIC is equivalent to step 1 in >>1.

Of course you cannot oppose step 2 in >>66. Nobody can.

Name: Anonymous 2010-09-06 19:02

>>63
Professionals don't use the very high level languages you advocate. They use Java, C++, and C#, all of which are terrible considering performance and development time.

Academics aren't concerned with performance on actual machines; depending on their interest they might completely ignore performance considerations outside of complexity analysis or they might assume an idealized ISA that has never existed.

I guess it would be hackers (in the original sense) that care about performance over development time, and you're one of the mythical professionals that use decent languages designed to actually decrease development time.

Name: Anonymous 2010-09-06 21:36

>>73
Well not necessarily. Someone like Paul Graham would certainly consider himself a professional, not an academic. You have to admit that coding for maximum possible performance without regard to development time is an academic pursuit. In the real world it really only applies to numerical simulations, such as scientific simulations or video games.

You're using a different definition of professional, that of a code monkey in a giant enterprise shop. Those are a different breed and they are irrelevant to this discussion because they don't get to choose their programming language. The project managers do, and for them, having a predictable schedule and replaceable programmers is a lot more important than having the fastest programs or the shortest development cycle.

Name: Anonymous 2010-09-06 21:57

>>76
Professionals focus on what they can do today, and sometimes they think about what will be possible in the future.  Academics focus on what will be possible, and sometimes they try to implement it.  Without both professionals and academics, we're fucked.

Look at those "professional" languages: Java, C++, and C#.  Their development is only possible because of the academic languages that came before them.  The people who designed these languages are academics and professionals.

Take Java, the waning king of professional languages.  When it was introduced, things like compiling to a cross-platform VM and using garbage collection were widely seen as quaint academic tricks that had no place in a production environment.  Today those are taken for granted.

Haskell is currently the big thing among academic languages.  Haskell has a few big features like better correctness guarantees, a wonderful FFI, easy parallel programming, and working STM.  Bits of Haskell will be in the next generation of "professional" languages, and you can either learn Haskell now, or wait until someone waters it down for you.

Why don't professionals use better languages?  (1) existing code (2) retraining (3) library availability (4) support

Academics are (1) writing new code (2) always learning (3) don't use as many libraries and (4) are expected to figure the damn things out themselves

Here's a tip for you: Microsoft is paying academics to work on Haskell.  It is all about the bottom line.

Name: Anonymous 2010-09-06 22:01

>>78
Bits of Haskell will be in the next generation of "professional" languages...
Of course. It's loaded with syntax, which professionals love because they can spend more time typing than thinking.

Name: Anonymous 2010-09-06 22:11

Protip: SICP is 25 years old. Computing has come far in that time,

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List