Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

OO vs procedural

Name: Anonymous 2007-03-01 4:30 ID:WOeEZFMF

Tell me about it.

Name: Anonymous 2007-03-01 5:12 ID:eX1bJxuV

For most people and languages, the difference is lol(x, y) vs. x.lol(y), which makes OO absurd (and operations tend to suck more on x.lol(y) notation, like string1.equals(string2), ugh). Then you have inheritance, which means you get a free dish of spaghetti code. It's just dynamically updated copypasta, and come to think of it, spaghetti is pasta.

That's ALL there's to OO; note that garbage collection, exceptions (= goto 2.0), function and operator overloading, etc. are not OO features, and many languages offer them procedurally.

If you are serious about OO, then OO is about functional units responsible for keeping their own state, and can provide a good abstraction. Inheritance and mixins can have their uses as well, but for this to be comfortable, you have to work with a  dynamic language such as... I'm not going to give examples because I'll be called fanboy.

Of course, and like with pure anything, pure OO sucks. Not all structures, models and algorithms adjust to OO well. Pure OO languages which shove OO up your ass all the time, like Java, are bound to suck. (Java is bound to suck for many other reasons as well.) So you use OO when it fits, procedural programming when it fits, and functional programming when it fits, and finish early.

Name: Anonymous 2007-03-01 5:12 ID:eX1bJxuV

>>2
P.S.: Objects are poor man's closures.

Name: Anonymous 2007-03-01 6:00 ID:ROjSaolg

>>2

I agree with most of this post (especially choosing the paradigm based on what job you have to do), except for Java as an example of "pure OO", since Java method bodies are essentially procedural.  It's the same for Python, Ruby, etc.

I would consider a "pure OO" language to be Self or Smalltalk, which have no constructs except message passing (Self doesn't even have classes, you clone prototype objects instead).

Name: Anonymous 2007-03-01 11:07 ID:Heaven

Pure OO languages which shove OO up your ass all the time, like Java, are bound to suck.
Java is OO gone horribly wrong then marketed pure enterprise OO.

Name: Anonymous 2007-03-01 13:09 ID:iyCrV+EW

>>4
In what way it is the same for Ruby? I'm not a Ruby user but it seems that your understanding of the Ruby object system is lacking.

Name: Anonymous 2007-03-01 14:47 ID:xe3ovJKc

Fuck OO.  GOTO or GTFO.

Name: Anonymous 2007-03-01 15:03 ID:Ybr+qfIs

Java is the bastard child of OO and procedural programming, but it shames its fathers (no girls in programming lol) by doing neither remotely well.

Name: Anonymous 2007-03-02 2:36 ID:h+R4C0zj

>>8
Java isn't as much a programming language as it is a productivity tool for dime-a-dozen managers and their fresh out of college "Java developers" who can't tell their arse from their elbow. It was invented for the tech boom of the turn of the century for the purpose of squeezing reams of crap, ill-indented code from mediocre-to-shitty programmers.

Why some people insist on using it today is beyond me.

Name: Anonymous 2007-03-02 3:06 ID:OYNtzQBQ

Fuck OO. GOTO or GTFO.

Name: Anonymous 2007-03-02 4:21 ID:mnlveWAT

>>9

Because unlike a "true" OO language, Java can be made to run in reasonable amounts of time.  Compilers can't optimize "true" OO because they have no clue what the code will be doing until runtime.  So you lose all the nice static typechecks that make Java code fast and (reasonably) safe, and allow you to do automated refactorings through shiny IDEs like Eclipse.

Yeah, the fact that its not "true" OO is annoying, but the gains from having a language that a computer can understand are immense.  Most of the people who complain about Java OO are those who haven't worked on large projects - where having the computer understand your code is incredibly important.

Name: Anonymous 2007-03-02 5:01 ID:qlYUOBZ8

>>9
Absolute truth, fucking win, signed.

>>11
CFLAGS JUST KICKED IN, YO! Development time is usually far more expensive than taking one more second to run. And who the heck said static typechecks are nice? They are anal, that's what they are. Inflexible, powerless, anal about all. Seriously, if you need to explain which type your objects (or not-objects in Java, lol) are, you need rework. Even for large projects.

Name: Anonymous 2007-03-02 10:07 ID:Heaven

deproductivity tool
fixed.

Name: Anonymous 2007-03-02 11:52 ID:mSIoGxzr

>>6

I stand corrected.  Ruby seems to be pure OO too, unlike Perl and Python.  I had only ever seen Ruby code, but once I read the docs I saw that what looked like top-level procedure definitions are actually syntactic sugar for private method definitinos on class Object.  Mea culpa.

Name: Anonymous 2007-03-02 12:47 ID:PWFP7qxZ

>>11
You're talking out of your arse. There's absolutely no reason why ints, floats et al can't be objects and still have static compile-time guarantees to enable safety and optimization. The compiler knows the difference between a String object and a BufferedInputStreamReader object, why would it have problems with integer objects?

Name: Anonymous 2007-03-02 13:45 ID:AJifwx71

>>11
Seen Strongtalk?

Name: Anonymous 2007-03-02 14:48 ID:Xyl1zNL5

>>12
If you believe static type checking is "anal", you know nothing about polymorphism. Static checking is a must when you work on a team... like in a real job you don't have because you're most likely unemployed.

Name: Anonymous 2007-03-02 15:00 ID:uYejW+YA

>>17
IF IT WALKS LIKE A DUCK AND QUACKS LIKE A DUCK MOTHERFUCKER
[turns off cruise control]

If you work in a team, you design and specify properly and stick to it when you code. Then if you call method "Defribolize" on an object that supports the method "Defribolize", it works! OMG! This is a really amusing concept. To work.

By the way, I have a real job on this. [TURNS ON CRUISE CONTROL] [WAVES EPENIS]

Name: Anonymous 2007-03-02 15:08 ID:Heaven

>>16
The gay version of smalltalk?

Name: Anonymous 2007-03-02 15:10 ID:Xyl1zNL5

>>18
I work on a team with mathematicians. They know absolutely nothing about compilers or CPUs even when they have 12 PhDs, and here is the part where everything collapses:

and stick to it when you code
YOU can stick to it, others can't.

Name: Anonymous 2007-03-02 17:21 ID:AJifwx71

YOU won't stick to it, and others won't either.
fix'd

It's amazing how naive some programmers are. Sorry, man, you aren't transhuman. No matter how hard you try, no matter how many processes you put in place, you will make mistakes, and some will get through.

like in a real job you don't have because you're most likely unemployed.
Truth.

Besides, the waterfall model went out of vogue ages ago. "design and specify properly"? Haha, right. Please excuse the blood I'm laughing up over here.

Name: Anonymous 2007-03-02 18:50 ID:uYejW+YA

>>21
No matter how hard you try, no matter how many processes you put in place, you will make mistakes, and some will get through.
No shit, Sherlock! That's why you test your shit. You will make mistakes, regardless of static or dynamic typing. Static typing will catch 2 errors and fuck you in the ass 20 times. I'd rather spend twice as long debugging a dynamic program, because I'll take one sixth as long to write it. And no, you are the unexperienced fag. Because of the higher abstraction, you make less mistakes, as you don't have to waste your time micro-managing your types, and you require less side effects. Correct programs are more obviously correct and more easily provable.

Whenever I write a sizable C module, I spend a considerable amount of time debugging it because no matter how hard I try, I often make mistakes. On the other hand, I've written Python modules the same lines long (implementing 10 times more functionality, and written in a bit less time) that work right from the start or right after fixing a couple of syntax errors. The logic is almost always correct, and it does what I want, even when the module consisted of implementing funny classes with lots of double-underscore methods.

Moreover, you often find higher abstractions and unanal typing yields far more useful results. Many times after doing some work one of my mates would say "wait, now they say we need X besides Y, back to work", then I am "...come to think of it, X will work just as well, I hadn't thought about it when I wrote it to support Y".

Besides, the waterfall model went out of vogue ages ago. "design and specify properly"? Haha, right.
Who said we work on a waterfall model? You said it, not me. I work on an incremental life cycle. That's what I was talking about in the previous paragraph. Sometimes you'd need to update some component, then discover you don't have to because you haven't been anal about things and your new object or function still quacks like a duck (or can be taught to quack pretty easily), which is all you really needed.

And again, I'm employed and use dynamic languages almost exclusively at work. I'm lucky to have a manager who understands his shit. He did start a couple of projects with Java before I was hired, but after seeing how could I get shit done and modified pretty quickly without turning it into a mess, and seeing how he could very easily understand what I was doing, while Java shit he knew better would always look like a piece of shit, he ditched Java. At first, he would say "now we need this, I hope we can have it by the end of the week", and I would reply "WTF one week, that'll be done by tomorrow, testing and all. Anything else?". It's up to you to believe so, and it's your time to waste if you use static languages.

Name: Anonymous 2007-03-02 19:00 ID:Heaven

>>22
You seem to be working under the false assumption that static typing -> low level procedural language, like C.

Name: Anonymous 2007-03-02 19:09 ID:uYejW+YA

>>23
Java is supposed to be higher level, and it still sucks (plus it adds new forms of suckage unrelated to being static typed). I simply don't know more static typed languages, I moved to dynamic languages long ago.

Name: Anonymous 2007-03-02 20:19 ID:AJifwx71

No shit, Sherlock! That's why you test your shit.
That's why you test your shit and use static typing.

On the other hand, I've written Python modules the same lines long (implementing 10 times more functionality, and written in a bit less time) that work right from the start or right after fixing a couple of syntax errors.
Uh, yeah. Strange that you should bring Python up, since I earn my living working on a mid-size project written in that language. By mid-size I'm referring to 50MB of code just for the section I'm working in.

The main reason the thing has managed to scale to this size is largely thanks to a number of hacks put in place to fake static typing, and since these checks occur at run-time, it really puts a damper on our development pace.

The sooner you catch the errors, the better. Too restrictive a type system and you're in for some pain, but too far opposite is just foolhardy. At the very least Python ought to rip off my() and use strict from Perl. The very bare least.

As for that productivity boost you saw: that's not due to dynamic typing. If that's the conclusion you drew, I suggest thinking a bit more on the differences between Python and C.

Name: Anonymous 2007-03-02 23:37 ID:504KU4Ba

>>24
I used to dislike static typing when my only exposure to it had been the C class of languages (including Java). That changed after I was introduced to languages with well thought out strong type systems, like OCaml and Haskell.

Especially Haskell of those two is most likely higher level than whatever dynamically typed language you are using right now.

Name: Anonymous 2007-03-03 4:14 ID:eS2UIJf5

>>26
Ditto. Static typing doesn't have to be synonymous with ugly verbosity (or verbose ugliness). In fact I find Haskell's type information actually makes it easier for me to figure out what code does, and to write correct functions.

I used to be a dynamic language fanboy, but now I'm of the opinion that dynamic typing is just an excuse not to implement a proper type system. Most people who rant about how awesome dynamic typing is are mainly just relieved to be away from the world of having to declare every damn thing and, worst of all, use type-casts when extracting stuff out of a collection that stores its shit as Objects or whatever. The first can be addressed by type inference, the second by generic functions and types.

So in conclusion GTFO with your Pythonic 'duck' typing.

Name: Anonymous 2007-03-03 4:17 ID:XaSQYFhj

Fuck OO. GOTO or GTFO.

Name: Anonymous 2007-03-03 4:44 ID:eS2UIJf5

The main reason the thing has managed to scale to this size is largely thanks to a number of hacks put in place to fake static typing, and since these checks occur at run-time, it really puts a damper on our development pace.

You're not by any chance working on Zope are you?

Name: Anonymous 2007-03-03 9:57 ID:ol2k1O3N

>>25
No need for my/strict in Python, as the use of an undefined symbol raises an exception.

The 5-10 times more productivity comes from several things: dynamic typing, no time spent on managing types, builtin lists and dictionaries, functional programming features, expresiveness of the language, dynamic extensibility, and the generic nature of anything you write. For the same price, something that would work on X will also work on Y, Z, and the whole alphabet, without having to consider polymorphism rules or anything - you call method hi, and objects have method hi, as simple as that.

BTW, 50 MB of Python? You could program all vital systems of a space station with that.

>>27, yes, dynamic typing can be troublesome when you call method "hi" of object "x" because it might not be evident what "x" is in real time, but if you write proper code, you should be able to tell what kind (note the use of "kind" instead of "type") of objects x will be from its name and from the context (e.g. function it's done in). That's why you avoid one-character names for important objects. If you can't tell the kind of an object by its name or immediately accessible documentation near the function definition, you're doing something wrong (or the guy who wrote that did something wrong).

Also, QUACK.

>>29
I was thinking the same thing.

Name: Anonymous 2007-03-03 16:11 ID:OPzhAJgR

>>30
"dynamic typing" means that you can't tell an object's kind or type by its name, that's where you fail at life and everything.

Name: Anonymous 2007-03-03 16:45 ID:Heaven

>>31
No.

Name: Anonymous 2007-03-04 12:25 ID:ob/0W4N4

Coincidentally, I'm now reading a paper by Erik Meijer called "Static Typing Where Possible, Dynamic Typing When Needed: The End of the Cold War Between Programming Languages" ( http://pico.vub.ac.be/~wdmeuter/RDL04/papers/Meijer.pdf ).  I'd recommend it to participants who were actually interested in this thread, rather than just flaming.

Name: Anonymous 2007-03-04 12:46 ID:mkOkq9IY

>>32
Yes it does because the interpreter doesn't check the name and can't, it's in english you moron.

Name: Anonymous 2007-03-05 9:04 ID:gRucJgj3

>>34
You're a moron if you name variables in such a way that you can't tell what kind (not necessarily type) of objects will they reference in real time.

Name: Anonymous 2007-03-05 16:13 ID:Heaven

>>35
You're probably the type of faggot who names looping variables "loopIndex1" instead of "i". Fuck off.

Name: Anonymous 2007-03-06 5:57 ID:LblekCGL

>>36
Short names (usually one character) are best for short loops, temporary variables, dictionary parts, etc. For everything else (function parameters, application logic, globals, etc.) you use long, descriptive names.

Also, I feel offended that you think I'd use shitCase. If loopIndex1 were a variable, it'd be loop_index1. If it were a function, it'd be LoopIndex1.

Name: Anonymous 2007-03-06 12:55 ID:6JZw35Eg

>>37
I can quote books too and pretend to understand what I'm talking about.
PS : faggot

Name: Anonymous 2007-03-06 13:32 ID:gL48pHzf

>>37
is a total faggot.

Name: Anonymous 2007-03-06 13:35 ID:4zhFPtMq

All variables should have one-character names... if you excuse me, I'm off to achieve Satori while programming some Haskell.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List