For most people and languages, the difference is lol(x, y) vs. x.lol(y), which makes OO absurd (and operations tend to suck more on x.lol(y) notation, like string1.equals(string2), ugh). Then you have inheritance, which means you get a free dish of spaghetti code. It's just dynamically updated copypasta, and come to think of it, spaghetti is pasta.
That's ALL there's to OO; note that garbage collection, exceptions (= goto 2.0), function and operator overloading, etc. are not OO features, and many languages offer them procedurally.
If you are serious about OO, then OO is about functional units responsible for keeping their own state, and can provide a good abstraction. Inheritance and mixins can have their uses as well, but for this to be comfortable, you have to work with a dynamic language such as... I'm not going to give examples because I'll be called fanboy.
Of course, and like with pure anything, pure OO sucks. Not all structures, models and algorithms adjust to OO well. Pure OO languages which shove OO up your ass all the time, like Java, are bound to suck. (Java is bound to suck for many other reasons as well.) So you use OO when it fits, procedural programming when it fits, and functional programming when it fits, and finish early.
I agree with most of this post (especially choosing the paradigm based on what job you have to do), except for Java as an example of "pure OO", since Java method bodies are essentially procedural. It's the same for Python, Ruby, etc.
I would consider a "pure OO" language to be Self or Smalltalk, which have no constructs except message passing (Self doesn't even have classes, you clone prototype objects instead).
Pure OO languages which shove OO up your ass all the time, like Java, are bound to suck.
Java is OO gone horribly wrong then marketed pure enterprise OO.
Name:
Anonymous2007-03-01 13:09 ID:iyCrV+EW
>>4
In what way it is the same for Ruby? I'm not a Ruby user but it seems that your understanding of the Ruby object system is lacking.
Name:
Anonymous2007-03-01 14:47 ID:xe3ovJKc
Fuck OO. GOTO or GTFO.
Name:
Anonymous2007-03-01 15:03 ID:Ybr+qfIs
Java is the bastard child of OO and procedural programming, but it shames its fathers (no girls in programming lol) by doing neither remotely well.
Name:
Anonymous2007-03-02 2:36 ID:h+R4C0zj
>>8
Java isn't as much a programming language as it is a productivity tool for dime-a-dozen managers and their fresh out of college "Java developers" who can't tell their arse from their elbow. It was invented for the tech boom of the turn of the century for the purpose of squeezing reams of crap, ill-indented code from mediocre-to-shitty programmers.
Why some people insist on using it today is beyond me.
Because unlike a "true" OO language, Java can be made to run in reasonable amounts of time. Compilers can't optimize "true" OO because they have no clue what the code will be doing until runtime. So you lose all the nice static typechecks that make Java code fast and (reasonably) safe, and allow you to do automated refactorings through shiny IDEs like Eclipse.
Yeah, the fact that its not "true" OO is annoying, but the gains from having a language that a computer can understand are immense. Most of the people who complain about Java OO are those who haven't worked on large projects - where having the computer understand your code is incredibly important.
>>11
CFLAGS JUST KICKED IN, YO! Development time is usually far more expensive than taking one more second to run. And who the heck said static typechecks are nice? They are anal, that's what they are. Inflexible, powerless, anal about all. Seriously, if you need to explain which type your objects (or not-objects in Java, lol) are, you need rework. Even for large projects.
I stand corrected. Ruby seems to be pure OO too, unlike Perl and Python. I had only ever seen Ruby code, but once I read the docs I saw that what looked like top-level procedure definitions are actually syntactic sugar for private method definitinos on class Object. Mea culpa.
Name:
Anonymous2007-03-02 12:47 ID:PWFP7qxZ
>>11
You're talking out of your arse. There's absolutely no reason why ints, floats et al can't be objects and still have static compile-time guarantees to enable safety and optimization. The compiler knows the difference between a String object and a BufferedInputStreamReader object, why would it have problems with integer objects?
>>12
If you believe static type checking is "anal", you know nothing about polymorphism. Static checking is a must when you work on a team... like in a real job you don't have because you're most likely unemployed.
Name:
Anonymous2007-03-02 15:00 ID:uYejW+YA
>>17
IF IT WALKS LIKE A DUCK AND QUACKS LIKE A DUCK MOTHERFUCKER
[turns off cruise control]
If you work in a team, you design and specify properly and stick to it when you code. Then if you call method "Defribolize" on an object that supports the method "Defribolize", it works! OMG! This is a really amusing concept. To work.
By the way, I have a real job on this. [TURNS ON CRUISE CONTROL] [WAVES EPENIS]
>>18
I work on a team with mathematicians. They know absolutely nothing about compilers or CPUs even when they have 12 PhDs, and here is the part where everything collapses:
and stick to it when you code
YOU can stick to it, others can't.
Name:
Anonymous2007-03-02 17:21 ID:AJifwx71
YOU won't stick to it, and others won't either.
fix'd
It's amazing how naive some programmers are. Sorry, man, you aren't transhuman. No matter how hard you try, no matter how many processes you put in place, you will make mistakes, and some will get through.
like in a real job you don't have because you're most likely unemployed.
Truth.
Besides, the waterfall model went out of vogue ages ago. "design and specify properly"? Haha, right. Please excuse the blood I'm laughing up over here.
Name:
Anonymous2007-03-02 18:50 ID:uYejW+YA
>>21 No matter how hard you try, no matter how many processes you put in place, you will make mistakes, and some will get through.
No shit, Sherlock! That's why you test your shit. You will make mistakes, regardless of static or dynamic typing. Static typing will catch 2 errors and fuck you in the ass 20 times. I'd rather spend twice as long debugging a dynamic program, because I'll take one sixth as long to write it. And no, you are the unexperienced fag. Because of the higher abstraction, you make less mistakes, as you don't have to waste your time micro-managing your types, and you require less side effects. Correct programs are more obviously correct and more easily provable.
Whenever I write a sizable C module, I spend a considerable amount of time debugging it because no matter how hard I try, I often make mistakes. On the other hand, I've written Python modules the same lines long (implementing 10 times more functionality, and written in a bit less time) that work right from the start or right after fixing a couple of syntax errors. The logic is almost always correct, and it does what I want, even when the module consisted of implementing funny classes with lots of double-underscore methods.
Moreover, you often find higher abstractions and unanal typing yields far more useful results. Many times after doing some work one of my mates would say "wait, now they say we need X besides Y, back to work", then I am "...come to think of it, X will work just as well, I hadn't thought about it when I wrote it to support Y".
Besides, the waterfall model went out of vogue ages ago. "design and specify properly"? Haha, right.
Who said we work on a waterfall model? You said it, not me. I work on an incremental life cycle. That's what I was talking about in the previous paragraph. Sometimes you'd need to update some component, then discover you don't have to because you haven't been anal about things and your new object or function still quacks like a duck (or can be taught to quack pretty easily), which is all you really needed.
And again, I'm employed and use dynamic languages almost exclusively at work. I'm lucky to have a manager who understands his shit. He did start a couple of projects with Java before I was hired, but after seeing how could I get shit done and modified pretty quickly without turning it into a mess, and seeing how he could very easily understand what I was doing, while Java shit he knew better would always look like a piece of shit, he ditched Java. At first, he would say "now we need this, I hope we can have it by the end of the week", and I would reply "WTF one week, that'll be done by tomorrow, testing and all. Anything else?". It's up to you to believe so, and it's your time to waste if you use static languages.
>>22
You seem to be working under the false assumption that static typing -> low level procedural language, like C.
Name:
Anonymous2007-03-02 19:09 ID:uYejW+YA
>>23
Java is supposed to be higher level, and it still sucks (plus it adds new forms of suckage unrelated to being static typed). I simply don't know more static typed languages, I moved to dynamic languages long ago.
Name:
Anonymous2007-03-02 20:19 ID:AJifwx71
No shit, Sherlock! That's why you test your shit.
That's why you test your shit and use static typing.
On the other hand, I've written Python modules the same lines long (implementing 10 times more functionality, and written in a bit less time) that work right from the start or right after fixing a couple of syntax errors.
Uh, yeah. Strange that you should bring Python up, since I earn my living working on a mid-size project written in that language. By mid-size I'm referring to 50MB of code just for the section I'm working in.
The main reason the thing has managed to scale to this size is largely thanks to a number of hacks put in place to fake static typing, and since these checks occur at run-time, it really puts a damper on our development pace.
The sooner you catch the errors, the better. Too restrictive a type system and you're in for some pain, but too far opposite is just foolhardy. At the very least Python ought to rip off my() and use strict from Perl. The very bare least.
As for that productivity boost you saw: that's not due to dynamic typing. If that's the conclusion you drew, I suggest thinking a bit more on the differences between Python and C.
Name:
Anonymous2007-03-02 23:37 ID:504KU4Ba
>>24
I used to dislike static typing when my only exposure to it had been the C class of languages (including Java). That changed after I was introduced to languages with well thought out strong type systems, like OCaml and Haskell.
Especially Haskell of those two is most likely higher level than whatever dynamically typed language you are using right now.
Name:
Anonymous2007-03-03 4:14 ID:eS2UIJf5
>>26
Ditto. Static typing doesn't have to be synonymous with ugly verbosity (or verbose ugliness). In fact I find Haskell's type information actually makes it easier for me to figure out what code does, and to write correct functions.
I used to be a dynamic language fanboy, but now I'm of the opinion that dynamic typing is just an excuse not to implement a proper type system. Most people who rant about how awesome dynamic typing is are mainly just relieved to be away from the world of having to declare every damn thing and, worst of all, use type-casts when extracting stuff out of a collection that stores its shit as Objects or whatever. The first can be addressed by type inference, the second by generic functions and types.
So in conclusion GTFO with your Pythonic 'duck' typing.
Name:
Anonymous2007-03-03 4:17 ID:XaSQYFhj
Fuck OO. GOTO or GTFO.
Name:
Anonymous2007-03-03 4:44 ID:eS2UIJf5
The main reason the thing has managed to scale to this size is largely thanks to a number of hacks put in place to fake static typing, and since these checks occur at run-time, it really puts a damper on our development pace.
You're not by any chance working on Zope are you?
Name:
Anonymous2007-03-03 9:57 ID:ol2k1O3N
>>25
No need for my/strict in Python, as the use of an undefined symbol raises an exception.
The 5-10 times more productivity comes from several things: dynamic typing, no time spent on managing types, builtin lists and dictionaries, functional programming features, expresiveness of the language, dynamic extensibility, and the generic nature of anything you write. For the same price, something that would work on X will also work on Y, Z, and the whole alphabet, without having to consider polymorphism rules or anything - you call method hi, and objects have method hi, as simple as that.
BTW, 50 MB of Python? You could program all vital systems of a space station with that.
>>27, yes, dynamic typing can be troublesome when you call method "hi" of object "x" because it might not be evident what "x" is in real time, but if you write proper code, you should be able to tell what kind (note the use of "kind" instead of "type") of objects x will be from its name and from the context (e.g. function it's done in). That's why you avoid one-character names for important objects. If you can't tell the kind of an object by its name or immediately accessible documentation near the function definition, you're doing something wrong (or the guy who wrote that did something wrong).
Coincidentally, I'm now reading a paper by Erik Meijer called "Static Typing Where Possible, Dynamic Typing When Needed: The End of the Cold War Between Programming Languages" ( http://pico.vub.ac.be/~wdmeuter/RDL04/papers/Meijer.pdf ). I'd recommend it to participants who were actually interested in this thread, rather than just flaming.
Name:
Anonymous2007-03-04 12:46 ID:mkOkq9IY
>>32
Yes it does because the interpreter doesn't check the name and can't, it's in english you moron.
Name:
Anonymous2007-03-05 9:04 ID:gRucJgj3
>>34
You're a moron if you name variables in such a way that you can't tell what kind (not necessarily type) of objects will they reference in real time.
>>35
You're probably the type of faggot who names looping variables "loopIndex1" instead of "i". Fuck off.
Name:
Anonymous2007-03-06 5:57 ID:LblekCGL
>>36
Short names (usually one character) are best for short loops, temporary variables, dictionary parts, etc. For everything else (function parameters, application logic, globals, etc.) you use long, descriptive names.
Also, I feel offended that you think I'd use shitCase. If loopIndex1 were a variable, it'd be loop_index1. If it were a function, it'd be LoopIndex1.
Name:
Anonymous2007-03-06 12:55 ID:6JZw35Eg
>>37
I can quote books too and pretend to understand what I'm talking about.
PS : faggot
- shitCase is good.
- Short names are good for variables.
- Medium-length names are good for functions, types, modules, etc.
- Java-length names are bad.
Name:
Anonymous2007-03-06 15:29 ID:EjUj3aVQ
>>41
You mean superClassToGetIntsBecauseJavaWontGiveMeAFuckingPointerToOne isn't a good name?
Name:
Anonymous2007-03-07 5:08 ID:5HTqSbzp
>>38
What? What book? If there's a book that already said that with those words let me know for massive coincidence, it'll make me happy.
>>41
shitCase is shitty shit mostly Java fags use: it looks fugly, and it's ambiguous: fuckSister is shitCase, but fuck alone can't be told from lowercase. Butt ugly. CamelCase is the proper way, and it's not ambiguous except for single letters which you shouldn't be using for something that deserves CamelCase.
Short names are good for loop and index variables, temporary variables, and such, but if you name your variables for major application logic, non-anonymous functions, function parameteres (save geometry, Maths, and stuff like that), globals, or similar with single letters, then I hope I never have to work on your code.
And of course Java-length names suck. If you need five words to describe something, you're doing it wrong.
Name:
Anonymous2007-03-07 6:22 ID:aQP5QU4I
CamelCase is the proper way
Sure, if you're a VB 'programmer'.
Take your religious war elsewhere. Here we care about programming, not your cute little opinion on largely irrelevant style preferences.
Name:
Anonymous2007-03-07 8:20 ID:/KyOVEZ7
firstWordLow case is very very useful in Haskell, since underline is how you write something that's supposed to be read as being subscripted (since it doesn't mean anything in the language). And CamelCase is restricted to constructors and type, class, etc names.
Wouldn't use it in C, hell no. My left pinky is already sore enough from working the left shift key.
>>45
One word. Forced indentation of code. Thread over
Name:
Anonymous2007-03-07 10:15 ID:UF1mV5Xm
>>43 but fuck alone can't be told from lowercase.
That's sort of the whole idea.
And if you need long variable names, then your variable scopes are too big.
Name:
Anonymous2007-03-07 12:07 ID:5HTqSbzp
>>44
I don't know Visual Basic. Do they use CamelCase for functions and classes? Then at least there's ONE thing they got right.
>>46
Hahaha, get a language that doesn't rely on hacks like that to parse and tell the type of identifiers.
>>49
No, that's not the idea. There's lower_case, there's CamelCase and there's UPPER_CASE for you to use as you see fit. A language shouldn't impose you a convention, but you should stick to a decent one. And a decent one would be CONSTANTS, FunctionsAndClasses, and other_variables, plus less significant variables which you name a, b, c, .... Of course, you shouldn't need neither BufferedCocksInputShitter nor buffered_cocks_input_shitter because if you can't explain what you do with less than 3 words you're failing or missing vocabulary.
>>50 Hahaha, get a language that doesn't rely on hacks like that to parse and tell the type of identifiers.
Haskell uses type inference, which your puny mind has no chance of comprehending!
Name:
Anonymous2007-03-07 16:57 ID:pT+k+A3l
No need for my/strict in Python, as the use of an undefined symbol raises an exception.
No wonder I think the majority of python fags are idiots.
That's incredible ignorant.
Name:
Anonymous2007-03-07 19:05 ID:lyv5qAm/
Python is fail, it's not OO enough (Ruby is), not functional enough (Lisp and Haskell are) and not useful enough (Erlang is). It's a failed mix of languages, I will never use this crap.
Name:
Anonymous2007-03-08 1:13 ID:fr6vdVp3
OO IS FOR FAGGOTS WHO NEED TRAINING WHEELZ. EVERYTHING CAN BE DONE IN PROCEDURAL MORE EFFICIENTLY.
Name:
Anonymous2007-03-08 5:57 ID:3b42YNTO
>>53
Python's OO is good exactly because it's not OO-fanatic and doesn't get in the way when OO does not adjust properly to a particular problem. Right tool for the right job, fags.
I agree that it should provide more functional programming tools, but right now you have built-in lists (and dictionaries), first-class functions and classes, lexical scoping, lambdas, lazy iterators, coroutines, and you can do a lot of stuff, from the basic list operations (map, filter, fold, scan, zip, etc.) to function manipulation (currying, composition, etc.). You can write/pasta decorators to memoize or tail-recursion-optimize your functions. That's more than most languages offer.
As for usefulness, bullshit. Python's standard library and cheese shop are awesome.
Name:
Anonymous2007-03-09 17:27 ID:e9IH8SxS
>>55
I agree. Python is pretty useful even though it is not pure in any respect.
Python's random nonfunctionality is quite disturbing, though. Some standard methods (like list.reverse()) seem to be destructive just for the kicks, and they bite me every fucking time I use them.
Another thing I'll really fucking hate for as long as I will use Python is the fact that 0 == False, which doesn't work too well with enumerators.
Name:
Anonymous2008-04-19 18:40
>>55
You're a moron if you name variables in such a way that you can't tell what kind (not necessarily type) of objects will they reference in real time.
Name:
Anonymous2009-03-06 11:04
The hell indeed I would believe nothing they said you need to go teach and I felt kind of vocational schooling It.
Name:
Trollbot90002009-07-01 8:06
It in each directory so the url inference engine built into the current node and moves the pointer to the previous node Used for retracing?
Name:
Anonymous2012-03-23 23:40
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boyAll work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boyAll work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boyAll work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boyAll work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boyAll work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boyAll work and no play makes Jack a dull boy
There's no such thing as OO vs procedural. Mostly since the term OO is meaningless. Everybody has their own definition of it, and people always misunderstand each other when discussing it. In a similar way, procedural is used to name either the things not OO or anything imperative, and as such doesn't bear much weight.
The notion that you need classes and static typing for large projects is plain wrong. With large projects, you need to divide the problem into smaller problems, and solve them independently. You build small modules with clear requirements and responibilities, and larger modules in terms of other modules. This has been known since the olden days[1]. You can do this with classes, but that's not the only way.
[1] On the Criteria To Be Used in Decomposing Systems into Modules
Name:
Anonymous2012-03-26 10:26
You are all insane.
Real programmers uses interface programming.
Interfaces (with multiple inheritance) solves all the silly problems with duck typing and allows me too have compiler checks.
You also get better class diagrams withouth the fucking inheritance (C++, fuck you).
No i mean pure abstract base classes in a langauge that supports multiple inheriteance.
And on that note Multiple dimensional seperation of concerns and subject orietanted programming works and should be favored when using (read, forced to) OO.
Name:
Anonymous2012-03-26 12:58
>>2 lol(x, y) vs. x.lol(y)
More like mynamespace_lol(x, y) (or something along those lines) vs x.lol(y) if you're writing something that isn't a toy program.
The idea of OO is good, but it's not a one size fits all solution, unlike procedural programming which is, but it's not so pleasant sometimes.
Name:
Anonymous2012-03-26 13:42
>>25
>The main reason the thing has managed to scale to this size is largely thanks to a number of hacks put in place to fake static typing
Sounds interesting. Could you give an example of the notation? Say you had,
def add(a, b):
return a + b
in your work's codebase. How would you annotate static type information?
Name:
Anonymous2012-03-26 15:32
I stand neutral, but I find this funny:
"Microkernel" was the buzz-word of last year, so Minix is a microkernel. "Object-oriented" is this years, so Minix is object-oriented - right?
- joe, Feb 3 1992, 3:33 am, comp.os.minix, LINUX is obsolete
>>75
I guess I'm the 10%. #occupyworld4ch xDDDDDDD
Also, I'm guessing you didn't just choose 2008 arbitrarily or even based on any real meaningful information. Rather, you chose the exact year you came here, to make it seem like you're an ``oldfag''. Don't kid yourself.
Name:
Anonymous2012-03-27 9:52
>>76
Moot has openly stated that with 4chan pop falling, in a few years we could get back to pre 2009/2008 levels.
I came here somewhen around 2003-2005, /d/ was nice and cozy back then.
I know you're being a dumb nigger, I don't know what I can't help but respond to your retardation. If only Fox News never reported on 4chan pedophila
Name:
Anonymous2012-03-27 9:56
>>77 lol look at me I'm such an oldfag xDDDD no one else discovered this site before me xDDDDD
Name:
Anonymous2012-03-27 9:57
I came here in 2012 xd. My older brother (13) told me about it. Im planning on being a hacker for the isreali government xD so I can fight you guys XD
Bringing /prog/ back to its people
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
All work and no play makes Jack a dull boy
Name:
Anonymous2012-05-28 23:20
OO is just a superset of procedural programming with some simplifications for things that you'd do anyway without them (functions taking pointers to structs -> logically put them together with the data in the struct definition, add the "this" pointer and all the struct members in scope automatically.)
It's all a bunch of terminology made to make things look new and interesting when it's really nothing more than a way to simplify existing convention.