Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-4041-

Exercise 9.

Name: Anonymous 2010-07-09 16:42

Give your view as to why a course in assembly language is no
longer a requirement for a computer science degree in many colleges.

Name: Anonymous 2010-07-09 16:46

Because industry requirements are getting dumber each days and they just want more plumbers, instead of people who actually know the system inside-out. It may be that systems are getting too complex for anyone to know them in full, but that's not an excuse to not understand the basics.

Name: Anonymous 2010-07-09 16:51

For the same reason SICP is no longer taught at MIT.

Name: Anonymous 2010-07-09 17:01

Because computer science is only incidentally about computers, and pretending universities are supposed to be training camps for corporate programmers is actually kind of offensive.

Name: Anonymous 2010-07-09 17:11

Because Java is more useful.

Name: Anonymous 2010-07-09 17:12

>>4

incidentally about computers

I only figured this out after reading SICP

Name: Anonymous 2010-07-09 17:31

>>2
The basics have gotten larger. There's no reason to dedicate an entire obligatory course to assembly language when you can integrate a quick introduction into other courses and let students pick more specialized courses later on.

Name: Anonymous 2010-07-09 17:34

>>6

A friend and I had a long discussion about what we would change "Computer Science" to.  I may make a thread about that...

Name: Anonymous 2010-07-09 17:38

>>8
Don't.

Name: Anonymous 2010-07-09 17:40

Too slow on the draw son.

Name: Anonymous 2010-07-09 17:44

I was talking to some enterprise tools one day. We spoke about the languages we use most often. It was something like this on their part:

"Lol, why the fuck would you learn a language like C?"

or

"What's C?"

or

"Sorry, I don't do assembler.".

Name: Anonymous 2010-07-09 17:45

Give your view as to why a course in assembly language is no
longer a requirement for a computer science degree in many colleges.
I'm fine with them teaching assembly, but I wouldn't devote a whole course to it.

Name: Anonymous 2010-07-09 17:47

Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed. Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy, to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACM – turingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist.[25] Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[26] The term computics has also been suggested.[27] In continental Europe, names such as informatique (French), Informatik (German) or informatica (Dutch), derived from information and possibly mathematics or automatic, are more common than names derived from computer/computation.

The renowned computer scientist Edsger Dijkstra stated, "Computer science is no more about computers than astronomy is about telescopes." The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research has also often crossed into other disciplines, such as philosophy, cognitive science, linguistics, mathematics, physics, statistics, and economics.

Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science.[11] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.

The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined. David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.[28]

The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.

Name: Anonymous 2010-07-09 17:54

>>13
[citation needed]

Name: Anonymous 2010-07-09 18:37

I would call it "GUI Interfaces in Visual Basic".

Name: Anonymous 2010-07-09 18:52

WAT C???

Name: Anonymous 2010-07-09 18:58

>>16
A language for 50 year old UNIX programmers who still live with their mothers and suffer from Buried Penis Syndrome.

Name: Anonymous 2010-07-09 19:05

Give your view as to why a course in assembly language is no longer a requirement for a computer science degree in many colleges.
"Because you would never find enough students who would want to stay in the course if you forced [that] ... it would be bad for the department." - professor from my college, when I asked why the major wasn't obligated to learn any language of lower level than C++

Name: Anonymous 2010-07-09 19:14

Give your view as to why a course in assembly language is no longer a requirement for a computer science degree in many colleges.
It's for EEs not theorem proving systems and toy languages.

Name: Anonymous 2010-07-09 19:34

>>19
CEs

Name: Anonymous 2010-07-09 23:36

>>20
Don't worry, we EEs only get down and code when we need something that works right the first time. The rest we leave to them.

Name: Anonymous 2010-07-10 2:17

>>21
The only reason errors are much less frequent in EE is because testing is embedded very deeply into hardware design practices. The cost of failure is simply unacceptable for some projects (like ASICs), so everything is usually very rigurously tested using test benches and other mechanisms. Considerably much less than software engineers do.

Overall, languages like VHDL and Verilog appear high-level, but they tend to be quite low-level, in the way one designs state machines manually and how various other things you take for granted when doing normal programming have to be done by in a much more manual (and modular) manner. There are also complex synchornisation issues and other problems you have to deal when designing for a physical platform.

Things could be made much more simpler by using some higher-level language like a subset of C or certain functional languages which eventually compile to VHDL or Verilog or portable or non-portable (technology specific) schematics, but can you really afford the bloat in time and space that such languages will bring with them? Hardware costs money, and the bigger and slower your hardware is, the more you have to pay, especially if it's something mass manufactured.

Software on the other hand is not limited as much as hardware: RAM tends to be cheap, space is plentiful, CPUs are reasonably fast. It only becomes an issue when you have to buy large server farms to run your software on each node. And since it's software, bugs are usually less important as software can be easily upgraded.

Name: Anonymous 2010-07-10 2:49

>>1
Optimizing ASM code today is far more difficult and C compilers have better results in general. What is point in asm code that runs like 10%-20% faster but requires ten times more effort?

Name: Anonymous 2010-07-10 2:59

>>23
1: Because learning ASM learns you how a computer works. Computer science might be to computer what astronomy is to telescopes, but knowing how to operate one is sure going to come in handy.
2: Those 10-20%, of course. You might need them.

Name: Anonymous 2010-07-10 3:01

Compilers have inline ASM and CPU intrinsics which make pure asm program not necessary(except for extreme size optimizations).

Name: Anonymous 2010-07-10 4:35

>>22
Overall, languages like VHDL and Verilog appear high-level
This, except not at all.

Name: Anonymous 2010-07-10 5:42

Because learning ASM learns you how a computer works.
Oh god not this tired bullshit again. This hasn't been true for what, 15 years?

Name: Anonymous 2010-07-10 6:46

>>27
How is it not true? If you know asm, you know your CPUs external documented interface. You should also learn about your platform and how other interfaces work.

Of course, on modern OSes, the bulk of the code is usually usermode code which is isolated from having to deal with hardware directly.

The other step in learning how a CPU works is to learn how CPUs are designed and how they work, and maybe make your own as an exercise (buy some FPGA to run it on). You may also model a software emulator which emulates your entire platform. After that, you can say you truly understand how computers work.

Name: Anonymous 2010-07-10 7:22

>>28
knowing the interface != knowing the implementation

Are you aware of just how much shit is going on under the hood on a modern CPU?

Name: Anonymous 2010-07-10 7:54

>>29
Go read the source of some CPU cores. There's enough open source ones, some simple, some quite complex (like OpenSparc ones).

Name: Anonymous 2010-07-10 8:55

>>30
How would reading sources for toy CPUs teach you anything about the workings of a modern computer?

Name: Anonymous 2010-07-10 9:22

>>31
Why do you think things are so different?
If you really want to learn how *insert popular CPU* works (like x86), get yourself a lot of money and reverse engineer it in a FA lab, or buy papers from chipworks. Modern CPUs aren't that different from older ones, they just use more complex processes, are more optimized, sometimes even full-custom. If you want to learn the workings of other components (RAM, buses, interface cards, and so on, you can choose to read the standards so you can see what is expected of each component (the interface), and from there you can figure out how to implement such components or see if there's any open source implementations for you to look at). It's really no big secret, most of the info is out there publicly for you to read and understand, and if you want hands-on experience, you can get it.

Name: Anonymous 2010-07-10 12:37

>>32
The argument you guys are having is stupid, but I'd like to point out that learning assembly does not typically involve reverse-engineering chips from silicon (and certainly does not give you the skills to do so!) You've just endorsed >>27's point, which I don't think is what you meant to do.

Name: Anonymous 2010-07-10 17:32

>>33
You don't need to reverse the chip to understand the arch.
If you know the public interface, that's enough to have all the software work properly on your system. How you create something which implements that interface is up to you, and it doesn't have to be the same way as done by the original chip maker: providing your implementation is faithful to the documented interface, the software would work the same. The public interface (assembly language being one such thing) is usually well documented, so anyone versed in CPU and hardware design should be able to implement it given enough time (there's no real secrets on how to implement various things, the only trick here is doing it efficiently, which is what most of the time is invested on modern CPUs, but those are just optimizations(even if the entire chip may consist of them)).

So no, my claim still stands that you don't really need to know the exact undocumented internals of an architecture to understand it or make compatible software for it (on all levels, both kernel and user mode). If I didn't explain myself well-enough, I'll give a practical example that anyone can understand: x86 is implemented by both Intel and AMD, both doing their own R&D (they do have some agreements when it comes to their patent pool sharing and similar things, but that's still just sharing of PUBLIC, WELL-DOCUMENTED information that the hardware and software must conform to be called x86). The same is true for other things like RAM interfaces, buses and whatnot.

Another thing to note is that one doesn't even have to know how hardware internally works to understand a platform: an emulated platform (based on a software model) is just as valid as a hardware model, in fact, many hardware models first start off as a simple software model on which one can test and toy around, after which hardware models are developed. I'd imagine people here on /prog/ would understand that the model/interface is an universal and abstract mathemathical construct which can be implemented in actual code, be it physical "code" (hardware), or pure software code, or ever something inbetween. When people develop for a platform, they develop for the model of a platform, while the platform itself can be implemented in many different ways (emulated, virtualized, FPGA, ASIC, different full-custom CPUs, and so on). Another simple example would be the Java virtual machine which started out as a purely virtual machine which implements the model of a certain stack-based CPU, which was only implemented in software at that time. After years have passed and Java become popular, there appeared CPUs in the market which could execute java bytecode natively, thus the platform was now implemented in hardware, but the same (compliant) software would still work, as it was designed for a certain abstract CPU model.

Name: Anonymous 2010-07-10 18:14

>>34
>>33 is right, you are undermining your own point. Your claim is that a given assembly language can be realised in Hardware in a multitude of ways. No-one disputes that, but it brings us back to the original point, that knowledge of the interface does not give you knowledge of the internals ,or to put it bluntly, HOW IT WORKS. Which is fine, you can do useful things with that knowledge, but you can't claim you understand the hardware. A computer architecture class is more than an assembly tutorial

Name: Anonymous 2010-07-10 18:23

>>35
The interface itself explains how it's supposed to work, but how a SPECIFIC platform works depends entirely on that platform. Of course, if you really want to know the gritty details, you should just take an EE course or read the relevant EE books yourself.

Name: Anonymous 2010-07-10 18:37

>>36
Which is what the entire point of this conversation was. >>27 says "Learning ASM won't teach you how a computer works", because an assembly language is the contract, the interface, and it could be realised an infinite number of ways. As far as an Assembly language programmer is concerned, the computer is a black box that obeys an interface and it doesn't matter how it does that. OR let me put it one other way, Assembly Language is just another programming language.

Name: Anonymous 2010-07-10 19:17

>>37
It's not exactly just another programming language. It's the language to which everything is compiled to eventually, as the CPU can only run native code. Knowing assembly means you know the CPU and you're able to program the system at a lower level. It also gives you an understanding of the costs of various mid and high-level constructs encountered in mid/high-level languages.
While it might be a waste of effort to have your students study the entire x86 instruction set or something along those lines, having the student understand the base language of a CPU will help the student better understand higher level languages and their implementation. Not understanding the foundation on which the software world sits would only make someone confused about things.

On the other hand, I do agree that a separate course on assembly might not be that useful for most students, but a few lessons dedicated to it would be useful. Of course, if the student has read his SICP, he might not need such a course as SICP explains this quite nicely.

Name: Anonymous 2010-07-10 19:24

>>38
Knowing assembly means you know the CPU
It's like I'm talking to a wall.

Name: Anonymous 2010-07-10 22:26

>>39
You are talking to a wall. A wall of text. I'm surprised you responded. (I wouldn't have responded to him anyway, save my comment in >>33.)

Name: Anonymous 2010-07-11 4:33

you can learn assembly in a few weeks... if you're going to go through the trouble of getting a computer science degree would it not make sense to learn assembly language of some form at least at a very basic level?  If you want to be a programmer, how could you not even want to learn assembly, it does not make sense

Name: Anonymous 2010-07-11 11:42

>>41
Some painters may take a mild interest in what ingredients are used to make the various colors in their paints. It may even, in some removed and circuitous way, provide them with some understanding of how ingredients interact to produce certain effects.

But most painters are content with the tools they have, and are instead interested in perfecting their craft. And if they want to test the interactions between their paints, they can do so with the paints themselves, without bothering with ingredients at all.

Name: Anonymous 2011-01-31 20:10

<-- check em dubz

Name: Anonymous 2011-02-04 13:28

Name: Anonymous 2011-02-04 14:20

Don't change these.
Name: Email:
Entire Thread Thread List