Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

__future__

Name: Anonymous 2011-07-15 3:28

What is /prog/'s opinion on the future of programming?
-Will functional languages really take over?
-Are there any other paradigms rising out of obscurity?
-Will the hardware we run on change drastically? Would this affect how we program? Think quantum computers and shit (/tech/esqe, but entirely related to what we do)

I think that in the future the use of multi-paradigm languages will become almost standard when it comes to writing anything. Concurrent programming is the way of the future; it is currently the only way Moore's Law will continue to be followed. However there are elements of functional languages that make them difficult for practical use, so switching between and combining paradigms where necessary may be the way to go. However we will end up with most languages looking like C++, or worse; a mess of different ways of achieving the same goal (I still love me some C++).

As such I don't see a massive shift to functional languages taking place just yet, however I do think we will see the incorporation of functional features into OO and procedural languages. However I feel that the need for functional languages is inevitable; we are moving closer and closer to a world where we are having to concurrently use data from different sources at once and functional languages simplify this situation greatly.

I actually haven't learned a functional language yet, but am probably going to try pick up Haskell and F# later in the year. Most of the information I've got on the functional style comes from http://www.defmacro.org/ramblings/fp.html

Given that all computers are Turing-complete, I don't think we'll make any changes to our actual programming style. High level languages will remain almost exactly the same, save maybe an idea or two. But I do think that the future of hardware could have a gigantic effect on cryptography. I haven't read much about this, but I am very interested in the effect it could have.

Name: Anonymous 2011-07-15 16:40

>>21
Eh? Hardware wise, our processors wouldn't be where they were today if it weren't for low-level optimizations. Do you know how much low-level hackery goes into designing modern speculative, out-of-order, cache-coherent CPUs whether they be CISC or RISC?

Software wise, you can't just settle with simple higher level abstractions for everything simply because you're relying on Moores Law eventually giving us enough cores so you can just say "Well, I'll just upgrade to 32 cores and that'll solve my problem."

Getting faster (ie. more transistors) CPUs to make unoptimized software faster worked when things were all serialized running in a single thread, but it will not work for making unoptimized multi-threaded software faster.

The bottleneck isn't more CPUs/transistors. The bottleneck with concurrency is with memory latency and shared memory, and if you haven't noticed, memory latency is actually increasing as throughput is increased.

You will never be able to make multi-threaded software that isn't designed to scale faster by throwing more hardware at it, it will hit it's limit and never surpass it.

You have to solve the problem in software, it's the only way, and there's single simple method to reduce memory sharing in such a way that you can hide it behind the veneers of a general purpose programming language--this is stuff that must be dealt with in the design of the software itself by the programmers using the language themselves.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List