Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

__future__

Name: Anonymous 2011-07-15 3:28

What is /prog/'s opinion on the future of programming?
-Will functional languages really take over?
-Are there any other paradigms rising out of obscurity?
-Will the hardware we run on change drastically? Would this affect how we program? Think quantum computers and shit (/tech/esqe, but entirely related to what we do)

I think that in the future the use of multi-paradigm languages will become almost standard when it comes to writing anything. Concurrent programming is the way of the future; it is currently the only way Moore's Law will continue to be followed. However there are elements of functional languages that make them difficult for practical use, so switching between and combining paradigms where necessary may be the way to go. However we will end up with most languages looking like C++, or worse; a mess of different ways of achieving the same goal (I still love me some C++).

As such I don't see a massive shift to functional languages taking place just yet, however I do think we will see the incorporation of functional features into OO and procedural languages. However I feel that the need for functional languages is inevitable; we are moving closer and closer to a world where we are having to concurrently use data from different sources at once and functional languages simplify this situation greatly.

I actually haven't learned a functional language yet, but am probably going to try pick up Haskell and F# later in the year. Most of the information I've got on the functional style comes from http://www.defmacro.org/ramblings/fp.html

Given that all computers are Turing-complete, I don't think we'll make any changes to our actual programming style. High level languages will remain almost exactly the same, save maybe an idea or two. But I do think that the future of hardware could have a gigantic effect on cryptography. I haven't read much about this, but I am very interested in the effect it could have.

Name: Anonymous 2011-07-16 10:57

Only having read >>1, my answer is:
Will functional languages really take over?
No, but the functional paradigm is useful and it helps to use it (when it makes sense) regardless of the language you're actually using (even for imperative ones like C or assembly)
Are there any other paradigms rising out of obscurity?
I enjoy CL's multiparadigm aproach very much. Declarative, metaprogramming, functional, object oriented (and meta object oriented) and even imperative. Being able to just use what you want and having access to a variety of tools is quite nice. I'm not a huge fan of restrictions as far as programmer's freedom is concerned, except for cases where it can greatly improve stability and clarity of the code, but this is a fine line to walk.
-Will the hardware we run on change drastically? Would this affect how we program? Think quantum computers and shit (/tech/esqe, but entirely related to what we do)
Normal CPUs will be rather common, simply because most of the current software is designed for them. More CPUs (more cores) will be commonplace. GPU-oriented software is also rising, although a niche, but it's useful especially for parallelizable algorithms - it's currently in use for gaming to crypto to AI/machine learning to multimedia/encoding/decoding acceleration and various other more niche applications which require speed.
Where true parallelization will be required FPGAs will meet the demand as usual, but you should expect much denser (although slower) alternatives in the next 5-10 years if certain research projects succeed - they'll mostly be used for treating your hardware like it was software, but now at a much lower cost than FPGAs are today, and possibly for AI applications (maybe even some neural network based AGIs if those ever succeed).

As for Moore's Law - today's CPUs are designed to be too sequential, there is great possibility for parallelization (as far as certain applications are concerned), even if we hit the maximal physical minimimalization plateau  - it won't be that many years until they can't scale down lithography-based techniques used in today's manufacturing (they're already having major challenges), and people will have to be very creative there (be it finally taking the challenge of molecular nanotechnology seriously or some sort of advanced 3D layering before attempting that), and once they take that final step, speeds will increase by a huge lot (likely one last time) and then hit off a plataeu and from there on only the actual design will matter as far as speed is concerned.

As for quantum computing, its success depends a lot on whatever the true laws of this particular universe that we live in are (it's not enough for there to be "quantum mechanics" - the actual underlying implementation matters, depending on which "interpretation" is true, we may see differing results as far as quantum computing is concerned). It's too early to tell if it'll truly be a success. If it does succeed, certain search problems will become much faster to solve than on classical computers, but no, it's not going to make NP-hard stuff suddenly fast.

But I do think that the future of hardware could have a gigantic effect on cryptography.
PKI and various asymmetric crypto may become unusuable. Symmetric crypto will probably be affected much less. There are certain ways to work around these problems, but that would be a discussion for another time.

Future is what you make it.
Certain trends may sweep the world you live in, but it is always the programmer (unless working for employers with specific language/hardware requirments) that decides what solves your problems best.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List