Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

__future__

Name: Anonymous 2011-07-15 3:28

What is /prog/'s opinion on the future of programming?
-Will functional languages really take over?
-Are there any other paradigms rising out of obscurity?
-Will the hardware we run on change drastically? Would this affect how we program? Think quantum computers and shit (/tech/esqe, but entirely related to what we do)

I think that in the future the use of multi-paradigm languages will become almost standard when it comes to writing anything. Concurrent programming is the way of the future; it is currently the only way Moore's Law will continue to be followed. However there are elements of functional languages that make them difficult for practical use, so switching between and combining paradigms where necessary may be the way to go. However we will end up with most languages looking like C++, or worse; a mess of different ways of achieving the same goal (I still love me some C++).

As such I don't see a massive shift to functional languages taking place just yet, however I do think we will see the incorporation of functional features into OO and procedural languages. However I feel that the need for functional languages is inevitable; we are moving closer and closer to a world where we are having to concurrently use data from different sources at once and functional languages simplify this situation greatly.

I actually haven't learned a functional language yet, but am probably going to try pick up Haskell and F# later in the year. Most of the information I've got on the functional style comes from http://www.defmacro.org/ramblings/fp.html

Given that all computers are Turing-complete, I don't think we'll make any changes to our actual programming style. High level languages will remain almost exactly the same, save maybe an idea or two. But I do think that the future of hardware could have a gigantic effect on cryptography. I haven't read much about this, but I am very interested in the effect it could have.

Name: Anonymous 2011-07-15 18:53

>>38
>>36 what
VALID SEPPLESOCKS LAMBDA EXPRESSION

Name: Anonymous 2011-07-15 21:38

the future of programming is python in every area some assembly language isn't

Name: Anonymous 2011-07-16 2:32

>>42
Then why am I learning java?

Name: Anonymous 2011-07-16 2:33

>>43
Because you're as gay as >>42

Name: Anonymous 2011-07-16 3:48

>>42
python is shitty. shit shit shitty.

Name: Anonymous 2011-07-16 3:50

>I don't see a massive shift to functional languages taking place just yet,
It won't ever happen. I think C++ will be hybridized with features of these langs and regain the hipsters who left for java/python/ruby/etc

Name: Anonymous 2011-07-16 3:57

>>46
THAT won't ever happen. C++ is in its final years. Besides being horribly designed, it is suffering from Common Lisp's issue, which is that the language just keeps growing and growing and simply can't be learned in a weekend, much less 6 months.

Name: Anonymous 2011-07-16 3:59

>>46
C++ need a huge syntax overhaul to be useful. Its too complex to parse.

Name: Anonymous 2011-07-16 4:03

>>48
YOU MENA IT'S

Name: Anonymous 2011-07-16 4:32

Haskell and Scheme are the future of programming languages. Maybe BitC will be as well.

Name: Anonymous 2011-07-16 4:38

>>47
What, then, is the future of Common Lisp?

Name: Anonymous 2011-07-16 4:54

>>50
Maybe BitC will be as well.
Except when it won't. They're adopting a more ML-like syntax, losing all the benefits of a homoiconic representation. It's basically ML with some built-in ways to control structure layout and such. It even has THE FORCED COLLECTION OF GARBAGE

Name: Anonymous 2011-07-16 6:00

>>48
There's nothing wrong with context sensitive grammars.

Name: Anonymous 2011-07-16 6:03

>>53
Except everything.

Name: Anonymous 2011-07-16 6:04

>>53
go suck seven dicks at once

Name: Anonymous 2011-07-16 6:06

>>55
You have time to do it, while your code parses.

Name: Anonymous 2011-07-16 6:26

>>54,55
Only an autist with severe OCD would find context sensitive grammars to their disliking.

Name: Anonymous 2011-07-16 6:27

>>57
Sorry, but I'd like my code to parse fast.

Name: Anonymous 2011-07-16 6:43

>>57
Were alll autistes here.

Name: Anonymous 2011-07-16 10:57

Only having read >>1, my answer is:
Will functional languages really take over?
No, but the functional paradigm is useful and it helps to use it (when it makes sense) regardless of the language you're actually using (even for imperative ones like C or assembly)
Are there any other paradigms rising out of obscurity?
I enjoy CL's multiparadigm aproach very much. Declarative, metaprogramming, functional, object oriented (and meta object oriented) and even imperative. Being able to just use what you want and having access to a variety of tools is quite nice. I'm not a huge fan of restrictions as far as programmer's freedom is concerned, except for cases where it can greatly improve stability and clarity of the code, but this is a fine line to walk.
-Will the hardware we run on change drastically? Would this affect how we program? Think quantum computers and shit (/tech/esqe, but entirely related to what we do)
Normal CPUs will be rather common, simply because most of the current software is designed for them. More CPUs (more cores) will be commonplace. GPU-oriented software is also rising, although a niche, but it's useful especially for parallelizable algorithms - it's currently in use for gaming to crypto to AI/machine learning to multimedia/encoding/decoding acceleration and various other more niche applications which require speed.
Where true parallelization will be required FPGAs will meet the demand as usual, but you should expect much denser (although slower) alternatives in the next 5-10 years if certain research projects succeed - they'll mostly be used for treating your hardware like it was software, but now at a much lower cost than FPGAs are today, and possibly for AI applications (maybe even some neural network based AGIs if those ever succeed).

As for Moore's Law - today's CPUs are designed to be too sequential, there is great possibility for parallelization (as far as certain applications are concerned), even if we hit the maximal physical minimimalization plateau  - it won't be that many years until they can't scale down lithography-based techniques used in today's manufacturing (they're already having major challenges), and people will have to be very creative there (be it finally taking the challenge of molecular nanotechnology seriously or some sort of advanced 3D layering before attempting that), and once they take that final step, speeds will increase by a huge lot (likely one last time) and then hit off a plataeu and from there on only the actual design will matter as far as speed is concerned.

As for quantum computing, its success depends a lot on whatever the true laws of this particular universe that we live in are (it's not enough for there to be "quantum mechanics" - the actual underlying implementation matters, depending on which "interpretation" is true, we may see differing results as far as quantum computing is concerned). It's too early to tell if it'll truly be a success. If it does succeed, certain search problems will become much faster to solve than on classical computers, but no, it's not going to make NP-hard stuff suddenly fast.

But I do think that the future of hardware could have a gigantic effect on cryptography.
PKI and various asymmetric crypto may become unusuable. Symmetric crypto will probably be affected much less. There are certain ways to work around these problems, but that would be a discussion for another time.

Future is what you make it.
Certain trends may sweep the world you live in, but it is always the programmer (unless working for employers with specific language/hardware requirments) that decides what solves your problems best.

Name: Anonymous 2011-07-16 11:16

>>51
It will slowly die and be replaced by something more streamlined like Arc (except not Arc.)

Name: Anonymous 2011-07-16 12:33

right tool for the right job

Name: Anonymous 2011-07-16 15:06

>>60
HEY FUCKFACE
CPU'S ARE SEQUENTIAL BECAUSE I/O IS SEQUENTIAL
ALSO, ZISC

Name: Anonymous 2011-07-16 15:34

I'm a professional codder

Name: Anonymous 2011-07-16 17:13

>>60
PKI and various asymmetric crypto may become unusuable. Symmetric crypto will probably be affected much less. There are certain ways to work around these problems, but that would be a discussion for another time.
I've always wondered. Are there any asymmetric cryptosystems that aren't based on the DH assumption nor on factoring large semiprimes?

Name: Anonymous 2011-07-16 17:17

http://en.wikipedia.org/wiki/Amdahl's_law

If you consider its implications we are heading towards giant fpgas or something similar.
Maximizing the parallelization of code will be the only performance consideration. Another stupidity is the fact that 64bit numbers are overkill for almost any application, till they are stuffed down our throats.

Name: Anonymous 2011-07-16 17:48

>>63
That makes no sense when it comes to programs being either. network bound or CPU bound.

Name: Anonymous 2011-07-16 19:04

>>66
64-bits is nice for bit flags and moving memory around. I also wrote a 64-bit FNV-1A hash which is 15% faster than the builtin CRC32 instructions on SSE 4.2 CPUs like the Core i7.

Name: Anonymous 2011-07-16 19:34

>>68
yes it is.
But for things like that you'd even gain more advantage if you do have the ability to couple 2 or more processors to handle upper & lower bits individually. Or accessing memory in a similar fashion. This cannot be done with current architectures of course.
But in the future we might have one processor/logic entity available _just_ to handle one pin.

The only problem I'd see is nobody will be around to handle that much low level stuff.

Name: Anonymous 2011-07-16 19:50

>>69
That sounds retarded. ALU pipelines already process machine words in parallel. Why would having a core per pin with it's own instruction decoder and pipeline be better? That just sounds completely fucking backwards. I'll dub your architecture SISB or Single-Instruction Single-Bit. That's just wrong.

Modern CPUs usually have SIMD pipelines. We're heading towards SIMT and MIMD architectures for general purpose CPUs. SIMT and MIMD are already used on GPUs.

With SIMT/MIMD you have a single instruction decoder driving multiple pipelines, where each pipeline is often 128-bits, 256-bits, 512-bits, or even 1024-bits wide (for those 4x4 32-bit floating point matrix multiplies, awww yeah).

Keeping each pipeline as wide as possible maximizes overall throughput and makes the best use of transistors.

Name: Anonymous 2011-07-16 20:35

FIBONACCI BUTT SORT

Name: Anonymous 2011-07-16 22:17

>>70
Well, one pin might not be practical and was just an example of where the trend might lead us. I have nfi how something like it might look like and if it would work, but there would be some number of bits practical for a particular instruction set / number of processors / layout.

If processors are equipped with a hardware stack, very few single cycle instructions and placed within a rectangular grid. You don't need no fuckin pipelining, at least not in the traditional sense.

Memory access will be an issue only in the transition time, later it will be on the chip.

How exactly is MIMD used in GPUs? I found no documentation of that.

Name: Anonymous 2011-07-16 23:24

I WROTE AN ENCRYPTION ALGORITHM IN 2 SECONDS
IT'S CALLED "SHORT TERM MEMORY LOSS"
100% PERFECT

Name: Anonymous 2011-07-17 0:29

>>73
How do you know?

Name: Anonymous 2011-07-17 1:53

>>29

There's a big difference between solving hardware problems and solving people problems, and an abstraction can ALWAYS be built between the two. Of course the abstraction will leak and it won't be the most efficient thing possible. At that point you have to gauge how much patience you have for things like setting compiler flags. I have patience for that AFTER I've profiled, and not before, and I refuse to let it affect my design until after I've profiled.

 A language with semantics designed for THIS attitude is the one that should win


That's why ParrotVM is such a good idea.

Functional concepts

Parrot has rich support for several features of functional programming including closures and continuations, both of which can be particularly difficult to implement correctly and portably, especially in conjunction with exception handling and threading. Implementing solutions to these problems at the virtual machine level prevents repeated efforts to solve these problems in the individual client languages.


http://en.wikipedia.org/wiki/Parrot_virtual_machine#Functional_concepts

Name: Anonymous 2011-07-17 2:16

>>75
What the fuck is this.
Also, does Parrot really support continuations?

Name: Anonymous 2011-07-17 2:38

>>75
Parrot is pretty cool.

Name: Anonymous 2011-07-17 5:14

>>60
Will functional languages really take over?
No, but the functional paradigm is useful and it helps to use it (when it makes sense) regardless of the language you're actually using (even for imperative ones like C or assembly)
I don't know why people think you need a functional language to write functional code. Some things are simply better expressed functionally, even coding in C, and the result can often be faster, smaller, simpler and easier to understand. I hate how much scaffolding people write to wrap simple algorithms into mutable objects in C++/Java. Drives me crazy.

Name: Anonymous 2011-07-17 6:34

>>78
Because they make it insanely hard.

Name: Anonymous 2011-07-17 9:30

>>76
The first half reads like Joel on Software.

As for the parrot VM, it has surprisingly good feature coverage. It's a much better choice than you might think for your dynlanguage.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List