Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Project Euler problem 18/67 solver in Haskell

Name: Anonymous 2009-10-22 19:51


maximumPath :: [[Int]] -> Int
maximumPath tl = head $ foldl (flip reduce) (head rtl) (tail rtl)
  where rtl = reverse tl
        reduce [] b = []
        reduce a [] = []
        reduce (a:as) (b:c:bs) = (a + max b c) : reduce as (c:bs)
        reduce (a:as) [b] = [a + b]


To solve 67 (which is the more interesting one),


trif <- readFile "triangle.txt"
let answer = maximumPath $ map (\l -> map read l) (map words (lines trif))

Name: Anonymous 2009-10-25 11:31

>>22

If precedence confuses you, then you really are a moron.

Precedence confuses many people, actually. Which C programmer hasn't been had by the '&' operator's arbitrary precedence? If precedence isn't confusing then why is there a '``' operator for treating an arbitrary binary function as an infix operator?

That's a optimization issue.

Well in an imperative language i can mutate shit instead of praying to the deforestator. And it's not much of a side effect when mutating a fresh sequence.

FIXABLE, temporary, compiler specific, optimization issue?

Well i suppose writing a deforestator that actually works instead of hardcoded support for left/right folds, map, reduce and some other random shit has been put on hiatus until the halting problem is given a positive answer.

Using transformers is trivial, but if you're put off by the nesting, just write yourself a new monad.

I once read a Haskell ebook that made an assertion that lazy evaluation is fucking awesome. Then it showed a program that consisted of a 'main' function being a 'do' expression, with lots of a '<-' sprinkled here and there.

Last two-thirds of the PDF was showing how to appease Hindley-Milner with type declarations using the arrow syntax, which is readable, not at all confusing and elegant since it was written by actual mathematicians and not some John McCarthy, a political science undergrad.

Last time i tried to give a fuck I read some internets "book" about doing shit for "great good".

So again i was treated to some random shit that didn't interest me at all. Instead of showing how to do cool shit with HM like prove that binary trees have the correct structure, it treated me to random inanity.

Then i learned that functions can't be defined from M-x run-haskell (ghci) prompt, only by loading a whole file. And apparently i can't query the "VM" for function's type, because it's a "black box" and "that's the way it's supposed to be". In the fucking interpreter.

C books start on how one can reify low-level data structure to improve space complexity and locality of reference; Lisp books start with how one can smash the state with revolutionary macros; Smalltalk books (probably, never read any) start with how Squeak internals can be redefined at runtime straight from the IDE. Haskell books say that the program is 9000% more correct since the /types/ of expressions are known at compile time; slightly more useful properties can be proven, this time by writing over 9000 arrows.

Oh yeah, some intense shit can be done with static typing, except it's in sequent calculus and not obsolete HM. Go figure.

Haskell's lazy evaluation is a big win and Haskell can be used to do Real Work. Except that a left-pointing-arrow has to be placed before all the terrible thing. Potential side effects include:

- Creating a new file (modifies file system state)
- Reading from a file (advances the pointer of the file descriptor)
- Raising an exception (doesn't return from a function)
- Printing the result on the screen
- Allocating memory (calls mmap(2)/sbrk(2) which is just not cool)
- Calling a function in a non-tail position (allocates a stack frame)
- Evaluating any expression (advances EIP)
- Having the computer turned on (uses electricity)

So as someone said to be good at functional programming one has to create a new universe every time anything is done, to avoid mutating the state. But what if creating a new universe is a side effect as well?

Some argue that it's good to encapsulate impure shit in a ghetto, present a functional interface to the ghetto, and write the rest in functional style. But why all the fucking arrows? And it can be done in any imperative language as well. Perhaps Haskell is a language for college professors who can judge the quality of their students' programs by the number of left-pointing arrows (as opposed to number of bang signs as in Scheme). Morans learn to hate their future corporate job when they grind side effects for a living, on teh JVM.

Lazy evaluation can be implemented trivially for any language with SICP powers. Srsly. Just use a code walker to embed any function application form in a thunk. When it's time to get an actual result, compute the thunks recursively. Can even be made 'eager' with futures. Probably.

Of course sans deforestator, which would fail anyway if fold/map/whatever function couldn't be determined to be used at compile time. When Haskellers write papers on how their deforestator can nao handle right folds, others happily mutate their shit. And when it comes to OOP, they use dataflow-oriented programming thanks to Excel 101 skills they possess.

Srsly, i can't troll the shit out of anyone. frozenvoid could do better. Tiem to sleep nao. Good night /b/.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List