I have 2 favorite ones (I couldn't decide to go for just one of them):
- Skip Lists
- Dancing Links
Name:
Anonymous2008-03-29 7:04
λx.λy.λf.(f x) y
Name:
Anonymous2008-03-29 7:06
.....what type of dance?
Name:
Anonymous2008-03-29 7:09
hree, a heap tree
Name:
Anonymous2008-03-29 7:18
rb-trees. They are, so to say, the bomb.
Name:
Anonymous2008-03-29 7:29
Lists. Haskell lists.
Name:
Anonymous2008-03-29 7:45
oct trees
Name:
Anonymous2008-03-29 7:48
faggot trees
Name:
Anonymous2008-03-29 8:08
>>3
The name Dancing Links comes from the way the algorithm works, as iterations of the algorithm cause the links to "dance" with partner links so as to resemble an "exquisitely choreographed dance."
Name:
Anonymous2008-03-29 8:10
>>4
heaps are fine, cause they're vectors, therefore very fast
>>5
trees suck
skip lists are easier to understand, easier to implement, and proved to be better than any tree.
Name:
Anonymous2008-03-29 8:25
cons cell
Name:
Anonymous2008-03-29 11:03
>>10 Splay Trees are far better than Skip Lists and way easier to implement. Plus, the truly great operating systems implement them.
Coming in second play, the Priority Queue is my next favourite data structure.
>>12
"A splay tree is a self-balancing binary search tree with the additional unusual property that recently accessed elements are quick to access again. It performs basic operations such as insertion, look-up and removal in O(log(n)) amortized time."
skip lists also perform all that in O(log(n))
so whats better in splay trees? I'm still not convinced
Name:
Anonymous2008-03-29 19:30
quad trees, followed by treaps (tree-heaps). can you name any other data structure which relies upon a prng? nop
Name:
Anonymous2008-03-29 19:36
Someone should invent the awesomest data structure ever and name it ´´Knuth´´
Name:
Anonymous2008-03-29 19:42
The one that proves p=np
Name:
Anonymous2008-03-29 22:00
>>20
Splay trees don't require PRNG. PRNG takes up precious CPU cycles. Plus, the splaying operation on a tree is the most beautiful thing in computer science.
>>24
i try to get my friends (non-computer scientists) to understand the beauty of these things and they just don't understand ;_;
Name:
Anonymous2008-03-29 23:06
also, yeah, gotta agree with >>5
being able to do everything in guaranteed O(log(n)) is rockin. its design is really brilliant, and the various proofs associated with it are pretty.
>>27
sorry i don't recognize any data structures invented after 1980
Name:
Anonymous2008-03-30 0:22
>>26
AVL trees are simpler and the proofs are quite similar to rb. But AVL have much faster searchs since the tree height is bound by 1.44*log(n). AVL FTW.
Name:
Anonymous2008-03-30 2:07
Hash treaps
Name:
Anonymous2008-03-30 2:15
No love for the simple array? :(
0(1) lookup, fuckin' A this shit is awesome. enjoy your nlogn fags
Name:
Anonymous2008-03-30 2:17
Bisexual Hash Table
Name:
Anonymous2008-03-30 2:59
What, no love for Bloom Filters?
Name:
Anonymous2008-03-30 7:53
>>29
With an AVL tree you have O(log n) rotations after an insert, but with a rb-tree you have only O(1) rotations.
Name:
Anonymous2008-03-30 10:31
>>34
If n is 2, then O(log n) is O(1). If n is 1, then it is O(0). You can't beat O(0). The gist of it is, keep your tree smaller than 2 nodes, and you will be alright.
>>37
Big-O is about order of growth of an algorithm. Whether an algorithm is run or not in your program is irrelevant to the order of growth of that algorithm. O(0) is meaningless. Read SICP.