Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

/prog/ Challenge

Name: Anonymous 2007-09-04 15:06 ID:bG0CVxDp

Hey, I've stolen this challenge from some website, it's pretty easy but I know most of you will fuck it up.

A sequence is defined by:

n -> n/2 (n is even)
n -> 3n + 1 (n is odd)

Find the longest sequence, with a starting number under one million.

Bonus points when you find the solution for ten million.

Name: Anonymous 2007-09-05 5:19 ID:7QgTXC0R

1. you don't have to test a number if it's a member of a sequence you've already tested.

2. when a sequence reaches a power of two, the number of bits in the binary representation of that number plus the sequence 1, 4, 2, 1 is the number of terms left in the sequence.

3. A number which is two raised to an odd power can only appear at the start of a sequence, hence odd powers of two don't need to be tested since we can just set the initial known maximum sequence length to 24.

I think we'll all agree that these are all pretty trivial points, the thing that bothers me is that noone even tried to implement them (except 1).

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List