because pseudocode, also like much of mathematical notation, is not precise enough. as humans we can infer the missing implied assumptions in the pseudocode, but computers have a hard time with this
Name:
Anonymous2011-11-23 18:55
pseudocode is a lot easier to read and write
Yeah, well, that's just, like, your opinion, man. Also, see >>2.
Name:
Anonymous2011-11-24 1:21
Python is a failed attempt. The Pseudocode Programming Language would strictly adhere to the pseudocode standard!