Hardware has gotten to the point where translating a CISC instruction set into the RISC microcode isn't really much of a bottleneck, with regards to desktop computing.
With embedded systems, RISC is the obvious answer.
/thread.
Name:
Anonymous2008-04-13 1:12
With three instructions, we only need 2 bits to represent every instruction. This is significant savings in comparison to 32 bit instructions.
Name:
Anonymous2008-04-13 2:11
>>7
But wait, with two bits we can represent 4 commands. Should we --
1. Define a fourth command, nop.
2. Not define a fourth command just to piss people off.
3. Switch to a non-binary system such that one bit can represent three possible values (or a system where n bits represents exactly three values).
4. Cut our losses and go back to reading SICP.
Name:
Anonymous2008-04-13 2:19
3. Switch to a non-binary system such that one bit can represent three possible values
This is very expensive with all our investments in binary digital logic.
Name:
Anonymous2008-04-13 2:47
>>8
Actually, 2 bits should represent 3 values, not one bit.
We need a systerm where 1 bit represents 1.5 values
Name:
Anonymous2008-04-13 2:48
Shannon knows the answer
Name:
Anonymous2008-04-13 3:23
>>10
1.5 values wouldn't be enough. In 2 bits we only had 2.25 possible values.
One bit needs to represent at least around 1.7321 values
Zero 2dup or not in the definition Haskell will automatically figure out how a certain program accesses a certain age around 10 or so I was forced to use java for no other reason to learn Python and STFU Or.
CISC is pointless now seeing as most code is compiled, and CISC is meant for hand-written code.
RISC is designed for being extremely fast and easier to implement in hardware, and has an instruction set designed for compiled code.