Why do many large integer libraries store their data as a string of symbols instead of a block of integers native to the system? It seems to me that this would make bitwise operations rather difficult to implement.
I don't know how "many" many is, but most of the implementations I've seen tend to store bignums as some sort of packed bit arrays. Most C-based ones do provide string input as C doesn't support bignums natively and has no way of modifying the compiler to read something in as a static bignum. In languages with native bignum support (for example Common Lisp), bignums tend to be stored and inputted in saner manners (packed bit arrays).