>>28
mutable string buffers can be handy. Albiet, in place modification of strings for formating isn't the most productive thing to do when you have more powerful tools at your disposal than what you get in the C standard library. But sometimes you just need a large expandable array of untyped binary data for serialization. String buffers can be good for that, provided the language provides a means to write and read an object to and from a binary string.
Although I should have mentioned that not only are symbols immutable, but they are also unique. That is, converting a string to a symbol has an implicit dictionary look up for an already existing equivalent symbol, in a symbol table. So this means that if you do file io and represent the read text lines as symbols, every get line has an implicit hash table look up, where the line from the file is interned as a symbol. With strings this doesn't need to happen, making them better for fast reading, transforming, and writing. Although the uniqueness of symbols makes them good for cases when they are created once and then compared to other symbols many times. Because each symbol is unique, equality can be resolved by pointer equality.
It's not that hard to have symbols and strings. If you are going to be able to generate code at run time, symbols can help make it more efficient. But symbols aren't great for everything. Lisp/Scheme has capabilities that provide symbols with a niche. Strings would be shitty representations of unevaluated code, and symbols are shitty when strings are faster and easier.