Why do programmers still code in arbitrary limits like these?
It breaks when you do #define LINE_MAX UINT_MAX
Why not redesign so that you use an O(1) space algorithm, or if it can't be avoided, dynamically allocate as much as needed?
Name:
Anonymous2008-08-26 14:33
Statically allocated buffers are instantly available. On most architectures I know of, the compiler generates no implicit instructions to handle initialization. The OS is never even bothered to set it up, unless for whatever reason you need more blocks of memory to hold your stack or something. It's a fast and easy way to get a reasonably sized block of data out of a function. char buf[4096];
unsigned int line_no;
FILE *file = fopen("foo", "r");
for(line_no = 0; fgets(buf, 4096, file); line_no++) {
if(buf[0] != '\0') {
if(buf[strlen(buf) - 1] == '\n') {
/* handle it */ }
/* ... */ } }
I claim that if your text file has 'lines' over 4096 bytes, maybe it's time to think of a better way to parse it. Perhaps you should read it in and handle it X bytes at a time, rather than line-by-line, or maybe you'll want to figure out how long your lines can get before reading it.