when i have them all in a directory, GNU tools choke on my ram
so what is /comp/s way to deal with a wikipedia dump of 15 million files ?
Name:
Anonymous2008-02-07 13:18
xargs
Name:
Anonymous2008-02-07 13:26
xargs expects lines from stdin so you have to pipe the filenames via find or ls but 15 mill. is too much to be piped.
I use perl:
opendir DIR, "bla";
while defined readdir DIR { do some stuff }
did 1mill files per minute
Name:
Anonymous2008-02-07 13:29
xargs wont help - ls still does reserve ram -_-
Name:
Anonymous2008-02-08 2:50
>>1
I think you made a typo... you said GNU and I think you meant GUN. A GUN is a type of ranged firearm used to hunt and kill. Unless you were talking about GNUs which are a type of bird or something
Wildebeest eh? I knew I should have googled GNU first before I wrote >>5
Name:
Anonymous2008-02-10 1:06
>>11
GNU is an acronym that stands for GNU's Not Unix.
Name:
Anonymous2008-02-10 1:36
>>15
stop trolling you asshole. that doesn't even make sense.
Name:
Anonymous2008-02-10 5:37
>>15
That has to be the most moronic troll I've ever seen. What the fuck.
Name:
Anonymous2008-02-10 5:46
>>16,17
HAHAHA. Troll? Try the truth. Try reading this transcript by the creator of GNU. http://www.gnu.org/philosophy/stallman-kth.html
If you don't want to bother, search for the section that says And that word is of course GNU, which stands for ``Gnu's Not Unix''.
Its name is a recursive acronym for GNU's Not Unix; it was chosen because its design is Unix-like, but differs from Unix by being free software and by not containing any Unix code.
Name:
Anonymous2008-02-10 10:47
>>22
Only the most elite of trolls vandalize Wikipedia articles to support their troll posts. Kudos.