Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-

Can someone help me find this article?

Name: Anonymous 2009-06-25 19:08

My school only has access to the online archives for this journal back to '97. Are any of you guys able to download articles from '95? If someone could download this article and put it on megaupload or something for me, I'll totally suck your cock dry. >_>

L. Bertrand. Computing a hyperelliptic integral using arithmetic in the
jacobian of the curve. Applicable Algebra in Engineering, Communication
and Computing, 6:275–298, 1995.


http://www.springerlink.com/content/m786486452285575/

Name: Anonymous 2009-06-25 19:20

Oh God I hate that shit. You Google some interesting stuff and a ton of fucking paysites appear and of course Google's cache is blocked.

Do they pay Google or something? They should be marked as advertisements. If they don't pay, then they should be banned outright, since they're in violation of the indexing policies (serving different content to Google than to normal visitors).

Why don't somebody with access to such articles gets them all and makes a mega-torrent or something? I attempted to make requests like the ones the Google indexer does but it didn't work, raising my suspicions that they're working together. Fucking scum, profiting from the papers of others. The scientific community is a piece of shit. IEEE is the worst, I hope their fucking papers catch fire and they die a horrible death.

Name: Anonymous 2009-06-25 19:21

>>2
IHBT.

Name: Anonymous 2009-06-25 19:41

>>2
A torrent of every article from every journal ever??

Name: Anonymous 2009-06-25 19:44

>>4
What size do you estimate? About 500GB or 1TB should be very doable (there are torrents of that order of magnitude in the wild). Keep in mind that if packaged sensibly, you'd only have to download a small fraction with whatever interests you.

Name: Anonymous 2009-06-25 20:03

>>5
Definitely much more than 1TB. Should be on the petabyte scale.

Name: Anonymous 2009-06-25 20:06

>>6
PⅇTA

Name: Anonymous 2009-06-25 20:12

>>6
So, 10 billion articles at 100KB each? I don't think so.

10 million at 100KB average sounds rather reasonable.

In any case, I'd settle for the computer- and math-related ones, if that makes it better.

It'd be interesting to know how their arrangement with the search engines works.

Name: Anonymous 2009-06-25 21:00

Ignoring the size issue for the moment, you'd have to first get a list of every journal ever published (not a trivial task by itself), break into the online archives of the ones that even have such things, and then track down copies of all the ones that don't have electronic versions and scan them all in.

I think it might actually be easier to create a torrent with every single music album ever produced or every single movie ever made.

Name: Anonymous 2009-06-26 3:27

>>9
But torrenting what is essentially the collective knowledge of humans is many times more awesome.

Name: Anonymous 2009-06-26 6:08

>>8
With infinite compression this is all moot

Name: Anonymous 2009-06-26 6:31

>>11
Once  releases his decompressor, all we have to do is feed it a zero-byte file and then extract whatever articles we want.

Name: Anonymous 2009-06-26 7:16

>>6,8
For what it's worth, the Library of Congress' total collection is estimated to be in the range of 10 TB, and the collected contents of all US academic research libraries is estimated at 2 PB.

Even very basic compression will vastly reduce that size, though.

Name: FrozenVoid 2009-06-26 7:22

>>12 actually it expects a 1024bit float with parameters in filename.


______________________________________________
http://xs135.xs.to/xs135/09042/av922.jpg
orbis terrarum delenda est

Name: Anonymous 2009-06-26 7:24

>>13
No doubt FV will get it onto a Floppy ;)

Name: ​​​​​​​​​​ 2010-09-10 0:26

Name: Anonymous 2010-11-27 6:04

Name: Anonymous 2011-02-02 22:50


Don't change these.
Name: Email:
Entire Thread Thread List