Write a program in your language that
1.opens a file as arbitrary precision integer.
2.divides the number by 3.
3.stores the binary result in another file.
Name:
Anonymous2009-04-14 13:02
int x = file.open(someFile)/3;
x.binaryWrite(anotherFile);
Define the format of the integer and its limits.
Without limiting the language, the fastest will be in hand-optimized asm. If you take a high level language into consideration, then, the result varies widely depending on the compiler and the design choices of your language.
Due to these inconsistencies, I conclude this is a troll thread.
In before best Java scalable enterprise solutions, pseudo-functional FIOC, Haskell one-liner that uses extensive amounts of $ that are really parens, and a C program that actually works.
>>6 Haskell one-liner that uses extensive amounts of $ that are really parens In after, you mean.
Additionally, $ is the operator for low-precedence function application -- it's has the same effect as using parens to change evaluation order in most situations, but can be used for much, much more.
Name:
Anonymous2009-04-14 14:03
>>8
Is it wrong that I came buckets to that program?
There, that's closer to something to work with. I think I started writing SEPPLES on the last one.
Name:
Anonymous2009-04-14 14:10
in b4 JAVA
Name:
Anonymous2009-04-14 14:20
>>11
I think if people saw that, there would be no more need for Java. You still have two sections with C++, but you just have to remove the "OPENING_DEVICE" and "OPENING_DEVICE."
>>22
BigInteger has obnoxious overhead for such a trvial problem and the default I/O buffering is NOT sufficient given it will be the bottleneck in this problem.
>>30,32
Do you mean ASCII CHARACTERS? Because there's a huge fucking difference between a bloated mass of ASCII CHARACTERS and the actual, numerical, binary result. Say what you mean (or stop meaning such a retarded thing in the first place), or you have no right to be a dick when you don't get it.
Name:
Anonymous2009-04-15 10:44
This is best implemented using long division and not reading the entire file in at once. Thus it will scale nicely.
MAC OS is like BSD, only apples charge for it. Why would I pay for something that I get for free, and without pig disgusting extras like web 2.0 GUI or screenlets?
$ gcc -c post23.c -o post23 file1.txt post23.out
post22.c:3:17: error: gmp.h: No such file or directory
post22.c: In function 'main':
post22.c:8: error: 'mpz_t' undeclared (first use in this function)
post22.c:8: error: (Each undeclared identifier is reported only once
post22.c:8: error: for each function it appears in.)
post22.c:8: error: syntax error before 'n'
post22.c:9: error: 'n' undeclared (first use in this function)
$ time ./post23
-bash: ./post23: No such file or directory
real 0m0.002s
user 0m0.000s
sys 0m0.002s
WE HAVE A WINNER
Name:
Anonymous2009-04-16 19:53
SEX
Name:
Anonymous2009-04-16 20:02
>>44
Two milliseconds to run an inexistent program? cygwin is faster than that!
tl;dr I whine about faggoty reddit programmer types >>65
I'm weary of hearing people say 'don't optimize early...', not because it is bad advice, but because it is often their way of saying "I don't like optimizing...all optimizing is bad...it's the compilers job...who cares, computers are always getting faster". I find this attitude to be unscientific and ultimately destructive, it pushes the burden of technological advance to the Electrical/Electronic Engineers(and they do a fantastic job of it) instead of the programmers. I fear that programming is destined to splinter into different levels, with the majority on the highest levels(web apps etc) completely ignorant of any low level details, all the while berating compiler-writers and other low-level programmer types for being too performance-oriented.
Name:
Anonymous2009-04-17 10:28
>>66
The point is to first know where is bottleneck and then optimize the bottleneck. Don't pre-optimize code on the basis of guessing where a bottleneck might be before you have even written some code; if you don't have an intimate knowledge of the problem domain, you should first write then profile the program. The optimization strategies I currently possess came about by first writing naive code then doing some deep analysis about what my naive code is actually doing.
>>67
This is bullshit and I'm sick of hearing this. Most modern big programs suck because of this.
If you don't care about performance, then sure, you can fire up a profiler at the last minute and then optimize some shit code that is taking 99% of the time. Feels good but the result is still crap.
I cringe every time I see an optimization in some project such as Firefox where people complain and then the programmers go in and optimize that bit and then it runs an order or two of magnitude faster, usually also with cleaner, more readable code as a result. If everything was written like that from the beginning, the thing would be FUCKING INSTANTANEOUS. Like it should be, because two decades ago people were solving the same problem in 10MHz CPUs and it ran at good speeds.
This retardation has even started to affect video games, where most have the nerve to run at 30fps and even slow down further (while a few still strive to prove that you can do all the same shit at 60fps without ever missing a frame - CoD4 comes to mind and there are a few more). Fun fact: the last generation console that had the most games running at 60fps was the least powerful one (the PS2).
tl;dr: programmers are fucking lazy trash. Enjoy your enterprise.
Name:
Anonymous2009-04-17 13:53
>>23
mpz_inp_str is reading text numbers in base 10 not binary.
If everything was written like that from the beginning,
There would never be a shippable product.
because two decades ago people were solving the same problem in 10MHz CPUs and it ran at good speeds.
No- the first browser wasn't created until ~1991, and by and large it solved absolutely none of the problems in the modern browser market. 10MHz is also less than half of what was available at the time.
Fun fact: the last generation console that had the most games running at 60fps was the least powerful one (the PS2).
.....IHBT
Name:
Anonymous2009-04-18 4:28
No- the first browser wasn't created until ~1991, and by and large it solved absolutely none of the problems in the modern browser market. 10MHz is also less than half of what was available at the time.
gopher solved all those problems and runs just fine on my 8086.
Name:
Anonymous2009-04-18 5:09
There would never be a shippable product.
Wrong, you'd just have to wait longer.
I also believe that optimization should be an integral part of the design process. If you're going to write code, then do as much as possible to get it right and optimized the first time before you write, otherwise you're just spending extra time rewriting shit.
Name:
Anonymous2009-04-18 5:15
>>71
As does every other program in this thread, for the simple reason that input base is not specified.
Name:
Anonymous2009-04-18 5:40
#include <stdio.h>
/* change this to the largest multiple of your filesystem's
blocksize that's less than available physical memory. */
#define BUF_LEN 1073741824
unsigned char buffa[BUF_LEN];
/* takes input from stdin file, writes output to stdout file */
main() {
register unsigned char r = 0;
register int i;
while(i=fread(buffa,1,BUF_IN_LEN,stdin)) {
register int j;
for(j=0;j<i;i++) {
register unsigned int d = (r<<8) + buffa[j];
register unsigned char q = (d>>2) + (d>>4) + (d>>6) + (d>>8) + (d>>10) + (d>>12) + (d>>14);
r = d - q - (q<<2);
buffa[j] = q;
}
fwrite(buffa,1,i,stdout);
}
return 0;
}
>>74
Never get a job doing game programming, you would be eaten alive.
Name:
Anonymous2009-04-18 8:50
>>70 8/10, you've done well to make me use up 60 minutes of my time.
The tactic of don't-optimize-early is a tactic in the strategy of managing developer productivity. This plays into the principle of coder's time being expensive and limited, coder's experience and understanding being limited, and computing resources being cheaper than coder time.
By understanding the problem domain foremost, coders can use that understanding to guide their craft. By establishing some working but naive code the coder establishes a standard that can be improved and also contributes to understanding the problem domain. By analyzing the established situation, the developers can form targets to apply the optimization techniques. By having concrete and achievable targets, the coder can focus the effort into completing each target as they come as opposed to indescriminately applying optimization techniques to potential (unwritten) code.
If everything was written like that from the beginning, the thing would be FUCKING INSTANTANEOUS.
Coders aren't born with an understanding of their problem domain in coding. This understanding is learned through experience.
Like it should be, because two decades ago people were solving the same problem in 10MHz CPUs and it ran at good speeds.
I think your understanding of this matter is shallow. Two decades ago, we couldn't do many things interactively that we can do today because of limited computing resources.
You have massive and bloated monstrosities today partly because of a lack of understanding of the systems used as well as the lack of understanding system understanding of the problem domain. Another problem is because of strong demand to ship working programs within the schedule. This often means that programmers use tools that are intended for quick results and the developers having less time to analyze the problem domain. Part of the problem is also the mentality of profitting through artificial scarcity and subjugating users out of freedom 1 of free software; the proprietary software masters take it upon themselves to be the gatekeepers of the software that was developed in their name; the gatekeepers choose to keep the burden of indescriminately hacking at code monstrosities because of misguided notions of "Intellectual Property".
What sort of fucking idiot regards "opens a file as arbitrary precision integer" to mean "read in a file, interpret it as ASCII and take that string to be a base 10 number"? Obviously, it means take the actual file data to be a number.
Name:
Anonymous2009-04-20 18:37
>>76
Interesting method of dividing by three, but it makes little difference as the bottleneck is disk access.
Name:
Anonymous2009-04-20 18:44
>>86 Obviously, it means take the actual file data to be a number.
`Obvious' is probably not the word you're looking for here considering every single program in this thread does it the other way.
Given >>1's definition, how would you interpret a binary file as an arbitrary-precision integer?
Name:
Anonymous2009-04-20 18:46
>>88 No, every single program in this thread does not.
The two programs in this thread which actually do it properly and efficiently interpret >>1 to mean the file is read in as a binary number with the most significant bit at the start of the file and the least significant bit at the end.
Name:
Anonymous2009-04-20 18:49
>>89
I think you need to re-examine your assumptions.