I just happened to look at some code I wrote in 2007 and haven't touched ever since. Now it seemed embarrassingly bad. That made me feel good; I've been programming since the early 90s, and apparently I'm still making measurable progress in such a short period.
Any similar experiences?
Name:
Anonymous2011-12-28 17:53
the only thing you are making progress towards is self-destruction. how's that information make you feel?
Name:
Anonymous2011-12-28 18:03
I'm definitely improving. Looking back, it looks so terrible. I can't believe I actually thought some of the things I did were a good idea. I know I'll never be perfect, but I don't like that feeling that I'm currently doing something fucking retarded, but I won't be able to appreciate it until 5 years later.
Name:
Anonymous2011-12-28 18:18
the other day I read an old implementation of a database manager I made when I was 16 using mother fucking PASCAL
shit is fugly
Name:
Anonymous2011-12-28 21:14
>>4
Haha, I remember when I read my 2000-line game written in Turbo Pascal back when I was 5. Apparently I didn't know about indentation *blushes*
Name:
Anonymous2011-12-28 21:42
The most I've ever done is Red Alert 2 & Yuri's Revenge mods.
Feels good though, that shit could generate a lot of bugs.
>>5
I wish I had some of my babby QBASIC; thinking back it feels like I was never more productive.
Name:
Anonymous2011-12-29 21:03
>>15
My first non-trivial program was a stick-figure rendition of Street Fighter written in QBASIC. Then I got a faster computer the game became unplayable. It's surprising how oblivious to vsync even many "professional" programmers were back then.
Name:
Anonymous2011-12-29 23:46
>>16
vsync is separate from processor dependent program speeds. Vsync is about prevent the graphic buffer from being drawn to or read from more than the refresh rate per second, thread calculation limits are more related to preventing unnecessary, it's more like a vsync for logic rates in that case.
I guess if you get into the technical aspects you could probably even make a way for the rendering methods themselves to halt, consuming less gpu power, though it's probably safer just to lock a transfer buffer. I'd be interested if someone had insight into how or what OpenGL or DirectX specify to happen when vsync is enabled, and what the various gpu driver makers actually do
>>1
No, it means your code is always bad but you don't realize it until you look at it again with fresh eyes.
When I look at old code, I'm often amazed by how good it is. Sometimes I'm afraid I'll never write code that well again, but then I continue to amaze myself.