Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-4041-

Low level graphics and you

Name: Anonymous 2012-04-18 17:21

After a long time of meditating on OpenGL I've come down with a bad case of "I don't know how to do anything and only use tools made by someone better than me". Since the only cure for this common programmer illness is more meditating I've decided I want to learn graphics on an even lower level than OpenGL, though finding a guide for this route has proven to be nigh impossible.

I'd appreciate it if a guru could give me a friendly pointer in the right direction or an insight on the things that take place between an OpenGL call and a pixel being displayed onscreen.

At the end of my journey I desire the ability of displaying but a single pixel without the aid of instruments given to me by the masters.

Name: Anonymous 2012-04-18 17:26

Read video driver source code.

Name: Anonymous 2012-04-18 17:26

Read the following, in no particular order:
VESA BIOS Extensions
X11
Graphics Programming Black Book
Linux kernel for your favorite arch

Name: Anonymous 2012-04-18 17:29

Mathematics and x86

Name: Anonymous 2012-04-18 19:11

Just use OpenGL/DirectX like a fucking normal person and stop bitching.

Or do it the OLDSKOOL way and use qBASIC's SCREEN 7 mode to write those KOOL demos like they used to make.

Name: Anonymous 2012-04-18 21:14

>>4
x86 is Mooly Eden's fecal matter

Name: Anonymous 2012-04-18 21:16

At the end of my journey I desire the ability of displaying but a single pixel without the aid of instruments given to me by the masters.
Sadly, you really can't just do this any more (sort of)...  In the old days, "VGA" and "VESA" were standard graphics modes that all video cards supported.  So you could select a VGA or a VESA mode, get pointer to the framebuffer, and just write bytes directly to it.  Each mode had different resolution, color depth, number of color planes, etc...  Eventually, graphics card designers just said "fuck it" and started making everything non-standard.  So, now, you can't reliably write to your video card's framebuffer and expect it to work on everyone else's video card.  You have to go through the video driver, unless you want to use one of those old VGA/VESA modes.

Anyway, the other reason no one does it this way is that you're throwing away all the hardware acceleration you get by using a "standard" 3D API like OpenGL or DirectX.  Even if you only want to do 2D, graphics cards these days have blit hardware that is much faster than your pixel-twiddling code will be.

Name: Anonymous 2012-04-18 21:56

>>6
fuck off, goyfaggot

Name: Anonymous 2012-04-18 22:36

>>1
you should learn how the hardware works. It puts a nice perspective on it all and it can be helpful for explaining visual defects caused by certain types of bugs.

Name: Anonymous 2012-04-19 8:06

>>1
If you're only going to be writing a single pixel, then you're not really resolving your issue of "I don't know how to do anything and only use tools made by someone better than me".

Name: Anonymous 2012-04-19 11:51

>>10
I was merely pointing out that my goals are humble and I wish to learn for the sake of knowledge and not create something grandiose to avoid off topic replies that don't answer the question but rather criticize my wish of gaining a deeper understanding of the machine.
Example: >>5

After using ZBrush (3D Modeling being a passion of mine aside from programming) I noted it's superb performance and attempted to recreate it using OpenGL with help from OpenCL (the information is scarce but I believe ZBrush isn't even hardware accelerated). I failed to reach speeds comparable to ZBrush and realized there is much much more to know about this field.

Name: Anonymous 2012-04-19 12:48

Name: Anonymous 2012-04-19 14:58

>>12
Would you mind elaborating.
"ARB - OpenGL Assembly Language is a low-level shading language."
How exactly can I use a shading language by itself?

Name: Anonymous 2012-04-19 23:44

>>13
By learning the language and using it with C to make it into a graphics envelope to your personal design specifications.

Name: Anonymous 2012-04-20 8:58

>>13
Could you link a step by step guide on how this is done? Let's assume I know the language, what do I do with the code now?

Name: Anonymous 2012-04-20 18:09

>>15
No, because I have no idea what your specifications are.
The simplest way is to make a few classes, one for an individual pixel with all the interactions of which you will need, one that is an array of the pixel handling class (this will be a horizontal line), and one that is an array of the last class making it a display vector.

Name: Anonymous 2012-04-20 23:17

>>15
just learn open gl or whatever is available, and get into shaders when you are more familiar with the limitations of the fixed function pipeline.

But if you do want to write a graphics library from the ground up you can always generate bit maps and the like. Once you know the file format and and generate an image you want, then you can take it from there. It might be a fun experience.

Name: Anonymous 2012-04-20 23:47

if you just wanna play with shaders try this: http://www.iquilezles.org/apps/shadertoy/

(needs webgl of course)

Name: Anonymous 2012-04-21 0:22

an insight on the things that take place between an OpenGL call and a pixel being displayed onscreen.
Red Book explains this.

Name: Anonymous 2012-04-21 9:48

Please read the 1st post before replying.
I'm already proficient with OpenGL and GLSL, what I want to know is how graphics libraries are written in the first place. To put it as simple as possible, what would be the universal low level equivalent(assuming there is one) of glVertex.

Name: Anonymous 2012-04-21 10:51

>>20
If you still think in terms of glVertex, you obviously are not proficient with OpenGL. Read http://www.arcsynthesis.org/gltut/.

Name: Anonymous 2012-04-21 11:01

>>20
now forget everything you learn in university about opengl and start again with a good tutorial.

Name: Anonymous 2012-04-21 11:41

>>21
>>22
I was illustrating with an example what I wanted to know. Would you rather I pasted full latest standard compliant code just to do that?
Please do not approach a question with an insecure mindset and an urge to prove you're smarter, you're not.

Name: Anonymous 2012-04-21 12:38

Wow, this thread has been amazingly useless to you. Sorry OP; now I remember why I don't come around here anymore.

To the people telling you to just use OpenGL or that you have to go through video card drivers: shut up. You're missing the point. Of course a software renderer is going to be 1000x slower than doing it in hardware. The point is just to learn how it works.

OP, what it sounds like you want to implement is a software triangle rasterizer. First, get a good grasp of linear algebra; you need a good understanding of how vectors and matrices are used in 3D graphics:

http://www.wildbunny.co.uk/blog/vector-maths-a-primer-for-games-programmers/

With this you should understand how to take a triangle mesh and transform the triangles to on-screen coordinates. From there you need to rasterize the triangles. Here's a not entirely terrible article on doing this:

http://joshbeam.com/articles/triangle_rasterization/

You basically calculate a bounding box for the triangle, and then calculate the span of each row to determine what pixels are covered by the triangle.

To sample the color of each pixel from a texture, you need to interpolate between the UV coordinates of the vertices. The way to do this is with barycentric coordinates. In other words you transform the xy position of the pixel into a weighted sum of the xy coordinates of the vertices. Then you can use this to compute the texture position of the pixel from the texture coordinates of the vertices. I can't find a good reference for this, but here's the Wikipedia article on it; it gives the formula explicitly for a conversion to barycentric coordinates in 2D:

http://en.wikipedia.org/wiki/Barycentric_coordinate_system_%28mathematics%29

That's really all there is to it. Add a depth buffer and you should be able to render any old textured triangle mesh.

Name: Anonymous 2012-04-21 12:44

>>11
After using ZBrush (3D Modeling being a passion of mine aside from programming) I noted it's superb performance and attempted to recreate it using OpenGL with help from OpenCL (the information is scarce but I believe ZBrush isn't even hardware accelerated).

ZBrush is absolutely hardware accelerated. As far as I know it uses voxels internally, but it generates a triangle mesh on the fly to render it in the traditional OpenGL/DirectX pipeline.

Name: Anonymous 2012-04-21 12:48

>>20
Equivalent of the immediate-mode glVertex(Normal|Texcoord) calls would likely be nothing at all. Classic is storing data in a memory buffer followed by a call to some kind of draw command.

More to the point, could you please download the OpenGL 2.1 specification:
http://www.opengl.org/registry/doc/glspec21.20061201.pdf
And open it at chapter '2.11 Coordinate Transformations', then at '3.5.1 Basic Polygon Rasterization', skimming through.

If I understand correctly your goal is to code a simulation based on these kind of equations, a software 3D rasterizer?

Name: Anonymous 2012-04-21 13:33

>>24
Thanks but I'm not interested in the math. I went through the rasterization article and saw them using this "SetPixel()", that's what I'm interested in. I want to know how are pixels drawn, how graphics libraries implement a "SetPixel()" function, surely they don't use a function provided by windows.
>>25
I'm afraid I can't find the source right at this moment, but according to the literature I've read on this subject, ZBrush is not hardware accelerated and it uses neither Direct3D or OpenGL.

Name: Anonymous 2012-04-21 13:47

You can look at AMDs 'open' specification of their architecture. It is told that xorg drivers were written using it.
Though most of the code is supplied by themselves so you probably will be at a loss.

If you really want low level graphic use a microcontroller. On the pc anything beyond VESA relies on proprietary and/or undocumented code.

Name: Anonymous 2012-04-21 14:11

>>27
Oh. Well that's stupid. SetPixel() is bad function, and should never be used in a real program. Nobody draws pixels using SetPixel().

The way it works in modern computers is that the video card has its own video memory where images are stored, as well as the screen pixels (i.e. the chunk of memory that actually gets sent to the monitor.) The video card provides hardware accelerated operations for blitting and porter-duff compositing. In other words these operations are physically implemented in circuits in the GPU. These are exposed to the OS or your window system (i.e. Windows, X11, etc.) by the video card drivers.

Name: Anonymous 2012-04-21 14:23

>>25
but it generates a triangle mesh on the fly to render it in the traditional OpenGL/DirectX pipeline
which pretty much kills the reasoning behind hardware acceleration.
>>27
>>23
I can't decide if you are an idiot or just a newbie. You don't use something like "setPixel" in graphics programming. Or you don't use glVertex (google "opengl immediate mode" for reasons)
GPUs work in a parallel way. You generate an mesh with vertices and indices and ask GPU to render them.

Name: Anonymous 2012-04-21 14:36

Name: Anonymous 2012-04-21 14:40

>>30
People please read the discussion before providing your "professional" opinions.

Name: Anonymous 2012-04-21 14:52

>>32
you can't afford my professional opinion.

Name: Anonymous 2012-04-21 14:56

>>30
>You generate an mesh with vertices and indices and ask GPU to render them.
No, it's a high level description of how we see it through some API's, it's the dumbed down version of graphics programming for the sake of user friendliness and not what OP is asking.

Name: Anonymous 2012-04-21 15:02

>>34
well then explain the pipeline or shaders to OP. I sure am not gonna bother it

Name: Anonymous 2012-04-21 15:07

>>35
"I'll go on a programming board just to say I won't help"
Neither of those are his questions. Unfortunately I don't know graphics below popular API's so I can't help.

Name: Anonymous 2012-04-21 15:11

It seems the usefulness of /prog/ is already exhausted. I'm closing this discussion.
Thanks to everyone who contributed, I'v decided to read "Graphics Programming Black Book" and see if I have any questions left after I'm done with it.

Name: Anonymous 2012-04-21 15:14

hey OP, you could, you know, get the source for a bootloader, inject the code of a program in it using dd or something, and boot that shit up, your program injected could just draw on VESA, so:

bootloader -> graphics mode and page tables to reach the linear frame buffer -> manipulate it directly in a C program, no libraries

Name: VIPPER 2012-04-21 15:15

>>37
usefulness of /prog/
LOL

Name: Anonymous 2012-04-21 15:18

>>39

go to bed, vipperu-san!

Name: VIPPER 2012-04-21 15:29

>>40
Ok

Name: Anonymous 2012-04-21 18:02

>>24
did you read the "low level" part? seriously...
OP is asking for something that is not possible to do, or at least not "universally" as he asks; if he really wants to go low level he should read the sauce of nouveau drivers, or get an old computer

Name: Anonymous 2012-04-21 18:12

>>24
OP doesn't necessarily need to implement the fixed function pipe line in software. Ey could write a very advanced ray tracer entirely in software, and this isn't all that uncommon. When it is all done in software, there are no restrictions on what model to use. But OP isn't going to have a good working knowledge of the fixed function pipeline unless ey learns from a well founded example, like opengl or whatever else is currently accessible and running on ey's platform. Now shut up and go scrub another backtotoGpleeas!

Name: Anonymous 2012-04-21 18:41

>>37
all of your questions are answered in the thread but you rejected them as irrelevant. Good luck learning anything with that attitude.

If you want to know how those routines are implemented, you are going to need to learn about display drivers, GPUs, and the like. This is hardware and math, not subroutines in C or something. Learn about the hardware, and learn about the math. If you don't learn the math, the implementation wont make sense, and you wont understand why it works.

Name: Anonymous 2012-04-21 19:27

Fact is you really don't have any business writing anything if you aren't able to take the math of it, wrap one end of a chain around your arm, the other end around it's neck, and fuck it in the ass from behind while yanking the chain as you thrust.

Name: sage 2012-04-21 20:20

I'm not interested in the math - A Redditor

Name: Anonymous 2012-04-21 21:59

>>44
Not withstanding the fact that you're a fucking idiot, math appears a lot in your trivial code. For example, the while loop in your imperative languages work on the idea that the position is a function of time.

Name: Anonymous 2012-04-22 5:36

>>47
He wasn't saying anything contrary to what you're saying

Name: Anonymous 2012-04-22 10:16

>>44
This is hardware and math, not subroutines in C or something.

This. OP is asking how these things are implemented; the answer is *in hardware*.

The OpenGL specification explains the rendering pipeline in exhaustive detail; that is how it is physically implemented in the circuits of a GPU. Read that to learn how it works.

Name: Anonymous 2012-04-22 11:27

>>49
We have programmable shaders for like... 10 years?

Name: Anonymous 2012-04-22 13:28

>>49
Modern GPUs implement OpenGL in software and use a much better specification called DirectX for actual hardware.  OpenGL is nothing more than an adapter layer over DirectX.

Name: bampu pantsu 2012-05-29 4:30

bampu pantsu

Name: Anonymous 2013-11-30 8:11

░░░░░░░▄▀▀▀▀▀▀▀▀▀▀▄▄░░░░░░░░░
░░░░▄▀▀░░░░░░░░░░░░░▀▄░░░░░░░
░░▄▀░░░░░░░░░░░░░░░░░░▀▄░░░░░ YOU HAVE BEEN VISITED BY
░░█░░░░░░░░░░░░░░░░░░░░░▀▄░░░ LE 'FEEL OF NO GF
░▐▌░░░░░░░░▄▄▄▄▄▄▄░░░░░░░▐▌░░
░█░░░░░░░░░░░▄▄▄▄░░▀▀▀▀▀░░█░░ A qt 3.14 gf will come to you,
▐▌░░░░░░░▀▀▀▀░░░░░▀▀▀▀▀░░░▐▌░ but ONLY if you post a
█░░░░░░░░░▄▄▀▀▀▀▀░░░░▀▀▀▀▄░█░ `>tfw no GF on this thread
█░░░░░░░░░░░░░░░░▀░░░▐░░░░░▐▌
▐▌░░░░░░░░░▐██▀█▄░░░░░░█▀█░▐▌
░█░░░░░░░░░░░▀▀▀░░░░░░▀▀▀▀░▀▄
░▐▌░░░░▄░░░░░░░░░░░░░▌░░░░░░█
░░▐▌░░▐░░░░░░░░░░░░░░▀▄░░░░░█
░░░█░░░▌░░░░░░░░▐▀░░░░▄▀░░░▐▌
░░░▐▌░░▀▄░░░░░░░░▀░▀░▀▀░░░▄▀░
░░░▐▌░░▐▀▄░░░░░░░░░░░░░░░░█░░
░░░▐▌░░░▌░▀▄░░░░▀▀▀▀▀▀░░░█░░░
░░░█░░░▀░░░░▀▄░░░░░░░░░░▄▀░░░
░░▐▌░░░░░░░░░░▀▄░░░░░░▄▀░░░░░
░▄▀░░░▄▀░░░░░░░░▀▀▀▀█▀░░░░░░░

Name: Cudder !MhMRSATORI!fR8duoqGZdD/iE5 2013-11-30 9:04

Build your own VGA-compatible hardware.

Name: Anonymous 2013-11-30 20:16

/
███████ ]▄▄▄▄▄▄▄▄ Bob is building an army.
▂▄▅█████████▅▄▃▂ Bob and his tank are against Google+
Il███████████████████]. Copy and Paste this all over
◥⊙▲⊙▲⊙▲⊙▲⊙▲⊙▲⊙◤.. Youtube if you are with us

Don't change these.
Name: Email:
Entire Thread Thread List