📕 subnode [[@jakeisnt/2023 05 15]] in 📚 node [[2023-05-15]]
  • Monday, 05/15/2023 ** 18:06 What am I doing right now?
  • Package up returns
  • Put clothes in a pile to get rid of
  • Edit a photo
  • Render a square of pixels on the screen with raylib and change them every second
  • Make a plan for lunchboxes ** 22:15 Today I've been radicalized by GPUs. I've spent my life up until this point assuming that graphics libraries all start and end with turning pixels on the screen on and off. This is just not true.

The short of it is that the GPU on your computer - either 'integrated' (built into the CPU as an optimised subsection) or 'discrete' (a separate card entirely) hold a data structure called a framebuffer that represents the pixels that will be written to the screen. This information is written to a buffer then sent to the screen. The framebuffer is a data structure that represents the pixels of a monitor.

Cool, so I can just turn pixels on the framebuffer on and off?

No.

First, the framebuffer isn't just exposed. Whatever windowing system you're using does not allow you to write to the framebuffer at will. That would be a security vulnerability at best - applications could write pixels into one another to make you see something - and at worst make your computer unusable without a standard protocol that tells them how to write to the framebuffer and where. (If you aren't in a graphical session, you can get raw access to the framebuffer: https://seenaburns.com/2018/04/04/writing-to-the-framebuffer/).

You'll want to use a windowing library that abstracts requesting this framebuffer for you over various windowing systems (as Windows, MacOS, etc. have all concocted slightly different ways of doing this, nad they love making extra work for programmers) and gives you a reference to it. GLFW is historically the most popular, but systems like SDL2 and winit (Rust) provide similar functionality. You can then write pixels to this buffer following a standard, straightforward protocol nad they'll show up on the screen.

Unfortunately, though, the framebuffer doesn't live on the CPU or in the screen or whatever you think would be sane. Yes, screens have framebuffers, but it's your operating system's job to mediate between its representation and the data the screen is given. It lives on the GPU. GPUs are not optimized for drawing pixels on screens. They're complex mathematical hardware with complex APIs, optimized for rendering lines and rays and curves for modern 3D graphics, originally created to optimize for rendering perfect fonts with PostScript rather than in a bitwise fashion. The good news: they make playing video games fast, performing complex application tasks in parallel. How they do this is to be learned and probably under NDA. The bad news: GPUs expose complex, proprietary APIs that are inelegant and expose very large surface areas to program against. This makes learning to program for optimal graphics a mess, mostly because you're protecting corporate secrets. CUDA - the fundamental API exposed to empower parallel programming on the GPU - is not open. This makes computing a complex, ugly, mess - you'll always be programming against this nasty, abstracted API that's been artificially created, rather than being able to write to the machine and have the machine just render the text. This makes leveraging modern computing power a disgusting mess.

The good news here is that you can just ask GLFW for a reference to the framebuffer and write to it.

My goal with learning computer graphics has been to build small, beautiful applications that people - people who don't know much at all about using computers - can use every day to accomplish things in their life more seamlessly. Two paths to move forward:

  1. Learn to implement graphics tools by pretending modern graphics don't work that way and start developing abstractions over the framebuffer.
  2. Commit to learning a modern graphics library or abstraction. WebGPU and Vulkan are both compelling ways forward here. Vulkan has a solid Linux compatibility layer and is guaranteed Windows/Linux/other platform support. Metal (classic proprietary MacOS work) is DOA. WebGPU is incredibly compelling but the API doesn't have sustainability guarantees. It's made for the browser - so it's made to run anywhere and everywhere - but the API could be a moving target.

Whoah - Mach Engine solved this. https://github.com/hexops/mach-gpu.

📖 stoas
⥱ context