🏛 Stoas near [[@agora/2023 05 15]]
📚 Node [[2023-05-15]]
↳ 📓 Resource [[@agora/2023 05 15]]
To encourage change, look for what suggests that they don't like how things are.
- Look for anything good they have to say about what change might do.
- Look for anything that suggests that they could change if they wanted to.
- Look for anything that sounds like a commitment to change.
What makes it difficult for them to consider change?
What would it take to go where they want to go?
- What worries them about how things are?
- What makes them think they need to do something about it?
- What happens from what they are doing now?
- Is there anything about what they are doing now that is a reason other people might worry about them?
- How has this stopped them from being where they want to be?
- What will happen if they don't change what they're doing?
- What would it take to go where they want to go? collapsed:: true
- How would change help them? collapsed:: true
- How would they see change as possible? collapsed:: true
- When would they tell you that they want to change? collapsed:: true
Look for what they think might happen if they change vs. what they think might happen if they don't change.
- What do they like about how things are now?
- What do they dislike about how things are now?
- How can these be illustrated visually?
- What is a day in their life like?
What do you worry most about the thing that might change?
- What's the worst that might happen?
- How might that happen? What else might happen as a result of that?
- What's the best thing that might happen?
- How would things change, if you changed?
Ask them about a different time in their life. Both what was, and what might be.
- What happens if things don't change?
- How is what they're doing consistent with what they want? How does it work against what they want?
- Monday, 05/15/2023 ** 18:06 What am I doing right now?
- Package up returns
- Put clothes in a pile to get rid of
- Edit a photo
- Render a square of pixels on the screen with raylib and change them every second
- Make a plan for lunchboxes ** 22:15 Today I've been radicalized by GPUs. I've spent my life up until this point assuming that graphics libraries all start and end with turning pixels on the screen on and off. This is just not true.
The short of it is that the GPU on your computer - either 'integrated' (built into the CPU as an optimised subsection) or 'discrete' (a separate card entirely) hold a data structure called a framebuffer that represents the pixels that will be written to the screen. This information is written to a buffer then sent to the screen. The framebuffer is a data structure that represents the pixels of a monitor.
Cool, so I can just turn pixels on the framebuffer on and off?
First, the framebuffer isn't just exposed. Whatever windowing system you're using does not allow you to write to the framebuffer at will. That would be a security vulnerability at best - applications could write pixels into one another to make you see something - and at worst make your computer unusable without a standard protocol that tells them how to write to the framebuffer and where. (If you aren't in a graphical session, you can get raw access to the framebuffer: https://seenaburns.com/2018/04/04/writing-to-the-framebuffer/).
You'll want to use a windowing library that abstracts requesting this framebuffer for you over various windowing systems (as Windows, MacOS, etc. have all concocted slightly different ways of doing this, nad they love making extra work for programmers) and gives you a reference to it. GLFW is historically the most popular, but systems like SDL2 and winit (Rust) provide similar functionality. You can then write pixels to this buffer following a standard, straightforward protocol nad they'll show up on the screen.
Unfortunately, though, the framebuffer doesn't live on the CPU or in the screen or whatever you think would be sane. Yes, screens have framebuffers, but it's your operating system's job to mediate between its representation and the data the screen is given. It lives on the GPU. GPUs are not optimized for drawing pixels on screens. They're complex mathematical hardware with complex APIs, optimized for rendering lines and rays and curves for modern 3D graphics, originally created to optimize for rendering perfect fonts with PostScript rather than in a bitwise fashion. The good news: they make playing video games fast, performing complex application tasks in parallel. How they do this is to be learned and probably under NDA. The bad news: GPUs expose complex, proprietary APIs that are inelegant and expose very large surface areas to program against. This makes learning to program for optimal graphics a mess, mostly because you're protecting corporate secrets. CUDA - the fundamental API exposed to empower parallel programming on the GPU - is not open. This makes computing a complex, ugly, mess - you'll always be programming against this nasty, abstracted API that's been artificially created, rather than being able to write to the machine and have the machine just render the text. This makes leveraging modern computing power a disgusting mess.
The good news here is that you can just ask GLFW for a reference to the framebuffer and write to it.
My goal with learning computer graphics has been to build small, beautiful applications that people - people who don't know much at all about using computers - can use every day to accomplish things in their life more seamlessly. Two paths to move forward:
- Learn to implement graphics tools by pretending modern graphics don't work that way and start developing abstractions over the framebuffer.
- Commit to learning a modern graphics library or abstraction. WebGPU and Vulkan are both compelling ways forward here. Vulkan has a solid Linux compatibility layer and is guaranteed Windows/Linux/other platform support. Metal (classic proprietary MacOS work) is DOA. WebGPU is incredibly compelling but the API doesn't have sustainability guarantees. It's made for the browser - so it's made to run anywhere and everywhere - but the API could be a moving target.
Whoah - Mach Engine solved this. https://github.com/hexops/mach-gpu.