yakcyll's ~/.plan Whatever I plan to do on a particular day (or am still working on) will have its own line marked with -. When I accomplish something, I'll add a * to a note; whenever I notice a bug / figure out a missing feature / get an idea and don't get enough time to address it properly, I make a note of it and mark it with ?. Some things get noted many times before they get fixed. Occasionally I go back through the old notes and mark with a + those I have since fixed, and with a ~ the things I have since lost interest in. # 16.04.2026 - reflection of nested anonymous structs/unions - inclusion of scripts in the json spec for physics sandbox scenarios - adaptation of physics sandbox code for hosper testing i've got it. a bit of hand-written glue code was necessary to bind some of the templates in Registry to lua; it probably could be replaced with an autogen making use of template function selectors, but i don't yet know how to select the appropriate templates, nor how to instantiate them using reflection. both are probably non-issues, but at this current moment, i genuinely don't care - i got a script to be called on every tick and update an entity's position, including collisions, and for the time being, that's good enough. i will later today write down some details about the process of investigating reflection and integration with lua and sol2 while they are fresh in my mind. custom map logic as ugc... i can't wait to make something awesome with this. # 14.04.2026 + reflection of methods + reflection of non-static fields + reflection of anonymous unions - reflection of nested anonymous structs/unions ~ reflection of templates? i'm fairly certain that what i'm doing can be considered crime against c++. through the magic of reflection, consteval and a ton of elbow grease i have managed to implement automatic c++ type mirroring in lua. due to (hopefully temporary) limitations of reflection, constructors have to be specified manually. i've managed to get automatic operator overload inclusion to work already, which turned out to be non-trivial. there's a lot more work to be done here: static fields need special handling, i have no clue how to deal with templated methods and fields hidden in unions or nested types are still inaccessible. at this point i'm not too worried about critical roadblocks - it appears that this form of reflection, however verbose, will be sufficient to implement transparent generation of bindings. if it turns out to work (especially if i get fields to be automatically mapped), then i'll probably switch away from reflect-cpp in the implementation of registry as well. all this is... enthralling. # 11.04.2026 managed to get lua 5.5.1 + sol2 integrated into the engine, to a point where an entity can invoke a lua script on tick update and invoke the global logging procedures. pretty pleased with myself; will try to switch to luajit tomorrow to try and make it more performant. then there will be two major tasks to do related to scripting: map the existing engine api into lua (with the added difficulty stemming from templated functions) and enable registering lua callbacks to engine subsystems (that will depend on how sol represents lua functions passed as arguments to c++ functions; probably will have to use trampolines, visiting lambdas or do some sorta magic, like here: https://lowkpro.com/blog/creating-c-closures-from-lua-closures.html). with those, i'll be able to start creating scenes/scenarios for testing, that will be very useful for identifying regressions, especially in wonky subsystems (networking). this was much easier than i expected, honestly. even if eventually utilizing codegen to create bindings will be necessary to keep my sanity, the fact that there is a subsystem in place that i can play with makes me much more interested in spending more time on the engine. my main concern remains the lua vm exploding from executing unsafe code, i must research how to make this secure eventually. # 10.04.2026 testing collisions involved adding different setups with moving objects to a single scene. i thought this is clearly not scalable and so i decided to get distracted by starting on creating the map editor. ported the physics sandbox code over as a base for the editor and realized that the objects in the collision testing setups move in arbitrary fashions. i do eventually want map entities to follow custom mapper-provided logic, so i started looking into integrating a lua interpreter in the engine. it doesn't look like an impossible task. there will be some overhead related to managing templates in the registry, but it all seems managable, thanks to sol2. i'll try to replicate the functionality of nativescript with luascript, where its lambdas will be loaded from scripts at runtime. the bigger issue here is providing complete engine api to the scripts; i reckon this can be done using a dedicated table in lua that contains references to classes in context, but this part unfortunately doesn't seem possible to automate, so i will need to maintain a manually updated index of classes/functions exported to lua. that might become a problem, it might also end up being preferable; scripts, being code sourced externally at runtime by nature, should be tightly controlled with respect to which parts of the system they can access. no way to figure this out other than try and find out. # 05.04.2026 + refactor Hull*Faces* helpers to return Vector<Polygon> instead of Vector<Vector<Math::Vector3>> + update Checks.cpp (e.g. CheckHullFaces) to use Polygons instead of Vector<Vector<Math::Vector3>> + update collision funcs for Capsule (starting with CheckForCollisionImpl(Hull, Capsule)) to consider the translation of Capsule to sit in the middle of its inner segment + finish GenerateCapsule and GenerateBox in Hosper/Debug i have become a slave to the architecture of the engine. i can't figure out how to cleanly supply the renderer with some wireframed debug geometry without essentially hacking on the api. this is extremely stupid. for the time being i decided to create physics visualizers the simplest possible way: do everything using existing interfaces from within hosper. create some physics shape meshes and on each tick obtain collision info from physics, then create temporary entities with renderables representing both collision shapes and each contact point from the last tick; scrap all of them on the next tick (possible thanks to Add/RemoveHierarchy). if it turns out to be a bottleneck (not unlikely), then i'll figure out how to push all of that code back into the engine. not worth getting blocked over something like this right now otherwise. # 31.03.2026 + visualization of colliders - debug shape shaders in renderer? transient mesh instances in the world? - equivalent of move_and_slide where the moved object is glued to the surface it is currently attached to while moving (prevents yeeting forward when moving down a sloped surface) bogged down in maths behind collision detection for capsules. i need a way to quickly and easily test and visualize results of various collisions, both for ray-body and for body-body collisions; i think it's a necessary debug tool moving forward, even if it introduces a bit of jank to the physics engine. i'm simply not confident enough in my implementation as is, given the complexity of calculations involved and multitude of references used to construct them. been thinking a lot about throwing in the towel and integrating jolt lately; it all makes me appreciate the 'don't make engines, make games' advice that much more. # 23.03.2026 - capsule collision - "dynamic" player controller i kicked off hosper. the scope is large, so splitting the work into stages with reasonable results and goals is of paramount importance; i'm already feeling a bit lost. the primary objective is definitely feature parity with the godot prototype; after all, i haven't even finished that one. at the same time, since i already have to consider low level details of the implementation, i should try to decide upon some and integrate them along the way. what i think will turn out to be most impactful down the road is how levels will be represented in terms of data structures. the way they are stored in memory will impact the way they are stored on disk, what metadata will have to be precomputed/compiled, how they will be authored, etc. a major selling point of making a custom engine is the power to shape the tools to create games in it; this opportunity must not be squandered here. i looked into how source 2 changed its map authoring workflow compared to source 1 and the old hammer, as well as how the creation process looks like in diabotical. the latter is interesting - maps are created both through freeform smooth terrain manipulation and through grid-based block manipulation. limiting editing to a grid sounds like it simplifies a lot in practice, but it seems very limiting in terms of what sorts of geometry can be created. i think marrying this with inserting/storing mesh-based props will be good enough for our purpose; i don't think the geometry that the player interacts with directly (i.e. ground) should be complicated at all, but at the same time mesh props can add a lot of nice detail to the scene. notabene, it's a bit surprising and a bit disappointing that diabotical resolved to use props for ramps and angled surfaces; i expected them to implement edge-based block manipulation of some sorts. i'd like to look into techniques for that. let's start simple. i will first port the dynamic controller from hp-godot over, to have a control scheme; then, i will work on a scheme to store, present and manipulate grid-aligned meshes. there are some kinks to iron out along the way (lack of a capsule among colliders being one of them), but it seems like a fairly simple set of first steps, even if not that impressive. it will help me immensely to have something to interact with tactilely. # 05.03.2026 + cleanup and merging of text related stuff # 04.03.2026 * fixed text alignment issues * fixed font size not being passed to the text shader correctly ? scrollcontainer doesn't scroll down to cover the entirety of the text ? or the text is rendered beyond the texture ? relatively low performance in box-stack # 02.03.2026 the text shaders have been working alright for the last two days. there were some bugs related to texture thrashing, invalid copies, still a ton of bugs related to refhandles hiding in there. currently battling improper sizing and positioning of glyph quads. the performance is actually quite satisfactory so far, although i haven't performed any proper stress testing. however, the most urgent matter, a blocking issue in fact, is that the text looks like _shit_. glyphs are rendered inconsistently, wobbly even, at small font sizes. i'm relatively sure this is a programmer error, but while i think i understand the code that does all the measurements, it's becomes clear there's an error somewhere in there after a simple visual inspection of outputs. i'll try to implement manual font scaling tomorrow; maybe the hinter is working against me in this particular case. # 23.02.2026 work is progressing slowly. haven't had much time or headspace to sit down and churn out the code necessary to rasterize text on the gpu and the time i did find was spent mostly marinating in ideas on how to design the interface for the whole thing - you know, the problem you aren't supposed to have and only have because you stick to oop. this isn't a jab or anything, i'm genuinely not smart enough for this. integrating sebastian's code was luckily the smallest on the list of challenges. i seem to be getting close to a working demo: i have already prepared shaders, data structures and procedures to construct glyph contour data. now i have to combine everything i have prepared so far, prepare a text pipeline, update the textserver rasterizer to output glyph data, update the draw calls to match those responsible for drawing the ui (vertex and index buffers are unnecessary) and finally test and debug it thoroughly. that's probably for tomorrow. # 18.02.2026 an epiphany struck regarding rendering text on the gpu: the main thread does not care what is on the textures themselves. this means that the task of preparing text for showing on screen can be split in two - the main thread can prepare the texture (size the text, prepare metadata, allocate the buffer, initialize the texture object, etc.), keep it empty/clear and pass it to the rendering thread with some metadata that the renderer can then use to, in its own time, run a new text shader to draw appropriate glyphs onto the provided clear texture. i don't know why i didn't think of this before. i'll take the approach sebastian lague presented in one of his videos: draw and fill glyphs based on contour data read directly from the font. i don't yet know if freetype provides this information, but i will resolve to extracting it myself if need be. the renderer will have to thus cache/upload this data to the gpu; this means renderapis will probably have to become aware of the concept, but maybe it makes more sense to faciliate a) selecting shaders by id from the renderer api, and b) uploading arbitrary persistent buffers to the gpu instead. this would allow using just the drawsingle api for the purpose of drawing text. getting closer to the renderdevice abstraction from godot here, but i want to implement something like that at some point anyway. gpus are just i/o machines for pixels after all. tomorrow is off from work - have to travel - will start working on this in the evening and continue on friday. since sebastian graciously posted his shader code to github, i can hope to get this solution working by the end of the week. i've made a rough outline of work needed to complete a gameplay prototype for spacewar, so i'm enthusiastic about progress over the next couple of days. # 16.02.2026 + integrated kb_text_shape with the engine; the performance is atrocious for my use case (could be caused by code refactor artefacts, i.e. optimizations missing that were present in the original implementation), so despite the ballooning binary size i'll stick with harfbuzz/icu for the time being + implemented saving replicated cvar values upon connecting to a server by client + implemented replicating cvars on client connect, restoring old values on disconnect ? should the network stream carry variants? especially for cvars? on one hand it feels like type ambiguity ought to be resolved before sending/after receiving, plus type info is unwanted overhead; on the other, in order to decode a byte stream, the type needs to be known up front, so technically console/variable store would have to parse/decode the actual packet containing replicated cvars in order to select the correct decoder overload # 13.02.2026 replacing harfbuzz/icu with kb_text_shape