at the end of the hall - postmortem
This post contains spoilers for at the end of the hall, and is written with the expectation that the reader has tried it.
Bevy Jam 6
I knew after my first bevy jam entry that I wanted to do something bigger.
I've been developing games as long as I've been programming, but it's a hobby I've never taken seriously. It's fun to make little prototypes. It's not fun to live with your engine, to live with your code, to let the creeping tendrils of requirements thread ominously through bits of logic that were once clean and sensible. That is to say, I don't make a lot of games. I don't have creative or technical workflows. I've been improvising.
And that's twice as much of a problem in Bevy. Without an editor (for anything - environment, materials, animation graphs, particles), without asset preprocessing, how exactly was I supposed to go "bigger"?
There's Skein, a marvelous "Blender-as-Bevy-editor" integration, but I'd prefer to be able to write my own editor tooling in a language I know and like. Blender using Python for scripting makes it a no-go for me. Besides, for me, even the mechanics of simple GLTF editing in Blender is a nightmare. A proficiency issue, yes, but it'd be nice to use a skill I'd already developed.
Bevity
For complete Bevy integrations, there are no alternatives. But UnityGLTF instantly made Unity my GLTF editor of choice. C# is great. Custom Unity tooling is easy to write. I've used Unity on and off over the years, so it came naturally. Except, there was no integration. I couldn't attach Bevy components to GameObjects. Identifying objects by name worked, but didn't scale. So I created Bevity.
Bevity is..."we have Skein at home." A nightmarish, bug-ridden, proof of concept integration that leverages Bevy Remote Protocol to allow you to assign Bevy components to Unity GameObjects and export them in the GLTF extras. The approach is lifted directly from Skein, just with none of the features or polish. I am loathe to invest deeply in maintaining something like this; the second Bevy has an officially supported and featureful scene format, I'll abandon this work and focus on a Bevy-native editing experience. I did the minimum work possible.
In the meantime, the jam was a good reason to experiment.
Chain Reaction
Part of my obsession with a proper editing tool stemmed from my plan to make a first person game, regardless of theme. Whatever I did, there was going to be an environment. And I had plans for four of the themes. Plans that I concocted defensively against themes I didn't love, making sure I wouldn't suffer from design paralysis on a theme that didn't resonate.
Chain reaction, though? That resonated. When the theme was announced, I had a concept within the first couple hours: a puzzle game with devices that trigger each other.
But trigger each other how?
The first iteration of a "signal" - the trigger mechanism between devices - was an inside-out sphere that radiated out from the emitter. This had unacceptable performance. Iteration two involved a toroid that expanded outward from an emitter. It felt like a mechanic that belonged in a top down game. I was vexed. How could I emit signals in three dimensions? And how could I do it in a way that didn't plagiarize The Talos Principle?
Thus begat the signal square.
The signal square was my first lesson in designing a puzzle game by maximizing the number of constraints. The expanding, inverted sphere would've made for boring puzzles; it only has two axes of constraint: time and distance. The toroid, only time, distance, and verticality. But the square? Dynamically constrained on all 4 axes. It could be slower/faster, thinner/thicker, longer/shorter, deeper/shallower. By being the most constrained implementation, it created the most puzzles. By implementing the signals, there were suddenly glimpses of puzzles everywhere. The large, immobile - immobility being another constraint - emitters were suited for different situations than the smaller, mobile ones. Some puzzles involve racing against a signal on the time axis. Others involve adjusting the verticality of an emitter.
Design by Constraint
The cube was implemented early on. Something about it felt important to the design language of the game. "This is the kind of game that has cubes." Naturally, in a game about powering devices with signals, the cube could be powered. But early on, it wasn't obvious what a powered cube should do. An inert cube was easy. It goes on switches, it's a platform. What was a powered cube? The first iteration just signaled nearby objects, allowing for chain-reaction style puzzles seen towards the end of the game. On its own, that made powering cubes boring. You needed several to get anything useful done. It was a worse version of a signal. This wasn't constrained; it was useless.
To make them more interesting, I allowed them to only discharge their signal when near another device. Instantly, every puzzle in the game was trivialized by cubes.
I love power spikes in games. I love when you get a skill that makes everything that came before it feel like a joke, and the game slowly humbles you as it outscales that initial burst of power. Powered cubes had to stay; I've never felt so powerful in a puzzle game. But the player needed kept humble.
The true design strength of cubes would come from the constraints placed upon them. "Discharge" gates that removed power from cubes carried or thrown through, "Dissolve" gates that refuse to allow the passage of devices, doors that cannot be opened with a single cube. The game became about "things you can't do with cubes" and "things you can't do without cubes". At the time, this was the opposite of my intention. I hadn't set out to make something that explored the depths of a single mechanic, like "Portal". Rather, I had expected to have a variety of signal producers, each suited to a different purpose. There could have been different shapes and colors of signals with different effects or compatibility. Luckily, the deep focus on the cube mechanics was perfect for a jam.
I would've loved to make more devices, but the scope had already been crept. I was still making the main set of puzzle rooms on the second Saturday. I was behind. For all my planning and practice between this jam and the last one, there was no lack of technical challenges.
Hierarchies are Hard, but Physics is Harder.
My games authored with Bevity use a two-phase asset loading system. GLTF scenes are loaded into Bevy Scenes through the asset system, then those Scenes are mutated directly in a postprocess pass. I think of these mutated Scenes like prefabs; a collection of assets in a pre-prepared and reusable state. Unfortunately, only Reflect data can exist in a Scene. So there's a third, hidden phase of asset loading, where any given prefab is instantiated and the various running systems populate it with the appropriate components. To complicate things, GLTFs exported from Bevity through UnityGLTF are hierarchy heavy. Each GameObject is an Entity which parents a second Entity containing the Mesh and Material. Unlike Skein, Bevity cannot write to the GLTFMeshExtras or GLTFMaterialExtras. Tag components, therefore, always end up on the parent Entity, which has few other components or bits of identifying information. As my systems and library systems get ahold of these Entities, they each modify different bits of the hierarchy, and the simple task of keeping track of "what Entity has what Components on it" becomes a painful source of mental overhead.
One of the additional disadvantages of Bevity is that it has no notion of a mapping between a GameObject and an Entity. I cannot, in Unity, have a Component field that references another GameObject that then converts to a proper Entity reference on the Bevy side. This isn't a technical constraint, I just didn't implement it. Seemed like more trouble than it was worth. But dismissing that does not obviate the need to assign relationships among objects. Doors needed to be associated with poles. Buttons with devices. So as part of my postprocessing, I prepared these relationships based on siblingship. Certain Components with certain sibling Components would automatically register particular relationships. Suitable for my needs, if inflexible. The real problem was that it complicated the nature of hierarchies further.
Combined all that with the fact that spawning a SceneRoot can't be reacted to until a SceneInstance is ready, even though I've already preloaded all my assets, made managing the state of things in a consistent way and preventing errant systems from running on unprepared Entities incredibly difficult.
I probably lost a day, total, to "Component-on-wrong-Entity" and "order systems happen in". Physics was a big part of that. A tag that needed to be on a Collider being on a RigidBody. A tag missing entirely. Two RigidBodies in the same hierarchy. Exhausting. But it got worse.
On Friday I playtested the web build for the first time after implementing half of the puzzles. One and a half frames per second. I knew the dev build had been getting choppy locally, so I switched over to a release web build.
One and a half frames per second.
The geometry in at the end of the hall is composed of of many cubes. When importing a scene, my asset preprocessing calls Avian3d's trimesh_from_mesh
on each object. Tracy revealed that my large amount of complex (due to trimesh_from_mesh
) static geometry was destroying performance on web. Parallelism of the narrow phase - impossible on web currently - is an enormous performance boost. Again, I lost most of a day to recover. Writing scripts to merge groups of cubes. Marking every single Collider that needed a trimesh with a special component. Playing through the game to understand which puzzles were made impossible by the changes.
Desperate, and still short on performance, I created a system to disable RigidBodies and Colliders outside of a certain range. Disabling the Colliders caused an unimaginable amount of bugs, made more frustrating by an Avian3d bug with ColliderDisabled. Another several hours lost.
In retrospect, there are two solutions here.
- For complex objects, manually author and include multiple simplified collider meshes that can be handled with Avian's convex meshing.
- For the simplest of objects, prefer to produce Cubic colliders from their bounding boxes.
Tweening is not a Robust Animation System
at the end of the hall relies heavily on bevy_tween
. Every single device has a material or translation tween.
Tweens add complexity because the state of the tween must be manually synchronized with the state of the device. Consider a door pole that lights up slowly as it receives a signal. Rather than data binding the animation directly to the state of the device, the tween is separate. As the state of the device changes, I must author additional code to both manage the existing tweens and instantiate the new tweens with the appropriate state derived from the device. This is bug prone. Even the final version is absolutely ridden with broken tweens that mis-communicate the state of devices.
To complicate matters further, the bevy_tween
API relies on a command extension that inserts tweens as children of the Entity being modified. As groups of observers run in a mysterious order, some despawn Entities that others attempt to add tweens to. With a normal Commands, this is trivially solved by try_insert
, but the tweening command extension does not include a "try" variant. For this reason, the final version of at the end of the hall ships with a fork of bevy_tween
that converts all .inserts
to try_inserts
.
I would've been happier with a custom data binding solution and better state management. Exploring this space is a postmortem action item.
Iteration Speed is Everything
Towards the end of the jam my iterative compile times with dynamic linking were nearly 40 seconds. I had left UI until the last day, and I couldn't get system hotpatching working before the jam. So every CSS failure, every design mistake, every menu issue was another near-minute long wait.
I started to make decisions that resulted in the fewest iterations. You'll notice that the menu cursor is always on the right side; this isn't an aesthetic choice, it's a compile-time optimization. Putting the cursor on the right allows you to update the cursor by only mutating the text without disrupting the horizontal alignment of the start of each menu item. I saved a little time not having to think about the cursor layout.
Less critically, my asset preprocessing isn't friendly to hot-reloading. So every asset change involved a restart. Just more friction between me and my flow state.
Once I got to a point where mechanics were stable and I spent most of my time in the editor, productivity felt incredible. Having to go back and revise mechanics, especially how they interact with other mechanics? Another timesink, accentuated by the 30, 40 second punishment of waiting for every mistake. As someone who likes to move fast, this chipped away at my sanity more than anything else. Next time my template will have three requirements:
- Functional asset reloading
- Functional system hotpatching
- Logic divided into crates from the start
An Aesthetic for a Programmer
I'm not a practiced artist. Combined with my (established) hatred of Blender, making a 3D game where I have the amount of creative control I want felt impossible. Unifying off-the-shelf assets never realizes my vision. I've been improving my 3D and shader skills since the last jam, but every project runs right back into this same problem.
Uncreatively, I turned to outlines. Being able to omit lighting (or reduce it to toon shading) and textures, while getting a sharp, visually clear result? It's hard to reject that.
There are a couple solutions in the ecosystem. Primarily, bevy_mod_outline and bevy_edge_detection. To me, these have the same problem; a lack of creative control as to where outlines go. Relying on depth, normals, or color makes it impossible to remove outlines from certain elements.
Thankfully there is a fantastic example of using a custom "section color" attribute to do a postprocess pass that renders outlines based on the section color. In my asset postprocessing, I convert model vetex colors to this section color attribute. With this capability, I invented two separate workflows for defining different kinds of outlines on different object.
- I export a textured object, and a script in Unity bakes that texture to the vertex color. Then, a second Unity script randomizes those vertex colors, so objects that are the same color do not visually merge. An example of this is the pole of a door pole. It is ridged, but not outlined.
- A Unity script identifies contiguous faces of the model based on a configurable angle and normal analysis, then assigns them random vertex colors. An example of this is the button. Despite having a single-colored body, it still receives outlines along its facets.
This approach beautified my simple geometry. Initially, it also introduced a large amount of aliasing, but FXAA performs admirably in this case.
As for the models, they were created with Asset Forge. Highly recommended!
What Does a Door Sound Like?
If I understated it before, I'll reiterate: I'm obsessed with creative control. Pulling music and sound assets off the shelf and dropping them in my game always fails to fulfill the vision. So, I always put my limited sound design skills to use. With Ableton Live and a synthesizer, making clicks and beeps and boops is straightforward. But immediately, I got stuck on a question.
"What does a vertically sliding door sound like?"
I spent an hour annoying myself with sounds. Perhaps doors didn't need to sound like anything after all. Defeated, and on the verge of abandoning sound design, I browsed some free sound libraries for "door noises". Creaking, knocking, slamming. All wrong. I finally came across the sound of a wooden vertical door, lifted by a pulley and crank. It sounded out of place. As a refinement step, I sampled it in Ableton, and pitched the sample to be harmonious with the music. This helped the "wrong" door sound belong more strongly, even if it's imperfect.
It's no technical achievement - it's built into Bevy - but I did think adding spatial audio made a big difference to the sound quality. Sadly, sound effects are few and far between. I just ran out of time.
What's at the end of the hall?
In the wake of 9 stressful and chaotic days of jamming, it's easy to count misgivings. It's easy to compare your vision of what your game could've been with what you delivered, and to feel like you came up short. For me, that hits twice as hard. I have a whole lifetime of abandoned and half baked projects, each victimized by my inexperience in a different way.
So, it feels weird to walk away proud.
at the end of the hall is the closest thing to a complete experience I've ever produced, and until now I never believed I could get that close on my own.
See you next jam.
Comments
Log in with itch.io to leave a comment.
When we were doing 3d animation I reduced all animation calls to just a basic macro