Thanks to visit codestin.com
Credit goes to anja-haumann.de

Games

I'm currently taking the free Harward CS50 Introduction to Computer Science course and built a chicken tic-tac-toe for the scratch Read more
A collaboration with publisher United Soft Media. Puzzle adventure for iOS, Android, and Steam, based on the award-winning EXIT – Read more
Search Training is a medical VR application that trains hemianopsia patients to compensate for their vision loss with eye movement Read more
A mobile adaption of the classic 'Das verrückte Labyrinth' by Ravensburger.   Genre: Puzzle Platform: Mobile (android, ios) Artstil: 3D Read more
I'm really happy to announce that the Museum der Alltagskultur (museum of everyday culture) in Waldenbuch added Craftspeople to their Read more
A Jesters Tale is a small rhythm game/RPG prototype that I had the pleasure to contribute to at the first Read more
Won the Pädagogischer Medienpreis 2022 (educational award). Nominated for ‘Best Mobile Game’ at the Deutscher Entwickler Preis 2021.   A Read more
A mobile adaption of the classic Memory® game by Ravensburger.   Genre: Puzzle Platform: Mobile (android, ios) Artstil: 3D Engine: Read more
The digital adaption of the board game Ubongo for Nintendo Switch.     Genre: Puzzle Platform: Nintendo Switch - exclusive Read more
This is the prototype of an interactive learn-app for children created at the cultural hackathon 'Coding da Vinci Süd 2019' Read more
Space Shop VR is an multiplayer-VR game for which you just need 1 VR-headset.   https://www.youtube.com/watch?v=ygqLtUSmv3k   Genre: Puzzle Platform: Read more
A girl eavesdropping a conversation of adults.
- What Remains of Edith Finch meets 13 Reasons Why. - Vices is the prototype of a story focused adventure Read more
Aeronauts is the prototype of a settler like strategy game for mobile.   Genre: Realtime strategie Platform: Android Artstil: Comic Read more
Bild des Titels und 3 Charaktere des Spiels.
Demolition disaster is a fast-paced local multiplayer battle arena.   Genre: Lokale Multiplayer Battle-Arena Platform: Windows Artstil: 3D Comic Engine: Read more
The constellation of the great camel.
Mother Earth is a puzzle-game and an interactive story.     Genre: Puzzle adventure Platform: Android (Google Play Store) Artstil: Read more

Blog

You know something’s off when your disk space vanishes like socks in a dryer, and nothing you delete brings it Read more
    Keynote and workshop     Group picture by: Sebastian Knackstedt Read more
Let's delve into the nuanced distinction between complicated and complex. While these terms might initially appear interchangeable, it's crucial to Read more
Even if you're not explicitly working with multi-threading, Unity's internal processes might be, and it's crucial to be able to Read more
Have you ever wondered why controller input isn't supported by the scroll rect? This is a question that crosses my Read more
During a demo project at the MD.I Hub for an automotive client, I combined the Oculus Rift (VR headset) with Read more
Sometimes you need to load assets in the editor without the comfort of a direct reference. Assets are loaded using Read more
Have you ever wondered what makes code "dirty," what refactoring entails, and why there are no silver bullets in programming? Read more
In a nutshell: steer clear of OnTrigger callbacks when it comes to managing caches.   By caches, I'm referring to Read more
Improving the inspector in Unity can have significant benefits for game development efficiency and workflow.   An organized and easy-to-read Read more
Before we talk about antialiasing, we must discuss the problem it tries to solve. It's called aliasing and is generated Read more
I really like reading in general but especially reading technical books. In contrast to digital content, a book must pass Read more
Sometimes there is an animated gameObject which mustn't stop animating or its state breaks. If it should continue off-screen you Read more
"Events are evil!" is a common opinion among Unity developers. What are the reasons behind this statement and are interfaces Read more
Ever checked an object for null and it didn't work? Computers do precisely what you tell them to do -> Read more
Problem The CreateAssetMenu attribute is handy if you want a quick way to instantiate scriptable objects.  But it falls short Read more
One constraint of articulation bodies is that they lose their physics state if their hierarchy changes. This behaviour manifests as Read more
The system was integrated for industry applications like realistic physic simulations for robotics or the training of neural networks. But Read more
I'm currently playing around with articulation bodies which were introduced in 2020.1. Some parts of the feature are still buggy Read more
There is a wide range of shaders for visual effects in games. Since the release of Shader Graph, the creation Read more
I played around with my Arduino starter kit and read a few pages in Making Embedded Systems (more on it Read more
So you want to build a rhythm game. A game that has elements moving in sync with the beat of Read more
Can you have dynamic eyes that react to the environment but at the same time use frame-by-frame animation for the Read more
Problem Do you ever think "The GameObject setup is so weird, it should be documented" but then don't do it? Read more
I always liked to spend time with my dad in his electrical workshop. He is a radio engineer and built Read more
Unity's default gizmos are essential for visual debugging but can only be drawn in world space. This restriction can be Read more
One evening when I was engaged in playing D&D with my friends I realized that we had a hard time Read more
Unity's random seems like a straightforward to use and easily accessible function. It is helpful when dealing with procedural generation Read more
GDD is the abbreviation for game design document. The GDD is written and maintained by the game designer. It describes Read more
  If you build a game in Unity you will need a few UI screens to guide your player. The Read more
 I will show two examples of custom editors I implemented in production to enhance our workflow with generated data.  The Read more
Converting between the different coordinate systems of the Unity UI can be the key to a lot of cool custom Read more
Lately, you see a lot of blog posts and news about mixed reality. This reminded me that I wrote a Read more
Today I want to show you, how you can save reliably on the sd card/external storage. This will only work Read more
A short teaser about my bachelor thesis with the titel: Generierung und Evaluierung abwechslungsreicher Gebäudekomplexe (Generation and evaluation of diverse Read more

WSL2: The Expanding Ghost (and How to get rid of it)

You know something’s off when your disk space vanishes like socks in a dryer, and nothing you delete brings it back.


If you’re using WSL2 on Windows, congrats — you’re probably hosting a quietly expanding ghost.

Yes. Even after clearing caches. Even after purging Docker. The virtual disk just… stays large.

The Prime Suspect: Docker’s Build Cache


If you’re running Docker inside WSL, this one’s for you.

Docker build layers stack up like geological sediment:

  • Unpruned build layers quietly accumulate
  • Each new image leaves a little trail behind
  • Over time, this becomes a landfill of gigabytes

One day I checked. My Docker cache was 80 GB…. Eighty!

“But I Deleted Stuff!” – What’s Actually Happening


Yes, you did. Here’s why it didn’t help:

  • Linux: “I marked the space as free.”
  • Windows: “Cool. I’m keeping the file at full size anyway.”
  • You: “Why is my disk still full?”
  • WSL: [unused bytes haunting your SSD]

WSL2 stores its entire Linux filesystem in a .vhdx file — a virtual disk.

Virtual disks aren’t mind readers. They act like physical hard drives: when you delete something inside, the data is logically gone — but the space isn’t reclaimed unless you explicitly clean it up from the outside.

  • It grows as you add stuff
  • It doesn’t shrink when you delete stuff

So even when Linux thinks it’s tidy. To Windows, the file is still big because no one came in with a vacuum. So it just… stays big. A quiet monument to every container, cache, and careless npm install.

The Fix: Manually Compact the Disk


You can get rid of the disk ghost — but you need special tools.
Here’s how to make the virtual disk reflect reality.

It’s always worked for me — but I follow one rule:

⚠️ Back up your stuff first. No backup, no pity.

Step-by-step (Windows 11)

  1. Open Command Prompt or PowerShell as Administrator
  2. Shut down your WSLs
    • wsl --shutdown
  3. Run:

(Adjust the path to match your setup. If you use the default settings, this works as-is.)

diskpart
select vdisk file="C:\Linux\ext4.vhdx"
attach vdisk readonly
compact vdisk
detach vdisk
exit

This can take a while depending on your ghost size — go make tea. Or refactor something.

When WSL starts again, the file will finally match the real size of your Linux world.

Memory ghost returned successfully to the realm of unused bytes.

 

 

Why not hold a workshop?

During the Pentecost weekend, I was invited as a lecturer to the seminar

 

“More Than Fun? The Significance of Play for the Individual and Society”,

organized by the German Academic Scholarship Foundation (Studienstiftung des deutschen Volkes).

The seminar brought together students from a range of disciplines—philosophy, engineering, cultural studies and science—creating a stimulating mix of viewpoints. This diversity allowed us to explore play not just as recreation, but as a cultural practice, a mirror for society, and a methodological tool.

 

 

Keynote and workshop

I opened the weekend with a keynote lecture on the multiple roles play takes in our everyday and social lives—roles that go far beyond mere entertainment.

Later I held a full-day workshop which was structured to engage the students on multiple levels:

 

  1. Historical treasure hunt (Schnitzeljagd): Participants explored the heritage of traditional games—discovering their rules, social functions and parallels to our modern society.
  2. Showcase of notable digital games: We examined a curated selection spanning art, science, and entertainment—showing how digital play reflects and shapes cultural narratives.
  3. Mini game jam: Finally, participants teamed up to design board games, applying insights from history, theory, and modern games in a creative sprint.

 

What struck me most was the students’ curiosity and openness. In a setting that balanced critical reflection with creative experimentation, thoughtful prototypes emerged, sparking dialogue across theoretical, historical, and digital dimensions.

For me, the seminar underlined one key insight: academic discourse and play-based methods can enrich each other profoundly. Play isn’t trivial—it’s central to how we learn, communicate, and design culture.

 

 

 

Group picture by: Sebastian Knackstedt

Complicated vs. Complex

Let’s delve into the nuanced distinction between complicated and complex. While these terms might initially appear interchangeable, it’s crucial to discern their subtle differences.

To establish a foundation, let’s turn to the definitions provided by the Merriam-Webster dictionary:

 

Complex:

  • Composed of two or more parts
  • Hard to separate, analyze, or solve

Complicated:

  • Consisting of parts intricately combined
  • Difficult to analyze, understand, or explain

At first glance, these definitions seem closely related, but it’s important to emphasize a significant difference. In contrast to complexity, complication implies difficulty in understanding. Complex problems, on the other hand, aren’t necessarily hard to understand or explain; rather, they highlight the challenge of finding a solution.

 

For a programmer, the guiding principle should always be to seek the simplest and most straightforward solution for a problem, regardless of its inherent complexity. It’s crucial to recognize that a complex problem does not warrant or demand a complicated solution.

 

The term “complicated” finds its most fitting application when describing code that could benefit from a more human-readable approach. While anyone can write code understandable to a computer, skilled programmers distinguish themselves by crafting code that is easily comprehensible to fellow humans.

Unity: Callbacks and Multi-Threading

Even if you’re not explicitly working with multi-threading, Unity’s internal processes might be, and it’s crucial to be able to recognise a threading problem when you encounter it. Unity can execute your code on other threads and this might cause multi-threading-related issues in your code.

 

Unity employs multi-threading in the serialization, build process and audio playback. That’s the reason why a Unity game can freeze but the audio continues to play.  The game and the audio are running on separate threads.

 

When you are using Unity callbacks to implement custom importers or a build process your code will likely be executed on a different and/or multiple thread(s). Now you need to consider if your code is multi-threading safe.

 

One key aspect to be aware of is that the Unity API is not thread-safe, meaning it can’t be accessed outside the main thread. This limitation can lead to strange bugs where variables abruptly change or functions aren’t called.

 

Real-World Example

Consider a scenario where game objects are disabled as part of a build process to eliminate debug features. The function responsible for this task uses Application.isPlaying.

 

Surprisingly, the value of isPlayMode, which was false during the call of ShouldBeAlive(), changed to true within the function. This unexpected behaviour is an indication of a threading issue.

 

 

Why did this happen?

In this case, Unity executed the script on a worker thread as part of the build process. Outside the main thread Application.isPlaying (Unity API) can’t be accessed. Attempting to access it triggers a warning from Unity, but this notification will not reach you because the logger is also part of the API and inaccessible outside the main thread. It will silently fail or exhibit strange behaviour.

 

How to fix it?

Fortunately, all threading problems we encountered in our custom importers and build scripts could be solved by capturing the critical values in variables. In the case of the example, assigning the value of isPlayMode to a variable before passing it to the function effectively solved the issue.

Unity: Scroll Rect Controller Input

Have you ever wondered why controller input isn’t supported by the scroll rect? This is a question that crosses my mind every time I work on UI with controller support. Most of the UI components don’t support controller input apart from navigation. While Unity offers a lot of default functionality, there are instances where the basics seem to be missing.

 

To address this issue, I wrote a script. Whenever the event system selects a game object outside the viewport, the scroll rect scrolls (vertically) until it becomes visible. The script can easily be extended to accommodate horizontal scrolling.

 

Feel free to use or modify this version based on your preferences!

 

namespace AutoScroll
{
	using UnityEngine;
	using UnityEngine.EventSystems;
	using UnityEngine.UI;

	/// <summary>
	/// Setup: 
	/// Add the component next to the <see cref="ScrollRect"/>, 
	/// references will be gathered automatically.
	/// 
	/// Functionality: 
	/// When the <see cref="EventSystem"/> selects a game object outside the 
	/// <see cref="ScrollRect.viewport"/>, the <see cref="ScrollRect"/> 
	/// is scrolled to make it visible.
	/// 
	/// Constraints:
	/// Only supports vertical scrolling.
	/// </summary>
	public class AutoScrollOnSelect : MonoBehaviour
	{
		[SerializeField]
		private ScrollRect scrollRect;

		[SerializeField]
		private RectTransform viewport;

		private void Reset()
		{
			if (scrollRect == null)
				scrollRect = GetComponentInChildren<ScrollRect>();

			if (scrollRect != null && viewport == null)
				viewport = scrollRect.viewport.GetComponent<RectTransform>();
		}

		private void Update()
		{
			if (IsScrollRectValid(scrollRect) == false)
				return;

			RectTransform currentSelection = GetCurrentSelectedRectTransform();

			if (IsPartOfScrollRectContent(scrollRect, currentSelection))
				UpdateAutoScroll(currentSelection);
		}

		private void UpdateAutoScroll(RectTransform target)
		{
			RectBoundary boundaryInViewPortSpace =
				target.GetRectBounderyInTargetLocalSpace(viewport);

			bool outsideViewTop = boundaryInViewPortSpace.Max.y > viewport.rect.yMax;
			bool outsideViewBottom = boundaryInViewPortSpace.Min.y < viewport.rect.yMin;

			if (outsideViewTop)
				ScrollUp(scrollRect, viewport, boundaryInViewPortSpace);
			else if (outsideViewBottom)
				ScrollDown(scrollRect, viewport, boundaryInViewPortSpace);
		}

		/// <summary>
		/// The <paramref name="scrollTarget"/> is required to be in the local
		/// space of the <paramref name="viewport"/>.
		/// </summary>
		private static void ScrollUp(
			ScrollRect scrollRect,
			RectTransform viewport,
			RectBoundary scrollTarget)
		{
			float scrollUpDelta = scrollTarget.Max.y - viewport.rect.yMax;
			Scroll(scrollRect, viewport.rect.height, scrollUpDelta);
		}

		/// <summary>
		/// The <paramref name="scrollTarget"/> is required to be in the local
		/// space of the <paramref name="viewport"/>.
		/// </summary>
		private static void ScrollDown(
			ScrollRect scrollRect,
			RectTransform viewport,
			RectBoundary scrollTarget)
		{
			float scrollDownDelta = scrollTarget.Min.y - viewport.rect.yMin;
			Scroll(scrollRect, viewport.rect.height, scrollDownDelta);
		}

		/// <summary>
		/// The <paramref name="scrollDelta"/> is required to be in local
		/// viewport space.
		/// </summary>
		private static void Scroll(
			ScrollRect scrollRect,
			float viewPortHeight,
			float scrollDelta)
		{
			float contentHeight = scrollRect.content.rect.height;
			float overflow = contentHeight - viewPortHeight;
			float unitsToNormalize = 1f / overflow;

			scrollRect.verticalNormalizedPosition += scrollDelta * unitsToNormalize;
		}

		private static bool IsScrollRectValid(ScrollRect scrollRect)
		{
			return scrollRect != null && scrollRect.IsActive();
		}

		private static bool IsPartOfScrollRectContent(
			ScrollRect scrollRect,
			RectTransform rectTransform)
		{
			return
				rectTransform != null &&
				rectTransform.IsChildOf(scrollRect.content);
		}

		private static RectTransform GetCurrentSelectedRectTransform()
		{
			GameObject currentSelection =
				EventSystem.current.currentSelectedGameObject;

			if (currentSelection != null)
				return currentSelection.GetComponent<RectTransform>();
			else
				return null;
		}
	}

	public static class RectTransformExtension
	{
		public static RectBoundary GetRectBoundaryInWorldSpace(this RectTransform rectTransform)
		{
			Vector3 rectMin = rectTransform.TransformPoint(rectTransform.rect.min);
			Vector3 rectMax = rectTransform.TransformPoint(rectTransform.rect.max);
			return new RectBoundary(rectMin, rectMax);
		}

		public static RectBoundary GetRectBounderyInTargetLocalSpace(this RectTransform rectTransform, RectTransform target)
		{
			RectBoundary boundaryWorldSpace = rectTransform.GetRectBoundaryInWorldSpace();

			Vector3 minLocalSpace = target.InverseTransformPoint(boundaryWorldSpace.Min);
			Vector3 maxLocalSpace = target.InverseTransformPoint(boundaryWorldSpace.Max);

			return new RectBoundary(minLocalSpace, maxLocalSpace);
		}
	}

	public struct RectBoundary
	{
		public readonly Vector3 Min;
		public readonly Vector3 Max;

		public RectBoundary(Vector3 min, Vector3 max)
		{
			Min = min;
			Max = max;
		}
	}
}

Mixed Reality meets Green Screen

During a demo project at the MD.I Hub for an automotive client, I combined the Oculus Rift (VR headset) with the ZED Mini (stereo camera) to craft an immersive mixed-reality experience.

 

The green screen serves a crucial role in outlining the physical space where the virtual reality unfolds. This setup allows the VR user to seamlessly interact with the virtual environment without feeling disconnected from their surroundings.

 

Year: 2018

Engine: Unity 3D

Platform: Oculus Rift + ZED Mini

 

Unity: Load Assets in Editor

Sometimes you need to load assets in the editor without the comfort of a direct reference. Assets are loaded using the AssetDatabase and it can be done by

  • Path
  • Label
  • Name
  • GUID
  • Instance ID

Instance ID

The instance ID is visible at the top of the inspector when the inspector debug mode is active.

 

Beware, the instance ID  survives enter/exit play mode but will change every time the asset is unloaded. For example when Unity is closed.

Instance ID in the debug inspector

GUID

The GUID of an asset can be found in its meta file.

 

The GUID will only change when the meta file is changed/deleted.

Asset GUID in the corresponding meta-file

Load Specific Asset

I wouldn’t recommend using a specific asset path for loading. It can easily break when someone decides to move the asset. A more future-proof and reliable approach is to load assets based on their GUID.

#if UNITY_EDITOR
using UnityEditor;

public static class AssetLoader
{
	public static T LoadAsset<T>(string guid) where T : class
	{
		string assetPath = AssetDatabase.GUIDToAssetPath(guid);
		var asset = AssetDatabase.LoadAssetAtPath(assetPath, typeof(T)) as T;
		return asset;
	}
}
#endif

Load Assets From Folder

If the assets you want to load are placed in a specific location, you can use AssetDatabase.FindAsset to get them. The function takes a list of folders and a search-string as parameters.  It’s a bit more involved than loading by GUID. But the search-string makes the function very versatile because you can use every input that you would type into the project window (search field) to locate assets.

#if UNITY_EDITOR
using UnityEditor;
using System.Collections.Generic;

public static class AssetLoader
{
	/// <summary>
	/// Loads all assets of type <typeparamref name="T"/> in the 
	/// <paramref name="folderPath"/> including subfolders.
	/// </summary>
	/// <typeparam name="T"> Type of the assets to load.</typeparam>
	/// <param name="folderPath"> Path starting at the asset folder.
	/// Example: Assets/Folder/Folder </param>
	/// <returns></returns>
	public static List<T> LoadAssetsFromFolder<T>(string folderPath) where T : class
	{
		List<T> loadedAssets = new List<T>();
		string searchString = $"t:{typeof(T)}";

		string[] assetGUIDs = AssetDatabase.FindAssets(searchString, new[] { folderPath });

		foreach (var assetGUID in assetGUIDs)
		{
			string assetPath = AssetDatabase.GUIDToAssetPath(assetGUID);
			T loadedAsset = AssetDatabase.LoadAssetAtPath(assetPath, typeof(T)) as T;
			loadedAssets.Add(loadedAsset);
		}

		return loadedAssets;
	}
}
#endif

Load Assets with Label

This uses the same approach as the previous function. It just changes the search string for the AssetDatabase.FindAssets.

#if UNITY_EDITOR
using UnityEditor;
using System.Collections.Generic;

public static class AssetLoader
{
	/// <summary>
	/// Loads all labeled assets of type <typeparamref name="T"/> in the 
	/// <paramref name="folderPath"/> including subfolders.
	/// </summary>
	/// <typeparam name="T"> Type of the assets to load.</typeparam>
	/// <param name="folderPath"> Path starting at the asset folder.
	/// Example: Assets/Folder/Folder </param>
	/// <param name="label">Asset label</param>
	/// <returns></returns>
	public static List<T> LoadAssetsByLabel<T>(string folderPath, string label) where T : class
	{
		List<T> loadedAssets = new List<T>();
		string searchString = $"t:{typeof(T)} l:{label}";

		string[] assetGUIDs = AssetDatabase.FindAssets(searchString, new[] { folderPath });

		foreach (var assetGUID in assetGUIDs)
		{
			string assetPath = AssetDatabase.GUIDToAssetPath(assetGUID);
			T loadedAsset = AssetDatabase.LoadAssetAtPath(assetPath, typeof(T)) as T;
			loadedAssets.Add(loadedAsset);
		}

		return loadedAssets;
	}
}
#endif

Programming Glossary

Have you ever wondered what makes code “dirty,” what refactoring entails, and why there are no silver bullets in programming? I’ve compiled a glossary with brief explanations to help you gain an overview of some of the perplexing phrases often used by programmers. If something is missing that you’d like to see explained in the glossary, feel free to leave a comment, and I’ll be sure to add it.

 

I divided the glossary into the following sections:

  • General
  • Code Quality
  • Code Structure
  • Problem-Solving
  • Programs
  • Programming Language Attributes

General







Code Quality










Code Structure




Problem-Solving






Programs









Programming Language Attributes






Unity: Don’t Cache in OnTrigger

In a nutshell: steer clear of OnTrigger callbacks when it comes to managing caches.

 

By caches, I’m referring to any collection that monitors the objects within the collider. These callbacks prove unreliable, and Unity currently has no plans to address this behaviour in the immediate future.

 

There’s no assurance that you’ll consistently receive an OnTriggerExit call for every OnTriggerEnter. Moreover, objects may trigger multiple OnTriggerEnters before initiating an OnTriggerExit. The peculiar behaviour often arises when objects are disabled or enabled within the collider. Disabling an object won’t trigger an OnTriggerExit, but enabling it will result in an additional OnTriggerEnter. Additionally, fast-moving objects sometimes fail to trigger correctly.

 

If you still need to track objects, you can use the following script. While it may not solve all problems, it does provide a solution for handling objects that are enabled or disabled within the collider.

 

	using System.Collections.Generic;
	using UnityEngine;
	using UnityEngine.Pool;

	/// <summary>
	/// This component extends Unity's default trigger callbacks by also calling
	/// TriggerExit for game objects and colliders that were disabled while
	/// inside the trigger. Additionally, it calls TriggerExit for all contained
	/// colliders when the component itself is disabled.
	/// </summary>
	public class ReliableTrigger : MonoBehaviour
	{
		public event System.Action<Collider> TriggerEnter;
		public event System.Action<Collider> TriggerExit;

		private readonly HashSet<Collider> enteredColliders = new HashSet<Collider>();

		private void OnTriggerEnter(Collider other)
		{
			enteredColliders.Add(other);
			TriggerEnter?.Invoke(other);
		}

		private void OnTriggerExit(Collider other)
		{
			enteredColliders.Remove(other);
			TriggerExit?.Invoke(other);
		}

		private void Update()
		{
			// Note: Not sure if it's necessary to strip null objects from the
			// hash set to prevent it from growing. Objects are not destroyed often.
			List<Collider> collidersToExit = ListPool<Collider>.Get();
			foreach (var collider in enteredColliders)
			{
				if (collider == null)
					continue;

				if (collider.enabled == false || collider.gameObject.activeInHierarchy == false)
					collidersToExit.Add(collider);
			}

			foreach (var collider in collidersToExit)
				OnTriggerExit(collider);

			ListPool<Collider>.Release(collidersToExit);
		}

		private void OnDisable()
		{
			List<Collider> collidersToExit = ListPool<Collider>.Get();
			collidersToExit.AddRange(enteredColliders);
			enteredColliders.Clear();

			foreach (var collider in collidersToExit)
			{
				if (collider != null)
					OnTriggerExit(collider);
			}
			ListPool<Collider>.Release(collidersToExit);
		}
	}

Unity: Better Inspector

Improving the inspector in Unity can have significant benefits for game development efficiency and workflow.

 

An organized and easy-to-read inspector layout can provide quick access to game objects, components, and properties, making it simpler to identify and modify different elements of the game without having to navigate a cluttered and confusing interface.

 

In addition, input validation in the inspector can reduce the likelihood of errors and mistakes by preventing developers from accidentally entering invalid or out-of-range values.

 

While custom inspectors and editors can be used for this purpose, even simple improvements such as good naming and attributes can make a big difference in enhancing the inspector experience.

 

Example

To illustrate the concept, I will use the following example script to show how to enhance the inspector gradually.

 

using UnityEngine;

public class ExampleScript : MonoBehaviour
{
  /// <summary>
  /// Must be in the Range 1 - 10
  /// </summary>
  [SerializeField]
  private float movementRange = 5f; 

  [SerializeField]
  private float movementSpeed = 3f;

  [SerializeField]
  private Animator animator;

   [SerializeField]
   private float animationParameterA = 0.5f;

  [SerializeField]
  private float animationParameterB = 1f;

  [SerializeField]
  private Transform myTransform;

  [SerializeField]
  private Vector3 someMoveVectorInDegree = Vector3.zero;

  [SerializeField]
  private AudioClip stepSound;

  [SerializeField]
  private AnimationCurve animationCurve;
}

 

1. Context

Organize the properties by context into groups.

 

using UnityEngine;

public class ExampleScript : MonoBehaviour
{
  [SerializeField]
  private Transform myTransform;

  [SerializeField]
  private Vector3 someMoveVectorInDegree = Vector3.zero;

  /// <summary>
  /// Must be in the Range 1 - 10
  /// </summary>
  [SerializeField]
  private float movementRange = 5f; 

  [SerializeField]
  private float movementSpeed = 3f;

  [SerializeField]
  private Animator animator;

    [SerializeField]
    private float animationParameterA = 0.5f;

  [SerializeField]
  private float animationParameterB = 1f;

  [SerializeField]
  private AnimationCurve animationCurve;

  [SerializeField]
  private AudioClip stepSound;
}

 

 

2. Naming

To continue improving the inspector, create separation between the groups from the previous step by either adding space or adding headers with fitting names. Additionally, enhance the naming of variables by either removing excess detail or providing context for improved clarity. 

 

using UnityEngine;

public class ExampleScript : MonoBehaviour
{
  [Header("Movement")]
  [SerializeField]
  private Transform movementTransform;

  [Space]
  [SerializeField]
  private Vector3 moveVector = Vector3.zero;


  [SerializeField]
  private float movementRange = 5f;

  [SerializeField]
  private float movementSpeed = 3f;

  [Header("Visuals")]
  [SerializeField]
  private Animator animator;

  [SerializeField]
  private float animationParameterA = 0.5f;

  [SerializeField]
  private float animationParameterB = 1f;

  [SerializeField]
  private AnimationCurve animationCurve;

  [Header("Audio")]
  [SerializeField]
  private AudioClip stepSound;
}

 

 

3. Tooltips

To regain the detail information that was lost in the previous step, consider adding it as a tooltip that can be easily viewed by hovering over the item in both the inspector and your IDE. It is worth noting that tooltips are the best place to include unit information, such as indicating whether an angle is in radians or degrees. This helps to avoid confusion when making adjustments to variables.

 

using UnityEngine;

public class ExampleScript : MonoBehaviour
{
  [Header("Movement")]
  [SerializeField]
  private Transform movementTransform;

  [Space]
  [Tooltip("In degrees")]
  [SerializeField]
  private Vector3 moveVector = Vector3.zero;

  [Tooltip("Must be in the range [1 - 10]")]
  [SerializeField]
  private float movementRange = 5f;

  [SerializeField]
  private float movementSpeed = 3f;

  [Header("Visuals")]
  [SerializeField]
  private Animator animator;

  [SerializeField]
  private float animationParameterA = 0.5f;

  [SerializeField]
  private float animationParameterB = 1f;

  [SerializeField]
  private AnimationCurve animationCurve;

  [Header("Audio")]
  [SerializeField]
  private AudioClip stepSound;
}

 

4. Attributes

Using attributes is an effective method for validating input values and preventing out-of-bounds errors. Some of the available attributes include those that clamp values within a specified range or define minimum values. If you require an attribute that is not built-in, you can find a wide variety of useful options on GitHub. Certain attributes only modify how a value is displayed in the inspector, making it easier to understand the impact of any adjustments.

 

using UnityEngine;

public class ExampleScript : MonoBehaviour
{
  [Header("Movement")]
  [SerializeField]
  private Transform movementTransform;

  [Space]
  [Tooltip("In degrees")]
  [SerializeField]
  private Vector3 moveVector = Vector3.zero;

  [Tooltip("Must be in the range [1 - 10]")]
  [Range(1f, 10f)]
  [SerializeField]
  private float movementRange = 5f;

  [Min(0.1f)]
  [SerializeField]
  private float movementSpeed = 3f;

  [Header("Visuals")]
  [SerializeField]
  private Animator animator;

  [Range(0f, 1f)]
  [SerializeField]
  private float animationParameterA = 0.5f;

  [Range(0f, 1f)]
  [SerializeField]
  private float animationParameterB = 1f;

  [SerializeField]
  private AnimationCurve animationCurve;

  [Header("Audio")]
  [SerializeField]
  private AudioClip stepSound;
}

 

Conclusion

After applying the steps outlined above, you can expect to see significant improvements in the layout and functionality of your inspector. By comparing the original inspector with the final enhanced version, you can easily visualize the benefits of organizing properties by context, enhancing variable naming, and adding tooltips and attributes for improved clarity and validation. With an optimized inspector, you can work more efficiently and effectively, allowing you to focus on developing your game and bringing your vision to life.

 

Source
Enhanced

What’s Antialiasing and Why is it Blurry?

Before we talk about antialiasing, we must discuss the problem it tries to solve.

It’s called aliasing and is generated by improper sampling.

 

Aliasing

The term originates in the field of signal processing. In computer graphics, it describes a variety of ugly artefacts and image flaws created by improper sampling. Staircase-like artefacts that appear at the edges of lines are a common example of aliasing.

 

Example for aliasing on the edges of a line.

 

What exactly happens when aliasing appears has something to do with the interaction between the original frequency of the picture and the sampling rate.

 

If you are interested in diving deeper google the Theory of Fourier Analysis, and the Nyquist Frequency. They are not just applicable to audio processing but to everything else that can be described as a sum of sinus waves, including images.

 

Or take a look at Texturing & Modeling, A Procedural Approach – Third Edition (Ebert, Musgrave, Peachey, Perlin, Worley 2003). A great book on computer graphics I’m currently reading. Its chapter about the different antialiasing approaches inspired this blog post.

 

 

Sampling

Raster images (consisting of pixels) are a digital representation of the real world. The digital images are created by sampling the original with a defined resolution called the sampling rate. This is how cameras turn a nice view into a digital picture!

 

Example of a raster image with visible pixel borders.

 

 

Sampling and reconstruction are fundamental to computer graphics.

 

You can create digital pictures without quality loss when the amount of detail in the original (called bandwidth) is not larger than the resolution of the digital copy.

This is known as the Sampling Theorem.

 

Unfortunately for us, the requirements for this are rarely satisfied because the real world tends to have an infinite amount of detail. This is when aliasing occurs.

 

Sampling example:

When sampling the left picture with a sampling rate of 17×17, we get the picture on the right side. It’s only an inaccurate copy of the original with large fragments. This happened because the sampling rate was way lower than the original resolution (bandwidth) of the picture.

 

The original picture (left) and the sampled picture (right)

with a grid representing the sampling rate.

 

 

 

Antialiasing

Antialiasing is an attempt to fix improper sampling by

 

  1. Filtering the image to improve the conditions for the sampling
  2. Increasing the sampling rate

 

High frequencies are a cause for aliasing problems which the antialiasing tackles by applying a low-pass filter to the picture before sampling. The visual effect of low-pass filtering is to blur the image. The challenge for good antialiasing is to blur as little as possible while still getting rid of the unwanted frequencies.

 

Additionally supersampling/oversampling is used. This means the algorithm computes at a higher rate (resolution) than required. This can be 2x, 4x, or 8x for each pixel. If you apply 4x antialiasing to your texture/screen the computer must perform 4 times the amount of work to render the image. This is the reason for the bad performance of antialiasing.

 

On the left is the sampled picture with aliasing artefacts,

on the right, the same picture with antialiasing.

Book Recommendation: Algorithm Design

I really like reading in general but especially reading technical books. In contrast to digital content, a book must pass a technical review before publication. This ensures to a certain degree that the content is good or at least not entirely wrong.

 

When searching for information online you must always ensure that you are not consuming outdated or incorrect information. This can be quite hard if you want to learn a new programming language or paradigm because you don’t recognise what’s incorrect.

 

 

 

The Algorithm Design Manual – Second Edition (Steven S. Skiena)

Google Book Preview

 

 

 

My absolute favourite book!

 

I think algorithm design is super interesting and I like Skiena’s writing style. The book gives a brought overview of the different kinds of algorithms sprinkled with entertaining stories from his experiences with clients.

 

At the end of each chapter, sources for future reading, a few exercises and challenges to deepen the learning are given.

 

 

After reading the book you will know how to recognise algorithmic problems and abstract your concrete problems to the standard algorithmic problems. You can skip through the book and check if one of the standard problems looks like the one you are trying to solve. 

 

Once you know what your problem is called it’s a lot easier to find a solution. If you’re unlucky your problem resides in the NP part of the book but this also means nobody else is able to solve it efficiently at the moment.

Unity: Animator State Lost?

Sometimes there is an animated gameObject which mustn’t stop animating or its state breaks. If it should continue off-screen you can set the Culling Mode of the animator to Always Animate.

 

But sometimes that’s not enough. When you disable an animator in the mid of an animation, it’s most likely stuck the next time it’s enabled.

 

You can prevent this by setting Keep Animator Controller State On Disable to true. This option is not visible in the default inspector. You must enter debug mode to modify it, a demonstration is below. The property can also be set by code.

 

 

Unity: Events vs Interfaces

“Events are evil!” is a common opinion among Unity developers. What are the reasons behind this statement and are interfaces an alternative to events? Example implementation at the bottom.

 

 

Problems

A common source for event bugs are timing issues. For an event to work correctly you have to subscribe before it’s triggered and unsubscribe on destroy/disable. Sounds easy?

 

 

Inactive Objects

You can’t get an event before you subscribed. This implies that a script can’t be enabled based on an event if it hasn’t been active before and subscribed.

 

 

Subscribe

Subscribing in Awake could already be too late. Example: Script A triggers an initialized event on awake and script B subscribes in Awake. B will never receive the event when the Awake of script A is called before script B. The order depends on the script execution order which will behave unpredictably. Never depend on the script execution order for your events to arrive. 

 

Most beginners will try to fix this issue by firing the event in Start and subscribing in Awake. Or they add both scripts to the script execution order settings. 

 

This will mask the symptom of the problem but wound solve the cause. They just introduced an implicit temporal coupling between the two scripts. This is bad because an unrelated piece of code can break when the event is fired earlier/later.

 

 

Unsubscribe

If you subscribe to an event you have to unsubscribe. This is especially important if you subscribe regularly. For example, if you subscribe in OnEnable you must unsubscribe in OnDisable. If not you will continue to add listeners each time the object is enabled which will cause your listener to be executed multiple times. If you are unsure unsubscribe more often than you need. Unsubscribing without subscribing first will cause no problems. 

 

 

Listener Execution

The order in which event listeners are executed is random and there is no way to change it. Events can’t be used if the order of the listeners’ execution is important. This would also be temporal coupling -> bad practice.

 

 

Solution

These problems can be avoided by using interfaces instead of events.

In this approach, an interface is defined for each event and the calling code searches for the implementers and executes them. The most basic approach to finding the implementers would be a FindObjectsOfType. If you know where your listeners are located you could also use GetComponentsInChildren and the like. You can also sort the implementers before calling them if the order is important.

 

Advantages

  • No need to subscribe/unsubscribe
  • Can be used to enable GameObjects
  • The execution order of the implementers can be defined

Example Implementation

using System;
using UnityEngine;

public interface IExampleHandler
{
  int ExecutionOrder { get; }

  void OnExampleExecuted();
}

public class InterfaceCaller : MonoBehaviour
{
  void Start()
  {
    IExampleHandler[] handlers =
      (IExampleHandler[])FindObjectsOfType(typeof(IExampleHandler), includeInactive: true);

    // You can sort the handlers if you need to.
    Array.Sort(handlers, (h1, h2) => h1.ExecutionOrder.CompareTo(h2.ExecutionOrder));

    foreach (var handler in handlers)
      handler.OnExampleExecuted();
  }
}

public class InterfaceListener : MonoBehaviour, IExampleHandler
{
  public int ExecutionOrder => executionOrder;

  // Makes the execution order configurable per instance inspector.
  [SerializeField]
  private int executionOrder = 1;

  public void OnExampleExecuted()
  {
    // Return if you don't want the script
    // to receive the call when it's disabled.
    if (enabled == false)
      return;

    Debug.Log("Example executed!");
  }
}




 

Unity: Null Check Not Working?

Ever checked an object for null and it didn’t work? Computers do precisely what you tell them to do -> it must be a programming error that bypasses the null check.

 

This is one of the moments where I doubt my sanity as a programmer.

 

What Happened?

The C# target object isn’t null but the C++ object it pointed to is. In my case, the target is a MonoBehaviour referenced as an interface. The != operator, therefore, executed a pure C# null check instead of Unity’s custom null check which leads to the null reference exception.

 

 

Why?

Unity is a bit special when it comes to null checks. Most of the Unity engine is built in C++. The properties of GameObjects and UnityEngine.Objects live in the C++ part of the engine. The C# object only contains a pointer to the native C++ object.

 

The problem with this separation is, that the lifetime of the C++ and C# objects can be different because of memory management. The C++ object can already be destroyed while the C# object is still waiting for the garbage collector. The C# object isn’t null and passes the C# null check. But accessing it will lead to a null reference exception in the C++ engine.

 

To handle this problem Unity overloaded the == operator to return null for the C# object when the C++ native object is null.

 

This has a few problems:

 

  • It’s slow
  • Not thread-safe
  • Inconsistent behaviour with the ?? operator which does a pure C# null check.  (Should never be used with UnityEngine.Objects because of the above reason.)
  • It does not work when the object is boxed or referenced as an interface

 

Solution

The problem is that the compiler executed a C# null check instead of a Unity null check. This behaviour can be fixed in two ways:

  1. Don’t reference a UnityEngine.Object as interface or box it
  2. Cast it to UnityEnigne.Object before doing the null check.
if ((target as UnityEngine.Object) != null)
    target.OnPointerExit();

 

Unity: Create Assets with Dynamic Name/Properties

Problem

The CreateAssetMenu attribute is handy if you want a quick way to instantiate scriptable objects.  But it falls short when a dynamic name or the initialization of different properties is needed, for example, the generation of a custom ID.

 

Solution

The solution is to create the asset yourself. The CreateAssetMenu attribute is only a wrapper for the UnityEditor.MenuItem attribute. Use the magic string Assets/Create as itemName to place it in the create menu. You can extend the itemName in the same way as the menuName in the CreateAssetMenu to add subfolders.

 

A simple script to create a scriptable object with a custom name could look like this:

using UnityEngine;

public class CustomObject : ScriptableObject
{
	public int RandomNumber;
	public Texture2D Graphic;

#if UNITY_EDITOR
	[UnityEditor.MenuItem("Assets/Create/Custom/Object at Root (Menu Item)", priority = 10)]
	private static void CreateWithRandomName()
	{
		CustomObject myObject = UnityEditor.ObjectFactory.CreateInstance<CustomObject>();
		myObject.RandomNumber = Random.Range(0, 42);

		string directory = "Assets";
		string assetName = $"NewAssetName{myObject.RandomNumber}.asset";
		string path = System.IO.Path.Combine(directory, assetName);

		UnityEditor.AssetDatabase.CreateAsset(myObject, path);
	}
#endif
}

 

Create at Selection

You can change the path of the created object to spawn it wherever you want. If you would like to mirror the behaviour of the CreateAssetMenu attribute, place it at the position of the current selection. The selection can be accessed via the UnityEditor.Selection.activeObject. Be aware that the active object can be ba an asset or a folder. If the first is the case the asset name needs to be stipped from the path.

 

	[UnityEditor.MenuItem("Assets/Create/Custom/Object at Selection (Menu Item)", priority = 10)]
	private static void CreateAtSelection()
	{
		CustomObject myObject = UnityEditor.ObjectFactory.CreateInstance<CustomObject>();
		myObject.RandomNumber = Random.Range(0, 42);

		string directory = UnityEditor.AssetDatabase.GetAssetPath(UnityEditor.Selection.activeObject);
		if (System.IO.Path.HasExtension(directory))
			directory = System.IO.Path.GetDirectoryName(directory);

		string assetName = "NewAssetName.asset";
		string path = System.IO.Path.Combine(directory, assetName);

		UnityEditor.AssetDatabase.CreateAsset(myObject, path);
		UnityEditor.Selection.activeObject = myObject;
	}

 

Create from Selection

Now it’s only a short step to a neat asset integration workflow.
Search the selected assets for the objects you need and add the references to the object instance.

 

Real-life example

We use custom AudioEffect scriptable objects in our games to wrap audio clips to support randomisation and variants.

 

You just have to select the clips and use the create menu. The object pulls the clip references from the selection and generates the correct name based on the source clips names. It’s a reall time saver if you have 100+ sounds with variants in your game.

 

	[UnityEditor.MenuItem("Assets/Create/Custom/Object from Selection (Menu Item)", priority = 10)]
	private static void CreateFromSelection()
	{
		Texture2D[] graphics = UnityEditor.Selection.GetFiltered<Texture2D>(UnityEditor.SelectionMode.Assets);
		if (graphics != null && graphics.Length > 0)
		{
			CustomObject myObject = UnityEditor.ObjectFactory.CreateInstance<CustomObject>();
			myObject.RandomNumber = Random.Range(0, 42);
			myObject.Graphic = graphics[0];

			string directory = UnityEditor.AssetDatabase.GetAssetPath(UnityEditor.Selection.activeObject);
			if (System.IO.Path.HasExtension(directory))
				directory = System.IO.Path.GetDirectoryName(directory);

			string assetName = $"MyObject_{myObject.Graphic.name}.asset";
			string path = System.IO.Path.Combine(directory, assetName);

			UnityEditor.AssetDatabase.CreateAsset(myObject, path);
			UnityEditor.Selection.activeObject = myObject;
		}
		else
			Debug.LogError("Select a texture to create a CustomObject.");
	}

 

Following the complete script with all examples:

using UnityEngine;

[CreateAssetMenu(menuName = "Custom/Object (Create Asset Menu)")]
public class CustomObject : ScriptableObject
{
	public int RandomNumber;
	public Texture2D Graphic;

#if UNITY_EDITOR
	[UnityEditor.MenuItem("Assets/Create/Custom/Object at Root (Menu Item)", priority = 10)]
	private static void CreateWithRandomName()
	{
		CustomObject myObject = UnityEditor.ObjectFactory.CreateInstance<CustomObject>();
		myObject.RandomNumber = Random.Range(0, 42);

		string directory = "Assets";
		string assetName = $"NewAssetName{myObject.RandomNumber}.asset";
		string path = System.IO.Path.Combine(directory, assetName);

		UnityEditor.AssetDatabase.CreateAsset(myObject, path);
	}

	[UnityEditor.MenuItem("Assets/Create/Custom/Object at Selection (Menu Item)", priority = 10)]
	private static void CreateAtSelection()
	{
		CustomObject myObject = UnityEditor.ObjectFactory.CreateInstance<CustomObject>();
		myObject.RandomNumber = Random.Range(0, 42);

		string directory = UnityEditor.AssetDatabase.GetAssetPath(UnityEditor.Selection.activeObject);
		if (System.IO.Path.HasExtension(directory))
			directory = System.IO.Path.GetDirectoryName(directory);

		string assetName = "NewAssetName.asset";
		string path = System.IO.Path.Combine(directory, assetName);

		UnityEditor.AssetDatabase.CreateAsset(myObject, path);
		UnityEditor.Selection.activeObject = myObject;
	}

	[UnityEditor.MenuItem("Assets/Create/Custom/Object from Selection (Menu Item)", priority = 10)]
	private static void CreateFromSelection()
	{
		Texture2D[] graphics = UnityEditor.Selection.GetFiltered<Texture2D>(UnityEditor.SelectionMode.Assets);
		if (graphics != null && graphics.Length > 0)
		{
			CustomObject myObject = UnityEditor.ObjectFactory.CreateInstance<CustomObject>();
			myObject.RandomNumber = Random.Range(0, 42);
			myObject.Graphic = graphics[0];

			string directory = UnityEditor.AssetDatabase.GetAssetPath(UnityEditor.Selection.activeObject);
			if (System.IO.Path.HasExtension(directory))
				directory = System.IO.Path.GetDirectoryName(directory);

			string assetName = $"MyObject_{myObject.Graphic.name}.asset";
			string path = System.IO.Path.Combine(directory, assetName);

			UnityEditor.AssetDatabase.CreateAsset(myObject, path);
			UnityEditor.Selection.activeObject = myObject;
		}
		else
			Debug.LogError("Select a texture to create a CustomObject.");
	}
#endif
}

Unity: Preserve Articulation Body State

One constraint of articulation bodies is that they lose their physics state if their hierarchy changes. This behaviour manifests as snapping back to their initial configuration/position. You can find a short overview of the constraints and advantages of articulation bodies here.

 

Problem

Changing the active state of the root articulation body is also a hierarchy change. This implies that disabling the root resets the physics state of the complete articulation body chain. I don’t want the active state of my game objects coupled to a physics reset.

 

Left: Intended range of movement.

Right: Default behaviour after re-enable -> state reset.

 

Solution

The reset can be countered by restoring the previous physics state of the hierarchy. It’s easier than it sounds. Most of the properties already contain the values of all articulation bodies in the tree. For an explanation of the underlying system, take a look at the sources linked in my documentation overview.

 

Add the following script to one of the articulation bodies of the hierarchy you want to enable/disable. It works independently of the position in the articulation tree. But I would recommend putting it on the root articulation body. This way the component is easier to find in the hierarchy.

 

Be aware that the setting Match Anchors of all articulation bodies must be disabled for this script to work correctly. The setting would recalculate the anchor of the joint on enable. This is a problem because the state of the articulation body is relative to its anchor position. Changeing the anchor makes the restored state invalid.

 

Strange behaviour after enable because the property ‘Match Anchors’ was enabled.

 

Script

using System.Collections.Generic;
using UnityEngine;

public class ArticulationBodyStatePreserver : MonoBehaviour
{
  [SerializeField]
  private ArticulationBody body;

  private bool backupValid = false;

  private readonly List<float> jointPositionBackup = new List<float>();
  private readonly List<float> jointVelocityBackup = new List<float>();

  private void Reset()
  {
    if (body == null)
      body = GetComponentInChildren<ArticulationBody>();
  }

  private void OnEnable()
  {
    if (backupValid)
    {
      body.SetJointPositions(jointPositionBackup);
      body.SetJointVelocities(jointVelocityBackup);
    }
  }

  private void FixedUpdate()
  {
    jointPositionBackup.Clear();
    jointVelocityBackup.Clear();

    body.GetJointPositions(jointPositionBackup);
    body.GetJointVelocities(jointVelocityBackup);

    backupValid = true;
  }
}

 

Unity: Articulation Body Overview

The system was integrated for industry applications like realistic physic simulations for robotics or the training of neural networks. But it can also be used in games where a high precision physics simulation is needed. For example physics-based puzzles.

 

Some parts of the feature are still buggy or not completely implemented yet. Especially the spherical joint seems to exhibit a lot of bugs that crash Unity’s physics system and should be avoided at the moment.

 

The feature is still new and documentation is rare. If you need more detailed information look at my documentation post. It’s a list of the best documentation sources I found on the internet so far. Following is a short overview of the advantages and constraints of articulation bodies:

 

Advantages

  • Accurate physics simulation with high mass ratios, drive models and better dynamic stability. It’s built for the modelling of real-world physics for industry applications.

 

  • Guaranteed no jitter and simulation errors because it uses a constraint coordinate system. The constraints and limits are always respected which makes the joints not stretchable.

 

  • Scales with DoF (degrees of freedom) instead of the number of rigidbodies. The number of articulation bodies the DoF are distributed over is irrelevant. The system doesn’t scale with the number of bodies.

 

  • Mixed setups with articulation bodies and rigidbodies are possible. They affect each other and collide correctly. There is no need to use articulate bodies exclusively.

 

  • Easy to edit with built-in gizmos and a small set of variables.

 

Constraints

  • No transform manipulation. Really, you can’t move these things by changing the transform component. The change will be ignored. The bodies are stable because they can’t be manipulated outside the physics calculations. Side effect: you can’t move articulation bodies in the editor at runtime using gizmos. At the start, it’s super annoying but you get used to it.

 

  • Must use forces or the drive property of the articulation bodies to manipulate them. Only the root is an exception and can be teleported in code by calling ArticulaionBody.TeleportRoot.

 

  • Can’t be kinematic. If you want to make an articulation body kinematic connect it with a fixed joint to a kinematic rigidbody.

 

  • Constrains object setup. The object hierarchy defines the relation between the articulation bodies.

 

  • No circular dependencies because the objects must be ordered in a tree structure for the physics solver to work. This constraint can be bypassed by making the last link with a rigidbody joint.

 

  • The force applied to a joint can’t be retrieved.

 

  • Hierarchy is fixed. It’s difficult to change the articulation body hierarchy at runtime because it will reset the physics simulation of the complete articulation tree. The objects lose their state. Every object in the hierarchy will snap back to its original state.

 

  • GameObject.SetActive is problematic. Disabling an articulation body has the same implications as changing the hierarchy. The physics state is reset.  But it can be coped with by storing the current state of the articulation tree and restoring it on enable. My solution for this inconvenience is here.

 

  • Prefabs are problematic. Don’t try to make parts of an articulation body hierarchy into a prefab. It breaks the physics hierarchy and behaves oddly.

The documentation states that the articulation bodies are more expensive to simulate but I didn’t see any major performance bottlenecks so far.

Unity: Articulation Body Documentation

I’m currently playing around with articulation bodies which were introduced in 2020.1. Some parts of the feature are still buggy or not completely implemented yet. Especially the spherical joint seems to exhibit a lot of bugs that crash Unity’s physics system and should be avoided at the moment.

 

Because the feature is so new there is very little documentation. I hope to get you a headstart by collecting the best information sources I found so far. If you want to evaluate if the feature is suited for your needs before reading the documentation, take a look at my articulation body overview for a short summary of the advantages and constraints.


The most helpful source for me was the Unity announcement thread about the new articulation system. It’s maintained by a Unity dev that helps to develop the feature. It’s long but I encourage you to read the whole thread. It’s the best source on the topic and the best place to seek help and request features. There is also a Unity blog post with a more general explanation of the new feature set.

 

If you want to do a deep dive into the underlying functionality, the documentation of Nvidia is also great. Nvidia PhysX 4.0 is the articulation body system Unity is integrating.

 

Unity’s documentation on the scripting API is ok but the manual is better. Be aware that a few of the documented properties are not yet implemented. For example the inverse dynamics and the joint forces.

Unity: Easy Shader Property Access

There is a wide range of shaders for visual effects in games. Since the release of Shader Graph, the creation of shaders has become more accessible for non-programmers. This has raised the question: how do you set shader variables if you can’t code?

 

Whenever you want to set/get a shader variable you should do some caching to minimize the impact on runtime performance. This is important if the variable is set in each frame (update loop).

 

Approach

A C# class that can be initialized with the shader variable name and its type is handy for programmers that don’t want to write the caching code each time. But this may be a bit too much for a visual artist. A simple solution would be a component which can be configured in the inspector. This eliminates the need for a script. Shader properties that are simple enough to be handled in the inspector include:

 

  • float
  • int
  • color
  • matrix
  • texture
  • vector

I have implemented class and component access for these shader properties. It can be found on my GitHub and is open-source (MIT). If you need access to the buffers or array versions of the properties feel free to extend my implementation.

 

Component Access

The setup for the component is easy. Just add it to a gameObject, and set the reference to the renderer and the property name. Now the property can be set/get on the component or added to a unity event.

 

Left the component used to access the tint property of the default sprite shader. On the right is an example of how to use the component in combination with a Unity Event.

 

Script Access

If you want to use it in code look at the following example. It’s a simple script that sets the tint property (shader property name: _Color) of the default sprite shader.

 

On the left, the set-up in the hierarchy and the changing sprite is on the right.

 

using UnityEngine;

public class ShaderAccessExample : MonoBehaviour
{
  [SerializeField]
  private SpriteRenderer spriteRenderer;

  [SerializeField]
  private float duration = 3f;

  private float elapsedTime = 0;

  private ShaderColorAccess colorAccess;

  void Start()
  {
    colorAccess = new ShaderColorAccess(spriteRenderer, "_Color");
  }

  void Update()
  {
    elapsedTime += Time.deltaTime;
    elapsedTime %= duration;

    Color newColor = Color.Lerp(Color.yellow, Color.blue, elapsedTime / duration);

    colorAccess.Set(newColor);
  }
}

 

 

Private Variables and LEDs

I played around with my Arduino starter kit and read a few pages in Making Embedded Systems (more on it here). Now I am ready to blink the built-in LED of my board.


My first approach looked like this:

#define LED LED_BUILTIN

void setup() 
{
  SetupLED();
}

void loop() 
{
  EnableLED();
  delay(1000);
  
  DisableLED();
  delay(1000);
}

void SetupLED()
{
  pinMode (LED, OUTPUT);
  DisableLED();
}

void EnableLED()
{
  digitalWrite(LED, HIGH);
}

void DisableLED()
{
  digitalWrite(LED, LOW);
}

 

As next step I tried to remove the explicit LOW and HIGH to make it more sophisticated. I want to toggle the LED independent of it’s current state.

 

As far as I know, I can’t read the state of the LED if I’m using the pin as output. So I need a local variable to hold the current LED state and keep it in sync.

 

At first, I was confused because there are no access modifiers in processing. All variables put at the top of the file are global. It’s a really bad practice to make variables global that handle internal state. Other people could set the variable and mess up your state. Or worse, they read your variable and depend there logic on the internal working of your system. This can cause strange bugs where changeing the content of a private function can compromise an unrelated system.

 

After searching the web, I found that a variable in function scope can be made persistent by adding the static keyword. It seems a bit strange to me.

Is this the way to go in processing?

 

#define LED LED_BUILTIN

void setup() 
{
  SetupLED();
}

void loop() 
{
  ToggleLED();
  delay(1000);
}

void SetupLED()
{
  pinMode (LED, OUTPUT);
  DisableLED();
}

void ToggleLED()
{
  // Kind of persistant variable but in function scope.
  // Is only LOW the first time the function is called.
   static int persistentValue = LOW;

  if (persistentValue == LOW)
      EnableLED();
  else
      DisableLED();

  persistentValue = !persistentValue;
}

void EnableLED()
{
  digitalWrite(LED, HIGH);
}

void DisableLED()
{
  digitalWrite(LED, LOW);
}

 

Unity: Sync Objects With Music

So you want to build a rhythm game. A game that has elements moving in sync with the beat of your music. I participated in the rhythm game ‘A Jesters Tale‘ at the Nementic game jam. And it turned out it’s not as easy to sync objects with music as it seems at first glance. Everything tends to go out of beat with time if you don’t code it correctly.

 

I found two great articles and a Reddit thread about building a rhythm game. I think the articles are all you need to build one yourself. I wound repeat the content here but provide you with links to the sources.

 

Reddit thread: How to Make a Rhythm Game
Article: Coding to the Beat – Under the Hood of a Rhythm Game in Unity
Article: Music Syncing in Rhythm Games

Unity: Frame by Frame Animation with Dynamic Eyes

Can you have dynamic eyes that react to the environment but at the same time use frame-by-frame animation for the body? Yes, you can do that in Unity and it’s not difficult at all.


Most of the work is setting up the hierarchy of GameObjects in Unity correctly and creating the animations. The frame-by-frame animation in my example comes from Unity’s ‘2D Roguelike’ tutorial. I added wiggle eyes for demonstration purposes. The tutorial is divided into incremental steps to illustrate the learning process of setting up the animation.

 

 

Step 0 – Create and Move the Player

 

Visual demo and setup of the gameObjects

 

In this setup all necessary GameObjects are already created. The body with the frame-by-frame animation and the two eyes as separate objects. If you don’t know how to set up an animation in Unity you can find a lot of tutorials on the internet. (I won’t go into details here)

With this setup you will quickly notice that the eyes do not move. Only the character moves across the screen. This is because only the body moves. If you want the eyes to move, you have to parent them under the body.

 

 

Step 1 – Parent the Eyes

 

Visual demo and setup of the gameObjects

 

This setup is much better than the previous one. The eyes now move with the character when he moves. But wait, they don’t move up/down with the animation. If the character had a rig, the eyes would automatically move with the body. But we don’t have a rig because we use a sprite animation. This means that we have to manually move the eyes with the animation.

 

 

Step 2 – Animate Eyes

 

Visual demo and setup of the gameObjects

 

Open your sprite animation in the animation window and add keyframes for the eyes. To do this, add the Transform.Position properties of the eyes to the animation. Adjust the keyframes until you are happy with the up/down movement of the eyes. Now all parts of the character’s body move in sync.

 

Animation with the position of the eyes added.

 

 

Step 3 – Move Pupils

Left: demo with eye movement. Right: comparison of all previous steps.

 

Now for the topic of this blog entry: moving the pupils. A simple way to move the pupils is to parent a pupil object under each eye and move it in code. The script below moves the pupil a set distance from the center of the eye in the direction of an object. In our case the object of interest is the yellow ball. The script must be placed on the eye gameObject not the pupil to work correctly.

 

 

// Author: Anja Haumann

using UnityEngine;

public class EyeMover : MonoBehaviour
{
    [Tooltip("Pupil that will be moved in the direction of the target")]
    [SerializeField]
    private Transform pupil;

    [Tooltip("Object the eye is looking at")]
    [SerializeField]
    private Transform lookAtTarget;

    [Tooltip("The distance the pupil is alowed to travel from the center of the eye.")]
    [SerializeField]
    private float movementDistance = 0.1f;

    void Update()
    {
        Vector3 direction = lookAtTarget.position - transform.position;
        Vector3 offset = direction.normalized * movementDistance;
        pupil.transform.localPosition = offset;
    }
}

 

This is one of the most basic ways to implement a Look-At. From here you can dive further into the topic and add more complex code. For example, interest zones that change the target of the eyes as soon as the player walks into them. Or different layers of eye behaviour that switch between random eye movement and object focus based on distance.

 

If you take this to the extreme, you’ll end up with the Witcher 3 eye-system. So before you dive deep into the matter, you should consider if it has to be that complicated. Most of the time a simple look-at that is blended with an eye idle animation is enough. We have implemented something similar in Space Shop VR.

 

 

If you want to play around with the implementation yourself, here is my example project. Just download it and try it out: LookAtAnimationTest

Unity: Comment Component

Problem

Do you ever think “The GameObject setup is so weird, it should be documented” but then don’t do it? Or are you forced to insert components in strange places because Unity just can’t do it any other way? Keyword: Unity’s UI rendering order is influenced by the hierarchy.

 

Most of the time it seems excessive to create documentation for a single component. But then a few weeks later you wonder why it’s there and everything breaks when you delete/move it? It would be handy if you could leave a comment on the GameObject. Something like ‘Do not delete/move! This is needed for the clickblocker’. Programmers comment their code, why not the GameObject setup in Unity too?

 

Solution

And that’s what the comment component is for. It’s a component with a text box which you can put next to weird GameObject setups. It is opened by double clicking on it and you can format it differently depending on the severity. You can choose between Information, Warning, Error.

 

 

It is especially useful when you work with several people (designers, programmers) on the same project. It keeps your colleagues from unknowingly breaking already implemented features.

 

It can be very frustrating when 3 people in a row delete an important component because they think it is unnecessary. Every week the same bug comes back from QA and every time it is the same component that is missing. Only the person who deleted it during a cleanup changed. This goes on until everyone on the team knows that this particular component has a purpose and needs to stay right there.

 

My colleague wrote the comment component and put up the source code on GitHub.

Starting with Embedded Systems

I always liked to spend time with my dad in his electrical workshop. He is a radio engineer and built several robots with blinking LEDs for me. He hoped that I would get into engineering too. But I kind of never did, the robots he built were awesome, but I wasn’t interested in the soldering part.

 

 

The Spider Robot

After beginning my game design courses at university I started to love programming. My dad saw his second chance and built a spider robot powered by a Pololu Mini Maestro 24 channel microcontroller for me. I was hyped. But too inexperienced to get a robot with six legs (total of 18 servos) to stand up or even walk. I packed it away in the hope I could get back to it when I was more experienced.

 

The spider robot my dad built for me.

I know it’s technically not a spider because

it has only 6 legs but I like to call it spider. 🙂

 

 

Return of the Spider

A few weeks before this post I found the box with the spider robot and thought to myself:

 

Now you know how to code, let’s get this robot walking!

 

The robot is now able to stand up and sit down again. I mainly code in the high-level language C# in Unity and realised that there is a huge difference to low-level languages like Basic. It’s a whole new world if you have to manually construct bytes and send them to the servos to get them to do anything. It’s not like you could just call Servo.Move(degrees).

 

 

Learning Embedded Systems

I have enough coding experience to know that I wound get anywhere if I start with a complex robot before knowing the basics of embedded system programming.

 

I started my journey by listening to the EMBEDDED.fm podcast to get an idea of what I’m getting myself into. I liked the podcast and bought the book of the host Making Embedded Systems by Elecia White. It’s a really good book for beginners and I can recommend it if you know how to code and want to get into embedded systems. I don’t think it’s a good entry if you have no programming experience.

 

My dad got me a Super StarterKit UNO R3 Project for my birthday. I’m now learning embedded systems by starting at the basics, blinking lights.

 

And it’s a lot of fun 😀 Maybe I will post a few of my Arduino scripts in the future.

Unity: Screen Space Gizmos

Unity’s default gizmos are essential for visual debugging but can only be drawn in world space. This restriction can be rather annoying if you try to debug custom UI or viewport-based camera movement. A simple line at a specific pixel coordinate would solve the problem most of the time. There are two solutions to this problem. 

 

Solution A

Add a canvas with a half-transparent image at the screen position. This is easy to do but comes with the following disadvantages:

 

  • Must be created/deleted each time you want to debug
  • Sets the canvas and scene dirty -> causes version control noise
  • Not suitable if the screen position is not static

 

Solution B

Draw gizmos at the Camera.NearClipPlane. You can’t draw the gizmos exactly on the near clip plane because they would be invisible. Offset the gizmos a bit (0.001f) in the camera view direction. The gizmos will move with the camera and behave like screen space gizmos. Be aware that the canvas scale must be taken into account when dealing with pixel coordinates. If something is slightly off chances are high that the canvas.scaleFactor is missing in a calculation.

 

Below is an example implementation that works with all camera projections and canvas modes except world-space. In world-space just use the default gizmos.

 

Code at my GitHub: https://github.com/AuriMoogle/Unity-ScreenSpaceGizmos

 

// Author: Anja Haumann 2022 - MIT License
// Explanation and more content at my blog: https://anja-haumann.de

using UnityEngine;

public static class ScreenGizmos
{
  private const float offset = 0.001f;

  /// <summary>
  /// Draws a line in screen space between the 
  /// <paramref name="startPixelPos"/> and the 
  /// <paramref name="endPixelPos"/>. 
  /// </summary>
  public static void DrawLine(
    Canvas canvas, 
    Camera camera, 
    Vector3 startPixelPos, 
    Vector3 endPixelPos)
  {
    if (camera == null || canvas == null)
      return;

    Vector3 startWorld = PixelToCameraClipPlane(
      camera, 
      canvas, 
      startPixelPos);

    Vector3 endWorld = PixelToCameraClipPlane(
      camera, 
      canvas, 
      endPixelPos);

    Gizmos.DrawLine(startWorld, endWorld);
  }

  /// <summary>
  /// Converts the <paramref name="screenPos"/> to world space 
  /// near the <paramref name="camera"/> near clip plane. The 
  /// z component of the <paramref name="screenPos"/> 
  /// will be overriden.
  /// </summary>
  private static Vector3 PixelToCameraClipPlane(
    Camera camera, 
    Canvas canvas,
    Vector3 screenPos)
  {
    // The canvas scale factor affects the
    // screen position of all UI elements.
    screenPos *= canvas.scaleFactor;

    // The z-position defines the distance to the camera
    // when using Camera.ScreenToWorldPoint.
    screenPos.z = camera.nearClipPlane + offset;
    return camera.ScreenToWorldPoint(screenPos);
  }
}

 

Unity: Google Drive + Asset Bundles for D&D

One evening when I was engaged in playing D&D with my friends I realized that we had a hard time remembering the names of the characters and cities we encountered. And it would enhance our experience if we knew them and not fall back to asking our game master all the time.

 

The idea to write a small app to access a character database was born. There are already a lot of services and online tools available to do exactly this for D&D. If I were lazy I would have set up one of these services. But I thought it would be more fun and educational if I programmed the app myself. It doesn’t seem too complicated to host a few names and pictures in my cloud and build a handy client to access and modify the data. So I started to figure out the details.

 

Requirements

The requirements I came up with are:

  • mobile client for easy access
  • data hosted online to share between users
  • data must be modifiable without updating the client
  • entries contain a picture and text

I didn’t want to manage a complete database because it seemed overkill, so I went with a JSON file for the texts and an AssetBundle for the pictures. 

 

Overview of the UI screens of the App

 

Cloud Access

I dropped both on my google drive. I didn’t want to make the data publicly accessible because it contains private data from our campaign. I wanted to protect our files with a password but still be able to access them from the app. An easy solution was to set up a Google Service Account. You can manage file access for these accounts the same way as human users. They are required to log in before accessing your files. 

 

I don’t recommend using google drive as a data host if you plan to share your app with more than a few people. Google will shut down your account if there is too much traffic. I would recommend using a regular hosting service for larger user bases. It is ok for my use case because I will only use the app once a week and share it with four users.

 

Implementation

After the cloud was set up, I started implementing the Unity Client. It was straightforward to build a few UI screens to access the data and edit it. The only thing that took a while was my attempt to use Addressables for the building and loading of the picture AssetBundle. If you want to load an AssetBundle via Addressables from a non-public URL you have to implement your own AssetBundleProvider to handle the credentials. An example of an implementation is the PlayAssetDeliveryAssetBundleProvider on Github or in this unity forums thread.

 

I was able to load the bundles but it always crashed when Addressables tried to access the assets. Because it’s a fun project I kicked Addressables and used normal AssetBundles instead. Now I had to implement the loading and unloading myself. It seemed more manageable than finding out what caused Addressabls/Unity to crash without any helpful error or stack trace.

 

Not really helpful error Unity crashed with when accessing the asset bundle.

I made sure the bundle is not corrupted.

 

 

If you also want to use AssetBundles I can highly recommend the Unity Asset Bundle Browser Tool. It is officially deprecated but still works like a charm and is documented. The tool helps to manage and build AssetBundles. You can download it as a package from the Github URL in the package manager.

 

Unity Asset Bundle Browser Tool

 

Where to download packages from Github URLs

Unity: Why Random.Range is Dangerous

Unity’s random seems like a straightforward to use and easily accessible function. It is helpful when dealing with procedural generation and AI behavior because the seed can be fixed to repeat the behavior. So it seems. But this is not always the case. UnityEngine.Random can case intricate timing bugs. These bugs are hard to track down in complex systems like AI behaviors that need to behave deterministically.

 

Causes

But what causes this kind of bug? The use of async functions in combination with calls to the static random class. Asynchronous functions are not guaranteed to take the same amount of time each invocation. It leads to a different point in time the Random.Range call is executed. This can change the call execution order and consequently the behavior of all following classes that use the random functionality. 

 

Symptoms

The main symptom of a timing bug related to Unity’s random is a changing list of random values for a fixed seed. Initializing the random seed makes the random numbers deterministically. If it’s not, you are dealing with a timing problem.

Asynchronous systems which can cause timing problems are:

 

  • Sounds
  • Animations
  • Scene loading
  • Addressables
  • Tween libraries

If one of these systems triggers a function that makes a call to Random.Range the outcome is no longer dependable. I will give an example.

 

Example

It is an actual bug I found in our production code. The AI in our current game uses random numbers to vary its behavior. I initialized the random function with a seed and started the game. Running a game with only AI players, I expect the same AI to win each time. But this was not the case. The winning AI would change randomly while using a fixed random seed. 

 

I investigated by comparing the log files the AI wrote and realized that the random numbers used by the AI changed every session. The numbers should be the same when using the same seed. This strange behavior was not present when I simulated the game without visual representation. It suggests that the bug is part of the visuals, not the behavior logic. The hard thing about debugging timing-related bugs is the fact that changing the timing can conceal the bug. This means that adding a debug log, changing an unrelated system, or clicking a button one frame later could camouflage the bug temporarily.

 

After digging through the visual representation of the AI for three days, I found a walk animation with exit time and animation events. The events triggered step sounds which would call Random.Range to modify the pitch. This call to Random.Range was the offender. It changed its execution frame because it relied on the animation timing. It caused it to be called right before or after the AI based on the frame rate. 

 

Solution

I solved the problem by introducing a separate random object for the AI. Created a System.Random object and passed it to all AI-related systems to decouple it from the random numbers used by the visuals. Now everything works again and is more robust than before. If deterministic behavior is requiered it is not safe to use UnityEngine.Random. You don’t know if another system uses it with asynchronous function calls leading to changing random numbers when using a fixed seed.

GDD Example

GDD is the abbreviation for game design document. The GDD is written and maintained by the game designer. It describes all game features in detail and is the go-to document for the development team. Sometimes it is also used as the basis for a publisher contract because it states all features that will be implemented.

 

GDDs used to be the single source of information while developing a game, but this changed in recent years. It is difficult to handle multiple GDD versions/to ensure that nobody works based on outdated information. Multiple game designers can’t work on the same document and it is hard for the programmers to find the important implementation details. The trend goes to online hosted wikis.

 

At the company, we use mainly wikis hosted on confluence. These wikis are maintained by our 2 game designers and are sometimes made available for our QA testers. Sometimes a GDD is still needed to apply for funding or something similar. For example, the FFF Bayern requires a GDD as part of their evaluation process. As an example, I will link you one GDD I wrote for a small game in university.

 

Hope you can understand german:

 

SaveTheGhosts_Anja_Haumann

Unity: UI Workflow Improvement

 

If you build a game in Unity you will need a few UI screens to guide your player. The number of canvases will increase with the complexity and size of your project.

 

The canvases and the UI graphics will start to clutter your scene view because canvases are not moveable. This leads to a bad editing workflow and version control noise if you fall back to disable/enable the canvases that you want to edit.

 

This problem can be solved by parenting a new panel game object right beneath the canvas, set it to stretch and move it out of the way. This is the new parent game object for your UI graphics. 

 

Default Setup

 

 

Improved Setup

 

The panel positions need to be reset in the beginning to make them visible again. This can be done in a custom build process or in a simple script that resets the position in awake. The script must be placed on the offset game object.

 

Example

 

using UnityEngine;

/// <summary>
/// Used on RectTransform panels which are moved away from their default
/// position in the scene view to organize UI screens for editing.
/// At runtime, the anchored position is set back to zero.
/// </summary>
public class ResetRectTransform : MonoBehaviour
{
  private void Awake()
  {
    var rectTransform = (RectTransform)transform;
    rectTransform.offsetMin = Vector2.zero;
    rectTransform.offsetMax = Vector2.zero;
  }
}

 

 

Unity: Custom Editor Examples

 I will show two examples of custom editors I implemented in production to enhance our workflow with generated data.  The examples are from the game ‘Space Shop VR‘. It is about a shop in space that sells different items to aliens. The clue is that all aliens speak different languages and the VR player must translate the alien’s request with the help of the other players.

 

All alien cases were generated based on a ruleset table maintained by the game designer. This made it easy to tweak for our designer and removed the need for the programmers to manually edit every case when the game design changed the rules.  The generation workflow lead to quite hard-to-read data objects that made debugging and checking the generated cases difficult and time-consuming. For example, the data objects generated for the alien named Gordianer:

 

 

This is quite hard-to-read, isn’t it? And the worst is there are around 100 of these case objects with a few hundred entries each. What would you do if a playtester reports that one of the questions in the party-hat case is wrong? You should write a custom editor to visually debug the data! I went with these two editors to get a better grasp at which poses the alien is ought to do to get a specified item:

 

 

Another good example are the custom editors of the Ininin alien. The alien consists of multiple smaller entities called Ini. It communicates by walking in circles while changing colors. The data without a custom editor would look like this:

 

 

It was a lot easier for the designers to input the walking patterns (left) that are used to generate the case data (right) with the following editors.

 

 

 

I hope the custom editors in this post gave you an impression of the changes editor code can make for data readability. It is not hard to add dropdowns for scriptable objects or draw multiple fields in a row. Start small and enhance your data visualization step by step with custom editors.

Unity: UI Space Conversion

Converting between the different coordinate systems of the Unity UI can be the key to a lot of cool custom UI and input features. You can basically convert every point in all coordinate spaces available with a few simple math functions. There are three different spaces:

 

Screen Space

This is the space in which Unity provides its Input, for example, the position returned by Input.mousePosition is in screen space. Points in screen space are Vector2 and contain x- and y-pixel coordinates of the screen. Some of Unity’s input functions will return Vector3 instead of Vector2 for calculation convenience. The z value of these points is always 0. The origin of this coordinate system is the lower-left corner and its size is the pixel size of the device/window.

 

Be aware that the mouse position can leave the screen space when your game is played in window mode and the user moves the mouse out of the window.

 

Don’t be afraid to use screen space coordinates that are outside of the visible screen area. For example points with coordinates smaller than 0 or larger than the screen size. Unity will handle all invisible positions correctly and you can use them to spawn objects right off-screen.

 

public static Vector3 ScreenToViewPort(Vector3 screenPosition)
{
  return new Vector3(
    screenPosition.x / Screen.width,
    screenPosition.y / Screen.height,
    0); 
}

 

View Port Space

This is basically the same as screen space but normalized to the screen size. The origin of the coordinate system is also the lower-left corner but the size is 1 in each dimension.

 

You can convert between the two spaces by dividing or multiplying the position.

 

public static Vector3 ViewPortToScreen(Vector3 viewPortPosition)
{
  return new Vector3(
    viewPortPosition.x * Screen.width,
    viewPortPosition.y * Screen.height,
    0);
}

 

Rect Space

It’s best practice to use Unitys build-in RectTransformUtility to convert between screen and rect space. It contains the following functions:

 

PixelAdjustPoint

Convert a given point in screen space into a pixel correct point.

 

PixelAdjustRect

Returns the pixel coordinates of the rect transform corners.

 

RectangleContainsScreenPoint

Returns whether a screen point is contained in the rect transform.

 

ScreenPointToLocalPointInRectangle

Transform a screen point to a position in the local space of a rect transform.

 

ScreenPointToWorldPointInRectangle

Transform a screen point to a position in world space that is on the plane of the given rect transform.

Requirements of ‘Mixed Reality’ with an unknown environment

Lately, you see a lot of blog posts and news about mixed reality. This reminded me that I wrote a scientific paper about mixed reality in university (30.11.2017). It’s about the requirements of mixed reality games with an unknown environment for the game design. The German title is ‘Game Design: Anforderungen von Mixed Reality mit unbekannter Spielumgebung’. The goal of the paper was to compile a questionnaire that can help design mixed reality games.

If you are familiar with German, you can read the paper here. If you don’t speak german it’s no problem. I translated the final questionnaire to English below. I hope it will help you to design great mixed reality games. 🙂

 

Questionnaire

Spatial minimum requirements

General

  • Are free spaces, surfaces or a combination of both needed?
  • How much space do you need?
    • 30 cm or 3 meters make a big difference.
  • Is a walking area around your game space needed?
  • Is the effort to provide those spatial conditions reasonable?
    • You can clear a table in no time but would your player move the furniture?

Free space

  • Can this much space be found in the average apartment of your target group?
  • If not, can your game be played outside?
    • Garden
    • Parking ground
    • Public spaces
  • Do you need to account for the different physical properties of your players?
    • Eye-level
    • Arm reach
    • Physical disabilities

Surfaces

  • Does your game fit on an average dining table?
  • If not, can it be played on the floor?
    • Requirements for free space apply
    • Carpet floor
    • Wood or stone floor
  • Do your players have physical impairments that prevent them from playing on the floor?
    • Backache
    • Problems with the knees

 

 

Marker Tracking

General

  • Does marker cards give your game additional value that can’t be achieved without them?
  • Does the player need additional control over the placement of game content to get a better experience?

Dynamic Markers – Movement of the markers is part of the game

  • Do you really need the additional input of the card movement or is it just a gimmick?
  • Do you need an even surface to place and move the cards without difficulty?
    • Spatial minimum requirements for surfaces apply
  • If so, how much space do you need?

Static Markers – Are placed once and never moved

  • Are your markers placed once and not moved through the game?
  • Can you play on an uneven surface?

 

 

Location requirements

  • Does it improve the game if the player has to go to a special location to play your game?
    • Popular examples of location-based games are‘Pokémon Go’, ‘Ingress’ or ‘Zombies, Run!’
  • Does it improve the player experience to visit this location?
  • Is it reasonable for the player to go/travel to this location?
  • Is it possible for your target group to visit these locations?
    • Kids are maybe not allowed to go there alone
  • Do you need a lot of places so your game can be played around the globe or is it a local game for your city?
  • Is the GPS accuracy enough to support a flawless game experience?

 

 

Procedural environment generation

  • Does the procedural adaption of the environment improve the player experience?
  • Is this not achievable through physical adjustments of the environment?
  • Does the interaction of the player with the game world support a procedural generation of the levels?
  • Is it possible to generate a variety of interesting levels in different physical environments?
  • How do you deal with large open spaces and small furnished places?
  • Must the physical space satisfy any special requirements?
    • Must the player be able to walk around your level or can he walk right through it?
  • How likely is it that the apartment of your players satisfies these requirements?

Unity: Saving to SD Card

Today I want to show you, how you can save reliably on the sd card/external storage. This will only work for devices with android Lollipop (API level 21) or greater.

Also hostet at my GitHub: https://github.com/AuriMoogle/Unity-SaveToSDCard

 

No java code necessary, only Unity!

 

 

If you must save on the sd card and the emulated external storage on the internal storage is no option for you, this is the code you need.

 

There are two facts that are essential to this approach.

 

  • First fact: It is possible to get the path to the sd card with native java code. (But we don’t want to use that, we are unity developers.)
  • Second fact: Every java function can be called via string with unity .

 

  • Conclusion: If we can do it in java, we can do it in unity too. We just need to know which native functions we want to call.

 

So I searched the native android developer forums for a way to get a reliable path to the sd card. And finally I found this in the xamarin forum:

https://forums.xamarin.com/discussion/103478/how-to-get-path-to-external-sd-card

 

In java you need 3 functions to get the sd card path:

  1. GetExternalFilesDirs
  2. IsExternalStorageRemovable
  3. IsExternalStorageEmulated

We can also call this functions with the androidJavaObject integration from unity. We just have to pass the names of the upper native functions. At this point it is really important to mind the spelling. You must call GetExternalFilesDirs, the s is important!

 

AndroidJavaObject[] externalFilesDirectories = 
                    context.Call<AndroidJavaObject[]> 
                    ("getExternalFilesDirs", (object)null);

 

If you forget the s you will get a path to a random external storage. That can be the sd card but it’s more likely that you get the emulated storage. But we want a list of the paths to all available external storages (sd card and emulated).

 

If we got them, we can find the path to the sd card via its properties. The sd card is in contrast to the emulated storage not emulated (Jep, that’s a property) and removable.

So we check these properties for each directory.

 

bool isRemovable = environment.CallStatic<bool> 
                   ("isExternalStorageRemovable", directory); 

bool isEmulated = environment.CallStatic<bool> 
                  ("isExternalStorageEmulated", directory);

// We found the sd card!
if (isRemovable && isEmulated == false) 
    return directory;

 

If we now wrap all this together and add a few lines, we get the code below. A function that will give you the path to the sd card, if it’s available. If there is no sd card you will get the emulated storage as a fallback.

 

And now have fun  and save everything you got to the sd card!

 

 

private static string GetAndroidExternalFilesDir()
{
     using (AndroidJavaClass unityPlayer = 
            new AndroidJavaClass("com.unity3d.player.UnityPlayer"))
     {
          using (AndroidJavaObject context = 
                 unityPlayer.GetStatic<AndroidJavaObject>("currentActivity"))
          {
               // Get all available external file directories (emulated and sdCards)
               AndroidJavaObject[] externalFilesDirectories = 
                                   context.Call<AndroidJavaObject[]>
                                   ("getExternalFilesDirs", (object)null);

               AndroidJavaObject emulated = null;
               AndroidJavaObject sdCard = null;

               for (int i = 0; i < externalFilesDirectories.Length; i++)
               {
                    AndroidJavaObject directory = externalFilesDirectories[i];
                    using (AndroidJavaClass environment = 
                           new AndroidJavaClass("android.os.Environment"))
                    {
                        // Check which one is the emulated and which the sdCard.
                        bool isRemovable = environment.CallStatic<bool>
                                          ("isExternalStorageRemovable", directory);
                        bool isEmulated = environment.CallStatic<bool>
                                          ("isExternalStorageEmulated", directory);
                        if (isEmulated)
                            emulated = directory;
                        else if (isRemovable && isEmulated == false)
                            sdCard = directory;
                    }
               }
               // Return the sdCard if available
               if (sdCard != null)
                    return sdCard.Call<string>("getAbsolutePath");
               else
                    return emulated.Call<string>("getAbsolutePath");
            }
      }
}

 

 

 

Procedural Building Generation

A short teaser about my bachelor thesis with the titel:

Generierung und Evaluierung abwechslungsreicher Gebäudekomplexe

(Generation and evaluation of diverse buildings)

 

The object of the thesis was to develop an algorithm, that generates diverse/interesting buildings via an evaluation function.

 

The buildings are compost of fassades and  rooms, which are all accessible from the front door via doors and stairs.

 

 

If you are interested and speak german, you can read my bachelor thesis here: Generierung_und_Evaluierung_abwechslungsreicher_Gebäudekomplexe

 

 

Examples of my generation approach can be seen in the videos below.

I added the delay between the steps  to make the individual steps of the algorithm better visable. The algorithm can generate a complett building in a few seconds.

 

The generation:

 

 

 

The finished fassade of the building:

 

 

 

The generated interior of the building:

 

 

More buildings: