Supporting Multitouch in Unity apps on Linux thanks to libinput and Rust

This is the first entry in a new series of “Behind the scenes” interviews, where we want to showcase some of the interesting, challenging, and surprising things we do as part of projects. Today, our three developers Ville, Jonas, and Pascal talk about a neat solution they found to improve support for multi-touch input when running apps created with the Unity game engine on Linux.

Pascal: Thanks for joining me today! Recently, the three of us worked on an interesting side project to improve the way we deal with touch inputs in Unity projects running on Linux. Why did we do this again?

Jonas: For quite some time now, we’ve been working a lot with the Unity game engine. We use it for a wide variety of applications, like visualizations for industry fairs or concept car infotainment systems. While Unity was originally made for creating 3D games, we’ve been able to use its tooling in slightly… unconventional ways. That meant we’ve also ran into issues many Unity developers haven’t. For example, we usually have applications with large touch screens running on Linux systems. For understandable reasons those configurations aren’t as reliable as configurations targeting game consoles, Windows gaming PCs or smartphones.


Pascal: My understanding is that this has been an issue in the past projects. Can you quickly describe what seems to be the underlying problem?

Ville: So, when targeting Linux systems, Unity uses the SDL library to achieve window management integration as well as mouse, keyboard, touch and joystick input. While that is in itself not a bad solution, for touch input it only supports single touch. What’s even more problematic is that because of mismatching architectures between SDL and Unity, we continue to see timing issues where tapping a touch screen sometimes triggers a touch in the last touched position, not the new one. This of course renders great confusion for the end user.


Pascal: I see. How did you previously solve this, then?

Jonas: We spent a huge chunk of time tracking down exactly when and why this issue appears. Then, our amazing colleague Eike Siewertsen was able to find a patch to work around for the exact version of SDL shipped with Unity 2019.2. He packaged it as a binary patch we could run as the last step of our build toolchain. Although this seemed to work well for that specific version, the problem became worse after Unity released new updates where our workaround no longer made sense, but the core issue remained…

Pascal, Jonas, and Ville doing this inteview from home
Social distancing edition.


Pascal: I can see how this can be quite fragile. What is the new approach? How’d you come up with it?

Ville: Continuously forking, patching and maintaining a variant of SDL is definitely not what we want to spend time on. Instead, we decided to isolate the problem by first figuring out – what is the easiest and most robust way to read touch input on Linux we can come up with? The answer to that was to hook up ourselves directly to libinput. As the name may suggest, that’s the library that all our Linux systems use to deal with input devices of all kinds.

With Pascal as our resident Rust hacker, he quickly made a small library that reads the touch input events directly from the Rust libinput bindings. Since it’s a static library, we can directly use DllImport in C# in Unity to load it (when running on Linux). All the library exposes is a simple getter function, which we poll in Unity’s run loop to get the current list of active touches for every frame. All of a sudden we had the raw touch input to the best of Linux’s knowledge on every frame! To our great relief the difference in performance between this and using SDL is negligible.

Pascal: That sounds like an awesome hack! So how’d you hook that into Unity? Does this replace the usual input system completely?

Ville: Exactly! The final part of the equation was just hooking into Unity’s Input module system and replacing the Standard Input Module with our alternative. Not only did a polling-based system like this alleviate us from the issue with taps not always registering at the correct position – we also finally had multi touch support in Unity on Linux! And that opens up a huge number of new possibilities!

So, we started to piece by piece replace parts of the touch input system to mitigate the issues we were seeing. But fairly quickly we realised that the benefits heavily outweighed the (basically non-existent) downsides – and decided to take a leap of faith and fully rely on this new system. Of course it took some tweaking and testing to understand how to integrate with Unity’s update loop and Script Execution Order in the best way, but when that was done the latency was down to unnoticeable levels. Actually I’m not even sure a delay exists anymore...

Pascal: What does all this actually mean for how we develop unity apps? Do other devs have to know about this?

Jonas: All of this means that we can continue to use the platform of Unity which the design and front-end dev team has started to use (and love?) more and more lately. And we can use it together with Linux, which is well-known ground for our hardware integration and backend developers, without concerns of a lesser touch input system dragging down the end-user experience.

We’ll definitely keep improving on what we already have, to get a system that is solid and as feature-complete as needed, and perhaps also share it with the world when it’s ready! It’d feel great knowing only one other person or team getting to know of this solution before diving into SDL-binary-patch-mode to fix a silly, hard-to-track issue and instead spend their time on creating a better product!

Pascal: Thanks so much for your time, folks! Let’s do this again soon!

Read more