01 April 2020

Migrating to MRTK2 - using the non-native keyboard in touch scenarios

Prelude

With apologies for the uncharacteristic hiatus in my blog - last month I had a HoloLens 2 available for development and test purposes, and utilizing that opportunity to the max had a bit more priority than actually blogging about it. Then the world got hit head-on by the Corona madness and I had other things on my mind. Now, in self-isolation, hoping to avoid the virus (as no doubt most of you are right now), I have finally started to crank out the backed up blog posts I had chalked up 'for later' while I was converting apps for HoloLens 2.

Intro

In the Mixed Reality Toolkit 2 you can use the beautiful system keyboard for text input and that works amazingly well - in HoloLens 2. Since MRKT2 development prioritizes HL2 development, and for good reason, this is not surprising. But in Immersive Headsets it works not so very well, and in HoloLens 1 it has the same problem.
Now since MRTK2.3 there's a new keyboard available - although actually it's an old keyboard - the Keyboard prefab, that used to reside in HoloToolkit\UX\Prefabs, has been renamed to the NonNativeKeyboard prefab and now sits in MixedRealityToolkit.SDK\Experimental. It has a few advantages over the native keyboard:
  • It is a Unity object, not a native object, so you can control size, rotation and position just like any other Unity object
  • It has basically the same API and usage as the old keyboard, which makes it attractive to use in existing applications.
  • It has a built-in button for speech recognition
  • It gives a consistent look & feel for your apps.
It also has a few quite distinct disadvantages:
  • It does not support touch events for HoloLens 2
  • It does not take into account the differences in apparent size in WMR headsets and HoloLenses
  • It should act different in various environments, (like being close when in HL2, and further away and bigger in other cases) which it does not
Now this, my friends, can be mitigated with a pretty simple add-on behaviour that I created. It takes care of positioning, platform dependent scaling and distance - but above all, it adds touch to the nonnative keyboard in a very simple way.

How it works

The start is simple enough: just some settings for each platform:
public class KeyboardAdapter : MonoBehaviour
{
    [SerializeField]
    private float Hl1Distance = 1.0f;
    [SerializeField]
    private float Hl1Scale = 1.0f;
    
    [SerializeField]
    private float Hl2Distance = 0.3f;
    [SerializeField]
    private float Hl2Scale = 0.3f;

    [SerializeField]
    private float WmrHeadSetDistance = 0.6f;

    [SerializeField]
    private float WmrHeadSetScale = 0.6f;

    [SerializeField] 
    private AudioClip _clickSound;

    private AudioSource _clickSoundPlayer;
}
Basically a couple of settings. For every platform support (HoloLens 1, HoloLens 2 and Mixed Reality headsets there is a distance from the user it will appear, and an apparent scale it will have. Also, you can assign a click sound for when a key is hit.
Almost all the heavy lifting happens in Start:
private void Start()
{
    _clickSoundPlayer = gameObject.AddComponent<AudioSource>();
    _clickSoundPlayer.playOnAwake = false;
    _clickSoundPlayer.spatialize = true;
    _clickSoundPlayer.clip = _clickSound;
    var buttons = GetComponentsInChildren<Button>();
    foreach (var button in buttons)
    {
        var ni = button.gameObject.AddComponent<NearInteractionTouchableUnityUI>();
        ni.EventsToReceive = TouchableEventType.Pointer;
        button.onClick.AddListener(PlayClick);
    }
}
The first four lines simply add and initialize the sound that is played when you tap a button. The next ones do the real work: they find every Button object in the keyboard, add a NearInteractionTouchableUnityUI to it, set the events to receive to "pointer" and add an event listener to the Button - that basically only serves to play the sound
The keyboard is built of Unity UI components, and adding a NearInteractionTouchableUnityUI and setting the EventToReceive to Pointer is all the is necessary to make the button 'think' it's clicked when it's actually touched. And if you have set the ClickSound to an audio file in the editor, it plays a sound now when you tap our touch it, too.
Then we have these two properties who return the right value depending on the platform the app is running on:
private float Scale => GetPlatformValue(Hl1Scale, Hl2Scale, WmrHeadSetScale);
private float Distance => GetPlatformValue(Hl1Distance, Hl2Distance, WmrHeadSetDistance);
Which is done by this little method:
private float GetPlatformValue(float hl1Value, float hl2Value, float wmrHeadsetValue)
{
    if (CoreServices.CameraSystem.IsOpaque)
    {
        return wmrHeadsetValue;
    }

    var capabilityChecker = CoreServices.InputSystem as IMixedRealityCapabilityCheck;

    return capabilityChecker.CheckCapability(MixedRealityCapability.ArticulatedHand) ? 
            hl2Value : hl1Value;
}
If the headset is opaque, then it's a Windows Mixed Reality headset (or at least not a HoloLens), and otherwise we determine based upon the capability of tracking hands whether it's a HoloLens 1 or 2.
This is used to determine to show the keyboard on the desired place and at the desired scale.
public void ShowKeyboard()
{
    NonNativeKeyboard.Instance.PresentKeyboard();
    NonNativeKeyboard.Instance.RepositionKeyboard(CameraCache.Main.transform.position + 
                                                  CameraCache.Main.transform.forward * 
                                                  Distance, 0f);
    NonNativeKeyboard.Instance.gameObject.transform.localScale *= Scale;
}
And that is really all. Some creative use of components already in the MRTK2.

Usage

Just drop a NonNativeKeyboard prefab from the MRTK2 in your scene, and drop this behavior on it. You then only have to take care of two things. First, set Min and Max scale both to 1, otherwise this will interfere with the way the keyboard is scaled by this behavior:

And of course, you have to set some parameters for the behavior itself. I have chosen what I like to think are reasonable settings for every platform:

And of course the sound that appears when you tap or touch the keyboard

How it looks

On a HoloLens 2( in my app Walk the World for HoloLens 2) it looks like this:

It is pretty close, as you are supposed to be able to touch it
On HoloLens 1 it looks like this:

Pretty much the same - bigger, but further away and thus easier to control with the air tap. Since this is a 2D picture, the differences between HoloLens 1 and 2 are actually hard to spot. And on Mixed Reality it looks like this - it look smaller, but that's because in a headset everything looks smaller. It's apparent size is the same as in HoloLens 1.

And finally, and action movie of touch enabled non native keyboard on a HoloLens 2. You can actually touch the buttons and they respond with a button press sound, as you see

Why no solvers?

You might have noticed I did not use any MRTK2 solvers to keep the keyboard floating in view when you when you move your head, our keep a dynamical distance. I did initially, but I found out that a keyboard that actually moves is very annoying when you are trying to type, especially when using touch type. Then the keyboard is close, and a move is easily triggered when you want to do things like first type the A (left), then the P (totally right). So I decided just to let it appear right in front of the user, and keep it there. If you don't use it, it automatically disappears after a configurable timeout. This is built into the keyboard, that is not my doing.

Conclusion

It's quite remarkable how you can make 'old' components interplay with the new HoloLens 2 interaction possibilities using the new components and a bit of imagination. Also, the MRTK2 made making apps that run on both HoloLens 1, HoloLens 2 and Windows Mixed Reality headsets a lot easier. Although it's clear MRTK2 prioritizes HoloLens 2 above all - and that's logical, because there's where the action and the business is. I fear the venerable trailblazing HoloLens 1, will quickly lose mindshare and disappear from the radar when HoloLens 2 becomes more widely available. I am not quite sure what to do with mine, but time will tell.
As usual, you can find the code of the little project here. If you want to run the app and see the keyboard, just say "Open Keyboard".

Post scriptum

I put the part that makes the NonNativeKeyboard touchable in a pull request to the MRTK2 that was merged at April 14, and will be part of the MRTK 2.4.0 release.

2 comments:

Travis Rothbloom said...

Thanks for this. Seems to be some uncertainty regarding what's the best path forward in MRTK regarding the keyboard as there have been a few issues pointed out with TouchScreenKeyboard and MixedRealityKeyboard - is NonNativeKeyboard going to be the way MRTK recommends going forward? Seems to be recent development in both.

I'm also interested in whether you think swiping would be feasible with the NNK - would the inherent latency be too much to make this feasible?

FYI seemed like the repo you linked to was still empty but did find this: https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/Examples/Experimental/NonNativeKeyboard

Joost van Schaik said...

Thanks for informing me I had forgotten to actually add the files to the repo and push them. This has been corrected.

Swiping on the NNK would be quite some work. Not sure if you can get the necessary accuracy to make this actually helpful. Swiping with your finger on a glass keyboard is one thing, but moving your whole arm and hand to do some... not sure.