03 May 2020

Migrating to MRTK2: right/left swappable hand menu's for HoloLens 2

Intro

As I announced in this tweet that you might remember from February the HoloLens 2 version of my app Walk the World sports two hand palm menus - a primary menu, often used command menu that is attached to your left hand that you operate with your right hand, and a secondary less-used settings menu that is attached to your right hand - and that you operate with your left. Lorenzo Barbieri of Microsoft Italy, a.k.a. 'Evil Scientist' ;) did the brilliant suggestion I should accommodate left-handed usage as well. And so I did - I added a button to the settings menu that actually swaps the 'handedness' of the menus. This means: if you select 'left handed operation' the main menu is operated by your left hand, and the secondary settings menu by your right.

A little video makes this perhaps more clear:

This blog explains how I made this work. I basically extracted the minimal code from my app and made it into a mini app that doesn't do more than make the menu swappable - both by pressing a toggle button and by a speech command. I will discuss the main points, but not everything in detail - but as always you can download a full sample project and see how it's working in the context of a complete running app.

This sample uses a few classes of my MRTKExtensions library of useful scripts.

Configuring the Toolkit

I won't cover this in much detail, but the following items need to be cloned and partially adapted:

  • The Toolkit Configuration Profile itself (I usually start with DefaultMixedRealityToolkitConfigurationProfile). Turn off the diagnostics (as ever)
  • The Input System Profile
  • The SpeechCommandsProfile
  • The RegisteredServiceProviderProfile

Regarding the SpeechCommandsProfile: add two speech commands:

  • Set left hand control
  • Set right hand control

In the RegisteredServiceProviderProfile, register the Messenger Service that is in MRKTExtension.Messaging. If you have been following this blog, you will be familiar with this beast. I introduced this service as a Singleton behavior back in 2017 and converted it to a service when the MRTK2 arrived.

Menu structure

I already explained how to make a hand menu last November,  and in my previous blog post I explained how you should arrange objects that should be laid out in a grid (like buttons). The important things to remember are:

  • All objects that are part of a hand menu should be in a child object of the main menu object. In the sample project, this child object is called "Visuals" inside each menu.
  • All objects that should be easily arrangeable in a grid, should be in a separate child object within the UI itself. I always all this child object "UI", and this is where you put a GridObjectCollection behaviour on.

Consistent naming makes a structure all the more recognizable I feel.

Menu configuration

The main palm menu has, of course, a Solver Handler and a Hand Constraint Palm Up behaviour. The tracked hand is set to the left.

The tricky thing is always to remember - the main menu is going to be operated by the dominant hand. For most people the dominant hand is right - so the hand to be tracked for the dominant menu is the left hand, because that leaves the right hand free to actually operate controls on that menu. For left hand control the main menu has to be set to track the right hand. This keeps confusing me every time.

On the palm. It won't surprise you to see the Settings menu look like this:

With the Solver's TrackedHandedness set to Right. But here you also see the star of this little show: the DominantHandController, with the DominantHandController set to off - since I always have the settings menu operated by the non dominant hand, whatever that might be.

DominantHandController

This is actually a very simple script, that responds to messages sent from either a button or from speech commands:

namespace MRTKExtensions.HandControl
{
    [RequireComponent(typeof(SolverHandler))]
    public class DominantHandHelper : MonoBehaviour
    {
        [SerializeField] 
        private bool _dominantHandControlled;
        private IMessengerService _messenger;
        private SolverHandler _solver;

        private void Start()
        {
            _messenger = MixedRealityToolkit.Instance.GetService<IMessengerService>();
            _messenger.AddListener<HandControlMessage>(ProcessHandControlMessage);
            _solver = GetComponent<SolverHandler>();
        }

        private void ProcessHandControlMessage(HandControlMessage msg)
        {
            var isChanged = SetSolverHandedness(msg.IsLeftHanded);
            if (msg.IsFromSpeechCommand && isChanged && _dominantHandControlled)
            {
                _messenger.Broadcast(new ConfirmSoundMessage());
            }
        }

        private bool SetSolverHandedness(bool isLeftHanded)
        {
            var desiredHandedness = isLeftHanded ^_dominantHandControlled ?
Handedness.Left : Handedness.Right; var isChanged = desiredHandedness != _solver.TrackedHandness; _solver.TrackedHandness = desiredHandedness; return isChanged; } } }

SetSolverHandedness determines what the handedness of the solver should be set to - depending on whether this menu is set to be controlled by the dominant hand or not, and whether or not left handed control is wanted. That's an XOR yes, you don't see that very often. But write out a truth table for those two parameters and that's where you will end up with. This little bit of code is what actually does the swapping of the menus from right to left and vice versa.

It also returns a value to see if the value has actually changed. This is because if the command is started from a speech command we want, like any good Mixed Reality developer, give some kind of audible cue the command has been understood and processed. After all, we can say a speech command any time we want, and if the user does not have a palm up, he or she won't see hand menu flipping from one hand to the other. So only if the command comes from a speech command, and actual change has occurred, we need to give some kind of audible confirmation. I also added this confirmation only to be given by the dominant hand controller - otherwise we get a double confirmation sound. After all, there are two of these behaviours active - one for each menu.

Supporting act: SettingsMenuController

Of course, something still needs to respond to the Toggle Button being pressed. This is done by the this little behaviour:

    public class SettingsMenuController : MonoBehaviour
    {
        private IMessengerService _messenger;

        [SerializeField]
        private Interactable _leftHandedButton;

        public void Start()
        {
            _messenger = MixedRealityToolkit.Instance.GetService<IMessengerService>();
            gameObject.SetActive(CapabilitiesCheck.IsArticulatedHandSupported);
            _messenger.AddListener<HandControlMessage>(ProcessHandControlMessage);
        }

        private void ProcessHandControlMessage(HandControlMessage msg)
        {
            if (msg.IsFromSpeechCommand)
            {
                _leftHandedButton.IsToggled = msg.IsLeftHanded;
            }
        }

        public void SetMainDominantHandControl()
        {
            SetMainDominantHandDelayed();
        }

        private async Task SetMainDominantHandDelayed()
        {
            await Task.Delay(100);
            _messenger.Broadcast(new HandControlMessage(_leftHandedButton.IsToggled));
        }
    }
}

The SetMainDominantHandControl is called from the OnClick event in the Interactable behaviour on the toggle button:

and then simply fires off the message based upon the toggle status of the button. Note that there's a slight delay, this has two reasons:

  1. Make sure the sound the button plays actually has time to play
  2. Make sure the button's IsToggled is actually set to the right value before we fire off the message.

Yeah, I know, it's dicey but that's how it apparently needs to work. Also note this little script not only fires off HandControlMessage but also listens to it. After all, if someone changes the handedness by speech commands, we want to see the button's toggle status reflect the actual status change.

Some bits and pieces

The final piece of code - that I only mention for the sake of completeness - is SpeechCommandProcessor :

namespace HandmenuHandedness
{
    public class SpeechCommandProcessor : MonoBehaviour
    {
        private IMessengerService _messenger;

        private void Start()
        {
            _messenger = MixedRealityToolkit.Instance.GetService<IMessengerService>();
        }

        public void SetLeftHanded(bool isLeftHanded)
        {
            _messenger.Broadcast(new HandControlMessage(isLeftHanded) 
{ IsFromSpeechCommand = true }); } } }

It sits together with a SpeechInputHandler in Managers:

Just don't forget to turn off the "Is Focus Required" checkbox as these are global speech commands. Itned to forget this, and that makes for an amusing few minutes of shouting to your HoloLens without it having any effect, before the penny drops.

Conclusion

You might have noticed I don't let the menu's appear on your hand anymore but next to your hand. This comes from the design guidelines on hand menu's on the official MRKT2 documentation, and although I can have a pretty strong opinion about things, I do tend to take some advice occasionally ;) - especially when it's about usability and ergonomics. Which is exactly why I made this left-to-right swappability in the first place. I hope this little blog post will give people some tools to add a little bit to inclusivity for HoloLens 2 applications.

Full project, as mostly always, here on GitHub.

No comments: