22 March 2018

Windows 10 on ARM and devices–hang on to your continuum dock!


I have been asked to evaluate a prototype Windows 10 on ARM PC. You might have seen people talk about it earlier, like my friend Lance who wrote about his one day developer experience, Daren May has something to say about remote debugging with these devices, and wouldn’t you know - on the first day of spring, one sprung up at to Paul Thurrott’s site. I am not sure if that’s exactly the same model as I have – it looks pretty similar, but that’s actually not important. As far as Windows goes, the platform and what it can do is more interesting to me than the actual underlying hardware. Windows goes ARM – yet again, one might say.

Wait, haven’t we seen this before?

Windows has been running on ARM before, both on tablets, phones and IoT devices like a Raspberry PI. Windows RT was an Windows 8 variant, Windows Mobile made it actually to Windows 10, IoT devices run a super compact version of Windows 10 and UWP apps. In all cases, apps on Windows versions that run on ARM devices could only be native (ARM) apps. For a number of use cases, the backward compatibility with the vast library of Windows apps that have been created over the years posed a bit of a challenge. And that’s where some brand new tech comes in. The new Windows 10 on ARM runs actual x86 code, made for the ‘conventional’ Intel chips - converting it on the fly. It uses a technology that’s called CHPE (pronounced “chip-pee”) to work the magic. Lance’s article has a nice in-depth explanation of it. I talked to people working on that CHPE during the last MVP Summit. Modest and quiet people they are, but by golly, I felt like a Neanderthal getting a quantum physics 101 lecture by the late professor Hawking when they casually talked about a few of the things they had to overcome. Really impressive.

imageI installed some x86 programs on the PC, downloaded from various sources and some Desktop Bridge programs from the Windows Store. It’s very much a case of Your Mileage May Vary, but let’s just put it this way – I put a resource hog like Chrome on it – the x86 version – and it ran just fine, even the first time, when it’s supposed to be slower while CHPE works it’s magic. I still prefer Edge, as I like to keep my memory and battery power for other things than just web pages – but it runs Chrome just fine. I also tried TeamViewer – also just works fine – case in point, I made the screenshots on this blogpost using that. For all intents and purposes, this is just Windows. So much so, that you actually have to dig to see there’s another heart beating beneath it’s metal. The most obvious is the File Explorer – see image on the right side:

And of course, there’s this.


Also, fun fact: because my good old Map Mania app still has an ARM package, intended for phones, it gets the native ARM package from the store, and runs very fast on the device. So pay attention kids, but by all means, submit an ARM package when you put your app in the Windows Store. .

So if this is just Windows… how about devices?

One of the most awesome things I like about Windows is that whatever device you plug into it, it works, and nearly instantly. If it does not, you actually have a better chance of having a defective device than Windows not at least eking the basic functionality out of it. I have had… let’s say, other and utterly frustrating experiences with other operating systems. However, the device I have has just one port – an USB-C port. It charges fine with the accompanying charger, but what about other devices?

See the source imageThis is where the fun starts. As a former Windows Phone MVP, I went all the way to the Lumia 950XL, scoring a free Continuum Dock with the phone. Remember this one? Connect a keyboard,a mouse and a monitor to it, plug the other end in your Lumia, and you basically had a kind of PC-from-your-pocket. Turns out Microsoft did not use some proprietary tricks, but apparently just some standard protocol.

I plugged the dock into the device, power in the other end:

Score one – it charged. Then I went a bit …. overboard…


I connected this entire pile of hardware to it. And all of it worked. What you see here, connected simultaneously:

  • A Dell monitor connected via DisplayPort (tried the HMDI port too – worked as well)
  • Two USB hubs, because I have 3 only USB ports on the dock ;)
  • A generic USB key
  • A Microsoft Basic Mouse V2
  • A Microsoft Natural Ergonomic Keyboard 4000 v1
  • An Xiaomi MI 5 Android Phone
  • A Microsoft LifeChat LX-3000 headset
  • A Microsoft XBox One controller
  • A Microsoft Sculpt ergonomic keyboard and accompanying mouse set (via a wireless dongle)

Not on this picture, but successfully tried:

  • A HoloLens – it got set up, but I could not connect to the portal via localhost:10080. I have to look into that a little bit more. Also other things work but that’s outside the scope of this article.
  • A fairly new Canon DSLR, but I needed that one to take the picture so it’s obviously not in it ;)

I also found the PC actually wants to charge from a Lizone QC series battery, that I originally bought to extend my Surface Pro 4’s battery life on long transatlantic flights. The Windows 10 on ARM PC itself is missing from the picture – that’s because it’s a pre-release device and I don’t want pictures of it to roam around the internet.

Did I find stuff that did not work? In fact, I did:

  • I could not get a fingerprint reader that I got for free to work. This is some pre-release device that I got on the summit from a fellow MVP – 1.5 or maybe 2.5 years ago. Although it is set up and recognized, I cannot activate it in the settings screen. Maybe this has something to do with the built-in Windows-Hello-compatible camera of the PC getting priority.
  • A wireless dongle for XBox One controllers. Remember the original XBox One controllers did not have Bluetooth in it? This gadget allows you to connect it to PCs anyway. It connects, but nothing is set up. It’s not a big deal, as a controller plugged in via an USB cable works just fine. I suppose this dongle was not sold in large volumes, and probably not at all anymore, as all newer XBox One controllers can be connected via Bluetooth. Only people hanging on to old hardware (guilty as charged) would run into this.

General conclusion

I feel like a broken record, because I keep getting back to this simple fact - it’s just Windows, it will run your apps pretty nicely, it will connect to nearly all of your hardware, and give you a very long battery life. Although, I can imagine battery life might degrade a little if you add this much devices to it’s USB port. But then again, if you need this many devices connected to your PC you might want to rethink what kind of PC you want to buy anyway ;). The point is, you can, and everything but very obscure devices will work.

Now if you would excuse me, I have to clean up an enormous pile of stuff – my study looks like a minor explosion took place in the miscellaneous hardware box.

17 March 2018

Loading remote video stored in Azure blob storage into a floating gaze activated video player in a Mixed Reality app


The title of this blog post kind of gives away that this is actually two blog post in one:

  • How to prepare and load videos into Azure
  • How to load these videos back from Azure, and show these in a floating video player that is activated upon looked at.

The basic idea

The UI to demo loading and playing the video is a simple plane that gets a ‘MovieTexture’ applied to it. When you look at the plane (i.e. the gaze strikes the Plane), MovieTexture’s “Play” method is called, and the video starts playing. When you don’t look at it for like three seconds, the MovieTexture’s “Pause” method is called. It’s not rocket science.

Two post ago, I introduced a BaseMediaLoader a as simple base class for downloading media. We are going to re-use that in this post, as loading video – as you will see – is not that different from loading audio.

Prepare and upload the video

If you have read my post about loading audio you might have guessed – you can’t just upload an MP4 file to a blob storage, download and play it. Unity seems to have a preference for off-center open source formats. You will need to convert you movie to the OggTheora and you can do this with the command line tool “ffmpeg”. The documentation on it is not very clear, and default conversion yields a very low quality movie (think early years YouTube). I have found the following parameters give a quite reasonable conversion result:

ffmpeg.exe -i .\Fireworks.mp4 -q:v 8 fireworks.ogv

-q:v 8 gives a nice video quality. Also, the original 121605 kb movie is compressed to about 40000 kb. The resulting ogv need to be uploaded to an Azure blob storage. I used the Storage Explorer for that. That also makes it easy to get a shared access signature url.

Video player components

The video player itself is pretty simple – a Plane to display the movie on, a Text to tell the user to start playing it by looking at the Plane, and an AudioSource you can just about see in this image blow, depicted by a very vague loudspeaker icon



imageNote the video player is about 3 meters from the user, and a bit off-center to the left – preventing it from auto starting immediately, which it would do if it would appear right ahead. The video plane is rotated 90/90/270° to make it appear upright with the right direction to the user.

The VideoPlayer script

The  VideoPlayer script is actually doing all the work – downloading the video, playing it when gaze hits, and pausing the playback after a timeout of 2 seconds (‘Focus Lost Timeout’). It start pretty simple:

using System.Collections;
using HoloToolkit.Unity.InputModule;
using UnityEngine;
using UnityEngine.Networking;

public class VideoPlayer : BaseMediaLoader, IFocusable
    public GameObject VideoPlane;

    public AudioSource Audio;

    public GameObject LookText;

    public float FocusLostTimeout = 2f;

    private MovieTexture _movieTexture;

    private bool _isFocusExit;

    protected void Start()

Notice all components are explicitly defined, that is – although they are within one prefab, you still have to drag the Plane, the Text and the AudioSource into the script’s fields. Initially it turns off everything – if there’s nothing downloaded (yet), show nothing. If you are on a slow network, you will see the player disappear for a while, then reappear.

The most important part of this script consist out of this two methods:

protected override IEnumerator StartLoadMedia()
    yield return LoadMediaFromUrl(MediaUrl);

private IEnumerator LoadMediaFromUrl(string url)
    var handler = new DownloadHandlerMovieTexture();

    yield return ExecuteRequest(url, handler);

    _movieTexture = handler.movieTexture;
    _movieTexture.loop = true;
    Audio.loop = true;

    VideoPlane.GetComponent<Renderer>().material.mainTexture = _movieTexture;
    Audio.clip = handler.movieTexture.audioClip;

Remember, from BaseMediaLoader, that StartLoadMedia is called as soon as MediaUrl changes. That turns off the UI again (in case it was already turned on because a different file was loaded previously). Then we need an DownloadHandlerMovieTexture. I think the person who came up with the DownloaderScheme should be awarded for an originality award ;)

Then we set both the loop property for both the movie texture and the AudioSource to true, and after that we apply the movie texture to the Videoplane's Renderer material texture so it will indeed show the movie.  Since that will only play a silent movie, we need to extract the movie texture's audioClip property value and put that in our audio source, and both make the plane and the text visible, inviting the user to have a look

Then we have these two simple methods to actually start and pause playing. Notice you have to start call the movie texture's Play method and the AudioSource's Play method, but for pausing it's enough to call just the movieTexture's Play. One of those weird Unity idiosyncrasies.

private void StartPlaying()
    if (_movieTexture == null)
    _isFocusExit = false;
    if (!_movieTexture.isPlaying)

private void PausePlaying()
    if (_movieTexture == null)

Notice the setting of _onFocusExit to false when the StartPlaying. We need that later. Finally, the methods that actually are fired when you are looking at or away from the plane, as defined by IFocusable

public void OnFocusEnter()

public void OnFocusExit()
    _isFocusExit = true;

IEnumerator PausePlayingAfterTimeout()
    yield return new WaitForSeconds(FocusLostTimeout);
    if (_isFocusExit)

If the user stops looking at the plane, _onFocusExit is sets to true and a coroutine starts that first waits for the defined time. If that time has passed and the user still does not look at the plane, the video play will actually be paused. This way you prevent small head movements, that make the gaze cursor wander off the plane for a short period of time, will make the movie stop and start repeatedly - which is a bad user experience.

No controls?

The floating audio player I described earlier has a fancy slider that showed progress and made it possible to jump to any piece of the audio. Unfortunately, a movie texture does not support a time property that you can get and set to random access parts of the movie, and jump to a specific point. You can only move forward, and only by setting the loop property to true you actually end up at the start again, because moving to start does not work either. I don't know why this is, but that's the way it seems to be.


Showing video is almost a easy as playing audio, and in many ways are similar. The default Unity capabilities allow only for a bit limited control, but it's a nice way to - for instance - show instructional videos. Be aware playing videos in a resource-constricted device (read: HoloLens) might ask for a lot of resources. Consider smaller low-res videos is this case. Testing is always key.

Demo project, containing more stuff by the way, can be found here