Ignite Australia 2017 Wrap-up

inov226

Last week I had the pleasure to present at Ignite Australia. I’ve missed the last few as it’s coincided with overseas trips so moving to February worked out well.

I presented Lessons from Hollywood: Building the interactions of tomorrow today. It was a big different to the techy talks I normally do as it was a bit more conceptual and covered a lot of things in a broader way.

Overall I was happy with it and I got to show Dippy to one lucky audience member. It was really nice a few people found me on the Thursday and asked if they could meet Dippy and lucky for them I  just so happened to have my HoloLens with me.

I had some fun with the projector half way through an could no longer see anything on my laptop screen even after trying duplicate, extend etc so just had to suck it up and demo from the projector.

Had a bit of a demo glitch but people didn’t seem to mind too much.

I showed a bunch of technologies and hopefully gave people some ideas.

Things we covered:

Surface Dial and the new Radial Controller Class

Kinect For Windows

Power BI

HoloLens

Cognitive Services

You can check out the presentation here. Unfortunately you miss out on seeing all the awesome audience participation I had as it’s just the laptop and microphone feed.

Creating a Diprotodon for HoloLens–Dippy is Born

dippy

I recently presented at Ignite Australia on Lessons from Hollywood: Building the interactions of tomorrow today . For that talk and for fun I wanted to create something from scratch rather than use assets someone else made. Now I have 0 graphical ability but fortunately my brother Adrian did get those genes instead.

My brief was I wanted a model I could use in HoloLens. Something a bit different to the cat, dog, ballerina etc that you see all the time. Would also be nice to have an Australian theme to it incase I want to show this overseas there’s an extra element of Aussie about it Smile

He suggested a Diprotodon (essentially a giant prehistoric wombat) which is the largest known marsupial ever to have lived.

dippy model

Stage 1 – The Model.

First he built the model and exported it with a decent poly count as an fbx. I import that to my unity project, pick a point in space to place it and ta-da diprotodon model. At this point I thought he looked pretty good and it was time to name him (the hardest thing in software development) so what’s more Aussie than shortening his title – so we went with Dippy

dippy textture

Stage 2 – Textured Model.

He looked pretty good and I sent through some pics I took from the HoloLens so  then Adrian went away and created a textured version – you can see fur and nose texture here but no colour etc.

dippy

Stage 3 – Adding some colour

Next we wanted to make him look like a wombat so Adrian created the “maps”. 4 files that I had no idea what to do with (cause I’m a dev not a designer).

I went through a few different shaders and with some chat help we decided the most appropriate was the Standard (Specular)

and then matched them up to the maps he’d given me

image

I had to cheat and look up what some of these meant to help with the guess work but here’s my non-designer summary:

Albedo – base colour or diffuse map. Defines the colour of diffused light. This one I matched up to the dip_diffuse or as I like to call it the coloured fur map.

image

Specular – for shininess and highlight colour. I used his naming to initially match this one to dip_specular.

image

Occlusion – greyscale or as i call it the black and white dippy which matched to dip_ao.

image

Normal – allows you add some detail like bumps, scratches etc. This matched up to dip_normal

image

My final shader looked like this.

image

He looked pretty cool at this point but there was  a bit of a shimmer. We tried a few things like dropping poly count but then i noticed it on most holograms – even the simple cat in the Holograms app. So I left it for a bit.

dippy scanline

Stage 4 – Scanlines

For my demo, I wanted to show 2 different ways of using the same model but with different “definition”. In this case I wanted the same polycount but different maps – 1 with detailed fur and another will something simpler so you knew it was a diprotodon but didn’t need the detailed colour.

image

This time I used the  HoloToolkit Standard Fast Shader and dropped the dip_scan_line map onto albedo, emission and detail.

dippy before

Stage 5 – Fixing the shimmer

Presentation done but the shimmer was still bugging me. Took me a bunch of goes to capture it but you can see it on a few of the pics above but I think this one highlights it best. The Dippy on the left to me really stands out with the edges.

So I took advantage of the Unity guys being at Ignite and showed them. I really wanted to know if it’s something we should have done at the model level, the Unity level or even a code level. John suggested the Anti-Aliasing settings for the project.

Normally all the doco points to picking “Fastest” / “worst” and nothing else and anti aliasing is set to Disabled by default.

image

We made the simple change to set Anti Aliasing to 4x Multi Sampling. Re-deployed that and wow what a difference that made

dippy after

That shimmer is gone!

Using the C for Airtap in HoloLens

airtap

When I first used HoloLens almost two years ago I was taught the air tap. I looked a lot like the picture above. It worked for me and I was content.

A few months ago, I got my own device and started sharing how cool the device was. So then I had to teach people how to air tap.

Airtap open

Finger straight up in the air, then bend and the main joint to the hand.

Airtap close

I soon noticed a lot of people found this hard. No matter how many times i showed or demoed they would have trouble. A lot of them would bend at the wrong joint .

somethingdifferent

This type of motion doesn’t get picked up. They’d get frustrated and in a lot of cases would move on to a grabbing motion.

So hatting to my fellow MVPs who also love HoloLens I quickly discovered it isn’t just my bad teaching and that a lot of people had seen the same thing. Someone (sorry guys I can’t remember which of you mentioned this first) said try using a C and making them close it like a pinch. I tried this myself a few times and it seemed to be detected just aswell as the traditional air tap.

 C open

So at my Ignite presentation I taught half the room the official way and half the room the other. So 2nd half of the room I got them to make a C, then close

C Close

I thought that was the end of the story. The next day of of the people who went to my talk came to find me. He’d done a HoloLens experience the day before and had spent 20 mins doing air tap. He said the taught way left him with a really sore hand and that he had gone back again today and used the C I taught him the eventing before and had no issues.

So….teach people a few different ways to do a gesture to see what works for them. I’m finding the C is a great alternative so far.