For those that are missing the HoloLens template in VS 2017 on of the awesome Emerging Experiences MVPs Dwight has updated the VS2015 one for you.
Get it here.
For those that are missing the HoloLens template in VS 2017 on of the awesome Emerging Experiences MVPs Dwight has updated the VS2015 one for you.
Get it here.
Last night I showed of HoloLens to the Adelaide .Net User Group.
We had a decent number of people hear about VR/AR and MR but I think they were all keen to check out the device.
After going through the basics with everyone to explain gaze and gestures like Airtap and Bloom it was time to hand it over.
My first volunteer got it really quickly and after turning my back for a few seconds was happily navigating around the holograms.
He also took to Dippy and after one voice command had already guessed what the next few were. Like most people he created a huge collection of diprotodons.
Next volunteer had a go at air tap and voice commands after we showed how to use it with the emulator.
But as you can see by the massive grin, even the simplest of apps that we can build makes people happy.
My last demo was of Robo Raid – and boy did this volunteer have fun.
Thanks David for the pics and the invitation. Love seeing people enjoying this tech.
Today was an early-ish start as I’m off to Adelaide. While I’m here I’m meeting a few different places/people and first on my list is Aberfoyle Park High School.
Thanks to David Gardiner, I was welcomed to present to the Grade 8 advanced math classes. I wasn’t sure what would excite them but I thought if anything would, HoloLens would.
It’s the first presentation I’ve done where I had questions before I even opened my mouth!
I started with the “boring stuff” – how I got into IT, what sorts of things I work on etc. and included a few pics or the technology I grew up with for some context.
Out of all the pictures I showed Donkey Kong got the biggest reaction as it seems a lot of Dad’s had one of these…which is not surprising since their dad is probably a similar age to me.
With the boring stuff over, it was time to bring out something more fun. The vibe in the room immediately changed from disinterest/distracted to electric!
David’s daughter had a play with the HoloLens before the session and kindly allowed me to take a pic…
I had no shortage of volunteers to show off how the device works. The students played with Dippy endlessly putting in on people’s laps, changing from female, to male to babies.
There were endless ooohhs, ahhhs and of course “Can I have the next turn?”
In amongst all of that there were good questions. Including what Dippy was for. Again I was surprised that some of the students knew what a Diprotodon was.
I was also really impressed (but not surprised) at how quickly the students “got” the airtap and gaze. Most of them preferred the “C” method and quickly adapted to turning their waist rather than their head when they had their hand in front.
Thanks to the teachers who allowed me to take over their class for the afternoon and then would have had to get them to concentrate afterwards.
Last week I had the pleasure to present at Ignite Australia. I’ve missed the last few as it’s coincided with overseas trips so moving to February worked out well.
I presented Lessons from Hollywood: Building the interactions of tomorrow today. It was a big different to the techy talks I normally do as it was a bit more conceptual and covered a lot of things in a broader way.
Overall I was happy with it and I got to show Dippy to one lucky audience member. It was really nice a few people found me on the Thursday and asked if they could meet Dippy and lucky for them I just so happened to have my HoloLens with me.
I had some fun with the projector half way through an could no longer see anything on my laptop screen even after trying duplicate, extend etc so just had to suck it up and demo from the projector.
Had a bit of a demo glitch but people didn’t seem to mind too much.
I showed a bunch of technologies and hopefully gave people some ideas.
Things we covered:
You can check out the presentation here. Unfortunately you miss out on seeing all the awesome audience participation I had as it’s just the laptop and microphone feed.
I recently presented at Ignite Australia on Lessons from Hollywood: Building the interactions of tomorrow today . For that talk and for fun I wanted to create something from scratch rather than use assets someone else made. Now I have 0 graphical ability but fortunately my brother Adrian did get those genes instead.
My brief was I wanted a model I could use in HoloLens. Something a bit different to the cat, dog, ballerina etc that you see all the time. Would also be nice to have an Australian theme to it incase I want to show this overseas there’s an extra element of Aussie about it
He suggested a Diprotodon (essentially a giant prehistoric wombat) which is the largest known marsupial ever to have lived.
First he built the model and exported it with a decent poly count as an fbx. I import that to my unity project, pick a point in space to place it and ta-da diprotodon model. At this point I thought he looked pretty good and it was time to name him (the hardest thing in software development) so what’s more Aussie than shortening his title – so we went with Dippy
He looked pretty good and I sent through some pics I took from the HoloLens so then Adrian went away and created a textured version – you can see fur and nose texture here but no colour etc.
Next we wanted to make him look like a wombat so Adrian created the “maps”. 4 files that I had no idea what to do with (cause I’m a dev not a designer).
I went through a few different shaders and with some chat help we decided the most appropriate was the Standard (Specular)
and then matched them up to the maps he’d given me
I had to cheat and look up what some of these meant to help with the guess work but here’s my non-designer summary:
Albedo – base colour or diffuse map. Defines the colour of diffused light. This one I matched up to the dip_diffuse or as I like to call it the coloured fur map.
Specular – for shininess and highlight colour. I used his naming to initially match this one to dip_specular.
Occlusion – greyscale or as i call it the black and white dippy which matched to dip_ao.
Normal – allows you add some detail like bumps, scratches etc. This matched up to dip_normal
My final shader looked like this.
He looked pretty cool at this point but there was a bit of a shimmer. We tried a few things like dropping poly count but then i noticed it on most holograms – even the simple cat in the Holograms app. So I left it for a bit.
For my demo, I wanted to show 2 different ways of using the same model but with different “definition”. In this case I wanted the same polycount but different maps – 1 with detailed fur and another will something simpler so you knew it was a diprotodon but didn’t need the detailed colour.
This time I used the HoloToolkit Standard Fast Shader and dropped the dip_scan_line map onto albedo, emission and detail.
Presentation done but the shimmer was still bugging me. Took me a bunch of goes to capture it but you can see it on a few of the pics above but I think this one highlights it best. The Dippy on the left to me really stands out with the edges.
So I took advantage of the Unity guys being at Ignite and showed them. I really wanted to know if it’s something we should have done at the model level, the Unity level or even a code level. John suggested the Anti-Aliasing settings for the project.
Normally all the doco points to picking “Fastest” / “worst” and nothing else and anti aliasing is set to Disabled by default.
We made the simple change to set Anti Aliasing to 4x Multi Sampling. Re-deployed that and wow what a difference that made
That shimmer is gone!
When I first used HoloLens almost two years ago I was taught the air tap. I looked a lot like the picture above. It worked for me and I was content.
A few months ago, I got my own device and started sharing how cool the device was. So then I had to teach people how to air tap.
Finger straight up in the air, then bend and the main joint to the hand.
I soon noticed a lot of people found this hard. No matter how many times i showed or demoed they would have trouble. A lot of them would bend at the wrong joint .
This type of motion doesn’t get picked up. They’d get frustrated and in a lot of cases would move on to a grabbing motion.
So hatting to my fellow MVPs who also love HoloLens I quickly discovered it isn’t just my bad teaching and that a lot of people had seen the same thing. Someone (sorry guys I can’t remember which of you mentioned this first) said try using a C and making them close it like a pinch. I tried this myself a few times and it seemed to be detected just aswell as the traditional air tap.
So at my Ignite presentation I taught half the room the official way and half the room the other. So 2nd half of the room I got them to make a C, then close
I thought that was the end of the story. The next day of of the people who went to my talk came to find me. He’d done a HoloLens experience the day before and had spent 20 mins doing air tap. He said the taught way left him with a really sore hand and that he had gone back again today and used the C I taught him the eventing before and had no issues.
So….teach people a few different ways to do a gesture to see what works for them. I’m finding the C is a great alternative so far.
Tonight was the Brisbane .Net User Group where Dave was kind enough to introduce me.
There were so many options for what to show and talk about I had to step back and think that most people will not have even seen a HoloLens in the flesh before so went right back to basics. I’d done this quite a few times but on much smaller groups so that people can see and feel the device.
But how do you get as many people to have a go without having to explain everything to each person…show and tell. So while we taught Chris as the first volunteer we got everyone to practice some of the key gestures., and voice commands.
Next up HoloToolkit – and how to use a bunch of the prefabs and very little code to get a focus aware cursor, tap and voice commands going. I’m still trying to work out the best way to teach the tap gesture. It’s a really interesting one to watch people do and then try other things if they get it wrong the first time.
We also had a bit of play with Vuforia and the demo gods were against me. After it working flawlessly all day on the night it wasn’t having any of it on either their sample or my own app. Oh well.
Here’s a few of the links that people specifically asked about:
Buying the device (in Aus as opposed to US)
I’m presenting at this month’s Brisbane .Net UG on HoloLens. Details:
Reality – Virtual, Augmented and Mixed. Join Bronwen – Microsoft Emerging Experiences MVP and get your reality in check.
This session will cover the different types of reality experiences and focus on the mixed reality of HoloLens. Bronwen will show you the actual device and how you can start building amazing holographic experiences for this self contained device. Don’t have a device – we’ll cover how you can start today without one.
Where: Microsoft Brisbane
When: 6pm 17th January
Vuforia have put together a sample app so you can have a go in HoloLens.
To get it going quickly I downloaded the vuforia samples eyewear 6.2.6 under the Digital Eyewear section and unzipped.
I imported this package and the latest holotoolkit package (because the setup and build is idiot-proof more than anything else).
I only cared about the actual scene above so selected Vuforia-2-HoloLens to include in the build.
I entered my newly acquired license key.
Then I built, and deployed to the HoloLens. Then all I got was nothing. After repeating this a few times I realised I’d been silly – I need something for it to recognise.
So then I went hunting for the images that are referenced in the sample and found them here. Then I made a quick dash over to my parents place because I don’t have a colour printer and it didn’t work with black and white or just using my laptop.
Then I laid them out on the table and ran and tada…teapots.
For me there were a few lessons here…
I was working through Rick’s Creative Coding with Unity sessions and one of those nice little tips he gave was how to quickly reset your transform properties.
Instead of painfully setting them all to 0’s each time :
One of the apps that can be useful when demoing and generally seeing what’s going on in the HoloLens is the companion app.
The problem it seems for those of us in Australia and presumably other non-US countries is once you get to it in the store (as the search doesn’t yield results) is you can’t download it. Even though we can now get HoloLens in Aus the app hasn’t been updated.
To get around this – adjust your pc region settings to United States.
Go back to the store and you can download. You can return to your regularly scheduled viewing by flipping your region back to Australia
DDD Brisbane is done and dusted for another year. After 5 years at QUT this year we grew yet again and took on a new venue. The lovely Advanced Engineering Building at UQ.
At 6:30 this crew started arriving to help setup. We had over 400 bags, lanyards, agendas, many tables, lecterns etc. to get in place before the crowd descended upon us.
As the crowd started to arrive the team did well to quickly process every attendee. It flowed pretty well considering the foyer here is much narrower than the past.
The mountain of swag bags slowly become a mound and then petered to a pile.
Damian kicked off the day with some housekeeping, including announcing some of the special speakers we had lined up.
Troy Hunt kicked off the sessions as our keynote speaker.
The main auditorium seems to soak up our 400 attendees but it was a packed house from the start.
Session 1 of the day we kicked off with .Net Core, Surviving uninspiring workplaces and HoloLens
The first break kept both our coffee carts very busy. These guys worked tirelessly from 8 till 4pm keeping our crowd filled with caffeine.
In some of the brief quiet periods it gave the crew a chance to play with some of the
After that bit of caffeine, we were straight back into it with some .Net Core on Linux, TDD and building walls around aggregates.
The pre-lunch sessions (always hard when people’s stomach’s start to rumble) we had Identity Server, why is maintenance so hard and Serverless Architecture.
As we’ve grown, we’ve also had to grow our food. A couple of years back we realised it was too much for use to be running around after food and this year we stepped it up another notch. The lines were long but these guys did a great job of feeding the hungry crowd.
Post lunch, straight back into it with Search engines with .Net, Delivering more with less features and How to protect yourself from being owned.
The sessions before afternoon tea included: Simplifying user interface programming, Developer to Entrepreneur and Industrial grade continuous delivery.
Afternoon tea – the last chance to get a coffee and speak to our great sponsors. Minds were blown with the amazing experiences of the latest from VR and MR.
Last of the technical sessions today and the crowds continued to watch: TypeScript, Agile Retrospective and a session on Blizzard Legion Launch.
This year we say good-bye to our fearless leader Damian who is leaving us for Canada. It was really nice to look back at the changes to the venue, crowd and food over 6 years! It was also nice to see so many familiar faces have re-joined us year on year.
It wouldn’t be DDD without the wheel of prizes supplied by our wonderful sponsors. Lots of happy faces. Can’t wait to see what people do with the awesome prize of the Vive.
Of course the event would not exist or run smoothly without the organizers and volunteers! Thanks again for all your help to make this what I think was the best DDD Brisbane yet.
After getting a real HoloLens I thought the best thing was to take it out into the world to meet some real people.
Normally I’d spend some time to put together a nice presentation. I hadn’t had time so I was bad and winged it.
My first experiment was an interesting one. I’d configured the device for me and I don’t have a pupil measuring device so I thought I’d see how we go at just using it as is.
After showing people the first gesture the select I let them go.
It didn’t take long until the “Wow!”, “No way!” and the general look of amazement on the face was present.
Helping them without seeing what they see was a bit challenging so I quickly decided i needed to at least have a few holograms lying around for them to see.
Even though I’ve tried to explain the experience to people for over a year. No matter how many times you explain it – people do not understand it’s awesomeness until they experience it themselves.
It doesn’t take long till people are selecting, rotating, placing and playing with holograms like a pro. You soon notice that your room is all of a sudden filled with hamster wheels, astronauts and fish swimming around your desk.
I also have a bad habit of losing my hologram menus so I’ve taken to put them on a wall.
After playing with the emulator for awhile I finally caved and bought a real device.
While doing the initial setup I got stuck at the network connection screen. At first I thought this was due to the hotel internet needing a special browser page with details to login so I left it.
The next day when I had more conventional wi-fi I realised something wasn’t quite right. I’m back a the same screen and I can’t see a network…hmmm… Luckily I’m in a room full of other HoloLens geeks so we got someone to share their phone data as a hotspot…still nothing.
At this point I realised something was amiss and I was going to need to reset it.
From here I followed the instructions. Plugged in my HoloLens, downloaded the software and installed to the HoloLens. Halfway through the install I got a failure. I tried running it again but it couldn’t find my device now as it was in a funny state.
So now I’m going into manual recovery mode.
1. Hold the power button for 15 seconds, until all the lights are off. For me this felt like it took a minute…
2. Press and hold the volume up button
3. While holding the volume up button, press and hold the power button for 10 seconds, until the middle LED on the device arm lights up. Again, this felt like at least a minute.
At this point you won’t be able to do anything with the HoloLens. It took a few goes before my computer would pick it up in this mode. My 2nd attempt failed and I ended up having to do it for the 3rd time.
After the install worked, I was able to get successfully through the start up, calibration, network connection and away I went. I must say I was getting quite worried I’d bricked the device for good.
Hopefully this will help anyone else that gets themselves into a funny state on the HoloLens.
Great news for everyone that wanted a Hololens and couldn’t get past the application form stage. The team announced today general availability of up to 5 devices from the store.
Only catch is it is still only available to those with a US and Canada shipping address….
So today was the most AWESOME day! When I first saw the videos of HoloLens early this year I knew I HAD to come to Build to see it.
I was so fortunate to be able to attend the Holographic Academy – a half day hands on labs with HoloLens.
When you first put it on and see your first hologram it truly is MIND == BLOWN! While there are a few little issues like field of view others have mentioned I was totally blown away and feeling very Tony Stark running around with my Hololens.
I was amazed at the spatial mapping capabilities. I could take my virtual, holographic ball and drop it onto a couch. Then I could live up the cushion on and angle and watch the ball roll off the couch to then see it land in someone’s hand (who wasn’t wearing a Hololens so couldn’t see it).
I was a kid in a candy store for 1/2 day and I think my head would fall off from the giant grin.
I was also impressed by the sound. It’s crisp, clear and you can’t hear the sound from someone else’s hololens.
To get an idea of what I got to see there’s a cool video here: