Ultimate Coder Challenge Week 6 – Road Tests Reveal Issues. Competitor Interaction Reveals Tips.

Updated on 13 August 2018 by


7 teams are competing in the Ultimate Coder Challenge where showcase applications are being built for a Lenovo Yoga 13 Ultrabook to demonstrate the Intel Perceptual Computing hardware.

I’m involved with the judging of the Ultimate Coder event and every week you’ll find an update from me as I analyze the teams progress. This is week 6, one week before the competition closes on April 12th and teams have just come back from GDC where they demonstrated their apps to attendees and judges. Foot and tongue tracking features in the posts this week but I’ve checked the dates on the posts and sure enough, there may be a few April Fools going on here! Ben was at GDC for Ultrabooknews last week and his comments and a few videos are included below.

You can find all our Ultimate Coder posts here

Our Ultrabook software developer resources are here

All our Perceptual Computing posts here

Only five of the seven teams have reported back after GDC so either GDC feedback has caused them to throw the towel in or they’re simply coding like mad to get things finished. Given that the teams are all supporting each other well, I doubt anyone is going to give up. Here’s a summary of this weeks posts and additional reporting gand videos from Ben Lang who was at GDC.

Peter O’ Hanlon

As a result of competitor interaction [a key part of this competition has been inter-competitor support and interaction and all teams should be congratulated on that] Peter has decided to re-introduce voice recognition into his HUDA image manipulation engine. He’s also put a video up showing some of the gestures he’s introduced for different image filters. You can read his update here.

Infrared 5 / Brass Monkey

Kiwi Catapult Revenge got feedback at GDC that will be used to fine tune the final product, they refined the head tracking algorithm to reduce CPU load and demonstrated foot tracking…


Great one team Intrared 5, although I note the date of your post – April 1st!

With just over a week to go Infrared 5 are starting to take risks because they’ve announced that they’ve got special features they want to add in. Gulp! The week 6 post can be read here.

Code Monkeys

If you thought foot tracking was mad, check this out. Tongue tracking! Inspired by the work of Dr G Simmonds. You have to watch the video to appreciate this and I’m not giving anything away here. Full 1st April post here. Nice one team!

Lee Bamber

Lee managed to squeeze in some coding for his Percetuacam app while at GDC but also learnt a lot from the live testing. Lee says that the voice recognition isn’t working well in noisy environments. This could be due to cheap mics on the perceptual computing hardware (there’s a lot that can be done with beam-forming and noise-cancelling mics that might not have been implemented) or through un-optimised software in the SDK. Read Lee’s post, including a few good perceptual computing developer tips, here.


The Puppet in Motion app that Sixense is working on is one of my favorite ideas in the competition because it has the potential to test the latency, processing power and SDK to the limits. The demo this week is fantastic and I can’t wait to get my Perceptual Computing hardware set-up (it arrived today) to get going with the demo. Check out the post, and video, here.

Ultrabooknews at GDC

Ben was at GDC last week and had a chance to talk in detail to a couple of the teams. Here’s his additional report.

Brass Monkey

I met up with Chris from Brass Monkey to see the latest on their game. The graphics have made an advancement from previous weeks. The 2D cardboard style reminds me of Little Big Planet. It’s a smart art choice because it means they can keep polygons low, for the sake of performance, while retaining a distinct style.

At present the flame breathing weapon is triggered with a button on the smartphone controller. Chris tells me that they want to use mouth-detection from the Gesture Camera input to trigger it instead. He’s also considering adding volume as an input. So to breath fire you’ll have to open your mouth and roar like an angry fire breathing dragon which would certainly be a funny sight!

The flame effect is currently hogging too much performance — when you try to hold down the button it slows the game way down. The team knows they need to fix this and will be working on a solution that will maintain a good looking effect without sacrificing so much performance.


I met with Chip from Sixense at GDC to see Puppets in Motion. The team had previously shown a control scheme where in you open and close your hand like you would with a sock puppet, and that input is used to animated the characters’ bodies and mouths. The build they had at GDC used an open palm gesture where moving your thumb back and forth would animated the mouth and the direction of the palm dictated the body direction. The latter seems less intuitive but still works well — Sixense’s Gesture Camera input feels very tight.

Chip told me that the team wants to implement a blowing input wherein you blow on the camera as the wolf to blow pig’s house down. The camera’s mic would detect the sound of the air on the camera and translate that into a force to knock the house down, perhaps with a physics driven animation. It would be fun to have it physics based, especially if the force used to push the house was based on volume — this way the house could be knocked down gently or with great force which would open up more narrative paths.

Recording is built into the program which will make it easy for users to share their stories. At present I only saw two models (a wolf and a pig) in the game. The pig model was done well, but the wolf looked like it could use a bit more attention — it might still be placeholder art. The ability to make your own models or dress up / customize the existing models might add options for more creativity by the user.


Important links for the Intel Ultimate Coder Challenge: Going Perceptual Competition

Full disclosure – We are being compensated for content related to the Ultimate Coder and to attend an Intel event. We have committed to posting at least once per week on this subject and judging the final applications. All posts are 100% written and edited by Ultrabooknews.

More from us. (No silly ads.)

2 Comments For This Post

  1. Robert says:

    As fun as breathing fire is it would be nice if developers would look at various ways by which perceptual computing would help handicapped People in their lifes as well.

  2. Pete O'Hanlon says:

    Thanks for the updates Steve. I’m really disappointed that I couldn’t catch up with you last week, but who knows what the future holds – and if we do, there’s a beer or three from me for keeping me entertained throughout the competition.

Search UMPCPortal

Top Ultra Mobile PCs

Recommended Reading

GPD Pocket 2
7.0" Intel Core m3-8100Y
Viliv S5
4.8" Intel Atom (Silverthorne)
Acer Aspire E11 ES1
11.6" Intel Celeron N2840
GPD Win 2
6.0" Intel m3 7Y30
Microsoft Surface Go
10.0" Intel Pentium 4415Y
HP Chromebook 11 G3
11.6" Intel Celeron N2830
Archos 9
9.0" Intel Atom Z510
Lenovo ThinkPad P40
14.0" Intel Core i7 5500U
Samsung Galaxy Book 12
12.0" Intel Core i5 7200U
Lenovo IdeaPad A10
10.1" ARM Cortex A9 (Dual-Core)