7 teams are competing in the Ultimate Coder Challenge where showcase applications are being built for a Lenovo Yoga 13to demonstrate the Intel Perceptual Computing hardware. I’m one of the judges and this is the last week of the competition for the competitors. The competition closes on April 12th and that leaves just over 2 days before developers have to hand over their apps to us. My equipment arrives tomorrow so I’ll be setting up the Lenovo Yoga 13 and the perceptual computing kit over the weekend. I’m looking forward to testing.
You can find all our Ultimate Coder posts here
All our Perceptual Computing posts here
Important note: Intel will be holding their software keynote at 0900 Beijing time tomorrow (11th April) where we expect to hear more about perceptual computing and HTML5 development. Check back tomorrow for an update from Beijing.
Infrared 5 / Brass Monkey
Kiwi Catapult Revenge uses two main methods of input. The Brass Monkey control system uses smartphone sensors for guiding the flight and the perceptual computing sensors are used to control additional features and for a 7-week project I think Infrared 5 choose the right approach to getting a product complete but they may not have chosen the right approach in terms of focusing on the issues, techniques and programming skills required for the perceptual computing competition. Kiwi Catapult Revenge will be fun to play though and I’m looking forward to testing it out. Read the Infrared 5 final post and watch a fun and informative video here.
Peter O’ Hanlon
We all went into this project without knowing exactly what the hardware and SDK was capable of and Peter rightly highlights that there are a lot of issues to be sorted out before the perceptual computing hardware and SDK becomes mainstream. In my discussions with Intel and others 2015 is seen as the year where this technology could be successfully deployed into Ultrabooks and AIO PCs and if we think of the greater timeline, we’ve really only just finished writing page one of the book. Peter has highlighted that it’s bloody hard work as a singleton and of course, I’ll be taking that into account when judging. Read Peters final posting here.
The Code Monkeys highlight something that I think is extremely important with any perceptual computing implementation. Feedback. Gavin added a head and hand feedback mechanism to the software. The user is going to ‘get it’ a lot quicker with this sort of feedback and I think we need to highlight that as a requirement that should be considered for all software that uses perceptual computing. Maybe Intel should put together some more detail on recommendations in this area. That’s not all they have to say though. There’s a very detailed posting covering the issues of redesigning an existing UI for perceptual computing inputs. There’s a section on responsiveness and computational load – something I was worried about at the start of this competition. The Startgate Gunship roundup is here.
I think we can safely say that Sizense have the cutest result in the competition. Puppet Motion looks like a lot of fun and the videos we’ve seen, very smooth and responsive. The artwork is fantastic and Sixense have done something that could be as critical as the on-screen feedback I mentioned before. Training or guiding the user into their first experience makes it a better one. Check this video out and then the final post from Sixense where you’ll find more info and another great video.
Simian Squared have, for me, one of the most exciting and challenging projects in the whole competition. It was always going to require accuracy, low-latency and a lot of computational power to get a rotating pottery wheel responding to two hands moving in front of the PC camera. Have they done it? I’m not sure. Their last post talks about how they could implement the deformations..and there’s 2 days to go! All the details here.
Eskil starts his final post with a thumbs up for Ultrabooks that I can relate to. It’s all about quick availability and with Haswell, that’s going to get even better when the new Connected Standby Ultrabooks appear in the second half of 2013. He also talks about build tool sets that can help to make better products in the future. Eskil’s user interface library will be complete on Friday and I look forward to seeing if he’s contributed something special to the future of Perceptual Computing. Eskil’s week 6 post was a little late for me to include in my summary last week so having read it now, I’m quite excited to test it out.
Lee has taken a concept that I think has legs. The 3D video conferencing application is impressive to watch and I hope to be impressed when I test it this weekend. Lee calls the competition an ‘Elite VIP Hackathon’ which is right. These teams have been invited to compete against each other and have ended up working with each other to produce information, about good and bad, that can be used by developers in the future. In Lee’s final post you’ll find the best software, a great 9-minute overview video from Lee and information on his body mass tracking algorithm. Lee’s last post is here. Thanks for the fireworks Lee.
So that’s it for the summaries. You can find the previous Ultimate Coder summaries here. Starting tomorrow I’ll be setting up the Lenovo Yoga 13 and testing. During next week I’ll be assessing the software and then on the 24th April the winner will be announced.
Important links for the Intel Ultimate Coder Challenge: Going Perceptual Competition
- Intel competition Website
- Join everyone on Facebook
- Competitors list and backgrounder
- Twitter stream #ultimatecoder
- All our perceptual computing coverage can be found here.
- All posts about this and the previous software competition can be found here
Full disclosure – We are being compensated for content related to the Ultimate Coder and to attend an Intel event. We have committed to posting at least once per week on this subject and judging the final applications. All posts are 100% written and edited by Ultrabooknews.