7 teams have just finished week 2 of the Ultimate Coder Challenge, building showcase perceptual computing applications on a Lenovo Yoga 13 that will demonstrate the sort of interaction capability that could be built into a Ultrabook, and perhaps the operating system in the future.
I’m involved with the judging of the Ultimate Coder event and every week you’ll find an update from me as I analyse the teams progress.
This is week 2. The teams presented their application ideas last week and I was honestly quite surprised at the level that all the teams had set their targets. There’s on red-flag to report but apart from that – Wow!
You can find all our Ultimate Coder posts here
All our Perceptual Computing posts here
Peter O’ Hanlon – UI frameworking
Peter is writing a kind-of sequencer for photo editing called Huda and the editing process will include gesture control. In this weeks post Peter gets into some UI design work and kicks off his coding with the Windows Presentation Foundation and a playlist of rock music. Peter’s app differs from others here as he’s starting from scratch so it’s important to get that UI right first time. There’s no way he has time to go back and change it later.
There’s lots of code for you to dive into if you’re coding in WPF so check that out. Have a look at his videos too. It’s all here in the Week 2 Post.
Code Monkeys – Combination inputs
As a reminder, the Code Monkeys are adding gesture support to one of their games. The Code Monkeys have been considering UI aspects too despite them having a screen-based layout already in place. Maybe we should call it ‘user interaction’ instead of ‘user interface’ here. Code Monkeys will be looking at eye-tracking as a way to decouple the gun aim from the centre of the screen. This sounds good, but how? If it works you’ll be able to look at the enemy in the game and then provide a gesture to begin shooting. I really like that idea of combining gestures but if it’s not accurate or not responsive it could kill the game completely. Read the Week 2 post here
Simian Squared – Clay Reality
Balancing realism and fun is discussed in the Simian Squared update today and I think they’re right because there’s no way that you can 100% represent a lump of clay on a potters wheel with the hardware we’ve got here. There’s no haptic feedback and there are going to be inaccuracies, errors and latency to deal with. Simian Squared have taken on a complex tasks here but the processing power of the Ultrabook is going to really help. I wouldn’t be surprised if they hit the limits a few times too.
Simian Squared provide a full rundown of their thoughts in this Week 2 post.
Eskil Steenberg – Betray is alive.
Eskil (AKA Quelsolaar) is building an input framework for user interfaces. It will offer generic positional data regardless of the input whether it’s mouse or hand-gesture. I’m amazed at the depth of what’s going on here but I’m also struggling to imagine what’s going to come out at the end. Will this be a showcase application or will it be a platform on which showcase applications need to be built before we can see the strengths of Perceptual Computing?
Eskil has coded himself through a huge amount of work this week though and in his Week 2 post you’ll find a ton of detail and a link to the plugin-enabled library he’s developing. BETRAY_H has started!
Sixense – Rag Doll Blues
Story of the week! How gutted must have Sixense been when they saw a video, presented at a huge new product launch, that echoed exactly the idea they had planned for the Ultimate Coder challenge.
They slept on it. Came back to analyse what had been presented and decided that they had a better idea; More interactive and more realistic control of the puppets.
There’s some great insight into gesture control issues in the Sixense Week2 post and despite the shock event you get the impression they’re really excited about their project which I’m sure is going to rub off on the final project.
Lee has hit on something I find extremely interesting. By using 3D data from the cameras Lee want’s to create a video conference session using representations of the original person. Not only could this be fun to play with but it has some more serious applications in terms of bandwidth reduction for video conferencing.
In testing, Lee has proved his idea could work by testing the 3D data and checking frame rates. He’s happy with progress and will be moving forward with a better representation of a face next week.
Lee’s week 2 post includes a video demo and is available here.
Kiwi Catapult Revenge is a game that’s going to use sensor input from a smartphone, had-tracking input via the perceptual computing hardware and use that to control a game in a browser window. That’s a lot of technology and a lot of layers of software. Will the Ultrabook have enough power to keep up or will it be the bottleneck?
It appears that the 3D depth information from the camera is above expectations and Brass Monkey are now planning to try and track pupil movement which is something I don’t think anyone is expecting. It adds more complexity though and the clock is ticking but the team seems well organised.
You can read the Week 2 post here and even try out a simple scene movement demo.
It’s nice to see all the teams making good progress. There haven’t been any major issues and all the teams seem confident with their targets. The only red-flag I have to fly is for Eskil who needs to think about how he’s going to showcase the perceptual computing features with his framework. Has he budgeted time for demo software? On the positive side we’re seeing good feedback on the strengths of the hardware and the possibility of eye-tracking.
Important links for the Intel Ultimate Coder Challenge: Going Perceptual Competition
- Intel competition Website
- Join everyone on Facebook
- Competitors list and backgrounder
- Twitter stream #ultimatecoder
- All our perceptual computing coverage can be found here.
- All posts about this and the previous software competition can be found here
Full disclosure – We are being compensated for content related to the Ultimate Coder and to attend an Intel event. We have committed to posting at least once per week on this subject and judging the final applications. All posts are 100% written and edited by Ultrabooknews.