Gesture Priming and the Ultimate Coder Challenge

Posted on 13 February 2013, Last updated on 21 September 2019 by

perceptual computing image 1

On Friday we will be announcing our participation in a new Intel-sponsored competition to develop showcase applications that use gesture, tracking and voice input using the new Perceptual Computing developer hardware and SDK that Intel have developed with their partners.

In this article I want to put some thoughts forward about perceptual computing, how it can be used, some of the issues and a few ground-rules but first we need to talk about the past.

When I first started using touchscreen laptops in 2007/2008 it took me a long time before I had a core set of use cases that went beyond just showing-off. Touch is fun but in some cases is less productive than the keyboard or mouse equivalent so we need to be aware that it will take a while until a core set of usable scenarios emerge. A side-swipe for moving between photos is an easy one to remember and use. Pointing to a tile with one finger and selecting with five fingers may be less intuitive and less reliable which means it could be easier just to touch a screen.

2013-02-13-1600With the new developer hardware being offered by Intel (you can buy the Creative perceptual computing hardware here) we’re not just talking about 2D object recognition. 3D object tracking is part of the hardware and SDK (by SoftKinetic), facial tracking and close-range tracking of, for example, fingers (by Total Immersion)  and along with the dual-array mic and voice recognition software (by Nuance) there’s a huge range of options on offer. Add touch, multi-touch mousepad, mouse and keyboard shortcuts into the mix and it could get messy.

We’ve covered software development with touch many times before on Ultrabooknews.com. One of the best overviews is here. You’ll find a reference to the Windows 8 Touch Language and it’s important that similar guidelines are created for gesture control. I hope that through the Intel Ultimate Coder Challenge: Going Perceptual competition we’ll end-up with some agreement on what’s working, what’s not working and what should be part of a Windows 8 gesture control language.

Intel have already put together a  very interesting document on the subject of gesture guidelines. Human Interface Guidelines, part of the software development kit for Perceptual Computing, is available here and definitely worth reading.

As I mentioned earlier, we’re in the learning stage with gesture control right now and through the Intel Ultimate Coder Challenge: Going Perceptual competition I think we’ll refine our ideas and start to include facial input over the next months but here are a some of my early thoughts about gestures.

Two Styles of Air Gestures

We’re likely to have two sets of gesture languages. The first will be the casual style that will form around windows and applications control via a lazy style of gestures likely to be made from the elbow with one hand. Most of these controls needs to be embedded in the OS

The second type of gesture will be done in a much more productive style with a tight application focus and will often include two hands. The result for the end user must be a noticeable improvement in productivity via accuracy or ease of function. Gesture failures in this mode will rarely be tolerated. This is the model I expect to see used in the Ultimate Coder competition.

 

2013-02-13-1603(0)     2013-02-13-1605

Casual and productive gestures.

I’ve shown images used on a desktop style of device. While the tablet style of device should definitely be taken into account I don’t see it being used much at this stage unless tablets are effectively used as fixed desktop screens. Large-screen presentations should also be considered but these will follow a similar style of casual and productive gestures as for tablet-top screens.

Note that in the images above we have hand tracking (object) and multi-point tracking and that in both cases there may be a depth element. Zooming, for example, could be a casual gesture played from the elbow that includes point tracking with fingers opening or object tracking using distance from screen.

Gestures must be consistent, intuitive, simple, reliable and have a low error-rate if users are going to adopt them. Complex gestures with a high error rate aren’t useful for anyone.

Note: I’ve been trying to find out what the situation is with gesture patents. It’s a complex area and one that developers should be aware of.  I will not be covering this topic in this article.

Windows 8 Modern or Windows 8 Desktop Gestures?

The area of Windows 8 most likely to get traction with gestures is through the Modern UI.  It’s new and is likely to be more openly explored by users. It’s the perfect place, and the perfect timing, to start implementing gestures into this tile system and in fact the UI lends itself well to air gestures. Developers are also encouraged to implement simpler user interfaces in this area too. This is an area where the OS vendor will play a big part by providing and supporting SDKs and building their own gesture support in to users interfaces.

Unfortunately I don’t think we’re that far yet so what we’ll see first are a set of experimental and showcase applications on the desktop and that’s with the ‘Going Perceptual’ competition is all about.

You’ll see specific applications and specific gestures and there are likely to be a big range of implementations. This is our playground. This is where we learn about what’s good and what’s not.

Software Developers – See our Ultrabook Software Development Resources.

Gesture Criteria

The worst thing that could happen to the development of gesture control is a lack of control and guidance and a lack of agreement on basic gestures.

  • Gestures need to be intuitive to allow users to discover.
  • Gestures need to be simple for everyday use
  • Gestures need to be accurate and reliable for productivity
  • Gestures need on-screen feedback
  • Gestures needs to be consistent across applications
  • Gestures must be optional. (At least as gestures evolve and find their place in computing.)

Consistency could be one of the hardest issues to control. We’ve already seen problems with touch where some touch controls are not consistent across applications and there’s no agreement on common touch controls. In fact, one gets the impression that patents are involved. This needs to be avoided with gestures.

Ultimately, OS vendors need to be thinking about gesture control, object tracking, voice control and facial analysis today. If they work together now there’s a much great chance of these input methods succeeding in the future.

Will Gestures be Successful?

Gestures have every chance of being successful if there’s consistency and accuracy but there also needs to be a commitment from operating system vendors. OSVs need to talk to each other NOW to ensure that we move forward with a well-defined set of rules. The Ultimate Coder Challenge is going to be a good forum in which to learn and we’re pleased to be part of it. We encourage OSVs, ISVs and manufacturers to watch and contribute.

Follow the new Intel competition – Intel Ultimate Coder Challenge: Going Perceptual. We are judging and we’re looking forward to learning lots more about this topic. Watch out for a demo soon too!

This is a sponsored post brought to you by Intel and Ultrabooknews.  All content written by Ultrabooknews. Subject and source article by Intel. We thank Intel for their support of Ultrabooknews

Comments are closed.

Search UMPCPortal

Find ultra mobile PCs, Ultrabooks, Netbooks and handhelds PCs quickly using the following links:

Acer Aspire Switch 10
10.1" Intel Atom Z3745
Acer C740
11.6" Intel Celeron 3205U
VIA Nanobook
7.0" VIA C7-M
Lenovo Ideapad Flex 10
10.1" Intel Celeron N2806
Acer Aspire E11 ES1
11.6" Intel Celeron N2840
Dell Chromebook 11
11.6" Intel Celeron 2955U
Dell Latitude E7440
14.0" Intel Core i5-4200U
Acer TravelMate B113
11.6" Intel Core i3
HP Elitebook 820 G2
12.5" Intel Core i5 5300U
Acer Chromebook 11 CB3-131
11.6" Intel Celeron N2807