Showing posts with label Windows 7. Show all posts
Showing posts with label Windows 7. Show all posts

Thursday, August 11, 2011

Kinect Control for WPF

Yesterday, Esri’s Applications Prototype Lab released a sample for ArcGlobe that allows users to navigate in three dimensions using a Kinect sensor and simple hand gestures.

This post describes a sample utility library developed in conjunction with the ArcGlobe add-in called KinectControl. KinectControl is a WPF user control that can display raw kinect feeds but most importantly provide developers with the orientation, inclination and extension of both arms relative to the sensor. KinectControl was developed a generic library that can used to kinectize any application.

The following few screenshots demonstrate the capabilities of the KinectControl. By default, KinectControl displays the sensor’s video feed and the skeleton of the closest person to the sensor. The orange text at the bottom of the app is debug information from the test application.

Kinect Control for WPF

Occasionally a limb may appear red, this indicates that one or more of the limb’s joints cannot be “tracked” and its position is “inferred” or approximated by the sensor. This often happens when a joint is obscured from view, for example, if a user is pointing their hand and arm directly towards the sensor, the user’s shoulder cannot be seen by the sensor.

On the upper right hand corner of the KinectControl are three button that allow the user to toggle between three different views. The video and depth views are self explanatory but the third, blend, is a combination of the both.

Kinect Control for WPF

The blend view color codes each and every person identified by the kinect sensor with a different color as shown below. The kinect sensor can identify up to seven people.

Kinect Control for WPF

This white stick figure graphic in the upper left hand corner is used to alert the user whenever he or she has moved beyond the kinect’s field of view. For example, in the screenshot below, the user has moved too far to their left.

Kinect Control for WPF

In the bottom left hand corner are two buttons to control the inclination of the sensor. Each button click will move the sensor one degree up or down.

Kinect Control for WPF

And lastly, the test app that is included with the sample uses binding to display the left and right arm orientation and inclination on the screen.

Kinect Control for WPF

Please click the link below to download the KinectControl sample. To use this sample you must have a Kinect connected to Windows 7 computer with Kinect for Windows SDK installed.

Wednesday, August 10, 2011

Kinect for ArcGlobe

On June 16, 2011, Microsoft released the Kinect for Windows SDK. This SDK allows windows developers to support motion with an Xbox 360 Kinect device. The Applications Prototype Lab at Esri has just completed a prototype using a Kinect to navigate in ArcGlobe.

To fly forward, the user can raise their right hand. The display will navigate in the direct the right hand is pointing. We call this “superman navigation”. If the left hand is elevated, the display will pivot around a central location on the globe surface. And lastly, if both hands are raised, the screen will zoom in or out as the hands are both together or apart.

To use the add-in you must have the following:

  1. Kinect for Xbox 360 sensor,
  2. Windows 7 (32 or 64bit),
  3. .NET Framework 4.0,
  4. Kinect for Windows SDK beta.

The add-in (with source code) is available here.

This add-in was challenging in sense that translating traditional mouse navigation to motion was not easy. With mouse or touch input devices, users have immediate sensory feedback once they have clicked a mouse button or touched a screen. In a number of Xbox games, this issue has been overcome with a paused hover. That is, a user uses his or her hand to move the screen cursor over a button and waits for a few seconds as an activation animation completes. This is fine for buttons that occupy discrete areas of a screen but not for interaction throughout the screen,

The approach adopted, rightly or wrongly, by this add-in is that of a virtual screen that exists at arm’s length directly in front of the user. This virtual screen only extended +/- 25° from an arm point directly ahead at the real screen. This technique provides an approximate motion to screen mapping with screen contact only at full arm extension and only within a narrow 50°x50° area in front of the user.

Ultimately a better approach could be to rely purely on touch-like gestures such as left swipe, right swipe and pinching. There has been exciting work in this area by Deltakosh on the Kinect Toolbox project but I hope that the final release of the Kinect SDK includes gesture recognition.

Monday, May 23, 2011

Landsat Touch for Silverlight

clip_image002

The Lab is proud to release Landsat Touch, a Silverlight based web application for browsing the newly published Landsat imagery.

Try the application here:
http://maps.esri.com/sldemos/landsat/default.html
(click the “need help” button in the lower left hand corner for usage tips)

Download the source code from here:
http://www.arcgis.com/home/item.html?id=04de702976434f4d8c054005ce4603a2

Earlier this year Esri announced the publication of more than 8TB of Landsat imagery as image services on ArcGIS Online. These services can be consumed by any ArcGIS client, the ChangeMatters viewer or by any custom app using the REST API.

Landsat Touch is a custom web application developed using the ArcGIS API for Silverlight. The objective of this project was to create an easy to use application that could browse, compare and contrast Landsat content and provide support for touch devices (if available). The application is fully functional using a mouse on Windows or Mac computer, but the application is optimized for Windows 7 touch devices.

Multi-touch behavior is provided using the Multi-Touch Manipulation library from CodePlex which is partially derived from Microsoft’s Manipulation and Inertial sample.

The ability for this application to compare two layers of content is conceptually similar to ArcMap’s swipe layer tool. An important distinction with Landsat Touch is that users can explicitly define the “swiped” area by manipulating the size, orientation and location of windows. However the most exciting features demonstrated by this application is its ability to be multi-touch and multi-user. On large touch devices, say 40”+ diagonally, it is conceivable that two or more users can simultaneously add and manipulate Landsat windows.

Thursday, July 15, 2010

Touch for ArcGIS Desktop 10

Touch for ArcGIS Desktop

Just published on the ArcGIS Resource Center is a proof-of-concept developed by the ESRI’s Applications Prototype Lab called Touch for ArcGIS Desktop. This contribution is an AddIn that adds touch navigation to ArcGlobe.  The contribution can be downloaded from here and includes the AddIn itself and full source code.  The download page includes instructions how to install and use the AddIn.

The AddIn will install on any computer running ArcGIS 10 but a warning message will appear if you do not have a Windows 7 Touch-enabled device connected.

This AddIn is based on code from the Windows 7 Multitouch .NET Interop Sample Library.