Microsoft Kinect Sensor

Evaluating the Microsoft Kinect Sensor on the PC

The Sensor


The Sensor Tracking Field

A home made connector for the sensor

The Kinect requires additional power and uses a modified USB plug in order to connect to a PC. Here’s how it would look if you were to create one yourself. (Do I feel a DIY urge coming on?). Alternatively just buy the relevant power lead accessory.

(This one is not mine btw)


Color Coded Depth Image

As well as being able to capture a normal video image, the Kinect also provides a depth image. The depth image contains the distance from the sensor at each pixel in the image.


Another Depth Image

The depth information can be used to provide a variety of different coloured images.


Surface model from Depth Image data

The depth data can also be used to create 3D surface models.


Point Cloud from Depth Image data

Another use of the data is to generate 3D point clouds.


Point Clouds

The captured depth data can be used to create a 3D point cloud model of objects.


Point Cloud - Room Objects

The captured depth data can be used to create a 3D point cloud model of objects in a room.


Model from Point Cloud data

This sample used the point cloud data with LIDAR modeling technology to produce a 3D model


Samples - Point Cloud to 3D Mesh

Experiments with the depth data also explored various techniques for generating 3D mesh models from the point cloud data.


Finding distances from the camera

The depth of a location in an image can be found using a simple mouse click. The captured depth image can be superimposed on top of the normal image as shown here so that when a mouse click occurs, the distance from the camera at that point is quickly found.


Hand Tracking

Hands can be tracked without the need for marker symbols, gloves or other intrusive items. A video showing this in action is included later.


Body Tracking 2

The Kinect allows full body tracking with the joint representation as shown in this image. This enables full body motion to be translated into, or interpreted as, appropriate commands for an application.


Realtime point cloud/mesh update

The point cloud/mesh can be updated in real-time as shown in this video clip. The window at the top right is showing the normal video image, and the window at the top left shows the depth image. The window at the bottom shows the point cloud with each point coloured using colour information from the normal image. Registration is slightly off but you can still get the idea.

Hand Tracking

This sample demonstrates tracking multiple hands and shows how the tracked positions can be plotted as the user moves the hands enabling the ability to sketch in space with little or no perceptible lag.

  • Background has been removed
  • Active area is approximately 40-60cm from sensor
  • Can be temperamental starting to track 2nd and subsequent hands

Multiple hands (i.e. more than 2) can also be tracked simultaneously


Microsoft Kinect Sensor (web-site)