In this paper we rst describe how we have constructed a 3D deformable Point Distribution Model of the human hand, capturing training data semi-automatically from volume images via a physically-based model. We then show how we have attempted to use this model in tracking an unmarked hand moving with 6 degrees of freedom (plus deformation) in real time using a single video camera. In the course of this we show how to improve on a weighted least-squares pose parameter approximationat little computational cost. We note the successes and shortcomings of oursystemanddiscuss how itmightbe improved. 1 Motivations There has long been a need for a vision-based hand tracking system which is capable of tracking movement with 6 degrees of freedom (DOF), along with articulation information, whilst being as unintrusive as possible. The use ofhand markings or coloured gloves and the need for highly constrained environments are undesirable. Such a system should also be as widely available as possible ...