Connan, JamesMoemedi, Kgatlhego ArethaDept. of Computer ScienceFaculty of Science2014-01-172024-10-302011/06/082011/06/082014-01-172024-10-302010https://hdl.handle.net/10566/16908Magister Scientiae - MScThis thesis presents an approach for automatically generating signing animations from a sign language notation. An avatar endowed with expressive gestures, as subtle as changes in facial expression, is used to render the sign language animations. SWML, an XML format of SignWriting is provided as input. It transcribes sign language gestures in a format compatible to virtual signing. Relevant features of sign language gestures are extracted from the SWML. These features are then converted to body animation pa- rameters, which are used to animate the avatar. Using key-frame animation techniques, intermediate key-frames approximate the expected sign language gestures. The avatar then renders the corresponding sign language gestures. These gestures are realistic and aesthetically acceptable and can be recognized and understood by Deaf people.enAvatarBlenderSASLSign writingSign languageSign language animationsSWMLRendering an avatar from sign writing notation for sign language animationThesisUniversity of the Western Cape