A coma is a profound or deep state of unconsciousness. An individual in a state of coma is alive but unable to move or respond to his or her environment. Even though those patients in a persistent vegetative state lose their higher brain functions, when other key functions such as breathing and circulation remain relatively intact. However, spontaneous facial movements and expressions may occur in response to external stimuli. Clinicians regard the patient’s facial expression as a valid indicator for motion intensity. Hence, correct interpretation of the facial agitation of the patient and its correlation with motion is a fundamental step in designing an automated motion assessment system. Computer vision techniques can be used to quantify agitation in sedated Intensive Care Unit (ICU) patients. In particular, such techniques can be used to develop objective agitation measurements from patient motion. In the case of paraplegic and coma patients, whole body movement is not available, and hence, monitoring the whole body motion is not a viable solution. Hence in this case, measuring head motion and facial grimacing quantify facial patient agitation based on K-Nearest Neighborhood (k-NN), Fuzzy k-NN (f-kNN), Linear Discriminate Analysis (LDA) and Neural Network (NN). Using the proposed classifiers, some experimental results for different angle, distances and illumination levels have been obtained. It is found that the classification accuracy is higher than 90.00% for the proposed features and classification techniques. Finally, Graphical User Interface (GUI) Layout Editor is developed to minimize the tedious work of medical staff.