We all know what emotions are and how they can influence someone’s decisions. In many fields, emotions play a big role and almost every time they can be seen on someone’s face. It is difficult to read the emotions on somebody’s face when you actually concentrate more on what the person is saying. Not everyone can be a multi-tasking master and pay attention to the body language as well (which by the way, sometimes shows the true feelings of a person). In marketing for example, we can see very clearly the attitude that a customer has over a product just by looking at his face. Let us all agree, life would be way easier if we would be always aware of what a person is truly feeling.
Taking in account all that, researchers Hakan Boz and Utku Kose conducted a study and wrote an article called ’’Emotion Extractions from Facial Expressions by Using Artificial Intelligence Techniques’’. Nowadays, in an era where technology becomes more and more dominant in each and every field, these researchers developed a system that is actually able to identify the emotion of a person based on his facial expression. In detail, the system considered a Cascade Feedforward Artificial Neural Network model trained by a recent optimization algorithm called as Vortex Optimization Algorithm.
The main objective is generally finding the points determining different parts of a face. After determining the related parts of a face, it becomes easier to have idea about i.e. facial expression. On the other hand, it is also important that different genders, different races and even changing sizes of a head – face can affect the way of detecting facial expression. So, it has always been important, especially for researchers working on detecting facial expressions by intelligent systems, to use appropriate types of data. The researchers Boz and Kose applied 3 sets of data, the first set containing 490 photos with genders, and adjusted for 7 emotions, the second set including 810 photos with genders, races, and adjusted for 7 emotions and the third set (the authors’ choices) 100 photos chosen randomly and adjusted for 7 emotions.
In order to train the system, the researchers used Microsoft Kinect 3D infrastructure to gather 3D face points. Microsoft Kinect 3D is a popular system of both hardware and software solutions for effective recognition processes from photos or videos. The results had been positive and the system was able to extract the emotions from different photographs. They used the Vortex Optimization Algorithm. This is an intelligent, Swarm Intelligence based algorithm, which was developed by Kose and Arslan by inspiring from dynamics of vortices in the nature. In detail, the algorithm uses some role-based and evolutionary mechanisms to achieve the optimization solution processes.
In conclusion, the researchers succeeded in developing a system that is effective enough to extract different emotions from photos related to individuals with different gender and even different race. We are all sure that this is the future of technology and soon enough, emotion reading won’t be a secret for the humankind anymore.