Análisis de bases de datos de expresiones faciales para la identificación automática de emociones centradas en el aprendizaje

dc.contributor.authorGonzález Meneses, Yesenia Nohemí
dc.contributor.authorGuerrero García, Josefina
dc.contributor.orcidGonzález Meneses, Yesenia Nohemí [0000-0003-1034-0204]spa
dc.contributor.orcidGuerrero García, Josefina [0000-0002-3393-610X]spa
dc.date.accessioned2024-09-12T20:28:16Z
dc.date.available2024-09-12T20:28:16Z
dc.date.issued2021-09-17
dc.description.abstractEste trabajo presenta el análisis del estado del arte de bases de datos de expresiones faciales para la identificación automática de emociones centradas en el aprendizaje. La obtención de datos para los procesos de reconocimiento automático en un contexto específico es esencial para su éxito. Así, este tipo de proyectos inician haciendo una revisión de la información disponible para llevar a cabo las etapas de entrenamiento y clasificación de las emociones con las técnicas computacionales que se propongan. Se describen las actividades de búsqueda de las bases de datos de expresiones faciales que capturan emociones centradas en el aprendizaje. Estas actividades formaron parte de las etapas de la metodología del trabajo para reconocer las emociones de estudiantes mientras realizaban actividades de aprendizaje en línea. Esto permitió justificar la creación de la base de datos desde la formalización de un protocolo para su captura hasta su digitalización.spa
dc.description.abstractenglishThis work presents the analysis of the state of the art of facial expressions databases for the automatic identification of learning-centered emotions. Obtaining data for automatic recognition processes in a specific context is essential for their success. Thus, this project begins by reviewing the information available to carry out the training and classification stages of emotions with the proposed computational techniques. The search activities of the databases of facial expressions that capture learning-centered emotions are described. These activities were part of the stages of the work methodology to recognize students' emotions while they carried out online learning activities. This allowed justifying the creation of the database, formalizing a protocol from its capture to its digitization.eng
dc.format.mimetypeapplication/pdfspa
dc.identifier.doihttps://doi.org/10.29375/25392115.4300
dc.identifier.instnameinstname:Universidad Autónoma de Bucaramanga UNABspa
dc.identifier.issnISSN: 1657-2831spa
dc.identifier.issne-ISSN: 2539-2115spa
dc.identifier.repourlrepourl:https://repository.unab.edu.cospa
dc.identifier.urihttp://hdl.handle.net/20.500.12749/26488
dc.language.isospaspa
dc.publisherUniversidad Autónoma de Bucaramanga UNABspa
dc.relationhttps://revistas.unab.edu.co/index.php/rcc/article/view/4300/3508spa
dc.relation.referencesAifanti, N., Papachristou, C., & Delopoulos, A. (2010). The MUG facial expression database. 1th International Workshop on Image Analysis for Multimedia Interactive Services WIAMIS 10, 1–4.
dc.relation.referencesAlmohammadi, K., Hagras, H., Yao, B., Alzahrani, A., Alghazzawi, D., & Aldabbagh, G. (2017). A type-2 fuzzy logic recommendation system for adaptive teaching. Soft Computing, 21(4). https://doi.org/10.1007/s00500-015- 1826-y
dc.relation.referencesAneja, D., Colburn, A., Faigin, G., Shapiro, L., & Mones, B. (2017). Modeling Stylized Character Expressions via Deep Learning. In Computer Vision – ACCV 2016. ACCV 2016. Lecture Notes in Computer Science (Vol. 10112). Springer, Cham. https://doi.org/10.1007/978-3-319-54184-6_9
dc.relation.referencesArana-Llanes, J. Y., González-Serna, G., Pineda-Tapia, R., Olivares-Peregrino, V., Ricarte-Trives, J. J., & Latorre- Postigo, J. M. (2018). EEG lecture on recommended activities for the induction of attention and concentration mental states on e-learning students. Journal of Intelligent & Fuzzy Systems, 34(5). https://doi.org/10.3233/JIFS- 169517
dc.relation.referencesArroyo, I., Cooper, D. G., Burleson, W., Woolf, B. P., Muldner, K., & Christopherson, R. (2009). Emotion sensors go to school. Artificial Intelligence in Education, 17–24.
dc.relation.referencesBarrón-Estrada, M. L., Zatarain-Cabada, R., Aispuro-Medina, B. G., Valencia-Rodríguez, E. M., & Lara-Barrera, A. C. (2016). Building a Corpus of Facial Expressions for Learning-Centered Emotions. Research in Computing Science, 129, 45–52.
dc.relation.referencesBixler, R., & D’Mello, S. (2013). Towards Automated Detection and Regulation of Affective States During Academic Writing. In Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science (Vol. 7926). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39112-5_142
dc.relation.referencesBosch, N., & D’Mello, S. (2017). The Affective Experience of Novice Computer Programmers. International Journal of Artificial Intelligence in Education, 27(1). https://doi.org/10.1007/s40593-015-0069-5
dc.relation.referencesBosch, N., D’Mello, S. K., Baker, R. S., Ocumpaugh, J., Shute, V., Ventura, M., Wang, L., & Zhao, W. (2016a). Detecting student emotions in computer-enabled classrooms. IJCAI International Joint Conference on Artificial Intelligence, 4125–4129.
dc.relation.referencesBosch, N., D’Mello, S. K., Ocumpaugh, J., Baker, R. S., & Shute, V. (2016b). Using Video to Automatically Detect Learner Affect in Computer-Enabled Classrooms. ACM Transactions on Interactive Intelligent Systems, 6(2). https://doi.org/10.1145/2946837
dc.relation.referencesBotelho, A. F., Baker, R. S., & Heffernan, N. T. (2017). Improving Sensor-Free Affect Detection Using Deep Learning. In Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science (Vol. 10331). https://doi.org/10.1007/978-3-319-61425-0_4
dc.relation.referencesCabada, R., Barrón, M., & Olivares, J. M. (2014). Reconocimiento automático y aspectos éticos de emociones para aplicaciones educativas. Inteligencia Artificial: Una Reflexión Obligada.
dc.relation.referencesCornelius, R. R. (1996). The science of emotion: Research and tradition in the psychology of emotions. Prentice-Hall, Inc.
dc.relation.referencesCowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, 18(1). https://doi.org/10.1109/79.911197
dc.relation.referencesEkman, P. (2004). Emotions Revealed. Recognizing Faces and Feelings to Improve Communication and Emotional Life. Henrry Holt and Company.
dc.relation.referencesEkman, P., Friesen, W., & Hager, J. (2002). Facial action coding system: Research nexus network research information.
dc.relation.referencesel Kaliouby, R., & Picard, R. W. (2019). Affectiva Database. MIT Media Laboratory. https://www.affectiva.com/
dc.relation.referencesFreitas-Magalhães, A. (2021). Facial Action Coding System 4.0-Manual of Scientific Codification of the Human Face. Leya.
dc.relation.referencesFuentes, C., Herskovic, V., Rodríguez, I., Gerea, C., Marques, M., & Rossel, P. O. (2017). A systematic literature review about technologies for self-reporting emotional information. Journal of Ambient Intelligence and Humanized Computing, 8(4). https://doi.org/10.1007/s12652-016-0430-z
dc.relation.referencesGonzález Meneses, Y. N., Guerrero García, J., Reyes García, C. A., Olmos Pineda, I., & González Calleros, J. M. (2019). Methodology for Automatic Identification of Emotions in Learning Environments. Research in Computing Science, 148(5), 89–96. https://doi.org/10.1088/0031-9112/29/6/013
dc.relation.referencesGonzález-Hernández, F., Zatarain-Cabada, R., Barrón-Estrada, M. L., & Rodríguez-Rangel, H. (2018). Recognition of learning-centered emotions using a convolutional neural network. Journal of Intelligent & Fuzzy Systems, 34(5). https://doi.org/10.3233/JIFS-169514
dc.relation.referencesGraesser, A. C., & D’Mello, S. (2012). Emotions During the Learning of Difficult Material. Psychology of Learning and Motivation, 57. https://doi.org/10.1016/B978-0-12-394293-7.00005-4
dc.relation.referencesGupta, A., D’Cunha, A., Awasthi, K., & Balasubramanian, V. (2016). DAiSEE: Towards User Engagement Recognition in the Wild. ArXiv Preprint.
dc.relation.referencesHappy, S. L., Patnaik, P., Routray, A., & Guha, R. (2017). The Indian Spontaneous Expression Database for Emotion Recognition. IEEE Transactions on Affective Computing, 8(1). https://doi.org/10.1109/TAFFC.2015.2498174
dc.relation.referencesHarley, J. M., Bouchet, F., & Azevedo, R. (2013). Aligning and Comparing Data on Emotions Experienced during Learning with MetaTutor. In Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science (Vol. 7926). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39112-5_7
dc.relation.referencesHill, D. (2010). Emotionomics: Leveraging emotions for business success.
dc.relation.referencesHjortsj, C. H., Ekman, P., Friesen, W. v., Hager, J. C., Facs, F., & Facs, F. M. (2019). Sistema de Codificación Facial.
dc.relation.referencesKoelstra, S., Muhl, C., Soleymani, M., Jong-Seok Lee, Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., & Patras, I. (2012). DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Transactions on Affective Computing, 3(1). https://doi.org/10.1109/T-AFFC.2011.15
dc.relation.referencesLangner, O., Dotsch, R., Bijlstra, G., Wigboldus, D. H. J., Hawk, S. T., & van Knippenberg, A. (2010). Presentation and validation of the Radboud Faces Database. Cognition & Emotion, 24(8). https://doi.org/10.1080/02699930903485076
dc.relation.referencesLivingstone, S. R., & Russo, F. A. (2018). The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLOS ONE, 13(5). https://doi.org/10.1371/journal.pone.0196391
dc.relation.referencesLucey, P., Cohn, J. F., Kanade, T., Saragih, J., & Ambadar, Z. (2010). The extended Cohn-Kanade dataset (CK+): a complete facial expression dataset for action unit and emotion-specified expression Conference on. Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference On, July, 94– 101.
dc.relation.referencesLyons, M. J., Kamachi, M., & Gyoba, J. (2014). Japanese Female Facial Expressions (JAFFE), Database of digital images. http://www.kasrl.org/jaffe_info.html
dc.relation.referencesMavadati, S. M., Mahoor, M. H., Bartlett, K., Trinh, P., & Cohn, J. F. (2013). DISFA: A Spontaneous Facial Action Intensity Database. IEEE Transactions on Affective Computing, 4(2). https://doi.org/10.1109/T-AFFC.2013.4
dc.relation.referencesMehmood, R., & Lee, H. (2017). Towards Building a Computer Aided Education System for Special Students Using Wearable Sensor Technologies. Sensors, 17(317), 1–22. https://doi.org/10.3390/s17020317
dc.relation.referencesMena-Chalco, J. P., Cesar-Jr, R., & Velho, L. (2008). Banco de dados de faces 3D: IMPA-FACE3D. IMPA-RJ, Tech. Rep.
dc.relation.referencesMohamad Nezami, O., Dras, M., Hamey, L., Richards, D., Wan, S., & Paris, C. (2020). Automatic Recognition of Student Engagement Using Deep Learning and Facial Expression. In Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2019. Lecture Notes in Computer Science. https://doi.org/10.1007/978- 3-030-46133-1_17
dc.relation.referencesMollahosseini, A., Hasani, B., & Mahoor, M. H. (2019). AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild. IEEE Transactions on Affective Computing, 10(1). https://doi.org/10.1109/TAFFC.2017.2740923
dc.relation.referencesMonkaresi, H., Bosch, N., Calvo, R. A., & D’Mello, S. K. (2017). Automated Detection of Engagement Using Video- Based Estimation of Facial Expressions and Heart Rate. IEEE Transactions on Affective Computing, 8(1). https://doi.org/10.1109/TAFFC.2016.2515084
dc.relation.referencesNye, B., Karumbaiah, S., Tokel, S. T., Core, M. G., Stratou, G., Auerbach, D., & Georgila, K. (2017). Analyzing Learner Affect in a Scenario-Based Intelligent Tutoring System. In Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science (Vol. 10331). Springer, Cham. https://doi.org/10.1007/978-3-319- 61425-0_60
dc.relation.referencesPicard, R. W. (2000). Affective Computing. MIT Press.
dc.relation.referencesPicard, R. W. (2003). Affective computing: challenges. International Journal of Human-Computer Studies, 59(1–2). https://doi.org/10.1016/S1071-5819(03)00052-1
dc.relation.referencesScherer, K. R. (2005). What are emotions? And how can they be measured? Social Science Information, 44(4). https://doi.org/10.1177/0539018405058216
dc.relation.referencesSneddon, I., McRorie, M., McKeown, G., & Hanratty, J. (2012). The Belfast Induced Natural Emotion Database. IEEE Transactions on Affective Computing, 3(1). https://doi.org/10.1109/T-AFFC.2011.26
dc.relation.referencesSteidl, S. (2009). Automatic classification of emotion related user states in spontaneous children’s speech. Logos- Verlag.
dc.relation.referencesThomaz, C. E. (2012). FEI Face Database. https://fei.edu.br/~cet/facedatabase.html
dc.relation.referencesValstar, M., & Pantic, M. (2010). Induced disgust, happiness and surprise: an addition to the MMI facial expression database. Proc. 3rd Intern. Workshop on EMOTION (Satellite of LREC): Corpora for Research on Emotion and Affect.
dc.relation.referencesXiao, X., Pham, P., & Wang, J. (2017). Dynamics of Affective States During MOOC Learning. In Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science (Vol. 10331). Springer, Cham. https://doi.org/10.1007/978-3-319-61425-0_70
dc.relation.referencesZafeiriou, S., Kollias, D., Nicolaou, M. A., Papaioannou, A., Zhao, G., & Kotsia, I. (2017). Aff-wild: valence and arousal’In-the-Wild’challenge. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 34–41.
dc.relation.referencesZatarain-Cabada, R., Barrón-Estrada, M. L., González-Hernández, F., Oramas-Bustillos, R., Alor-Hernández, G., & Reyes-García, C. A. (2017b). Building a Corpus and a Local Binary Pattern Recognizer for Learning-Centered Emotions. In Advances in Soft Computing. MICAI 2016. Lecture Notes in Computer Science (Vol. 10062). https://doi.org/10.1007/978-3-319-62428-0_43
dc.relation.referencesZatarain-Cabada, R., Barron-Estrada, M. L., Gonzalez-Hernandez, F., & Rodriguez-Rangel, H. (2017a). Building a Face Expression Recognizer and a Face Expression Database for an Intelligent Tutoring System. 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT). https://doi.org/10.1109/ICALT.2017.141
dc.relation.referencesZhao, G., Huang, X., Taini, M., Li, S. Z., & Pietikäinen, M. (2011). Facial expression recognition from near-infrared videos. Image and Vision Computing, 29(9). https://doi.org/10.1016/j.imavis.2011.07.002
dc.relation.urihttps://revistas.unab.edu.co/index.php/rcc/issue/view/276spa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.sourceVol. 22 Núm. 2 (2021): Revista Colombiana de Computación (Julio-Diciembre); 58-71spa
dc.subjectBases de datosspa
dc.subjectIdentificación automática de emocionesspa
dc.subjectExpresiones facialesspa
dc.subject.keywordsDatabaseseng
dc.subject.keywordsAutomatic identification of emotionseng
dc.subject.keywordsFacial expressionseng
dc.titleAnálisis de bases de datos de expresiones faciales para la identificación automática de emociones centradas en el aprendizajespa
dc.title.translatedAnalysis of databases of facial expressions for the automatic identification of learning-centered emotionseng
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.type.driverinfo:eu-repo/semantics/article
dc.type.hasversioninfo:eu-repo/semantics/publishedVersion
dc.type.localArtículospa
dc.type.redcolhttp://purl.org/redcol/resource_type/ART

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
Artículo.pdf
Tamaño:
474.91 KB
Formato:
Adobe Portable Document Format
Descripción:
Artículo

Bloque de licencias

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
license.txt
Tamaño:
347 B
Formato:
Item-specific license agreed upon to submission
Descripción: