Perceptual User Interfaces Logo
University of Stuttgart Logo

Biography

Ekta Sood is a PhD student in the Perceptual User Interfaces group since June 2019. She holds a Bachelor's degree in Cognitive Science from the University of California, Santa Cruz, and a Master's degree in Computational Linguistics from the University of Stuttgart. Her interests are in Multimodal modeling, Attention and Memory in Deep Learning, Affective Computing and Emotion Analysis, as well as Behavioral Neuroscience. Ekta was awarded a Google scholarship for the 2019/2020 year and did an internship at Meta (USA) in 2022.

Please note: Ekta is currently taking part in an internship at Meta, and cannot accept new students until December 2022.


Teaching

2022
Machine Perception and Learning (Tutor)Master
2020
Machine Learning and Computer Vision for HCI (Fachpraktikum)Master
2019
Machine Learning and Computer Vision for HCI (Fachpraktikum)Master
 
Intelligent User Interfaces (Hauptseminar)Master

Supervision

2022
Visual Question Answering through Attention Modelling with Curiosity-driven Reinforcement Learning (M.Sc.) Stoyan Dimitrov*
 
Get Into Their Heads: Disentangling Relevancy and Attention in Multimodal Transformers (M.Sc.) Fabian Koegel
 
Investigating Multi-modal Human-like Attention Integration for Visual Question Answering (M.Sc.) Ann-Sophia Mueller
2021
Calibration of an Eyetracker While Reading (Bachelor Research Project) Hallach et al.
 
Gaze to Text and Gaze to Image Generation (M.Sc.) Keerthana Jaganathan
 
Gaze Dataset Collection Study for Visually Grounded Language Modeling Tasks (InfoTech Project) Fabian Koegel
 
Image Reconstruction from Human Brain Activity by Variational Autoencoder and Adversarial Learning (M.Sc.) Mariia Podguzova**
2020
Fast-Learning System in Multi-Modal Context (M.Sc.) Adnen Abdessaied
 
Incorporating Gaze and Speech Signals in Multi-modal Emotion Recognition (B.Sc.) Ahmed Abdou
2019
Analyzing Human Gaze Information for Mental Image Synthetization (M.Sc.) Florian Strohm
*Co-supervised with Yao Wang **Co-supervised with Florian Strohm

Open Thesis Projects

 
Interpreting Neural Attention in NLP with Human Visual AttentionMaster
 
Aligned Representation Learning with Human AttentionMaster
 
Neuroscience Inspired Computer VisionMaster
 
Consciousness in Encoded StatesMaster
 
Emotions with GazeMaster
 
Interpreting Attention-based VQA Models with Multimodal Human Visual AttentionMaster/Bachelor
 
Predicting Neurological Deficits with GazeMaster/Bachelor
 
Improving Common-Sense Reasoning Tasks with a Cognitive Model of Human Visual AttentionMaster

Publications

  1. Video Language Co-Attention with Multimodal Fast-Learning Feature Fusion for VideoQA

    Video Language Co-Attention with Multimodal Fast-Learning Feature Fusion for VideoQA

    Adnen Abdessaied*, Ekta Sood*, Andreas Bulling

    Proc. of the 7th Workshop on Representation Learning for NLP (Repl4NLP), pp. 1–12, 2022.

    Abstract Links BibTeX Project

  2. Gaze-enhanced Crossmodal Embeddings for Emotion Recognition

    Gaze-enhanced Crossmodal Embeddings for Emotion Recognition

    Ahmed Abdou, Ekta Sood, Philipp Müller, Andreas Bulling

    Proc. International Symposium on Eye Tracking Research and Applications (ETRA), pp. 1–18, 2022.

    Abstract Links BibTeX Project

  3. Facial Composite Generation with Iterative Human Feedback

    Facial Composite Generation with Iterative Human Feedback

    Florian Strohm, Ekta Sood, Dominike Thomas, Mihai Bâce, Andreas Bulling

    Proceedings of the NeurIPS Workshop Gaze Meets ML (GMML), pp. 1–19, 2022.

    Abstract Links BibTeX Project Oral presentation

  1. Neural Photofit: Gaze-based Mental Image Reconstruction

    Neural Photofit: Gaze-based Mental Image Reconstruction

    Florian Strohm, Ekta Sood, Sven Mayer, Philipp Müller, Mihai Bâce, Andreas Bulling

    Proc. IEEE International Conference on Computer Vision (ICCV), pp. 245-254, 2021.

    Abstract Links BibTeX Project

  2. VQA-MHUG: A gaze dataset to study multimodal neural attention in VQA

    VQA-MHUG: A gaze dataset to study multimodal neural attention in VQA

    Ekta Sood, Fabian Kögel, Florian Strohm, Prajit Dhar, Andreas Bulling

    Proc. ACL SIGNLL Conference on Computational Natural Language Learning (CoNLL), pp. 27–43, 2021.

    Abstract Links BibTeX Project Oral presentation

  1. Multimodal Integration of Human-Like Attention in Visual Question Answering

    Multimodal Integration of Human-Like Attention in Visual Question Answering

    Ekta Sood, Fabian Kögel, Philipp Müller, Dominike Thomas, Mihai Bâce, Andreas Bulling

    arxiv:2109.13139, pp. 1–11, 2021.

    Abstract Links BibTeX Project

  1. Anticipating Averted Gaze in Dyadic Interactions

    Anticipating Averted Gaze in Dyadic Interactions

    Philipp Müller, Ekta Sood, Andreas Bulling

    Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 1-10, 2020.

    Abstract Links BibTeX Project

  2. Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension

    Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension

    Ekta Sood, Simon Tannert, Diego Frassinelli, Andreas Bulling, Ngoc Thang Vu

    Proc. ACL SIGNLL Conference on Computational Natural Language Learning (CoNLL), pp. 12-25, 2020.

    Abstract Links BibTeX Project

  3. Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention

    Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention

    Ekta Sood, Simon Tannert, Philipp Müller, Andreas Bulling

    Advances in Neural Information Processing Systems (NeurIPS), pp. 1–15, 2020.

    Abstract Links BibTeX Project

  1. Predicting Gaze Patterns: Text Saliency for Integration into Machine Learning Tasks

    Predicting Gaze Patterns: Text Saliency for Integration into Machine Learning Tasks

    Ekta Sood

    Proc. International Workshop on Computational Cognition (ComCo), pp. 1–2, 2019.

    Links BibTeX Project Best poster award

  1. Comparing Attention-Based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension

    Comparing Attention-Based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension

    Matthias Blohm, Glorianna Jagfeld, Ekta Sood, Xiang Yu, Ngoc Thang Vu

    Proc. ACL SIGNLL Conference on Computational Natural Language Learning (CoNLL), pp. 108–118, 2018.

    Abstract Links BibTeX Project