Papers
Topics
Authors
Recent
Search
2000 character limit reached

Modeling User Preferences via Brain-Computer Interfacing

Published 15 May 2024 in cs.HC and cs.AI | (2405.09691v2)

Abstract: Present Brain-Computer Interfacing (BCI) technology allows inference and detection of cognitive and affective states, but fairly little has been done to study scenarios in which such information can facilitate new applications that rely on modeling human cognition. One state that can be quantified from various physiological signals is attention. Estimates of human attention can be used to reveal preferences and novel dimensions of user experience. Previous approaches have tackled these incredibly challenging tasks using a variety of behavioral signals, from dwell-time to click-through data, and computational models of visual correspondence to these behavioral signals. However, behavioral signals are only rough estimations of the real underlying attention and affective preferences of the users. Indeed, users may attend to some content simply because it is salient, but not because it is really interesting, or simply because it is outrageous. With this paper, we put forward a research agenda and example work using BCI to infer users' preferences, their attentional correlates towards visual content, and their associations with affective experience. Subsequently, we link these to relevant applications, such as information retrieval, personalized steering of generative models, and crowdsourcing population estimates of affective experiences.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. Confounds in the data—Comments on “Decoding brain representations by multimodal learning of neural activity and visual features”. IEEE Trans. Pattern Anal. Mach. Intell. 44, 12 (2021), 9217–9220. https://doi.org/10.1109/TPAMI.2021.3121268
  2. Anna M. Beres. 2017. Time is of the Essence: A Review of Electroencephalography (EEG) and Event-Related Brain Potentials (ERPs) in Language Research. Applied Psychophysiology and Biofeedback 42, 4 (2017), 247–255. https://doi.org/10.1007/s10484-017-9371-3
  3. Brain-Supervised Image Editing. In Proc. IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR ’22). 18480–18489. https://doi.org/10.1109/CVPR52688.2022.01793
  4. Brainsourcing: Crowdsourcing Recognition Tasks via Collaborative Brain-Computer Interfacing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20). 1–14. https://doi.org/10.1145/3313831.3376288
  5. Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing. IEEE Trans. Affect. Comput. 14, 4 (2023), 3094–3105. https://doi.org/10.1109/TAFFC.2022.3225885
  6. Brain relevance feedback for interactive image generation. In Proc. Annual ACM Symposium on User Interface Software and Technology (UIST ’20’). 1060–1070. https://doi.org/10.1145/3379337.3415821
  7. Summary of over fifty years with brain-computer interfaces—a review. Brain Sciences 11, 1 (2021), 43. https://doi.org/10.3390/brainsci11010043
  8. BENDR: using transformers and a contrastive self-supervised learning task to learn from massive amounts of EEG data. Front. Hum. Neurosci. 15 (2021). https://doi.org/10.3389/fnhum.2021.653659
  9. The Perils and Pitfalls of Block Design for EEG Classification Experiments. IEEE Trans. Pattern Anal. Mach. Intell. 43, 1 (2021), 316–333. https://doi.org/10.1109/TPAMI.2020.2973153
  10. Jan Mizgajski and Mikołaj Morzy. 2019. Affective recommender systems in online news industry: how emotions influence reading choices. User Model. User-Adapt. Interact. 29, 2 (2019), 345–379. https://doi.org/10.1007/s11257-018-9213-x
  11. Affective annotation of videos from EEG-based crowdsourcing. ([n. d.]). Under review.
  12. Sneaky Emotions: Impact of Data Partitions in Affective Computing Experiments with Brain-Computer Interfacing. Biomed. Eng. Lett. 14 (2024), 103–113. https://doi.org/10.1007/s13534-023-00316-5
  13. Machine Learning-Based Cascade Filtering System for fNIRS Data Analysis. In Proc. of Progress in Applied Electrical Engineering (PAEE ’23). 1–5. https://doi.org/10.1109/PAEE59932.2023.10244522
  14. Investigating User Perceptions Towards Wearable Mobile Electromyography. In Proc. IFIP Conf. on Human-Computer Interaction (INTERACT ’21). 339–360. https://doi.org/10.1007/978-3-030-85610-6_20
  15. Crowdsourcing Affective Annotations via fNIRS-BCI. IEEE Transactions on Affective Computing (2023). https://doi.org/10.1109/TAFFC.2023.3273916
  16. Affective Relevance: Inferring Emotional Responses via fNIRS Neuroimaging. In Proc. Intl. ACM SIGIR Conf. on Research and Development in Information Retrieval (SIGIR ’23). https://doi.org/10.1145/3539618.3591946
  17. Feeling Positive? Predicting Emotional Image Similarity from Brain Signals. In Proceedings of the ACM Intl. Conf. on Multimedia (MM ’23). https://doi.org/10.1145/3581783.3613442
  18. Brain-Computer Interface for Generating Personally Attractive Images. IEEE Trans. Affect. Comput. 14, 1 (2023), 637–649. https://doi.org/10.1109/TAFFC.2021.3059043
  19. NEMO: A Database for Emotion Analysis Using Functional Near-Infrared Spectroscopy. IEEE Trans. Affect. Comput. (2023). https://doi.org/10.1109/TAFFC.2023.3315971
  20. Ramesh Srinivasan. 1999. Methods to Improve the Spatial Resolution of EEG. International Journal of Bioelectromagnetism 1, 1 (1999), 102–111. https://doi.org/10.1007/s10484-017-9371-3
  21. Custom-made Near Infrared Spectroscope as a Tool for Obtaining Information Regarding the Brain Condition. In Proc. Intl. Conf. on Methods and Models in Automation and Robotics (MMAR ’23). 256–263. https://doi.org/10.1109/MMAR58394.2023.10242471

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.