更改

删除1,864字节 、 2021年7月28日 (三) 18:17
第418行: 第418行:  
| Contempt || R12A+R14A
 
| Contempt || R12A+R14A
 
|}
 
|}
 +
<nowiki>-| 快乐 | | 6 + 12 |-| 悲伤 | | 1 + 4 + 15 |-| 惊喜 | | 1 + 2 + 5 b + 26 |-| 恐惧 | | 1 + 2 + 4 + 5 + 20 + 26 |-愤怒 | 4 + 5 + 7 + 23 |-| 厌恶 | 9 + 15 + 16 |-| 藐视 | R12A + R14A | }</nowiki>
   −
{| class="wikitable sortable"
  −
|-
  −
!情感! !行动单位 |<nowiki>-| 快乐 | | 6 + 12 |-| 悲伤 | | 1 + 4 + 15 |-| 惊喜 | | 1 + 2 + 5 b + 26 |-| 恐惧 | | 1 + 2 + 4 + 5 + 20 + 26 |-愤怒 | 4 + 5 + 7 + 23 |-| 厌恶 | 9 + 15 + 16 |-| 藐视 | R12A + R14A | }</nowiki>
      
====Challenges in facial detection====
 
====Challenges in facial detection====
As with every computational practice, in affect detection by facial processing, some obstacles need to be surpassed, in order to fully unlock the hidden potential of the overall algorithm or method employed. In the early days of almost every kind of AI-based detection (speech recognition, face recognition, affect recognition), the accuracy of modeling and tracking has been an issue. As hardware evolves, as more data are collected and as new discoveries are made and new practices introduced, this lack of accuracy fades, leaving behind noise issues. However, methods for noise removal exist including neighborhood averaging, [[Gaussian blur|linear Gaussian smoothing]], median filtering,<ref>{{cite web|url=http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/OWENS/LECT5/node3.html|title=Spatial domain methods}}</ref> or newer methods such as the Bacterial Foraging Optimization Algorithm.<ref>Clever Algorithms. [http://www.cleveralgorithms.com/nature-inspired/swarm/bfoa.html "Bacterial Foraging Optimization Algorithm – Swarm Algorithms – Clever Algorithms"] {{Webarchive|url=https://web.archive.org/web/20190612144816/http://www.cleveralgorithms.com/nature-inspired/swarm/bfoa.html |date=2019-06-12 }}. Clever Algorithms. Retrieved 21 March 2011.</ref><ref>[http://www.softcomputing.net/bfoa-chapter.pdf "Soft Computing"]. Soft Computing. Retrieved 18 March 2011.</ref>
+
 
    
As with every computational practice, in affect detection by facial processing, some obstacles need to be surpassed, in order to fully unlock the hidden potential of the overall algorithm or method employed. In the early days of almost every kind of AI-based detection (speech recognition, face recognition, affect recognition), the accuracy of modeling and tracking has been an issue. As hardware evolves, as more data are collected and as new discoveries are made and new practices introduced, this lack of accuracy fades, leaving behind noise issues. However, methods for noise removal exist including neighborhood averaging, linear Gaussian smoothing, median filtering, or newer methods such as the Bacterial Foraging Optimization Algorithm.Clever Algorithms. "Bacterial Foraging Optimization Algorithm – Swarm Algorithms – Clever Algorithms" . Clever Algorithms. Retrieved 21 March 2011."Soft Computing". Soft Computing. Retrieved 18 March 2011.
 
As with every computational practice, in affect detection by facial processing, some obstacles need to be surpassed, in order to fully unlock the hidden potential of the overall algorithm or method employed. In the early days of almost every kind of AI-based detection (speech recognition, face recognition, affect recognition), the accuracy of modeling and tracking has been an issue. As hardware evolves, as more data are collected and as new discoveries are made and new practices introduced, this lack of accuracy fades, leaving behind noise issues. However, methods for noise removal exist including neighborhood averaging, linear Gaussian smoothing, median filtering, or newer methods such as the Bacterial Foraging Optimization Algorithm.Clever Algorithms. "Bacterial Foraging Optimization Algorithm – Swarm Algorithms – Clever Algorithms" . Clever Algorithms. Retrieved 21 March 2011."Soft Computing". Soft Computing. Retrieved 18 March 2011.
第458行: 第456行:  
姿势可以有效地作为一种检测用户特定情绪状态的手段,特别是与语音和面部识别结合使用时。根据具体的动作,姿势可以是简单的反射性反应,比如当你不知道一个问题的答案时抬起你的肩膀,或者它们可以是复杂和有意义的,比如当用手语交流时。不需要利用任何物体或周围环境,我们可以挥手、拍手或招手。另一方面,当我们使用物体时,我们可以指向它们,移动,触摸或者处理它们。计算机应该能够识别这些,分析上下文,并以一种有意义的方式作出响应,以便有效地用于人机交互。
 
姿势可以有效地作为一种检测用户特定情绪状态的手段,特别是与语音和面部识别结合使用时。根据具体的动作,姿势可以是简单的反射性反应,比如当你不知道一个问题的答案时抬起你的肩膀,或者它们可以是复杂和有意义的,比如当用手语交流时。不需要利用任何物体或周围环境,我们可以挥手、拍手或招手。另一方面,当我们使用物体时,我们可以指向它们,移动,触摸或者处理它们。计算机应该能够识别这些,分析上下文,并以一种有意义的方式作出响应,以便有效地用于人机交互。
   −
There are many proposed methods<ref name="JK">J. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999</ref> to detect the body gesture. Some literature differentiates 2 different approaches in gesture recognition: a 3D model based and an appearance-based.<ref name="Vladimir">{{cite journal | first1 = Vladimir I. | last1 = Pavlovic | first2 = Rajeev | last2 = Sharma | first3 = Thomas S. | last3 = Huang | url = http://www.cs.rutgers.edu/~vladimir/pub/pavlovic97pami.pdf | title = Visual Interpretation of Hand Gestures for Human–Computer Interaction: A Review | journal = [[IEEE Transactions on Pattern Analysis and Machine Intelligence]] | volume = 19 | issue = 7 | pages = 677–695 | year = 1997 | doi = 10.1109/34.598226 }}</ref> The foremost method makes use of 3D information of key elements of the body parts in order to obtain several important parameters, like palm position or joint angles. On the other hand, appearance-based systems use images or videos to for direct interpretation. Hand gestures have been a common focus of body gesture detection methods.<ref name="Vladimir"/>
+
There are many proposed methods<ref name="JK">J. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999</ref> to detect the body gesture. Some literature differentiates 2 different approaches in gesture recognition: a 3D model based and an appearance-based.<ref name="Vladimir">{{cite journal | first1 = Vladimir I. | last1 = Pavlovic | first2 = Rajeev | last2 = Sharma | first3 = Thomas S. | last3 = Huang | url = http://www.cs.rutgers.edu/~vladimir/pub/pavlovic97pami.pdf | title = Visual Interpretation of Hand Gestures for Human–Computer Interaction: A Review | journal = [[IEEE Transactions on Pattern Analysis and Machine Intelligence]] | volume = 19 | issue = 7 | pages = 677–695 | year = 1997 | doi = 10.1109/34.598226 }}</ref> The foremost method makes use of 3D information of key elements of the body parts in order to obtain several important parameters, like palm position or joint angles. On the other hand, appearance-based systems use images or videos to for direct interpretation. Hand gestures have been a common focus of body gesture detection methods.<ref name="Vladimir" />
    
There are many proposed methodsJ. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999 to detect the body gesture. Some literature differentiates 2 different approaches in gesture recognition: a 3D model based and an appearance-based. The foremost method makes use of 3D information of key elements of the body parts in order to obtain several important parameters, like palm position or joint angles. On the other hand, appearance-based systems use images or videos to for direct interpretation. Hand gestures have been a common focus of body gesture detection methods.
 
There are many proposed methodsJ. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999 to detect the body gesture. Some literature differentiates 2 different approaches in gesture recognition: a 3D model based and an appearance-based. The foremost method makes use of 3D information of key elements of the body parts in order to obtain several important parameters, like palm position or joint angles. On the other hand, appearance-based systems use images or videos to for direct interpretation. Hand gestures have been a common focus of body gesture detection methods.
第471行: 第469行:  
这可用于通过监测和分析用户的生理迹象来检测用户的情感状态。 这些迹象的范围从心率和皮肤电导率的变化到面部肌肉的微小收缩和面部血流的变化。这个领域的发展势头越来越强劲,我们现在看到了实现这些技术的真正产品。通常被分析的4个主要生理特征是血容量脉搏、皮肤电反应、面部肌电图和面部颜色模式。
 
这可用于通过监测和分析用户的生理迹象来检测用户的情感状态。 这些迹象的范围从心率和皮肤电导率的变化到面部肌肉的微小收缩和面部血流的变化。这个领域的发展势头越来越强劲,我们现在看到了实现这些技术的真正产品。通常被分析的4个主要生理特征是血容量脉搏、皮肤电反应、面部肌电图和面部颜色模式。
   −
====Blood volume pulse====
     −
====Blood volume pulse====
     −
= = = 血容量脉搏 = = =
+
Blood volume pulse
   −
=====Overview=====
  −
  −
=====Overview=====
  −
  −
= = = 概述 = = =
      +
=====Overview 概述=====
 
A subject's blood volume pulse (BVP) can be measured by a process called photoplethysmography, which produces a graph indicating blood flow through the extremities.<ref name="Picard, Rosalind 1998">Picard, Rosalind (1998). Affective Computing. MIT.</ref> The peaks of the waves indicate a cardiac cycle where the heart has pumped blood to the extremities. If the subject experiences fear or is startled, their heart usually 'jumps' and beats quickly for some time, causing the amplitude of the cardiac cycle to increase. This can clearly be seen on a photoplethysmograph when the distance between the trough and the peak of the wave has decreased. As the subject calms down, and as the body's inner core expands, allowing more blood to flow back to the extremities, the cycle will return to normal.
 
A subject's blood volume pulse (BVP) can be measured by a process called photoplethysmography, which produces a graph indicating blood flow through the extremities.<ref name="Picard, Rosalind 1998">Picard, Rosalind (1998). Affective Computing. MIT.</ref> The peaks of the waves indicate a cardiac cycle where the heart has pumped blood to the extremities. If the subject experiences fear or is startled, their heart usually 'jumps' and beats quickly for some time, causing the amplitude of the cardiac cycle to increase. This can clearly be seen on a photoplethysmograph when the distance between the trough and the peak of the wave has decreased. As the subject calms down, and as the body's inner core expands, allowing more blood to flow back to the extremities, the cycle will return to normal.
   第489行: 第481行:  
一个实验对象的血容量脉搏(BVP)可以通过一个叫做光容血管造影术的技术来测量,这个过程产生一个图表来显示通过四肢的血液流动【38】。波峰表明心脏将血液泵入四肢的心动周期。如果受试者感到恐惧或受到惊吓,他们的心脏通常会“跳动”并快速跳动一段时间,导致心脏周期的振幅增加。当波谷和波峰之间的距离减小时,可以在光电容积描记器上清楚地看到这一点。当受试者平静下来,身体内核扩张,允许更多的血液回流到四肢,循环将恢复正常。
 
一个实验对象的血容量脉搏(BVP)可以通过一个叫做光容血管造影术的技术来测量,这个过程产生一个图表来显示通过四肢的血液流动【38】。波峰表明心脏将血液泵入四肢的心动周期。如果受试者感到恐惧或受到惊吓,他们的心脏通常会“跳动”并快速跳动一段时间,导致心脏周期的振幅增加。当波谷和波峰之间的距离减小时,可以在光电容积描记器上清楚地看到这一点。当受试者平静下来,身体内核扩张,允许更多的血液回流到四肢,循环将恢复正常。
   −
=====Methodology=====
     −
=====Methodology=====
     −
= = = = 方法论 = = =
+
 
 +
 
 +
=====Methodology 方法论=====
 +
 
    
Infra-red light is shone on the skin by special sensor hardware, and the amount of light reflected is measured. The amount of reflected and transmitted light correlates to the BVP as light is absorbed by hemoglobin which is found richly in the bloodstream.
 
Infra-red light is shone on the skin by special sensor hardware, and the amount of light reflected is measured. The amount of reflected and transmitted light correlates to the BVP as light is absorbed by hemoglobin which is found richly in the bloodstream.
第501行: 第494行:  
红外光通过特殊的传感器硬件照射在皮肤上,测量皮肤反射的光量。反射和透射光的数量与 BVP 相关,因为光线被血红蛋白吸收,而血液中的血红蛋白含量丰富。
 
红外光通过特殊的传感器硬件照射在皮肤上,测量皮肤反射的光量。反射和透射光的数量与 BVP 相关,因为光线被血红蛋白吸收,而血液中的血红蛋白含量丰富。
   −
=====Disadvantages=====
     −
=====Disadvantages=====
     −
= = = = 劣势 = = = =
+
=====Disadvantages 劣势=====
 +
 
    
It can be cumbersome to ensure that the sensor shining an infra-red light and monitoring the reflected light is always pointing at the same extremity, especially seeing as subjects often stretch and readjust their position while using a computer.
 
It can be cumbersome to ensure that the sensor shining an infra-red light and monitoring the reflected light is always pointing at the same extremity, especially seeing as subjects often stretch and readjust their position while using a computer.
第535行: 第527行:  
{{Main|Galvanic skin response}}
 
{{Main|Galvanic skin response}}
   −
Galvanic skin response (GSR) is an outdated term for a more general phenomenon known as [Electrodermal Activity] or EDA.  EDA is a general phenomena whereby the skin's electrical properties change.  The skin is innervated by the [sympathetic nervous system], so measuring its resistance or conductance provides a way to quantify small changes in the sympathetic branch of the autonomic nervous system.  As the sweat glands are activated, even before the skin feels sweaty, the level of the EDA can be captured (usually using conductance) and used to discern small changes in autonomic arousal.  The more aroused a subject is, the greater the skin conductance tends to be.<ref name="Picard, Rosalind 1998"/>
+
Galvanic skin response (GSR) is an outdated term for a more general phenomenon known as [Electrodermal Activity] or EDA.  EDA is a general phenomena whereby the skin's electrical properties change.  The skin is innervated by the [sympathetic nervous system], so measuring its resistance or conductance provides a way to quantify small changes in the sympathetic branch of the autonomic nervous system.  As the sweat glands are activated, even before the skin feels sweaty, the level of the EDA can be captured (usually using conductance) and used to discern small changes in autonomic arousal.  The more aroused a subject is, the greater the skin conductance tends to be.<ref name="Picard, Rosalind 1998" />
    
Galvanic skin response (GSR) is an outdated term for a more general phenomenon known as [Electrodermal Activity] or EDA.  EDA is a general phenomena whereby the skin's electrical properties change.  The skin is innervated by the [sympathetic nervous system], so measuring its resistance or conductance provides a way to quantify small changes in the sympathetic branch of the autonomic nervous system.  As the sweat glands are activated, even before the skin feels sweaty, the level of the EDA can be captured (usually using conductance) and used to discern small changes in autonomic arousal.  The more aroused a subject is, the greater the skin conductance tends to be.
 
Galvanic skin response (GSR) is an outdated term for a more general phenomenon known as [Electrodermal Activity] or EDA.  EDA is a general phenomena whereby the skin's electrical properties change.  The skin is innervated by the [sympathetic nervous system], so measuring its resistance or conductance provides a way to quantify small changes in the sympathetic branch of the autonomic nervous system.  As the sweat glands are activated, even before the skin feels sweaty, the level of the EDA can be captured (usually using conductance) and used to discern small changes in autonomic arousal.  The more aroused a subject is, the greater the skin conductance tends to be.
第547行: 第539行:  
皮肤导电反应通常是通过放置在皮肤某处的小型氯化银电极并在两者之间施加一个小电压来测量的。为了最大限度地舒适和减少刺激,电极可以放在手腕、腿上或脚上,这样手就可以完全自由地进行日常活动。
 
皮肤导电反应通常是通过放置在皮肤某处的小型氯化银电极并在两者之间施加一个小电压来测量的。为了最大限度地舒适和减少刺激,电极可以放在手腕、腿上或脚上,这样手就可以完全自由地进行日常活动。
   −
====Facial color====
+
 
    
====Facial color====
 
====Facial color====
   −
= = = 面部表情 = =
      
=====Overview=====
 
=====Overview=====
   −
=====Overview=====
  −
  −
= = = 概述 = = =
      
The surface of the human face is innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Whether or not facial emotions activate facial muscles, variations in blood flow, blood pressure, glucose levels, and other changes occur. Also, the facial color signal is independent from that provided by facial muscle movements.<ref name="face">Carlos F. Benitez-Quiroz, Ramprakash Srinivasan, Aleix M. Martinez, [https://www.pnas.org/content/115/14/3581 Facial color is an efficient mechanism to visually transmit emotion], PNAS. April 3, 2018 115 (14) 3581–3586; first published March 19, 2018 https://doi.org/10.1073/pnas.1716084115.</ref>
 
The surface of the human face is innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Whether or not facial emotions activate facial muscles, variations in blood flow, blood pressure, glucose levels, and other changes occur. Also, the facial color signal is independent from that provided by facial muscle movements.<ref name="face">Carlos F. Benitez-Quiroz, Ramprakash Srinivasan, Aleix M. Martinez, [https://www.pnas.org/content/115/14/3581 Facial color is an efficient mechanism to visually transmit emotion], PNAS. April 3, 2018 115 (14) 3581–3586; first published March 19, 2018 https://doi.org/10.1073/pnas.1716084115.</ref>
第567行: 第555行:  
=====Methodology=====
 
=====Methodology=====
   −
=====Methodology=====
  −
  −
= = = = 方法论 = = =
     −
Approaches are based on facial color changes. Delaunay triangulation is used to create the triangular local areas. Some of these triangles which define the interior of the mouth and eyes (sclera and iris) are removed. Use the left triangular areas’ pixels to create feature vectors.<ref name="face"/> It shows that converting the pixel color of the standard RGB color space to a color space such as oRGB color space<ref name="orgb">M. Bratkova, S. Boulos, and P. Shirley, [https://ieeexplore.ieee.org/document/4736456 oRGB: a practical opponent color space for computer graphics], IEEE Computer Graphics and Applications, 29(1):42–55, 2009.</ref> or LMS channels perform better when dealing with faces.<ref name="mec">Hadas Shahar, [[Hagit Hel-Or]], [http://openaccess.thecvf.com/content_ICCVW_2019/papers/CVPM/Shahar_Micro_Expression_Classification_using_Facial_Color_and_Deep_Learning_Methods_ICCVW_2019_paper.pdf Micro Expression Classification using Facial Color and Deep Learning Methods], The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 0–0.</ref> So, map the above vector onto the better color space and decompose into red-green and yellow-blue channels. Then use deep learning methods to find equivalent emotions.
+
Approaches are based on facial color changes. Delaunay triangulation is used to create the triangular local areas. Some of these triangles which define the interior of the mouth and eyes (sclera and iris) are removed. Use the left triangular areas’ pixels to create feature vectors.<ref name="face" /> It shows that converting the pixel color of the standard RGB color space to a color space such as oRGB color space<ref name="orgb">M. Bratkova, S. Boulos, and P. Shirley, [https://ieeexplore.ieee.org/document/4736456 oRGB: a practical opponent color space for computer graphics], IEEE Computer Graphics and Applications, 29(1):42–55, 2009.</ref> or LMS channels perform better when dealing with faces.<ref name="mec">Hadas Shahar, [[Hagit Hel-Or]], [http://openaccess.thecvf.com/content_ICCVW_2019/papers/CVPM/Shahar_Micro_Expression_Classification_using_Facial_Color_and_Deep_Learning_Methods_ICCVW_2019_paper.pdf Micro Expression Classification using Facial Color and Deep Learning Methods], The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 0–0.</ref> So, map the above vector onto the better color space and decompose into red-green and yellow-blue channels. Then use deep learning methods to find equivalent emotions.
    
Approaches are based on facial color changes. Delaunay triangulation is used to create the triangular local areas. Some of these triangles which define the interior of the mouth and eyes (sclera and iris) are removed. Use the left triangular areas’ pixels to create feature vectors. It shows that converting the pixel color of the standard RGB color space to a color space such as oRGB color spaceM. Bratkova, S. Boulos, and P. Shirley, oRGB: a practical opponent color space for computer graphics, IEEE Computer Graphics and Applications, 29(1):42–55, 2009. or LMS channels perform better when dealing with faces.Hadas Shahar, Hagit Hel-Or, Micro Expression Classification using Facial Color and Deep Learning Methods, The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 0–0. So, map the above vector onto the better color space and decompose into red-green and yellow-blue channels. Then use deep learning methods to find equivalent emotions.
 
Approaches are based on facial color changes. Delaunay triangulation is used to create the triangular local areas. Some of these triangles which define the interior of the mouth and eyes (sclera and iris) are removed. Use the left triangular areas’ pixels to create feature vectors. It shows that converting the pixel color of the standard RGB color space to a color space such as oRGB color spaceM. Bratkova, S. Boulos, and P. Shirley, oRGB: a practical opponent color space for computer graphics, IEEE Computer Graphics and Applications, 29(1):42–55, 2009. or LMS channels perform better when dealing with faces.Hadas Shahar, Hagit Hel-Or, Micro Expression Classification using Facial Color and Deep Learning Methods, The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 0–0. So, map the above vector onto the better color space and decompose into red-green and yellow-blue channels. Then use deep learning methods to find equivalent emotions.
第585行: 第570行:     
==Potential applications==
 
==Potential applications==
=== Education ===
+
===Education===
 
Affection influences learners' learning state. Using affective computing technology, computers can judge the learners' affection and learning state by recognizing their facial expressions. In education, the teacher can use the analysis result to understand the student's learning and accepting ability, and then formulate reasonable teaching plans. At the same time, they can pay attention to students' inner feelings, which is helpful to students' psychological health. Especially in distance education, due to the separation of time and space, there is no emotional incentive between teachers and students for two-way communication. Without the atmosphere brought by traditional classroom learning, students are easily bored, and affect the learning effect. Applying affective computing in distance education system can effectively improve this situation.
 
Affection influences learners' learning state. Using affective computing technology, computers can judge the learners' affection and learning state by recognizing their facial expressions. In education, the teacher can use the analysis result to understand the student's learning and accepting ability, and then formulate reasonable teaching plans. At the same time, they can pay attention to students' inner feelings, which is helpful to students' psychological health. Especially in distance education, due to the separation of time and space, there is no emotional incentive between teachers and students for two-way communication. Without the atmosphere brought by traditional classroom learning, students are easily bored, and affect the learning effect. Applying affective computing in distance education system can effectively improve this situation.
 
<ref>http://www.learntechlib.org/p/173785/</ref>
 
<ref>http://www.learntechlib.org/p/173785/</ref>
第607行: 第592行:  
情感计算也被应用于交流技术的发展,以供孤独症患者使用【46】。情感计算项目文本中的情感成分也越来越受到关注,特别是它在所谓的情感或情感互联网中的作用【47】。
 
情感计算也被应用于交流技术的发展,以供孤独症患者使用【46】。情感计算项目文本中的情感成分也越来越受到关注,特别是它在所谓的情感或情感互联网中的作用【47】。
   −
=== Video games ===
     −
=== Video games ===
+
===Video games===
   −
= = = 电子游戏 = =
      
Affective video games can access their players' emotional states through [[biofeedback]] devices.<ref>{{cite conference |title=Affective Videogames and Modes of Affective Gaming: Assist Me, Challenge Me, Emote Me |first1=Kiel Mark |last1=Gilleade |first2=Alan |last2=Dix |first3=Jen |last3=Allanson |year=2005 |conference=Proc. [[Digital Games Research Association|DiGRA]] Conf. |url=http://comp.eprints.lancs.ac.uk/1057/1/Gilleade_Affective_Gaming_DIGRA_2005.pdf |access-date=2016-12-10 |archive-url=https://web.archive.org/web/20150406200454/http://comp.eprints.lancs.ac.uk/1057/1/Gilleade_Affective_Gaming_DIGRA_2005.pdf |archive-date=2015-04-06 |url-status=dead }}</ref> A particularly simple form of biofeedback is available through [[gamepad]]s that measure the pressure with which a button is pressed: this has been shown to correlate strongly with the players' level of [[arousal]];<ref>{{Cite conference| doi = 10.1145/765891.765957| title = Affective gaming: Measuring emotion through the gamepad| conference = CHI '03 Extended Abstracts on Human Factors in Computing Systems| year = 2003| last1 = Sykes | first1 = Jonathan| last2 = Brown | first2 = Simon| isbn = 1581136374| citeseerx = 10.1.1.92.2123}}</ref> at the other end of the scale are [[brain–computer interface]]s.<ref>{{Cite journal | doi = 10.1016/j.entcom.2009.09.007| title = Turning shortcomings into challenges: Brain–computer interfaces for games| journal = Entertainment Computing| volume = 1| issue = 2| pages = 85–94| year = 2009| last1 = Nijholt | first1 = Anton| last2 = Plass-Oude Bos | first2 = Danny| last3 = Reuderink | first3 = Boris| bibcode = 2009itie.conf..153N| url = http://wwwhome.cs.utwente.nl/~anijholt/artikelen/intetain_bci_2009.pdf}}</ref><ref>{{Cite conference| doi = 10.1007/978-3-642-02315-6_23| title = Affective Pacman: A Frustrating Game for Brain–Computer Interface Experiments| conference = Intelligent Technologies for Interactive Entertainment (INTETAIN)| pages = 221–227| year = 2009| last1 = Reuderink | first1 = Boris| last2 = Nijholt | first2 = Anton| last3 = Poel | first3 = Mannes| isbn = 978-3-642-02314-9}}</ref> Affective games have been used in medical research to support the emotional development of [[autism|autistic]] children.<ref>{{Cite journal
 
Affective video games can access their players' emotional states through [[biofeedback]] devices.<ref>{{cite conference |title=Affective Videogames and Modes of Affective Gaming: Assist Me, Challenge Me, Emote Me |first1=Kiel Mark |last1=Gilleade |first2=Alan |last2=Dix |first3=Jen |last3=Allanson |year=2005 |conference=Proc. [[Digital Games Research Association|DiGRA]] Conf. |url=http://comp.eprints.lancs.ac.uk/1057/1/Gilleade_Affective_Gaming_DIGRA_2005.pdf |access-date=2016-12-10 |archive-url=https://web.archive.org/web/20150406200454/http://comp.eprints.lancs.ac.uk/1057/1/Gilleade_Affective_Gaming_DIGRA_2005.pdf |archive-date=2015-04-06 |url-status=dead }}</ref> A particularly simple form of biofeedback is available through [[gamepad]]s that measure the pressure with which a button is pressed: this has been shown to correlate strongly with the players' level of [[arousal]];<ref>{{Cite conference| doi = 10.1145/765891.765957| title = Affective gaming: Measuring emotion through the gamepad| conference = CHI '03 Extended Abstracts on Human Factors in Computing Systems| year = 2003| last1 = Sykes | first1 = Jonathan| last2 = Brown | first2 = Simon| isbn = 1581136374| citeseerx = 10.1.1.92.2123}}</ref> at the other end of the scale are [[brain–computer interface]]s.<ref>{{Cite journal | doi = 10.1016/j.entcom.2009.09.007| title = Turning shortcomings into challenges: Brain–computer interfaces for games| journal = Entertainment Computing| volume = 1| issue = 2| pages = 85–94| year = 2009| last1 = Nijholt | first1 = Anton| last2 = Plass-Oude Bos | first2 = Danny| last3 = Reuderink | first3 = Boris| bibcode = 2009itie.conf..153N| url = http://wwwhome.cs.utwente.nl/~anijholt/artikelen/intetain_bci_2009.pdf}}</ref><ref>{{Cite conference| doi = 10.1007/978-3-642-02315-6_23| title = Affective Pacman: A Frustrating Game for Brain–Computer Interface Experiments| conference = Intelligent Technologies for Interactive Entertainment (INTETAIN)| pages = 221–227| year = 2009| last1 = Reuderink | first1 = Boris| last2 = Nijholt | first2 = Anton| last3 = Poel | first3 = Mannes| isbn = 978-3-642-02314-9}}</ref> Affective games have been used in medical research to support the emotional development of [[autism|autistic]] children.<ref>{{Cite journal
第628行: 第611行:  
情感视频游戏可以通过生物反馈设备访问玩家的情绪状态【48】。一种特别简单的生物反馈形式可以通过游戏手柄来测量按下按钮的压力:这已被证明与玩家的唤醒水平密切相关【49】; 另一方面是脑机接口【50】【51】。情感游戏已被用于医学研究,以支持自闭症儿童的情感发展【52】。
 
情感视频游戏可以通过生物反馈设备访问玩家的情绪状态【48】。一种特别简单的生物反馈形式可以通过游戏手柄来测量按下按钮的压力:这已被证明与玩家的唤醒水平密切相关【49】; 另一方面是脑机接口【50】【51】。情感游戏已被用于医学研究,以支持自闭症儿童的情感发展【52】。
   −
=== Other applications ===
     −
=== Other applications ===
+
===Other applications===
   −
= = = 其他应用 = = =
      
Other potential applications are centered around social monitoring.  For example, a car can monitor the emotion of all occupants and engage in additional safety measures, such as alerting other vehicles if it detects the driver to be angry.<ref>{{cite web|url=https://gizmodo.com/in-car-facial-recognition-detects-angry-drivers-to-prev-1543709793|title=In-Car Facial Recognition Detects Angry Drivers To Prevent Road Rage|date=30 August 2018|website=Gizmodo}}</ref>  Affective computing has potential applications in [[human computer interaction|human–computer interaction]], such as affective mirrors allowing the user to see how he or she performs; emotion monitoring agents sending a warning before one sends an angry email; or even music players selecting tracks based on mood.<ref>{{cite journal|last1=Janssen|first1=Joris H.|last2=van den Broek|first2=Egon L.|date=July 2012|title=Tune in to Your Emotions: A Robust Personalized Affective Music Player|journal=User Modeling and User-Adapted Interaction|volume=22|issue=3|pages=255–279|doi=10.1007/s11257-011-9107-7|doi-access=free}}</ref>
 
Other potential applications are centered around social monitoring.  For example, a car can monitor the emotion of all occupants and engage in additional safety measures, such as alerting other vehicles if it detects the driver to be angry.<ref>{{cite web|url=https://gizmodo.com/in-car-facial-recognition-detects-angry-drivers-to-prev-1543709793|title=In-Car Facial Recognition Detects Angry Drivers To Prevent Road Rage|date=30 August 2018|website=Gizmodo}}</ref>  Affective computing has potential applications in [[human computer interaction|human–computer interaction]], such as affective mirrors allowing the user to see how he or she performs; emotion monitoring agents sending a warning before one sends an angry email; or even music players selecting tracks based on mood.<ref>{{cite journal|last1=Janssen|first1=Joris H.|last2=van den Broek|first2=Egon L.|date=July 2012|title=Tune in to Your Emotions: A Robust Personalized Affective Music Player|journal=User Modeling and User-Adapted Interaction|volume=22|issue=3|pages=255–279|doi=10.1007/s11257-011-9107-7|doi-access=free}}</ref>
第652行: 第633行:  
人们也可以利用情感状态识别来判断电视广告的影响,通过实时录像和随后对他或她的面部表情的研究。对大量主题的结果进行平均,我们就能知道这个广告(或电影)是否达到了预期的效果,以及观众最感兴趣的元素是什么。
 
人们也可以利用情感状态识别来判断电视广告的影响,通过实时录像和随后对他或她的面部表情的研究。对大量主题的结果进行平均,我们就能知道这个广告(或电影)是否达到了预期的效果,以及观众最感兴趣的元素是什么。
   −
==Cognitivist vs. interactional approaches==
      
==Cognitivist vs. interactional approaches==
 
==Cognitivist vs. interactional approaches==
  −
= = = Cognitivist vs. interactionapproach = =
  −
   
Within the field of [[human–computer interaction]], Rosalind Picard's [[cognitivism (psychology)|cognitivist]] or "information model" concept of emotion has been criticized by and contrasted with the "post-cognitivist" or "interactional" [[pragmatism|pragmatist]] approach taken by Kirsten Boehner and others which views emotion as inherently social.<ref>{{cite journal|last1=Battarbee|first1=Katja|last2=Koskinen|first2=Ilpo|title=Co-experience: user experience as interaction|journal=CoDesign|date=2005|volume=1|issue=1|pages=5–18|url=http://www2.uiah.fi/~ikoskine/recentpapers/mobile_multimedia/coexperience_reprint_lr_5-18.pdf|doi=10.1080/15710880412331289917|citeseerx=10.1.1.294.9178|s2cid=15296236}}</ref>
 
Within the field of [[human–computer interaction]], Rosalind Picard's [[cognitivism (psychology)|cognitivist]] or "information model" concept of emotion has been criticized by and contrasted with the "post-cognitivist" or "interactional" [[pragmatism|pragmatist]] approach taken by Kirsten Boehner and others which views emotion as inherently social.<ref>{{cite journal|last1=Battarbee|first1=Katja|last2=Koskinen|first2=Ilpo|title=Co-experience: user experience as interaction|journal=CoDesign|date=2005|volume=1|issue=1|pages=5–18|url=http://www2.uiah.fi/~ikoskine/recentpapers/mobile_multimedia/coexperience_reprint_lr_5-18.pdf|doi=10.1080/15710880412331289917|citeseerx=10.1.1.294.9178|s2cid=15296236}}</ref>
   第664行: 第641行:  
在人机交互领域,罗莎琳德 · 皮卡德的情绪认知主义或“信息模型”概念受到了后认知主义或“互动”实用主义者柯尔斯滕 · 博纳等人的批判和对比【56】。
 
在人机交互领域,罗莎琳德 · 皮卡德的情绪认知主义或“信息模型”概念受到了后认知主义或“互动”实用主义者柯尔斯滕 · 博纳等人的批判和对比【56】。
   −
Picard's focus is human–computer interaction, and her goal for affective computing is to "give computers the ability to recognize, express, and in some cases, 'have' emotions".<ref name="Affective Computing"/> In contrast, the interactional approach seeks to help "people to understand and experience their own emotions"<ref name="How emotion is made and measured"/> and to improve computer-mediated interpersonal communication.  It does not necessarily seek to map emotion into an objective mathematical model for machine interpretation, but rather let humans make sense of each other's emotional expressions in open-ended ways that might be ambiguous, subjective, and sensitive to context.<ref name="How emotion is made and measured"/>{{rp|284}}{{example needed|date=September 2018}}
+
Picard's focus is human–computer interaction, and her goal for affective computing is to "give computers the ability to recognize, express, and in some cases, 'have' emotions".<ref name="Affective Computing" /> In contrast, the interactional approach seeks to help "people to understand and experience their own emotions"<ref name="How emotion is made and measured">{{cite journal|last1=Boehner|first1=Kirsten|last2=DePaula|first2=Rogerio|last3=Dourish|first3=Paul|last4=Sengers|first4=Phoebe|title=How emotion is made and measured|journal=International Journal of Human–Computer Studies|date=2007|volume=65|issue=4|pages=275–291|doi=10.1016/j.ijhcs.2006.11.016}}</ref> and to improve computer-mediated interpersonal communication.  It does not necessarily seek to map emotion into an objective mathematical model for machine interpretation, but rather let humans make sense of each other's emotional expressions in open-ended ways that might be ambiguous, subjective, and sensitive to context.<ref name="How emotion is made and measured" />{{rp|284}}{{example needed|date=September 2018}}
    
Picard's focus is human–computer interaction, and her goal for affective computing is to "give computers the ability to recognize, express, and in some cases, 'have' emotions". In contrast, the interactional approach seeks to help "people to understand and experience their own emotions" and to improve computer-mediated interpersonal communication.  It does not necessarily seek to map emotion into an objective mathematical model for machine interpretation, but rather let humans make sense of each other's emotional expressions in open-ended ways that might be ambiguous, subjective, and sensitive to context.
 
Picard's focus is human–computer interaction, and her goal for affective computing is to "give computers the ability to recognize, express, and in some cases, 'have' emotions". In contrast, the interactional approach seeks to help "people to understand and experience their own emotions" and to improve computer-mediated interpersonal communication.  It does not necessarily seek to map emotion into an objective mathematical model for machine interpretation, but rather let humans make sense of each other's emotional expressions in open-ended ways that might be ambiguous, subjective, and sensitive to context.
第670行: 第647行:  
皮卡德的研究重点是人机交互,她研究情感计算的目标是“赋予计算机识别、表达、在某些情况下‘拥有’情感的能力”【4】。相比之下,交互式的方法旨在帮助“人们理解和体验他们自己的情绪”【57】,并改善以电脑为媒介的人际沟通。它不一定寻求将情感映射到机器解释的客观数学模型中,而是让人类以可能含糊不清、主观且对上下文敏感的开放式方式理解彼此的情感表达【57】。
 
皮卡德的研究重点是人机交互,她研究情感计算的目标是“赋予计算机识别、表达、在某些情况下‘拥有’情感的能力”【4】。相比之下,交互式的方法旨在帮助“人们理解和体验他们自己的情绪”【57】,并改善以电脑为媒介的人际沟通。它不一定寻求将情感映射到机器解释的客观数学模型中,而是让人类以可能含糊不清、主观且对上下文敏感的开放式方式理解彼此的情感表达【57】。
   −
Picard's critics describe her concept of emotion as "objective, internal, private, and mechanistic". They say it reduces emotion to a discrete psychological signal occurring inside the body that can be measured and which is an input to cognition, undercutting the complexity of emotional experience.<ref name="How emotion is made and measured"/>{{rp|280}}<ref name="How emotion is made and measured"/>{{rp|278}}
+
Picard's critics describe her concept of emotion as "objective, internal, private, and mechanistic". They say it reduces emotion to a discrete psychological signal occurring inside the body that can be measured and which is an input to cognition, undercutting the complexity of emotional experience.<ref name="How emotion is made and measured" />{{rp|280}}<ref name="How emotion is made and measured" />{{rp|278}}
    
Picard's critics describe her concept of emotion as "objective, internal, private, and mechanistic". They say it reduces emotion to a discrete psychological signal occurring inside the body that can be measured and which is an input to cognition, undercutting the complexity of emotional experience.
 
Picard's critics describe her concept of emotion as "objective, internal, private, and mechanistic". They say it reduces emotion to a discrete psychological signal occurring inside the body that can be measured and which is an input to cognition, undercutting the complexity of emotional experience.
第676行: 第653行:  
皮卡德的批评者将她的情感概念描述为“客观的、内在的、私人的和机械的”。他们认为它把情绪简化为发生在身体内部的一个离散的心理信号,这个信号可以被测量,并且是认知的输入,削弱了情绪体验的复杂性。
 
皮卡德的批评者将她的情感概念描述为“客观的、内在的、私人的和机械的”。他们认为它把情绪简化为发生在身体内部的一个离散的心理信号,这个信号可以被测量,并且是认知的输入,削弱了情绪体验的复杂性。
   −
The interactional approach asserts that though emotion has biophysical aspects, it is "culturally grounded, dynamically experienced, and to some degree constructed in action and interaction".<ref name="How emotion is made and measured"/>{{rp|276}} Put another way, it considers "emotion as a social and cultural product experienced through our interactions".<ref>{{cite journal|last1=Boehner|first1=Kirsten|last2=DePaula|first2=Rogerio|last3=Dourish|first3=Paul|last4=Sengers|first4=Phoebe|title=Affection: From Information to Interaction|journal=Proceedings of the Aarhus Decennial Conference on Critical Computing|date=2005|pages=59–68}}</ref><ref name="How emotion is made and measured">{{cite journal|last1=Boehner|first1=Kirsten|last2=DePaula|first2=Rogerio|last3=Dourish|first3=Paul|last4=Sengers|first4=Phoebe|title=How emotion is made and measured|journal=International Journal of Human–Computer Studies|date=2007|volume=65|issue=4|pages=275–291|doi=10.1016/j.ijhcs.2006.11.016}}</ref><ref>{{cite journal|last1=Hook|first1=Kristina|last2=Staahl|first2=Anna|last3=Sundstrom|first3=Petra|last4=Laaksolahti|first4=Jarmo|title=Interactional empowerment|journal=Proc. CHI|date=2008|pages=647–656|url=http://research.microsoft.com/en-us/um/cambridge/projects/hci2020/pdf/interactional%20empowerment%20final%20Jan%2008.pdf}}</ref>
+
The interactional approach asserts that though emotion has biophysical aspects, it is "culturally grounded, dynamically experienced, and to some degree constructed in action and interaction".<ref name="How emotion is made and measured" />{{rp|276}} Put another way, it considers "emotion as a social and cultural product experienced through our interactions".<ref>{{cite journal|last1=Boehner|first1=Kirsten|last2=DePaula|first2=Rogerio|last3=Dourish|first3=Paul|last4=Sengers|first4=Phoebe|title=Affection: From Information to Interaction|journal=Proceedings of the Aarhus Decennial Conference on Critical Computing|date=2005|pages=59–68}}</ref><ref name="How emotion is made and measured" /><ref>{{cite journal|last1=Hook|first1=Kristina|last2=Staahl|first2=Anna|last3=Sundstrom|first3=Petra|last4=Laaksolahti|first4=Jarmo|title=Interactional empowerment|journal=Proc. CHI|date=2008|pages=647–656|url=http://research.microsoft.com/en-us/um/cambridge/projects/hci2020/pdf/interactional%20empowerment%20final%20Jan%2008.pdf}}</ref>
    
The interactional approach asserts that though emotion has biophysical aspects, it is "culturally grounded, dynamically experienced, and to some degree constructed in action and interaction". Put another way, it considers "emotion as a social and cultural product experienced through our interactions".
 
The interactional approach asserts that though emotion has biophysical aspects, it is "culturally grounded, dynamically experienced, and to some degree constructed in action and interaction". Put another way, it considers "emotion as a social and cultural product experienced through our interactions".
    
互动方法断言,虽然情绪具有生物物理方面,但它是“以文化为基础的,动态体验的,并在某种程度上构建于行动和互动中”【57】。换句话说,它认为“情感是一种通过我们的互动体验到的社会和文化产品”【57】【58】【59】。
 
互动方法断言,虽然情绪具有生物物理方面,但它是“以文化为基础的,动态体验的,并在某种程度上构建于行动和互动中”【57】。换句话说,它认为“情感是一种通过我们的互动体验到的社会和文化产品”【57】【58】【59】。
 +
 +
    
==See also==
 
==See also==
第696行: 第675行:  
* [[Wearable computer]]}}
 
* [[Wearable computer]]}}
   −
== Citations ==
  −
{{Reflist|2}}
     −
== General sources ==
  −
* {{cite journal | last = Hudlicka | first =  Eva | title = To feel or not to feel: The role of affect in human–computer interaction | journal = International Journal of Human–Computer Studies |  volume = 59 | issue = 1–2 | year = 2003 | pages = 1–32 | citeseerx = 10.1.1.180.6429 | doi=10.1016/s1071-5819(03)00047-8}}
  −
* {{cite book | last1 = Scherer |first1=Klaus R |last2=Bänziger |first2= Tanja  |last3=Roesch |first3=Etienne B | title = A Blueprint for Affective Computing: A Sourcebook and Manual | location = Oxford | publisher = Oxford University Press | year = 2010 }}
     −
*
+
==General sources==
*
      +
* {{cite journal | last = Hudlicka | first =  Eva | title = To feel or not to feel: The role of affect in human–computer interaction | journal = International Journal of Human–Computer Studies |  volume = 59 | issue = 1–2 | year = 2003 | pages = 1–32 | citeseerx = 10.1.1.180.6429 | doi=10.1016/s1071-5819(03)00047-8}}
 +
*{{cite book | last1 = Scherer |first1=Klaus R |last2=Bänziger |first2= Tanja  |last3=Roesch |first3=Etienne B | title = A Blueprint for Affective Computing: A Sourcebook and Manual | location = Oxford | publisher = Oxford University Press | year = 2010 }}
   −
*
  −
  −
*
      
==External links==
 
==External links==
 
* [http://affect.media.mit.edu/ Affective Computing Research Group at the MIT Media Laboratory]
 
* [http://affect.media.mit.edu/ Affective Computing Research Group at the MIT Media Laboratory]
* [http://emotions.usc.edu/ Computational Emotion Group at USC]
+
*[http://emotions.usc.edu/ Computational Emotion Group at USC]
* [http://emoshape.com/ Emotion Processing Unit – EPU]
+
*[http://emoshape.com/ Emotion Processing Unit – EPU]
* [http://sites.google.com/site/memphisemotivecomputing/ Emotive Computing Group at the University of Memphis]
+
*[http://sites.google.com/site/memphisemotivecomputing/ Emotive Computing Group at the University of Memphis]
* [https://web.archive.org/web/20180411230402/http://www.acii2011.org/ 2011 International Conference on Affective Computing and Intelligent Interaction]
+
*[https://web.archive.org/web/20180411230402/http://www.acii2011.org/ 2011 International Conference on Affective Computing and Intelligent Interaction]
* [https://web.archive.org/web/20091024081211/http://www.eecs.tufts.edu/~agirou01/workshop/ Brain, Body and Bytes: Psychophysiological User Interaction] ''CHI 2010 Workshop'' (10–15, April 2010)
+
*[https://web.archive.org/web/20091024081211/http://www.eecs.tufts.edu/~agirou01/workshop/ Brain, Body and Bytes: Psychophysiological User Interaction] ''CHI 2010 Workshop'' (10–15, April 2010)
* [https://web.archive.org/web/20110201001124/http://www.computer.org/portal/web/tac IEEE Transactions on Affective Computing] ''(TAC)''
+
*[https://web.archive.org/web/20110201001124/http://www.computer.org/portal/web/tac IEEE Transactions on Affective Computing] ''(TAC)''
* [http://opensmile.sourceforge.net/ openSMILE: popular state-of-the-art open-source toolkit for large-scale feature extraction for affect recognition and computational paralinguistics]
+
*[http://opensmile.sourceforge.net/ openSMILE: popular state-of-the-art open-source toolkit for large-scale feature extraction for affect recognition and computational paralinguistics]
    
* Affective Computing Research Group at the MIT Media Laboratory
 
* Affective Computing Research Group at the MIT Media Laboratory
第731行: 第703行:       −
* MIT 媒体实验室情感计算研究小组  
+
* MIT 媒体实验室情感计算研究小组
* USC 计算情感小组  
+
* USC 计算情感小组
* 情感处理单元-EPU  
+
* 情感处理单元-EPU
* 曼菲斯大学情感计算小组  
+
* 曼菲斯大学情感计算小组
* 2011年国际情感计算和智能交互会议  
+
* 2011年国际情感计算和智能交互会议
* 大脑,身体和字节: 精神生理学用户交互 CHI 2010研讨会(10-15,2010年4月)  
+
* 大脑,身体和字节: 精神生理学用户交互 CHI 2010研讨会(10-15,2010年4月)
* IEEE 情感计算会刊(TAC)  
+
* IEEE 情感计算会刊(TAC)
 
* openSMILE: 流行的最先进的开源工具包,用于大规模的情感识别和计算语言学特征提取
 
* openSMILE: 流行的最先进的开源工具包,用于大规模的情感识别和计算语言学特征提取
   第749行: 第721行:     
{{DEFAULTSORT:Affective Computing}}
 
{{DEFAULTSORT:Affective Computing}}
[[Category:Affective computing| ]]
+
[[index.php?title=分类:Affective computing| ]]
    
<noinclude>
 
<noinclude>
   −
<small>This page was moved from [[wikipedia:en:Affective computing]]. Its edit history can be viewed at [[情感计算/edithistory]]</small></noinclude>
+
<small>This page was moved from [[wikipedia:en:Affective computing]]. Its edit history can be viewed at [[情感计算/edithistory]]</small>
 +
 
 +
 
   −
|}
+
 
 +
==Citations==
 +
{{Reflist|2}}
 +
 
 +
 
 +
 
 +
 
 +
 
 +
{{DEFAULTSORT:Affective Computing}}
 +
[[index.php?title=分类:Affective computing| ]]