更改

跳到导航 跳到搜索
删除1,580字节 、 2020年7月19日 (日) 18:50
add bold
第69行: 第69行:  
In the past four decades, computational sociology has been introduced and gaining popularity.  This has been used primarily for modeling or building explanations of social processes and are depending on the emergence of complex behavior from simple activities.  The idea behind emergence is that properties of any bigger system do not always have to be properties of the components that the system is made of.  The people responsible for the introduction of the idea of emergence are Alexander, Morgan, and Broad, who were classical emergentists.  The time at which these emergentists came up with this concept and method was during the time of the early twentieth century.  The aim of this method was to find a good enough accommodation between two different and extreme ontologies, which were reductionist materialism and dualism.  
 
In the past four decades, computational sociology has been introduced and gaining popularity.  This has been used primarily for modeling or building explanations of social processes and are depending on the emergence of complex behavior from simple activities.  The idea behind emergence is that properties of any bigger system do not always have to be properties of the components that the system is made of.  The people responsible for the introduction of the idea of emergence are Alexander, Morgan, and Broad, who were classical emergentists.  The time at which these emergentists came up with this concept and method was during the time of the early twentieth century.  The aim of this method was to find a good enough accommodation between two different and extreme ontologies, which were reductionist materialism and dualism.  
   −
在过去的四十年里,计算社会学诞生并且越来越受欢迎。这主要用于建模或构建社会过程的解释,并且依赖于从简单活动中'''涌现 Emergence'''的复杂行为。涌现背后的思想是,任何较大系统不总是必须和组成系统的部分具有一样的性质。引入涌现思想的人是亚历山大 Alexander、摩根 Morgan和布罗德 Broad,他们都是'''经典涌现主义者 Classical Emergentists'''。经典涌现主义者提出这个概念和方法的时间是在二十世纪初期。这种方法的目的是在还原论的唯物主义和二元论这两种不同的本体论之间找到一个足够好的调和。
+
在过去的四十年里,计算社会学诞生并且越来越受欢迎。这主要用于建模或构建社会过程的解释,并且依赖于从简单活动中'''涌现 Emergence'''的复杂行为。涌现背后的思想是,任何较大系统不总是必须和组成系统的部分具有一样的性质。引入涌现思想的人是'''亚历山大 Alexander'''、'''摩根 Morgan'''和'''布罗德 Broad''',他们都是'''经典涌现主义者 Classical Emergentists'''。经典涌现主义者提出这个概念和方法的时间是在二十世纪初期。这种方法的目的是在还原论的唯物主义和二元论这两种不同的本体论之间找到一个足够好的调和。
      第95行: 第95行:  
In the post-war era, Vannevar Bush's differential analyser, John von Neumann's cellular automata, Norbert Wiener's cybernetics, and Claude Shannon's information theory became influential paradigms for modeling and understanding complexity in technical systems. In response, scientists in disciplines such as physics, biology, electronics, and economics began to articulate a general theory of systems in which all natural and physical phenomena are manifestations of interrelated elements in a system that has common patterns and properties. Following Émile Durkheim's call to analyze complex modern society sui generis, post-war structural functionalist sociologists such as Talcott Parsons seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the AGIL paradigm. Sociologists such as George Homans argued that sociological theories should be formalized into hierarchical structures of propositions and precise terminology from which other propositions and hypotheses could be derived and operationalized into empirical studies. Because computer algorithms and programs had been used as early as 1956 to test and validate mathematical theorems, such as the four color theorem, some scholars anticipated that similar computational approaches could "solve" and "prove" analogously formalized problems and theorems of social structures and dynamics.
 
In the post-war era, Vannevar Bush's differential analyser, John von Neumann's cellular automata, Norbert Wiener's cybernetics, and Claude Shannon's information theory became influential paradigms for modeling and understanding complexity in technical systems. In response, scientists in disciplines such as physics, biology, electronics, and economics began to articulate a general theory of systems in which all natural and physical phenomena are manifestations of interrelated elements in a system that has common patterns and properties. Following Émile Durkheim's call to analyze complex modern society sui generis, post-war structural functionalist sociologists such as Talcott Parsons seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the AGIL paradigm. Sociologists such as George Homans argued that sociological theories should be formalized into hierarchical structures of propositions and precise terminology from which other propositions and hypotheses could be derived and operationalized into empirical studies. Because computer algorithms and programs had been used as early as 1956 to test and validate mathematical theorems, such as the four color theorem, some scholars anticipated that similar computational approaches could "solve" and "prove" analogously formalized problems and theorems of social structures and dynamics.
   −
战后时期,范内瓦·布什(Vannevar Bush)的'''微分分析机 Differential Analyser''' 、冯·诺伊曼(John von Neumann)的'''元胞自动机 Cellular Automata'''、 维纳(Norbert Wiener)的'''控制论 Cybernetics'''和克劳德·香农(Claude Shannon)的'''信息论 Information Theory'''成为技术系统中建模和理解复杂性的重要范式。物理学、生物学、电子学和经济学等学科的科学家开始阐述系统的一般理论,即所有自然和物理现象都是具有共同模式和属性的系统中相互关联的元素的表现。按照涂尔干(Émile Durkheim)分析特定的复杂现代社会
+
战后时期,'''范内瓦·布什 Vannevar Bush'''的'''微分分析机 Differential Analyser''' 、'''冯·诺伊曼 John von Neumann'''的'''元胞自动机 Cellular Automata'''、 '''维纳 Norbert Wiener'''的'''控制论 Cybernetics'''和'''克劳德·香农Claude Shannon'''的'''信息论 Information Theory'''成为技术系统中建模和理解复杂性的重要范式。物理学、生物学、电子学和经济学等学科的科学家开始阐述系统的一般理论,即所有自然和物理现象都是具有共同模式和属性的系统中相互关联的元素的表现。按照'''涂尔干 Émile Durkheim'''分析特定的复杂现代社会
    
(--[[用户:嘉树|嘉树]]([[用户讨论:嘉树|讨论]])Following Émile Durkheim's call to analyze complex modern society sui generis)
 
(--[[用户:嘉树|嘉树]]([[用户讨论:嘉树|讨论]])Following Émile Durkheim's call to analyze complex modern society sui generis)
  --~~克劳德·香农(Claude Shannon)  这些同样去括号
     −
的要求,战后结构功能主义社会学家如帕森斯(Talcott Parsons)利用这些组成部分之间系统和等级相互作用的理论,试图产生大统一的社会学理论,如 AGIL 范式。霍曼斯(George Homans)等社会学家认为,社会学理论应该被构建为具有命题和精确术语的等级结构,并且从中可以得出能够在实证研究中被操作化的其他命题和假设。由于早在1956年计算机算法和程序就已经被用来测试和验证数学定理(如'''四色定理Four Color Theorem'''),一些学者预计相似的计算方法可以“解决”和“证明”类似的社会结构和动态的问题和定理。
+
的要求,战后结构功能主义社会学家如'''帕森斯 Talcott Parsons'''利用这些组成部分之间系统和等级相互作用的理论,试图产生大统一的社会学理论,如 AGIL 范式。'''霍曼斯 George Homans'''等社会学家认为,社会学理论应该被构建为具有命题和精确术语的等级结构,并且从中可以得出能够在实证研究中被操作化的其他命题和假设。由于早在1956年计算机算法和程序就已经被用来测试和验证数学定理(如'''四色定理Four Color Theorem'''),一些学者预计相似的计算方法可以“解决”和“证明”类似的社会结构和动态的问题和定理。
    
===Macrosimulation and microsimulation 宏观模拟和微观模拟===
 
===Macrosimulation and microsimulation 宏观模拟和微观模拟===
第122行: 第121行:  
The 1970s and 1980s were also a time when physicists and mathematicians were attempting to model and analyze how simple component units, such as atoms, give rise to global properties, such as complex material properties at low temperatures, in magnetic materials, and within turbulent flows. Using cellular automata, scientists were able to specify systems consisting of a grid of cells in which each cell only occupied some finite states and changes between states were solely governed by the states of immediate neighbors. Along with advances in artificial intelligence and microcomputer power, these methods contributed to the development of "chaos theory" and "complexity theory" which, in turn, renewed interest in understanding complex physical and social systems across disciplinary boundaries. Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the Santa Fe Institute was established in 1984 by scientists based at Los Alamos National Laboratory and the BACH group at the University of Michigan likewise started in the mid-1980s.
 
The 1970s and 1980s were also a time when physicists and mathematicians were attempting to model and analyze how simple component units, such as atoms, give rise to global properties, such as complex material properties at low temperatures, in magnetic materials, and within turbulent flows. Using cellular automata, scientists were able to specify systems consisting of a grid of cells in which each cell only occupied some finite states and changes between states were solely governed by the states of immediate neighbors. Along with advances in artificial intelligence and microcomputer power, these methods contributed to the development of "chaos theory" and "complexity theory" which, in turn, renewed interest in understanding complex physical and social systems across disciplinary boundaries. Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the Santa Fe Institute was established in 1984 by scientists based at Los Alamos National Laboratory and the BACH group at the University of Michigan likewise started in the mid-1980s.
   −
20世纪70年代和80年代,物理学家和数学家也在试图模拟和分析简单的组成单位如何产生整体特性,比如低温环境中原子在磁性材料和湍流中的复杂材料特性。使用'''元胞自动机 Cellular Automata''',科学家们能够指定由元胞网格组成的系统,其中每个元胞只占据一些有限的状态,状态之间的变化完全由相邻元胞的状态控制。随着人工智能和微型计算机能力的进步,这些方法促进了“混沌理论”和“复杂性理论”的发展,这反过来又重新引起了人们对跨学科的复杂物理和社会系统的兴趣。明确致力于跨学科复杂性研究的机构也是在这个时代成立的: 圣菲研究所是由美国洛斯阿拉莫斯国家实验室(Los Alamos National Laboratory)的科学家于1984年建立的,密歇根大学的 BACH 小组也是在20世纪80年代中期建立的。
+
20世纪70年代和80年代,物理学家和数学家也在试图模拟和分析简单的组成单位如何产生整体特性,比如低温环境中原子在磁性材料和湍流中的复杂材料特性。使用'''元胞自动机 Cellular Automata''',科学家们能够指定由元胞网格组成的系统,其中每个元胞只占据一些有限的状态,状态之间的变化完全由相邻元胞的状态控制。随着人工智能和微型计算机能力的进步,这些方法促进了“混沌理论”和“复杂性理论”的发展,这反过来又重新引起了人们对跨学科的复杂物理和社会系统的兴趣。明确致力于跨学科复杂性研究的机构也是在这个时代成立的: 圣菲研究所是由'''美国洛斯阿拉莫斯国家实验室 Los Alamos National Laboratory'''的科学家于1984年建立的,密歇根大学的 BACH 小组也是在20世纪80年代中期建立的。
      第130行: 第129行:  
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling. Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior. In 1981, mathematician and political scientist Robert Axelrod and evolutionary biologist W.D. Hamilton published a major paper in Science titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a prisoner's dilemma game when agents followed simple rules of self-interest. Axelrod and Hamilton demonstrated that individual agents following a simple rule set of (1) cooperate on the first turn and (2) thereafter replicate the partner's previous action were able to develop "norms" of cooperation and sanctioning in the absence of canonical sociological constructs such as demographics, values, religion, and culture as preconditions or mediators of cooperation. Throughout the 1990s, scholars like William Sims Bainbridge, Kathleen Carley, Michael Macy,  and John Skvoretz developed multi-agent-based models of generalized reciprocity, prejudice, social influence, and organizational information processing. In 1999, Nigel Gilbert published the first textbook on Social Simulation: Simulation for the social scientist and established its most relevant journal: the Journal of Artificial Societies and Social Simulation.
 
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling. Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior. In 1981, mathematician and political scientist Robert Axelrod and evolutionary biologist W.D. Hamilton published a major paper in Science titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a prisoner's dilemma game when agents followed simple rules of self-interest. Axelrod and Hamilton demonstrated that individual agents following a simple rule set of (1) cooperate on the first turn and (2) thereafter replicate the partner's previous action were able to develop "norms" of cooperation and sanctioning in the absence of canonical sociological constructs such as demographics, values, religion, and culture as preconditions or mediators of cooperation. Throughout the 1990s, scholars like William Sims Bainbridge, Kathleen Carley, Michael Macy,  and John Skvoretz developed multi-agent-based models of generalized reciprocity, prejudice, social influence, and organizational information processing. In 1999, Nigel Gilbert published the first textbook on Social Simulation: Simulation for the social scientist and established its most relevant journal: the Journal of Artificial Societies and Social Simulation.
   −
元胞自动机范式引发了强调'''基于主体建模 Agent-based Modeling'''的第三次社会模拟浪潮。与微观模拟一样,这些模型强调自下而上的设计,但采用了与微观模拟不同的四个关键假设: '''自主性 Autonomy'''、'''相互依赖性 Interdependency'''、'''简单规则 Simple Rules'''和'''适应性行为 Adaptive Behavior'''。1981年,数学家、政治学家阿克塞尔罗德(Robert Axelrod)和进化生物学家汉密尔顿(W.D. Hamilton)在《科学》杂志上发表了一篇名为《合作的进化》(The Evolution of Cooperation)的重要论文,该论文采用基于主体建模方法,论证了在一个囚徒困境中,当主体遵循简单的自利规则时,互惠的社会合作是如何建立和稳定的。阿克塞尔罗德和汉密尔顿证明,主体遵循这样一套简单的规则: (1)在第一轮进行合作,(2)在其后重复伙伴以前的行动,能够在没有人口学差异、价值观、宗教和文化等社会规范作为合作的先决条件或中介的情况下制定合作和制裁的“规范”。整个20世纪90年代,像威廉·希姆斯·本布里奇(William Sims Bainbridge),Kathleen Carley,Michael Macy 和 John Skvoretz 这样的学者开发了基于多主体的广义互惠、偏见、社会影响和组织信息处理模型。1999年,吉尔伯特(Nigel Gilbert)出版了第一本关于社会模拟: 写给社会科学家的仿真模拟(Simulation for the social scientist)的教科书,并建立了它最相关的杂志: 人工社会和社会模拟杂志(the Journal of Artificial Societies and Social Simulation)。
+
元胞自动机范式引发了强调'''基于主体建模 Agent-based Modeling'''的第三次社会模拟浪潮。与微观模拟一样,这些模型强调自下而上的设计,但采用了与微观模拟不同的四个关键假设: '''自主性 Autonomy'''、'''相互依赖性 Interdependency'''、'''简单规则 Simple Rules'''和'''适应性行为 Adaptive Behavior'''。1981年,数学家、政治学家阿克塞尔罗德(Robert Axelrod)和进化生物学家汉密尔顿(W.D. Hamilton)在《科学》杂志上发表了一篇名为《合作的进化》(The Evolution of Cooperation)的重要论文,该论文采用基于主体建模方法,论证了在一个囚徒困境中,当主体遵循简单的自利规则时,互惠的社会合作是如何建立和稳定的。阿克塞尔罗德和汉密尔顿证明,主体遵循这样一套简单的规则: (1)在第一轮进行合作,(2)在其后重复伙伴以前的行动,能够在没有人口学差异、价值观、宗教和文化等社会规范作为合作的先决条件或中介的情况下制定合作和制裁的“规范”。整个20世纪90年代,像'''威廉·希姆斯·本布里奇 William Sims Bainbridge''','''Kathleen Carley''','''Michael Macy''' '''John Skvoretz''' 这样的学者开发了基于多主体的广义互惠、偏见、社会影响和组织信息处理模型。1999年,吉尔伯特(Nigel Gilbert)出版了第一本关于社会模拟: 《写给社会科学家的仿真模拟(Simulation for the social scientist)》的教科书,并建立了它最相关的杂志: 人工社会和社会模拟杂志《the Journal of Artificial Societies and Social Simulation》。
    
===Data mining and social network analysis 数据挖掘和社会网络分析===
 
===Data mining and social network analysis 数据挖掘和社会网络分析===
    
{{main|Data mining|Social network analysis}}
 
{{main|Data mining|Social network analysis}}
  −
--[[用户:嘉树|嘉树]]([[用户讨论:嘉树|讨论]]) 少一段原文
  −
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like James S. Coleman, Harrison White, Linton Freeman, J. Clyde Mitchell, Mark Granovetter, Ronald Burt, and Barry Wellman.[22] The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as network analysis and multilevel modeling, that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the World Wide Web, mobile phone usage, and discussion on Usenet allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.[23] Continued improvements in machine learning algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.[24][25]
      
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like [[James Samuel Coleman|James S. Coleman]], [[Harrison White]], [[Linton Freeman]], [[J. Clyde Mitchell]], [[Mark Granovetter]], [[Ronald Burt]], and [[Barry Wellman]].<ref>{{cite book|title=The Development of Social Network Analysis: A Study in the Sociology of Science |first=Linton C. |last=Freeman |publisher=Empirical Press |location=Vancouver, BC |year=2004}}</ref> The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as [[network theory|network analysis]] and [[multilevel modeling]], that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the [[World Wide Web]], mobile phone usage, and discussion on [[Usenet]] allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.<ref>{{cite journal|title=Life in the network: the coming age of computational social science|first9=J|last10=Gutmann|first10=M.|last11=Jebara|first11=T.|last12=King|first12=G.|last13=Macy|first13=M.|last14=Roy|first14=D.|last15=Van Alstyne|first15=M.|last9=Fowler|first8=N|last8=Contractor|first7=N|last7=Christakis|first6=D|last6=Brewer|first5=AL|last5=Barabasi|first4=S |journal=Science|last4=Aral |date=February 6, 2009|first3=L |volume=323|pmid=19197046 |issue=5915|last3=Adamic |pages=721–723|pmc=2745217 |doi=10.1126/science.1167742 |first1=David |last1=Lazer |first2=Alex |last2=Pentland |display-authors=8}}</ref> Continued improvements in [[machine learning]] algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.<ref>{{cite journal|first1=Jaideep |last1=Srivastava |first2=Robert |last2=Cooley |first3=Mukund |last3=Deshpande |first4=Pang-Ning |last4=Tan |journal=Proceedings of the ACM Conference on Knowledge Discovery and Data Mining |title=Web usage mining: discovery and applications of usage patterns from Web data|volume=1 |year=2000 |pages=12–23 |doi=10.1145/846183.846188|issue=2}}</ref><ref>{{cite journal|doi=10.1016/S0169-7552(98)00110-X|title=The anatomy of a large-scale hypertextual Web search engine |first1=Sergey |last1=Brin |first2=Lawrence |last2=Page |journal=Computer Networks and ISDN Systems |volume=30 |issue=1–7 |pages=107–117 |date=April 1998|citeseerx=10.1.1.115.5930 }}</ref>
 
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like [[James Samuel Coleman|James S. Coleman]], [[Harrison White]], [[Linton Freeman]], [[J. Clyde Mitchell]], [[Mark Granovetter]], [[Ronald Burt]], and [[Barry Wellman]].<ref>{{cite book|title=The Development of Social Network Analysis: A Study in the Sociology of Science |first=Linton C. |last=Freeman |publisher=Empirical Press |location=Vancouver, BC |year=2004}}</ref> The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as [[network theory|network analysis]] and [[multilevel modeling]], that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the [[World Wide Web]], mobile phone usage, and discussion on [[Usenet]] allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.<ref>{{cite journal|title=Life in the network: the coming age of computational social science|first9=J|last10=Gutmann|first10=M.|last11=Jebara|first11=T.|last12=King|first12=G.|last13=Macy|first13=M.|last14=Roy|first14=D.|last15=Van Alstyne|first15=M.|last9=Fowler|first8=N|last8=Contractor|first7=N|last7=Christakis|first6=D|last6=Brewer|first5=AL|last5=Barabasi|first4=S |journal=Science|last4=Aral |date=February 6, 2009|first3=L |volume=323|pmid=19197046 |issue=5915|last3=Adamic |pages=721–723|pmc=2745217 |doi=10.1126/science.1167742 |first1=David |last1=Lazer |first2=Alex |last2=Pentland |display-authors=8}}</ref> Continued improvements in [[machine learning]] algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.<ref>{{cite journal|first1=Jaideep |last1=Srivastava |first2=Robert |last2=Cooley |first3=Mukund |last3=Deshpande |first4=Pang-Ning |last4=Tan |journal=Proceedings of the ACM Conference on Knowledge Discovery and Data Mining |title=Web usage mining: discovery and applications of usage patterns from Web data|volume=1 |year=2000 |pages=12–23 |doi=10.1145/846183.846188|issue=2}}</ref><ref>{{cite journal|doi=10.1016/S0169-7552(98)00110-X|title=The anatomy of a large-scale hypertextual Web search engine |first1=Sergey |last1=Brin |first2=Lawrence |last2=Page |journal=Computer Networks and ISDN Systems |volume=30 |issue=1–7 |pages=107–117 |date=April 1998|citeseerx=10.1.1.115.5930 }}</ref>
第143行: 第139行:  
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like James S. Coleman, Harrison White, Linton Freeman, J. Clyde Mitchell, Mark Granovetter, Ronald Burt, and Barry Wellman. The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as network analysis and multilevel modeling, that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the World Wide Web, mobile phone usage, and discussion on Usenet allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments. Continued improvements in machine learning algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.
 
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like James S. Coleman, Harrison White, Linton Freeman, J. Clyde Mitchell, Mark Granovetter, Ronald Burt, and Barry Wellman. The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as network analysis and multilevel modeling, that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the World Wide Web, mobile phone usage, and discussion on Usenet allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments. Continued improvements in machine learning algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.
   −
'''社会网络分析 Social Network Analysis'''独立于社会系统计算模型的发展,在20世纪70年代和80年代出现于图论、统计学和社会结构的研究中,它作为一种独特的分析方法被社会学家如 James s. Coleman,Harrison White,Linton Freeman,J. Clyde Mitchell,Mark Granovetter,Ronald Burt 和 Barry Wellman 等阐述和采用。在整个1980年代和1990年代,计算和通信技术日益普及,这要求采用诸如网络分析和多级建模等分析技术,这些技术可以扩展到日益复杂和庞大的数据集中。最近的计算社会学没有使用模拟,而是使用网络分析和先进的统计技术来分析大规模计算机数据库中电子代理的行为数据。电子记录,如电子邮件和即时消息记录,万维网上的超链接,移动电话数据,以及 Usenet 上的讨论,使社会科学家能够直接观察社会行为并在多个时间点和多个层次的分析行为,并且不受传统的实证方法,如访谈、观察(--[[用户:嘉树|嘉树]]([[用户讨论:嘉树|讨论]])为了通顺,删除研究对象/被试:participants)或调查工具的限制。机器学习算法的不断改进同样使得社会科学家和企业家能够使用新技术来识别大型电子数据集中潜在但有意义的社会互动和演化模式。
+
'''社会网络分析 Social Network Analysis'''独立于社会系统计算模型的发展,在20世纪70年代和80年代出现于图论、统计学和社会结构的研究中,它作为一种独特的分析方法被社会学家如 '''James s. Coleman''','''Harrison White''','''Linton Freeman''','''J. Clyde Mitchell''','''Mark Granovetter''','''Ronald Burt''' '''Barry Wellman''' 等阐述和采用。在整个1980年代和1990年代,计算和通信技术日益普及,这要求采用诸如网络分析和多级建模等分析技术,这些技术可以扩展到日益复杂和庞大的数据集中。最近的计算社会学没有使用模拟,而是使用网络分析和先进的统计技术来分析大规模计算机数据库中电子代理的行为数据。电子记录,如电子邮件和即时消息记录,万维网上的超链接,移动电话数据,以及 Usenet 上的讨论,使社会科学家能够直接观察社会行为并在多个时间点和多个层次的分析行为,并且不受传统的实证方法,如访谈、观察(--[[用户:嘉树|嘉树]]([[用户讨论:嘉树|讨论]])为了通顺,删除研究对象/被试:participants)或调查工具的限制。机器学习算法的不断改进同样使得社会科学家和企业家能够使用新技术来识别大型电子数据集中潜在但有意义的社会互动和演化模式。
      第165行: 第161行:  
Content analysis has been a traditional part of social sciences and media studies for a long time. The automation of content analysis has allowed a "big data" revolution to take place in that field, with studies in social media and newspaper content that include millions of news items. Gender bias, readability, content similarity, reader preferences, and even mood have been analyzed based on text mining methods over millions of documents. The analysis of readability, gender bias and topic bias was demonstrated in Flaounas et al. showing how different topics have different gender biases and levels of readability; the possibility to detect mood shifts in a vast population by analysing Twitter content was demonstrated as well.
 
Content analysis has been a traditional part of social sciences and media studies for a long time. The automation of content analysis has allowed a "big data" revolution to take place in that field, with studies in social media and newspaper content that include millions of news items. Gender bias, readability, content similarity, reader preferences, and even mood have been analyzed based on text mining methods over millions of documents. The analysis of readability, gender bias and topic bias was demonstrated in Flaounas et al. showing how different topics have different gender biases and levels of readability; the possibility to detect mood shifts in a vast population by analysing Twitter content was demonstrated as well.
   −
长期以来,内容分析一直是社会科学和媒体研究的传统方法。内容分析的自动化使得这一领域发生了一场“大数据”革命,这些研究中,社交媒体和报纸内容包括了数百万条的新闻。基于文本挖掘方法,'''性别偏差Gender Bias'''、'''可读性 Readability'''、'''内容相似度 Content Similarity'''、'''读者偏好 Reader Preferences''',甚至是'''情绪 Mood'''的数以百万计的文档都可以被分析。Flaounas等人对可读性、性别偏差和'''话题偏差 Topic Bias'''为主要研究对象,展示了不同的话题如何有不同水平的性别偏差和可读性; 展示了通过分析推特内容来检测人群情绪变化的可能性。
+
长期以来,'''内容分析 Content Analysis'''一直是社会科学和媒体研究的传统方法。内容分析的自动化使得这一领域发生了一场“大数据”革命,这些研究中,社交媒体和报纸内容包括了数百万条的新闻。基于文本挖掘方法,'''性别偏差Gender Bias'''、'''可读性 Readability'''、'''内容相似度 Content Similarity'''、'''读者偏好 Reader Preferences''',甚至是'''情绪 Mood'''的数以百万计的文档都可以被分析。Flaounas等人对可读性、性别偏差和'''话题偏差 Topic Bias'''为主要研究对象,展示了不同的话题如何有不同水平的性别偏差和可读性; 展示了通过分析推特内容来检测人群情绪变化的可能性。
     
259

个编辑

导航菜单