更改

删除19,568字节 、 2020年7月31日 (五) 11:46
第11行: 第11行:     
[[File:Complexity-map-with-sociolo.png|thumb|right|300px|社会学和复杂性中的研究范式和科学家的历史地图.]]
 
[[File:Complexity-map-with-sociolo.png|thumb|right|300px|社会学和复杂性中的研究范式和科学家的历史地图.]]
  −
      
===背景===
 
===背景===
 +
在过去的四十年里,计算社会学诞生并且越来越受欢迎。这主要用于建模或构建社会过程的解释,并且依赖于从简单活动中'''涌现 Emergence'''的复杂行为。<ref name="EACICS">Salgado, Mauricio, and Nigel Gilbert. "[http://epubs.surrey.ac.uk/749319/1/Emergence%20and%20Communication%20-%20V3%201.pdf Emergence and communication in computational sociology]." Journal for the Theory of Social Behaviour 43.1 (2013): 87-110.</ref> 涌现背后的思想是,任何较大系统不总是必须和组成系统的部分具有一样的性质。<ref>Macy, Michael W., and Robert Willer. "[http://sct.uab.cat/lsds/sites/sct.uab.cat.lsds/files/FROM%20FACTORS%20TO%20ACTORS%20(Macy%20and%20Willer%202002).pdf From factors to actors: computational sociology and agent-based modeling]." Annual review of sociology 28.1 (2002): 143-166.</ref>  引入涌现思想的人是亚历山大 Alexander、摩根 Morgan和布罗德 Broad,他们都是'''经典涌现主义者 Classical Emergentists'''。经典涌现主义者提出这个概念和方法的时间是在二十世纪初期。这种方法的目的是在还原论的唯物主义和二元论这两种不同的本体论之间找到一个足够好的调和。<ref name="EACICS"/>
       +
虽然涌现在计算社会学的建立上发挥了宝贵而重要的作用,但也有人不太同意。该领域的一个主要人物爱泼斯坦 Epstein对这种概念持怀疑态度,因为有些方面是这个概念无法解释的。爱泼斯坦提出了一种反对“涌现主义”的观点,他表示:“正是这些部分所产生的充分性构成了对整体的解释”。<ref name="EACICS"/>
   −
In the past four decades, computational sociology has been introduced and gaining popularity {{According to whom|date=June 2017}}.  This has been used primarily for modeling or building explanations of social processes and are depending on the emergence of complex behavior from simple activities.<ref name="EACICS">Salgado, Mauricio, and Nigel Gilbert. "[http://epubs.surrey.ac.uk/749319/1/Emergence%20and%20Communication%20-%20V3%201.pdf Emergence and communication in computational sociology]." Journal for the Theory of Social Behaviour 43.1 (2013): 87-110.</ref>  The idea behind emergence is that properties of any bigger system do not always have to be properties of the components that the system is made of.<ref>Macy, Michael W., and Robert Willer. "[http://sct.uab.cat/lsds/sites/sct.uab.cat.lsds/files/FROM%20FACTORS%20TO%20ACTORS%20(Macy%20and%20Willer%202002).pdf From factors to actors: computational sociology and agent-based modeling]." Annual review of sociology 28.1 (2002): 143-166.</ref>  The people responsible for the introduction of the idea of emergence are Alexander, Morgan, and Broad, who were classical emergentists.  The time at which these emergentists came up with this concept and method was during the time of the early twentieth century.  The aim of this method was to find a good enough accommodation between two different and extreme ontologies, which were reductionist materialism and dualism.<ref name="EACICS"/>
  −
  −
In the past four decades, computational sociology has been introduced and gaining popularity.  This has been used primarily for modeling or building explanations of social processes and are depending on the emergence of complex behavior from simple activities.  The idea behind emergence is that properties of any bigger system do not always have to be properties of the components that the system is made of.  The people responsible for the introduction of the idea of emergence are Alexander, Morgan, and Broad, who were classical emergentists.  The time at which these emergentists came up with this concept and method was during the time of the early twentieth century.  The aim of this method was to find a good enough accommodation between two different and extreme ontologies, which were reductionist materialism and dualism.
  −
  −
在过去的四十年里,计算社会学诞生并且越来越受欢迎。这主要用于建模或构建社会过程的解释,并且依赖于从简单活动中'''涌现 Emergence'''的复杂行为。涌现背后的思想是,任何较大系统不总是必须和组成系统的部分具有一样的性质。引入涌现思想的人是亚历山大 Alexander、摩根 Morgan和布罗德 Broad,他们都是'''经典涌现主义者 Classical Emergentists'''。经典涌现主义者提出这个概念和方法的时间是在二十世纪初期。这种方法的目的是在还原论的唯物主义和二元论这两种不同的本体论之间找到一个足够好的调和。
  −
  −
  −
  −
While emergence has had a valuable and important role with the foundation of Computational Sociology, there are those who do not necessarily agree.  One major leader in the field, Epstein, doubted the use because there were aspects that are unexplainable.  Epstein put up a claim against emergentism, in which he says it "is precisely the generative sufficiency of the parts that constitutes the whole's explanation".<ref name="EACICS"/>
     −
While emergence has had a valuable and important role with the foundation of Computational Sociology, there are those who do not necessarily agree.  One major leader in the field, Epstein, doubted the use because there were aspects that are unexplainable.  Epstein put up a claim against emergentism, in which he says it "is precisely the generative sufficiency of the parts that constitutes the whole's explanation".
+
'''基于主体的模型 Agent-based Models'''对计算社会学有着历史性的影响。这些模型最早出现在20世纪60年代,用于仿真模拟组织、城市等系统的控制和反馈过程。在20世纪70年代,ABM模型引入个体作为主要的分析单元,并使用自下而上的策略来对行为建模。最后一波浪潮发生在20世纪80年代。那时,模型仍然是自下而上的; 唯一的区别是主体之间具有互相依赖和相互作用。<ref name="EACICS"/>
 
  −
虽然涌现在计算社会学的建立上发挥了宝贵而重要的作用,但也有人不太同意。该领域的一个主要人物爱泼斯坦 Epstein对这种概念持怀疑态度,因为有些方面是这个概念无法解释的。爱泼斯坦提出了一种反对“涌现主义”的观点,他表示:“正是这些部分所产生的充分性构成了对整体的解释”。
  −
 
  −
 
  −
 
  −
Agent-based models have had a historical influence on Computational Sociology.  These models first came around in the 1960s, and were used to simulate control and feedback processes in organizations, cities, etc.  During the 1970s, the application introduced the use of individuals as the main units for the analyses and used bottom-up strategies for modeling behaviors.  The last wave occurred in the 1980s.  At this time, the models were still bottom-up; the only difference is that the agents interact interdependently.<ref name="EACICS"/>
  −
 
  −
Agent-based models have had a historical influence on Computational Sociology. These models first came around in the 1960s, and were used to simulate control and feedback processes in organizations, cities, etc.  During the 1970s, the application introduced the use of individuals as the main units for the analyses and used bottom-up strategies for modeling behaviors.  The last wave occurred in the 1980s.  At this time, the models were still bottom-up; the only difference is that the agents interact interdependently.
  −
 
  −
'''基于主体的模型 Agent-based Models'''对计算社会学有着历史性的影响。这些模型最早出现在20世纪60年代,用于仿真模拟组织、城市等系统的控制和反馈过程。在20世纪70年代,ABM模型引入个体作为主要的分析单元,并使用自下而上的策略来对行为建模。最后一波浪潮发生在20世纪80年代。那时,模型仍然是自下而上的; 唯一的区别是主体之间具有互相依赖和相互作用。
      
===系统理论和结构功能主义===
 
===系统理论和结构功能主义===
 
+
战后时期,范内瓦·布什 Vannevar Bush的'''微分分析机 Differential Analyser''' 、'''[[冯·诺伊曼 John von Neumann]]'''的'''[[元胞自动机 Cellular Automata]]'''、 '''维纳 Norbert Wiener'''的'''[[控制论 Cybernetics]]'''和'''[[克劳德·香农 Claude Shannon]]'''的'''[[信息论 Information Theory]]'''成为技术系统中建模和理解复杂性的重要范式。物理学、生物学、电子学和经济学等学科的科学家开始阐述系统的一般理论,即所有自然和物理现象都是具有共同模式和属性的系统中相互关联的元素的表现。沿着'''涂尔干 Émile Durkheim'''分析特定的复杂现代社会的思路,<ref>{{cite book|first=Émile |last=Durkheim |title=The Division of Labor in Society |location=New York, NY |publisher=Macmillan}}</ref> 战后结构功能主义社会学家如'''帕森斯 Talcott Parsons'''利用这些组成部分之间系统化和层次化相互作用的理论,试图产生大统一的社会学理论,如 AGIL 范式。<ref name="Bailey">{{cite book|first1=Kenneth D. |last1=Bailey |editor=Jonathan H. Turner |chapter=Systems Theory |title=Handbook of Sociological Theory |publisher=Springer Science |year=2006 |location=New York, NY |isbn=978-0-387-32458-6 |pages=379–404}}</ref> '''霍曼斯 George Homans'''等社会学家认为,社会学理论应该被构建为具有逻辑命题和精确术语的层次化结构,并且从中可以得出能够在实证研究中被操作化的其他命题和假设。<ref name="Blackwell">{{cite encyclopedia|year=2007 |title=Computational Sociology |last=Bainbridge |first=William Sims |encyclopedia=Blackwell Encyclopedia of Sociology |publisher=Blackwell Reference Online |url=http://www.sociologyencyclopedia.com/subscriber/tocnode?id=g9781405124331_chunk_g97814051243319_ss1-85 |doi=10.1111/b.9781405124331.2007.x |hdl=10138/224218 |editor=Ritzer, George|isbn=978-1-4051-2433-1|hdl-access=free }}</ref> 由于早在1956年计算机算法和程序就已经被用来测试和验证数学定理(如'''四色定理Four Color Theorem'''),<ref>{{cite book|last=Crevier |first=D. |year=1993 |title=AI: The Tumultuous History of the Search for Artificial Intelligence |url=https://archive.org/details/aitumultuoushist00crev |url-access=registration |publisher=Basic Books |location=New York, NY}}</ref>一些学者预计相似的计算方法可以“解释”和“证明”关于社会结构和演变的类似形式化的问题和定理。
{{main|Systems theory|Structural functionalism}}
  −
 
  −
In the post-war era, [[Vannevar Bush]]'s [[differential analyser]], [[John von Neumann]]'s [[Von Neumann cellular automata|cellular automata]], [[Norbert Wiener]]'s [[cybernetics]], and [[Claude Shannon]]'s [[information theory]] became influential paradigms for modeling and understanding complexity in technical systems. In response, scientists in disciplines such as physics, biology, electronics, and economics began to articulate a [[systems theory|general theory of systems]] in which all natural and physical phenomena are manifestations of interrelated elements in a system that has common patterns and properties. Following [[Émile Durkheim]]'s call to analyze complex modern society ''[[sui generis]]'',<ref>{{cite book|first=Émile |last=Durkheim |title=The Division of Labor in Society |location=New York, NY |publisher=Macmillan}}</ref> post-war structural functionalist sociologists such as [[Talcott Parsons]] seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the [[AGIL paradigm]].<ref name="Bailey">{{cite book|first1=Kenneth D. |last1=Bailey |editor=Jonathan H. Turner |chapter=Systems Theory |title=Handbook of Sociological Theory |publisher=Springer Science |year=2006 |location=New York, NY |isbn=978-0-387-32458-6 |pages=379–404}}</ref> Sociologists such as [[George Homans]] argued that sociological theories should be formalized into hierarchical structures of propositions and precise terminology from which other propositions and hypotheses could be derived and operationalized into empirical studies.<ref name="Blackwell">{{cite encyclopedia|year=2007 |title=Computational Sociology |last=Bainbridge |first=William Sims |encyclopedia=Blackwell Encyclopedia of Sociology |publisher=Blackwell Reference Online |url=http://www.sociologyencyclopedia.com/subscriber/tocnode?id=g9781405124331_chunk_g97814051243319_ss1-85 |doi=10.1111/b.9781405124331.2007.x |hdl=10138/224218 |editor=Ritzer, George|isbn=978-1-4051-2433-1|hdl-access=free }}</ref> Because computer algorithms and programs had been used as early as 1956 to test and validate mathematical theorems, such as the [[four color theorem]],<ref>{{cite book|last=Crevier |first=D. |year=1993 |title=AI: The Tumultuous History of the Search for Artificial Intelligence |url=https://archive.org/details/aitumultuoushist00crev |url-access=registration |publisher=Basic Books |location=New York, NY}}</ref> some scholars anticipated that similar computational approaches could "solve" and "prove" analogously formalized problems and theorems of social structures and dynamics.
  −
 
  −
In the post-war era, Vannevar Bush's differential analyser, John von Neumann's cellular automata, Norbert Wiener's cybernetics, and Claude Shannon's information theory became influential paradigms for modeling and understanding complexity in technical systems. In response, scientists in disciplines such as physics, biology, electronics, and economics began to articulate a general theory of systems in which all natural and physical phenomena are manifestations of interrelated elements in a system that has common patterns and properties. Following Émile Durkheim's call to analyze complex modern society sui generis, post-war structural functionalist sociologists such as Talcott Parsons seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the AGIL paradigm. Sociologists such as George Homans argued that sociological theories should be formalized into hierarchical structures of propositions and precise terminology from which other propositions and hypotheses could be derived and operationalized into empirical studies. Because computer algorithms and programs had been used as early as 1956 to test and validate mathematical theorems, such as the four color theorem, some scholars anticipated that similar computational approaches could "solve" and "prove" analogously formalized problems and theorems of social structures and dynamics.
  −
 
  −
战后时期,范内瓦·布什(Vannevar Bush)的'''微分分析机 Differential Analyser''' 、'''冯·诺伊曼 John von Neumann'''的'''元胞自动机 Cellular Automata'''、 '''维纳 Norbert Wiener'''的'''控制论 Cybernetics'''和'''克劳德·香农 Claude Shannon'''的'''信息论 Information Theory'''成为技术系统中建模和理解复杂性的重要范式。物理学、生物学、电子学和经济学等学科的科学家开始阐述系统的一般理论,即所有自然和物理现象都是具有共同模式和属性的系统中相互关联的元素的表现。沿着'''涂尔干 Émile Durkheim'''分析特定的复杂现代社会的思路,战后结构功能主义社会学家如'''帕森斯 Talcott Parsons'''利用这些组成部分之间系统化和层次化相互作用的理论,试图产生大统一的社会学理论,如 AGIL 范式。'''霍曼斯 George Homans'''等社会学家认为,社会学理论应该被构建为具有逻辑命题和精确术语的层次化结构,并且从中可以得出能够在实证研究中被操作化的其他命题和假设。由于早在1956年计算机算法和程序就已经被用来测试和验证数学定理(如'''四色定理Four Color Theorem'''),一些学者预计相似的计算方法可以“解释”和“证明”关于社会结构和演变的类似形式化的问题和定理。
      
===宏观模拟和微观模拟===
 
===宏观模拟和微观模拟===
   −
{{main|System dynamics|Microsimulation}}
+
截至20世纪60年代末70年代初,社会科学家越来越多地使用已有的计算技术,对组织、工业、城市和全球人口进行包含控制和反馈过程的'''宏观模拟 Macrosimulation''' 。这些模型使用微分方程作为其他系统因素的整体函数来预测人口分布,这些系统因素包括财产控制、城市交通、人口迁移和疾病传播等。<ref>{{cite book|first=Jay |last=Forrester |year=1971 |title=World Dynamics |location=Cambridge, MA |publisher=MIT Press}}</ref><ref>{{cite journal|doi=10.1287/opre.26.2.237|title=Using Simulation to Develop and Validate Analytic Models: Some Case Studies |first1=Edward J. |last1=Ignall |first2=Peter |last2=Kolesar |first3=Warren E. |last3=Walker |journal=Operations Research |volume=26 |issue=2 |year=1978 |pages=237–253}}</ref> 20世纪70年代中期,尽管对社会系统的仿真获得了巨大的关注,但在'''罗马俱乐部 Club of Rome''' 发布预测报告称促进指数式经济增长的政策最终将导致全球环境灾难,<ref>{{cite book|title=The Dynamics of Growth in a Finite World |last1=Meadows |first1=DL |last2=Behrens |first2=WW |last3=Meadows |first3=DH |last4=Naill |first4=RF |last5= Randers |first5=J |last6=Zahn |first6=EK |year=1974 |location=Cambridge, MA |publisher=MIT Press}}</ref>这个悲观的结论导致许多研究者尝试抹黑这些(仿真)模型,并试图让(这些模型的)研究者自身显得不那么科学。<ref name="SfSS1"/><ref>{{cite news|title=Computer View of Disaster Is Rebutted |newspaper=The New York Times |date=October 18, 1974|url=https://www.nytimes.com/1974/10/18/archives/computer-view-of-disaster-is-rebutted.html?_r=0}}</ref> 为了避免同样的情况,许多社会科学家将注意力转向'''微观模拟 Microsimulation'''模型。这些模型通过模拟个体状态的总体变化而不是总体人口级别的分布变化来进行预测和研究政策的效果。<ref>{{cite journal|doi=10.1016/0167-2681(90)90038-F|title=From engineering to microsimulation : An autobiographical reflection |journal=Journal of Economic Behavior & Organization |year=1990 |volume=14 |issue=1 |pages=5–27 |first=Guy H. |last=Orcutt}}</ref>然而,这些微观模拟模型并不允许个体相互作用或适应、变化,研究者也不打算将它们用于基础理论研究。<ref name="MW"/>
 
  −
By the late 1960s and early 1970s, social scientists used increasingly available computing technology to perform macro-simulations of control and feedback processes in organizations, industries, cities, and global populations. These models used differential equations to predict population distributions as holistic functions of other systematic factors such as inventory control, urban traffic, migration, and disease transmission.<ref>{{cite book|first=Jay |last=Forrester |year=1971 |title=World Dynamics |location=Cambridge, MA |publisher=MIT Press}}</ref><ref>{{cite journal|doi=10.1287/opre.26.2.237|title=Using Simulation to Develop and Validate Analytic Models: Some Case Studies |first1=Edward J. |last1=Ignall |first2=Peter |last2=Kolesar |first3=Warren E. |last3=Walker |journal=Operations Research |volume=26 |issue=2 |year=1978 |pages=237–253}}</ref> Although simulations of social systems received substantial attention in the mid-1970s after the [[Club of Rome]] published reports predicting that policies promoting exponential economic growth would eventually bring global environmental catastrophe,<ref>{{cite book|title=The Dynamics of Growth in a Finite World |last1=Meadows |first1=DL |last2=Behrens |first2=WW |last3=Meadows |first3=DH |last4=Naill |first4=RF |last5= Randers |first5=J |last6=Zahn |first6=EK |year=1974 |location=Cambridge, MA |publisher=MIT Press}}</ref> the inconvenient conclusions led many authors to seek to discredit the models, attempting to make the researchers themselves appear unscientific.<ref name="SfSS1"/><ref>{{cite news|title=Computer View of Disaster Is Rebutted |newspaper=The New York Times |date=October 18, 1974|url=https://www.nytimes.com/1974/10/18/archives/computer-view-of-disaster-is-rebutted.html?_r=0}}</ref> Hoping to avoid the same fate, many social scientists turned their attention toward micro-simulation models to make forecasts and study policy effects by modeling aggregate changes in state of individual-level entities rather than the changes in distribution at the population level.<ref>{{cite journal|doi=10.1016/0167-2681(90)90038-F|title=From engineering to microsimulation : An autobiographical reflection |journal=Journal of Economic Behavior & Organization |year=1990 |volume=14 |issue=1 |pages=5–27 |first=Guy H. |last=Orcutt}}</ref> However, these micro-simulation models did not permit individuals to interact or adapt and were not intended for basic theoretical research.<ref name="MW"/>
  −
 
  −
By the late 1960s and early 1970s, social scientists used increasingly available computing technology to perform macro-simulations of control and feedback processes in organizations, industries, cities, and global populations. These models used differential equations to predict population distributions as holistic functions of other systematic factors such as inventory control, urban traffic, migration, and disease transmission. Although simulations of social systems received substantial attention in the mid-1970s after the Club of Rome published reports predicting that policies promoting exponential economic growth would eventually bring global environmental catastrophe, the inconvenient conclusions led many authors to seek to discredit the models, attempting to make the researchers themselves appear unscientific. Hoping to avoid the same fate, many social scientists turned their attention toward micro-simulation models to make forecasts and study policy effects by modeling aggregate changes in state of individual-level entities rather than the changes in distribution at the population level. However, these micro-simulation models did not permit individuals to interact or adapt and were not intended for basic theoretical research.
  −
 
  −
截至20世纪60年代末70年代初,社会科学家越来越多地使用已有的计算技术,对组织、工业、城市和全球人口进行包含控制和反馈过程的'''宏观模拟 Macrosimulation''' 。这些模型使用微分方程作为其他系统因素的整体函数来预测人口分布,这些系统因素包括财产控制、城市交通、人口迁移和疾病传播等。20世纪70年代中期,尽管对社会系统的仿真获得了巨大的关注,但在'''罗马俱乐部 Club of Rome''' 发布预测报告称促进指数式经济增长的政策最终将导致全球环境灾难,这个悲观的结论导致许多研究者尝试抹黑这些(仿真)模型,并试图让(这些模型的)研究者自身显得不那么科学。为了避免同样的情况,许多社会科学家将注意力转向'''微观模拟 Microsimulation'''模型。这些模型通过模拟个体状态的总体变化而不是总体人口级别的分布变化来进行预测和研究政策的效果。然而,这些微观模拟模型并不允许个体相互作用或适应、变化,研究者也不打算将它们用于基础理论研究。
      
===元胞自动机和基于主体建模===
 
===元胞自动机和基于主体建模===
   −
{{main|Cellular automata|agent-based modeling}}
+
20世纪70年代和80年代,物理学家和数学家也在试图模拟和分析简单的组成单位如何产生整体特性,比如低温环境中原子在磁性材料和湍流中(表现出的)的复杂材料特性。<ref>{{cite book|title=Cellular automata machines: a new environment for modeling |url=https://archive.org/details/cellularautomata00toff |url-access=registration |first1=Tommaso |last1=Toffoli |first2=Norman |last2=Margolus | author2-link = Norman Margolus |year=1987 |publisher=MIT Press |location=Cambridge, MA}}</ref> 使用'''[[元胞自动机 Cellular Automata]]''',科学家们能够指定由元胞网格组成的系统,其中每个元胞只占据一些有限的状态,状态之间的变化完全由相邻元胞的状态控制。随着人工智能和微型计算机能力的进步,这些方法促进了“[[混沌理论]]”和“[[复杂性理论]]”的发展,这反过来又重新引起了人们对跨学科的复杂物理和社会系统的兴趣。<ref name="SfSS1"/>明确致力于跨学科复杂性研究的机构也是在这个时代成立的: [[圣菲研究所]]是由'''美国洛斯阿拉莫斯国家实验室Los Alamos National Laboratory'''的科学家于1984年建立的,密歇根大学的 BACH 小组也是在20世纪80年代中期建立的。
 
  −
The 1970s and 1980s were also a time when physicists and mathematicians were attempting to model and analyze how simple component units, such as atoms, give rise to global properties, such as complex material properties at low temperatures, in magnetic materials, and within turbulent flows.<ref>{{cite book|title=Cellular automata machines: a new environment for modeling |url=https://archive.org/details/cellularautomata00toff |url-access=registration |first1=Tommaso |last1=Toffoli |first2=Norman |last2=Margolus | author2-link = Norman Margolus |year=1987 |publisher=MIT Press |location=Cambridge, MA}}</ref> Using cellular automata, scientists were able to specify systems consisting of a grid of cells in which each cell only occupied some finite states and changes between states were solely governed by the states of immediate neighbors. Along with advances in [[artificial intelligence]] and [[microcomputer]] power, these methods contributed to the development of "[[chaos theory]]" and "[[complex systems|complexity theory]]" which, in turn, renewed interest in understanding complex physical and social systems across disciplinary boundaries.<ref name="SfSS1"/> Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the [[Santa Fe Institute]] was established in 1984 by scientists based at [[Los Alamos National Laboratory]] and the BACH group at the [[University of Michigan]] likewise started in the mid-1980s.
  −
 
  −
The 1970s and 1980s were also a time when physicists and mathematicians were attempting to model and analyze how simple component units, such as atoms, give rise to global properties, such as complex material properties at low temperatures, in magnetic materials, and within turbulent flows. Using cellular automata, scientists were able to specify systems consisting of a grid of cells in which each cell only occupied some finite states and changes between states were solely governed by the states of immediate neighbors. Along with advances in artificial intelligence and microcomputer power, these methods contributed to the development of "chaos theory" and "complexity theory" which, in turn, renewed interest in understanding complex physical and social systems across disciplinary boundaries. Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the Santa Fe Institute was established in 1984 by scientists based at Los Alamos National Laboratory and the BACH group at the University of Michigan likewise started in the mid-1980s.
     −
20世纪70年代和80年代,物理学家和数学家也在试图模拟和分析简单的组成单位如何产生整体特性,比如低温环境中原子在磁性材料和湍流中(表现出的)的复杂材料特性。使用'''元胞自动机 Cellular Automata''',科学家们能够指定由元胞网格组成的系统,其中每个元胞只占据一些有限的状态,状态之间的变化完全由相邻元胞的状态控制。随着人工智能和微型计算机能力的进步,这些方法促进了“混沌理论”和“复杂性理论”的发展,这反过来又重新引起了人们对跨学科的复杂物理和社会系统的兴趣。明确致力于跨学科复杂性研究的机构也是在这个时代成立的: 圣菲研究所是由'''美国洛斯阿拉莫斯国家实验室Los Alamos National Laboratory'''的科学家于1984年建立的,密歇根大学的 BACH 小组也是在20世纪80年代中期建立的。
     −
 
+
元胞自动机范式引发了强调'''基于主体建模 Agent-based Modeling'''的第三次社会模拟浪潮。与微观模拟一样,这些模型强调自下而上的设计,<ref name="MW"/>但采用了与微观模拟不同的四个关键假设: '''自主性 Autonomy'''、'''相互依赖性 Interdependency'''、'''简单规则 Simple Rules'''和'''适应性行为 Adaptive Behavior'''。<ref>{{cite journal |title=A simulation of the structure of academic science |journal=Sociological Research Online |volume=2 |issue=2 |pages=1–15 |year=1997 |first=Nigel |last=Gilbert |url=http://www.socresonline.org.uk/socresonline/2/2/3.html |doi=10.5153/sro.85 |access-date=2009-12-16 |archive-url=https://web.archive.org/web/19980524062306/http://www.socresonline.org.uk/socresonline/2/2/3.html |archive-date=1998-05-24 |url-status=dead }}</ref> 1981年,数学家、政治学家阿克塞尔罗德 Robert Axelrod和进化生物学家汉密尔顿 W.D. Hamilton 在《科学》杂志上发表了一篇名为《合作的进化 The Evolution of Cooperation》的重要论文,该论文采用基于主体建模方法,论证了在一个囚徒困境中,当主体遵循简单的自利规则 rules of self-interest 时,互惠的社会合作是如何建立和稳定的。<ref>{{cite journal|title=The Evolution of Cooperation |first1=Robert |last1=Axelrod |first2=William D. |last2=Hamilton |journal=Science |volume=211 |issue=4489 |pages=1390–1396 |doi=10.1126/science.7466396|pmid=7466396 |date=March 27, 1981|bibcode=1981Sci...211.1390A }}</ref>阿克塞尔罗德和汉密尔顿证明,主体遵循这样一套简单的规则: (1)在第一轮进行合作,(2)在其后重复伙伴以前的行动,能够在没有人口学差异、价值观、宗教和文化等社会规范作为合作的先决条件或中介的情况下制定合作和制裁的“规范”。<ref name="Cooperation"/> 整个20世纪90年代,像威廉·希姆斯·本布里奇 William Sims Bainbridge ,Kathleen Carley,Michael Macy 和 John Skvoretz 这样的学者开发了关于广义互惠、偏见、社会影响和组织信息处理模型的多主体模型。1999年,吉尔伯特 Nigel Gilbert出版了第一本关于社会模拟的教科书: 《'''写给社会科学家的仿真模拟 Simulation for the social scientist'''》,并建立了它最相关的杂志: 人工社会和社会模拟杂志《the Journal of Artificial Societies and Social Simulation》。
 
  −
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling. Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior.<ref name="MW"/> Agent-based models are less concerned with predictive accuracy and instead emphasize theoretical development.<ref>{{cite journal |title=A simulation of the structure of academic science |journal=Sociological Research Online |volume=2 |issue=2 |pages=1–15 |year=1997 |first=Nigel |last=Gilbert |url=http://www.socresonline.org.uk/socresonline/2/2/3.html |doi=10.5153/sro.85 |access-date=2009-12-16 |archive-url=https://web.archive.org/web/19980524062306/http://www.socresonline.org.uk/socresonline/2/2/3.html |archive-date=1998-05-24 |url-status=dead }}</ref> In 1981, mathematician and political scientist [[Robert Axelrod]] and evolutionary biologist [[W.D. Hamilton]] published a major paper in ''[[Science (journal)|Science]]'' titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a [[prisoner's dilemma]] game when agents followed simple rules of self-interest.<ref>{{cite journal|title=The Evolution of Cooperation |first1=Robert |last1=Axelrod |first2=William D. |last2=Hamilton |journal=Science |volume=211 |issue=4489 |pages=1390–1396 |doi=10.1126/science.7466396|pmid=7466396 |date=March 27, 1981|bibcode=1981Sci...211.1390A }}</ref> Axelrod and Hamilton demonstrated that individual agents following a simple rule set of (1) cooperate on the first turn and (2) thereafter replicate the partner's previous action were able to develop "norms" of cooperation and sanctioning in the absence of canonical sociological constructs such as demographics, values, religion, and culture as preconditions or mediators of cooperation.<ref name="Cooperation"/> Throughout the 1990s, scholars like [[William Sims Bainbridge]], [[Kathleen Carley]], [[Michael Macy]],  and [[John Skvoretz]] developed multi-agent-based models of [[generalized reciprocity]], [[prejudice]], [[social influence]], and organizational [[information processing]]. In 1999, [[Nigel Gilbert]] published the first textbook on Social Simulation: ''Simulation for the social scientist'' and established its most relevant journal: the [[Journal of Artificial Societies and Social Simulation]].
  −
 
  −
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling. Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior. In 1981, mathematician and political scientist Robert Axelrod and evolutionary biologist W.D. Hamilton published a major paper in Science titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a prisoner's dilemma game when agents followed simple rules of self-interest. Axelrod and Hamilton demonstrated that individual agents following a simple rule set of (1) cooperate on the first turn and (2) thereafter replicate the partner's previous action were able to develop "norms" of cooperation and sanctioning in the absence of canonical sociological constructs such as demographics, values, religion, and culture as preconditions or mediators of cooperation. Throughout the 1990s, scholars like William Sims Bainbridge, Kathleen Carley, Michael Macy,  and John Skvoretz developed multi-agent-based models of generalized reciprocity, prejudice, social influence, and organizational information processing. In 1999, Nigel Gilbert published the first textbook on Social Simulation: Simulation for the social scientist and established its most relevant journal: the Journal of Artificial Societies and Social Simulation.
  −
 
  −
元胞自动机范式引发了强调'''基于主体建模 Agent-based Modeling'''的第三次社会模拟浪潮。与微观模拟一样,这些模型强调自下而上的设计,但采用了与微观模拟不同的四个关键假设: '''自主性 Autonomy'''、'''相互依赖性 Interdependency'''、'''简单规则 Simple Rules'''和'''适应性行为 Adaptive Behavior'''。1981年,数学家、政治学家阿克塞尔罗德(Robert Axelrod)和进化生物学家汉密尔顿(W.D. Hamilton)在《科学》杂志上发表了一篇名为《合作的进化》(The Evolution of Cooperation)的重要论文,该论文采用基于主体建模方法,论证了在一个囚徒困境中,当主体遵循简单的自利规则(rules of self-interest)时,互惠的社会合作是如何建立和稳定的。阿克塞尔罗德和汉密尔顿证明,主体遵循这样一套简单的规则: (1)在第一轮进行合作,(2)在其后重复伙伴以前的行动,能够在没有人口学差异、价值观、宗教和文化等社会规范作为合作的先决条件或中介的情况下制定合作和制裁的“规范”。整个20世纪90年代,像威廉·希姆斯·本布里奇(William Sims Bainbridge),Kathleen Carley,Michael Macy 和 John Skvoretz 这样的学者开发了关于广义互惠、偏见、社会影响和组织信息处理模型的多主体模型。1999年,吉尔伯特(Nigel Gilbert)出版了第一本关于社会模拟的教科书: 《'''写给社会科学家的仿真模拟 Simulation for the social scientist'''》,并建立了它最相关的杂志: 人工社会和社会模拟杂志《the Journal of Artificial Societies and Social Simulation》。
      
===数据挖掘和社会网络分析===
 
===数据挖掘和社会网络分析===
   −
{{main|Data mining|Social network analysis}}
+
'''社会网络分析 Social Network Analysis'''独立于社会系统计算模型的发展,在20世纪70年代和80年代出现于[[图论]]、统计学和社会结构的研究中,它作为一种独特的分析方法被社会学家如 James s. Coleman,Harrison White,Linton Freeman,J. Clyde Mitchell,Mark Granovetter,Ronald Burt Barry Wellman 等阐述和采用。<ref>{{cite book|title=The Development of Social Network Analysis: A Study in the Sociology of Science |first=Linton C. |last=Freeman |publisher=Empirical Press |location=Vancouver, BC |year=2004}}</ref> 在整个1980年代和1990年代,计算和通信技术日益普及,这要求采用诸如网络分析和多级建模等分析技术,这些技术可以扩展到日益复杂和庞大的数据集中。最近的计算社会学没有使用模拟,而是使用网络分析和先进的统计技术来分析大规模电子服务器构成的计算机数据库中的行为数据。<ref>{{cite journal|title=Life in the network: the coming age of computational social science|first9=J|last10=Gutmann|first10=M.|last11=Jebara|first11=T.|last12=King|first12=G.|last13=Macy|first13=M.|last14=Roy|first14=D.|last15=Van Alstyne|first15=M.|last9=Fowler|first8=N|last8=Contractor|first7=N|last7=Christakis|first6=D|last6=Brewer|first5=AL|last5=Barabasi|first4=S |journal=Science|last4=Aral |date=February 6, 2009|first3=L |volume=323|pmid=19197046 |issue=5915|last3=Adamic |pages=721–723|pmc=2745217 |doi=10.1126/science.1167742 |first1=David |last1=Lazer |first2=Alex |last2=Pentland |display-authors=8}}</ref> 电子记录,如电子邮件和即时消息记录,[[万维网]]上的超链接,移动电话数据,以及 Usenet 上的讨论,使社会科学家能够直接观察社会行为并在多个时间点和多个层次的分析行为,并且不受传统的实证方法,如访谈、观察或调查工具的限制。机器学习算法的不断改进同样使得社会科学家和企业家能够使用新技术来识别大型电子数据集中潜在但有意义的社会互动和演化模式。<ref>{{cite journal|first1=Jaideep |last1=Srivastava |first2=Robert |last2=Cooley |first3=Mukund |last3=Deshpande |first4=Pang-Ning |last4=Tan |journal=Proceedings of the ACM Conference on Knowledge Discovery and Data Mining |title=Web usage mining: discovery and applications of usage patterns from Web data|volume=1 |year=2000 |pages=12–23 |doi=10.1145/846183.846188|issue=2}}</ref><ref>{{cite journal|doi=10.1016/S0169-7552(98)00110-X|title=The anatomy of a large-scale hypertextual Web search engine |first1=Sergey |last1=Brin |first2=Lawrence |last2=Page |journal=Computer Networks and ISDN Systems |volume=30 |issue=1–7 |pages=107–117 |date=April 1998|citeseerx=10.1.1.115.5930 }}</ref>
 
  −
 
  −
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like James S. Coleman, Harrison White, Linton Freeman, J. Clyde Mitchell, Mark Granovetter, Ronald Burt, and Barry Wellman.[22] The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as network analysis and multilevel modeling, that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the World Wide Web, mobile phone usage, and discussion on Usenet allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.[23] Continued improvements in machine learning algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.[24][25]
  −
 
  −
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like [[James Samuel Coleman|James S. Coleman]], [[Harrison White]], [[Linton Freeman]], [[J. Clyde Mitchell]], [[Mark Granovetter]], [[Ronald Burt]], and [[Barry Wellman]].<ref>{{cite book|title=The Development of Social Network Analysis: A Study in the Sociology of Science |first=Linton C. |last=Freeman |publisher=Empirical Press |location=Vancouver, BC |year=2004}}</ref> The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as [[network theory|network analysis]] and [[multilevel modeling]], that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the [[World Wide Web]], mobile phone usage, and discussion on [[Usenet]] allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.<ref>{{cite journal|title=Life in the network: the coming age of computational social science|first9=J|last10=Gutmann|first10=M.|last11=Jebara|first11=T.|last12=King|first12=G.|last13=Macy|first13=M.|last14=Roy|first14=D.|last15=Van Alstyne|first15=M.|last9=Fowler|first8=N|last8=Contractor|first7=N|last7=Christakis|first6=D|last6=Brewer|first5=AL|last5=Barabasi|first4=S |journal=Science|last4=Aral |date=February 6, 2009|first3=L |volume=323|pmid=19197046 |issue=5915|last3=Adamic |pages=721–723|pmc=2745217 |doi=10.1126/science.1167742 |first1=David |last1=Lazer |first2=Alex |last2=Pentland |display-authors=8}}</ref> Continued improvements in [[machine learning]] algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.<ref>{{cite journal|first1=Jaideep |last1=Srivastava |first2=Robert |last2=Cooley |first3=Mukund |last3=Deshpande |first4=Pang-Ning |last4=Tan |journal=Proceedings of the ACM Conference on Knowledge Discovery and Data Mining |title=Web usage mining: discovery and applications of usage patterns from Web data|volume=1 |year=2000 |pages=12–23 |doi=10.1145/846183.846188|issue=2}}</ref><ref>{{cite journal|doi=10.1016/S0169-7552(98)00110-X|title=The anatomy of a large-scale hypertextual Web search engine |first1=Sergey |last1=Brin |first2=Lawrence |last2=Page |journal=Computer Networks and ISDN Systems |volume=30 |issue=1–7 |pages=107–117 |date=April 1998|citeseerx=10.1.1.115.5930 }}</ref>
  −
 
  −
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like James S. Coleman, Harrison White, Linton Freeman, J. Clyde Mitchell, Mark Granovetter, Ronald Burt, and Barry Wellman. The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as network analysis and multilevel modeling, that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the World Wide Web, mobile phone usage, and discussion on Usenet allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments. Continued improvements in machine learning algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.
     −
'''社会网络分析 Social Network Analysis'''独立于社会系统计算模型的发展,在20世纪70年代和80年代出现于图论、统计学和社会结构的研究中,它作为一种独特的分析方法被社会学家如 James s. Coleman,Harrison White,Linton Freeman,J. Clyde Mitchell,Mark Granovetter,Ronald Burt 和 Barry Wellman 等阐述和采用。在整个1980年代和1990年代,计算和通信技术日益普及,这要求采用诸如网络分析和多级建模等分析技术,这些技术可以扩展到日益复杂和庞大的数据集中。最近的计算社会学没有使用模拟,而是使用网络分析和先进的统计技术来分析大规模电子服务器构成的计算机数据库中的行为数据。电子记录,如电子邮件和即时消息记录,万维网上的超链接,移动电话数据,以及 Usenet 上的讨论,使社会科学家能够直接观察社会行为并在多个时间点和多个层次的分析行为,并且不受传统的实证方法,如访谈、观察或调查工具的限制。机器学习算法的不断改进同样使得社会科学家和企业家能够使用新技术来识别大型电子数据集中潜在但有意义的社会互动和演化模式。
       
763

个编辑