Dictionary of Arguments


Philosophical and Scientific Issues in Dispute
 
[german]

Screenshot Tabelle Begriffes

 

Find counter arguments by entering NameVs… or …VsName.

Enhanced Search:
Search term 1: Author or Term Search term 2: Author or Term


together with


The author or concept searched is found in the following 19 entries.
Disputed term/author/ism Author
Entry
Reference
Algorithms Pentland Brockman I 204
Algorithms/Pentland: If we have the data that go into and out of each decision, we can easily ask, Is this a fair algorithm? Is this AI doing things that we as humans believe are ethical? This human in-the-loop approach is called “open algorithms”; you get to see what the Als take as input and what they decide using that input. If you see those two things, you’ll know whether they’re doing the right thing or the wrong thing. It turns out that’s not hard to do. If you control the data, then you control the AI. >Artificial intelligence/Pentland, >Ecosystems/Pentland, >Decision-making processes/Pentland, >Cybernetics/Pentland.

Pentland, A. “The Human strategy” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press.


Brockman I
John Brockman
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019
Artificial Intelligence Pentland Brockman I 200
Artificial intelligence/Pentland: On the horizon is a vision of how we can make humanity more intelligent by building a human AI. It’s a vision composed of two threads. One is data that we can all trust- data that have been vetted by a broad community, data where the algorithms are known and monitored, much like the census data we all automatically rely on as at least approximately correct.
The other is a fair, data-driven assessment of public norms, policy, and government, based on trusted data about current conditions. >Cybernetics/Pentland, >Ecosystems/Pentland, >Decision-making Processes/Pentland, >Data/Pentland.
Brockman I 204
One thing people often fail to mention is that all the worries about AI are the same as the worries about today’s government. For most parts of the government - the justice system, etc. - there’s no reliable data about what they’re doing and in what situation. VsArtificial intelligence/Pentland: Current AI is doing descriptive statistics in a way that’s not science and would be almost impossible to make into science. To build robust systems, we need to know the science behind data.
Solution/Pentland: The systems I view as next-generation Als result from this science- based approach: If you’re going to create an AI to deal with something physical, then you should build the laws of physics into it as your descriptive functions, in place of those stupid little neurons. >Ecosystem/Pentland.
ing algorithms. When you replace the stupid neurons with ones that capture the basics of human behavior, then you can identify trends with very little data, and you can deal with huge levels of noise.
The fact that humans have a “commonsense” understanding that they bring to most
Brockman I 205
problems suggests what I call the human strategy: Human society is a network just like the neural nets trained for deep learning, but the “neurons” in human society are a lot smarter.

Pentland, A. “The Human strategy” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press.


Brockman I
John Brockman
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019
Climate Costs United States Norgaard I 178
Climate Costs/Losses/United States: The original United States Environmental Protection Administration (USEPA) studies of the damages from climate change were exclusively concerned with measuring effects in the United States (Smith and Tirpak 1989)(1). The analyses examined the consequences of the equilibrium climate that would be caused by doubling carbon dioxide (CO2) concentrations in the earth (550 ppm). The USEPA studies did not address the dynamics of impacts over time. For example, the coastal, forestry, and ecosystem studies involve sectors that take decades if not centuries to adjust. The studies did not capture how these costs evolved over time. The USEPA studies revealed that a limited number of economic sectors were vulnerable to climate change: agriculture, coastal, energy, forestry, infrastructure, and water. In addition, several non‐market sectors were also vulnerable including recreation, ecosystems, endangered species, and health. Subsequent economic studies attempted to value the US economic damages associated with these impacts in terms of dollars (Nordhaus 1991(2); Cline 1992(3); Titus 1992(4); Fankhauser 1995(5); Tol 1995(6)). These economic results were summarized in the Second Assessment Report of the Intergovernmental Panel on Climate Change (Pearce et al. 1996)(7). The aggregate damage estimates to the US for doubling greenhouse gases (550 ppm) range from 1.0 to 2.5 percent of GDP.
The damage estimates varied widely across the different authors reviewed even though each author relied on the same original USEPA sectoral studies. Most of the other authors [excluding Cline and Frankhauser] assumed that ecosystem change would not necessarily be this harmful.
Norgaard I 179
Climate Costs/Losses: Two studies went beyond the US and predicted impacts across the world (Fankhauser 1995(5); Tol 1995(6)). Unfortunately, there was little evidence at the time to base this extrapolation upon other than population and income. They predicted global impacts from doubling CO would range from 1.4 to 1.9 percent of Gross World Product (GWP). They predicted that the bulk of these damages would fall on the OECD (60 to 67 percent) because they assumed that damages were proportional to income. Only 20 to 37 percent of the damages were predicted to fall on low latitude countries, although this would amount to a higher fraction of their GDP (over 6 percent). Africa, southern Asia, and southeast Asia (not including China) were predicted to be the most sensitive to warming with losses over 8 percent of GDP (Tol 1995)(6). If temperatures could rise to 10 °C in future centuries, damages could rise to 6 percent of GWP (Cline 1992)(3). Vs: (…) this is based largely on just extrapolating the results of the doubling experiment rather than upon additional research concerning higher temperatures.
(…) the current present value of a ton of carbon would lead to damages on the order of $5 to $12 per ton (Pearce et al. 1996)(7). This is equivalent to $1.4 to $3.3 per ton of carbon dioxide. This social cost of carbon should rise over time at approximately a 2 percent rate to account for the rising marginal damages associated with accumulating greenhouse gases in the atmosphere. Such low prices will not stop greenhouse gases from accumulating over this century, they will simply slow them down (Nordhaus 1991)(2).
Climate Costs/Catastrophes: The IPCC report also considered catastrophe. If temperatures were 6°C warmer by 2090, ‘experts’ predicted an 18 percent chance that damages would be greater than 25 percent of GWP (Nordhaus 1994)(8). In this case, experts included economists but also natural scientists unfamiliar with damage estimation. The three catastrophes identified in the IPCC report are a runaway greenhouse gas effect, disintegration of the West Antarctic ice sheet, and major changes in ocean currents (Pearce et al. 1996)(7).

>Emission permits, >Emission reduction credits, >Emission targets, >Emissions, >Emissions trading, >Climate change, >Climate damage, >Energy policy, >Clean Energy Standards, >Climate data, >Climate history, >Climate justice, >Climate periods, >Climate targets, >Climate impact research, >Carbon price, >Carbon price coordination, >Carbon price strategies, >Carbon tax, >Carbon tax strategies.


1. Smith, J., and Tirpak, D. 1989. Potential Effects of Global Climate Change on the United States. Washington, DC: US Environmental Protection Agency.
2. Nordhaus, W. 1991. To slow or not to slow: The economics of the greenhouse effect. Economic Journal 101: 920–37.
3. Cline, W. 1992. The Economics of Global Warming. Washington, DC: Institute of International Economics.
4. Titus, J. G. 1992. The cost of climate change to the United States. In S. Majumdar, L. Kalkstein, B. Yarnal, E. Miller, and L. Rosenfeld (eds.), Global Climate Change: Implications, Challenges, and Mitigation Measures. Easton, PA: Pennsylvania Academy of Sciences.
5. Fankhauser, S. 1995. Valuing Climate Change: The Economics of the Greenhouse. London: Earthscan.
6. Tol, R. 1995. The damage costs of climate change: Towards more comprehensive estimates. Environmental and Resource Economics 5: 353–74.
7. Pearce, D. et al. 1996. The social cost of climate change: Greenhouse damage and the benefits of control. Pp. 179–224 in Intergovernmental Panel on Climate Change, Climate Change 1995: Economic and Social Dimensions of Climate Change. Cambridge: Cambridge University Press.
8. Nordhaus, W. 1994. Managing the Global Commons. The Economics of Climate Change. MIT Press, Cambridge, MA.



Mendelsohn, Robert: “Economic Estimates of the Damages Caused by Climate Change”, In: John S. Dryzek, Richard B. Norgaard, David Schlosberg (eds.) (2011): The Oxford Handbook of Climate Change and Society. Oxford: Oxford University Press.


Norgaard I
Richard Norgaard
John S. Dryzek
The Oxford Handbook of Climate Change and Society Oxford 2011
Cybernetics Pentland Brockman I 194
Cybernetics/Pentland: State-of-the-art research in most engineering disciplines is now framed as feedback systems that are dynamic and drive by energy flows. Even AI is being recast as human/machine “adviser” systems, and the military is beginning large-scale funding in this area—something that should perhaps worry us more than drones and independent humanoid robots. But as science and engineering have adopted a more cybernetics-like stance, it has become clear that even the vision of cybernetics is far too small. It was originally centered on the embeddedness of the individual actor but not on the emergent properties of a network of actors. This is unsurprising, because the mathematics of networks did not exist until recently, so a quantitative science of how networks behave was impossible. We now know that study of the individual does not produce understanding of the system except in certain simple cases. >Ecosystems/Pentland.


Pentland, A. “The Human strategy” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press.


Brockman I
John Brockman
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019
Data Pentland Brockman I 199
Data/cybernetics/ecosystem/decision-making processes/Pentland: When you can get (…) feedback quantitatively - which is difficult, because most things aren’t measured quantitatively - both the productivity and the innovation rate within the organization can be significantly improved. A next step is to try to do the same thing but at scale, something I refer to as building a trust network for data. It can be thought of as a distributed system like the Internet, but with the ability to quantitatively measure and communicate the qualities of human society (…)
Brockman I 203
If we have the data that go into and out of each decision, we can easily ask, Is this a fair algorithm? Is this AI doing things that we as humans believe are ethical? This human in-the-loop approach is called “open algorithms”; you get to see what the Als take as input and what they decide using that input. If you see those two things, you’ll know whether they’re doing the right thing or the wrong thing. It turns out that’s not hard to do. If you control the data, then you control the AI. >Algorithms, >Artificial intelligence.

Pentland, A. “The Human strategy” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press.


Brockman I
John Brockman
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019
Decision-making Processes Pentland Brockman I 198
Decision-making Processes/Pentland: My students and I are looking at how people make decisions, on huge databases of financial decisions, business decisions, and many other sorts of decisions. What we’ve found is that humans often make decisions in a way that mimics AI credit-assignment algorithms and works to make the community smarter. A particularly interesting feature of this work is that it addresses a classic problem in evolution known as the group selection problem. The core of this problem is: How can we select for culture in evolution, when it’s the individuals that reproduce? What you need is something that selects for the best cultures and the best groups but also selects for the best individuals, because they’re the units that transmit the genes. >Ecosystem/Pentland, >Cybernetics/Pentland. “Distributed Thompson sampling”/Pentland: a mathematical algorithm used in choosing, out of a set of possible actions with unknown payoffs, the action that maximizes the expected reward in respect to the actions. The key is social sampling, a way of combining evidence, of exploring and exploiting at the same time. It has the unusual property of simultaneously being the best strategy both for the individual and for the group.
Social sampling: (…) is looking around you at the actions of people who are like you, finding what’s popular, and then copying it if it seems like a good idea to you. Idea propagation has this popularity function driving it, but individual adoption also is about figuring out how the idea works for the individual—a reflective attitude.
When you combine social sampling and personal judgment, you get superior decision making.
That’s amazing, because now we have a mathematical recipe for doing with humans what all those AI techniques are doing with dumb computer neurons. We have a way of putting people together to make better decisions, given more and more experience.
(…) the way you can make human AI, will work only if you can get feedback to them that’s truthful. It must be grounded on whether each per son’s actions worked for them or not.
Brockman I 199
The next step is to build a credit-assignment function (>Ecosystem/Pentland).
Pentland, A. “The Human strategy” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press.


Brockman I
John Brockman
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019
Ecology Naess Singer I 251
Ecology/Naess, Arne/Singer, P.: (A. Naess (1973)(1): Def Shallow Ecology/Naess: is limited to the traditional framework of ethics: this is about not polluting water, for example, in order to have enough drinking water and to avoid pollution, so that one can continue to enjoy nature. On the other hand,
Def Deep Ecology/Naess: wants to preserve the biosphere for its own sake, regardless of the potential benefit to mankind.
Deep Ecology/Naess/Singer, P.: thus takes as its subject matter larger units than the individual: species, ecosystems and even the biosphere as a whole.
Deep Ecology(2): (A. Naess and G. Sessions (1984)(2)
Principles:
1. The wellbeing and development of human and non-human life on earth have a value in itself (intrinsic, inherent value), regardless of the non-human world's use for human purposes.
2. Wealth and diversity of life forms contribute to the realization of these values and are values in themselves.
3. People do not have the right to diminish the wealth and diversity of the world, except when it comes to vital interests.
Singer I 252
Biosphere/Naess/Sessions/Singer, P.: Sessions and Naess use the term "Biosphere" in a broad sense, so that rivers, landscapes and ecosystems are also included. P. SingerVsNaess: (see also SingerVsSessions): the ethics of deep ecology does not provide satisfactory answers to the value of the life of individuals. Maybe that is the wrong question. Ecology is more about systems than individual organisms. Therefore, ecological ethics should be related to species and ecosystems.
Singer I 253
So there is a kind of Holism behind it. This is shown by Lawrence Johnson (L. Johnson, A Morally Deep World, Cambridge, 1993). Johnson's thesis: The interests of species are different from the sum of individual interests and exist simultaneously together with individual interests within our moral considerations. >Climate change, >Climate damage, >Energy policy, >Clean Energy Standards, >Climate data, >Climate history, >Climate justice, >Climate periods, >Climate targets, >Climate impact research

1. A. Naess (1973). „The Shallow and the Deep, Long-Range Ecology Movement“, Inquiry 16 , pp. 95-100
2. A. Naess and George Sessions (1984). „Basic Principles of Deep Ecology“, Ecophilosophy, 6

Naess I
Arne Naess
Can Knowledge Be Reached? Inquiry 1961, S. 219-227
In
Wahrheitstheorien, Gunnar Skirbekk Frankfurt/M. 1977


SingerP I
Peter Singer
Practical Ethics (Third Edition) Cambridge 2011

SingerP II
P. Singer
The Most Good You Can Do: How Effective Altruism is Changing Ideas About Living Ethically. New Haven 2015
Ecology Sessions Singer I 252
Ecology/biosphere/George Sessions/Bill Devall/Singer, P.: W. Devall and G. Sessions, Deep Ecology, Living As If Nature Mattered, Salt Lake City (1985): Thesis: The idea of biocentric equality is that all things in the biosphere have the same right to life and the right to their individual development possibilities. All organisms and entities in the ecosphere, as part of a coherent whole, have the same intrinsic value. P. SingerVsSessions, George/P. SingerVsDevall, Bill/Singer, P.: there are strong intuitive objections, for example:
1. That the welfare of adults is more important than the well-being of yeast and that the rights of gorillas are higher than the rights of grass.
2. If humans, gorillas, yeasts and grasses are all parts of a coherent whole, one can still ask why this gives the same intrinsic value to all elements.
a) Even if there is an intrinsic value in the realm of micro-organisms and the plant kingdom, this does not show that individual micro-organisms and individual plants also have an intrinsic value, because their survival as individuals is irrelevant to the survival of the ecosystem as a whole.
b) The fact that all organisms are part of a coherent whole does not show that they all have an intrinsic value, let alone the same intrinsic value.
It could still be that the whole thing has only one value, because it promotes the existence of conscious beings.

Sessions I
George Sessions
Deep Ecology - Living as If Nature Mattered Santa Barbara 1987


SingerP I
Peter Singer
Practical Ethics (Third Edition) Cambridge 2011

SingerP II
P. Singer
The Most Good You Can Do: How Effective Altruism is Changing Ideas About Living Ethically. New Haven 2015
Ecology Singer I 251
Ecology/Aldo Leopold/Singer, P.: (A. Leopold)(1): Thesis: We need a "new ethics" that deals with the relationship of man to land and animals. Leopold thesis: Something is okay if it intends to preserve the integrity, stability and beauty of the biotic community and it is wrong if it does not do so.
>Utilitarianism.
I 253
Ecology/Deep Ecology/Singer, P.: (see also Ecology/Naess, Ecology/Sessions). Problems: it is the question of whether a species or an ecosystem can be considered as an individual with interests.
>Ecosystemic approach.
Deep ecology: will have a problem with the definition of reverence for life. One cannot only doubt that trees, species and ecosystems have moral interests: moreover, if they are to be considered as a "self", it is still difficult to show that the survival of this self (the tree or the system) has a moral value, irrespective of the benefit it has for conscious life.
Existence/Systems/Value/Ethics/Singer, P.: Another problem: "How is it for a system not to be realized?"
I 254
P. SingerVsLovelock, James: In this respect, species, trees and ecosystems are more like rocks than knowing beings. We should confine ourselves to arguments concerning such knowing beings. >Deep ecology.

1. A. Leopold, A Sand County Almanac, with Essay on Conservation from Round River, New York (1970), pp. 238 and 262.

SingerP I
Peter Singer
Practical Ethics (Third Edition) Cambridge 2011

SingerP II
P. Singer
The Most Good You Can Do: How Effective Altruism is Changing Ideas About Living Ethically. New Haven 2015

Ecosystemic Approach
Ecosystemic Approach Bronfenbrenner Upton I 16
Ecosystemic Approach/Bronfenbrenner/Upton: Bronfenbrenner’s biocological systems theory (Bronfenbrenner 1977)(1) provides us with a framework for looking at the different factors that influence human development. Bronfenbrenner acknowledges the importance of biological factors for development, but also points to the fact that, more than any other species, humans create
Upton I 17
the environments that help shape their own development. Development always occurs in a particular social context and this context can change development. Bronfenbrenner maintained that human beings can therefore develop those environments to optimise their genetic potential. >Environment.
Aspects of the environment:
Microsystem: this includes the immediate environment we live in and any immediate relationships or organisations we interact with, such as the family, school, workplace, peer group and neighbourhood.
Mesosystem: this level describes the connections between immediate environments. According to Bronfenbrenner, the way in which the different groups or organisations in the microsystem work together will have an effect on how we develop as individuals.
Exosystem: this refers to the external environmental settings that only indirectly affect development, such as a parent’s workplace.
Macrosystem: this is the larger cultural context and includes cultural and social norms and attitudes, national economy, political culture and so on. Although this layer is the most remote from the individual, it still influences development, for example by shaping how the micro- and exosystems are organised.
>Norms, >Attitudes, >Socialization, >Social identity.
Upton I 18
Chronosystem: this system refers to the dimension of time as it relates to an individual’s environments. Elements within this system can be either external, such as the timing of a loved one’s death, or internal, such as the physiological changes that occur with ageing. As individuals get older, they may react differently to environmental changes and may be more able to determine more how those changes will influence them. >Aging.

1. Bronfenbrenner, U. (1977) Toward an experimental ecology of human development. American Psychologist, 32: 513–31.


Upton I
Penney Upton
Developmental Psychology 2011
Ecosystems Pentland Brockman I 195
Ecosystem/Pentland: How can we make a good human-artificial ecosystem, something that’s not a machine society but a cyberculture in which we can all live as humans—a culture with a human feel to it?
Brockman I 196
The first thing to ask is: What’s the magic that makes the current AI work? Where is it wrong and where is it right? The good magic is that it has something called the credit-assignment function. What that lets you do is take “stupid neurons”—little linear functions—and figure out, in a big network, which ones are doing the work and strengthen them.
The bad part of it is that because those little neurons are stupid, the things they learn don’t generalize very well. If an AI sees something it hasn’t seen before, or if the world changes a little bit, the AI is likely to make a horrible mistake. It has absolutely no sense of context. In some ways, it’s as far from
Solution/Pentland: imagine neurons in which real-world knowledge was embedded. When you add (…) background knowledge and surround it with a good credit assignment function, then you can take observational data and use the credit-assignment
Brockman I 197
function to reinforce the functions that are producing good answers. The result is an AI that works extremely well and can generalize. Social physics/Pentland: Thesis: Similar to the physical-systems case, if we make neurons that know a lot about how humans learn from one another, then we can detect human fads and predict human behavior trends in surprisingly accurate and efficient ways. This “social physics” works because human behavior is determined as much by the patterns of our culture as by rational, individual thinking. These patterns can be described mathematically and employed to make accurate predictions. >Cybernetics/Pentland, >Decision-making Processes/Pentland, >Data/Pentland.


Pentland, A. “The Human strategy” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press.


Brockman I
John Brockman
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019
Environmental Damage Economic Theories Mause I 402f
Environmental Damage/Economic Theory: Environmental damage is often the result of the economic use of natural resources. They are caused by production and consumption as well as the absorption of pollutants within the existing environmental media (air, water, soil). These forms of use can also be described as functions of the natural environment (production, consumption, landfill function). On the other hand, increasing land use for settlement, transport and production purposes is contributing to environmental damage because natural ecosystems are being reduced, biodiversity is declining, the landscape is being affected and the soil is increasingly sealed (Cansier 1993, p. 3 (1); Hartwig 1992, p. 126ff (2)).
Environmental Policy/Federal Republic of Germany: The environmental policy pursued in Germany for more than 40 years (see for an overview Böcher und Töller 2012, p. 6ff. (3)) has contributed to a significant improvement in Germany's environmental quality status, particularly in the recent past, according to the latest OECD environmental assessment report (2012) (4). For example, Germany's total greenhouse gas emissions (CO2, methane, etc.) in 2010 were 24 % below 1990 levels, although Germany is one of the few OECD countries to have completely decoupled greenhouse gas emissions and economic growth in the 2000s, not least due to a reduction in the energy intensity of industrial production.
Externality: The need for government action in the field of environmental policy can be justified from an economic point of view by the concept of external effects in addition to the public good properties of the elimination of environmental damage (Feess und Seeliger 2013, p. 39ff.(5); Endres 2000, p. 18ff.(6)).
Environmental damage and improvements can then be understood as a consequence of negative or positive side effects of production or consumption. Like public goods, these effects are not covered by the market price mechanism.
Problem: If the state does not ensure the "internalisation of external effects" within the framework of its environmental policy, i.e. for the charging of external costs or a renumeration of the external benefits from the polluter, this leads to a misallocation in the provision of private goods, which is accompanied by an overuse of environmental resources or too little improvement in environmental quality.(7)

>Emission permits, >Emission reduction credits, >Emission targets, >Emissions, >Emissions trading, >Climate change, >Climate damage, >Energy policy, >Clean Energy Standards, >Climate data, >Climate history, >Climate justice, >Climate periods, >Climate targets, >Climate impact research, >Carbon price, >Carbon price coordination, >Carbon price strategies, >Carbon tax, >Carbon tax strategies.

1. Cansier, Dieter. 1993. Umweltökonomie. Stuttgart/ Jena:
2. Hartwig, Karl-Hans, Umweltökonomie. In Vahlens Kompendium der Wirtschaftstheorie und Wirtschaftspolitik, ed. Dieter Bender, Hartmut Berg, Dieter Cassel, Günter Gabisch, Karl-Hans Hartwig, Lothar Hübl, Dietmar Kath, Rolf Peffekoven, Jürgen Siebke, H. Jörg Thieme und Manfred Willms, Vol. 2, 5. ed., 122– 162. München 1992
3. Böcher, Michael, und Annette E. Töller, Umweltpolitik in Deutschland. Eine politikfeldanalytische Einführung. Wiesbaden 2012.
4. OECD. 2012. OECD-Umweltprüfberichte. Deutschland 2012. Paris: OECD Publishing.
5. Feess, Eberhard, und Andreas Seeliger, Umweltökonomie und Umweltpolitik, 4. ed. München 2013
6. Endres, Alfred, Umweltökonomie, 3. ed. Stuttgart: 2000.
7. Ibid. p. 19


Mause I
Karsten Mause
Christian Müller
Klaus Schubert,
Politik und Wirtschaft: Ein integratives Kompendium Wiesbaden 2018
Extinction Gould I 291ff
Extinction/Evolution/Life/Gould: extinction is not a domino in a development with great consequences, extinction is what all species have in common. They cannot take all their ecosystems with them even when they die out. Therefore, species do not depend very much on each other. New York, for example, could survive without its dogs.
II 339 ff
Mistake: it is a mistake to say that "any species that is extinct is extinct because of its overspecialization." This is perhaps the most common misunderstanding about the history of life. It is a wrong understanding of progress and a wrong equation of disappearance and ineptitude. If one imagines life as a continuous and constant struggle, disappearance must be the final sign of inadequacy. >Explanation, >Theories.
II 340
GouldVs: but the present life does not even come close to perfection. The allegedly classic case of extinction based on competitive inferiority cannot be maintained.
For example, when the Andes rose, there was probably a considerable rain shadow over South America and the tropical forests were transformed into dry areas.
II 346
Consolation for believers in progress: in the case of mass extinction, an attempt is made with a definition "background rate". The background rate compares the normal development (normal extinction).
Discovery: for more than half a billion years, the background rate has been declining slowly but steadily. During the early Cambrian period, at the beginning of adequate fossil records, about 600 million years ago, the average rate stood at 4.6 extinct species per million years. Since then, the rate has been steadily decreasing to about 2.0.
If the Cambrian rate had continued, about 710 more genera would have died out! It is interesting to note that the total number of genera has increased since then by almost the same number (680).
II 347
No species is immortal. The inevitable should never be depressing.
IV 13
Extinction/Gould: extinction is more than just a negative force.
IV 178
Mass extinction/Gould: mass extinction must be reinterpreted from four points of view: 1. Mass extinction is not the peak of a continuum, but fractures.
2. Mass extinctions are much more frequent, faster, deeper and very different (in terms of the number of extinct creatures) than we have ever imagined.
IV 179
The end of the Ediacara fauna was the first mass extinction. The fauna has been replaced and not improved or strengthened.
IV 182
A periodicity of mass extinction has been discovered: it had been 26 million years since the last great death in the Permian period a climax arose. Common cause explanations: common causes for mass extinctions are: mountain formations, volcanism, temperature fluctuations, ...
New: a sinking sea level could be considered and has actually been observed before the last mass extinction. But: most mass extinction is preceded by a slow decline in animal groups! Possible explanation: there are only a few fossils, as fewer rocks are suitable for conservation.
IV 185
Evolution/classification: some branches of the evolutionary tree contain many species, others, very few. There are strong differences. During normal times, species-rich branches tend to increase their richness. Question: why do they not conquer the entire biosphere for themselves? Solution: in the event of mass extinction, they have worse chances.
IV 201
Extinction: each is inevitable forever. An extinct experiment will never be repeated. The chances are mathematically too slim. Biologists speak of the "principle of the irreversibility of evolution".

Gould I
Stephen Jay Gould
The Panda’s Thumb. More Reflections in Natural History, New York 1980
German Edition:
Der Daumen des Panda Frankfurt 2009

Gould II
Stephen Jay Gould
Hen’s Teeth and Horse’s Toes. Further Reflections in Natural History, New York 1983
German Edition:
Wie das Zebra zu seinen Streifen kommt Frankfurt 1991

Gould III
Stephen Jay Gould
Full House. The Spread of Excellence from Plato to Darwin, New York 1996
German Edition:
Illusion Fortschritt Frankfurt 2004

Gould IV
Stephen Jay Gould
The Flamingo’s Smile. Reflections in Natural History, New York 1985
German Edition:
Das Lächeln des Flamingos Basel 1989

Humans Bostrom I 110
Humans/intelligence/biology/capacities/Bostrom: The principal reason for humanity’s dominant position on Earth is that our brains have a slightly expanded set of faculties compared with other animals.
I 346
In what sense is humanity a dominant species on Earth? Ecologically speaking, humans are the most common large (~50 kg) animal, but the total human dry biomass (~100 billion kg) is not so impressive compared with that of ants, the family Formicidae (300 billion–3,000 billion kg). Humans and human utility organisms form a very small part (<0.001) of total global biomass. However, croplands and pastures are now among the largest ecosystems on the planet, covering about 35% of the ice-free land surface (Foley et al. 2007)(1). And we appropriate nearly a quarter of net primary productivity according to a typical assessment (Haberl et al. 2007)(2), though estimates range from 3 to over 50% depending mainly on varying definitions of the relevant terms (Haberl et al. 2013)(3). Humans also have the largest geographic coverage of any animal species and top the largest number of different food chains. >Superintelligence/Bostrom.

1. Foley, J. A., Monfreda, C., Ramankutty, N., and Zaks, D. 2007. “Our Share of the Planetary Pie.” Proceedings of the National Academy of Sciences of the United States of America 104 (31): 12585–6.
2. Haberl, H., Erb, K. H., Krausmann, F., Gaube, V., Bondeau, A., Plutzar, C., Gingrich, S., Lucht, W., and Fischer-Kowalski, M. 2007. “Quantifying and Mapping the Human Appropriation of Net Primary Production in Earth’s Terrestrial Ecosystems.” Proceedings of the National Academy of Sciences of the United States of America 104 (31): 12942–7.
3. Haberl, Helmut, Erb, Karl-Heinz, and Krausmann, Fridolin. 2013. “Global Human Appropriation of Net Primary Production (HANPP).” Encyclopedia of Earth, September 3.

Bostrom I
Nick Bostrom
Superintelligence. Paths, Dangers, Strategies Oxford: Oxford University Press 2017

Information Kauffman I 111
Order/Life/Human/Kauffman: the human is the product of two sources of order, not one. >Order/Kauffman, >Life/Kauffman, >Humans.
I 112
Information/order/life/emergence/Kauffman: most people assume that DNA and RNA are stable stores of genetic information. However, if life began with collective autocatalysis and later learned to incorporate DNA and genetic code, we must explain how these formations could be subject to hereditary variation and natural selection, even though they did not yet contain a genome! >Genes, >Selection.
On the one hand, evolution cannot proceed without matrices copying mechanisms, but on the other hand it is the one that combines the mechanisms.
>Evolution.
Could an autocatalytic formation evolve without it?
Solution: Spatial compartments (spaces divided by membranes) that split are capable of variation and evolution!
Solution: Assumption: every now and then random, uncatalysed reactions take place and produce new molecules. The metabolism (conversion, metabolism) would be extended by a reaction loop.
Evolution without genome, no DNA-like structure as a carrier of information.
>Life/Kauffman.
I 114
Catalysis/Autocatalysis/Kauffman: in the case of autocatalytic formations, there is no difference between genotype and phenotype. >Genotype, >Phenotype.
Life/emergence/Kauffman: this inevitably leads to the formation of a complex ecosystem. Molecules produced in a primordial cell can be transported into other primordial cells and influence reactions there.
Metabolic-based life does not arise as a whole or as a complex structure, but the entire spectrum of mutualism and competition is present from the very beginning. Not only evolution, but also co-evolution.
>Co-evolution.
I 115
Order/life/emergence/Kauffman: the autocatalytic formations must coordinate the behaviour of several thousand molecules. The potential chaos is beyond imagination. Therefore, another source of molecular order has to be discovered, the fundamental internal homeostasis (balance). Surprisingly simple boundary conditions are sufficient for this. >Beginning
I 148
Information/Genes/Kauffman: Question: What mechanism controls the implementation and suppression of certain genetic information? And how do the different cell types know which genes to use and when? J. Monod/Francois Jacob: Mid-1960s: Discovery of an operator that only releases a reaction at a certain point in time.
>J. Monod.
I 149
Also repressor. A small molecule can "switch on" a gene.
I 150
In the simplest case, two genes can suppress each other. Two possible patterns. >Genes.
Gene 1 is active and suppresses gene 2 or vice versa.
Both cell types would then have the same "genotype", the same genome, but they could realize different gene sets.
New horizon of knowledge: unexpected and far-reaching freedom at the molecular level.
The addition of the repressor to the operator at different points results in different receptivity to the operator on the DNA. Regulation.
I 151
This control mechanism by addition in two different places means complete freedom for the molecules to create genetic circuits of arbitrary logic and complexity. We must first learn to understand such systems.

Kau II
Stuart Kauffman
At Home in the Universe: The Search for the Laws of Self-Organization and Complexity New York 1995

Kauffman I
St. Kauffman
At Home in the Universe, New York 1995
German Edition:
Der Öltropfen im Wasser. Chaos, Komplexität, Selbstorganisation in Natur und Gesellschaft München 1998

Internet Zittrain I 3
Internet/Zittrain: The future unfolding right now is verypast. The future is not one of generative PCs attached to a generative network. It is instead one of sterile appliances tethered to a network of control.
I 7
The first part of the book traces the battle between the centralized proprietary networks and the Internet, and a corresponding fight between specialized information appliances like smart typewriters and the general-purpose PC, highlighting the qualities that allowed the Internet and PC to win.
I 26
Internet/Zittrain: the Internet’s founding is pegged to a message sent on October 29, 1969. It was transmitted from UCLA to Stanford by computers hooked up to prototype “Interface Message Processors” (IMPs). (1) A variety of otherwise-incompatible computer systems existed at the time—just as they do now—and the IMP was conceived as a way to connect them. (2) (The UCLA programmers typed “log” to begin logging in to the Stanford computer. The Stanford computer crashed after the second letter, making “Lo” the first Internet message.) From its start, the Internet was oriented differently from the proprietary networks and their ethos of bundling and control. Its goals were in some ways more modest. The point of building the network was not to offer a particular set of information or services like news or weather to customers, for which the network was necessary but incidental. Rather, it was to connect anyone on the network to anyone else. It was up to the people connected to figure out why they wanted to be in touch in the first place;
I 69
I have termed this quality of the Internet and of traditional PC architecture “generativity.” Generativity is a system’s capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences. Terms like “openness” and “free” and “commons” evoke elements of it, but they do not fully capture its meaning, and they sometimes obscure it.
I 101
In a development reminiscent of the old days of AOL and CompuServe, it is increasingly possible to use a PC as a mere dumb terminal to access Web sites with interactivity but with little room for tinkering. (“Web 2.0” is a new buzzword that celebrates this migration of applications traditionally found on the PC onto the Internet. Confusingly the term also refers to the separate phenomenon of increased user-generated content and indices on the Web—such as relying on user-provided tags to label photographs.) New information appliances that are tethered to their makers, including PCs and Web sites refashioned in this mold, are tempting solutions for frustrated consumers and businesses. None of these solutions, standing alone, is bad, but the aggregate loss will be enormous if their emergence represents a wholesale shift of our information ecosystem away from generativity. ((s) For generativity see Terminology/Zittrain.)

1. See ARPANET—The First Internet, http://livinginternet.com/i/ii_arpanet.htm (last visited June 1, 2007); IMP—Interface Message Processor, http://livinginternet.com/i/ii_imp.htm (last visited June 1, 2007).
2. See IMP—Interface Message Processor, supra note 25.

Zittrain I
Jonathan Zittrain
The Future of the Internet--And How to Stop It New Haven 2009

Order Kauffman Dennett I 306
Self-organization/Kauffman/Dennett: Kauffman's laws are not those of form, but of design, the compulsions of meta technology. >Laws/Kauffman, >Laws, >Laws of nature.
Dennett I 308
Self-organization/Kauffman: the ability to evolve, i. e. the ability to search the area of opportunity, is optimal when populations are "melting out" of local regions. >Self-organisation.
Local/Global/Self-organization/Technology/Kauffman: Local rules create global order.
>Local/global.
Dennett: mankind's technology is not governed by this principle. For example, pyramids are organized from top to bottom, but the building activity is of course from bottom to top.
>Technology.
Until the evolution of rational human technology, the rules run from local to global, then the direction is reversed.
---
Kauffman I 9
Order/Human/Kauffman thesis: natural selection has not formed us alone, the original source of order is self-organization. The complex whole can show "emergent" characteristics in a completely unmystic sense, which are legitimate for themselves.
>Complexity, >Emergence.
Kauffman I 21
The human then no longer appears as a product of random events, but as the result of an inevitable development. >Life, >Humans.
Kauffman I 18
Definition Rational Morphologists/Kauffman: (Darwin's predecessor): Thesis: biological species are not the product of random mutation and selection, but of timeless laws of shape formation. (Kauffman goes in a similar direction). Order/Physics/Kauffman: physics knows phenomena of profound spontaneous order, but does not need selection!
Cf. >Selection.
Kauffman I 30
Self-organization/Kauffman: thesis: certain structures occur at all levels: from ecosystems to economic systems undergoing technological evolution. >Ecosystems, >Economy.
Thesis: all complex adaptive systems in the biosphere, from single-celled organisms to economies, strive for a natural state between order and chaos. Great compromise between structure and chance.
>Structures, >Random.
Kauffman I 38
Order/physics/chemistry/biology: two basic forms: 1. occurs in so-called energy-poor equilibrium systems:
For example, a ball rolls into the middle of a bowl.
For example, in a suitable aqueous solution, the virus particle composes itself of its molecular DNA (RNA) and protein components, striving for the lowest energy state.
2. type of order: is present when the preservation of the structure requires a constant substance or energy supply. (Dissipative).
For example, a whirlpool in the bathtub.
For example, the Great Red Spot on Jupiter. It is at least 300 years old, which is longer than the mean residence time of a single gas molecule in the vortex. It is a stable structure of matter and energy through which a constant stream of matter and energy flows.
One could call it a living being: it supports itself and gives birth to "baby whirls".
>Life/Kauffman.
Cells, for example, are not low-energy, but rather complex systems that constantly convert nutrient molecules to maintain their inner structure and multiply.
Kauffman I 115
Order/life/emergence/Kauffman: the autocatalytic formations must coordinate the behaviour of several thousand molecules. The potential chaos is beyond imagination. Therefore, another source of molecular order has to be discovered, of the fundamental internal homeostasis (balance). Surprisingly simple boundary conditions are sufficient for this. >Laws/Kauffman.

Kau II
Stuart Kauffman
At Home in the Universe: The Search for the Laws of Self-Organization and Complexity New York 1995

Kauffman I
St. Kauffman
At Home in the Universe, New York 1995
German Edition:
Der Öltropfen im Wasser. Chaos, Komplexität, Selbstorganisation in Natur und Gesellschaft München 1998


Dennett I
D. Dennett
Darwin’s Dangerous Idea, New York 1995
German Edition:
Darwins gefährliches Erbe Hamburg 1997

Dennett II
D. Dennett
Kinds of Minds, New York 1996
German Edition:
Spielarten des Geistes Gütersloh 1999

Dennett III
Daniel Dennett
"COG: Steps towards consciousness in robots"
In
Bewusstein, Thomas Metzinger Paderborn/München/Wien/Zürich 1996

Dennett IV
Daniel Dennett
"Animal Consciousness. What Matters and Why?", in: D. C. Dennett, Brainchildren. Essays on Designing Minds, Cambridge/MA 1998, pp. 337-350
In
Der Geist der Tiere, D Perler/M. Wild Frankfurt/M. 2005
Reanalysis Climatology Edwards I 58
Reanalysis/Climatology/Edwards: Analyzed weather data aren’t of much use to climatologists because forecasters frequently revise their analysis models (as often as every six months in some cases). Each change in the analysis model renders the data it produces incommensurable with those produced by the previous model. Reanalysis eliminates this problem by using a single “frozen” model to analyze historical observational data over some long period (40–50 years or even more). Because analysis models are built to combine readings from all available observing systems, reanalysis also overcomes the otherwise thorny problem of comparing instruments such as radiosondes and satellite radiometers. The result is a physically self-consistent global data set for the entire reanalysis period. Potentially, this synthetic data set would be more accurate than any individual observing system.(1) (…) some scientists hope that reanalysis will eventually generate definitive data sets, useable for climate trend analysis, that will be better than raw observational records. For the moment, however, they are stuck with infrastructural inversion - that is, with probing every detail of every record, linking changes in the data record to social and technical changes in the infrastructure that created it, and revising past data to bring them into line with present standards and systems. >Infrastructure/Edwards, >Climate data/Edwards.
Edwards I 358
In reanalysis, investigators reprocess decades of original sensor data using a single “frozen” weather analysis and forecasting system. The result is a single complete, uniformly gridded, physically consistent global data set. Reanalysis offered a comprehensive solution to data friction such as that created by heterogeneous data sources, including satellite radiances not easily converted into traditional gridded forms. With reanalysis, many hoped, it would be possible to produce a dynamic data image of the planetary atmosphere over 50 years or more - essentially a moving picture that might reveal more precisely how, where, and how much Earth’s climate had changed. Global reanalysis might produce the most accurate, most complete data sets ever assembled. Yet the majority of gridpoint values in these data sets would be generated by the analysis model, not taken directly from observations. Whether or not it eventually leads to better understanding of climate change—a matter about which, at this writing, scientists still disagree - reanalysis represents a kind of ultimate moment in making data global. >Models/Climatology, >Climate data/Edwards, >Parameterization/metereology, >Homogenization/climatology.
Edwards I 447
From the earliest national and global networks through the 1980s, every empirical study of global climate derived from the separate stream of “climate data.” Climatological stations calculated their own averages, maxima, minima, and other figures. Central collectors later inverted the climate data infrastructure, scanning for both isolated and systematic errors and working out ways to adjust for them, seeking to “homogenize” the record. All of these efforts presumed (…) that only traditional “climate data” could form the basis of that record. But as numerical weather prediction skill advanced and computer power grew, a new idea emerged: What about a do-over? What if you could rebuild climate statistics “from scratch,” from daily weather data? And what if you could do this not simply by recalculating individual station averages, but by feeding every available scrap of weather data into a state-of-the-art 4-D assimilation system, as if taking a moving data image with a single camera? The roots of reanalysis lay in the Global Weather Experiment’s parallel data streams.
Edwards I 449
Four-dimensional data assimilation: Trenberth argued that the name “four-dimensional data assimilation” misstated the nature of operational analysis, which was actually “three and a half dimensional.” In other words, operational analyses looked backward in time, integrating data from the recent past (up to the observational cutoff), but they did not look forward in time, correcting the analysis with data arriving in the first few hours after the cutoff. But data assimilation systems purpose-built for reanalysis
I 450
could potentially offer this capability, leading (in principle) to more accurate, more smoothly varying analyses.(2)
I 456
Reanalysis provoked enormous excitement. By the early 2000s, other institutions, including the Japan Meteorological Agency, had launched major reanalysis projects, and numerous smaller, experimental projects had been started.(3) Investigators at NOAA’s (National Oceanic and Athmospheric Administration) Earth System Research Laboratory used surface pressure data from the pre-radiosonde era to extend reanalysis back to 1908, complementing existing studies to create a full century of reanalysis data, and they have begun to consider reaching even further back, into the late nineteenth century.(4) By 2007,
Edwards I 457
publications concerned with reanalysis for climate studies were appearing at a rate of 250 per year.(5) Parameterization: All the assimilation models used in reanalysis to date exhibit biases of various kinds, due mainly to imperfect physical parameterizations. >Parameterization/metereology, >Model bias/climatology.
Edwards I 459
Reanalysis: How well has reanalysis worked? Reanalyses and traditional climate data agree well—though not perfectly—for variables constrained directly by observations, such as temperature. But derived variables generated mainly by the model still show considerable differences.(6) For example, reanalysis models do not yet correctly balance precipitation and evaporation over land and oceans, whose total quantity should be conserved.(7) This affects their calculations of rainfall distribution, a climate variable that is extremely important to human populations and to natural ecosystems.
Edwards I 461
Reanalysis offers something that traditional climate data will never achieve: physically consistent data across all climate variables. Traditional climate data are “single variable”: you get a set of averages for temperature, another one for pressure, a third for precipitation, a fourth for sunshine, and so on. Each type of observation is independent of the others, but in the real atmosphere these quantities (and many others) are interdependent. Reanalysis models simulate that interdependence, permitting a large degree of cross-correction, and they generate all variables for every gridpoint. This allows scientists to study structural features of the atmosphere and the circulation not directly measured by instruments. >Human Fingerprint/climatology.
1. T. R. Karl et al., eds., Temperature Trends in the Lower Atmosphere: Steps for Understanding and Reconciling Differences (US Climate Change Science Program, 2006), 35.
2. Trenberth, K.E. Atmospheric circulation climate changes. Climatic Change 31, 427–453 (1995). https://doi.org/10.1007/BF01095156
3. K. Onogi et al., “JRA-25: Japanese 25-Year Re-Analysis Project—Progress and Status,” Quarterly Journal of the Royal Meteorological Society 131, no. 613 (2005). 22. G. P. Compo et al., “Feasibility of a 100-Year Reanalysis Using Only Surface Pressure Data,” Bulletin of the American Meteorological Society 87, no. 2 (2006): 175–; J. S. Whitaker et al., “Reanalysis without Radiosondes using Ensemble Data Assimilation,” Monthly Weather Review 132, no. 5 (2004): 1190–.
4. R. M. Dole et al., Reanalysis of Historical Climate Data for Key Atmospheric Features: Implications for Attribution of Causes of Observed Change (US Climate Change Science Program, 2008), 10.
5.. L. R. Lait, “Systematic Differences Between Radiosonde Measurements,” Geophysical Research Letters 29, no. 10 (2002): 1382.
6. R. B. Rood, “Reanalysis,” in Data Assimilation for the Earth System, ed. R. Swinbank et al. (Kluwer, 2003).
7. L. Bengtsson et al., “The Need for a Dynamical Climate Reanalysis,” Bulletin of the American Meteorological Society 88, no. 4 (2007): 495–.


Edwards I
Paul N. Edwards
A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming Cambridge 2013

The author or concept searched is found in the following controversies.
Disputed term/author/ism Author Vs Author
Entry
Reference
Chomsky, N. Pinker Vs Chomsky, N. Dennett I 545/546
Steven PinkerVsChomsky: specialization to the grammar is a conventional neo-Darwinist process. The majority of the most interesting properties of the "language organ" must have evolved through adaptation.   Pinker: the objections to this position are mostly ridiculous - e.g. the structure of the cell should be "purely physical" and explained without evolution - e.g. language were not designed to communicate, etc.

Pinker I 218
Design/Chomsky: It is wrong to make selection responsible for all design: E.g. the fact that I have a positive mass prevents me from eloping into outer space, but has nothing to do with selection. Simple physical explanation. Explanation/Selection/PinkerVsChomsky: you usually do not refer to selection to explain utility, but to explain something improbable. E.g. eye. If we calculate the parts of the universe with a positive mass and those equipped with an eye, we need an explanation for this difference. Vs: one might reply: the criterion: seeing/not-seeing was only introduced in retrospect, after we knew what animals are capable of. I 219 Most clusters of matter cannot see, but most cannot "fle" either, and I define that now as the composition, size and shape of the stone, on which I'm sitting now.
Def Design/Pinker: If the function cannot be described more economically than the structure, no design is present. The concept of function adds nothing new.
Design/Pinker: should not serve the harmony of the ecosystem or the beauty of nature. After all, the replicator must be the beneficiary.

Pi I
St. Pinker
How the Mind Works, New York 1997
German Edition:
Wie das Denken im Kopf entsteht München 1998

Dennett I
D. Dennett
Darwin’s Dangerous Idea, New York 1995
German Edition:
Darwins gefährliches Erbe Hamburg 1997

Dennett II
D. Dennett
Kinds of Minds, New York 1996
German Edition:
Spielarten des Geistes Gütersloh 1999

Dennett III
Daniel Dennett
"COG: Steps towards consciousness in robots"
In
Bewusstein, Thomas Metzinger Paderborn/München/Wien/Zürich 1996

Dennett IV
Daniel Dennett
"Animal Consciousness. What Matters and Why?", in: D. C. Dennett, Brainchildren. Essays on Designing Minds, Cambridge/MA 1998, pp. 337-350
In
Der Geist der Tiere, D Perler/M. Wild Frankfurt/M. 2005

The author or concept searched is found in the following 2 theses of the more related field of specialization.
Disputed term/author/ism Author
Entry
Reference
Self-Organization Kauffman, St. Dennett I 303
Self-Organization/Kauffman/Dennett: Thesis: Evolution itself undergoes evolution. It develops because it is a forced move in the design game. Finding the right path is surprisingly easy - laws of design, not of form - inevitabilities of metatechnics - epistasis: interaction between genes: - aptitude landscape strongly determines development: successful results are sacrificed.
Kauffman I 30
Kauffman's thesis: If the band of life were played again, the individual branches of the family tree of life might look different, but the patterns of branches, which initially diverge strongly and then become more and more a refining of details, probably follow a deeper regularity. Self-Organization/Kauffman: Thesis: these structures occur at all levels: from ecosystems to economic systems undergoing technological evolution.
Thesis: All complex adaptive systems in the biosphere - from protozoa to economies - strive for a natural state between order and chaos. Great compromise between structure and chance.
I 49
Thesis: The best compromises are apparently achieved in the phase transition between order and chaos.
I 51
Chaos Edge/Kauffman: great similarity with the theory of "self-organized criticality": thesis: Per Bak, Chao Tang, Kurt Wiesenfeld.
I 349
Self-Organization/Kauffman: Bak, Chao, Wiesenfeld, 1988: new theory: thesis: self-organized criticality. For example, a heap of sand on a table that is constantly getting bigger.
I 350
Potency Law/Kauffman: many small and little large avalanches. For avalanches there is no typical size at all! It is also independent of the size of the triggering grain of sand. Catastrophe/Chaos/Kauffman: Equilibrium systems do not need massive triggers to start moving massively.
I 366
Economy/Organization/Self-Organization/Kauffman: new researches (Emily Dickinson): Thesis: flatter organizations are more successful, split into fields, each striving to improve their own benefit. The trick is how to select the fields. (NK model). Fields can detect peaks. "Simulated annealing": Finding a good approximation method. ("Temperature" see below)
I 415
Thesis: we can consider goods and services as strings that interact with other strings.

Dennett I
D. Dennett
Darwin’s Dangerous Idea, New York 1995
German Edition:
Darwins gefährliches Erbe Hamburg 1997

Kau II
Stuart Kauffman
At Home in the Universe: The Search for the Laws of Self-Organization and Complexity New York 1995

Kauffman I
St. Kauffman
At Home in the Universe, New York 1995
German Edition:
Der Öltropfen im Wasser. Chaos, Komplexität, Selbstorganisation in Natur und Gesellschaft München 1998
Ecology Pinker, St. I 491
Tradition: Thesis: Animals acted for the benefit of the ecosystem, the population or the species. Seems to follow from Darwin. Widely used.   PinkerVs: that radically contradicts Darwinism and is also probably wrong.