Sebrof

joined 10 months ago
[–] Sebrof@hexbear.net 3 points 3 weeks ago (1 children)

I very much like your analogy of the mind creating an abstract version of the world. And the paper on emergence has a section on Hopfield networks. Without coming across as reductionist, I think there is something to this idea that our thoughts, mental formations, "computations" in our mind are some "macro" emergent "model" that can be analyzed without detailed understanding of the microconfigurations. This is very much related to the concept of entropy as you're most likely aware of. The book by Thurner et. al also has an entire chapter dedicated to what entropy means (and how it can be calculated of course) in a complex system. The "software" of thought, or computation, is a "higher-level abstraction" that can be run on multiple types of "hardware" with determine the specific micro-configurations.

If you are interested in computation theory (it's a new field to me), then the paper on emergence, Software in the Natural World may really interest you.

And if you are interested in the nexus of Hegelian dialectics and computation theory, then you may enjoy this paper on Hegel, computation, and self-reference.


Going back to multilayer-networks. I've thought about using that framework to combine physical constrains (like energy usage, etc.) with political-economic network (labor, commodity, money flows) to come up with some way to model modes of production. This would be similar to what anthropologist Eugene Ruyle has written about.

If using multilayer networks, the "relations of production" that help to differentiate the various modes can be expressed as various types of links between the nodes (whether they be individuals, firms(?), "some general unit of production", etc.). If one could find a way to generally model how humans organize themselves and built institutions, then perhaps that could be encoded in a network's links as well.

But... I got really bogged down in the details and making it work in practice was a bit harder. I have more to say about it, though, if anyone is interested.

I'll give your link a read, it sounds interesting!!

[–] Sebrof@hexbear.net 1 points 3 weeks ago

Some other writers in complexity can also be thought-provoking. As I’ve mentioned, complexity science is an emerging field so I don’t think a single school of thought has dominated? That also means that I, as a lay-person, may just be following quacks. So keep that in mind.

Here are a few other writers on complexity in case anyone is interested. Something to note is that you’ll find a bit of anti-Sovietism in these writers. It’s like they have this Hayekian worldview where they see socialism as unable to treat complexity, and this is why the USSR failed, etc. etc.. But immanent critique of the field is a good way forward, so learning how these authors think of complexity can still be useful.


My first introduction to thinking about Complexity was this article on complexity, scale, and cybernetic-communism. It is leftist, but still anti-Soviet. There are many citations included, though, if you want to go through some rabbit holes. If you are up to the mathematics, the final section is an exploration about mathematical measures of Complexity.

Some main ideas that are cited in the above article:

1.) Studies using historical data from the Seshat Global History Databank suggests that the growth of complex societies follows a repeating two-phase cycle. The first phase is in which societies grow in scale, but not in information-capacity. Any increase in complexity is simply due to increasing scale. Their given information-capacity induces a “scale threshold”, a maximum scale, at which the society cannot progress beyond and this causes the society to ‘stagnate’. This stagnation is the second phase where the scale remains the same, but information-capacity (may) grow. If a society can advance their information-capacity then it can continue to grow in scale until it meets the next scale threshold, and repeat.

I’ve thought about this being another view (in terms of complexity and information) of the Marxist idea that

At a certain stage of development, the material productive forces of society come into conflict with the existing relations of production or – this merely expresses the same thing in legal terms – with the property relations within the framework of which they have operated hitherto. From forms of development of the productive forces these relations turn into their fetters. Then begins an era of social revolution. The changes in the economic foundation lead sooner or later to the transformation of the whole immense superstructure.

Or another way of thinking about the quantity to quality idea in dialectics.

2.) In order to build cybernetic-communism we need better “instruments of complexity” beyond money and markets. In the most general definition, complexity is a measure of how much information it takes to “describe” a system. The last section of the article goes into some attempts at quantifying this, and some criticisms of complexity measures such as KL divergence, Kolmogorov complexity, etc.


I’ve also listened to some episodes of the General Intellect Unit Podcast where they discuss Cybernetics form a leftist perspective (but still manage to be anti-Soviet). I can't give a general recommendation, but it is another resource.


Other writers in complexity science that I have found are Alexander Siegenfeld and Yaneer Bar-Yam. Bar-Yam is the founding president of the New England Complex Systems Institute, and he definitely has is own ‘school of thought’ within Complexity Science. You may find it fruitful to go through some of his arguments, even if it is to build a better critique. He would definitely fit into the ‘anti-authority’ camp, and would view hierarchies as a limitation to the information-capacity of a network, and hence a limitation to complexity. So he would be a counter to your views on systems. You can also definitely find anti-Sovietism in his work, as well.

Siegenfeld and Bar-Yam wrote an introductory paper to complex systems that may be of interest, and doesn’t require mathematical knowledge.

Some big ideas from this paper are developing a more intuitive understanding of what complexity is. You can think of complexity as like the division of labor that you mentioned. It is correlation of various functions between a system’s parts. It isn’t randomness, and it isn’t uniform cohesion.

I think the biggest idea to take away form Bar-Yam’s work is a Scale-based extension of Ashby’s Requisite Law of Variety that we find in cybernetics. This means that the complexity of a system is a function of the components, the interactions, and the scale of the system. A system may be very complex at a large scale, but lacks necessary complexity at small scales, and etc. This is a view of complexity that focuses on Ashby’s idea of variety. This is the number of possible actions, or states, that a system can take. Bar-Yam extends variety to include scale. So there are small scale actions (actions of single individuals) and large-scale actions (of a state).

Ashby’s Law (with or without taking into account Scale) is about autopoiesis, a system’s ability to maintain itself in an environment. According to Ashby’s Law, a system in an environment (which itself is also a system) must be able to react or respond to actions from the environment at the appropriate scale. For example, the climate is an environment that our mode of production (system) is within (and also part of). Climate change creates certain actions (wildfires, global temperature increase, flooding) at a large (regional to global) scale. Capitalism, to maintain itself, has to respond to each environmental action at its scale. If it fails, then changes occur within Capitalism, the system. The system may adapt, evolve, change the environment, or fragment. For socialism (a system) to survive within its environment (global capitalism) it must have the appropriate responses, at the appropriate scale, to respond to capitalism’s "attacks". The variety of a system must match or exceed the variety of its environment in order to survive. And this applies at all scales.

The scale-based version of Ashby's Law suggests that sometimes a system can “flex” its variety at a particular scale in order to outcompete another system (or its environment). One example that’s given to describe Scale-based Ashby’s Law is guerilla warfare. In warfare, certain environments may favor smaller scale actions over large scale actions. Guerilla fighters may have more variety (actions/states/options) at small scale compared to larger armies. Some environments, like open fields, etc. may favor larger armies. So a large army will have more variety at a large scale than a small one, etc.

Other papers by Bar-Yam then talk about ways of calculating the complexity, or variety I suppose, of a system at its various scales.

At this point, I’m not certain how accepted these ideas are in complexity science. They may be approaching quack? But I’ve found them interesting to chew on, and try to incorporate into my understanding of Marxism and systems.

[–] Sebrof@hexbear.net 2 points 3 weeks ago* (last edited 3 weeks ago) (5 children)

I have been reading about complexity science and, as @Sodium_nitride@lemmygrad.ml mentioned, I have thought about whether we can use some insights from Complexity Science that help us build some general model or rule for human organization. This is definitely a question that has interested me lately, and if anyone wants to share thoughts or resources I’d be down. Your post here re-sparked my interest!

I thought I’d give an overview of the literature I’m familiar with, and try to make some connections with what OP wrote, and/or the goal that Sodium_nitride wrote. What follows is mostly an information dump, and I’m sorry for hijacking your post. It’s just that this topic really excites me! Hopefully some of this may be useful food for thought.


There is, of course, cybernetics and the work of Ashby and Beer. Most Hexbearites know of project Cybersyn in Chile, so Beer should be familiar. I think a big idea out of that classical Cybernetics is Ashby’s Law of Requisite Variety (which I will mention again below), and Beer’s idea of the Viable System Model (VSM). I actually am not familiar enough with the VSM to give a decent overview, but an overview written by Raul Espejo, someone who worked on Project Cybersyn with Beer, can be found here. If other comrades wish to give their understanding of the VSM, feel free! I'd love to hear it.

Moving on from Cybernetics and toward more contemporary complex systems theory, the model/theory I’m most familiar with is that presented by Thurner, Hanel, and Klimek in their Introduction to the Theory of Complex Systems. Since Complexity Science is an emerging field, this is just one of the attempts I know to give a general definition for what a complex system is and how to model it. This can be a good starting point for those who want to think about actually modeling complexity, but by no means is it the final word. It doesn’t mention agent based modeling, for example, which is something that could be added to simulate people/institutions and their actions.

The big idea of this book is its attempt at explaining complex systems as a co-evolving multilayer network. Complex systems are:

a.) networks, they consist of entities/components (nodes) with interconnections/relationships (links), they are

b.) co-evolving, both the entities (nodes) and relationships (links) dynamically change, and their evolution depends on each other, i.e. the dynamics of the entities depend on other entities and their interrelationships, and the dynamics of the interrelationships depend on the entities, and

c.) the networks that describe complex systems are multilayerd. A multilayer network just means there are multiple types of relationships in the network, you can express each relationship type with its own network (and the entities/nodes are also present on each network). You can imagine one network having “trade relations”, another network for “communication relations”, etc. The evolution of each relationship in a network-layer also also depends on the other network-layers.

The book develops a general model for this (it can be rather mathematical) and tries to use it for describing some common features of complex systems: punctuated equilibrium, self-organization, robustness, resilience, statistics on collapse, etc.

I’ve thought a bit about using this type of framework for, as Sodium_nitride, has mentioned, describing some common framework for organization. But brainstorming and sharing ideas would definitely help, especially since this isn’t my day job and one person can only think of so much. My first interest was thinking about complexity science as a way to talk about modes of production more generally.


A different direction is the work of Joshua epstein , he has a book called Generative Social Science: Studies in Agent-Based Computational Modeling. It doesn’t use Thurner et al.’s framework described above. It is focused on various ways of using agent-based modeling to generate some observed social dynamics. So think of Conway’s Game of Life but for social rules. This is also like @Sodium_nitride@lemmygrad.ml's idea of coming up with rules for organization and then studying what emerges from them.

There is similar work by Axtell where he uses agent-based modeling to generate observable statistics on firm sizes.

The takeaway if that if one can find a description of organizational rules, then you can simulate them to study emergent patterns and statistics that aren’t obvious from these rules themselves. And these are various works that go into details of how.


My focus on agent-based modeling and complexity science has taken a backseat for now. I want to learn more political economy and imperialism. And I think I was focusing too much on the micro-to-macro approach. And as Marx says, “the complete body is easier to study than its cells”. There are divides on the approach of using micro-descriptions to generate macro-results. It isn’t that micro-dynamics isn’t important, but it is the idea that if you want to focus on the macro-level then there are many cases where you can start at the macro-level and do not need to have a full (or accurate) micro-model. This is also where the idea of universality comes into play - various different micro-models can all give the same emergent macro-descriptions. There is a bit of redundancy, so to speak, in our micro-models. So some people say to just start with a macro-model if that’s the focus of your study.

A paper I recently read that has directed away form focusing too much on the micro-level descriptions is Software in the Natural World: A Computational Approach to Hierarchical Emergence which links together computation science, information theory, and complexity science. A big idea this paper is asking is under what contexts do you need a micro-model to explain an emergent macro-model. It finds that under certain situations you can do not need to refer to a micro-model in order to generate/predict macro-data. The “macro-world” is “closed off” to the “micro-world”. This is where causal, information, and computational closure comes into play.

If anyone is familiar with Physics then this isn’t surprising. You don’t need statistical mechanics to describe thermodynamics and make predictions. You don’t need quantum mechanics to make predictions of the macro world using Newtonian Physics. And Anwar Shaikh would similarly argue that you don’t need microeconomics to understand macroeconomics. But.. it isn’t that micro-models are useless or don’t have a place. It just depends on what you need the model for.