5
$\begingroup$

I have an intuitive concept of systems that grows in complexity steadily as it computes. I can think in some examples:

  1. Nature. Given the physical laws, some initial conditions (say, the Earth planet at the beginning of life) and enough time, the system eventually evolves in complexity, all the way from dinosaurs fighting over territory to humans building castles.

  2. Cellular automatas. Given a set of automata rules, some initial conditions (a grid with specific cells set) and enough time, some automatas eventually evolve complex structures.

  3. AI applications such as genetic programming (to some extent). The issue is that most of those evolve towards a specific goal and often reach a tipping point and stop.

The kind of system I'm talking about is supposed to grow in complexity indefinitely. That is, one can expect that, given enough time, such system will eventually develop complex structures such as "lifeforms" that defend their own existence. It is also not goal oriented, it is supposed to grow in complexity because their own internal structures are more stable when they are more complex. We know nature/physical laws satisfy that criteria, but I'm not sure about cellular automatas and genetic programming. That is a very vague concept and hard to define (there is no accepted definition of complexity, after all). Yet, is there any name/formalization for this kind of system? Do we know any examples of systems/specific automatas that, as far as our observation goes, never reach a "tipping point" where they stop becoming more complex?

$\endgroup$
11
  • 1
    $\begingroup$ I don't know, but my guess is that this is indeed too vague to have a single name. At least in TCS, this does seem to be a non-issue. $\endgroup$
    – Raphael
    Commented Aug 5, 2015 at 5:19
  • $\begingroup$ I recommend reading up on Conway's Game of Life. At least that's what it sounds like to me. $\endgroup$
    – Sagnik
    Commented Aug 5, 2015 at 5:21
  • $\begingroup$ It has a name: "physics". Entropy is a measure of complexity, and the second law of thermodynamics says (roughly) that in physical systems, the entropy constantly increases [and never reaches a tipping point]. I'm not sure what that has to do with computer science, though. If that's a valid answer to your question, then it seems like your question is too broadly scoped. Can you think of a way to scope your question more narrowly so that it's specifically about computer science? Can you define what you mean by "system"? $\endgroup$
    – D.W.
    Commented Aug 5, 2015 at 6:01
  • $\begingroup$ @Raphael Doesn't semantics count as TCS? This kind of problem does appear when modeling computing systems. For example, model checking works only if the system has bounded complexity with respect to the desired observations. $\endgroup$ Commented Aug 5, 2015 at 12:01
  • 1
    $\begingroup$ @Gilles I'm not sure that matches the question as I understand it. Model-checking becomes arduous because the space of potential states grows with the number of executed steps; any given run of the system does not necessarily "grow in complexity", whatever that means. $\endgroup$
    – Raphael
    Commented Aug 5, 2015 at 12:07

2 Answers 2

5
$\begingroup$

Hypothesis could be :

  1. The time is discrete, let $s_i$ be the state of your system at $t=i$.
  2. Your system is deterministic and its transition function $f$ is computable.
  3. Its complexity $c : \mathbb{N} \to \mathbb{N}$ is the Kolmogorov complexity, $c(i)$ is the length of the shortest C++ program which prints $s_i$.

Then you have a bound : $$c(i) \leq O^+(1) + c_0 + |f| + |i| = O^+(1) + |i| $$

($O^+(1)$ is a notation for a positive constant, |x| is the Kolmogorov complexity of x)

And more generally for any computable function $g$ : $$c(g(i)) \leq O^+(1) + c_0 + |f| +|g| + |i| = O^+(1) + |i| $$

The system complexity grows more slowly that any computable function !

Conclusion :

  • You should look at systems which are not deterministic or whose transition function is not computable. Or use an other definition for complexity (like this one : Chaos Theory).
  • With Kolmogorov complexity, chaotic states are more complex than states with lifeforms defending themselves, but may be one can define a physical macroscopic Kolmogorov Complexity where the unbreakable basic components for describe the world are quite big and their internal features are negligible. Then it becomes more complex to describe tangle of life and vacuum than homogeneous chaos.
$\endgroup$
5
  • $\begingroup$ I don't see how this answers the question. The question asks for a name for a certain kind of system; you seem to be suggesting a possible research direction $\endgroup$ Commented Aug 5, 2015 at 13:55
  • 2
    $\begingroup$ That is a very good answer, actually. The answerer noticed that I don't have all the pieces of the puzzle and thus am having a trouble in formalizing my intuitive thoughts. This answer helped improving my reasoning partially, but I don't understand the notation used on the two formulas. $\endgroup$
    – MaiaVictor
    Commented Aug 5, 2015 at 19:01
  • 1
    $\begingroup$ Anyway, I'm not sure if this works as I expect. François, wouldn't this definition lead to completely random/chaotic states be considered more complex than states with lifeforms defending themselves? After all, the kolmogorov complexity would be greater. $\endgroup$
    – MaiaVictor
    Commented Aug 5, 2015 at 19:03
  • $\begingroup$ @DavidRicherby The title asks for a name but the question itself is more opened. $\endgroup$
    – François
    Commented Aug 6, 2015 at 8:14
  • $\begingroup$ @srvm Answer edited $\endgroup$
    – François
    Commented Aug 6, 2015 at 8:40
1
$\begingroup$

there is some scientific study of systems that roughly fit your criteria. you might examine the following to determine which are fitting or not based on their properties. there is a wide range of examples with different phenomena in each category.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.