Regulating vs. Steering

From bio-physics-wiki

Jump to: navigation, search

The difference between regulating and steering

Controlconcept.png
Control theory is usually deals with the control of certain parameters of a system. A simple example is thermoregulation, in which the controlled parameter is temperature. First you set a certain temperature value at the thermostat as Reference point, at which the control system should keep the temperature. The system measures the current room temperature via a Sensor and compares it to the reference temperature. If a deviation from the reference temperature is measured, the error signal results in the control action of the Controller, in this example a heating device, that is turned up or down, according to the error signal. If it is to cold the heater is tured up, if it is to hot the heater is turned down. In this manner the controlled System is kept at a constant temperature. Control systems of this kind are called feedback control systems, because the output (temperature) is fed back to the input (sensor), forming a closed loop, that is why they are also called closed-loop systems.

temperature forecast for a period of 12 days. red curve: mean temperature; dark blue: deviations from mean

The idea of regulation bears a fundamental extension of the thoughts of early physics. In physics the primary goal was to examine a system to then predict its behavior. Knowledge about a system is regarded as complete, if all observations can be predicted. Now, the fundamental difference in control systems is that the system can be an Open System, that is a system that can have inputs or boundary conditions that are not known in advance and still the system state can be known, since it is regulated. Considering the familiar example of thermoregulation, this means, that the early physics approach might would have been to predict the weather, to then implement the weather time course in a controller, which is then able to keep the room temperature constant. Due to the nonlinear behavior of the weather, it is not possible to predict the weather for arbitrary times, since small deviations in the initial conditions can cause large differences in the calculated behavior of the system (see article Liapunov Exponent).


It is simply not possible to predict the weather for long periods of time. The graph on the right shows, that after twelve days the temperature can be somewhere in the range between $-18°C$ and $+24°C$.

In fact, the difference between the first method (feedback control - closed loop control) and the second method (called open-loop-control) described, is that the feedback systems constantly integrate information - they regulate, while the open-loop systems steer. Another example of steering, a more standard one, is to program a robot to dance, like this one, where you program the motors to move in a predefined way. It is evident, that if the steering is not 'intelligent', lets say you program the robot to lean back too much, he will fall. Or for a more complex situation, for instance the wind blows, you have to know, how hard the wind blows to keep the robot from falling, by doing the right action. Intelligence in this respect means knowledge about all forces acting on the system. Since in real life most of the things happening are not predictable, feedback control provides the 'intelligent' solution without prediction.

In simple control operations like the example of the thermostat shown above, the task of the controller is to keep some parameter constant. If we consider an organism as a biological system, that evolved through evolution, to what extent are such control operations manifested in the organisms? Some examples of these mechanisms are the control of body temperature and the eyes pupil-control-system, which have both been worked out in detail by John H. Milsum in his Book 'Biological Control Systems Analysis', that was already published in 1966. All examples shown so far, whether technical or biological have in common, that the aim of control is known. In technical control operations the goal is given by the designer. But how did the biological systems evolve to control certain parameters? Evolution is thought to optimise the response of the organism in a certain environment. But what is the goal of the response? What gets optimised? In certain situations the growth rate of an organism has prooved to be maximised (see this article). However, since evolution is an open process with no predefined goal, no gerneral statement about a universal goal can be made. Like Shakespeare said, "To Be or not to Be: That is the question"! So that the only thing we can assume to be intrinsic to all existing things is, that its goal is existence, otherwise it would not exist. It means everything that exists does something that makes it exist. But this statement is fairly useless since it has no predictive power. The question then is, what it does, that makes it exist.