A destructor does just the converse,
allowing to "destructure" some object
into more elementary components.
Firstly, it must be noted that
not all components of a structure need be explicitely
given to a constructor or returned by a destructor.
Actually, for the sake of readability,
but also because no system can be founded
without (infinitely many) implicit axioms,
most of the times,
many parameters are left implicit,
and taken as obvious from the context,
or their value being deferred to some later time.
Secondly, when some unique objects such as memory
(see linear programming)
are taken into account as parts of constructions,
destructuring the construction
to yield the original state of some of these unique objects
requires the construction to no longer be available,
so that complete destructuration of such an object also
means its destruction.
Thirdly, though the algorithmical meaning of objects may come
from global properties of complex constructions,
the only construction and destructuration operators provided by the
programming language may be local, providing no way to safely express
these properties.
Of course, all these aspects can be validly combined in many ways,
from constructions
with every meaningful aspect being explicit non-linear being considered
in a one operation (the clean case),
to constructions with most aspects being implicit, with semi-explicit
linear compounds and no way to have the language automatically group
operations in a semantically safe way (which is the dirty case).
For instance, in the traditional "OO" paradigm,
the constructor/destructor terminology is used only as relates to the
pool of available low-level side-effective resources,
with algorithmical aspects of objects being implicit,
so that in this point of view, destructuration is destruction.
The facts that the traditional C++ model forces all resources,
however different and independent,
to be considered together when describing construction of objects,
in some arbitrary unique of point of view,
and that constructors and destructors have to be specified
independently from each other,
without any provision of consistency being
accessible to programmers or checkers,
all contribute to make this model as dirty as it could be,
and shows to the least the extreme confusion of minds of those behind it.
On the contrary, in traditional typed lambda calculi,
constructors are used for objects of high-level semantics,
in a usually pure way, where all low-level side-effects are made implicit;
the "destructor" terminology is not used explicitly in usual
functional programming style,
destructors being implicitly used through pattern-matching
(which provides a very efficient and expressive way to deal with it);
the term is also avoided because it could lead to confusion with
destruction of objects, whereas semantically, nothig is destroyed
or created, as pure functional objects always exist in some abstract space,
whereas the fact that their storage may be reclaimed
after they are no more useful is an independent implementational issue.
Of course, storage is no more an independent issue in contexts where
resource availability is critical; clean style can then be preserved
while allowing full expression of resourcewise constraints
by use of Linear programming,
in which constructors and destructors have the combined semantics
of high-level algorithmical construction and low-level resource allocation.
Unhappily, our generalized point of view is never explicitly considered,
whereas it unifies those above particular concepts, as well as many others.
In the Tunes project,
faithful to our liberal philosophy,
we strive to enable everyone
to have full control on what one may leave implicit or make explicit
in any particular program,
so that one makes one's own paradigm for available paradigm constructors,
so that it destructures in a way that fits whatever program one writes,
instead of trying to force everyone to use a one centrally decided
paradigm.
And controlling what is implicit
is precisely what reflectivity
is all about.
Cybernetics
A definition for Cybernetics could be that it is the science of
information,
that is, of what makes dynamical systems and meta-systems.
A good pointer about it is the
Principia Cybernetica Project.
See Dynamism
and Reflection
Cynicism
Cynicism is a philosophy that refuses the abstraction of moral concepts:
the cynic man refuses to consider "what if ..." hypotheses about the world,
which only can build a non religious moral.
The pertinency space of a cynic is thus made of raw facts,
with no expressiveness,
and for a cynic the world has no (human-reachable) meaning.
Decentralized
The opposite of centralized.
Not obeying to a one guide, chief, führer, duce, or
any other kind of grand beloved allmighty allknowing leader.
Not even having such an organization in any part of the system.
Destructor
See Constructor
Distributed
The fact that information in a network is dynamically
organized in the system, that the system be decentralized.
Opposes to "networked"
in its common restricted sense.
Dynamism
The notion according to the which the world
constantly changes and evolves,
and that no static analysis can be indefinitely valid.
It appears that the main errors we fight are due to a static view of
a world that is essentially dynamic.
End-Users
According to the silly notions of
traditional systems,
some category of dumb people for whom computers would be made.
Actually, dumbness or intelligence are quite independent from
people's technical knowledge.
By despising people like that, the computer industry only manages to
discourage people from using computers.
Entropy
Entropy is the opposite of information.
It is a concept that was discovered fairly recently (XIXth century)
by physicists. It is a magnitude that appeared naturally in the equations
of thermodynamics; when physicists interpreted its meaning, they realized
that it expresses the amount of information in a dynamical system,
which they more and more realize everyday our physical world happens to be.
Basic theorems about such dynamical systems assert that in a closed
system, entropy globally increases with time; that is, information is
being lost, and the system evolves toward a state more and more
indistinguishable, less specific, more probable.
Ethics
Ethics is the interface between the world and philosophy.
It helps defines abstract, moral, rules that should direct any of our behavior.
It seems that we can define ethics based on
information:
the good would be to maximize available information in the long run.
One's contribution to global information would be the measure of one's
behavior.
Because total information in the system is bounded,
not only the amount of actual information collected,
but also the waste of potential information
is to be considered.
Also, producing redundant information is considered poor yield;
original information is much more valuable. In fact, all that results
from elements of information not to be considered independently but
accumulated together.
The fact that one's works can propagate one's information
beyond one's short (and condemned) physical life is a good
justification for long-term moral behavior:
the information resulting from one's deeds are the part of one
that survives one's death.
Of course, in any case, it will be only a small contribution to the
world; but even then, "good" deeds will have long-term effects,
while "bad" deeds from different people will be cancelled in the
chaos of each other's deeds.
Thus, if you want to live longer and broader, you must do good.
No need of any supernatural explanation.
But all this is a very large problem which is to be discussed in another
project (see Cybernetics).
Expediency
We say that something is expedient if it benefits
one's (or a few ones') short term and restricted interests.
In a more restricted sense (as used in the
TUNES acronym),
expedient implies that the expedient thing may be harmful or disastrous
for the global long term welfare.
We thus oppose this restricted expediency
to real Utility.
Beware that this does not
mean anyhow that all expedient things in the general sense are expedient
in the restricted sense.
Actually, most of the time, we have only a vague idea of what is really
useful or harmful, so in the limits of this knowledge,
expediency is the best approximation we have; if we do not have clear
ideas about expediency either,
we follow traditions and habits.
For example,
mass murdering and concentration camps are expedient to
dictatorial governments;
racketing, selling drugs or weapons are expedient to the mob;
industrial monopoly is expedient to the owners of these monopolies
and partners in a trust (a.k.a. the white-collar mob);
wasting the Earth's natural resources may be expedient to those who exploit
these;
wasting human work is expedient to those who sell broken cheapo products in
a cheated market with unfair competition;
but all these are extremely harmful to the world.
Expressiveness
A language is said to me more expressive than another one
if it can express more concepts than the other.
If concepts are computable functions,
then there is a maximally expressive class of languages,
those that are equivalent to Turing machines.
Now, what are concepts ?
If admittedly any language can be expressed inside another one,
all its sentences explainable in the other,
this does not mean that you can find a faithfull translation
of a sentence of one into a sentence of the other.
Poetry, ambiguity, puns, style, are difficult to translate.
So there can be more to a sentence that the informational
meaning usually expected from it.
While you can consider computer programs as just representations
of computable functions, you can also consider criteria like
computational complexity, human readability (under various circumstances),
ease to modify, length, ease of connections with other concepts in other
contexts, etc.
There are many divergent and sometimes opposite ways to refine the
concept of expressiveness.
See Abstraction level
Now, a whole class of such ways is easily achieved with some Formal Logic:
We said that structure S1 was computably more expressive than structure S2,
iff there was an "representation" morphism of S2 into S1,
that preserved all verifiable existential statements,
which precisely means that any computable object in S2 could be
computed in S1.
Now, while existential statements represent what can be computed
with the system, they surely do not suffice to express what can
be computed about the system,
or even simplier with extensions or restrictions to the system.
So not to neglect pertinacy,
we must consider expressiveness relations between structures
relatively to any family formularum defined in arbitrarily combined terms of
quantification complexity (e.g. Sigma-n, Pi-n, Delta-n formulae),
positivity constraints on some atomic relations,
fixed behavior on some subset of the language,
"weight" of formulas,
etc,
on either side of the morphism.
Among these expressiveness relations,
those with homogeneous symmetric constraints on the morphism
induce a partial order over considered structures.
As a conclusion, we have plenty of well-defined tools
that allow to build expressiveness hierarchies among programming languages.
These tools seem almost never used in common courses about computer languages,
which is a shame,
and leads to most available language been very poorly expressive.
Unhappily, we quickly see
how poorly expressive are available languages relatively to what can be done,
and how even among these available languages
some are miserably expressive as compared to others.
Surely at the early times
when logicians where the ones who studied programming languages,
before computers were widely available,
these expressiveness criterion appeared to them as obvious,
so they did not study it a lot;
the (often justified) minimalist trend among mathematicians also made them
try to express the most with little expressive languages.
Then, the tiny computer resources of early computers
made the study of programs easy,
and very expressive languages superfluous,
while poorly expressive languages were much easier to implement
within the limited available resources.
All those phenomena contribute to programming language expressiveness
having been neglected for long times.
But if computer science is to accumulate
any kind of tradition in a reliable way
that lasts accross generations,
it can only be through expressive languages,
because only they allow people to understand
what people long forgotten, using technology long replaced,
may have meant.
See the article "On the Expressive Power of Programming Languages"
(SCP 91) by Thomas Felleisen
at Rice University
(in
PS format)
Fair Competition
Competition based on real, total information
instead of the explicit or implicit lies
issued from the most powerful cheaters.
Fair Competition is the ideal
that liberalism tells us to strive toward,
as a stable way to enhance any reflective dynamical system's
informational state.
The french most often mistakenly translate "fair competition" by
"libre concurrence" ("libre" meaning free)
instead of "concurrence loyale" that fits better.
This is typical of a misconception of fair competition as
free entreprise (which is also required by liberalism),
that is freedom for anyone to enter the competition
and invest one's resources in any civil activity,
should existing competition fail to fulfill some needs,
or do it at excessive prices.
On the contrary, Fair Competition is the
dual requirement for Free Entreprise:
it means that the competing parties must abide
by rules of respect of the consumer,
that law and justice should enforce whenever it is doable,
where the overhead for obtaining information that
helps choose among the competition should be reduced.
This in particular means
that no information should be concealed to consumers,
that no misinformation be spread among them,
that there be actual competition and no trusts that racket people.
Please be aware that sadly,
"fair competition" is often used as a slogan,
outside of any theoretical context that gives it any meaning,
and sometimes in fallacious ways that distort such worthwhile contexts.
For instance,
there is currently an outstandingly high power
of the advertisement lobby over the mass-media;
these people try to justify their methods,
by an argument that does indeed
justify the existence of similar corporations,
but of completely opposite methods:
they invoke the need for consumer to be informed about products
that exists in any liberal market, so that the competition be fair.
Sure, information is cheerly needed,
but propaganda is not information,
for it contains much more noise and disinformation
than it contains information.
To be fair, advertisement should be based on
actual, objective arguments,
all of which must be easily checkable by the advertisee,
not on slogans, fallacious associations, and calls
to people's lowest instincts.
Feedback
The mechanism by which information can stabilize
propagate and evolve in a dynamic system:
subsystems correct their output
according to the input they get
back from the "external world" after.
When the corrected output changes in an opposite way to the
difference between the input and some point,
we have negative feedback, which allows the
subsystem to stabilize around that point that we call point of equilibrium.
In a dynamical world, where nothing can be taken for sure as definitely
granted, only through feedback can one adapt to the world. Particularly,
we must continually confront our theories and ideas to actual pratice,
and ideas other may have, so we can trust them in any way. Ideas that cannot
suffer and survive such confrontations are false.
First-order
In which only objects of a theory may be abstracted.
Freedom
See Liberty.
Functional
Functional Programming is a very powerful
paradigm for programming that originates in
the early works about lambda calculus:
computations consist in evaluating/expanding terms.
GNU
The GNU project, founded by Mr R.M.Stallman,
has a leading place among free software project,
of which it is a precursor.
The ideas about free software developped in this project
are mostly agreed to by the Tunes people;
namely that by promoting generalized constructive
distribution of information flow,
instead of the centralized destructive organization
imposed by the large corporations who held the power up to now
in the computing field,
the computer users and programmers are freeer, more happy,
and more productive.
However, the technical goals of the Tunes project are completely
different from those of the GNU project.
The GNU project aims at replacing existing non-free tools
by free compatible counterparts, with a freedom to make better counterparts
that is inevitably followed by programmers.
The Tunes project aims at refounding computing systems on cleaner grounds,
by rejecting any flaws in traditional design, and not directly caring about
any kind of compatibility.
The GNU project is much more pragmatic, and of immediate need and use.
The Tunes project is looking for longer-term utility.
The two projects do not fight each other, which would be against
their common free software philosophy.
Instead, I feel they are complementary, and should enrich each other.
God
Considering a system or universe, anyone that has power to defy the laws
of the universe is a God.
Then of course a God does not exist
inside the considered universe, but in some outer greater universe,
even though he might constantly show his presence through perpetual miracles
(see input/output).
He may also do punctual miracles, but these are completely beyond
understanding from inside the system (as a corollary, anything that's
not completely beyond understanding is not a miracle).
Particularly, the human user who can shutdown a computer is a God to the
computer programs running on the computer.
Thus, in Tunes terminology, we use this appropriate term
where other people talk about "super-user"
Also see Holy, root.
Grain
The grain of a system is the minimal or typical size of objects this
system can handle. The larger this size, the coarser the grain;
the smaller this size, the finer the grain.
The coarser the grain, the more expensive it
is to create, handle, maintain, dispatch, objects.
Thus, computing liberalism tells us
that to have a system that adapts better, you must reduce the grain as
much as possible.
However, handling objects may involve
administrative
overhead.
Thus, to achieve a better system, you must also reduce the administratrivia
together with the grain, wherever possible.
Traditional Operating Systems have a very high grain
(the basic object size being a file, and the typical packaging grain being
the huge "application"), thus yielding poor performance and adaptability.
The unsecure C language forbids a
smaller grain. As long as OSes will use such a language, the
computer industry is bound
poor results.
Higher-order
In which any kind of abstractions are recursively
possible over a system. That is, objects that talk about
objects that talk about, etc, about the system.
High-level
See abstraction level
Hole
See security hole.
Holy
That something be holy shall mean that it is absolute
and does not depend on any human-reachable thing.
Ancient traditions, from times when education was
too expensive and superstition was a better way to propagate information,
have imposed the cult of some things as holy as an easy way to have
people propagate this information.
As with any method, this method can be used (and was used) to
propagate all kind of information, bad as well as good. But this method,
where people learn without questioning, leads to crooks taking advantage
of people's ignorance, and either the method stresses on conservatism,
and information cannot be enhanced, and integrate feedback from real life,
or it doesn't, and crooks will corrupt the information.
This method for propagating information thus has high noise level,
low adaptation rate, and nasty side effects; but it requires no
civilizational investment to be possible.
This is why this method is inferior to other methods that do not
involve reason in place of blind trust, that accept feedback most
adapted to real-life, and not utterly corrupted by crooks, or stoned
by conservatism.
Note that there is a full range of intermediate methods between these,
where belief slowly gives place to reason; actually, reason is always
founded on beliefs, and beliefs can only be narrowed, never eliminated.
Also note that, narrowing beliefs requires an ever larger cultural
background, to confront beliefs with knowledge from real-life. Thus,
only the progress of civilization makes use of reason possible, while
this use makes civilization possible in turn, by having it compete against
the natural tendency toward entropy and chaos, as is expressed by crooks.
Some pretend that (at least some) words are holy.
That's pure nonsense.
If you're not convinced, take the word "God",
most holy word to many religious people.
If you pronounce it to a french man,
he will understand that you refer to a tool
which decency prevents me from naming here.
The latin (long the "holy" language for the Christians)
for "God" is "Deus",
which is pronounced as
the englishmen pronounce "deuce"
which is a scoring term in tennis.
Actually, for a word to be holy,
the language it is spoken in must be holy too;
however, History tells us how quickly languages appear, evolve and disappear,
as humans do and undo them, and thus are not holier than humans;
now, humans are quite human-reachable, and not holy at all.
As humans can only reach human-reachable things,
we can conclude that nothing that is holy thing is in a human's grasp.
Conversely, we see that nothing that is in human grasp is holy.
Thus, anything we can successfully talk about is not holy,
and there's no point trying to talking about holy things.
If you don't want to apply this to the human world,
at least you can apply it to the computer world,
where nothing is holy as everything was decided and can be changed by humans.
Note that happily, things do not need be holy to be useful.
Also note that things needn't be holy or universally true
to deserve respect.
But these are quite another debate.
Host
Standard name for the basic entity in a computer network:
each computer in the network is typically called a "host".
Information
Measures the improbability of the known state of a system.
The more information we have, the more restricted is the space of
possible actual system states as compared to some reference distribution of
possibilities.
In computers, the basic unit of information is the bit.
Fundamental theorems about entropy tell us that
global information decreases. But locally, it can increase by using
the free part of energy in some desequilibrium, and life is such an
engine that pumps entropy.
See the Ethics of information.
Inheritance
Inheritance is a implementation hack for having refined definitions
of objects.
It's the typical example of an element of
tradition that people
mistakenly consider as being holy,
and refuse to reject even though reason demonstrates how lame it is
as compared to what we now know.
If we formalize inheritance, we see that it consists in allowing
to define higher-order functions, under the constraint that each could
be applied only once, and only just after it was defined.
Single inheritance also requires that there be a unique argument.
Multiple inheritance accepts many arguments to the function,
all having to be given at once,
not any constructor being allowed to appear twice after expansion,
but with the same arguments,
and without any alpha-conversion
(well, a language like Dylan does offer some alpha-conversion).
Any sensible man can realize how ridiculous such constraints are.
For more serious object and class definition mechanism,
look for delegation (or isomorphically, functors), which are other names
for (sometimes limited) higher order functions.
Input
What is consumed by an open system.
See output.
Interoperability
the fact that a number of different entities be able
to speak the same language.
Isolation
When a computer object is not secure,
it must be isolated from possible sources of failure.
In traditional OSes,
processes are so
unsecure that
the system has to completely, systematically paranoidly isolate processes
one from the other.
This isolation is like having people put in quarantine to prevent possible
diseases to propagate. But it also prevents people from interacting one
with the other (i.e. to have any social life), and finally people have to
cheat it to live at all, and it then loses its few advantages after having
wasted a living.
Kernel
Any "central" part in a software program.
Particularly, a statically linked centralized run-time dispatcher
(particularly in an "operating system").
As any centralized bloat, it turns out to be easier to design,
but completely unefficient to use.
Also see
Centralization,
Microkernel.
Lambda-Calculus
The mathematically formalized theory of abstractions.
The usual reference for it is a 1984 book by Barendregt.
See logic and functional programming in the
Languages page...
Lazy Evaluation
Strategy to evaluate objects only when one is sure they are useful or
needed, and not before.
Lazy evaluation allows to avoid lots of (possibly infinite) computations,
while greatly simplifying the way to code high-level programs.
Liberalism
In its original, nineteenth century sense,
liberalism is the theory which shows
that in any evolving system, there is a natural
selection by survival of the fittest to that system's constraints,
so that to achieve the best state possible,
you must allow the fairest competition,
and the broadest liberty, so that
people may automatically adapt to all of the system's
natural constraints.
The accumulation of data throughout history, that follows this selection,
as well as the data accumulated as a result, is called
tradition.
Liberalism is commonly applied to economy,
where it tells that to achieve prosperity,
you must firstly
allow the fairest (not the wildest) competition between companies
to have the quickest adaptation.
Particularly, information should be freely available,
and discussion freely allowed, so that people may compare and choose;
choice should be free, and not based on prejudices.
And secondly, you must
encourage free enterprise (not free crookery)
and small businesses when possible, to achieve the finest
adaptation.
Particularly, trusts and monopolies should be fought whenever
they eventually appear, and stricly, democratically,
controlled when they are inevitable.
Liberalism does not apply only to economy,
as show the works of John Stuart Mill in the moral sciences,
Charles Darwin in the natural sciences.
Some even speak about economical or biological Utilitarianism,
or moral or social Darwinism !
The current continuators of this philosophical trend may well
be the cyberneticians and memeticians.
In the Tunes project, we apply those ideas to the
field of computing systems, that is, we defend
Computing Liberalism.
But beware, many people that call themselves (or are called)
liberal do not refer to this original liberalism, but to some
degenerate tradition, that misunderstands the deep ideas behind
it, and only retain the apparent conclusions of a time, that may
well be partial or obsolete. Often, they don't even care about
the ideas and just join because they hope to benefit from some
prestige related to a "liberal" tradition.
Though the word "liberal" may be historically attached to such
people and their parties, it is not in that meaning that
the Tunes project refers to this word.
Liberty
Liberty has always been a very difficult thing to define.
Basically, its meaning would be that people (or whatever subsystem is to
be free) are able to choose themselves goals they will achieve.
Thinkers soon discovered that paradoxically,
while physical constraints, like violence, directly opposed liberty,
moral constraints (as expressed through laws),
are an essential condition for any liberty to exist.
Actually, this paradox has an easy solution,
once settled a proper theory for liberty.
We define liberty as potential information.
Security will be actual information.
Physical constraints yields a little original information (most being
redundant), while wasting a great lot of resources, which is why they
reduce liberty so much.
Moral constraints, on the other hand, allow the willing man to
focus on a restriction of the system where the world has a meaning,
that is, information, so that any new input from the world will be
additional information; thus moral constraints do increase liberty.
More, with physical constraints, people reduce each other's liberty,
while with moral constraints, people increase each other's liberty.
In the first case, people expect each other to negatively interfere
with each other's goals. In the second case, people can expect each other
not to.
For example, having people repeat endlessly the same sentences of
wisdom greatly reduces their liberty,
and the more wisdom is being repeated, the more their liberty is reduced:
by repeating endlessly, one wastes the limited available resources one
can dispose of, thus depriving potential information from being made
available with these resources; worse even, just repeating things makes
those things meaningless, however great their original meaning was,
because they do not confront to reality anymore.
Information comes from interaction, not just sitting there.
So not only is liberty reduced, but the security is small (because of
redundance) and illusory.
But on the other hand, defining lots of rules about the meaning of
sentences greatly increases liberty, by letting people communicate more
easily, thus allowing to convey lots of information that couldn't be
possible otherwise.
John Stuart Mill has written the excellent essay
"On liberty" (1859), which I recommend to everyone.
It will explain much better than we may what liberty is,
at least what we think about it in the Tunes project.
Whereas it is very well worth the purchase of a paper copy,
it is also available on-line
"here" (283K),
or gzipped
here
(103K).
Because of a natural tendency toward entropy,
liberty naturally decreases, and we need continuously fight
for it not to eventually reduce to nothing.
Linear Programming
In any generalized algebra,
that is, a structure with a commutative associative sum operation
with neutral element over which other operations distribute,
a functional term is said to be linear in some variable
if and only if this term can be expressed as a sum of "monomials"
in each of which the variable appears (at most) once.
For instance, x + x*y + 2*y*y
is linear in x, but not in y.
Linear equations have many interesting properties that have been
studied with great success for many algebras,
from number field theory to type grammars.
In particular, expressing computations with linear terms
(with logical disjunction as the sum operator)
is a very convenient and elegant way
to cleanly express the semantics of objects
taking into account resources (such as memory)
whose physical unicity prevents multiple instances of the objects
from existing at the same time:
The state of the system will be represented by terms
that are linear in each of the (states of) the unique objects,
and possible computational operations will be functions that
are linear in those objects, to conserve the fact that at any time,
each unique object has just one state,
even though it may still be to decide which it is among many.
Linear programming is the art and science of programming with
such linear terms.
Low-level
See abstraction level
Man
Some fragile machine that takes several tens of years to manufacture,
with a low success rate, and only seconds to get definitely out-of-order.
The most powerful and most versatile machine known to date, however.
Meta-
A prefix that means beyond, or transcending.
The Webster says:
used with the name of an discipline to designate a new but related discipline
designed to deal critically with the original one (e.g. metamathematics)
We also apply this prefix to objects in the meta-discipline,
e.g. if the discipline deals with bars, the meta-discipline
will deal with meta-bars.
Metasystem transition
There is a theory, that has been in the air for long, and is being
settled down by the
Principia Cybernetica Project
as "metasystem transition theory",
that says that as the known world evolves toward more complexity,
it organizes into "higher" forms that let lower forms do things,
while they control things instead of doing them, that is, that
it evolves from systems to meta-systems.
The liberal/evolutionist/dynamicalsystemic
analysis explains it in that
relatively simple achievements in the meta-system
yields great improvements in the system,
or allow to achieve similar or greater results
without spending as many resources,
while developping capabilities that prove useful
in quite many unsuspected domains,
that allow to take advantage of new forms of
free energy not yet rivalled upon.
The development of the various forms of organizations known as life,
from particles
up to atoms
(organizations of particles),
up to molecules
(organizations of atoms),
up to simple organic molecules up to molecule complexes, protocells, and cells
(organization of molecules),
up to multicellular complexes and multicellular life forms
(organization of living cells),
up to life forms with specialized organs of increasing complexity
(organization of organs),
up to lifeforms with nervous systems
(system to coordination organs),
up to societies of above living creatures of various complexities,
up to ecosystems,
up to biospheres,
up to the study and development of the above by human rationality,
up to god knows what will follow next,
all that makes an overwhelming witness for that theory.
As for computing, all this means that
people,
after having developped analog devices,
have developped analog device components,
which led them to digital device components,
and then to computer hardware development,
to binary software programming,
to macro-assembly software programming,
to programming languages (initial LISP notation),
to interpreters,
to compilers (FORTRAN),
to low-level languages (C),
to medium-level languages (ML; LISP has been all the way down to here).
Tunes promotes next meta-system transition in computing,
what we call high-level languages, and
metaprogramming
Meta-system transition does not mean that there is a pure cut between
the system and the "meta-system", which would exist independently, or
"detach" from lower dependencies.
Actually, the "lower" system is never so active as when the
meta-system exists, and the meta-system does depends on this increased
activity (that it in turn generates) to even exist, it feeds on it,
and promotes it:
cellular life has not disappeared but multiplied since multicellular
life appeared;
the currently single known reasoning specie
has not had a less active, but a much more active material life,
since it developped its spiritual skills and activities;
much more people (not counting programs) do assembly programming since
interpreters and compilers where invented.