At the end this system recalls that used in the mafia,
some sects, and actually is the same as in any hierarchically organized mob.
But don't take it wrong, and let's just analyze why this happens.
Firstly, there is no such thing as a world-wide conspiration against
the customer.
Instead, there is a wild competition,
where everyone is looking for his own short-term profit,
without any law or regulation organism,
without any professional ethics.
That is, we face some law-of-the-strongest anarchy, which naturally
structures itself in this mob-like organization.
The Tunes project strives to provide means for a fair competition
to appear.
Also see
Computing System,
Computing Freedom,
Computing Liberalism,
Operating System,
Programmers,
Users,
End-Users.
Computing System
A computing system, as the name says, is a system that
allows men to do computations. Do not mix it up with a computer system
which is may be any kind of system based on a computer. A computing
system is for dynamic human interaction, so that men may express their
creativity, and adapt to an ever changing world.
It encompasses all of a computer's hardware and software
that make it interact with the external world of human beings.
Also see
Tunes,
Operating System.
Computing World
The computing world, as the name says, is the concept
encompassing all computers in the world, that are considered
as being running different versions of a common
Operating System,
and linked to a global network.
In fact, using our OS definition, the global OS
is the greatest common divisor of all existing systems (that is, not much),
and even computers remotely linked to an actual network through weekly,
monthly, or yearly floppies is considered as being on the global network.
Conservatism
The tendency to stick to the current contents of
tradition.
The degenerate tendency to believe that these contents should be conserved
because they be holy.
Constructors and Destructors
In computer terminology,
a constructor is some function
that takes a list of parameter objects
and makes a new object out of these
by assembling them into a structure.
A destructor does just the converse,
allowing to "destructure" some object
into more elementary components.
Firstly, it must be noted that
not all components of a structure need be explicitely
given to a constructor or returned by a destructor.
Actually, for the sake of readability,
but also because no system can be founded
without (infinitely many) implicit axioms,
most of the times,
many parameters are left implicit,
and taken as obvious from the context,
or their value being deferred to some later time.
For instance, in the traditional "OO" paradigm,
the constructor/destructor terminology is used only relative to the
pool of available low-level side-effective resources,
and never for algorithmically meaningful parts of programs.
On the contrary, in traditional typed lambda calculi,
constructors are used for objects of high-level semantics,
in a usually pure way, where all low-level side-effects are made implicit.
The "destructor" terminology is not used explicitly in usual
functional programming style,
destructors being implicitly used through pattern-matching.
Linear functional (or logic) programming makes all destructors apparent.
Unhappily, our generalized point of view is never explicitly considered,
whereas it unifies those above particular concepts, as well as many others.
In the Tunes project,
faithful to our liberal philosophy,
we strive to enable everyone
to have full control on what one may leave implicit or make explicit
in any particular program,
so that one makes one's own paradigm for available paradigm constructors,
so that it destructures in a way that fits whatever program one writes,
instead of trying to force everyone to use a one centrally decided
paradigm.
And controlling what is implicit
is precisely what reflectivity
is all about.
Cybernetics
A definition for Cybernetics could be that it is the science of
information,
that is, of what makes dynamical systems and meta-systems.
A good pointer about it is the
Principia Cybernetica Project.
See Dynamism
and Reflection
Cynicism
Cynicism is a philosophy that refuses the abstraction of moral concepts:
the cynic man refuses to consider "what if ..." hypotheses about the world,
which only can build a non religious moral.
The pertinency space of a cynic is thus made of raw facts,
with no expressivity,
and for a cynic the world has no (human-reachable) meaning.
Decentralized
The opposite of centralized.
Not obeying to a one guide, chief, führer, duce, or
any other kind of grand beloved allmighty allknowing leader.
Not even having such an organization in any part of the system.
Destructor
See Constructor
Distributed
The fact that information in a network is dynamically
organized in the system, that the system be decentralized.
Opposes to "networked"
in its common restricted sense.
Dynamism
The notion according to the which the world
constantly changes and evolves,
and that no static analysis can be indefinitely valid.
It appears that the main errors we fight are due to a static view of
a world that is essentially dynamic.
End-Users
According to the silly notions of
traditional systems,
some category of dumb people for whom computers would be made.
Actually, dumbness or intelligence are quite independent from
people's technical knowledge.
By despising people like that, the computer industry only manages to
discourage people from using computers.
Entropy
Entropy is the opposite of information.
It is a concept that was discovered fairly recently (XIXth century)
by physicists. It is a magnitude that appeared naturally in the equations
of thermodynamics; when physicists interpreted its meaning, they realized
that it expresses the amount of information in a dynamical system,
which they more and more realize everyday our physical world happens to be.
Basic theorems about such dynamical systems assert that in a closed
system, entropy globally increases with time; that is, information is
being lost, and the system evolves toward a state more and more
indistinguishable, less specific, more probable.
Ethics
Ethics is the interface between the world and philosophy.
It helps defines abstract, moral, rules that should direct any of our behavior.
It seems that we can define ethics based on
information:
the good would be to maximize available information in the long run.
One's contribution to global information would be the measure of one's
behavior.
Because total information in the system is bounded,
not only the amount of actual information collected,
but also the waste of potential information
is to be considered.
Also, producing redundant information is considered poor yield;
original information is much more valuable. In fact, all that results
from elements of information not to be considered independently but
accumulated together.
The fact that one's works can propagate one's information
beyond one's short (and condemned) physical life is a good
justification for long-term moral behavior:
the information resulting from one's deeds are the part of one
that survives one's death.
Of course, in any case, it will be only a small contribution to the
world; but even then, "good" deeds will have long-term effects,
while "bad" deeds from different people will be cancelled in the
chaos of each other's deeds.
Thus, if you want to live longer and broader, you must do good.
No need of any supernatural explanation.
But all this is a very large problem which is to be discussed in another
project (see Cybernetics).
Expediency
We say that something is expedient if it benefits
one's (or a few ones') short term and restricted interests.
In a more restricted sense (as used in the
TUNES acronym),
expedient implies that the expedient thing may be harmful or disastrous
for the global long term welfare.
We thus oppose this restricted expediency
to real Utility.
Beware that this does not
mean anyhow that all expedient things in the general sense are expedient
in the restricted sense.
Actually, most of the time, we have only a vague idea of what is really
useful or harmful, so in the limits of this knowledge,
expediency is the best approximation we have; if we do not have clear
ideas about expediency either,
we follow traditions and habits.
For example,
mass murdering and concentration camps are expedient to
dictatorial governments;
racketing, selling drugs or weapons are expedient to the mob;
industrial monopoly is expedient to the owners of these monopolies
and partners in a trust (a.k.a. the white-collar mob);
wasting the Earth's natural resources may be expedient to those who exploit
these;
wasting human work is expedient to those who sell broken cheapo products in
a cheated market with unfair competition;
but all these are extremely harmful to the world.
Fair competition
Competition based on real, total information
instead of the explicit or implicit lies
issued from the most powerful cheaters.
Fair Competition is the ideal
that liberalism tells us to strive toward,
as a stable way to enhance any reflective dynamical system's
informational state.
The french most often mistakenly translate "fair competition" by
"libre concurrence" ("libre" meaning free)
instead of "concurrence loyale" that fits better.
This is typical of a misconception of fair competition as
mere free entreprise (which is also required by liberalism),
that is freedom for anyone to enter the competition
and invest one's resources in any civil activity,
should existing competition fail to fulfill some needs,
or do it at excessive prices.
Fair competition also and mostly means that the competing parties
must abide by rules of respect of the consumer,
that law and justice should enforce whenever it is doable,
where the overhead for obtaining information that
helps choose among the competition should be reduced.
This in particular means
that no information should be concealed to consumers,
that no misinformation be spread among them,
that there be actual competition and no trusts that racket people.
Please be aware that sadly,
"fair competition" is often used as a slogan,
outside of any theoretical context that gives it any meaning,
and sometimes in fallacious ways that distort such worthwhile contexts.
For instance,
there is currently an outstandingly high power
of the advertisement lobby over the mass-media;
these people try to justify their methods,
by an argument that does indeed
justify the existence of similar corporations,
but of completely opposite methods:
they invoke the need for consumer to be informed about products
that exists in any liberal market, so that the competition be fair.
Sure, information is cheerly needed,
but propaganda is not information,
for it contains much more noise and disinformation
than it contains information.
To be fair, advertisement should be based on
actual, objective arguments,
all of which must be easily checkable by the advertisee,
not on slogans, fallacious associations, and calls
to people's lowest instincts.
Feedback
The mechanism by which information can stabilize
propagate and evolve in a dynamic system:
subsystems correct their output
according to the input they get
back from the "external world" after.
When the corrected output changes in an opposite way to the
difference between the input and some point,
we have negative feedback, which allows the
subsystem to stabilize around that point that we call point of equilibrium.
In a dynamical world, where nothing can be taken for sure as definitely
granted, only through feedback can one adapt to the world. Particularly,
we must continually confront our theories and ideas to actual pratice,
and ideas other may have, so we can trust them in any way. Ideas that cannot
suffer and survive such confrontations are false.
First-order
In which only objects of a theory may be abstracted.
Freedom
See Liberty.
Functional
Functional Programming is a very powerful
paradigm for programming that originates in
the early works about lambda calculus:
computations consist in evaluating/expanding terms.
God
Considering a system or universe, anyone that has power to defy the laws
of the universe is a God.
Then of course a God does not exist
inside the considered universe, but in some outer greater universe,
even though he might constantly show his presence through perpetual miracles
(see input/output).
He may also do punctual miracles, but these are completely beyond
understanding from inside the system (as a corollary, anything that's
not completely beyond understanding is not a miracle).
Particularly, the human user who can shutdown a computer is a God to the
computer programs running on the computer.
Thus, in Tunes terminology, we use this appropriate term
where other people talk about "super-user"
Also see Holy, root.
Grain
The grain of a system is the minimal or typical size of objects this
system can handle. The larger this size, the coarser the grain;
the smaller this size, the finer the grain.
The coarser the grain, the more expensive it
is to create, handle, maintain, dispatch, objects.
Thus, computing liberalism tells us
that to have a system that adapts better, you must reduce the grain as
much as possible.
However, handling objects may involve
administrative
overhead.
Thus, to achieve a better system, you must also reduce the administratrivia
together with the grain, wherever possible.
Traditional Operating Systems have a very high grain
(the basic object size being a file, and the typical packaging grain being
the huge "application"), thus yielding poor performance and adaptability.
The unsecure C language forbids a
smaller grain. As long as OSes will use such a language, the
computer industry is bound
poor results.
Higher-order
In which any kind of abstractions are recursively
possible over a system. That is, objects that talk about
objects that talk about, etc, about the system.
High-level
See abstraction level
Hole
See security hole.
Holy
That something be holy shall mean that it is absolute
and does not depend on any human-reachable thing.
Ancient traditions, from times when education was
too expensive and superstition was a better way to propagate information,
have imposed the cult of some things as holy as an easy way to have
people propagate this information.
As with any method, this method can be used (and was used) to
propagate all kind of information, bad as well as good. But this method,
where people learn without questioning, leads to crooks taking advantage
of people's ignorance, and either the method stresses on conservatism,
and information cannot be enhanced, and integrate feedback from real life,
or it doesn't, and crooks will corrupt the information.
This method for propagating information thus has high noise level,
low adaptation rate, and nasty side effects; but it requires no
civilizational investment to be possible.
This is why this method is inferior to other methods that do not
involve reason in place of blind trust, that accept feedback most
adapted to real-life, and not utterly corrupted by crooks, or stoned
by conservatism.
Note that there is a full range of intermediate methods between these,
where belief slowly gives place to reason; actually, reason is always
founded on beliefs, and beliefs can only be narrowed, never eliminated.
Also note that, narrowing beliefs requires an ever larger cultural
background, to confront beliefs with knowledge from real-life. Thus,
only the progress of civilization makes use of reason possible, while
this use makes civilization possible in turn, by having it compete against
the natural tendency toward entropy and chaos, as is expressed by crooks.
Some pretend that (at least some) words are holy.
That's pure nonsense.
If you're not convinced, take the word "God",
most holy word to many religious people.
If you pronounce it to a french man,
he will understand that you refer to a tool
which decency prevents me from naming here.
The latin (long the "holy" language for the Christians)
for "God" is "Deus",
which is pronounced as
the englishmen pronounce "deuce"
which is a scoring term in tennis.
Actually, for a word to be holy,
the language it is spoken in must be holy too;
however, History tells us how quickly languages appear, evolve and disappear,
as humans do and undo them, and thus are not holier than humans;
now, humans are quite human-reachable, and not holy at all.
As humans can only reach human-reachable things,
we can conclude that nothing that is holy thing is in a human's grasp.
Conversely, we see that nothing that is in human grasp is holy.
Thus, anything we can successfully talk about is not holy,
and there's no point trying to talking about holy things.
If you don't want to apply this to the human world,
at least you can apply it to the computer world,
where nothing is holy as everything was decided and can be changed by humans.
Note that happily, things do not need be holy to be useful.
Also note that things needn't be holy or universally true
to deserve respect.
But these are quite another debate.
Host
Standard name for the basic entity in a computer network:
each computer in the network is typically called a "host".
Information
Measures the improbability of the known state of a system.
The more information we have, the more restricted is the space of
possible actual system states as compared to some reference distribution of
possibilities.
In computers, the basic unit of information is the bit.
Fundamental theorems about entropy tell us that
global information decreases. But locally, it can increase by using
the free part of energy in some desequilibrium, and life is such an
engine that pumps entropy.
See the Ethics of information.
Inheritance
Inheritance is a implementation hack for having refined definitions
of objects.
It's the typical example of an element of
tradition that people
mistakenly consider as being holy,
and refuse to reject even though reason demonstrates how lame it is
as compared to what we now know.
If we formalize inheritance, we see that it consists in allowing
to define higher-order functions, under the constraint that each could
be applied only once, and only just after it was defined.
Single inheritance also requires that there be a unique argument.
Multiple inheritance accepts many arguments to the function,
all having to be given at once, and without any alpha-conversion
for constructed arguments.
Any sensible man can realize how ridiculous such constraints are.
For more serious object definition mechanism, look for
delegation (or isomorphically, functors), which are other names
for (sometimes limited) higher order functions.
Input
What is consumed by an open system.
See output.
Interoperability
the fact that a number of different entities be able
to speak the same language.
Isolation
When a computer object is not secure,
it must be isolated from possible sources of failure.
In traditional OSes,
processes are so
unsecure that
the system has to completely, systematically paranoidly isolate processes
one from the other.
This isolation is like having people put in quarantine to prevent possible
diseases to propagate. But it also prevents people from interacting one
with the other (i.e. to have any social life), and finally people have to
cheat it to live at all, and it then loses its few advantages after having
wasted a living.
Kernel
Any "central" part in a software program.
Particularly, a statically linked centralized run-time dispatcher
(particularly in an "operating system").
As any centralized bloat, it turns out to be easier to design,
but completely unefficient to use.
Also see
Centralization,
Microkernel.
Lambda-Calculus
The mathematically formalized theory of abstractions.
The usual reference for it is a 1984 book by Barendregt.
See logic and functional programming in the
Languages page...
Lazy Evaluation
Strategy to evaluate objects only when one is sure they are useful or
needed, and not before.
Lazy evaluation allows to avoid lots of (possibly infinite) computations,
while greatly simplifying the way to code high-level programs.
Liberalism
In its original, nineteenth century sense,
liberalism is the theory which shows
that in any evolving system, there is a natural
selection by survival of the fittest to that system's constraints,
so that to achieve the best state possible,
you must allow the fairest competition,
and the broadest liberty, so that
people may automatically adapt to all of the system's
natural constraints.
The accumulation of data throughout history, that follows this selection,
as well as the data accumulated as a result, is called
tradition.
Liberalism is commonly applied to economy,
where it tells that to achieve prosperity,
you must firstly
allow the fairest (not the wildest) competition between companies
to have the quickest adaptation.
Particularly, information should be freely available,
and discussion freely allowed, so that people may compare and choose;
choice should be free, and not based on prejudices.
And secondly, you must
encourage free enterprise (not free crookery)
and small businesses when possible, to achieve the finest
adaptation.
Particularly, trusts and monopolies should be fought whenever
they eventually appear, and stricly, democratically,
controlled when they are inevitable.
Liberalism does not apply only to economy,
as show the works of John Stuart Mill in the moral sciences,
Charles Darwin in the natural sciences.
Some even speak about economical or biological Utilitarianism,
or moral or social Darwinism !
The current continuators of this philosophical trend may well
be the cyberneticians and memeticians.
In the Tunes project, we apply those ideas to the
field of computing systems, that is, we defend
Computing Liberalism.
But beware, many people that call themselves (or are called)
liberal do not refer to this original liberalism, but to some
degenerate tradition, that misunderstands the deep ideas behind
it, and only retain the apparent conclusions of a time, that may
well be partial or obsolete. Often, they don't even care about
the ideas and just join because they hope to benefit from some
prestige related to a "liberal" tradition.
Though the word "liberal" may be historically attached to such
people and their parties, it is not in that meaning that
the Tunes project refers to this word.
Liberty
Liberty has always been a very difficult thing to define.
Basically, its meaning would be that people (or whatever subsystem is to
be free) are able to choose themselves goals they will achieve.
Thinkers soon discovered that paradoxically,
while physical constraints, like violence, directly opposed liberty,
moral constraints (as expressed through laws),
are an essential condition for any liberty to exist.
Actually, this paradox has an easy solution,
once settled a proper theory for liberty.
We define liberty as potential information.
Security will be actual information.
Physical constraints yields a little original information (most being
redundant), while wasting a great lot of resources, which is why they
reduce liberty so much.
Moral constraints, on the other hand, allow the willing man to
focus on a restriction of the system where the world has a meaning,
that is, information, so that any new input from the world will be
additional information; thus moral constraints do increase liberty.
More, with physical constraints, people reduce each other's liberty,
while with moral constraints, people increase each other's liberty.
In the first case, people expect each other to negatively interfere
with each other's goals. In the second case, people can expect each other
not to.
For example, having people repeat endlessly the same sentences of
wisdom greatly reduces their liberty,
and the more wisdom is being repeated, the more their liberty is reduced:
by repeating endlessly, one wastes the limited available resources one
can dispose of, thus depriving potential information from being made
available with these resources; worse even, just repeating things makes
those things meaningless, however great their original meaning was,
because they do not confront to reality anymore.
Information comes from interaction, not just sitting there.
So not only is liberty reduced, but the security is small (because of
redundance) and illusory.
But on the other hand, defining lots of rules about the meaning of
sentences greatly increases liberty, by letting people communicate more
easily, thus allowing to convey lots of information that couldn't be
possible otherwise.
John Stuart Mill has written the excellent essay
"On liberty" (1859), which I recommend to everyone.
It will explain much better than we may what liberty is,
at least what we think about it in the Tunes project.
Whereas it is very well worth the purchase of a paper copy,
it is also available on-line
"here" (283K),
or gzipped
here
(103K).
Because of a natural tendency toward entropy,
liberty naturally decreases, and we need continuously fight
for it not to eventually reduce to nothing.
Low-level
See abstraction level
Man
Some fragile machine that takes several tens of years to manufacture,
with a low success rate, and only seconds to get definitely out-of-order.
The most powerful and most versatile machine known to date, however.
Meta-
A prefix that means beyond, or transcending.
The Webster says:
used with the name of an discipline to designate a new but related discipline
designed to deal critically with the original one (e.g. metamathematics)
We also apply this prefix to objects in the meta-discipline,
e.g. if the discipline deals with bars, the meta-discipline
will deal with meta-bars.
Metasystem transition
There is a theory, that has been in the air for long, and is being
settled down by the
Principia Cybernetica Project
as "metasystem transition theory",
that says that as the known world evolves toward more complexity,
it organizes into "higher" forms that let lower forms do things,
while they control things instead of doing them, that is, that
it evolves from systems to meta-systems.
The liberal/evolutionist/dynamicalsystemic
analysis explains it in that
relatively simple achievements in the meta-system
yields great improvements in the system,
or allow to achieve similar or greater results
without spending as many resources,
while developping capabilities that prove useful
in quite many unsuspected domains,
that allow to take advantage of new forms of
free energy not yet rivalled upon.
The development of the various forms of organizations known as life,
from particles
up to atoms
(organizations of particles),
up to molecules
(organizations of atoms),
up to simple organic molecules up to molecule complexes, protocells, and cells
(organization of molecules),
up to multicellular complexes and multicellular life forms
(organization of living cells),
up to life forms with specialized organs of increasing complexity
(organization of organs),
up to lifeforms with nervous systems
(system to coordination organs),
up to societies of above living creatures of various complexities,
up to ecosystems,
up to biospheres,
up to the study and development of the above by human rationality,
up to god knows what will follow next,
all that makes an overwhelming witness for that theory.
As for computing, all this means that
people,
after having developped analog devices,
have developped analog device components,
which led them to digital device components,
and then to computer hardware development,
to binary software programming,
to macro-assembly software programming,
to programming languages (initial LISP notation),
to interpreters,
to compilers (FORTRAN),
to low-level languages (C),
to medium-level languages (ML; LISP has been all the way down to here).
Tunes promotes next meta-system transition in computing,
what we call high-level languages, and
metaprogramming
Meta-system transition does not mean that there is a pure cut between
the system and the "meta-system", which would exist independently, or
"detach" from lower dependencies.
Actually, the "lower" system is never so active as when the
meta-system exists, and the meta-system does depends on this increased
activity (that it in turn generates) to even exist, it feeds on it,
and promotes it:
cellular life has not disappeared but multiplied since multicellular
life appeared;
the currently single known reasoning specie
has not had a less active, but a much more active material life,
since it developped its spiritual skills and activities;
much more people (not counting programs) do assembly programming since
interpreters and compilers where invented.
Metaprogramming
Meta-programming is the art of developping methods and programs
to read, manipulate, and/or write other programs.
When what is developped are programs that can deal with themselves,
we talk about reflective programming.
From what is known of technical evolution,
it can be advanced that most probably,
like any technological activity,
the art of computer programming
will be done more and more by machines themselves
instead of humans;
Much like people don't
hand-compute astronomic or physical stuff anymore,
or wash their linen by hand,
or have portraits painted to transmit their picture,
and have machines do in their place.
This way, they can focus on more interesting stuff,
be it manipulating astronomical equations,
building washing machines,
shooting photographs,
etc,
or anything else that gained free time allows.
And even the activities that replaced the former
will one day be semi-automatized
if it brings valuable enough enhancements.
So computer programming,
which was once done at the level of electronic components,
then at the level of binary representation of machine programs and data,
then at that of human readable transcript of the former,
then at that of simple languages that are more abstract while still
mapping the underlying computer architecture,
then at that of more and more abstract languages,
will achieve such state that
most usual work can be done entirely automatically
from functional specifications,
while most of the interesting work will lie
in meta-programming.
Meta-Space
An object is defined in a space, in a context.
When we want to talk pertinently about that space itself,
we must place ourselves in another "outer" space of which the
manipulated or "inner" space is an object.
Such an outer space is named a meta-space.
Microkernel
As people understood that kernels only introduce
overhead without adding any functionality that
couldn't (and shouldn't for several reasons like efficiency, maintainability,
modularity, etc), they tried to reduce kernel sizes as much as they could.
The result is called a microkernel, which is pure overhead, with no
functionality at all.
The latest craze in Operating System research is to
boast about using a microkernel. Note that the size of the OS is not
reduced at all, as the functionality has only been moved elsewhere.
Microkernels are the stupidest idea ever: instead of removing the
overhead, it is concentrated and demultiplied. Because now you need go
from user to kernel then kernel to server.
Because of the low abstraction level of microkernels, lots of low-level
bindings must be done for "servers" that provide functionality, so
nothing is gained at the user/server interface either.
As a result, microkernels are slower, bigger, harder to program,
and harder to customize than traditional kernels.
The only positive point if there is is that they do encourage some
modularity; but this modularity being essentially low-level,
it's a burden more than anything else.
Monopoly
the fact that a person, company, or trust, controls a market.
The worst way in which competition can be unfair, leading
to degenerating, unproductive, uninnovative, unadapting,
traditions, which benefits only to the tenants of the
monopoly, bitterly harming the entire system.
Even when there are laws against monopolies, the power
such situation gives to the benefiting few makes opposing
them very difficult, which is why such laws are of limited
utility, it being too late when they can be applied.
Thus, there is a need to fight the conditions that make
monopolies possible, that is, wild competition, that
naturally leads to such local or global monopolies.
Again, we need to ensure the fairest competition.
Network
A networked system in the general meaning.
Networked
For an entity in a system, being linked to other entities,
that is, there being input and
output between the entity and others,
faster than the speed the entity stabilize.
For a system, having entities linked with each other.
In a common, restricted meaning for systems,
implies that little trust exists between the entities, and/or
that the system is organized in a centralized way,
and programmers only having access to very low-level communication primitives,
and having to hand-design the way information is organized in the system
(which is also a form of centralization,
as a one brain decides for all the system).
Nucleus
Another name for
kernel.