Digital Feedback as Another State of Matter

2006

I remember I’m going to forget what I just remembered.

1. The sampling of one reality

There is an essential difference between the analogue universe and its
digital transposition. An analogue model is carried by a continuous
signal while its digital alter ego codes the information in a symbolic
way using a numerical system.

The computer then processes the data as messages made up of a
succession of binary digits. This information contained in the
elementary choice of two states of a probable reality is the “bit”
[1], sole and unique switch confined in its own world, far away from
any “multiversal” theories.

Digital technology allows signal processing by controlling the
circulation of these messages. It makes the combination and
recombination of any objects and material possible within the virtual
frame of the machine, and as a consequence, permits a massive
generation and development of abstract models.

By extension, the access to unlikely forms and data dumps becomes
possible and made perceptible to us. We transform the objects, we
change nature while maintaining the form and vice-versa. Information
leaves and returns by the means of transformations, making it
abstract, yet possible to describe, freeze and store as patterns, and
appear to us like an another crystallised state of matter.

Next to that, the encoding of information adds another layer of
abstraction. The clock, mother nature of virtual existences, imprisons
the nature of the original model in its arbitrary rhythm, a complex
yet minimal linear structure of dancing zeroes and ones.

This binary code is not limited though. This versatile and bottom-up
sequencing system is able to provide rich expressions which explains
why it became today’s preferred support of information. Like a
demiurgic intervention [2], this abstraction does not have a fixed
form but always appears as an intermediary entity between the
conceptual model and its multiple realities.

During the 19th Century, millennia after the clepsydra and the first
reality shift [3], the virtual invasion in our perception of the
physical environment and the encoding of natural models was reaching a
key point. It would forever change and define our relationship and
interactions with data-processed matter.

Probably one of the first use of binary code by a machine, is the
control system for the weaving of patterns in fabric inside
Joseph-Marie Jacquard’s mechanical loom [4]. This machine produced in
1801 was the first entirely automatic mechanical loom. The unit was
controlled by a perforated card, also known as a punch-card. The
presence or the absence of holes on the cardboard would directly lead
the position of wires and then control the weaving according to the
programmed pattern.

The transposition of a model in another referential space is not
accomplished using arbitrary rules. On the contrary, a strict set of
correspondences needs to be elaborated to provide a working mapping of
reality.

By changing the reference frame, one could think at first we would
loose the sensitivity of the original model. But the direct results of
such a transformation is the apparition of a whole collection of
anamorphic ghosts resulting from the constraints implied by the data
mapping [5]. As a result our attention and understanding of the model
is moved to a set of new characteristics, invisible so far, revealed
by adding an extra degree of freedom in the process. During such a
passage, one loses the nuance, tone and definition one used to know
about a given model, but because of the new representation, it is then
possible to observe, explore and study new aspects of it.

Such a process does not reject the nature of the model but makes it
possible by an effort of abstraction to release the essential forms,
and to reach the heart of things. On the other hand, because of the
nature of digital permutations, it is particularly easy to loose
one-self in repeated re-mapping and transformations. Some focus will
need to be achieved to not fall into the trap of another “pipe” [6].

2. Software is not a tool

An algorithm defines a set of rules that lead to the resolution of a
problem. These rules can be represented by a succession of elementary
symbolic operations that follow a logical sequence. A programming
language is a set of syntactic and semantic rules which allow the
description of the considered algorithm/problem into a human-readable
context. This high-level human-friendly dialect is most of the time
translated by a compiler into a lower-level language, such as machine
code, which is more suitable to a machine that can carry out only a small number of hard-wired instructions.

From the algorithm itself and the higher level abstractions, down to
the machine code, there is a whole scale of creative opportunities
which all offer their own definition of freedom. There is absolutely
no steady state in these creative worlds, even though you are working
with a limited set of rules, the artwork can be forever expanded by
recombining its elements. In this Matryoshka system, it is possible to
mix and combine the processing of information in both vertical and
horizontal plans, from the software high level metaphors back and
closer to the metal in the hardware binary symbolic representation.

The consequence is the apparition of yet another transposition of the
author conceptual model into a collection of digital patterns and
computations, or just snapshots of the work and output samples. In a
distributed and connected environment this work then spreads like
digital germs, and represents many different pieces of its author and
the work. With those data-processed objects, an artist becomes a
significant source of data proliferation. These assembled packets
become languages of their own and agents of the emitting author. They
are human extensions of the body in a virtual shift which takes place
in the reality dissolution.

A data-processing artwork has a much larger aura than it seems. It
delivers to the world the contamination seeds through its author’s
agents. It lives inside your private digital environment, infects your
memory, cohabits with your personal data and definitively influences
your audio-visual field or desktop. This attempt to develop
independent autonomous creative processes and to abandon once and for
all the interactive pseudo deterministic systems, is obviously
something which is related to the contamination act, to the
virus. This new process is the next step in the evolution of digital
arts but at the same time a highly speculative quest for yet another
Holy Grail.

With the implementation of artificial neural networks, cellular
automata, genetic algorithms and alike, the artist appears as a
digital alchemist seeking the ultimate quine [7] that will live as an
independent data entity in a micro-, macro-, or meta-sea of living
information and memes.

In that regard, Abraham Moles [8] proposes three key points in order to
build the core of this quest for an autonomous artistic process:

– Anonymity, the creator becomes anonymous. The creation then loses
the characteristics of its individualistic form to become a
collective symbol.

– Redundancy, acting as its own feedback, the creation, even
if it can generate different results, must remain confined in a non
innovative process while creating constant revival of the
permutations.

– Balance between the original algorithmic form and the
injection of artistic emotions.

While the Anonymity component of such an autonomous artistic process
could be indirectly achieved in the dissolution of the artistic ego in
a speculative massively joined open source software art project, the
redundancy and balance prerequisites shows some problems that seems
much harder to overcome.

For example, the redundancy question can be arguable or at least needs
to be revised to explain that this is refering to cybernetics closure
and as such limit the definition of autonomous creation in the context
of the original framework in which the autonomous creative process has
been defined. There is no way out, and no magical autonomy beyond what
has been defined by the author of the system. On the other hand it is
good to note that because of the nature of this digital matter and the
matryoshka effect, although there is no way out of the system, there is
always a way “in”. For each procedural issues there is always a
possibility to add another degree of freedom to escape the limitations
of the running system. Said differently, inside a virtual machine, you
can always write and bootstrap in a more fitted virtual machine.

The most delicate dilemma undoubtedly remains the balance. While the
artwork is built based on a conceptual design, the need to develop
technical systems will rise throughout the process leading to a key
stage. However, the insertion of such systems has a dramatic influence
on the main creative process and the artistic intention. The survival
of the process depends then on the ability of the system to correctly
dose the previous diagram influence at each stage of its
regeneration. This is a dialogue between the author and the code that
is complex and fragile and which seems today impossible to simulate
beyond the obvious use of anthropomorphic tricks because we are facing
the limits of a machine that can only evaluate the structure of an
object without being able to attribute a man-made qualitative value to
it, other than a simple statistical report [9].

Consequently, not only such an artwork would need to act like an
autopoietic [10] machine, but its roots have to be fed by a set of
artistic rules providing a complex enough environment [11].

From this point of view, creating an art relying on processed, or
generated data implies the need to both work on the encapsuler and
encapsuled, on the operating system and its processes. As of today,
this work still heavily relies on the relationship between the
computer and the artist. As a matter of fact it is much more than
classes and instructions which are transmitted, it is also a part of
the author, his thoughts, his algorithmic vision, his sensitivity to
solve and eventually create new problems, or simply discard the
notions of problem solving and meaningful code for a human or machine
interpreter, turning software as a medium to express his intention.

In such a situation the artist is not replaced by the machine but
moved to a position where he initiates and develops a digital
chemistry where processes can mutate and crunch data on their own
within a carefully pre-defined artistic environment.

End Notes

[1] Binary Digit, binary number. It is the measuring unit of the
quantity of digital information. This concept results directly from
the work of Claude Shannon who in 1948 wrote “a Mathematical Theory of
the Communication” in which he posed the first stones of the concepts
of digitalisation.

[2] A demiurge is the god-like entity that shaped the world from a
perfect model to an imperfect reality either because it was made using
flawed matter (platonic interpretation) or because the demiurge was
imitating the supreme act of god creation (gnostic interpretation),
hence downgrading the original model.

[3] The clepsydra is a water clock. It has been the first device used
to measure time by letting water regularly flow out of a
container. Invented around 3000 BC it represents the first shift of
Humanity into an abstract environment that would rule human existence
based on virtual information and not on natural references.

[4] Thanks to Bouchon-Falcon early experiments with semi automatic
loom systems.

[5] Starting to appear in the 15th century paintings, an anamorphosis
is an image that has been deformed by changing the perspective
referential. Using the term anamorphic ghosts, we refer to the
collection of entities copied from the original model and deformed by
the change of referential occurring in the data-mapping.

[6] Very present in UNIX-like Operating Systems, a pipe, or anonymous
pipe, is a “First In First Out” communication channel that may
be used for one-way communication between different processes.
Creating a pipeline is a trivial, yet powerful way to stream and
manipulate data extensively.

[7] Named after philosopher Willard Van Orman Quine, a quine is a
program that can produce its source code when executed.

[8] Moles, Abraham. Art et ordinateur. Paris: Casterman, 1971

[9] For example, a working system is a system that works, disregarding
the fact it is working well or not.

[10] “An autopoietic machine is a machine organised (defined as a
unity) as a network of processes of production (transformation and
destruction) of components that produces the components which: (i)
through their interactions and transformations continuously regenerate
and realize the network of processes (relations) that produced them;
and (ii) constitute it (the machine) as a concrete unity in the space
in which they (the components) exist by specifying the topological
domain of its realization as such a network.” (Maturana and Varela
1980)

[11] With complex environment, we mean the richness defined as a
Langton phase transition occuring between periodic and chaotic
behaviours, as defined in some automata systems.

Bibliography

Shannon, Claude Elwood. “A Mathematical Theory of Communication.” Bell
System Technical Journal 27 (1948): 379-423, 623-656.

Randell, Brian. “The Origins of Computer Programming.” IEEE Annals of
the History of Computing 16, Issue 4 (1994): 6-14.

Bratley, Paul and Jean Millo. “Computer Recreations; Self-Reproducing
Automata.” Software — Practice & Experience 2 (1972): 397-400.

Moles, Abraham. Art et ordinateur. Paris: Casterman, 1971.

Maturana, Humberto and Francisco Varela. Autopoiesis and
Cognition. Dordrecht: Reidel, 1980.

Langton, Christopher. “Computation at the Edge of Chaos: Phase
Transitions and Emergent Computation.” In Emergent Computation, edited
by Stephanie Forest, 12-37. Cambridge: The MIT Press, 1991.