I enjoyed following Jacob's research and our many existential
conversations surrounding both our projects. In true Philosophical
style, though not an armchair in sight, we sat in class in our
ergonomic lab seats and mulled over the many problems of the world
concerning reality, existence, knowledge, values, reason, mind and
language. Jacob researched The Technological Singularity, a concept
which I think is highly relevant in terms of Western thought and
global effect in today's world. The focus of Jacob's exploration
centred around cybernetics and artificial intelligence (AI) in
relation to the Law of Accelerating Change and the notion of
singularity as describing a point at which human beings become
delimited both as a force and as a being.
His
main impetus was to investigate the idea of a fundamental paradigm
shift, vaster than we have ever known, as having occurred as a result
of recent technological advances. But as Jacob and I discussed in
class, viewing change as accelerating or decreasing, and ascertaining
even the direction of change itself is bound by fractals, that is,
where we place our boundary and by what we have delimited to be
included within our process of investigation. When we look in
minute detail we can interpret data or phenomena in a particularly
way, yet focussing on the very same thing at a greater distance we
can make an entirely different reading because our boundary has
expanded and the amount of data we have included has seemingly
increased in some way. It is precisely the notion of the paradigm
which can also be read and interpreted in this way as we delimit
particular time frames throughout history and perceive them as having
occurred in a particular sequence or order. As such, certain events
at the expense of others can become marked and prioritised as having
mapped reality. Jacob's beta presentation raised a similar concept
in relation to false narrative and the variance of measurement
procedures but rather with regards to prediction. Jacob and I had
previously discussed how delimitation in this way always distorts
reality, yet without delimitation we can unlikely formulate any
type of reality.
Jacob's beta presentation referred to this idea in terms of
intelligence, hierarchies and patterns associated with the neocortex
of the human brain. Patterns do constrain us in a particular way,
but I'm not sure that we can think for very long without them. As
such, it would depend upon one's frame of reference and perception of
reality as to how and when they interpreted the occurrence of
technological singularity, or even whether it will ever occur at all.
Jacob
and I further discussed the idea of accelerating change. We
discussed how technology and tools have been an integral part of
everyday human life since Neanderthal people. Additionally, Mary
Shelly wrote about the dangers of technology having gone too far in
her incantation of Frankenstein in the early 1800's. But Jacob
wanted to investigate in what way technological change has
accelerated much more recently in Western science, and if in fact
some tipping point has been reached in terms of the cardinal reality
of how human beings are placed within the world. Jacob's pitch in
week 5 discussed the ideas surrounding this notion of a tipping point
in terms of AI and debated whether or not it has likely already
occurred. In Jacob's research the debate really seems to be about
the effect of AI on existing human intelligence and surmising within
this scenario, whether humans are becoming like machines or whether
machines are becoming like humans. One wonders at the degree to
which such a decision for science to pursue these ideals suits the
common good or is made with an awareness of the likely consequences.
Yet quite rightly, Jacob questioned the degree to which it is even
possible to control this type of technological change, and this
inevitability is precisely the point. Regimes of scientific
research, power and economics could be said to have become so
embedded within Western thought, and global structures for that
matter, that this seems to be a process
of which we now have little control in halting or reversing. Jacob's
beta presentation addressed this very interrelation between
technology and economic growth, showing how each feeds into and
perpetuates the other.
This
discussion about AI in this way also led into a conversation with
Jacob about the supposed origin of human intelligence both in terms
of its state of being and its definition. We discussed the idea of
Hunter Gatherers as probably having thought more in terms of day to
day survival and the cycles of nature, but we wondered at the
evolution of the vast time and space within which we think of today
in terms of the past, present and future. In his beta, Jacob stated
that he particularly wanted to focus on the issue of time keeping and
I understood this as the construction of global space, for example
navigation and mapping in terms of latitude and longitude. As such,
human intelligence obviously has the capacity to evolve and adapt
into more complex thought over time, but I wonder of the limits to
this complexity. I also wonder at this process of evolution in
reverse, for example extinction. Jacob discussed the difficulty of
artificially replicating human consciousness which led me to wonder
where such attributes as emotional intelligence, human desire and
even Will would fit within the AI scenario. Do we perceive that
these features add
to human intelligence or thwart it? As such, would there be an
attempt to computise emotions and Will or would such features become
redundant, extinct? In this way, does AI
add to human intelligence or in fact thwart it? Is AI a lurch
forward into the future in terms of progress generally, technological
or otherwise, or does AI and singularity in fact mark the beginning
of a regression in evolution from a human
perspective? Where do we stand in terms of defining progress? I
found Jacob's beta slide on evolutionary bottles-necks particularly
relevant here.
In
widening the debate to include critical analysis of technological
singularity, Jacob's beta really created a much larger space for
discussion. His critique provided a good counterbalance and seemed
to create a discursive space, or distancing of sorts, enabling Jacob
to 'observe' his project. As such, his field of vision seemed to
expand, which will be invaluable for the essay. Jacob worked on his
project really consistently, nearly every week in class he told me
about new ideas he had had during the previous week and about the
theorists he'd been reading. The project topic was of particular
interest to him and he was really invested in the project's outcomes.
In many respects both our projects were tackling the same
complex issue and asking similar questions but through two different
themes.










