Slides & Leibniz-Lab: Sovereignty and the Human Factor in Transformation

Last week, I participated in the Leibniz-Lab “Upheavals and Transformations” workshop in Leipzig. As a data scientist with a background in landscape planning, I often feel like a technical outlier in these predominantly humanities-and-social-science-driven circles. However, the diversity of perspectives turned out to be the workshop’s greatest strength.

Who Owns the View?

I contributed a presentation titled “Who Owns the View? Corporate Sovereignty, Filter Bubbles, and the Struggle for Spatial Agency.”

My goal was to shift the discussion of “sovereignty” from a purely state-centric or legal concept toward a spatial and perceptual one. In an age of algorithmic curation, sovereignty effectively means the agency to decide where to look and where to go.

Title Slide: Who Owns the View?

Slides: https://slides.ad.ioer.info/who_owns_the_view/

I reused two specific activation examples from my recent lecture series, which resonated with the audience:

  1. The “Hülße Park” Scenario: Confronting our own tendency to “curate” reality by posting only beautiful sunsets while ignoring the physical degradation (trash, crowds) of a place.
  2. The “1913” LLM: Demonstrating how an AI model trained only on pre-1913 text creates a biased worldview.

This sparked a discussion on Generative AI as the next major disruption. The consensus was that we do not yet grasp the global societal effects of this technology. My introduction of Mark Weiser’s concept of Calm Technology (that informs but doesn’t demand our constant attention) was met with curiosity as a potential counter-model to the current attention economy.

Perspectives from the Workshop

The two days provided a lot of insights that went far beyond my own discipline.

Transformation as Recombination The keynote by Raj Kollmorgen (HS Zittau/Görlitz) set an important frame. He drew from historical transitions (like the German reunification) and argued that successful transformations rarely invent entirely new things. Instead, they recombine existing elements. Since the people living in a system remain the same before and after a disruption, adaptation is easier if we identify and carry over what works, rather than forcing a completely new system on them.

The Long View: Human vs. Human vs. AI Olaf Jöris (LEIZA) provided a great deep-time perspective, overlooking 10,000 years of human history. He argued that the stability of societies relies on the diversity of offered niches. The more people can find a niche to fulfill their needs, the more stable the society (in contrast to autocracies). Most interestingly, he noted that early human stress came from competing with nature. Around 5000 B.C., this shifted to internal competition (Human vs. Human). Today, we face a new triangulation: Human vs. Human vs. AI. As AI begins to compete for our niches (jobs, creative roles), we may see a broadening demographic pushed into low-wage sectors. A sobering projection..

Critique and Synthesis The interdisciplinary dialogue brought out several other interesting perspectives:

  • Arnold Bartetzky (GWZO) was very convincing with his synthesis of transdisciplinary threads into a coherent cultural narrative.
  • Alexander Mionskowski (GWZO) demonstrated the power of the humanities by summarizing our entire theme group into a concise, ad-hoc text. In our side discussions, he shared the concern that AI currently represents a negative disruption.
  • Thomas Rigotti (LIR) added a crucial psychological and neuroscientific layer, focusing on individual resilience. This connected well with my own work on visual perception (any transformation starts in the brain!).
  • Anna Veronika Wendland (Herder-Institut) offered a critique of modern smart tech. She used the example of the modern washing machine, that is fully automated, opaque, and uncontrollable, as an anti-pattern to Calm Technology and a symbol of lost agency.
  • Sebastian Ziaja (GESIS) provided a bridge back to data science and democratic processes, specifically through his work on “Bürgerräte” (Citizens’ Assemblies).

The workshop reinforced my belief that Responsible Geosocial Data Analytics is a critical tool to make the “invisible” filters of our perception visible. If we are moving into an era of “Human vs. AI” competition for niches, as Olaf suggested, we need infrastructures that allow us to reclaim our view of the world.