Through this introspective story, Jacobo Nájera shares with us a metaphor about his journey through technology and the internet as a craftwork… Enjoy it!
Jacobo Nájera
Translated by Anette Eklund
Published in Sursiendo, October 16, 2018 (Spanish)
If I had to choose a beginning to the crafting path, it would surely be the first time I rode a bike. To me, riding a bicycle means an opportunity to explore and ponder about trust and freedom, an experience where our bodies shape the memory of the journeys made.
As I kid -sometimes with a schedule and sometimes without it- I regularly rode on different sorts of terrains, each adventure was a new opportunity for me to learn about myself. However, time went by and something happened: I became much more aware of what each movement implied in tension with physics, where the layers of consciousness were not fully verbalizable nevertheless habitable within each journey, as it happens to be with all sorts of crafting.
One of those journeys allowed me to meet some radio frequency experimenters. I learned that a power source is responsible for supplying and maintaining an electrical circuit with energy, but above all, that electrical circuits can be connected by wave transmissions between lengths and distances, and for this, there is technique and expertise, in addition to the knowledge to calculate the characteristics that you require to transmit on 27 Mhz, that there is an Ohm’s Law, and that power is measured in Watts.
I remember the feeling of excitement when I first received a real paper-made letter coming from the other side of the Atlantic Ocean, its name was QSL card or confirmation card, with which the contact between two correspondents ends, sent by post after contact by radio. In which a reception report of its emissions is written.
One of the first challenges was the elaboration of a rectifier bridge to transform the electrical current of the municipal electrical supply system to a direct current to feed the circuit of the radio transmitter apparatus. In short, the possibility of learning about yourself through the things you build and go through the process of trial and testing, all within a community.
Later I made my first connection to the Internet via dial-up, which did not attract my attention as much as the trips with analog telecommunications devices.
When I really generated a connection with the Internet, it was until the age of fourteen, when I found communities that developed (free) software under similar principles to the communities of radio experimenters, that shocked me. It was the possibility of understanding the different concepts that made the Internet work by experimenting with installing software and getting some service to test it, such as a website or domain name resolution.
The biologist and hacktivist Mayeli Sánchez usually projects in her presentations an image with the current best-known Internet companies and refers to it as the new history of the Internet, to later ask itself if that new history will be the history of the Internet and in this way introduce and put into discussion the disputes of the frameworks of the stories and their non-linearity, and who is responsible for them.
For Jamie McClellan, a technologist who has participated in social movements as an administrator and systems developer, both Facebook and Twitter are based on the designs of the Indymedia social network or Independent Media Center of the social movements of the nineties, which had the democratic capabilities to publish news and then comment on them.
Thinking about the Internet can lead us to the development of an architecture that takes as its starting point the idea that each network node would have equal capacity to create, transmit and receive messages, which would be divided into packets (Sterling, B.), which materializes in protocols such as email or the one that allows a website to work, among others.
In order to understand this, it is imperative to know and recognize three moments of its genesis: the first one, related to the seventies where the first investigations were carried out on the development of a network capable of communicating city to city, in the RAND, DARPA, MIT and NPL laboratories. A second highly collaborative and community moment where many of the protocols that we currently use to communicate took shape. A third moment began after the dot com crisis, where companies adopted people’s data as one of their business models Sparrow (2014).
Taking up the ideas of the craftsman Jamie, the protocols acquired a clear social meaning in the 1990s through projects such as Riseup Networks in the United States, SinDominio and Nodo50 in Spain, ASCII (the now silent Amsterdam Subversive Center for Information Interchange) in the Netherlands, Aktivix and Plentyfact in the UK United Kingdom, SO36 and Nadir in Germany and Autistici/Inventati in Italy, who claimed ownership of the Milan (2013) protocols outside the State and corporations.
We can currently observe several intense discussions and tensions about the future of the Internet in terms of regulation, and use development, in a model focused on data guided by market value.
For their part, telecommunication operators declare that everyone makes money on their infrastructure and that they do not receive enough for the provision of connectivity. Another affirmation on this subject is that they need to encourage the use of their networks and that is why they have to have “free” services in which they receive some type of exchange by agreement with these services.
This occurs in a context where the state regulatory bodies have less and less information on the real costs involved in the provision of infrastructure, as well as data on the forms of deployment. On the other hand, major corporations are becoming less reliant on ‘public’ internet infrastructure and are using their own resources to interconnect their data centers.
The towns that have decided to develop autonomous telecommunications technology under the economic and social models of their community life face not only that the optical fibers to interconnect their own networks reach few cities and at high prices (as a result of intentional overselling) but also, to the implications of connecting to models guided by the exploitation of data, which forces these communities in the medium term to have their own infrastructure in the different provisioning layers, so that they can have the capacity for self-determination (Gómez, T, 2017).
Cyber attacks and crashes due to failures involving security problems are constant. A symptom of this is that these scale attacks have also adopted monetization models. Not only because of the remote kidnapping of computers, which later ask for a ransom to return their access but because of the market of vulnerabilities that are part of products with different purposes and costs.
The computer under the concept of general-purpose tends to disappear due to mechanisms of planned obsolescence and licensing since we can no longer be owners of the machines we buy, they are only for rent while the term of the software license lasts.
One of the elements that make up the set of Internet technologies is trust. Visualizing their connections and mechanisms is an exercise to know about the future of these technologies.
Let’s imagine someone is in control of millions of computers around the world (including yours and mine) and can have full privileges on our devices and update software remotely. This may seem problematic, but it depends on which perspective you look at it. Because we always place a level of trust when using one program or another.
In the groups of tech craftspeople I know, trust has a deep relationship between the limitations of the technology they develop and its mistakes. Especially the recognition of what the code cannot achieve in terms of its intrinsic characteristics.
One of the cultural values of the well-known technological craft is the technosocial mechanisms to develop trust. These do not solve the complexity of tensions that arise from all technology but create the conditions for a thinking-doing process as an active task of recognizing its faults and possibilities.
When I came into contact with free software communities, I learned that there are errors or flaws in the software that can imply security problems and that they can be exploited as vulnerabilities to achieve intrusions, however, within the free software communities, the relationship between error and possibility is a situation that is framed within its own culture, as Richard Sennett often indicates: “… more often than not, when people solve a “bug”, what really happens is that new possibilities open up for the use of the software. The code is a work in progress, it is not a finished and fixed object».
Within the technology development practices, we can find inner errors, for example, a software code is known in English as a bug and one of its localized words in Spanish is “bicho”. There are discussions about the moment in which this conceptual association was made for the first time, but within the different stories, there is a story about moths that generated problems in electromechanical computers or in other terms, some type of interference or malfunction.
I got to know several free software operating systems and the one that I decided would accompany me was Debian GNU/Linux. Throughout its history in the Debian project, approximately 5400 volunteers have participated in the different tasks for its development. It is estimated that more than 50% of GNU/Linux installations are based on or run on this operating system (W3Techs, 2016).
Those who develop Debian are trusted by millions of people. When we install this operating system, even from its download, we connect and put ourselves in a network of trust chains. From that dimension, its development community has control of millions of computers in several different locations in the world.
Debian is distributed through servers called mirrors, where a base system is downloaded for installation. These servers are in various parts of the world and are synchronized to have the same software in each one. Servers are usually maintained by volunteers. These mirrors also have a software repository, to install programs and receive security updates via the network, in the installations of this operating system.
The mechanisms for the development of trust and networks between collaborators, developers, and those who install this operating system are complex. But from the perspective of craftsmanship, a point stands out in its social contract, explicitly the number 3, it indicates:
“We will not hide the problems. We will keep our database of bug reports publicly accessible at all times. Bug reports that users submit will be visible to other users immediately.”
Debian Social Contract
On the other hand, a system that makes use of cryptography and cooperation rituals for the exchange of keys, carried out through an annual meeting in which they exchange digital identities through on-site verification. Employing these keys, the verification of the software that you download is achieved, to know that it is indeed prepared by this community and not by other people.
This community is constantly faced with learnings involving patents, laws, jurisdictions, codes of ethics, licensing methods, trademarks, economic and political forces.
If we address the legal aspects, they have not only built licenses that allow a software code to be used, studied, modified, and redistributed. Also in the legal and software code transit, these communities have learned that circumventing a patent has three possibilities: avoid the patent, obtain a use license or invalidate a patent in court Stallman (2002) or the use of patents to defend open-source projects and free software.
The dynamic that occurs in this type of community regarding its failures is that when a vulnerability is found, it is corrected in the code and is available to those who use the operating system. However, we are facing the market for vulnerabilities that have recently become popular in the media. The so-called zero-day vulnerabilities, those whose management model is related to the development of surveillance technologies and services, since they are vulnerabilities that are only known to a small group of people, there is no fix, they are the raw material for intrusion products that seek efficiency. Within these, such as malware.
Recently, another lesson learned from spaces related to these development practices is reflected in the declaration made by the Tor project in which they say that “We cannot build free and open-source tools that protect journalists, human rights activists, and ordinary people. around the world, if we also control who uses those tools.”
In recent years, projects such as Debian, F-Droid, FreeBSD, Fedora, Tails, Tor Browser, among others, have been working on something called Reproducible Builds, which aims to develop tools for verifying binary code for this. verify that the binary code you run on your machine is indeed the code prepared by these projects. In his own words:
“Reproducible Builds are a set of software development practices that create a verifiable path from human-readable source code to binary code used by computers.”
Technology is always going to be vulnerable, whether because of its intrinsic design, its context, or a mix of factors. The difference is the mechanisms we have to trust. Technological craftsmanship is a possible way to learn from us as we build new technologies.
The communication and digital culture group Sursiendo (2017) refers to the fact that within technological biodiversity there are “groups that continue to build an internet environment on a daily basis that values people and collective processes, privacy and anonymity, the circulation of knowledge and mutual learning.
One of the main tensions for future and present technological craftsmanship is that bugs are part of the economy of computer attacks, the same ones that have been adopted by the mechanisms of financial capitalism.
The management of errors and failures is one of the points of trust between the artisans and the people within their communities. In the medium term, mechanisms for recognizing the work of people involved in vulnerability research will play an important role, since they will be part of the scaffolding that fights computer attacks and their economies.
It is worth ponding on these processes that have made it possible to build technologies of high-quality craftsmanship. But above all in technologies that allow us to learn from ourselves.
References:
Milan, S, (2013). Social Movements and Their Technologies, Palgrave Macmillan
Sennett, R, (2009). El artesano, España, Anagrama
Sterling, B.(February, 1993). A Short History of the Internet
Sursiendo. (2017). Biodiversidad tecnológica para saltar los jardines vallados de internet. Available in: https://sursiendo.org/blog/2017/05/biodiversidad-tecnologica-para-saltar-los-jardines-vallados-de-internet
Sparrow, E. (2014, October 9th). How Silicon Valley Learned to Love Surveillance. Available in: https://modelviewculture.com/pieces/how-silicon-valley-learned-to-love-surveillance
Sánchez, M. (2016). Por Amor a La Libertad. Seminario de ética hacker. Universidad del Claustro de Sor Juana. México
Löding, T, Rosas, E., (producers). (2017). Telecomunicaciones Independientes en Resistencia. Mexico: Producciones Marca Diablo
Contrato social de Debian. Available in: https://www.debian.org/social_contract.es.html
Murdock, I. (1993). El manifiesto de Debian. Available in: https://www.debian.org/doc/manuals/project-history/ap-manifesto.es.html
Hacking Team: a zero-day market case study. Available in: https://tsyrklevich.net/2015/07/22/hacking-team-0day-market/
Tor Project. (2017). The Tor Project Defends the Human Rights Racists Oppose. Available in: https://blog.torproject.org/blog/tor-project-defends-human-rights-racists-oppose
W, Gunnar. Fortalecimiento del llavero de confianza en un proyecto geográficamente distribuido. Available in: https://gwolf.org/node/4055
W3Techs. (2016). Historical trends in the usage of Linux versions for websites. Available in: https://w3techs.com/technologies/history_details/os-linux
Reproducible Builds. Available in: https://reproducible-builds.org/
Stallman, Richard. Software patents — Obstacles to software development. Available in: https://www.gnu.org/philosophy/software-patents.html
Gómez, T. (2017). Los hackers que conectan a pueblos olvidados. Available in: https://newsweekespanol.com/2017/02/los-hackers-que-conectan-a-pueblos-olvidados/