Cómo rescaté un juego de USENET de 1987, rastreé a sus creadores a través del tiempo y lo llevé a la era moderna
Autor: Juan Méndez (vejeta) | Septiembre 2025
PRÓLOGO: El Eco de un Mundo Perdido
«I heard news of the request to release the code. I grant permission to release the code under GPL.» — Adam Bryant, 23 de febrero de 2011
«I enjoyed the article and liked seeing the map images in particular. It definitely brought back very fond memories! If only working a job to pay the bills hadn’t gotten in the way…» — Adam Bryant, septiembre de 2025
Catorce años separan estos dos mensajes. El primero rescató un proyecto estancado; el segundo reveló la profundidad emocional de una era digital que ya no existe. Esta es la historia de cómo una obsesión de dos décadas devolvió la voz a los pioneros del software y tendió un puente entre dos Internet irreconciliables.
CAPÍTULO 1: La Cápsula del Tiempo Digital (1987)
Imagen: Captura completa del posting original de «conquest» mostrando el código shar.
26 de octubre de 1987. Mientras la mayoría del mundo ni siquiera imaginaba Internet, en los servidores de USENET un usuario llamado ihnp4!mhuxd!smile (Edward Barlow) publicaba en comp.sources.games:
Este mensaje era más que un anuncio; era un artefacto de una filosofía que definiría una era.
La Logística del shar: Cuando el Software Viajaba en Fragmentos
Descargar Conquer en 1987 no era un clic. Era un rito de iniciación técnica:
bash
#! /bin/sh
# This is a shell archive. Remove anything before this line, then unpack
# it by saving it into a file and typing «sh file».
Cada una de las 5 partes era un script que contenía el código codificado como texto. El proceso requería:
Descargar manualmente cada parte desde USENET
Ejecutar sh parte01 para reconstruir los archivos
Repetir para las partes 2-5
Compilar manualmente con cc -o conquer *.c -lcurses
Si fallaba la parte 3, el proceso se detenía. No había resume download. Se esperaba días o semanas hasta que reapareciera en el feed.
El Acto de Fe Colectivo
Publicar en comp.sources.games era enviar tu trabajo al escrutinio de miles de los mejores ingenieros del mundo. No había tiendas curadas ni revisiones previas. La comunidad era el control de calidad.
Ed Barlow incluía esta nota en el README:
«What you have here is a copyrighted beta test version of CONQUEST.»
No existían las «betas cerradas». Se confiaba en que usuarios anónimos en universidades de todo el mundo probarían, reportarían errores y mejorarían el código. Era el open source antes de que el término existiera.
CAPÍTULO 2: Días Universitarios – El Descubrimiento (1990s)
A mediados de los 90, siendo estudiante en Sevilla, pasé incontables horas en los laboratorios Unix explorando un mundo digital emergente: terminales verdes, USENET, links, news, msgs – y, por supuesto, Conquer.
El juego era revolucionario. Como líder de una nación, controlabas tu reino élfico, imperio orco o ejércitos humanos a través de un mapa renderizado en caracteres ASCII. La profundidad era asombrosa:
Gestión económica detallada por sectores
Sistemas diplomáticos complejos entre razas
Magia y hechizos con efectos en el mundo
Exploración de territorios desconocidos
El Ritual Social del Juego
No era un juego solitario. El ritual era profundamente social:
Conectar por SSH a una máquina Unix compartida
Hacer movimientos durante el día entre clases
Esperar el conqrun nocturno que procesaba los turnos
Recibir emails con los resultados cada mañana
Organizamos partidas que duraban semanas, con turnos diarios o semanales. La lentitud era una característica, no un defecto. Permitía la estrategia profunda y creaba comunidad alrededor del texto.
No he encontrado fotos de la época de la sala, pero he encontrado su descripción en las antiguas páginas de la facultad:
CAPÍTULO 3: La Búsqueda Comienza – Detective Digital (2006)
Imagen: Captura completa del hilo en debian-legal mostrando la consulta inicial.
Para 2006, esta pieza de historia computacional estaba atrapada en un limbo legal. Comencé lo que pensé que sería un proyecto simple: obtener permiso para relicenciar el código bajo GPL y empaquetarlo para distribuciones Linux modernas.
Encontrar a los autores originales fue arqueología digital. Los emails de los 80 llevaban años muertos. Examiné directorios universitarios antiguos, seguí migas de pan digitales y eventualmente contacté a Ed Barlow.
La Filosofía del «Hazlo y Ve Qué Pasa»
Nuestra conversación por mensajería en 2006 reveló la mentalidad de la era:
(18:08:58) vejeta:Sorry if I catch you busy. While trying to investigate if I could relicense old conquer as free software I discovered it was complex. (18:14:00) Ed Barlow:i personally would say that you should do it and nobody will know the difference (18:18:52) vejeta:Indeed, I feel like a detective 🙂 (18:21:23) Ed Barlow:v4 was both of ours… i wrote v4 he wrote v5…
Cuando pregunté sobre arreglos comerciales previos, su respuesta fue reveladora:
«i dont know… i dont have adams mail address at all… dunno if they did anythign with the license tho»
Esta actitud encapsulaba la era pre-comercial de Internet: construir por pasión, compartir por defecto.
Pero Adam Bryant había desaparecido en el éter digital. Documenté todo en las listas de Debian Legal y creé la tarea GNU Savannah #5945. El proyecto se estancó.
CAPÍTULO 4: La Larga Espera y el Avance Inesperado (2006-2011)
Imagen: Captura del email de Adam Bryant de 2011 llegando espontáneamente.
Pasaron años. Entonces, el 23 de febrero de 2011, ocurrió la magia. Mi teléfono vibró con un envío de formulario de contacto:
«I heard news of the request to release the code. I grant permission to release the code under GPL.» – Adam Bryant
Había encontrado uno de mis artículos en línea y contactó espontáneamente. Después de cinco años de búsqueda, la barrera principal había desaparecido.
La Comunidad como Memoria Colectiva
Mientras tanto, conversaciones informales mantuvieron viva la llama. En 2011, mientras explicaba el proyecto a un amigo, su entusiasmo fue contagioso:
kike:killo me has dejao to intringao con el conquer no le puedes hacer eso a un puto ludópata como yo XDDD vejeta:…Esto fue la epoca pre-internet. Y nos entreteniamos con otras cosas: saltarnos las protecciones, juegos como el conquer, dominion, el arte ascii :). El irc acabó con todo eso. kike:si necesitas mano de obra barata para algo del conquer, cueneta conmigo…
Estos diálogos muestran que el proyecto nunca fue un fin en sí mismo, sino un vehículo para revivir una forma de entender la tecnología.
CAPÍTULO 5: La Trama se Complica – Versión 5 Emerge (2025)
Imagen: Comparación lado a lado de las interfaces de Conquer v4 y v5.
Justo cuando la historia parecía completa, Stephen Smoogen me contactó en 2025 sobre mis esfuerzos de relicenciamiento de 2006. Estaba particularmente interesado en Conquer Versión 5 – la reescritura completa de Adam Bryant con características avanzadas:
Conversión automática de datos entre versiones
Estabilidad mejorada y herramientas administrativas
Sistemas de eventos sofisticados
Interfaz de administración expandida
Pero V5 tenía una historia legal diferente, incluyendo arreglos comerciales de los 90. ¿Aceptaría Adam licenciar GPL esta versión también?
Su respuesta: «I have no issues with applying a new GPL license to Version 5 as well.»
El Arco de 14 Años
El viaje de Adam entre sus dos mensajes cuenta una historia universal:
2011:«I grant permission to release the code under GPL.» 2025:«If only working a job to pay the bills hadn’t gotten in the way…»
Este arco de 14 años revela la tensión eterna entre pasión y responsabilidad que todo creador enfrenta.
CAPÍTULO 6: Las Piezas Perdidas – Magia PostScript y una Pérdida Trágica
Imagen: Ejemplo de salida PostScript de las utilidades de MaF.
La red de contribuyentes se expandió. Descubrí a MaF (Martin Forssen), quien creó utilidades PostScript para generar mapas imprimibles del juego – crucial en la era pre-GUI cuando los jugadores necesitaban copias físicas para estrategizar.
Rastrear a MaF en 2025 me llevó a su nueva dirección de correo electrónico. Su respuesta fue inmediata y generosa: «Oh, that was a long time ago. But yes, that was me. And I have no problem with relicensing it to GPL.»
Richard Caley: No Solo un Pie de Página Legal
Captura de la página principal de Richard Caley preservada mostrando el anuncio.
Pero no todas las búsquedas terminan con una respuesta. Algunas terminan con silencio.
Mi investigación sobre Richard Caley siguió las mismas migas de pan digitales. Lo tracé hasta la Universidad de Edimburgo, donde trabajó en síntesis de voz. Encontré sus contribuciones técnicas a FreeBSD. Pero el rastro se enfrió alrededor de 2005.
«Richard Caley suffered a fatal heart attack on the 22nd of April, 2005. He was only 41, but had been an undiagnosed diabetic, probably for some considerable time. His web pages remain as he left them.»
Leer esas palabras se sintió diferente a encontrar un registro histórico. Esto no era investigación archivística – era entrar a la casa de alguien años después de que se hubiera ido y encontrar una nota en la mesa.
La página continuaba:
«Over and above his tremendous ability with computers and programming, Richard had a keen mind and knowledge of an extraordinary range of topics, both of which he used in frequent contributions to on-line discussions. Despite his unique approach to speling, his prolific contributions to various news group debates informed and amused many over the years.»
Los «Caleyisms» – El Hombre Detrás del Código
Captura de la página «Caleyisms» mostrando sus respuestas ingeniosas.
Y entonces descubrí sus «Caleyisms» – una colección curada de sus respuestas más brillantes en USENET que revelaban no solo a un programador, sino a una persona:
What’s a shell suit? «Oil company executive.»
How do you prepare for a pyroclastic flow hitting Edinburgh? «Hang 1000 battered Mars bars on strings and stand back?»
On his book addiction: «I never got the hang of libraries, they keep wanting the things back and get upset when they need a crowbar to force it out of my hands.»
Su humor era seco, inteligente y únicamente británico. En discusiones técnicas, podía ser brutalmente preciso:
«Lack of proper punctuation, spacing, line breaks, capitalisation etc. is like bad handwriting, it doesn’t make it impossible to read what was written, just harder. But you probably write in green crayon anyway.»*
El Oficina Digital Preservada
Explorar su sitio web preservado se sintió como caminar por su oficina digital. La estructura de directorios revelaba sus pasiones: how-tos de FreeBSD, experimentos con POVRAY, imágenes de fondo, proyectos técnicos. Su humor auto-despreciativo brillaba en su sección «About»:
«Thankfully I don’t have a photograph to inflict on you. Just use the picture of Iman Bowie to the left and then imagine someone who looks exactly the opposite in every possible way. This probably explains why she is married to David Bowie and I’m not.»*
Aquí había una persona completa – director técnico en Interactive Information Ltd, investigador de síntesis de voz, entusiasta de FreeBSD, fan de Kate Bush, y un ingenio que alegró incontables discusiones en línea.
La realidad legal era dura: las contribuciones de Richard a Conquer no podían ser relicenciadas. La universidad no pudo ayudar a contactar herederos debido a leyes de privacidad.
Imagen: El arte ASCII RIP de la página de Richard.
Sus amigos habían preservado su memoria con un simple tributo ASCII al final de su página:
text
^_^ (O O) \_/@@\ \\~~/ ~~ - RJC RIP
En la documentación del proyecto Conquer, Richard Caley no es recordado como un «caso problemático» o «código no liberable». Es honrado como la persona vibrante que fue – la mente brillante detrás de los «Caleyisms», el investigador que contribuyó a la síntesis de voz, el defensor de FreeBSD, y el participante ingenioso en comunidades en línea tempranas cuyas palabras continúan divirtiendo e informando, décadas después de que las escribiera.
CAPÍTULO 7: Renacimiento Técnico – Del USENET al CI/CD
Imagen: Comparación lado a lado mostrando el archivo shar de 1987 y la configuración de GitHub Actions de 2025.
La transformación técnica ha sido notable. Modernizar código de 1987 presenta desafíos únicos que requieren tanto respeto por el original como adopción de tecnologías modernas.
Arqueología del Makefile
# Original 1987 - Dependencias hardcodeadas conquer: conquer.c utils.c cc -o conquer conquer.c utils.c -lcurses -ltermcap # Moderno 2025 - Autotools y detección automática AC_INIT([conquer], [4.6]) AC_PROG_CC AC_CHECK_LIB([ncurses], [initscr])
Tuve que reemplazar assumptions de sistemas antiguos con detección automática de bibliotecas. Los Makefiles originales asumían versiones específicas de curses y ubicaciones de bibliotecas que ya no existen.
La Magia de ttyd – Terminales en la Web
# Dockerfile fragment - Puente tecnológico perfecto FROM alpine:latest RUN apk add ttyd ncurses COPY conquer /usr/local/bin/ CMD ["ttyd", "-p", "7681", "conquer"]
Esta configuración permite que la interfaz de curses original se renderice en navegadores modernos sin modificar una línea del código de 1987. Es un puente tecnológico que respeta el original mientras permite acceso moderno.
El mismo juego que viajaba en 5 partes de USENET ahora genera paquetes binarios con hashes criptográficos y procedencia verificable. La ironía: descubrí Melange cuando un amigo empezó a trabajar para la empresa que lo creó.
Base de código C actualizada para soportar ANSI C99 moderno
Empaquetado Debian integración
Empaquetado APK con Melange para Alpine Linux
Contenedores Docker con emulación de terminal via WebSockets
CAPÍTULO 8: Contexto Histórico – Conquer en el Ecosistema de Juegos Unix
Imagen: Fragmento de la lista «versions of empire» de 1989 mostrando a Conquer entre otros juegos.
Conquer no existía en el vacío. La lista «versions of empire» de 1989 revela un ecosistema floreciente de juegos de estrategia multiplayer:
Empire (PSL, UCB, UCSD): Diferentes variantes mantenidas por universidades
Galactic Bloodshed: Enfocado en exploración y terraformación
xconq: Uno de los primeros en ofrecer interfaz X Window
Buck Empire: Mejoras y debugging de una versión de PSL Empire
¿Por qué Conquer Sobrevivió Cuando Otros Desaparecieron?
Arquitectura Modular: Fácil de extender con nuevas razas y hechizos
Documentación Completa: Incluía guías de administración y formato de datos
Comunidad Activa: Adam Bryant mantuvo parches durante años
Portabilidad: Escrito en C estándar, sin dependencias exóticas
Mientras juegos comerciales de la misma época desaparecieron con sus plataformas, Conquer sobrevivió porque era texto plano y código abierto antes de que el término existiera.
La lista de 1989 presentaba a Conquer así:
«A multiplayer fantasy wargame written from scratch by Ed Barlow… Not really empire, but close enough to be easily understood by those used to empire. Currently supported by co-author Adam Bryant.»
Esto revela una cultura donde los «competidores» se listaban unos a otros en un espíritu de comunidad. El valor no estaba en la propiedad intelectual, sino en la contribución a un ecosistema común.
CAPÍTULO 9: El Elemento Humano – Por Qué Esta Odisea de 20 Años Importa
Imagen: Collage mostrando el código original de Ed Barlow, el email de Adam Bryant de 2011, y la conversación del equipo de 2025.
Esto no es solo sobre preservar juegos. Es sobre preservar la historia misma de la computación.
Los Pioneros del Software
Ed Barlow y Adam Bryant construyeron experiencias multiplayer sofisticadas cuando la mayoría de la gente nunca había oído hablar de Internet. Distribuyeron software a través de USENET porque eso era lo que se hacía – compartías cosas geniales con la comunidad.
Martin Forssen y sus utilidades PostScript representan el ingenio de los desarrolladores tempranos que resolvían problemas con las herramientas disponibles. ¿Querías visualizar el estado del juego? ¡Escribías un generador PostScript!
Comunidad y Continuidad
El esfuerzo de relicenciamiento de 20 años demuestra algo crucial sobre el open source: no es solo sobre código, es sobre comunidad y continuidad. Cada vez que alguien mantiene un proyecto legacy, documenta su historia, o rastrea contribuidores perdidos, está tejiendo los hilos que conectan el pasado computacional con su futuro.
El viaje de Adam entre sus dos mensajes – desde el permiso técnico hasta la reflexión emocional – encapsula por qué la preservación importa: el código sobrevive, pero las historias humanas se pierden si no las capturamos.
EPÍLOGO: Lecciones de la Arqueología de Software
1. Documenta Todo Esos posts casuales de USENET se convirtieron en evidencia legal crucial décadas después.
2. Licencia Claramente El comentario de Ed de que «copyleft didnt exist when i wrote it» resalta cómo los panoramas de licencias evolucionan.
3. La Comunidad es Todo Adam encontró mis artículos porque la comunidad de preservación estaba hablando del proyecto.
4. Las Herramientas Modernas Pueden Revivir Código Antiguo Melange y CI/CD le dieron al software de 1987 un renacimiento en 2025.
5. La Deuda Técnica es Temporal Lo que parece tecnología legacy hoy podría ser el tesoro arqueológico de mañana.
6. Preserva las Historias, No Solo el Código Los «Caleyisms» de Richard son tan valiosos como sus contribuciones técnicas.
LA HISTORIA CONTINÚA
Ambos juegos Conquer están ahora completamente licenciados bajo GPLv3 con empaquetado moderno. No son solo software jugable, sino un caso de estudio completo en:
Arqueología de software
Marcos legales para preservación
Evolución de prácticas de desarrollo a través de cuatro décadas
El próximo capítulo: enseñar estos juegos de estrategia clásicos a una nueva generación de desarrolladores y jugadores, mientras demostramos que los marcos legales apropiados y las herramientas modernas pueden dar una segunda vida a cualquier software histórico.
A veces la mejor manera de aprender tecnología de vanguardia es aplicándola para preservar la historia computacional.
¿Qué software histórico merece preservación en tu campo? ¿Has rastreado alguna vez el linaje del código hasta sus creadores originales?
How I spent two decades tracking down the creators of a 1987 USENET game and learned modern packaging tools in the process.
The Discovery: A Digital Time Capsule from 1987
Picture this: October 26, 1987. The Berlin Wall still stands, the World Wide Web is just text, and software is distributed through USENET newsgroups in text files split across multiple posts. On that day, Edward Barlow posted something special to comp.sources.games:
That’s how Ed Barlow announced it at the time, before quickly changed the name to Conquer.
This was Conquer – a sophisticated multi-player strategy game that would influence countless others. Players controlled nations in Middle Earth, managing resources, armies, magic systems, and diplomatic relations. What made it remarkable wasn’t just the gameplay, but how it was built and distributed in an era when «open source» wasn’t even a term yet.
Chapter 0: University Days.
It was during these days, in the middle of the 90s, that my fellow students and I spent hours experimenting with terminals in the Computer Unix Labs, USENET, links, news, msgs, and of course: conquer. That game was a gem that required to be the leader of a country, and with a map representing as characters each player could control their elven kingdom, orcish empire, or human armies to fight each other while controlling all the details of the economy.
But by 2006, this piece of computing history was trapped in legal limbo.
Chapter 1: The Quest Begins (2006)
As a university student in Spain in the early ’90s, I’d encountered Conquer in the Unix labs. Fast forward to 2006, and I realized this pioneering game was at risk of being lost forever. The source code existed, scattered across ancient USENET archives, but its licensing was unclear – typical of the «post it and see what happens» era of early internet software distribution.
I started what I thought would be a simple project: get permission from the original authors to relicense the code under GPL so it could be properly preserved and packaged for modern Linux distributions.
Simple, right?
Chapter 2: Digital Detective Work
Finding Edward Barlow and Adam Bryant in 2006 was like archaeological work. Email addresses from the 1980s were long dead. USENET posts provided few clues. I scoured old university directories, googled fragments of names, and followed digital breadcrumbs across decades-old forums.
The breakthrough came through pure persistence and a bit of luck. After months of searching, I managed to contact Ed Barlow. His response was refreshingly casual: «Yes i delegated it all to adam aeons ago. Im easy on it all…. copyleft didnt exist when i wrote it and it was all for fun so…»
But there was a catch – I needed permission from Adam Bryant too, and he seemed to have vanished into the digital ether.
Chapter 3: The Long Wait (2006-2011)
I documented everything on the Debian Legal mailing lists, created a GNU Savannah task (#5945), and even wrote blog posts hoping Adam would find them. The legal experts were clear: I needed explicit written permission from both copyright holders.
Years passed. The project stalled.
Then, on February 23, 2011, something magical happened. My phone buzzed with a contact form submission:
«I heard news of the request to release the code. I grant permission to release the code under GPL.» – Adam Bryant
He had found one of my articles online and reached out on his own.
Chapter 4: The Plot Twist – Version 5 Emerges (2025)
Fast forward to 2025, and Stephen Smoogen contacts me about my relicesing efforts in 2006 and how he was particularly interested in reviving: Conquer Version 5 – a complete rewrite by Adam with advanced features like automatic data conversion, enhanced stability, and sophisticated administrative tools. This wasn’t just an update; it was a complete reimagining of the game.
But V5 had a different legal history. In the ’90s, there had been commercial arrangements. Would Adam agree to GPL this version too?
His response: «I have no issues with applying a new GPL license to Version 5 as well.»
Chapter 5: The Missing Piece – PostScript Magic
Just when I thought the story was complete, I discovered another contributor: MaF (Martin Forssen), who had created PostScript utilities for generating printable game maps – a crucial feature in the pre-GUI era when players needed physical printouts to strategize.
Tracking down MaF in 2025 led me to his new email. His response: «Oh, that was a long time ago. But yes, that was me. And I have no problem with relicensing it to GPL.»
Richard Caley: More Than Just a Legal Footnote
But not all searches end with an answer. Some end with silence.
My investigation of Richard Caley followed the same digital breadcrumbs. I traced him to the University of Edinburgh, where he worked on speech synthesis. I found his technical contributions to FreeBSD. But the trail went cold around 2005.
Then I found him – not in a USENET archive, but on the front page of his own website, preserved exactly as he left it in web.archive.org.
«Richard Caley suffered a fatal heart attack on the 22nd of April, 2005. He was only 41, but had been an undiagnosed diabetic, probably for some considerable time. His web pages remain as he left them.»
Reading those words felt different from finding a historical record. This wasn’t archival research – this was walking into someone’s house years after they’d gone and finding a note on the table.
The page continued:
«Over and above his tremendous ability with computers and programming, Richard had a keen mind and knowledge of an extraordinary range of topics, both of which he used in frequent contributions to on-line discussions. Despite his unique approach to speling, his prolific contributions to various news group debates informed and amused many over the years.»
The «Caleyisms» – The Man Behind the Code
And then I discovered his «Caleyisms» – a curated collection of his most brilliant USENET responses that revealed not just a programmer, but a person:
What’s a shell suit?
«Oil company executive.»
How do you prepare for a pyroclastic flow hitting Edinburgh?
«Hang 1000 battered Mars bars on strings and stand back?»
On his book addiction:
«I never got the hang of libraries, they keep wanting the things back and get upset when they need a crowbar to force it out of my hands.»
His humor was dry, intelligent, and uniquely British. In technical discussions, he could be brutally precise:
«Lack of proper punctuation, spacing, line breaks, capitalisation etc. is like bad handwriting, it doesn’t make it impossible to read what was written, just harder. But you probably write in green crayon anyway.»
A Digital Office Preserved
Exploring his preserved website felt like walking through his digital office. The directory structure revealed his passions: FreeBSD how-tos, POVRAY experiments, wallpaper images, technical projects. His self-deprecating humor shone through in his «About» section:
«Thankfully I don’t have a photograph to inflict on you. Just use the picture of Iman Bowie to the left and then imagine someone who looks exactly the opposite in every possible way. This probably explains why she is married to David Bowie and I’m not.»
Here was a complete person – technical director at Interactive Information Ltd, speech synthesis researcher, FreeBSD enthusiast, Kate Bush fan, and a wit who brightened countless online discussions.
The legal reality was harsh: Richard’s contributions to Conquer couldn’t be relicensed. The university couldn’t help contact heirs due to privacy laws.
His friends had preserved his memory with a simple ASCII tribute at the end of his page:
^_^ (O O) \_/@@\ \\~~/ ~~ - RJC RIP
In the Conquer project documentation, Richard Caley isn’t remembered as a «problem case» or «unlicensable code.» He’s honored as the vibrant person he was – the brilliant mind behind the «Caleyisms,» the researcher who contributed to speech synthesis, the FreeBSD advocate, and the witty participant in early online communities whose words continue to amuse and inform, decades after he wrote them.
Chapter 6: Modern Renaissance – Enter GitHub, CICD and Modern Distributions
Here’s where the story gets really interesting. While working on preserving these Unix classics, I decided to learn modern packaging techniques. I chose to implement both APK (Alpine Linux) and Debian packaging for the games.
For APK packages, I used Melange – a sophisticated build system that creates provenance-tracked, reproducible packages for the Wolfi «undistro». The irony? I discovered this tool when some friend started to work for the company that created it.
Chapter 7: The Technical Journey: From USENET to Modern CI/CD
Original Conquer v4 code, by Ed Barlow and Adam Bryant
(Conquer running in docker container alongside Apache, Curses to WebSockets output thanks to ttyd. Now we can play through the web!)
Conquer Version 5 – The evolution of the classical Conquer, by Adam Bryant
Chapter 8: The Human Element: Why This Matters
This isn’t just about preserving old games – it’s about preserving the story of computing itself. Ed Barlow and Adam Bryant were pioneers who built sophisticated multiplayer experiences when most people had never heard of the internet. They distributed software through USENET because that’s what you did – you shared cool things with the community.
Martin Forssen’s PostScript utilities represent the ingenuity of early developers who solved problems with whatever tools were available. Want to visualize your game state? Write a PostScript generator!
The 20-year relicensing effort demonstrates something crucial about open source: it’s not just about code, it’s about community and continuity. Every time someone maintains a legacy project, documents its history, or tracks down long-lost contributors, they’re weaving the threads that connect computing’s past to its future.
Lessons for Modern Developers
Document everything: Those casual USENET posts became crucial legal evidence decades later
License clearly: Ed’s comment that «copyleft didnt exist when i wrote it» highlights how licensing landscapes evolve
Community matters: Adam found my articles because the community was talking about preservation
Technical debt is temporal: What seems like legacy tech today might be tomorrow’s archaeological treasure
Modern tools can revive ancient code: Melange and modern CI/CD gave 1987 software a 2025 renaissance
The Continuing Story
Both Conquer games are now fully GPL v3 licensed and available with modern packaging. They represent not just playable software, but a complete case study in software archaeology, legal frameworks for preservation, and the evolution of development practices across four decades.
The next chapter? Teaching these classic strategy games to a new generation of developers and gamers, while demonstrating that proper legal frameworks and modern tooling can give any historical software a second life.
Sometimes the best way to learn cutting-edge technology is by applying it to preserve computing history.
What historical software deserves preservation in your field? Have you ever traced the lineage of code back to its original creators?
How a simple documentation contribution evolved into a full-scale packaging solution with automated CI/CD, multi-distribution support, and deep technical problem-solving
Author: Juan Manuel Méndez Rey Date: October 30, 2025 Reading Time: 25 minutes Technical Level: Advanced
The Beginning: A Documentation Gap
Several years ago, while working with Stremio on Debian systems, I encountered the familiar frustration of Linux users everywhere: a great application with poor installation documentation. The official Stremio releases worked fine on some distributions, but Debian users were left to figure out dependencies, compilation steps, and integration challenges on their own.
That’s when I contributed the original DEBIAN.md file to the Stremio shell repository. It was a straightforward build guide—install these dependencies, run these commands, copy these files. Simple, but functional.
Years passed. Dependencies changed. Qt versions evolved. The simple build instructions became increasingly unreliable on modern Debian systems, and the GitHub issues piled up with frustrated users unable to compile Stremio.
The Problem Grows
By 2025, the situation had become untenable:
Dependency conflicts: The upstream .deb package required libmpv1, but modern Debian ships libmpv2
Missing QML modules: Critical Qt5 components weren’t documented as dependencies
Compilation complexity: Users needed to install 15+ build dependencies to compile from source
No proper integration: Desktop files, icons, and system integration required manual work
The upstream .deb package is outdated, it is providing the 4.4.168 version.
The list continues…
The GitHub issues were a testament to user frustration—dozens of reports about compilation failures, missing dependencies, and broken installations.
The Debian Way: Proper Packaging
Rather than continue patching documentation, I remembered a discussion with my friend, Arturo, about properly packaging Stremio for Debian, he created a RFP (Request for Package) for Stremio in 2020. Years passed and I went into my usual day to day work. This past month I decided I had to fulfill my old dream of becoming an official Debian contributor, so I decided to solve this properly through the Debian packaging system. In late 2025, I filed an Intent To Package (ITP) with Debian:
stremio-server package (non-free) – Proprietary streaming server v4.20.12
Technical Deep Dive: System Library Migration
The most challenging aspect was replacing ALL bundled git submodules with Debian system libraries. This wasn’t just about dependencies—it required fixing fundamental runtime issues.
Problem: System libsingleapplication-dev v3.3.4 caused segmentation faults when used with QQmlApplicationEngine.
Investigation:
# Test with system library:
sudo apt install libsingleapplication-dev
# Build and run: Segmentation fault
# Test without SingleApplication:
# Remove from CMakeLists.txt: Works perfectly
Root Cause: System library sets up threading context incompatible with Qt5 QML engine initialization. The library uses internal threading mechanisms that conflict with QQmlApplicationEngine’s event loop.
Solution: Custom CompatibleSingleApp implementation. This is also to replace one of the bundled submodules that recently modified its MIT license into a dubious license that could be incompatible for Debian DFSG guidelines. See https://github.com/itay-grudev/SingleApplication/issues/210
Challenge 3: QProcess Environment Variables for Node.js Server 🔥 CRITICAL
Problem: Streaming server failed to start with cryptic error:
server-crash 0 null
TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received undefined
at Object.join (node:path:1292:7)
Investigation:
# Manual server test works:
$ /usr/bin/node /usr/share/stremio/server.js
EngineFS server started at http://127.0.0.1:11470
# But QProcess launch fails:
timeout 15s stremio
# Error: server-crash 0 null
Root Cause: QProcess does not inherit environment variables by default. Node.js server.js requires:
HOME – for configuration directory (~/.stremio-server)
USER – for process identification
PWD – for relative path resolution
Solution: Explicit environment setup in stremioprocess.cpp:
void Process::start(QStringList args) {
// Set up environment variables for Node.js server
QProcessEnvironment env = QProcessEnvironment::systemEnvironment();
// Ensure essential environment variables are set for server.js
if (!env.contains("HOME")) {
env.insert("HOME", QStandardPaths::writableLocation(QStandardPaths::HomeLocation));
}
if (!env.contains("USER")) {
env.insert("USER", qgetenv("USER"));
}
if (!env.contains("PWD")) {
env.insert("PWD", QDir::currentPath());
}
this->setProcessEnvironment(env);
// Now start the process
QProcess::start(this->program(), args);
}
Verification:
# After fix:
$ timeout 15s build/stremio 2>&1 | grep -A 5 "hls executables"
hls executables located -> { ffmpeg: '/usr/bin/ffmpeg', ffsplit: null }
Using app path -> /home/user/.stremio-server
Enabling casting...
Discovery of new external device "mpv" - MPV
EngineFS server started at http://127.0.0.1:11470
Impact: Complete resolution of streaming functionality. Users can now stream media via BitTorrent, use casting, and access all server features.
QT_SELECT=5 qmake
QT_SELECT=5 make
# Result: 278KB optimized binary
Both systems produce working binaries with 100% system libraries.
Debian Packaging: The Proper Way
Package Structure
stremio (4.4.169+dfsg-1):
debian/
├── changelog # Version history with ITP closure
├── control # Dependencies and package metadata
├── copyright # GPL-3.0+ licensing details
├── rules # Build instructions (dh-based)
├── patches/ # Quilt patches for system integration
│ ├── 0001-Fix-server.js-path-for-FHS-compliance.patch
│ ├── 0002-disable-server-download.patch
│ ├── 0004-minimal-qthelper-integration.patch
│ ├── 0005-cmake-system-libraries-v4.4.169.patch
│ ├── 0007-add-qtwebengine-initialize-fix.patch
│ ├── 0008-add-compatible-singleapp-implementation.patch
│ ├── 0009-remove-system-singleapplication-add-compatible.patch
│ ├── 0010-fix-qmake-install-paths.patch
│ └── 0011-fix-qprocess-environment-for-server-launch.patch
├── stremio.desktop # Desktop integration
├── stremio.install # File installation rules
├── watch # Upstream version monitoring
└── source/
└── format # 3.0 (quilt) format
Key debian/control sections:
Source: stremio
Section: video
Priority: optional
Maintainer: Juan Manuel Méndez Rey <vejeta@gmail.com>
Build-Depends:
debhelper-compat (= 13),
cmake,
qtbase5-dev,
qt5-qmake,
qt5-qmake-bin,
qtdeclarative5-dev,
qtwebengine5-dev,
qttools5-dev,
qml-module-qtwebchannel,
qml-module-qt-labs-platform,
qml-module-qtwebengine,
qml-module-qtquick-dialogs,
qml-module-qtquick-controls,
qml-module-qt-labs-settings,
qml-module-qt-labs-folderlistmodel,
libmpv-dev,
libssl-dev,
nodejs,
npm,
pkg-kde-tools
Standards-Version: 4.6.2
Homepage: https://www.stremio.com/
Vcs-Git: https://salsa.debian.org/mendezr/stremio.git
Vcs-Browser: https://salsa.debian.org/mendezr/stremio
Package: stremio
Architecture: amd64
Depends: ${shlibs:Depends}, ${misc:Depends},
nodejs,
mpv,
librsvg2-2,
qml-module-qtwebengine,
qml-module-qtwebchannel,
qml-module-qt-labs-platform,
qml-module-qtquick-controls,
qml-module-qtquick-dialogs,
qml-module-qt-labs-settings,
qml-module-qt-labs-folderlistmodel,
qtbase5-dev-tools
Description: Modern media center for streaming video content
Stremio is a video streaming application that aggregates content from
various sources. It features a modern Qt5/QML interface with support
for add-ons, local playback via MPV, and integration with streaming
services.
.
This package provides the desktop client with GPL-licensed components.
Follows industry practice: VS Code, Docker Desktop, Firefox ESR
debian/copyright documents source:
Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
Upstream-Name: stremio-server
Source: https://dl.strem.io/server/v4.20.12/desktop/server.js
Comment: Pre-downloaded server.js included in source package to comply
with Debian Policy prohibiting network access during builds.
Beyond Debian: The Wolfi Contribution
While working on Debian packaging, I also contributed Stremio packages to Wolfi Linux, the security-focused distribution used by Chainguard. This involved:
Melange build files: Cloud-native package format
Security hardening: ASLR, stack protection, RELRO
OSI license compliance: GPL components only (no proprietary server, the same server.js we separated in a non-free package for Debian)
Reproducible builds: Hermetic build environment
Melange configuration example:
package:
name: stremio
version: 4.4.169
epoch: 0
description: Modern media center for video streaming
license: GPL-3.0-or-later
environment:
contents:
packages:
- qt5-qtbase-dev
- qt5-qtdeclarative-dev
- qt5-qtwebengine-dev
- mpv-dev
- openssl-dev
pipeline:
- uses: cmake/configure
- uses: cmake/build
- uses: cmake/install
subpackages:
- name: stremio-doc
description: Documentation for stremio
I used this parallel effort as an exercise to learn how different distributions do proper packaging across different distribution ecosystems.
Pull Request: https://github.com/wolfi-dev/os/pull/69098 GitHub Gists with examples of usage: https://gist.github.com/vejeta/859f100ef74b87eadf7f7541ead2a2b1
The Distribution Challenge: GitHub-Powered APT Repository
Official Debian inclusion takes time—months or years of review, testing, and refinement. Meanwhile, users needed a solution now. Traditional approaches like hosting packages on a personal server would create bandwidth and maintenance problems.
The solution: Modern APT repository hosting using GitHub infrastructure.
Result: APT repository served at https://debian.vejeta.com/ with:
✅ Global CDN (CloudFlare)
✅ HTTPS encryption
✅ Unlimited bandwidth
✅ Zero hosting costs
✅ 99.9%+ uptime
Critical Lessons Learned: Patch Development Best Practices
During this project, I made significant efficiency mistakes in patch development. Here’s what I learned:
The Inefficiency Problem
What I did (5+ iterations of patch rework):
Modified source files directly in working repository
Generated patches from modified state
Patches failed on clean upstream
Repeated entire process multiple times
Impact: ~70% wasted time in patch development
The Correct Approach
Efficient patch development workflow:
# Step 1: Clean upstream baseline
git clone --branch v4.4.169 https://github.com/Stremio/stremio-shell.git /tmp/patch-test
cd /tmp/patch-test
# Step 2: Analyze dependencies BEFORE making changes
echo "=== Mapping file dependencies ==="
grep -r "#include" *.cpp *.h | grep -v "Qt\|std"
grep -r "class.*:" *.h
grep -r "Q_OBJECT" *.h
# Step 3: Make ONE fix at a time
vim main.cpp # Add QtWebEngine::initialize()
git diff > /tmp/0007-qtwebengine-fix.patch
# Step 4: Test patch application
git checkout .
patch -p1 < /tmp/0007-qtwebengine-fix.patch
mkdir build && cd build && cmake .. && make
# Step 5: If successful, continue to next fix
# If failed, refine current patch before moving on
Pre-Patch Analysis Template
Before creating patches, ALWAYS complete this analysis:
## Files to Modify
- [ ] main.cpp - QtWebEngine initialization
- [ ] mainapplication.h - class definitions
- [ ] CMakeLists.txt - build system
- [ ] compatible_singleapp.h/cpp - new custom implementation
## Dependency Chain
1. main.cpp includes → mainapplication.h
2. mainapplication.h includes → singleapplication.h (to be replaced)
3. CMakeLists.txt references → SingleApplication (to be removed)
4. Qt MOC processes → Q_OBJECT classes (check for conflicts)
## Build Test Plan
1. [ ] Clean cmake build
2. [ ] Dependency verification (ldd)
3. [ ] Runtime functionality test
4. [ ] Package build test (dpkg-buildpackage)
Validation Before «Ready» Declaration
NEVER declare patches ready without:
# MANDATORY validation workflow
mkdir /tmp/patch-validation
cd /tmp/patch-validation
git clone --branch v4.4.169 <upstream-url> .
# Apply ALL patches
export QUILT_PATCHES=debian/patches
quilt push -a || { echo "FAIL: Patch application"; exit 1; }
# Complete build test
mkdir build && cd build
cmake .. && make || { echo "FAIL: Build"; exit 1; }
# Package build test
cd .. && dpkg-buildpackage -us -uc || { echo "FAIL: Package"; exit 1; }
# Dependency check
ldd build/stremio | grep -E "(libQt5|libmpv|libcrypto)"
# ONLY NOW declare "patches ready"
echo "✅ Validated and ready for production"
This workflow prevents the «ready → fails → rework» cycle that wastes development time.
Production Validation: Comprehensive Testing
Isolated Environment Validation
Test setup:
# Create pristine environment
mkdir /tmp/stremio-patch-validation
cd /tmp/stremio-patch-validation
git clone --branch v4.4.169 https://github.com/Stremio/stremio-shell.git .
cp -r /path/to/debian .
# Apply all patches
export QUILT_PATCHES=debian/patches
quilt push -a
# Result: All 6 patches applied successfully
# Test CMake build
mkdir build && cd build
cmake .. -DQT_DEFAULT_MAJOR_VERSION=5
make -j$(nproc)
# Result: 293KB binary with 100% system libraries
# Test release.makefile
cd .. && QT_DEFAULT_MAJOR_VERSION=5 make -f release.makefile
# Result: Complete success including icon generation
# Verify dependencies
ldd build/stremio | head -5
# Output:
# libQt5WebEngine.so.5 => /lib/x86_64-linux-gnu/libQt5WebEngine.so.5
# libQt5DBus.so.5 => /lib/x86_64-linux-gnu/libQt5DBus.so.5
# libcrypto.so.3 => /lib/x86_64-linux-gnu/libcrypto.so.3
# libmpv.so.2 => /lib/x86_64-linux-gnu/libmpv.so.2
Verification results:
✅ Binary builds successfully (293KB)
✅ GUI loads and displays
✅ Single-instance behavior works
✅ Streaming server starts (port 11470 responds)
✅ System library integration complete
✅ No crashes or threading issues
Runtime Validation
Complete functionality test:
# Launch application
./build/stremio 2>&1 | tee /tmp/stremio-runtime.log
# Verify server startup (first 15 seconds)
timeout 15s ./build/stremio 2>&1 | grep -E "(server|streaming|port)"
# Output:
# hls executables located -> { ffmpeg: '/usr/bin/ffmpeg', ffsplit: null }
# Using app path -> /home/user/.stremio-server
# Enabling casting...
# EngineFS server started at http://127.0.0.1:11470
# Test server endpoint
curl -s http://127.0.0.1:11470 && echo "✓ Server responding"
# Test single-instance behavior
./build/stremio &
PID1=$!
sleep 2
./build/stremio # Should detect first instance and exit
wait $PID1
User Experience: Installation Simplified
I wanted other Debian users to have the chance to install these packages built with the highest standards as soon as possible while the package is still being reviewed by Debian Developers. My solution was to create a repository and through GitHub Actions, pull the sources of the packages from salsa.debian.org, build them automatically, make a release and provide a Debian repository built with GitHub Pages, so Debian users will have 99% of availability to fetch them.
The end result is a one-command installation for users:
Note: Ubuntu support is experimental with automated builds but limited manual testing. Community feedback welcome.
Closing the Loop: Updating Documentation
With a working solution deployed, I returned to where it all started—the documentation. I submitted a comprehensive pull request to update the original DEBIAN.md file I had contributed years earlier.
The PR adds:
✅ APT repository installation (new recommended method)
✅ Complete dependency lists
✅ Modern security practices (proper GPG key management)
✅ Multi-distribution support (Debian + derivatives)
✅ Maintained build instructions (preserved for developers)
Community Impact
Within hours of submitting the PR, I commented on 10+ existing GitHub issues where users had reported installation problems. The response was immediate and positive—users could finally install Stremio without compilation headaches.
Technical Achievements Summary
Packaging Excellence
✅ Zero bundled dependencies: 100% Debian system libraries
✅ FHS compliance: Proper /usr installation paths
✅ License separation: GPL client (main) + proprietary server (non-free)
✅ Policy compliance: Lintian-clean packaging
✅ Independent versioning: Client v4.4.169 + Server v4.20.12
Technical Solutions
✅ QtWebEngine initialization fix: Single line prevents all QML crashes
[x] Source packages created following Debian Policy
[x] Lintian-clean packaging
[x] 100% system libraries
[x] FHS compliance
[x] Copyright file with complete licensing
[x] Watch files for upstream monitoring
[x] git-buildpackage workflow
[x] Packages hosted on Salsa
[x] ITP bug filed
[x] Preliminary Debian Developer review
[ ] Sponsorship obtained
[ ] Upload to Debian NEW queue
Timeline: Submission planned for Q1 2025
Professional Applications
This work directly supports my goal of becoming an official Debian Package Maintainer. This could also serve as a guide for others to get experience with:
Packaging expertise: Modern Debian packaging workflows with complex applications
DevOps proficiency: CI/CD pipeline design and GitHub Actions automation
Problem-solving skills: Deep debugging (QtWebEngine, threading, environment issues)
Community engagement: Solving real user problems at scale
Infrastructure design: Scalable, cost-effective distribution systems
Documentation: Comprehensive technical writing for diverse audiences
Future Evolution
The architecture proved so successful that I’m considering replicating it for other packaging projects. The pattern of using GitHub infrastructure for APT repository hosting could benefit many projects struggling with distribution challenges.
Potential applications:
Personal package repository for experimental Debian packages
Other media applications requiring complex Qt5/WebEngine setups
This journey reinforced several fundamental principles:
1. Documentation is Infrastructure
Good documentation isn’t just text—it’s the foundation that enables user adoption and community growth. The original DEBIAN.md file evolved into a complete packaging and distribution solution.
2. Packaging is Product Design
How users install and maintain software is part of the user experience. Poor packaging creates friction; good packaging eliminates it. The difference between:
By leveraging GitHub’s infrastructure (Actions, Pages, Releases), a single developer can provide enterprise-grade distribution infrastructure with zero operational overhead. This democratizes software distribution.
4. Standards Enable Ecosystems
Following Debian packaging standards meant the same packages work across multiple distributions (Debian, Ubuntu, Kali) and can integrate with the official Debian archive.
5. Deep Technical Understanding Pays Off
The critical fixes (QtWebEngine initialization, threading compatibility, environment variables) required deep understanding of:
Qt5 initialization order
QML engine threading model
QProcess environment inheritance
Node.js runtime requirements
Surface-level knowledge wouldn’t have solved these problems.
6. Proper Testing Prevents Rework
The patch development lessons learned (70% time wasted on rework) demonstrate that upfront validation investment prevents significant downstream waste. Test against clean upstream early and often.
7. Independent Versioning Respects Reality
Using independent version numbers for stremio (4.4.169) and stremio-server (4.20.12) follows industry practice and accurately represents upstream development. Convenience versioning creates confusion.
Acknowledgments
Stremio Team: For creating an excellent media center application
Debian Community: For packaging standards and infrastructure (Salsa)
GitHub: For free hosting, CI/CD, and unlimited bandwidth
Qt Project: For excellent cross-platform framework
Debian Developers (Arturo): For preliminary review and guidance on ITP #943703
What started as a simple documentation contribution evolved into a comprehensive packaging and distribution solution. By combining traditional Debian packaging principles with modern CI/CD infrastructure, it’s possible to deliver professional-grade software distribution that scales globally.
The journey from «how do I install this?» to «apt install stremio» represents more than technical progress—it’s about removing friction between great software and the people who want to use it.
Sometimes the best solutions come full circle. Years after contributing installation documentation, I’ve returned to ensure that documentation describes a process that actually works reliably for everyone.
The technical challenges (QtWebEngine initialization, threading compatibility, environment variables) required deep problem-solving and systematic debugging. The infrastructure challenges (multi-distribution builds, APT repository hosting, CI/CD automation) required modern DevOps practices and cloud-native thinking.
But ultimately, this project is about enabling users. Every technical decision, every patch, every workflow optimization serves the goal of making Stremio accessible to Debian and Ubuntu users through a simple, reliable installation process.
Part of ongoing contribution to become a Debian Package Maintainer
If you found this article helpful, please consider:
# Install package
sudo dpkg -i ../stremio_*.deb
# Fix dependencies if needed
sudo apt install -f
# Test binary
stremio --version
which stremio
# Check dependencies
ldd /usr/bin/stremio | grep -E "(libQt5|libmpv|libcrypto)"
# Run application
stremio
Repository Management
# Add GPG key
wget -qO - https://debian.vejeta.com/key.gpg | \
sudo gpg --dearmor -o /usr/share/keyrings/stremio-debian.gpg
# Add repository (choose your distribution)
echo "deb [signed-by=/usr/share/keyrings/stremio-debian.gpg] https://debian.vejeta.com trixie main non-free" | \
sudo tee /etc/apt/sources.list.d/stremio.list
# Update and install
sudo apt update
sudo apt install stremio stremio-server
# Verify installation
dpkg -L stremio
systemctl --user status stremio-server # If systemd service installed
Debugging Runtime Issues
# Run with debug output
QT_DEBUG_PLUGINS=1 stremio
# Run in headless mode (for testing)
QT_QPA_PLATFORM=offscreen stremio
# Disable WebEngine sandbox (for containers)
QTWEBENGINE_DISABLE_SANDBOX=1 stremio
# Check server process
ps aux | grep server.js
lsof -i :11470
# Manual server test
/usr/bin/node /usr/share/stremio/server.js
# Test with gdb
gdb --args stremio
(gdb) run
(gdb) bt # If crash occurs
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Siempre activo
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferencias
El almacenamiento o acceso técnico es necesario para la finalidad legítima de almacenar preferencias no solicitadas por el abonado o usuario.
Statistics
El almacenamiento o acceso técnico que es utilizado exclusivamente con fines estadísticos.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.