Saturday, 31 January 2026

Stellaris and the Long Shadow of DLC

Like many Paradox titles, Stellaris lives and evolves through an extensive DLC model. This is hardly unique within Paradox’s catalogue, but Stellaris handles DLC somewhat differently than, for example, Europa Universalis IV, and those differences matter more than they first appear.


In EU IV, DLCs traditionally focused on specific regions, nations, or political systems. Even without owning a particular expansion, you could still benefit indirectly from most new mechanics; you simply did not get the full experience when playing the nations or areas explicitly enhanced by that DLC. Over time, many once-essential mechanics—such as those introduced in Art of War or Res Publica—have been folded into the base game, making the barrier to entry much lower today.

Stellaris, by contrast, divides its DLCs into several distinct categories, each with very different impact. Species Packs—such as Plantoids, Lithoids, Necroids, Aquatics, Toxoids, and the more recent Astral Planes-era additions—primarily add new portraits, civics, traits, and cosmetic assets. Some of them, however, go further: Synthetic Dawn, for example, introduced fully playable Machine Empires, which fundamentally changed how population, economy, and ethics could function.

More significant are the major expansion DLCs. These add or overhaul core systems: diplomacy (Federations), internal politics and origins (Utopia), espionage (Nemesis), leaders and paragon mechanics (Galactic Paragons), or exploration and narrative depth (Distant Stars). Unlike EU IV, where missing DLC often feels like missing options, missing major Stellaris expansions can feel like missing entire layers of the game. The difference between playing with or without Utopia alone is substantial.

Then there are the Story Packs. These focus primarily on flavor: anomalies, archaeological sites, event chains, leviathans, and relics. While they rarely change the game’s mechanical core, they significantly enhance its sense of mystery and discovery—arguably one of Stellaris’ strongest qualities.

A single Stellaris campaign can be extremely long, even by 4X standards. The game offers extensive customization at galaxy creation: size, number of AI empires, crisis strength, technology cost, population growth scaling, and more. These settings can dramatically alter both pacing and difficulty. That said, Stellaris is a game where you do not need to finish a campaign to feel satisfied. In fact, many games become less engaging in the late stages, even with extensive automation enabled.

Personally, I find the early and mid-game the most rewarding. Exploration, surveying, uncovering ancient civilizations, and piecing together the history of long-dead empires is where Stellaris shines brightest. As the game progresses, it increasingly shifts toward pure empire management—optimizing economies, managing diplomacy, or engaging in large-scale warfare. By the mid-game, resource scarcity often gives way to abundance, with minerals and energy effectively becoming infinite if infrastructure is well developed. Strategic resources and empire-wide modifiers become the real constraints.

One area where Stellaris offers more depth than many of its genre peers is ship design. By default, the game automatically designs and updates ship classes as new technologies are researched, making it perfectly playable without manual intervention. However, players who wish to engage more deeply can design ships themselves, choosing weapon types, defenses, and combat roles. This system adds genuine strategic complexity, particularly when countering specific enemy builds.

Fleet combat, however, remains somewhat awkward in places. Naval capacity limits the total number of ships you can reasonably maintain, regardless of ship size. While exceeding this limit is possible, the economic penalties escalate quickly. On larger galaxies, this often forces players to invest heavily in naval capacity infrastructure simply to defend a sprawling empire. Early on, fleets must be split due to long travel times, making it difficult to respond to threats on opposite sides of the galaxy.

Later in the game, this issue is mitigated through advanced mobility options: Gateways, Wormholes, Jump Drives, and L-Gates significantly reduce response times and change the strategic landscape. Still, the early-game tension between geography and fleet size is very real.

Starbases and fortresses, while useful, rarely function as true defensive bulwarks in the way fortifications do in EU IV. Even heavily upgraded starbases are unlikely to defeat a determined enemy main fleet on their own. They delay, attrit, and support—but they do not replace mobile forces. There is nothing quite equivalent to a siege-based war of attrition.

In the end, Stellaris remains a game I return to regularly. Even so, I still find myself preferring Europa Universalis IV overall. Despite Stellaris being the newer title, EU IV feels more polished, perhaps because it operates within tighter historical and mechanical boundaries. Its systems change less dramatically over the course of a campaign, but what it does, it does exceptionally well.

Ironically, one of my long-standing frustrations with EU IV—the reliance on abstract monarch power—was largely addressed first in Stellaris, where leaders, institutions, and infrastructure play a more organic role. With Europa Universalis V now tackling the same issue and introducing what appear to be even more complex systems, it will be interesting to see how these designs continue to evolve.

Either way, Stellaris remains an ambitious, evolving, occasionally unwieldy but endlessly fascinating experiment in grand-scale science-fiction strategy—and one that still feels uniquely Paradox.

Thursday, 29 January 2026

Torment: Tides of Numenera – Choices, Companions, and the Weight of Immortality

When Torment: Tides of Numenera was first announced, I was immediately drawn in. Billed as a spiritual successor to Planescape: Torment, it promised the same deeply philosophical storytelling, complex moral choices, and richly imagined companions that made the original such a landmark. I backed the Kickstarter early on, joining a record-breaking campaign that reflected the sheer excitement surrounding the project.


Even now, years after its release, Torment remains a remarkable experience. While some critics and players note that it didn’t fully deliver on every promise made during the Kickstarter, the game excels in areas that matter most: story, dialogue, worldbuilding, and the sense that every choice carries weight. Companions feel alive, each with their own agendas, quirks, and perspectives, and the game’s visual design—while more grounded than the abstract planes of Planescape—is polished, evocative, and frequently stunning.

One of the game’s standout systems is the Crisis mechanic, which governs tension and story pacing. Crises are high-stakes moments triggered by pivotal story beats, dangerous locations, or morally significant decisions. During these sequences, options are limited, skill checks and dialogue become more consequential, and even combat can be intensified. Success moves the story forward, often unlocking new quests, narrative insights, or powerful items, while failure—or the choices you deliberately make—can have lasting consequences. Interestingly, not all encounters are crises; many puzzles and narrative challenges can be solved without combat, and these solutions often feel more creative and rewarding.

Replacing the traditional alignment system is the game’s Tide system, a brilliant innovation that tracks eight abstract values representing facets of the player’s personality and ethical stance. Every decision nudges one or more tides, influencing companions’ reactions, narrative outcomes, and even dialogue options. Unlike a simple “good vs. evil” scale, tides provide a nuanced moral compass that encourages experimentation and replay. It’s easy to make choices that produce unexpected outcomes, and the game even allows for permanent failure very early on—a striking way to underscore the stakes from the moment you begin.

Immortality, a hallmark of the Torment legacy, is cleverly interwoven here. You can die and return multiple times, and in some puzzles, death is not a failure but a deliberate solution. This reinforces the game’s themes of identity, consequence, and the cyclical nature of actions—echoing the philosophical weight that made the original Planescape so compelling. The story begins in medias res, with your character falling from the sky, immediately immersing you in a sense of urgency and mystery.

The world of Torment is rich throughout, but Act 2’s The Bloom stands out. This semi-sentient, dreamlike environment is not only visually striking but thematically resonant, tying memory, identity, and consequence together in ways that make exploration genuinely meaningful. The attention to detail, the interplay between companions, tides, and moral choices, all contribute to a level of replayability that rewards patient, thoughtful players.

It’s worth noting that the game never received official DLC, which is a shame given the depth and richness of its world. A free post-launch update did expand companion interactions and add minor content, but the base game alone already provides dozens of hours of complex narrative and strategic choice.

While some may argue that Tides of Numenera didn’t entirely live up to the hype generated during its Kickstarter, it remains a compelling, thoughtful RPG. It demonstrates how rich narrative, meaningful choice, and philosophical depth can coexist with tactical gameplay, and it arguably set the stage for later innovations in narrative-driven RPGs, as seen in Baldur’s Gate III. For anyone drawn to story-heavy role-playing with consequences that truly matter, Torment remains an essential experience.

Wednesday, 28 January 2026

Brandon Sanderson, the Cosmere, and the Perils of Screen Adaptation

As reported by both Polygon and Reactor, Brandon Sanderson is once again talking openly about screen adaptations of his work—and, perhaps more importantly, about the degree of control he intends to retain over them. That detail alone makes this worth paying attention to.

Hollywood adaptations of popular fantasy and science fiction have a long and uneven history, and optimism rarely comes without caveats. Still, the idea of a Cosmere adaptation—handled with care, budget, and authorial involvement—is difficult not to find at least somewhat intriguing.

One of the more reassuring points in the reporting is Sanderson’s insistence on creative control. This is not a minor detail. Even the strongest source material can be mangled beyond recognition in the wrong hands. The Rings of Power is a good recent example: lavishly produced, technically impressive, and yet—at least to my mind—fundamentally hollow. Beautiful, highly polished crap is still crap.


Sanderson’s position suggests he is acutely aware of this risk. His hesitation to move forward without meaningful influence over the end result reads less like stubbornness and more like hard-earned realism.

That said, I remain only cautiously optimistic. We have seen no shortage of ambitious adaptation plans fail to materialize or collapse under their own weight. Patrick Rothfuss’ Kingkiller Chronicle is perhaps the most infamous example: once positioned as a multi-format franchise spanning film, television, and games, it ultimately produced nothing at all. Other projects—American Gods, The Dark Tower, and even parts of The Witcher—serve as reminders that success in one medium does not translate automatically to another.

The timing is also notable. With The Wheel of Time ending after its third season, and The Witcher approaching its conclusion with the upcoming fifth, there is a clear vacuum in big-budget, ongoing fantasy television. Streaming platforms are still hungry for “the next thing,” even if they have become more cautious after several expensive misfires.

In that context, the Cosmere makes a certain kind of sense. It is expansive, internally consistent, and already structured as a shared universe—something studios understand instinctively. At the same time, that same complexity is a risk. Worldbuilding depth is a strength on the page, but can easily become a liability on screen if mishandled.

For now, this remains firmly in the realm of “interesting possibility” rather than imminent reality. Still, if there is one modern fantasy author I would trust to push back against the worst excesses of adaptation-by-committee, Sanderson is a reasonable candidate.

You can read the original reporting at Polygon and Reactor:

If nothing else, it is refreshing to see an author enter these discussions with eyes open—and with a clear understanding that fidelity, restraint, and good judgment matter just as much as spectacle.

Darkest Dungeon – Stress, Madness, and the Cost of Survival

When Darkest Dungeon was released, it didn’t just add another entry to the long tradition of dungeon-crawling games — it fundamentally changed how failure, trauma, and psychological pressure were represented. Instead of focusing solely on hit points and loot, the game introduced stress as a central mechanic, forcing you to deal with the mental consequences of repeatedly sending people into horrifying places. It’s a small shift on paper, but one that gives the entire game a very different tone.

A Bleak Setting

The game is set around a ruined ancestral estate, long abandoned and corrupted by unspeakable forces. As the heir, you return to reclaim the land and purge the darkness festering beneath it. From this estate — known simply as the Hamlet — you recruit heroes, manage resources, and plan expeditions into nearby dungeons. The narrative framing is minimal, but effective, reinforced by the narrator’s grim delivery and the oppressive art style.

The Hamlet: Your Only Safe Haven

The Hamlet serves as your hub between dungeon runs. It’s here that you recruit new adventurers, manage stress, upgrade buildings, and decide which expeditions to undertake next. Crucially, the Hamlet is not just a menu system — it’s a strategic layer that directly affects your long-term survival.

Stress accumulated in dungeons doesn’t simply disappear when a party returns. Instead, heroes need time and resources to recover. The Hamlet offers several ways to manage this, such as taverns, abbeys, and other facilities that help reduce stress, though often with side effects. As a result, you’re encouraged — or forced — to maintain multiple parties, rotating them as others recover. This creates a steady rhythm between risk-taking and recovery, and makes long-term planning essential.

Over time, the Hamlet itself can be upgraded using resources brought back from the dungeons. These upgrades improve everything from hero training to stress relief, but progress is slow and hard-won.

Stress as a Core Mechanic

Stress is where Darkest Dungeon truly stands apart. Characters accumulate stress from almost everything: seeing horrific enemies, triggering traps, suffering critical hits, or watching party members get injured or killed. Once a character reaches 100 stress, they suffer a resolve check. More often than not, they break — gaining a negative affliction that affects performance and can spread stress to others.

Occasionally, a hero will instead “pull through” and become virtuous, gaining powerful buffs. These moments feel rare and dramatic, and while they can be influenced by certain trinkets or abilities, they’re never something you can fully rely on. This constant uncertainty reinforces the game’s themes of risk and fragility.

Dungeon Exploration

Dungeons are built as networks of rooms connected by corridors. Rooms can contain battles, curios, treasures, or events, while corridors function almost like side-scrolling spaces where traps, hidden doors, and ambushes can occur. The pacing alternates between tense exploration and brutal combat, with very little room to relax.

Different dungeon regions are “flavored” with distinct enemies, environmental effects, and resistances. This means party composition and skill selection matter a great deal, and what works well in one dungeon might perform poorly in another. This variety adds both complexity and replayability.

Combat and Party Positioning

Combat in Darkest Dungeon is turn-based and heavily dependent on positioning. Each hero can only equip four skills at a time, similar to Diablo, and many skills can only be used from — or target — specific positions in the party lineup. Some abilities move characters forward or backward, while others reposition enemies.

This system allows for deep tactical play, but it can also be punishing. If enemies shuffle your party, or if you’re ambushed during rest, your carefully planned setup can fall apart. Sometimes you’ll lose an entire round just moving characters back into positions where they can actually use their skills, which can feel frustrating — but also reinforces the game’s unforgiving nature.

Roguelike Roots

Darkest Dungeon leans closer to a roguelike than a roguelite. Heroes die permanently, and if a party wipes in a dungeon, you lose both the characters and any items they were carrying. Progress is retained only through resources brought back to the Hamlet and improvements made there. While you can always recruit new heroes, they start fresh, reinforcing the sense that individuals are expendable — but experience and preparation are not.

Atmosphere Above All

Visually, Darkest Dungeon is entirely 2D, but the art direction is exceptional. The heavy lines, exaggerated animations, and grim color palette create a constant sense of dread. The sound design and narration further elevate the experience, making even routine actions feel ominous.

Over time, some animations can become repetitive, but the option to speed them up helps maintain momentum during longer play sessions.

Final Thoughts

Darkest Dungeon isn’t just difficult — it’s oppressive by design. The game wants you to feel worn down, uncertain, and constantly on the edge of disaster. For some players, that can be too much. For others, it’s precisely what makes the experience memorable.

By tying psychological pressure, resource management, and tactical combat into a single cohesive system, Darkest Dungeon created something that still feels distinctive years after release — a dungeon crawler where survival is as much about managing fear and stress as it is about winning fights.

Sunday, 25 January 2026

How to Invent Everything – A Survival Guide for the Stranded Time Traveler

At first glance, How to Invent Everything looks like it might be a prepper’s dream—or perhaps a post-apocalyptic one. Not a guide to surviving disaster as much as a handbook for restarting civilization after it has already collapsed. That distinction matters. This is less about stockpiling and bunkers, and more about rebuilding knowledge from the ground up. And no, reading it did not turn me into a prepper.


The book presents itself as an official manual meant to accompany a time machine, written by someone from the future (or at least an alternate timeline). Its purpose is simple: if a time traveler gets stranded in the past, this is the book meant to help them rebuild civilization and eventually make their way home. Ryan North, in this framing, is not the author of the manual itself, but the person who recovered it and added the introduction and editorial framing.

The tone is relentlessly humorous. The manual is filled with jokes, footnotes, puns, and playful asides about alternate timelines and paradoxes. I enjoy wordplay and clever footnotes, but this style sometimes feels pushed right to the edge. Each chapter opens with a famous quote now attributed both to its historical originator and to you, the stranded time traveler, since you will eventually travel back and say it yourself. As a joke it works initially, but repeated in every chapter it starts to wear thin. This may simply mean I am not the ideal target audience, even though the subject matter itself aligns very closely with my interests.

In hindsight, the cover probably should have warned me. The presentation clearly signals “geeky humor first, instruction second.” While I try not to judge a book by its cover, this one turns out to be fairly honest about what it is selling. North (or the publisher's marketing department) knows how to market a book, and this is very much aimed at readers who enjoy science, history, and nerd culture as entertainment.

That said, the underlying content is often fascinating. The book is packed with historical anecdotes that highlight just how fragile civilization’s accumulated knowledge really is. One striking example is scurvy: the use of citrus fruits to cure it was discovered, forgotten, and rediscovered no fewer than seven times before it finally stuck. Knowledge does not automatically accumulate—it must be preserved, transmitted, and defended against being lost.

The manual also repeatedly points out how many transformative inventions could have been implemented centuries or even millennia earlier if the right insights had been present. Many of these ideas seem obvious in hindsight, but they were unreachable without a systematic way to test hypotheses. This is where the book makes one of its strongest implicit arguments: the true invention underpinning modern civilization is not any single tool, but the scientific method itself. Philosophy turns into science the moment ideas are tested against reality.

At the same time, the book is light on practical detail. There are no blueprints, few diagrams, and very little that could realistically be followed step by step to rebuild complex technologies. That reinforces the sense that this is more a conceptual guide and a celebration of human ingenuity than a literal instruction manual. It is highly educational, but not especially actionable.

Reading it also invites some uncomfortable modern parallels. Even today, it is easy to see how people dismiss ideas that conflict with their preferred narratives. If education erodes, science can quickly lose its central role as a way of understanding the world. And while I am deeply fascinated by AI, it is hard not to worry about a future where humans gradually lose the ability to do difficult things themselves, having outsourced understanding to machines.

How to Invent Everything works best as a thought experiment and an intellectual curiosity. It is clever, funny, and often insightful, even if its humor occasionally overstays its welcome and its practical ambitions are overstated. More than anything, it serves as a reminder that civilization rests not on technology alone, but on shared knowledge—and that knowledge is far more fragile than we tend to assume.

Friday, 23 January 2026

Across the Obelisk – A Roguelite That Rewards Planning Over Luck

Across the Obelisk is a roguelite deck-building RPG published by Paradox Interactive in 2021, and one that has quietly grown into one of my favorite examples of the genre. At first glance it may look like yet another Slay the Spire descendant, but it quickly becomes clear that it is doing several things differently—and in many cases, better.


One of the most unusual features for a roguelite is that Across the Obelisk supports full co-op play. I haven’t explored this aspect extensively, but it is easy to imagine how appealing it is to share the planning, builds, and inevitable disasters with other players. It feels like a natural extension of the party-based design.

As with many Paradox titles, the game follows a long-term DLC strategy. New content is released gradually through paid expansions, often accompanied by free updates. While this model isn’t universally loved, Paradox has shown with games like Europa Universalis IV, Stellaris, and Cities: Skylines that it can sustain a game for many years. In this case, the base game already feels complete, and the DLCs mostly serve to expand replayability rather than gate essential content.

A standard run consists of three main acts followed by a fixed final act, although the latest DLC, Necropolis of the Damned, introduces an alternative ending. Each act is represented by a map filled with branching paths, events, and optional dungeons, often culminating in a miniboss before the act’s final boss. The first act is always the same, but later acts offer choices between different maps, adding strategic variety between runs.

At its core, the game revolves around assembling a party of four characters drawn from four archetypes: Warrior, Scout, Mage, and Healer. Each class has its own deck of cards used both for combat and for resolving events on the map, where card costs effectively function as a dice-like mechanic. The base game includes four starting characters, but many more can be unlocked through play. What makes characters truly distinct is their unique set of cards, which strongly shapes how each one can be built. Some DLCs even introduce dual-class characters, combining decks in ways that can become extremely powerful at higher difficulty levels.

Each character also has a talent tree that persists between runs. This is the main roguelite progression system: while your deck resets every run, talent points carry over, gradually opening up new builds and strategies. This persistence is complemented by a town upgrade system. At the start of each map, towns allow you to buy equipment, upgrade or remove cards, and otherwise fine-tune your decks. Over time, these towns themselves can be upgraded, enabling even more customization and min-maxing. The result is a game that feels far less dependent on luck than many other roguelites.


The game uses three main resources: gold, crystals, and supplies. Supplies are used exclusively to upgrade towns, while gold and crystals are spent on cards and equipment. Gold is more flexible—used for gear, pets, and services—while crystals are reserved for crafting and upgrading cards. At the end of each run, you earn gold and crystals that can be carried forward to future runs, allowing you to give yourself a head start. You can store the rewards from up to three completed runs and choose which ones to cash in when starting a new game.

After completing the game, you unlock Madness mode, which allows you to fine-tune the difficulty through a combination of madness levels and corruptors. These modifiers not only make the game harder but also increase the rewards, creating a satisfying risk-reward loop for experienced players.


All of these systems—multiple characters, persistent talents, town upgrades, map choices, and adjustable difficulty—combine to give Across the Obelisk remarkable replayability. It encourages experimentation and planning rather than brute forcing runs and hoping for good card draws.

Overall, Across the Obelisk is one of the strongest roguelite deck builders I’ve played in recent years. I spent a lot of time with Slay the Spire, but I think I ultimately enjoy this more. The deeper customization and reduced reliance on luck make each run feel more deliberate, more strategic, and ultimately more satisfying.

Tuesday, 20 January 2026

Urban Fantasy, Wicca, and Crime Fiction: A First Look at Harm None

This reflection is based solely on reading the first book in the series, Harm None, which often serves as the entry point and tone-setter for M.R. Sellars’ Rowan Gant Investigation novels. The series frequently appears on recommendation lists for urban fantasy, paranormal crime, and supernatural mystery, and after seeing it mentioned often enough, I decided to give it a try.

The premise centers on Rowan Gant, who works as a consultant on occult and supernatural matters for the St. Louis Police Department. Although mentioned only in passing, Rowan is also trained as an electrical engineer, a detail that remains largely unexplored in the narrative. In practice, the focus is firmly on his role as an investigator with specialist knowledge of magic and Wiccan belief rather than on any technical or scientific background.

Harm None follows a fairly traditional crime-novel structure: a series of violent and often gruesome murders, a methodical police investigation, and a gradual uncovering of both mundane and supernatural elements behind the crimes. Magic exists alongside standard police procedure rather than replacing it, with interviews, evidence gathering, and procedural routines forming the backbone of the story.

From a stylistic perspective, the prose is serviceable but uneven. The dialogue can feel stiff, and there are occasional moments of awkward phrasing that raise questions about the editing process. That said, this is hardly unique within the genre. Urban fantasy, especially in the late 1990s and early 2000s, was rarely treated as “quality literature,” and many readers likely neither notice nor care about these issues. In that sense, Harm None probably fits comfortably within its genre expectations rather than falling short of them.

What most clearly distinguishes the book is how closely it reflects the author’s personal beliefs. Sellars is a practicing witch, and the novel often feels as much concerned with explaining and defending Wicca as it is with telling a crime story. Large portions of the book are devoted to describing rituals, ethics, and belief systems, sometimes slowing the narrative considerably. At times this functions as worldbuilding; at others, it reads more like instruction or advocacy.

This emphasis is reinforced by how Wicca is portrayed within the story world. Many characters—particularly police officers and members of the public—view it with suspicion, frequently equating it with satanism or cult activity. The novel consistently pushes back against these misconceptions, giving it a clear thematic agenda. Readers with an interest in Wicca may find this aspect engaging or validating, while others may find it heavy-handed.

Despite the graphic nature of some of the crimes, Harm None never fully commits to horror, nor does it lean strongly into psychological thriller territory. The tone remains largely procedural and matter-of-fact throughout, which may be intentional but also prevents the atmosphere from becoming truly unsettling.

As a non-American reader, the setting can feel somewhat stereotypical, leaning into familiar genre shorthand: doughnut-eating cops, religious extremists, and broadly sketched secondary characters. Whether this is simply convention, cultural context, or a subtle form of parody is unclear, but it does contribute to a distinctly “all-American” tone.

The book also shows its age in small but noticeable ways. There are detailed descriptions of dial-up modems, early web searches, and internet research that would have felt helpful or necessary when the book was released in the early 2000s, but now stand out as dated and slightly anachronistic.

Overall, based solely on Harm None, the Rowan Gant Investigation series strikes me as competent and readable rather than exceptional. It is easy to see why it appeals to readers who enjoy paranormal crime fiction or who are interested in Wicca and its representation in fiction. For me, however, it never quite clicked, and I suspect I am simply not the intended audience.

Still, as an example of early-2000s urban fantasy with a strong personal voice behind it, Harm None provides an interesting snapshot of the genre and helps explain why the series continues to be recommended, even if it didn’t fully win me over.

Saturday, 17 January 2026

Neuromancer – Returning to the Origin of Cyberpunk

I don’t remember exactly when I first read Neuromancer. It must have been in Swedish translation, sometime in the early to mid-1990s, before I started reading almost exclusively in English. I later reread it in English perhaps ten or fifteen years ago, and most recently I returned to it in early 2025 after finally reading the rest of the Sprawl trilogy. Despite the book’s enormous influence on me, I was surprised by how little of the actual plot I remembered. In many ways, it felt like reading it again for the first time.


What lingered from those earlier readings were fragments and iconic moments rather than a coherent narrative: the heist against the Sense/Net headquarters to steal the ROM construct of McCoy “Dixie Flatline” Pauley, using a fabricated terrorist threat as a diversion; the tense encounters with and incursions through black ice; the strange, dreamlike quality of cyberspace itself. Even if the details had faded, it was immediately clear on rereading that Neuromancer had left a deep and lasting mark on how I think about technology, fiction, and the future — even if that influence is difficult to fully quantify.

At its core, Neuromancer follows Case, a washed-up hacker unable to access cyberspace after damaging his nervous system, who is pulled into a complex and dangerous job by shadowy employers. What initially appears to be a criminal caper slowly reveals itself to be something far stranger, involving artificial intelligences pushing against the limits imposed on them, fragmented identities, and questions about agency and consciousness. It’s a dense book, but also a remarkably cool one — full of momentum, style, and imagery that still feels potent decades later.

It was also William Gibson who led me onward to Neal Stephenson and Snow Crash, another mind-bending experience that deepened an already growing interest in consciousness and memes — long before “memes” became shorthand for internet jokes. That fascination was at least partially seeded by Neuromancer, with its ideas of information as something viral, contagious, and transformative.

Back then, I didn’t explore Gibson’s bibliography much beyond Neuromancer. I read The Difference Engine (co-written with Bruce Sterling) and Pattern Recognition, and I remember enjoying both, but neither left the same lasting imprint. Neuromancer stood apart — not just as a good novel, but as something foundational.

That foundation is hard to overstate. Neuromancer didn’t just define cyberpunk as a genre; it shaped how popular culture imagines the relationship between humans and computers. From the visual language of cyberspace to the idea of hacking as something immersive and embodied, its influence runs through books, films, and games alike. The Matrix is the most obvious descendant, but traces can be found everywhere: in anime like Ghost in the Shell, in games like Deus Ex, System Shock, and Cyberpunk 2077, and even in how we casually talk about “the Matrix” or “jacking in”. Gibson didn’t predict the future so much as provide it with a vocabulary.

For a long time, it felt impossible to imagine Neuromancer adapted for the screen. When I first read it, the technology and abstractions felt too strange, too internal, too conceptual. Then The Matrix arrived, and suddenly it seemed obvious that it could be done after all. More recently, watching The Peripheral — based on Gibson’s later novel — rekindled my interest in his work and reminded me just how adaptable his ideas can be. Having now reread the entire Sprawl trilogy, I find myself genuinely excited about the upcoming Neuromancer TV series.

So, is Neuromancer still relevant today? I think very much so. Some of the technology is dated, and the future it imagines is not the one we ended up with. But the strength of its vision, its cultural impact, and the fact that it remains a gripping and stylish story make it endure. More than anything, it still feels alive — and I can’t help but look forward to a new generation discovering cyberpunk, perhaps for the first time, through the world Gibson created.

Thursday, 15 January 2026

True Detective (Season One) – Revisiting a Modern Classic

Rewatching True Detective takes me back to a time when streaming still felt special, and HBO was the undisputed standard-bearer for prestige television. This was the era when ambitious, slow-burn storytelling was allowed to breathe, and when the promise of a limited series actually meant something. The Wire still stands, in my mind, as the finest television series ever produced, and it is no small thing to even be mentioned in the same sentence. Yet True Detective has earned that comparison, along with shows like Mindhunter, as one of the defining crime series of its generation.


So what is left to say about True Detective that hasn’t already been said? Perhaps not much in terms of analysis—but revisiting it years later offers a chance to see it through slightly different eyes.

A Brief, Spoiler-Free Recap

The first season of True Detective is set in Louisiana and follows two homicide detectives, Martin Hart and Rust Cohle, over a span of many years. The story begins with the investigation of a disturbing ritualistic murder and unfolds across multiple timelines, shifting between the original case and later interviews in which the detectives are asked to recount their experiences.

Rather than focusing on procedural detail alone, the series is as much about character as it is about crime. Hart and Cohle are deeply different men—personally, philosophically, and morally—and the case gradually becomes a lens through which their worldviews, failures, and obsessions are revealed. The structure allows the show to explore memory, perspective, and truth without ever resorting to cheap twists.

Performances That Define Careers

Watching the series again, I was struck by just how powerful Matthew McConaughey’s performance is. When I first saw True Detective, he was still largely associated with romantic comedies and lighter fare. He had been good in The Lincoln Lawyer, but True Detective was something else entirely. In hindsight, it is difficult not to see this role as the moment that truly catapulted him into a different category of actor.

Rust Cohle is a difficult character to play: cerebral, emotionally withdrawn, prone to long philosophical monologues that could easily have come across as pretentious or absurd in lesser hands. McConaughey makes it work. There is a sense of danger and exhaustion in his performance that feels completely authentic, and it remains compelling even when you know exactly where the story is going.

Woody Harrelson, meanwhile, is equally strong, even if his performance is less surprising. He was already a well-established actor at the time, and he brings a grounded, volatile energy to Martin Hart. The contrast between Hart and Cohle is one of the show’s great strengths. They are almost opposites in temperament and outlook, and their constant friction—arguments, petty resentments, reluctant loyalty—creates a tension that rarely lets up. Their scenes together crackle with energy.

Occult Atmosphere and Unease

I have always been drawn to stories that incorporate occult or esoteric elements, especially when they are used to create atmosphere rather than provide easy explanations. Se7en is an obvious influence here, but the lineage goes further back than that. True Detective uses suggestion and symbolism far more effectively than explicit revelation, allowing unease to accumulate slowly.

Importantly, the show never fully commits to a supernatural explanation. The occult imagery serves to deepen the horror and the sense of moral decay, rather than offering answers. That restraint is part of what gives the series its lasting power.

Rethinking the Ending

When I first watched True Detective, I remember feeling somewhat disappointed by the ending. At the time, it felt almost too ordinary, perhaps even anticlimactic, especially given the layers of myth, symbolism, and expectation that had built up over the season. One could even invoke Chekhov’s gun in relation to how the story ultimately resolves.

On rewatch, however, I found myself appreciating the ending far more. It now feels grounded rather than cheap, and arguably more realistic. Not every case has a grand, all-encompassing revelation. Many investigations end with partial answers, loose ends, and a lingering sense of incompleteness. In that sense, True Detective resists the temptation to over-explain its own mythology.

Closing Thoughts

Season one of True Detective remains a remarkable achievement: tightly written, beautifully shot, and anchored by two exceptional performances. It captures a moment in television history when ambition and restraint coexisted, and when audiences were trusted to sit with discomfort, ambiguity, and silence.

Revisiting it years later only reinforces its status—not just as great television, but as a reminder of what the medium can be when everything comes together just right.

Tuesday, 13 January 2026

Sid Meier’s Alpha Centauri – Still the High Point of 4X

I have been playing 4X games since first or second grade, when a friend introduced me to the original Civilization. Not long after, I had my own copy, and between Civilization and Civilization II I must have sunk hundreds of hours into conquering, optimizing, and reshaping the world. One of my favorite parts was always the science victory: launching a spaceship to Alpha Centauri. But there was also a lingering disappointment — the journey always ended just as it felt like it should begin.

When Sid Meier’s Alpha Centauri (SMAC) was released in 1999, it finally scratched that itch.

Unlike Civilization: Call to Power, which came out the same year and largely stuck to the classic Civ formula, Alpha Centauri committed fully to the idea of what happens after humanity leaves Earth. While Call to Power experimented with concepts like public works, it never really clicked for me. I enjoyed micromanaging settlers and engineers, terraforming every last tile, and turning the entire world into a hyper-optimized machine. SMAC didn’t just allow that — it doubled down on it.

Terraforming in Alpha Centauri remains unmatched. You can raise and lower terrain, drill for aquifers to create rivers, plant forests or alien flora, and manipulate rainfall and elevation to squeeze out every last unit of nutrients or energy. The planet truly feels malleable, something to be shaped over centuries rather than merely occupied.

Another strength of SMAC is its tighter timeline. By focusing almost entirely on the space age and beyond, the game avoids many of the coherence problems that plague 4X titles spanning thousands of years. Mechanics don’t have to be reinvented every era, and the game feels far more consistent as a result.

The Social Engineering system was also remarkably ahead of its time. By setting policies across Politics, Economy, Values, and Future Society, you effectively design the ideological backbone of your civilization. These choices ripple through everything — economy, research, diplomacy, and warfare — and allow for a wide range of viable playstyles. Who doesn’t enjoy the fantasy of building their own “perfect” society, even if it inevitably comes with trade-offs?

Then there is the Workshop. Designing your own units — choosing chassis, weapons, armor, and special abilities — added a level of customization I still haven’t seen properly replicated since. For players who didn’t want that level of micromanagement, the game could auto-design units, but for those who did, it was a dream.

By modern standards, the graphics are undeniably dated. There are also technical issues today: the Alien Crossfire expansion currently fails to launch on Steam due to a recent Windows update, though the base game still works, and the expansion may function on GOG. Despite this, SMAC has lost none of its pull. I still return to it regularly, and that familiar “just one more turn” feeling is very much alive.

A spiritual successor of sorts, Civilization: Beyond Earth, arrived in 2014. It’s a solid 4X game with far better graphics, but it never quite captured the same magic. The depth of terraforming and social engineering simply isn’t there, even though it came out fifteen years later.

All in all, Sid Meier’s Alpha Centauri remains an extraordinary game. In my mind, it is not only the best Sid Meier game, but one that still outshines later entries in the Civilization series — even Civilization VI, which many now consider the definitive Civ experience with its expansions. More than two decades later, SMAC still feels like the genre at its most ambitious and inspired.

Saturday, 10 January 2026

Chasing a Ghost: Reflections on The Mysterious Mr. Nakamoto

Benjamin Wallace’s The Mysterious Mr. Nakamoto: A Fifteen-Year Quest to Unmask the Secret Genius Behind Crypto promises an investigation into one of the great modern enigmas: the true identity of Satoshi Nakamoto, the creator of Bitcoin. Having finished the book, I cannot honestly say that I am any closer to knowing who Nakamoto really was—or is. But perhaps that is the point. The mystery itself may be as essential to Bitcoin’s mythology as the technology that underpins it.


Rather than delivering a definitive answer, Wallace offers something more diffuse but still compelling: a guided tour through the early history of Bitcoin and the many people who orbited its creation. The book introduces a wide cast of characters—developers, cryptographers, entrepreneurs, ideologues—many of whom are fascinating figures in their own right. Along the way, it also provides a vivid look at the cypherpunk movement, whose blend of idealism, paranoia, technical brilliance, and political radicalism forms much of the ideological soil from which Bitcoin emerged.

Reading the book stirred a strong sense of déjà vu. It took me back to the early days when cryptography was far from ubiquitous and often treated as something suspicious or even dangerous. I still remember the U.S. export restrictions on cryptography, which limited key lengths in software like Netscape and effectively weakened security for users outside the United States. I even had a friend who wore one of those infamous T-shirts printed with RSA source code—classified, at the time, as a munition. I was never quite that cool.

Bitcoin itself remains a deeply complex system. Even with a background in cryptography and some academic exposure to blockchain concepts—though no hands-on implementation experience—I still find many aspects of it difficult to fully internalize. Wallace does a reasonable job of explaining the basics without drowning the reader in math, but the underlying reality remains: Bitcoin is not simple, and its consequences are even less so.

It is not hard to see why Bitcoin appeals to such a broad range of fringe or outsider movements. Libertarians, privacy advocates, anti-statists, and those deeply distrustful of institutions all find something to admire in a currency that exists outside government control. Unfortunately, the same properties that attract idealists also attract illicit actors. This is hardly unique to Bitcoin; every major technological advance has been used for both constructive and destructive ends.

In many respects, Bitcoin is not fundamentally different from fiat currency. Its value ultimately rests on collective belief and trust. The crucial distinction lies in what that trust is anchored to. Fiat currencies rely on governments, central banks, and legal systems. Bitcoin relies on cryptography, code, and decentralized consensus. Neither is inherently immune to failure; they simply fail in different ways.

One of Bitcoin’s most significant economic properties is its fixed supply, capped at 21 million coins. This makes it deflationary by design, unlike most fiat currencies, which are inflationary. As a result, adjustment happens almost entirely through price rather than supply. By most estimates, the overwhelming majority of Bitcoin activity—well over 90% by value—remains speculative or financial rather than transactional. From that perspective, Bitcoin still has a long way to go before it functions as a broadly used currency in the everyday sense.

It is also striking how dominant Bitcoin remains, despite being technically inferior in many respects to later cryptocurrencies. Platforms like Ethereum offer programmability via smart contracts; others provide faster transactions, lower fees, or better energy efficiency. Yet Bitcoin retains an enormous advantage as the first mover. Perhaps more importantly, it lacks a controlling company, foundation, or visible founder. That absence of ongoing authority is a feature, not a bug.

Which brings us back to Nakamoto. It is widely believed that Satoshi mined roughly one million bitcoins in the early days of the network—an amount that would make them unimaginably wealthy today. And yet those coins remain untouched. If Nakamoto is still alive, that restraint suggests either extraordinary principle or an equally extraordinary desire to remain invisible. Wallace explores many theories, but none feel conclusive.

In the end, I closed the book no wiser about Nakamoto’s identity than when I began. What I did gain, however, was a renewed engagement with questions about money, trust, privacy, and the political dimensions of technology. The book also rekindled memories of a time when cryptography felt like a subversive act rather than an invisible layer of everyday life.

And perhaps that is enough. A book does not need to provide answers to be worthwhile. Sometimes it succeeds simply by sharpening the questions and ensuring they linger long after the last page is turned.

Wednesday, 7 January 2026

Solium Infernum - Better to Reign in Hell than Serve in Heaven

In Solium Infernum you take on the role of a scheming archfiend in Hell, vying to become its next ruler. Originally released in an earlier incarnation many years ago, the 2024 release finally gave me an excuse to dive in, and it is a game I have been curious about for quite some time.


At its core, Solium Infernum sits somewhere between a 4X game and grand strategy, mixing warfare, sorcery, and political intrigue as the main tools for advancement. It is not a game about rapid expansion or overwhelming force, but about positioning, timing, and outmaneuvering your rivals. In that sense it immediately sets itself apart from most of the genre.

The game is wonderfully atmospheric, with a very distinct visual style and art direction that fits its infernal setting perfectly. It does a solid job of onboarding new players: the tutorial is competent, the in-game Codex is excellent and explains every system in detail, and there is also a healthy supply of community-made guides and videos for those who want to dig deeper.

The map is a hex grid that wraps around on all sides, effectively forming a globe. Everything is visible from the start, which neatly removes the “explore” part of the traditional 4X formula. What remains is a game far more focused on diplomacy, intrigue, and careful planning than on discovery or raw expansion.

Warfare exists, but it is deliberately constrained. Before declaring a full blood feud—essentially open war—you must first succeed in vendettas, smaller and more limited conflicts. Only blood feuds allow you to assault an opponent’s stronghold and eliminate them entirely. This structure makes open conflict costly and deliberate, rather than something you fall into by default.

The emphasis on diplomacy and intrigue is one of the game’s strongest features. Schemes, threats, favors, and sorcery often matter more than armies, and neglecting these systems can leave even a militarily powerful archfiend dangerously exposed. Focusing too heavily on Wrath, for example, may make you strong on the battlefield, but vulnerable to manipulation, curses, or political isolation.

Victory is determined by prestige at the end of a fixed number of turns. Almost everything you do—warfare, plotting, diplomacy, sorcery—can generate prestige, but it is also a finite resource with competing uses. You can spend prestige to acquire greater titles and ranks, each conferring powerful bonuses, but the same prestige is also what ultimately decides the winner. Go too far in one direction, and you risk weakening your final position.

Another important constraint is the action economy. Each turn gives you only a limited number of actions, and the main way to expand that is by raising one of the core powers—Wrath, Deceit, Prophecy, Destruction, or Charisma—to level four. Since you usually start with modest values across the board, choosing which powers to invest in becomes a key strategic decision.

The game vaguely reminds me of an old Swedish board game, The Hell Game, where you also played a devil competing for dominance in Hell. Solium Infernum, however, feels far more balanced and refined, with its systems tightly interlocking rather than pulling in different directions.

So far I have mostly experimented with a more war-focused archfiend, and even there the game has consistently pushed back, forcing me to engage with intrigue and diplomacy whether I wanted to or not. That balancing act is part of what makes the game compelling, and I am very much looking forward to trying other archfiends and playstyles.

Solium Infernum is not a game about conquest for its own sake. It is about manipulation, restraint, and choosing the right moment to act. In a genre often dominated by expansion and optimization, that alone makes it feel refreshingly infernal.

Tuesday, 6 January 2026

The Singularity Is Nearer — Acceleration, Optimism, and Uneasy Futures

Ray Kurzweil’s The Singularity Is Nearer is an easy book to misread before even opening it. One could be tempted to dismiss it as the wishful thinking of an aging technologist doubling down on ideas he has championed for decades. Yet that would be unfair. Whatever one thinks of Kurzweil’s conclusions, his arguments are not built on vague optimism but on long-running trends in technology, economics, and human development.


At the heart of Kurzweil’s worldview lies what he calls the Law of Accelerating Returns: the idea that technological progress does not advance linearly, but exponentially. Each generation of technology provides the tools to develop the next one faster, leading to a compounding effect. This is not a fringe idea. Variations of it have been articulated by others, such as Lars Tvede in Supertrends, and it has historical support across multiple industrial revolutions.

Moore’s Law is the most familiar expression of this phenomenon. While transistor density on chips is no longer doubling as predictably as it once did, the more relevant metric—computing power per dollar—continues to improve at an exponential pace. Advances in specialized hardware, parallel computing, cloud infrastructure, and software efficiency have kept the broader trend alive. This sustained acceleration is one of the key enablers behind today’s rapid advances in artificial intelligence.

A Broader Book Than Expected

Although the singularity—Kurzweil’s projected moment when machine intelligence surpasses and merges with human intelligence—is the book’s central thesis, it is not the book’s sole focus. In fact, much of The Singularity Is Nearer reads as a wide-ranging survey of technological and societal progress. Artificial intelligence serves as the connective tissue, touching nearly every domain Kurzweil discusses: medicine, energy, manufacturing, education, and cognition itself.

In that sense, the book revisits familiar territory explored in works like Yuval Noah Harari’s Homo Deus and Kelly and Zach Weinersmith’s Soonish. Kurzweil’s approach, however, is more explicitly cumulative: each chapter stacks evidence to support the claim that progress is not only continuing, but accelerating.

The early sections focus heavily on empirical trends meant to counter widespread pessimism. Kurzweil draws extensively on Steven Pinker’s The Better Angels of Our Nature and research associated with Daniel Kahneman to argue that, by almost any long-term measurable metric—life expectancy, poverty, literacy, violence—the world has improved dramatically over the last two centuries.

Why, then, does it feel to many as though things are getting worse?

Kurzweil points to well-known cognitive biases. Human perception is tuned to detect sudden changes rather than slow, incremental improvements, because abrupt changes historically posed the greatest survival risks. We are also prone to extrapolating broad conclusions from vivid individual events. A single disaster, or a string of emotionally charged news stories, can outweigh years of gradual improvement in our mental accounting.

Modern media ecosystems amplify this effect. News and social platforms are optimized for attention and engagement, not statistical context. With global coverage, there is always a catastrophe somewhere, ready to be framed as evidence of decline. The result is a persistent mismatch between subjective perception and objective trends. Life may feel more chaotic, even as it becomes safer, healthier, and more prosperous in aggregate.

Automation, Work, and Disruption

Kurzweil is clear-eyed about the disruptions ahead. Automation and AI will render many existing professions obsolete, just as previous waves of industrialization did. Historically, machines often replaced skilled labor with lower-skilled labor augmented by tools. What may be different this time is that many newly created roles could demand higher levels of abstraction, adaptability, and technical literacy.

This raises serious questions about reskilling. It is one thing to say that new jobs will appear; it is another to expect displaced workers to transition smoothly into them, especially when the required skills differ radically from what came before. From the perspective of someone losing their livelihood, abstract assurances about future job creation offer limited comfort.

Kurzweil acknowledges that such transitions generate political tension and uncertainty. Technological change does not occur in a social vacuum. Even if the long-term outcome is positive, the short- and medium-term disruptions can be painful and destabilizing.

Measuring Progress in a Digital Economy

One of the more interesting arguments in the book concerns our metrics for economic success. As automation and digitalization increase, traditional measures like GDP and productivity become less informative. Many digital services generate enormous value while contributing almost nothing to GDP—Wikipedia being the canonical example.

Digital goods can be replicated at near-zero marginal cost, breaking the traditional link between production cost and price. As a result, economic growth increasingly manifests as improved quality, accessibility, and abundance rather than monetary exchange. This complicates policy decisions and public debates that still rely on 20th-century economic indicators.

Kurzweil briefly touches on containerization as an earlier example of invisible but transformative infrastructure. That aside immediately reminded me of Marc Levinson’s The Box, a book that has been sitting unread on my shelf for far too long. It is a useful parallel: some of the most impactful innovations reshape the world quietly, without capturing public imagination at the time.

Extending the Mind

Ultimately, Kurzweil’s path to the singularity runs through the extension of human cognition. First via external tools—AI assistants, neural interfaces, and cognitive augmentation—and eventually, perhaps, through fully digital or simulated minds. These ideas are no longer confined to science fiction. Robin Hanson’s The Age of Em explores similar territory with unsettling rigor.

Kurzweil includes several thought experiments around identity, continuity, and consciousness. While fascinating, they deserve deeper treatment than a single post can provide. I suspect I will return to these questions in the future, especially as they intersect with debates around AI alignment and digital personhood.

Cautious Optimism

I do not fully share Kurzweil’s confidence in timelines or inevitability. But I do find his general direction persuasive. The world is improving in measurable ways, even as it faces profound challenges. Kurzweil does not deny those challenges; he emphasizes that outcomes depend as much on social and political choices as on technology itself. Progress is not automatic, and it is certainly not evenly distributed.

There are also broader risks that Kurzweil touches on only lightly—issues explored in greater depth by authors like Nick Bostrom (Superintelligence) and Olle Häggström (Here Be Dragons). These concerns are real, and they complicate any straightforward narrative of technological salvation.

Still, The Singularity Is Nearer paints a compelling picture: not of an inevitable utopia, but of a future shaped by accelerating capability and human decision-making. I remain uncertain about the destination, but increasingly convinced that the trajectory Kurzweil describes is broadly correct. Whether it leads somewhere hopeful—or somewhere catastrophic—will depend less on the machines than on us.