Saturday, 10 January 2026

Chasing a Ghost: Reflections on The Mysterious Mr. Nakamoto

Benjamin Wallace’s The Mysterious Mr. Nakamoto: A Fifteen-Year Quest to Unmask the Secret Genius Behind Crypto promises an investigation into one of the great modern enigmas: the true identity of Satoshi Nakamoto, the creator of Bitcoin. Having finished the book, I cannot honestly say that I am any closer to knowing who Nakamoto really was—or is. But perhaps that is the point. The mystery itself may be as essential to Bitcoin’s mythology as the technology that underpins it.


Rather than delivering a definitive answer, Wallace offers something more diffuse but still compelling: a guided tour through the early history of Bitcoin and the many people who orbited its creation. The book introduces a wide cast of characters—developers, cryptographers, entrepreneurs, ideologues—many of whom are fascinating figures in their own right. Along the way, it also provides a vivid look at the cypherpunk movement, whose blend of idealism, paranoia, technical brilliance, and political radicalism forms much of the ideological soil from which Bitcoin emerged.

Reading the book stirred a strong sense of déjà vu. It took me back to the early days when cryptography was far from ubiquitous and often treated as something suspicious or even dangerous. I still remember the U.S. export restrictions on cryptography, which limited key lengths in software like Netscape and effectively weakened security for users outside the United States. I even had a friend who wore one of those infamous T-shirts printed with RSA source code—classified, at the time, as a munition. I was never quite that cool.

Bitcoin itself remains a deeply complex system. Even with a background in cryptography and some academic exposure to blockchain concepts—though no hands-on implementation experience—I still find many aspects of it difficult to fully internalize. Wallace does a reasonable job of explaining the basics without drowning the reader in math, but the underlying reality remains: Bitcoin is not simple, and its consequences are even less so.

It is not hard to see why Bitcoin appeals to such a broad range of fringe or outsider movements. Libertarians, privacy advocates, anti-statists, and those deeply distrustful of institutions all find something to admire in a currency that exists outside government control. Unfortunately, the same properties that attract idealists also attract illicit actors. This is hardly unique to Bitcoin; every major technological advance has been used for both constructive and destructive ends.

In many respects, Bitcoin is not fundamentally different from fiat currency. Its value ultimately rests on collective belief and trust. The crucial distinction lies in what that trust is anchored to. Fiat currencies rely on governments, central banks, and legal systems. Bitcoin relies on cryptography, code, and decentralized consensus. Neither is inherently immune to failure; they simply fail in different ways.

One of Bitcoin’s most significant economic properties is its fixed supply, capped at 21 million coins. This makes it deflationary by design, unlike most fiat currencies, which are inflationary. As a result, adjustment happens almost entirely through price rather than supply. By most estimates, the overwhelming majority of Bitcoin activity—well over 90% by value—remains speculative or financial rather than transactional. From that perspective, Bitcoin still has a long way to go before it functions as a broadly used currency in the everyday sense.

It is also striking how dominant Bitcoin remains, despite being technically inferior in many respects to later cryptocurrencies. Platforms like Ethereum offer programmability via smart contracts; others provide faster transactions, lower fees, or better energy efficiency. Yet Bitcoin retains an enormous advantage as the first mover. Perhaps more importantly, it lacks a controlling company, foundation, or visible founder. That absence of ongoing authority is a feature, not a bug.

Which brings us back to Nakamoto. It is widely believed that Satoshi mined roughly one million bitcoins in the early days of the network—an amount that would make them unimaginably wealthy today. And yet those coins remain untouched. If Nakamoto is still alive, that restraint suggests either extraordinary principle or an equally extraordinary desire to remain invisible. Wallace explores many theories, but none feel conclusive.

In the end, I closed the book no wiser about Nakamoto’s identity than when I began. What I did gain, however, was a renewed engagement with questions about money, trust, privacy, and the political dimensions of technology. The book also rekindled memories of a time when cryptography felt like a subversive act rather than an invisible layer of everyday life.

And perhaps that is enough. A book does not need to provide answers to be worthwhile. Sometimes it succeeds simply by sharpening the questions and ensuring they linger long after the last page is turned.

Wednesday, 7 January 2026

Solium Infernum - Better to Reign in Hell than Serve in Heaven

In Solium Infernum you take on the role of a scheming archfiend in Hell, vying to become its next ruler. Originally released in an earlier incarnation many years ago, the 2024 release finally gave me an excuse to dive in, and it is a game I have been curious about for quite some time.


At its core, Solium Infernum sits somewhere between a 4X game and grand strategy, mixing warfare, sorcery, and political intrigue as the main tools for advancement. It is not a game about rapid expansion or overwhelming force, but about positioning, timing, and outmaneuvering your rivals. In that sense it immediately sets itself apart from most of the genre.

The game is wonderfully atmospheric, with a very distinct visual style and art direction that fits its infernal setting perfectly. It does a solid job of onboarding new players: the tutorial is competent, the in-game Codex is excellent and explains every system in detail, and there is also a healthy supply of community-made guides and videos for those who want to dig deeper.

The map is a hex grid that wraps around on all sides, effectively forming a globe. Everything is visible from the start, which neatly removes the “explore” part of the traditional 4X formula. What remains is a game far more focused on diplomacy, intrigue, and careful planning than on discovery or raw expansion.

Warfare exists, but it is deliberately constrained. Before declaring a full blood feud—essentially open war—you must first succeed in vendettas, smaller and more limited conflicts. Only blood feuds allow you to assault an opponent’s stronghold and eliminate them entirely. This structure makes open conflict costly and deliberate, rather than something you fall into by default.

The emphasis on diplomacy and intrigue is one of the game’s strongest features. Schemes, threats, favors, and sorcery often matter more than armies, and neglecting these systems can leave even a militarily powerful archfiend dangerously exposed. Focusing too heavily on Wrath, for example, may make you strong on the battlefield, but vulnerable to manipulation, curses, or political isolation.

Victory is determined by prestige at the end of a fixed number of turns. Almost everything you do—warfare, plotting, diplomacy, sorcery—can generate prestige, but it is also a finite resource with competing uses. You can spend prestige to acquire greater titles and ranks, each conferring powerful bonuses, but the same prestige is also what ultimately decides the winner. Go too far in one direction, and you risk weakening your final position.

Another important constraint is the action economy. Each turn gives you only a limited number of actions, and the main way to expand that is by raising one of the core powers—Wrath, Deceit, Prophecy, Destruction, or Charisma—to level four. Since you usually start with modest values across the board, choosing which powers to invest in becomes a key strategic decision.

The game vaguely reminds me of an old Swedish board game, The Hell Game, where you also played a devil competing for dominance in Hell. Solium Infernum, however, feels far more balanced and refined, with its systems tightly interlocking rather than pulling in different directions.

So far I have mostly experimented with a more war-focused archfiend, and even there the game has consistently pushed back, forcing me to engage with intrigue and diplomacy whether I wanted to or not. That balancing act is part of what makes the game compelling, and I am very much looking forward to trying other archfiends and playstyles.

Solium Infernum is not a game about conquest for its own sake. It is about manipulation, restraint, and choosing the right moment to act. In a genre often dominated by expansion and optimization, that alone makes it feel refreshingly infernal.

Tuesday, 6 January 2026

The Singularity Is Nearer — Acceleration, Optimism, and Uneasy Futures

Ray Kurzweil’s The Singularity Is Nearer is an easy book to misread before even opening it. One could be tempted to dismiss it as the wishful thinking of an aging technologist doubling down on ideas he has championed for decades. Yet that would be unfair. Whatever one thinks of Kurzweil’s conclusions, his arguments are not built on vague optimism but on long-running trends in technology, economics, and human development.


At the heart of Kurzweil’s worldview lies what he calls the Law of Accelerating Returns: the idea that technological progress does not advance linearly, but exponentially. Each generation of technology provides the tools to develop the next one faster, leading to a compounding effect. This is not a fringe idea. Variations of it have been articulated by others, such as Lars Tvede in Supertrends, and it has historical support across multiple industrial revolutions.

Moore’s Law is the most familiar expression of this phenomenon. While transistor density on chips is no longer doubling as predictably as it once did, the more relevant metric—computing power per dollar—continues to improve at an exponential pace. Advances in specialized hardware, parallel computing, cloud infrastructure, and software efficiency have kept the broader trend alive. This sustained acceleration is one of the key enablers behind today’s rapid advances in artificial intelligence.

A Broader Book Than Expected

Although the singularity—Kurzweil’s projected moment when machine intelligence surpasses and merges with human intelligence—is the book’s central thesis, it is not the book’s sole focus. In fact, much of The Singularity Is Nearer reads as a wide-ranging survey of technological and societal progress. Artificial intelligence serves as the connective tissue, touching nearly every domain Kurzweil discusses: medicine, energy, manufacturing, education, and cognition itself.

In that sense, the book revisits familiar territory explored in works like Yuval Noah Harari’s Homo Deus and Kelly and Zach Weinersmith’s Soonish. Kurzweil’s approach, however, is more explicitly cumulative: each chapter stacks evidence to support the claim that progress is not only continuing, but accelerating.

The early sections focus heavily on empirical trends meant to counter widespread pessimism. Kurzweil draws extensively on Steven Pinker’s The Better Angels of Our Nature and research associated with Daniel Kahneman to argue that, by almost any long-term measurable metric—life expectancy, poverty, literacy, violence—the world has improved dramatically over the last two centuries.

Why, then, does it feel to many as though things are getting worse?

Kurzweil points to well-known cognitive biases. Human perception is tuned to detect sudden changes rather than slow, incremental improvements, because abrupt changes historically posed the greatest survival risks. We are also prone to extrapolating broad conclusions from vivid individual events. A single disaster, or a string of emotionally charged news stories, can outweigh years of gradual improvement in our mental accounting.

Modern media ecosystems amplify this effect. News and social platforms are optimized for attention and engagement, not statistical context. With global coverage, there is always a catastrophe somewhere, ready to be framed as evidence of decline. The result is a persistent mismatch between subjective perception and objective trends. Life may feel more chaotic, even as it becomes safer, healthier, and more prosperous in aggregate.

Automation, Work, and Disruption

Kurzweil is clear-eyed about the disruptions ahead. Automation and AI will render many existing professions obsolete, just as previous waves of industrialization did. Historically, machines often replaced skilled labor with lower-skilled labor augmented by tools. What may be different this time is that many newly created roles could demand higher levels of abstraction, adaptability, and technical literacy.

This raises serious questions about reskilling. It is one thing to say that new jobs will appear; it is another to expect displaced workers to transition smoothly into them, especially when the required skills differ radically from what came before. From the perspective of someone losing their livelihood, abstract assurances about future job creation offer limited comfort.

Kurzweil acknowledges that such transitions generate political tension and uncertainty. Technological change does not occur in a social vacuum. Even if the long-term outcome is positive, the short- and medium-term disruptions can be painful and destabilizing.

Measuring Progress in a Digital Economy

One of the more interesting arguments in the book concerns our metrics for economic success. As automation and digitalization increase, traditional measures like GDP and productivity become less informative. Many digital services generate enormous value while contributing almost nothing to GDP—Wikipedia being the canonical example.

Digital goods can be replicated at near-zero marginal cost, breaking the traditional link between production cost and price. As a result, economic growth increasingly manifests as improved quality, accessibility, and abundance rather than monetary exchange. This complicates policy decisions and public debates that still rely on 20th-century economic indicators.

Kurzweil briefly touches on containerization as an earlier example of invisible but transformative infrastructure. That aside immediately reminded me of Marc Levinson’s The Box, a book that has been sitting unread on my shelf for far too long. It is a useful parallel: some of the most impactful innovations reshape the world quietly, without capturing public imagination at the time.

Extending the Mind

Ultimately, Kurzweil’s path to the singularity runs through the extension of human cognition. First via external tools—AI assistants, neural interfaces, and cognitive augmentation—and eventually, perhaps, through fully digital or simulated minds. These ideas are no longer confined to science fiction. Robin Hanson’s The Age of Em explores similar territory with unsettling rigor.

Kurzweil includes several thought experiments around identity, continuity, and consciousness. While fascinating, they deserve deeper treatment than a single post can provide. I suspect I will return to these questions in the future, especially as they intersect with debates around AI alignment and digital personhood.

Cautious Optimism

I do not fully share Kurzweil’s confidence in timelines or inevitability. But I do find his general direction persuasive. The world is improving in measurable ways, even as it faces profound challenges. Kurzweil does not deny those challenges; he emphasizes that outcomes depend as much on social and political choices as on technology itself. Progress is not automatic, and it is certainly not evenly distributed.

There are also broader risks that Kurzweil touches on only lightly—issues explored in greater depth by authors like Nick Bostrom (Superintelligence) and Olle Häggström (Here Be Dragons). These concerns are real, and they complicate any straightforward narrative of technological salvation.

Still, The Singularity Is Nearer paints a compelling picture: not of an inevitable utopia, but of a future shaped by accelerating capability and human decision-making. I remain uncertain about the destination, but increasingly convinced that the trajectory Kurzweil describes is broadly correct. Whether it leads somewhere hopeful—or somewhere catastrophic—will depend less on the machines than on us.

Friday, 2 January 2026

The Icepick Surgeon – Sam Kean

Sam Kean’s The Icepick Surgeon is a collection of loosely connected stories from the history of science, focusing on figures whose work sits somewhere between ambition, moral blindness, and outright harm. In tone and structure it is reminiscent of books like John Gribbin’s Science: A History or Bill Bryson’s A Short History of Nearly Everything, though Kean’s approach is far more narrative-driven and character-focused. Like those books, it broadly moves forward in time, but with a much narrower scope.

Kean frames the book around twelve themes, each anchored by a “mad scientist” or morally questionable figure. That framing immediately invites certain expectations. Who doesn’t enjoy the mad scientist trope—megalomaniacal brilliance, dangerous ideas, intellect unrestrained by ethics? What the book mostly delivers instead are far more mundane characters: often mediocre, self-serving, and convinced—at least outwardly—that they were doing good. In that sense, the book becomes less about brilliance run amok and more about how ordinary human flaws, combined with authority and weak oversight, can lead to horrifying outcomes. The road to hell and good intentions, and all that.

The narrative style is one of the book’s strengths. Kean is a good storyteller, and he does an admirable job of weaving the different lives and topics together, giving the book a clear red thread despite its episodic structure. That same accessibility, however, may also explain why I found myself preferring Gribbin or Bryson overall. Readers without a strong background or interest in science may well find Kean’s approach more engaging.

One recurring issue is tone. At times the book almost reeks of moral indignation, which can become tiresome. Kean explicitly acknowledges, in at least one chapter, that people should be judged by the standards of their own time—and even notes that practices we consider acceptable today will likely horrify future generations. Yet in many other chapters he seems to do precisely the opposite, judging historical figures against modern ethical standards. While some of the material is genuinely horrifying—especially considering that much of it took place less than a century ago—the inconsistency is noticeable. It leaves you wondering whether humanity has meaningfully improved, or whether we have merely polished a thin veneer of civilization over the same underlying savagery.

The individual chapters vary significantly in how well they align with the book’s stated theme. The opening chapter on piracy, centered on William Dampier—buccaneer, explorer, and early biologist—is among the most interesting, precisely because it shows how scientific curiosity and moral ambiguity can coexist in the same individual.

Other chapters are less successful. The chapter on slavery, for example, adds little that is new, focusing narrowly on the horrors of the triangular trade without placing it in a broader historical context or meaningfully connecting it to modern forms of near-slavery. It also feels more indirect in its connection to science, making it less compelling than most other entries.

The chapter on murder, while engaging, reads more like true crime than an exploration of ethical failure in science. Similarly, the espionage chapter—centered on Harry Gold and the Manhattan Project—is fascinating in its own right but feels only loosely connected to the “mad scientist” framing. A brief but sharp detour into Lysenkoism momentarily brings the theme back into focus before the chapter returns to Gold’s tragic personal fate.

The chapter on lobotomy is perhaps the most chilling of the book, especially in light of recent reading on consciousness and materialism. The brain’s resilience is remarkable, but the casual certainty with which irreversible damage was inflicted in the name of progress is deeply unsettling.

The torture chapter, focusing on Henry Murray and Ted Kaczynski, also feels somewhat misaligned. Kaczynski is clearly the more compelling figure, which shifts the focus away from scientific malpractice and toward biography and consequence.

The final chapters on malpractice and fraud move into much more recent history and will likely feel familiar to most readers, unlike some of the more obscure figures earlier in the book. Their proximity in time makes them easier to relate to, but also less surprising.

Overall, The Icepick Surgeon is an engaging and often disturbing read, held together by strong storytelling and a clear narrative voice. Its thematic focus wavers at times, and the moral framing can feel uneven, but it succeeds in reminding the reader how easily science, ambition, and ethical failure can become entangled. Even when it doesn’t fully work, it remains interesting—and perhaps that is its greatest strength.

Thursday, 1 January 2026

Reading in 2025: A Year of Cyberwar, Consciousness, and Familiar Comforts

As the year draws to a close, it feels like a good moment to look back at what I’ve been reading. As usual, the list is a mix of fiction and non-fiction, comfort rereads and long-overdue titles finally crossed off the list. Nothing this year was truly life-changing, but several books were quietly excellent—and a few were notable for less flattering reasons.

In total, I read 39 books in 2025, split almost evenly between fiction and non-fiction:

  • Fiction: 19

  • Non-fiction: 20

The balance felt right, even if the non-fiction titles ended up leaving the strongest overall impression.


Highlights and Lowlights

Best Non-Fiction

Andy Greenberg – Sandworm: A New Era of Cyberwar and the Hunt for the Kremlin's Most Dangerous Hackers

This was the standout book of the year. Sandworm is both gripping and unsettling, and it genuinely made me rethink how fragile modern society is. Reading it in the aftermath of events like the 2024 CrowdStrike outage—and the Cloudflare outage in November—only reinforced how vulnerable critical infrastructure has become, even without malicious intent.

Closely behind was Bryan Burrough’s Barbarians at the Gate, which remains a masterclass in business journalism and corporate absurdity.

Worst Non-Fiction

Sam Parnia – Lucid Dying

Not a terrible book, but deeply frustrating. The topic is fascinating, yet the scientific rigor just isn’t there. It leans too heavily on speculation and anecdote, which makes it hard to take seriously.



Best Fiction

Steven Brust – Tsalmoth

Nothing in fiction this year truly blew me away, but Tsalmoth was consistently strong and rewarding. That said, it shared the podium with Steven Brust’s Lyorn and Lois McMaster Bujold’s Penric novellas, which were equally enjoyable in quieter, more understated ways.

Worst Fiction

Laurell K. Hamilton – Circus of the Damned

This one stood out for the wrong reasons. While some of the earlier Anita Blake novels still have some charm, this entry mostly felt like a slog.

Best Re-read

William Gibson – Neuromancer

Returning to Neuromancer was a reminder of just how sharp and influential it still is. Even decades later, it feels more modern than many books written long after it.



Oddities, Influences, and Reading Chains

The oddest book of the year was probably Philip K. Dick’s The Three Stigmata of Palmer Eldritch. Not bad—just deeply strange. It didn’t linger with me the way UBIK (which I read last year) did, but it’s unmistakably PKD in all the right and wrong ways.

Several books this year also acted as gateways to further reading. John Strausbaugh’s The Wrong Stuff was excellent and directly responsible for me picking up both Ignition! by John Drury Clark and A City on Mars by Kelly Weinersmith. It also pushed Eric Berger’s Liftoff higher up my to-read list.

On the non-fiction side, Sandworm made me add Kim Zetter’s Countdown to Zero Day to my reading list—though I haven’t quite gotten there yet.

I’d also strongly recommend Yuval Noah Harari’s books to almost anyone. Reading Sapiens and Homo Deus reminded me of the sense of wide-eyed curiosity I had when first reading Bill Bryson’s A Short History of Nearly Everything years ago.

Finally, the book that had been on my reading list the longest was Susan Blackmore’s The Meme Machine. It’s been there since I read Snow Crash roughly a decade ago, and it felt oddly satisfying to finally get to it.


Complete Reading List (Alphabetical by Author)

Dan Abnett

  • Xenos

  • Malleus

  • Hereticus

Susan Blackmore

  • The Meme Machine

Steven Brust

  • Tsalmoth

  • Lyorn

  • Agyar

Lois McMaster Bujold

  • The Physicians of Vilnoc

  • Masquerade in Lodi

Jim Butcher

  • Death Masks

Bryan Burrough

  • Barbarians at the Gate: The Fall of RJR Nabisco

Bryan Caplan

  • The Case Against Education

Harry Cliff

  • Space Oddities

Aubrey Clayton

  • Bernoulli’s Fallacy

John Drury Clark

  • Ignition!

Daniel C. Dennett

  • Consciousness Explained

Philip K. Dick

  • The Three Stigmata of Palmer Eldritch

William Gibson

  • Neuromancer

Andy Greenberg

  • Sandworm

Dashiell Hammett

  • The Maltese Falcon

Yuval Noah Harari

  • Sapiens

  • Homo Deus

Laurell K. Hamilton

  • The Laughing Corpse

  • Circus of the Damned

Sabine Hossenfelder

  • Lost in Math

Erik Hoel

  • The World Behind the World

Sam Kean

  • The Icepick Surgeon

Sarah Monette

  • A Theory of Haunting

Thomas Nagel

  • What Is It Like to Be a Bat?

Sam Parnia

  • Lucid Dying

Anthony Price

  • The Alamut Ambush

M.R. Sellars

  • Harm None

Anil Seth

  • Being You

Ola Skogäng

  • Mumiens blod

  • De förlorade sidornas bok

  • I dödsskuggans dal

John Strausbaugh

  • The Wrong Stuff

Kelly Weinersmith

  • Soonish

  • A City on Mars

Martha Wells

  • Queen Demon


Closing Thoughts

2025 wasn’t a year of dramatic literary revelations, but it was a solid, thoughtful reading year. The non-fiction in particular stood out, often leaving me with more questions than answers—which is usually a good sign. If nothing else, this year reinforced how much I enjoy following threads from one book to the next, letting curiosity rather than novelty guide what I read next.

If next year manages to surprise me a bit more, all the better—but this was time well spent.