A Day is Long, Decades Are Short, and Life Is a Spark | ebp A Day is Long, Decades Are Short, and Life Is a Spark | ebp A Day is Long, Decades Are Short, and Life Is a Spark | ebp
Post

A Day is Long, Decades Are Short, and Life Is a Spark

A reflection on turning 30

In late March of 2023, on a cool night, I was sitting on the subway reading a blog post. The air was crisp and still, yet it could not muffle a topic that had become incandescent across the world: under Sam Altman’s leadership, OpenAI had already risen rapidly onto the global stage. GPT-3.5 had launched the previous November and reached a million users in five days; GPT-4 had been out for only a matter of weeks. Although I had been following generative AI here and there since the GPT-3 era, I had never seen the media frenzy surge the way it did that spring. The world had only just emerged from the shadow of the pandemic. Everything seemed to be healing, and yet everything also felt uncertain. Only one thing was unmistakably clear: everyone could feel that something was stirring below the horizon, a vast and uneasy silhouette.

At the time, we did not yet know—but the future had already been set on a different course. I finished that article and fell into thought. I was twenty-seven that year.

Eight months later, Sam Altman was fired by OpenAI’s board. The upheaval lasted only a few days before ending with his return—but the fracture had become visible. After that, OpenAI seemed to drift further and further from its original ideals, its aura no longer quite the same. Anthropic, Google, and a host of others pushed harder, and the competition in the field only grew more brutal. The rest is familiar enough, and need not be retold.

But if we turn the clock back ten years, I did not care about any of this. I did not even know such things existed—models, papers, or the names that would one day change everything. I cared about exactly one thing: games.


Humanity’s desire to create machines that can follow instructions and do work can be traced back to 1843. That was the year Ada Lovelace wrote the first program—an algorithm designed for Charles Babbage’s Analytical Engine, a machine that was never actually built. A century later came machine code and assembly. Then came Turing, imagining a machine endowed with something like “intelligence,” capable of going beyond the limits of our explicit instructions. From the 1950s through the 1980s, symbolic AI dominated the field. People believed the world could be represented through rules and logic, through sharply defined boundaries. That path did solve many problems. By stacking layer upon layer of complex if-then trees, we achieved an old-fashioned kind of intelligence—or rather, a primitive outline of one.

Almost everyone believed in that path, save for a small minority. They imagined something closer to the way a biological brain might work: a machine that could learn. Take in signals, emit another signal—that was the perceptron. They believed that by connecting enough of these perceptrons together, one might approach the miracle nature had already performed: intelligence. Perceptrons linked together became “connectionism.” The aggregate form they took came to be called the neural network.

And then came silence.

After decades of labor, we saw behaviorism, evolutionism, Bayesianism—but the light at the end of the tunnel never truly appeared. The AI winter was not a metaphor. It was real. “Intelligence” itself began to feel like nothing more than a glorious daydream.

For more than half a century, the frontier all but stalled, yet exploration never stopped. In 2010, DeepMind was founded; the ReLU activation function emerged; the ImageNet competition began, attempting to establish a real benchmark for machine intelligence; and for the first time, GPUs revealed their immense potential in deep learning tasks. The age had not yet changed, but something beneath it had already begun to move.


In 2011, I was fifteen and had just graduated from middle school. Spring in Shanghai was especially warm and bright that year. The Expo had only just ended; attractions were still packed, and you could still see foreigners everywhere in the streets. I knew very little of the world then, but looking at those pavilions, I always felt that the world must be vast and alive. It seemed like a symbol of prosperity, of a future gathering itself into form and promise. From where I stood, the world felt boundless.

The springtime of AI, however, had arrived sixty years late. In 2012, AlexNet, powered by deep convolutional neural networks, crushed every competitor in the ImageNet competition. Very quickly, its three authors—Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton—became titanic figures in the age of AI.

That same late summer of 2012, I entered high school. Like most teenagers, I skipped class on weekends, disappeared into internet cafés, and played games day and night—that was my life. One evening after school, a friend casually asked me, “Have you played Assassin’s Creed II?” I said, “What’s that?” He replied, “Holy shit, you have to. This game is insane.” On the road home at dusk, he described an open world so vivid it sounded almost impossible. I was immediately restless with curiosity, and that Friday I went to a street stall and bought a pirated disc. It was the first AAA game of my life.

In the opening scene, Ezio and his brother Federico stand atop a tower in Florence, bathed in tender moonlight falling across the rooftops and over their shoulders:

“It is a good life we lead, brother.”
“The best. May it never change.”
“And may it never change us.”

That game very quickly became my favorite. I went back and immediately finished AC1, Brotherhood, and Revelations. Over just a few short years, Ezio grew from a charming young man into a weary old one with a beard. But to me, it felt as if I had truly lived a whole life alongside him, witnessing the full arc—from Jerusalem, Masyaf, Venice, Rome, and the Tuscan countryside, all the way back to Damascus. In the library, Altair’s footsteps overlapped with Ezio’s, and one legacy passed into another. Even now, I can still vividly remember those logos of studios from around the world fading in one by one after the beach prologue. Back then, I only thought one thing: this company was impossibly cool. Ubisoft, in my mind, was practically synonymous with rebellion. They dared to create things others thought impossible. There was something fiercely distinctive about their spirit.

In imagined worlds, there were no horizons. In spring, I freeran across the rooftops of Florence. At the height of summer, I shouted on the peak of High Hrothgar in Skyrim. When autumn leaves began to fall, we had just defeated Deathwing and were preparing to cross the mists into Pandaria. Back then, the sky outside my window was almost my entire understanding of the real world. The school walls and the road home were the full extent of where my feet could carry me. The world inside that bulky fourteen-inch monitor was where I truly lived.

What, exactly, were studying and exams for? What did those tedious subjects mean, other than locking me inside a classroom? Games were simply too compelling; reality could not compare. There was one fleeting moment when I tried to imagine what my thirtieth birthday might look like, but the thought passed and dissolved before I could grasp it. And now, I am thirty. Looking back, those days at sixteen that felt like the “worst” of my life were, in truth, among the best.


2013 was a turning point—for humanity, and for me. The famous example “king - man + woman = queen” arrived with Word2Vec. That same year, the variational autoencoder (VAE) was born. Yann LeCun joined Facebook and took charge of FAIR. Hinton joined Google. The year after that, GANs appeared. These were all striking papers and events, but just as with my high-school self, almost no one then could foresee where these threads would ultimately lead.

As a high-school senior, the pressure mounted with every passing day. The countdown on the classroom blackboard was updated daily; the gaokao was no longer an abstraction but something rushing straight toward us. But all I wanted was to escape. School was not a place I wanted to stay in for even one more second than necessary. You might not always find me in the classroom, but if there was a group sneaking out before evening study hall, or huddled together playing cards during class, odds were high I was among them. And of course, never forget: games were everything. My battlefield was in Azeroth and Los Santos, not in a classroom. And so the days passed.

Until two things changed me.

One was an ordinary afternoon when the results of the latest biology monthly exam were posted. I ranked dead last in the class—or if not last, then second to last. During the break, I was joking around with friends as usual when one of them said, “Sure, the required subjects are boring, so failing those is whatever—but failing an elective you chose? Damn, bro. Looks like this mage doesn’t actually have much INT.” We all burst out laughing, but that conversation planted a seed in me.

The second was a breakup—though in fairness, I had accomplished absolutely nothing at that point in my life. It was the first time I seriously began to ask myself: what happens after graduation? What is it that I actually want to do?

Not long after, I genuinely became fascinated by biology, and my grades rose to the top of the class. It felt natural, then, to think that maybe I should become a doctor. At the time, I was especially captivated by neuroscience. But my teacher advised me not to go down that path—she said it was too hard, and she was not sure it was truly the life I wanted. To this day, I remain grateful for her honesty.

In 2014, I turned eighteen. Like anyone standing at the threshold of adulthood, I felt enormous anxiety closing in around me. I returned to hiding out in internet cafés, but now I could no longer summon any excitement for games. I scrolled through game lists in a daze and realized I had already played most of them. They even began to resemble one another. Still, I missed the days when Call of Duty and Assassin’s Creed had once electrified me, and as my eyes passed over Ubisoft’s logo, I paused for just a moment.

Then, without warning, a pop-up ad appeared:

Unity 4.6 is now available.

Almost against my own will, I clicked it.

It was the first time I had ever heard the term “game engine.” It felt like a Pandora’s box that could make magic happen. The forbidden sensation of slipping behind the curtain seized me with enormous force, and in almost a single instant, it took hold of my full attention.

“One day, I’m going to make my name appear in those opening credits.”

That was what I told myself.

It felt as though everything before had been leading toward that moment, and everything after would begin from that afternoon. I changed a great deal that year. Before long, university arrived.


The final summer after the gaokao was the first vacation of my life completely free of academic pressure. No one would ever again tell you the correct answer on an exam paper. No one would even bother explaining questions to you anymore. It was as if everything you had learned in the first eighteen years of your life had all converged upon one single formal checkpoint—one that had been given far more weight than it deserved. Yesterday, getting a multiple-choice question wrong still mattered. Today, it meant nothing. Everything I had studied for eighteen years suddenly felt disposable. No one would wake you up. No one cared if you were late. So what, then, would tomorrow look like?

That summer was emptier than I had imagined.

Making games and playing games quickly became my new routine. I tried Unreal 3 (UDK), CryEngine, Unity, RPG Maker, and a dozen other tools, as though the only thing standing between me and the games of my dreams was the right set of software. But I soon realized that was not true. I knew nothing about code. UnrealScript looked like scripture in an unknown language. Kismet’s nodes made no intuitive sense to me. All I could really do was drag objects around. I began reading large numbers of design books, yet still felt lost.

One night, I asked myself a question: how did I imagine myself at thirty, forty, eighty? Could I imagine myself becoming someone excited by Excel and Word? A doctor? A bus driver? Or a game developer? Only the last possibility stirred something in me. So I sketched, clumsily, a rough sixty-year roadmap for my life—one milestone per decade, each decade meant to arrive somewhere. It was blurry and had no detail, but I vaguely felt that the games which had already shaped my view of the world would also become the beacons guiding me through the storm.


In 2014, a paper titled Neural Machine Translation by Jointly Learning to Align and Translate introduced attention. Three years later came the foundational work that would change the world: Attention Is All You Need.

For me, my university years flew by. I spent little time on coursework—the classes were not terribly difficult, and I had time to spare. So I poured all of myself into games. In my senior year, I was fortunate enough to intern at NVIDIA. The work was not easy, but the experience was immensely valuable. I met a group of friends there, some of whom I still keep in touch with. I wish them all well. By 2017, I gradually realized that I needed answers I could not find in books. I needed a methodology for game development. I believed those answers were in America—at the time, still the most advanced game industry in the world.

Before I left, a professor introduced me to a veteran designer who had spent decades in the game industry. We talked. I told him I did not want to make pay-to-win games. I wanted to make games that could move people, touch something in them. Real art.

He said, “I hope you still feel that way ten years from now.”

With that sentence, I began my graduate journey to the United States. Plano, a small town north of Dallas. There I met countless seniors, professors from the industry, and many friends. It felt like a giant family made up of game players. Schoolwork—the very thing my sixteen-year-old self had once despised—became a paradise. The courses were difficult and the pressure intense, yet they drove me to wake up at eight every morning and stay up until two in the morning. For the first time in my life, sleep felt like a burden, because there was simply too much that I loved and wanted to learn. That excitement gave me almost inexhaustible energy.


That same year, GPT-1 appeared—a model trained on internet text to predict the next word in a sequence. At first, attention itself did not attract that much notice. Then came BERT and AlphaFold v1. The year after, OpenAI Five defeated Dota 2 world champions OG, 2–0.

Those two years in America yielded a great deal. I systematically studied programming, rendering, design—almost everything related to games. I had the opportunity to experiment, to explore, to build the ideas that had lived in my head for years. When no one forced me to study, I felt for the first time an extraordinary freedom in exploring the world. It was as though the curiosity that had once been extinguished by an educational system reignited here, burning even more fiercely than before. For the first time in my life, I realized that there was, in truth, very little preventing me from doing anything I wanted—and that nowhere on Earth was more than fifteen hours away by plane. Two years earlier, the world had still seemed immeasurably vast to me; after crossing the Pacific several times a year, it suddenly became much smaller than I had imagined.

2020 was a turbulent year. The pandemic raged. Economic structures shifted violently. In the Texas heat, it felt as though God had pressed fast-forward. The semester was hurriedly wrapped up. My thesis defense happened online. I graduated. I had never imagined I would actually join Ubisoft, and yet it happened—as if the script had already been written. I felt that all the scattered stars had finally connected into a line. Ever since that afternoon in high school, every day had been my “best of times,” and I silently prayed:

“May it never change. And may it never change us.”

Then the pandemic deepened, war broke out, global tensions rose, and alliances began to fracture. The glittering game industry of the 2010s slowly gathered dust. The company that had once given a sixteen-year-old boy goosebumps was no longer what it had been. Not a single Assassin’s Creed since had truly moved me. The reality is simple: nothing ever stops changing. Not the world. Not us.

The rest of the story is widely known. Generative AI swept through every industry. A new technology became old news in a matter of hours. The fantasy of AGI suddenly seemed close enough to touch. The post-pandemic era brought large-scale economic decline, soaring unemployment, and eventually the media lost interest in reporting any of it.


It has now been nearly six years since graduation. Looking back on the road behind me, I find myself standing at a crossroads. I can clearly see the eighteen-year-old version of myself behind me, though he cannot see the one who now watches him. There will be more crossroads ahead. I, too, look into empty fog—yet my forty-year-old self can already see me tonight. And so it repeats, without end.

But I hope that every time we look back, even when we are old and frail, we can still clearly feel that younger self sitting in a room, clutching a computer, filled with both longing and confusion about the world. I hope we do not let him down.

Exactly ten years have passed since that question. Have my thoughts changed? Yes and no. I am no longer the child who naively believed games were pure art, untouched by commerce. But I have never accepted the kind of design in which every system is built around monetization. On the question of what truly matters, I am still holding hands tightly with my eighteen-year-old self. That thread has not broken. If I were to die tomorrow, I would want to leave behind an epitaph that reads Proudly Presents, not a few crude money-printing machines despised by everyone.

Over the past twelve years, I feel I have not betrayed that boy. Every second, I have tried to live as well as I could. I have never forced myself to do what I hated, nor have I wasted my time.


When I first sat down to write this, I thought the pace of the world in recent years had accelerated far beyond anything in the past. That is true—but the past never stood still either. We did not arrive here out of nowhere. Look closely enough, and the threads begin to resolve: the world has always rushed forward, and history’s great moments were always happening. It was not that the old world had been static—it was that the old me could not yet see it. I sat behind a window, staring through a small LED screen, trying to catch hold of something that might fill the emptiness and timidity inside me.

And when we look back, all those apparently contradictory theories, divisions, cyclical arguments, winters, nights of despair and heartbreak, and decades of tireless labor all converge upon a single point. From Ada Lovelace writing the first algorithm for a machine that was never built, to Attention Is All You Need, and then to that small burst called Sparks of Artificial General Intelligence—this thread of one hundred and eighty years began with a simple dream, woven across generations by countless hands. We went from messages carried on foot, to horses and carts, to satellites, and then to networks, turning thought itself into streams of bytes moving through topology. Then we digitized everything—novels, songs, images, films, games—compressing every modality into signal. And now, through generative AI, we have abstracted “modality” itself. Machines no longer merely transport and process bytes—they are beginning to grasp the meaning behind them. In that latent space where all modalities can be unified into token representation, the machine touches, for the first time, the soft inner fabric of human knowledge and language, and at last breaks the if-then cage that confined it for seventy years. And so, finally—finally—the mad and romantic dream in Turing’s heart has come true.

I do not know where humanity is headed, nor is that what truly concerns me. Perhaps everything we do on this pale blue dot is ultimately futile; perhaps we are merely the bootstrap program for a silicon-based lifeform. And yet here we are—releasing the brake, flooring the accelerator, racing into the unknown. What is it that makes humanity different from every other living thing? At this moment, my answer would be: curiosity. Curiosity is a merciless curse. It overcomes timidity and fear, and lets the grandeur of the universe and nature lure this innocent species into the trap with ease. It grants those frail, two-legged creatures the courage to face death and descend into the darkest abyss. Thus entropy grows exponentially, and countless pioneers hurl themselves forward like moths into flame—and so emperors and ministers across the ages, here on a single arm of the Orion Spur in this nearly boundless time and space, strike one brief spark.


And all of this—the history of humankind, the rise and fall of nations, the fate of families—eventually converged upon me, in one cool night, reading a blog post. On his thirtieth birthday, Sam Altman wrote:

“The days are long but the decades are short.”

Days are slow in passing, but ten years vanish in an instant. I wonder how much of those fervent words of self-exhortation he still remembers now, at forty.

Today is my thirtieth birthday. I write this for myself, for my younger self, and for every future version of me:

“The decades are short but the life is a spark.”

Years are few, and life flickers bright.

At this point, I have finished my own ten-year leg of the relay and passed the baton to you. I’ll see you again at forty.

This post is licensed under CC BY 4.0 by the author.