How Perfectly Can Reality Be Simulated? (2024)

Digitizing the real world involves the tedium of real-world processes. Three-dimensional models are created using lidar and photogrammetry, a technique in which hundreds or thousands of photographs of a single object are stitched together to produce a digital reproduction. In the redwood grove, as Caron set up his equipment, he told me that he had spent the past weekend inside, under, and atop a large “debris box”—crucially, not a branded Dumpster, which might not pass legal review—scanning it from all angles. The process required some nine thousand photographs. (“I had to do it fast,” he said. “People illegally dump their stuff.”) Plants and leaves, which are fragile, wavery, and have a short shelf life, require a dedicated vegetation scanner. Larger elements, like cliff faces, are scanned with drones. Reflective objects, such as swords, demand lasers. Lind told me that he loved looking at textures up close. “When you scan it, a metal is actually pitch-black,” he said. “It holds no color information whatsoever. It becomes this beautiful canvas.” But most of Quixel’s assets are created on treks that require permits and months of planning, by technical artists rucking wearable hard drives, cameras, cables, and other scanning equipment. Caron had travelled twice to the I’on Swamp, a former rice paddy on the outskirts of Charleston, South Carolina, to scan cypress-tree knees—spiky, woody growths that rise out of the water like stalagmites. “They look creepy,” he said. “If you want to make a spooky swamp environment, you need cypress knees.”

The company now maintains an enormous online marketplace, where digital artists can share and download scans of props and other environmental elements: a banana, a knobkerrie, a cluster of sea thrift, Thai coral, a smattering of horse manure. A curated collection of these elements labelled “Abattoir” includes a handful of rusty and sullied cabinets, chains, and crates, as well as twenty-seven different bloodstains (puddle, archipelago, “high velocity splatter”). “Medieval Banquet” offers, among other sundries, an aggressively roasted turnip, a rack of lamb ribs, wooden cups, and several pork pies in various sizes and stages of consumption. The scans are detailed enough that when I examined a roasted piglet—skin leathered with heat and torn at the elbow—it made me feel gut-level nausea.

Assets are incorporated into video games, architectural renderings, TV shows, and movies. Quixel’s scans make up the lush, dappled backgrounds of the live-action version of “The Jungle Book,” from 2016; recently, watching the series “The Mandalorian,” Caron spotted a rock formation that he had scanned in Moab. Distinctive assets run the risk of being too conspicuous: one Quixel scan of a denuded tree has become something of a meme, with gamers tweeting every time it appears in a new game. In Oakland, Caron considered scanning a wooden fence, but ruled out a section with graffiti (“DAN”), deeming it too unique.

Epic creates detailed simulations of people as part of a project called MetaHumans.Source: Epic Games

After a while, he zeroed in on a qualified redwood. Working in visual effects had given him a persnickety lens on the world. “You’re just trained to look at things differently,” he said. “You can’t help but look at clouds when you’ve done twenty cloudscapes. You’re hunting for the perfect cloud.” He crouched down to inspect the ground cover beneath the tree and dusted a branch of needles—distractingly green—out of the way. Caron’s colleagues sometimes trim grass, or snap a branch off a tree, in pursuit of an uncluttered image. But Caron, who is in his late thirties and grew up exploring the woods of South Carolina, prefers a leave-no-trace approach. He hoisted one of the scanning rigs onto his back, clipped in a hip belt to steady it, and picked up a large digital camera. After making a series of tweaks—color calibration, scale, shooting distance—he began to slowly circle the redwood, camera snapping like a metronome. An hour passed, and the light began to change, suboptimally. On the drive home, I considered the astonishing amount of labor involved in creating set pieces meant to go unnoticed. Who had baked the pork pies?

Sweeney, Epic’s C.E.O., has the backstory of tech-founder lore—college dropout, headquarters in his parents’ basem*nt, posture-ruining work ethic—and the stage presence of a spelling-bee contestant who’s dissociating. He is fifty-three years old, and deeply private. He wears seventies-style aviator eyeglasses, and dresses in corporate-branded apparel, like an intern. He is mild and soft-spoken, uses the word “awesome” a lot, and tweets in a way that suggests either the absence of a communications strategist or a profound understanding of his audience. (“Elon Musk is going to Mars and here I am debugging race conditions in single-threaded JavaScript code.”) He likes fast cars and Bojangles chicken. Last year, he successfully sued Google for violating antitrust laws. Epic, which is privately held, is currently valued at more than twenty-two billion dollars; Sweeney reportedly is the controlling shareholder.

When we spoke, earlier this spring, he was at home, in Raleigh, North Carolina, wearing an Unreal Engine T-shirt and drinking a soda from Popeyes. Behind him were two high-end Yamaha keyboards. We were on video chat, and the lighting in the room was terrible. During our conversation, he vibrated gently, as if shaking his leg; I wondered if it was the soda. “It’s probably going to be in our lifetime that computers are going to be able to make images in real time that are completely indistinguishable from reality,” Sweeney told me. The topic had been much discussed in the industry, during the company’s early days. “That was foreseeable at the time,” he said. “And it’s really only starting to happen now.”

Sweeney grew up in Potomac, Maryland, and began writing little computer games when he was nine. After high school, he enrolled at the University of Maryland and studied mechanical engineering. He stayed in the dorms but spent some weekends at his parents’ house, where his computer lived. In 1991, he created ZZT, a text-based adventure game. Players could create their own puzzles and pay for add-ons, which Sweeney shipped to them on floppy disks. It was a sleeper hit. By then, he had started a company called Potomac Computer Systems. (It took its name from a consulting business he had wanted to start, for which he had already purchased stationery.) It operated out of his parents’ house. His father, a cartographer for the Department of Defense, ran its finances. Sweeney renamed the company Epic MegaGames—more imposing, to his ear—and hired a small team, including the game designer Cliff Bleszinski, who was still a teen-ager. “In many ways, Tim Sweeney was a father figure to me,” Bleszinski told me. “He showed me the way.”

Cartoon by Suerynn Lee

Link copied

Sweeney’s lodestar was a company called id Software. In 1993, id released Doom, a first-person shooter about a husky space marine battling demons on the moons of Mars and in Hell. Doom was gory, detailed, and, crucially, fast: its developers had drawn on military research, among other things. But id also took the unusual step of releasing what it called Doom’s “engine”—the foundational code that made the game work. Previously, games had to be built from scratch, and companies kept their code proprietary: even knowing how to make a character crouch or jump gave them an edge. Online, Doom “mods” proliferated, and game studios built new games atop Doom’s architecture. Structurally, they weren’t a huge departure. Heretic was a fantastical first-person shooter about fighting the undead; Strife was a fantastical first-person shooter about fighting robots. But they were proofs of concept for a new method and philosophy of game-making. As Henry Lowood, a video-game historian at Stanford, told me, “The idea of the game engine was ‘We’re just producing the technology. Have at it.’”

Sweeney thought that he could do better. He soon began building his own first-person shooter, which he named Unreal. He recalled looking through art reference books and photographs to better understand shadows and light. When you spend hours thinking about computer graphics, he told me, the subject “tends to be unavoidable in your life. You’re walking through a dark scene outdoors at night, and it’s rainy, and you’re seeing the street light bounce off of the road, and you’re seeing all these beautiful fringes of color, and you realize, Oh, I should be able to render this.” Unreal looked impressive. Water was transparent, and flames flickered seemingly at random. After screenshots of the game were published, before its release, developers began contacting Sweeney, asking to use his engine for their own games.

How Perfectly Can Reality Be Simulated? (2024)
Top Articles
Latest Posts
Article information

Author: Jeremiah Abshire

Last Updated:

Views: 6543

Rating: 4.3 / 5 (74 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: Jeremiah Abshire

Birthday: 1993-09-14

Address: Apt. 425 92748 Jannie Centers, Port Nikitaville, VT 82110

Phone: +8096210939894

Job: Lead Healthcare Manager

Hobby: Watching movies, Watching movies, Knapping, LARPing, Coffee roasting, Lacemaking, Gaming

Introduction: My name is Jeremiah Abshire, I am a outstanding, kind, clever, hilarious, curious, hilarious, outstanding person who loves writing and wants to share my knowledge and understanding with you.