The Eternal visit a Gun That Doesn’t Kill

this informative article ended up being posted in partnership with The Marshall Project, a nonprofit news organization that covers the US unlawful justice system. Join their newsletter, or proceed with the Marshall Project on facebook.

Within the brand new cop drama APB, an Elon Musk-type billionaire engineer purchases a beleaguered Chicago police precinct to avenge his buddy’s murder. He re-outfits the station with wizardry including drones, a biometric interrogation seat and guns that immediately (and nonlethally) stop crooks aided by the range and precision of the old-fashioned pistol. We’ll leave it to your solicitors to argue if a civilian could obtain a precinct. When it comes to technology material, especially the pimped-out stun weapon, issue is prompt: Given the present high-profile fatal police shootings of civilians—roughly 1,000 a year—it is reasonable that law enforcement officials and victim advocates alike take an ongoing search for a unit that can neutralize a risk without causing permanent harm. “It’s shout or shoot. There aren’t some intermediate choices,” says Sid Heal, a retired Los Angeles County Sheriff’s Department commander whom consults internationally on the utilization of force. And since most police division protocols allow officers to respond to threats by using a more impressive range of force than they’re confronting, an officer whom faces off against a foe holding a lethal weapon—which could be a hammer or baseball bat—is almost certainly going to react with all the solution revolver. For 800 years, the only effective way to stop an adversary is weapon, set to kill. However the pursuit of a nonlethal alternative hasn’t been more urgent.

Tom Swift’s Electrical Rifle

Exactly what, then, in regards to the Taser? is not your solution—a tool that can surprise a subject into distribution, leaving no lasting damage? That’s the idea the theory is that, and since first introduced by Taser International in 1993, the product has become a mainstay in nearly every police department. But concept and training are two different things. Tasers are both less efficient than guns at stopping some body charging you at you with no guarantee of making them unscathed.

It’s not just a quick fix, but it stops a danger. Steve Tuttle, Taser Overseas

Named the 1911 youth guide Tom Swift and His Electric Rifle (it’s an initialism), Tasers work by shooting two electrically charged probes—one negative, one positive—delivering a 5 to 30 2nd surprise of 50,000 volts, although the voltage drops significantly upon effect. Both probes need to make close contact with your skin working. Yet often they don’t: Heavy clothing can repel them, therefore the further the length that they’re shot, the wider the space, or spread, between your two probes. “The spread is one base for every single seven legs they travel. Basically deploy it at you at 14 feet, the spread is going to be two legs,” states Taser Global spokesperson Steve Tuttle. “It’s not a magic pill.” But, he adds confidently, “It prevents a hazard,” and much more effortlessly than many other less-lethal options such as batons, pepper spray or disorienting blinking products. (Taser Global virtually has industry, though a small number of competitors have actually introduced comparable stun devices. The business won a patent infringement lawsuit against one, Karbon Arms, in 2014. Karbon Arms web site has since shut down as well as its Facebook web page says “closed for company.”) While Tasers are clearly less deadly than mainstream firearms, arguments carry on over whether they can surprise someone to death. A Washington Post investigation of police killings in 2015 found around one death per week associated with authorities utilization of Tasers, but no-one could definitively attribute those deaths to electric surprise. Some topics might have dropped and hit their heads after being shocked. In terms of range, in 2009 Taser introduced the XREP stretched range shotgun, which may reach up to 100 foot. However with only limited circumstances of practical usefulness and rounds costing $125 each, Tuttle claims, “It had been too costly. We pulled it.” Tuttle notes FBI data show that many officers fire their guns from seven to 10 feet away, well inside a Taser’s reach. Nevertheless, some officers won’t trust a Taser except up close. On TV, the number problem is solved simply by writing it into the script. In APB’s pilot episode, a detective is directed by the precinct’s brand new owner to shoot at a lady being held hostage by a perp having a weapon to the woman mind. “The Taser won’t kill the girl, but he can,” the rich employer whispers. The detective takes the shot additionally the woman falls, stunned but unharmed. Then your detective shoots once again and immobilizes the theif.

State i simply had an encounter with somebody threatening committing suicide in which you’ve got the less-lethal option. Now I go on the next task and I also have someone shooting at me personally. Am I going to make sure to switch the mode? John Folino, Chicago Police Department

Exactly how that scenario would play away into the real world, who knows? Sergeant Detective John Folino, the show’s technical adviser plus 19-year veteran for the Chicago Police Department, discusses practical issues over ethics. “Right now, you’ve got officers having a weapon plus Taser. It is possible to only have a great deal on your own duty belt,” impeding flexibility and causing straight back discomfort, he claims. Like Star Trek’s phasers, the fictional weapons on Fox’s APB have stun and destroy settings—which could also cause issues. “Say i recently had an encounter with somebody threatening committing suicide in which you’ve got the less-lethal choice,” Folino states. “Now we go on another work and I also have actually someone shooting at me personally. Am I going to be sure you switch the mode?” That’s a real-world concern tragically responded a year ago whenever Tulsa reserve deputy Robert Bates drew their gun as opposed to his Taser and killed Eric Harris, the unarmed topic of the sting procedure. Bates stated he’d gotten confused, and ended up being sentenced to four years for manslaughter. “That’s the reason why once we train, we put the Taser in the opposing part,” Folino claims. “It’s called your help part. Your actual gun is on your own strong part.”

Directed Energy: Feeling the Burn

Just what exactly else is offered? The next closest thing on evasive phaser on stun may be a directed energy system developed by Raytheon that fires waves of power that penetrate a paper-thin layer regarding the epidermis, creating an intolerable burning sensation. But it’s scarcely handheld and contains become mounted on a flatbed trailer. Initially created for the armed forces, it was implemented in Afghanistan this year before being recalled by the Air Force, apparently as a result of concerns about Geneva Convention violations. Raytheon couldn’t return requires comment. Jail guards tested a model for law enforcement use—mostly for crowd control—at la’ North County Correctional Facility this season. Heal, the retired LA sheriff’s commander, had been a consultant on that contract. “We wear it the top the prison where we simply had two murders. And also the ACLU objected,” he complains, by having a tone of exasperation.“Why don’t we simply use the material we’ve been utilizing since 1820, like billy groups and night sticks?”

There isn’t any such thing as perfect tool, and tools made to be non-lethal can find yourself having deadly effects or infringe on people’s liberties to talk out and construct. Rohini Haar, Physicians for Human Rights

The ACLU referred me personally to Physicians for Human Rights. “Our prevailing issues about weapons—either genuine or hypothetical—is both the risk they pose and their possibility of being used to break people’s legal rights,” writes Rohini Haar, a crisis medication physician utilizing the team, in a email. He states the beam’s results haven’t been completely studied. “Certainly an alternative solution to reside ammunition is warranted, but the problem let me reveal that [less-lethal weapons] are often deployed with no complete comprehension of their possible health impacts. … there’s absolutely no such thing as a perfect weapon, and weapons made to be non-lethal can find yourself having life-threatening impacts or infringe on people’s legal rights to talk out and assemble.” And so the search continues. Robert Afzal of Aculight Corp., a subsidiary of Lockheed in Bothell, Washington, is developing high-powered lasers to shoot straight down missiles. A bit of a Trekkie, he poses having a movie prop in a Smithsonian documentary that likens their laser up to a phaser. Both are beam tools, all things considered. But like Raytheon’s ray weapon, Afzal’s must be installed on a big automobile. Additionally utilizes intense heat to shoot straight down missiles, maybe not repel humans. “The phaser as stun weapon or Taser continues to be in realm of good technology fiction,” Afzal says. “We would want significant improvements in technology, including batteries, to make a useful handheld laser gun.” Size, then, still matters. The technology to pack all that energy in a holster-ready device merely is not right here yet. Additionally, the various shocking as well as heat tools currently available or in development follow a fundamental paradigm: Those that submit impulses instantaneously with a beam burn their topics rather than surprise them; those who surprise, like the tethered Taser probes, don’t usage beams. APB‘s only doing so-so into the ranks, therefore it’s not likely to spark the imagination of weapons designers. Nevertheless the most useful device was conceived above 300 years ago. In “The Tempest,” Shakespeare’s Prospero declares:

I’m able to right here disarm thee using this stick. While making thy gun fall.

His energy was merely secret. Get Back into Top. Skip To: Begin of Article.

I Took the AI Class Facebookers Are Literally Sprinting to Get Into

Chia-Chiunn Ho was eating lunch inside Facebook headquarters, at the Full Circle Cafe, when he saw the notice on his phone: Larry Zitnick, one of the leading figures at the Facebook Artificial Intelligence Research lab, was teaching another class on deep learning.

Ho is a 34-year-old Facebook digital graphics engineer known to everyone as “Solti,” after his favorite conductor. He couldn’t see a way of signing up for the class right there in the app. So he stood up from his half-eaten lunch and sprinted across MPK 20, the Facebook building that’s longer than a football field but feels like a single room. “My desk is all the way at the other end,” he says. Sliding into his desk chair, he opened his laptop and surfed back to the page. But the class was already full.

Internet giants have vacuumed up most of the available AI talent—and they need more.

He’d been shut out the first time Zitnick taught the class, too. This time, when the lectures started in the middle of January, he showed up anyway. He also wormed his way into the workshops, joining the rest of the class as they competed to build the best AI models from company data. Over the next few weeks, he climbed to the top of the leaderboard. “I didn’t get in, so I wanted to do well,” he says. The Facebook powers-that-be are more than happy he did. As anxious as Solti was to take the class—a private set of lectures and workshops open only to company employees—Facebook stands to benefit the most.

Deep learning is the technology that identifies faces in the photos you post to Facebook. It also recognizes commands spoken into Google phones, translates foreign languages on Microsoft’s Skype app, and wrangles porn on Twitter, not to mention the way it’s changing everything from internet search and advertising to cybersecurity. Over the last five years, this technology has radically shifted the course of all the internet’s biggest operations.

With help from Geoff Hinton, one of the founding fathers of the deep learning movement, Google built a central AI lab that feeds the rest of the company. Then it paid more than $650 million for DeepMind, a second lab based in London. Another founding father, Yann LeCun, built a similar operation at Facebook. And so many other deep learning startups and academics have flooded into so many other companies, drawn by enormous paydays.

The problem: These companies have now vacuumed up most of the available talent—and they need more. Until recently, deep learning was a fringe pursuit even in the academic world. Relatively few people are formally trained in these techniques, which require a very different kind of thinking than traditional software engineering. So, Facebook is now organizing formal classes and longterm research internships in an effort to build new deep learning talent and spread it across the company. “We have incredibly smart people here,” Zitnick says. “They just need the tools.”

Meanwhile, just down the road from Facebook’s Menlo Park, California, headquarters, Google is doing much the same, apparently on an even larger scale, as so many other companies struggle to deal with the AI talent vacuum. David Elkington, CEO of Insidesales, a company that applies AI techniques to online sales services, says he’s now opening an outpost in Ireland because he can’t find the AI and data science talent he needs here in the States. “It’s more of an art than a science,” he says. And the best practitioners of that art are very expensive.

In the years to come, universities will catch up with the deep learning revolution, producing far more talent than they do today. Online courses from the likes of Udacity and Coursera are also spreading the gospel. But the biggest internet companies need a more immediate fix.

Seeing the Future

Larry Zitnick, 42, is a walking, talking, teaching symbol of how quickly these AI techniques have ascended—and how valuable deep learning talent has become. At Microsoft, he spent a decade working to build systems that could see like humans. Then, in 2012, deep learning techniques eclipsed his ten years of research in a matter of months.

In essence, researchers like Zitnick were building machine vision one tiny piece at time, applying very particular techniques to very particular parts of the problem. But then academics like Geoff Hinton showed that a single piece—a deep neural network—could achieve far more. Rather than code a system by hand, Hinton and company built neural networks that could learn tasks largely on their own by analyzing vast amounts of data. “We saw this huge step change with deep learning,” Zitnick says. “Things started to work.”

For Zitnick, the personal turning point came one afternoon in the fall of 2013. He was sitting in a lecture hall at the University of California, Berkeley, listening to a PhD student named Ross Girshick describe a deep learning system that could learn to identify objects in photos. Feed it millions of cat photos, for instance, and it could learn to identify a cat—actually pinpoint it in the photo. As Girshick described the math behind his method, Zitnick could see where the grad student was headed. All he wanted to hear was how well the system performed. He kept whispering: “Just tell us the numbers.” Finally, Girshick gave the numbers. “It was super-clear that this was going to be the way of the future,” Zitnick says.

Within weeks, he hired Girshick at Microsoft Research, as he and the rest of the company’s computer vision team reorganized their work around deep learning. This required a sizable shift in thinking. As a top researcher once told me, creating these deep learning systems is more like being a coach than a player. Rather than building a piece of software on your own, one line of code at a time, you’re coaxing a result from a sea of information.

But Girshick wasn’t long for Microsoft. And neither was Zitnick. Soon, Facebook poached them both—and almost everyone else on the team.

This demand for talent is the reason Zitnick is now teaching a deep learning class at Facebook. And like so many other engineers and data scientists across Silicon Valley, the Facebook rank and file are well aware of the trend. When Zitnick announced the first class in the fall, the 60 spots filled up in ten minutes. He announced a bigger class this winter, and it filled up nearly as quickly. There’s demand for these ideas on both sides of the equation.

There’s also demand among tech reporters. I took the latest class myself, though Facebook wouldn’t let me participate in the workshops on my own. That would require access to the Facebook network. The company believes in education, but only up to a point. Ultimately, all this is about business.

Going Deep

The class begins with the fundamental idea: the neural network, a notion that researchers like Frank Rosenblatt explored with as far back as the late 1950s. The conceit is that a neural net mimics the web of neuron in the brain. And in a way, it does. It operates by sending information between processing units, or nodes, that stand in for neurons. But these nodes are really just linear algebra and calculus that can identify patterns in data.

Even in the `50s, it worked. Rosenblatt, a professor of psychology at Cornell, demonstrated his system for the New Yorker and the New York Times, showing that it could learn to identify changes in punchcards fed into an IBM 704 mainframe. But the idea was fundamentally limited—it could only solve very small problems—and in the late ’60s, when MIT’s Marvin Minsky published a book that proved these limitations, the AI community all but dropped the idea. It returned to the fore only after academics like Hinton and LeCun expanded these system so they could operate across multiple layers of nodes. That’s the “deep” in deep learning.

As Zitnick explains, each layer makes a calculation and passes it to the next. Then, using a technique called “back propagation,” the layers send information back down the chain as a means of error correction. As the years went by and technology advanced, neural networks could train on much larger amounts of data using much larger amounts of computing power. And they proved enormously useful. “For the first time ever, we could take raw input data like audio and images and make sense of them,” Zitnick told his class, standing at a lectern inside MPK 20, the south end of San Francisco Bay framed in the window beside him.

‘We have incredibly smart people here. They just need the tools.’ Larry Zitnick

As the class progresses and the pace picks up, Zitnick also explains how these techniques evolved into more complex systems. He explores convolutional neural networks, a method inspired by the brain’s visual cortex that groups neurons into “receptive fields” arranged almost like overlapping tiles. His boss, Yann LeCun, used these to recognize handwriting way back in the early ’90s. Then the class progresses to LSTMs—neural networks that include their own short-term memory, a way of retaining one piece of information while examining what comes next. This is what helps identify the commands you speak into Android phones.

In the end, all these methods are still just math. But to understand how they work, students must visualize how they operate across time (as data passes through the neural network) and space (as those tile-like receptive fields examine each section of a photo). Applying these methods to real problems, as Zitnick’s students do during the workshops, is a process of trial, error, and intuition—kind of like manning the mixing console in a recording studio. You’re not at a physical console. You’re at a laptop, sending commands to machines in Facebook data centers across the internet, where the neural networks do their training. But you spend your time adjusting all sorts of virtual knobs—the size of the dataset, the speed of the training, the relative influence of each node—until you get the right mix. “A lot of it is built by experience,” says Angela Fan, 22, who took Zitnick’s class in the fall.

A New Army

Fan studied statistics and computer science as an undergraduate at Harvard, finishing just last spring. She took some AI courses, but many of the latest techniques are still new even to her, particularly when it comes to actually putting them into practice. “I can learn just from interacting with the codebase,” she says, referring to the software tools Facebook has built for this kind of work.

For her, the class was part of a much larger education. At the behest of her college professor, she applied for Facebook’s “AI immersion program.” She won a spot working alongside Zitnick and other researchers as a kind of intern for the next year or two. Earlier this month, her team published new research describing a system that takes the convolutional neural networks that typically analyze photos and uses them to build better AI models for understanding natural language—that is, how humans talk to each other.

This kind of language research is the next frontier for deep learning. After reinventing image recognition, speech recognition, and machine translation, researchers are pushing toward machines that can truly understand what humans say and respond in kind. In the near-term, the techniques described in Fan’s paper could help improve that service on your smartphone that guesses what you’ll type next. She envisions a tiny neural network sitting on your phone, learning how you—and just you in particular—talk to other people.

For Facebook, the goal is to create an army of Angela Fans, researchers steeped not just in neural networks but a range of related technologies, including reinforcement learning—the method that drove DeepMind’s AlphaGo system when it cracked the ancient game of Go—and other techniques that Zitnick explores as the course comes to a close. To this end, when Zitnick reprised the course this winter, Fan and other AI lab interns served as class TAs, running the workshops and answering any questions that came up over the six weeks of lectures.

Facebook isn’t just trying to beef its central AI lab. It’s hoping to spread these skills across the company. Deep learning isn’t a niche pursuit. It’s a general technology that can potentially change any part of Facebook, from Messenger to the company’s central advertising engine. Solti could even apply it to the creation of videos, considering that neural networks also have a talent for art. Any Facebook engineer or data scientist could benefit from understanding this AI. That’s why Larry Zitnick is teaching the class. And it’s why Solti abandoned his lunch.

Go Back to Top. Skip To: Start of Article.

Don’t Worry, There’s Plenty of Great Iron Fist—It’s Just Not on Netflix

The critical pile-on of Iron Fist has officially reached comedy status. The fourth of Netflix’s Marvel shows (and the final lead-in to next year’s Defenders teamup) premieres today, and the reception to the first few episodes has not been kind. While that’s largely the fault of dull writing and plodding plotting, though, Iron Fist himself hasn’t been helping. From the moment that Netflix announced the casting of Finn Jones as the titular hero, there’s a been a steady drumbeat of complaints about a white guy playing the greatest martial artist in the world—a complaint that has only become louder as Jones has waded intro the fray, getting defensive on Twitter and suggesting that people are only complaining because Donald Trump is President.

To be fair, many comic book fans have come to the defense of Jones’ casting. Sure, they argue, it might be racially insensitive to have a white guy be Marvel’s best martial artist; and yeah, it’s another example of Marvel’s reliance on the “white savior” trope, one more troubling after last year’s Doctor Strange turned The Ancient One from an Asian to a Caucasian role. But, they insist, it’s canon, because Iron Fist was actually white.

That’s true: Danny Rand, the Iron Fist on the show, is indeed the primary Iron Fist in comic book continuity. But that doesn’t mean that Danny Rand is the only Iron Fist in Marvel’s comic book mythology. As early as his second comic book appearance (in 1972’s Marvel Premiere #16), there was the implication that Iron Fist wasn’t an individual’s identity as much as a shared mantle that had been worn by different people throughout history. It would take decades for that idea to come into focus, but when it did—courtesy of the 2006 Immortal Iron Fist series by Ed Brubaker, Matt Fraction, and David Aja—it revolutionized Iron Fist as a concept, and as a superhero identity.

Rand, Immortal Iron Fist revealed, was the sixty-seventh Iron Fist to that point. Although the series only introduced readers to seven of his 66 predecessors, all but one of them was of Asian descent. Beyond Quan Yazou, the original Iron Fist, there were Li Park, Bein Ming-Tian, Wu Ao-Shi, Bei Bang-Wen and Kwai Jun-Fan—and none of them were a hipster version of Bruce Wayne.(Though it’s telling that the series spent more time with the seventh predecessor, a white dude named Orson Randall, than any of the others.)

Nor was Iron Fist’s Asian legacy only in the past; in both Immortal Iron Fist and subsequent series Iron Fist: The Living Weapon, the writers established that the future of the Iron Fist was distinctly un-Caucasian. The former series flashed-forward to the year 3099 to introduce Wah Sing-Rand, while The Living Weapon showed a young female monk called Pei possessing the Iron Fist.

In many ways, this is in keeping with Marvel’s general direction with regards to comic book representation over the last few years. Once upon a time, the company’s catalog of heroes who were women or people of color was limited to sidekicks, supporting characters, and the occasional team-member. More recently, though, more familiar superhero identities have been turned into franchises with an aim of more accurately reflecting the world outside your window. The half-Black, half-Latino Miles Morales became a second Spider-Man; Sam Wilson—formerly the high-flying Falcon—signed on as a new Captain America; Thor was replaced as god of thunder by his ex-girlfriend Jane Foster.

While that trend seems to be continuing to this day—Invincible Iron Man was recently relaunched with a teenage girl taking the place of Tony Stark—there remains a horde of traditionalists for whom there can only be one version of any given character. More often than not, that means the original version, when almost everyone was a white dude. It’s worth noting that Marvel is seeing historically low sales of its monthly titles, leading to rumors of a relaunch later this year that will restore the white male versions of its big names in hopes of appealing to long-term fans.

Is that conservative impulse among fandom the reason that Marvel didn’t try to switch things up when selecting a TV version of Iron Fist? It’s unclear. The company’s movies and TV adaptations tend to hew towards the “classic” takes on characters, but not always: Samuel L. Jackson’s Nick Fury and Agents of SHIELD‘s Ghost Rider were based on later incarnations rather than the original (white) ones. But if you’re convinced that Netflix’s Iron Fist should be white because of “canon,” forget it: A full 80% of the comic book Iron Fists to date haven’t white. There’s more than enough material available to support an alternative take. Perhaps those concerned with fidelity to the source material should ask themselves why Marvel didn’t really go with canon in the first place.

Go Back to Top. Skip To: Start of Article.

Don’t Worry, There’s many Great Iron Fist—It’s simply not on Netflix

The critical pile-on of Iron Fist has officially reached comedy status. The fourth of Netflix’s Marvel shows (and also the final lead-in to next year’s Defenders teamup) premieres today, together with reception on very first couple of episodes is not sort. While that’s largely the fault of dull writing and plodding plotting, though, Iron Fist himself hasn’t been helping. As soon as that Netflix announced the casting of Finn Jones once the titular hero, there’s a been a steady drumbeat of complaints about a white man playing the greatest martial musician in world—a issue who has just become louder as Jones has waded intro the fray, getting defensive on Twitter and suggesting that folks are just complaining because Donald Trump is President.

To be fair, numerous comic book fans attended toward defense of Jones’ casting. Yes, they argue, it could be racially insensitive to enjoy a white man be Marvel’s best martial artist; and yeah, it is another exemplory instance of Marvel’s reliance in the “white savior” trope, one more troubling after last year’s Doctor Strange turned The Ancient One from an Asian up to a Caucasian role. But, they insist, it’s canon, because Iron Fist had been actually white.

That’s real: Danny Rand, the Iron Fist on show, is certainly the primary Iron Fist in comic book continuity. But that doesn’t mean that Danny Rand may be the only Iron Fist in Marvel’s comic guide mythology. As early as their 2nd comic book appearance (in 1972’s Marvel Premiere #16), there is the implication that Iron Fist had beenn’t an individual’s identity as much as a shared mantle that had been worn by different people throughout history. It could just take years for that idea in the future into focus, however when it did—courtesy associated with the 2006 Immortal Iron Fist show by Ed Brubaker, Matt Fraction, and David Aja—it revolutionized Iron Fist as a concept, so that as a superhero identity.

Rand, Immortal Iron Fist unveiled, had been the sixty-seventh Iron Fist to that point. Although the show only introduced visitors to seven of their 66 predecessors, all except one of them had been of Asian descent. Beyond Quan Yazou, the original Iron Fist, there have been Li Park, Bein Ming-Tian, Wu Ao-Shi, Bei Bang-Wen and Kwai Jun-Fan—and do not require had been a hipster form of Bruce Wayne.(Though it is telling that the series invested more hours because of the seventh predecessor, a white guy called Orson Randall, than some of the other people.)

Nor had been Iron Fist’s Asian legacy only previously; both in Immortal Iron Fist and subsequent show Iron Fist: The residing Weapon, the article writers founded that the future associated with Iron Fist had been distinctly un-Caucasian. The previous show flashed-forward towards 12 months 3099 to introduce Wah Sing-Rand, while The Living gun showed a feminine monk called Pei possessing the Iron Fist.

In lots of ways, this might be commensurate with Marvel’s basic direction about comic guide representation throughout the last couple of years. Once upon a time, the company’s catalog of heroes who had been ladies or individuals of color was restricted to sidekicks, supporting characters, additionally the periodic team-member. Recently, however, more familiar superhero identities have already been converted into franchises having an aim of more accurately reflecting the planet outside your window. The half-Black, half-Latino Miles Morales became a second Spider-Man; Sam Wilson—formerly the high-flying Falcon—signed on as new Captain America; Thor had been replaced as god of thunder by his ex-girlfriend Jane Foster.

While that trend seems to be continuing to the day—Invincible Iron Man had been recently relaunched having teenage girl taking the place of Tony Stark—there remains a horde of traditionalists for who there can just only be one form of any given character. Most of the time, meaning the original variation, when just about everyone was a white guy. It’s worth noting that Marvel is seeing historically low sales of its month-to-month games, resulting in rumors of the relaunch later this season that may restore the white male variations of its big names assured of attractive to long-lasting fans.

Is the fact that conservative impulse among fandom the main reason that Marvel didn’t try to switch things up when selecting a TV form of Iron Fist? It’s not clear. The business’s movies and TV adaptations often hew towards the “classic” assumes figures, however constantly: Samuel L. Jackson’s Nick Fury and Agents of SHIELD‘s Ghost Rider had been predicated on later incarnations rather than the initial (white) ones. However, if you’re convinced that Netflix’s Iron Fist must be white due to “canon,” forget it: A full 80% associated with the comic guide Iron Fists currently haven’t white. There’s more than enough material offered to help an alternative take. Perhaps those worried about fidelity to your source product should ask by themselves why Marvel didn’t really choose canon originally.

Go Back to Top. Skip To: Begin of Article.

Inside the Deeply Nerdy—and Insanely Expensive—World of Hollywood Prop Collecting

“Previous generations bought Renoirs and Cézannes,” Dan Lanigan says. “We’re buying stormtrooper helmets and Ghostbusters proton packs.” The burly TV producer is referring to the obsessive (and costly) pursuit of prop collecting. “This is the fine art of my generation.”

It used to be an underground hobby. People did it, but nobody talked about it—not only because it was embarrassing to admit that you coveted Charlton Heston’s slave collar from Planet of the Apes but also because, since such things were studio property, it was illegal to own them. Shady studio insiders and a cabal of collectors struck deals in private. That all changed in 1970, when MGM cleared some clutter from its soundstages with a three-day auction. Among the frayed costumes and antique furniture that hit the block were two of the most important sci-fi props ever made: the proto-steampunk contraption from the 1960 film adaptation of H. G. Wells’ The Time Machine, and the miniature model of the United Planets Cruiser C-57D, better known as the Forbidden Planet flying saucer. The time machine sold for almost $10,000, and while there’s no record of what the silver saucer went for then, it changed hands eight years ago for $76,700. Since MGM’s auction, prices for the best sci-fi props have routinely hit six-figures. In October 2015, the miniature Rebel blockade runner ship from Star Wars: Episode IV pulled down $450,000.

This very expensive hobby is about more than snatching up the coolest specimens. It’s about lost youth, self-identification, preserving the past, and—though most collectors won’t admit it—hero worship and secret cosplay. There are some things in life more thrilling than watching your favorite movie late at night while clutching a screen-used prop from the same flick in your trembling, sweaty palms, but it’s a very short list.

Prop: Deckard's PKD blaster | Film: Blade Runner (1982) | Designers: Terry Lewis and Ridley Scott | Materials: .222 caliber Steyr-Mannlicher SL rifle, Charter Arms Bulldog .44 Special, six LEDs (four red, two green) | Most Recent Selling Price: $270,000 Prop: Deckard’s PKD blaster | Film: Blade Runner (1982) | Designers: Terry Lewis and Ridley Scott | Materials: .222 caliber Steyr-Mannlicher SL rifle, Charter Arms Bulldog .44 Special, six LEDs (four red, two green) | Most Recent Selling Price: $270,000 Dan Winters

When the Blade Runner gun surfaced, it was a big deal for the sci-fi prop community. After 24 years without a sighting, enthusiasts had resigned themselves to the idea that Deckard’s hand cannon was lost forever, like tears in rain. Then suddenly there it was, at the 2006 Worldcon, displayed under glass in all its off-world glory. Using 170 forensic photographs documenting every screw, scratch, and rust spot, hardcore collectors on the RPF hobbyist website were able to make a positive ID. Not only was this an authentic BR gun, it was the authentic “hero” blaster—hero being prop lingo for the detailed model used in close-ups—the very same weapon Harrison Ford used to blow away replicants. Three years later, Deckard’s PKD (a sly nod to Philip K. Dick, the author of Blade Runner’s source material) sold at auction for $270,000. The winning bidder was Dan Lanigan, a burly TV producer known for bidding up lots that pass the “mom test,” props so indelibly iconic that even your mother would recognize them. The allure of this hero blaster is that, unlike so many sci-fi heaters, it looks and feels like a real gun. That’s because it’s made with real gun parts. The steel slab atop the barrel and the magazine below are from a .222-caliber Steyr-Mannlicher SL bolt-action target rifle (the factory serial number is clearly visible: 5223). The other primary donor organs were pulled from a Charter Arms Bulldog .44 Special. This inspired mix of high- and low-tech components strikes the perfect balance between dystopian sci-fi and gumshoe noir.

Prop: ED-209 VFX miniature | Film: RoboCop (1987) | Designer: Craig Hayes | Materials: Resin, wire, rubber, and foam over a metal armature | Estimated Value: $60,000 to $80,000Prop: ED-209 VFX miniature | Film: RoboCop (1987) | Designer: Craig Hayes | Materials: Resin, wire, rubber, and foam over a metal armature | Estimated Value: $60,000 to $80,000Dan Winters

The protagonist of Paul Verhoeven’s sleeper hit is Officer Murphy, the titular cyborg tasked with cleaning up the mean streets of Detroit. But the character that really steals the show is the dysfunctional and heavily armed homicidal bot known as ED-209. Whether blowing away a brown-nosing junior executive with 20-mm cannons or throwing a big-baby tantrum after falling down a flight of stairs, ED’s screen presence is a paragon of stop-motion animatronics. Collector Dan Lanigan purchased his ED-209 model directly from RoboCop’s VFX supervisor, Phil Tippett. It’s one of only two fully articulating ED-209 miniatures made for this underrated cyberpunk satire, and the only one reused for Robocop 2 and 3. A cross between a Bell UH-1 Huey gunship and a DARPA black project, this 8-inch-tall maquette is an exact dupe of the full-size (7-foot-tall, 300 pound) but mostly static fiberglass ED-209 that Verhoeven used for the live-action scenes. An obsessive attention to detail—from the four hydraulic rams controlling each leg to the heat exchangers, intake/exhaust vents, and radiators (homages to ED’s Motor City roots)—was necessary so that the lighting would reflect at exactly the same angle and intensity on both the puppet and it’s full-size counterpart. If the metrics were slightly off, the stop-motion and live-action footage wouldn’t match up perfectly in post-production. Hinged and ball-and-socket joints enable the many slight and precise body movements necessary for convincing stop-action photography. It’s not just the historical significance, though, that gets collectors excited. “ED is a badass Corvette with legs,” Lanigan says. “He’s a villain, but also likeable because he’s such a comical idiot.”

Prop: Lightsaber | Film: Star Wars: Return of the Jedi (1983) | Designers: Norman Harrison and Norank Engineering | Materials: Resin casting of original | Value: $30,000 Prop: Lightsaber | Film: Star Wars: Return of the Jedi (1983) | Designers: Norman Harrison and Norank Engineering | Materials: Resin casting of original | Value: $30,000 Dan Winters

In the world of vintage collectables, there’s always a marquee brand that demands insane prices. In the sci-fi prop world, that brand is Star Wars. The prices for production artifacts with a Lucasfilm provenance make a mockery of presale estimates. A TIE Fighter miniature from Star Wars: A New Hope sold for $402,500, nearly twice the expected price. More impressive, back in 2005, a lightsaber used by Mark Hamill in the same film sold for $200,600, three times its estimate. That first-gen weapon (the one lost along with most of Luke’s forearm in the showdown with Vader at Cloud City) was fashioned by set decorator Roger Christian out of an old flashgun handle for a Graflex camera, along with other doodads. This one, Luke’s green-bladed Excalibur, was a new design crafted for Jedi. But this saber wasn’t built piece by piece—it’s a casting. In this process, a silicone mold is made of the original prop, then that mold is used to produce identical copies in hard rubber, resin, and even metal. Castings are often used in place of hero props in stunt scenes so the detailed original doesn’t get damaged. This resin casting was used in the Sarlacc sequence at the Great Pit of Carkoon.

Prop: T-800 | Film: Terminator 2 (1991) | Designer: Stan Winston | Materials: Plastic, copper paint, nickel and chrome electroplating | Value: $488,750 Prop: T-800 | Film: Terminator 2 (1991) | Designer: Stan Winston | Materials: Plastic, copper paint, nickel and chrome electroplating | Value: $488,750 Dan Winters

Every generation has its childhood demons. The release of The Terminator in 1984 introduced a new bogeyman to the silver screen (and VHS): the T-800. Seven years later, the film’s sequel, Terminator 2: Judgment Day, cemented the reputation of the crimson-eyed grim reapers. Only four of these “puppets” were made for T2: two articulating heroes (capable of gross body movement, plus head and facial movement), and two “stunts” (nonarticulating, but designed to take more punishment). An original, full-scale T-800 endoskeleton sold at auction in 2007. Bidding started at $80,000 and topped out at $488,750, crushing the pre-auction high estimate of $120,000. Why so much for a shiny puppet? Because it was a screen-used hero T-800, one of the models that saw action when the cameras were rolling. Also, the T-800 happens to be Stan Winston’s Mona Lisa. The late designer’s FX wizardry is part of Hollywood lore: Jurassic Park III, Aliens, Predator, Predator 2, A.I., Edward Scissorhands. One of his four Oscars (Best Visual Effects, 1992) is thanks to this 6′ 2″ animatronic skeleton. The second-gen T-800 is made mostly of plastic that’s been electroplated. How do you electroplate a nonconductive material like plastic? By spraying the plastic with a high-particulate, conductive copper paint, then submerging the pieces in an electroplating bath, first nickel, then chrome. Although this added more weight to the puppets, it made the finish more durable. Huge weight savings were realized elsewhere—50 pounds’ worth—because the harder exterior eliminated the need for internal steel supports. This light and nimble design allowed a puppeteer to crash a stunt T-800 through a breakaway wall or wreak havoc on the Future War battlefield without having to worry about bits of chrome flaking off. Sweet dreams, puny humans.

Prop: Proton pack | Film: Ghostbusters (1984) | Designers: Stephen Dane and Ivan Reitman | Materials: Fiberglass, aluminum, lights, rubber tubing, andcomputer parts | Most Recent Selling Price: $169,900 Prop: Proton pack | Film: Ghostbusters (1984) | Designers: Stephen Dane and Ivan Reitman | Materials: Fiberglass, aluminum, lights, rubber tubing, and computer parts | Most Recent Selling Price: $169,900 Dan Winters

There’s no denying the cultural significance of Ghostbusters. Now more than three decades old, the original film still resonates like a giant tuning fork. Which goes a long way toward explaining why the proton pack is so revered by prop collectors. After all, who wouldn’t want their own portable unlicensed nuclear accelerator? Inspired by a military-issue flamethrower, “hardware consultant” Stephen Dane purchased a backpack frame from an army surplus store in Hollywood and made a rough prototype. After director Ivan Reitman added his tweaks, a cinematic legend was born. The molded fiberglass shell is attached to an aluminum backplate, which was then bolted to a US Army–spec backpack frame. Dane added paint, aluminum warning labels (“Danger: High Voltage 1KV”), flashing lights, crank knobs, and enough electronic parts to make the thing pop onscreen. Most of those components have been identified thanks to hi-res photos on prop sites: Sage and Dale resistors, Clippard pneumatic tubing, Arcolectric indicators, and Legris banjo bolts (on the neutrona wand). It’s as heavy as it looks—with the battery, a hero weighs more than 30 pounds. To ease the load on the actor’s shoulders, two lighter versions were available for use during filming: a gutted “semi-hero,” with some cast surface details (for wide shots) and a bantam-weight “stunt” made of foam rubber (for action scenes). Four years ago, a screen-used hero proton pack was added to the Lanigan collection. Price: $169,900. Congrats Dan, but remember: Don’t cross the streams. It would be bad.

Prop: Aries 1B Translunar space shuttle | Film: 2001: A Space Odyssey (1968) | Designers: Harry Lange, Fred Ordway, and others | Materials: Wood, plexiglass, acrylic, steel, brass, aluminum, plastic | Most Recent Selling Price: $344,000Prop: Aries 1B Translunar space shuttle | Film: 2001: A Space Odyssey (1968) | Designers: Harry Lange, Fred Ordway, and others | Materials: Wood, plexiglass, acrylic, steel, brass, aluminum, plastic | Most Recent Selling Price: $344,000Dan Winters

Stanley Kubrick’s masterful tale of human evolution catapulted the humble sci-fi genre from B-movie fodder to serious art, thanks largely to the groundbreaking visuals pioneered by the auteur director and his FX master, Douglas Trumbull. The miniature models used in the eerily realistic space travel scenes are of particular interest to collectors because of their intricate design—aerospace engineers were consulted on the production of each model. Most of the original props were destroyed, but one of the 2001 miniatures survived: the screen-used Aries shuttle that transports Dr. Heywood R. Floyd from the space station to the Clavius excavation site on the moon. In 1975, the prop found its way to one of Kubrick’s neighbors, a Hertfordshire public school teacher, who used it as a show-and-tell exhibit for art students. When the prop was eventually consigned to auction in 2015, the final paddle price greatly exceeded the expected high mark of $100,000. The winning bid, at $344,000, was the Academy of Motion Picture Arts and Sciences. It will be restored before being displayed at the new Renzo Piano-designed Academy Museum, which opens in 2018. The hulking Aries model—it weighs about 100 pounds and measures 94 inches in circumference—is made of wood, blown plexiglass, and various metals, finished with plastic bits cherry-picked from off-the-shelf scale-model kits. These hobby-model parts provide the detail, texture, and depth necessary for close-up FX photography with large-format cameras. Look closely and you’ll also see wires, tubing, flexible metal foils, decals (“Battery Location Point Here”), and plenty of heat-formed plastic cladding. Although the internal mechanicals were removed many years ago, the gears that control the four landing legs still function flawlessly. The virtuosic scene in 2001 starring this long-lost orb is the reason Mission Control still has The Blue Danube Waltz in heavy rotation on its wake-up playlist for ISS astronauts.

Show: Star Trek (1966-1969) | Prop: Phaser | Designer: Wah Chang | Materials: Aluminum, brass, popsicle sticks, acrylic tube, fiberglass, cast resin | Value: $200,000 Prop: Phaser | Show: Star Trek (1966-1969) | Designer: Wah Chang | Materials: Aluminum, brass, popsicle sticks, acrylic tube, fiberglass, cast resin | Value: $200,000 Dan Winters

There are plenty of bogus or knockoff Star Trek props in circulation, but there’s nothing fake about this original series phaser. The provenance is stellar: purchased by a prop artist directly from Paramount in the 1970s. It’s an ultra-rare hero constructed mostly of aluminum, fiberglass, and cast resin. The handle is a hand-painted brass tube embellished with popsicle sticks. (Yes, really. Look closely.) There were other phasers made, including midgrade fiberglass models for longer shots and VacuForm plastic ones for Kirk to use when clubbing Klingons. But this is the most intricate variant used for close-ups. Only two were made, so this specimen is worth a bundle. The owner isn’t selling anyway. It’s part of a massive sci-fi prop collection that includes classics like a prized space suit from 2001. If you must have a phaser of your own, there’s always the forgery market.

Prop: The Samaritan | Film: Hellboy (2004) | Designers: TyRuben Ellingson | Materials: Painted urethane | Estimated Value: $10,000 to $15,000Prop: The Samaritan | Film: Hellboy (2004) | Designer: TyRuben Ellingson | Materials: Painted urethane | Estimated Value: $10,000 to $15,000Dan Winters

Some props are sketched by a conceptual artist and painstakingly assembled by union craftspeople piece by piece. Many more, though, are simply castings. This is particularly true of movie prop firearms. Matt Damon can’t pistol-whip a bad guy with a real Sig Sauer 9-mm hero gun in The Bourne Identity. A “live gun” is used strictly for close-ups and shooting blanks, where filming anything but an actual Sig just won’t do. To pull off a pistol-whip scene, the prop department must cast a Sig Sauer stunt gun out of soft rubber. Guns are also cast in hard rubber, resin, and even metal depending on what function they need to serve in the film. In the prop collecting community, castings and recastings (castings of castings) are highly contentious subjects. “If you look for cheap movie prop kits or ‘raw castings’ on eBay, you’ll find hundreds of people all over the world who bought some shitty rubber prop and made it shittier by recasting it,” says former Lucasfilm VFX designer and MythBusters host Adam Savage. “Because every time you cast something, each successive generation gets crappier.” So when Savage decided to add the comically oversized Samaritan handgun to his prop collection, he went straight to the source: Guillermo del Toro, director of the Hellboy franchise. Unlike a lot of iconic props, there aren’t many genuine Samaritan castings on the market. Del Toro owns the only hero Samaritan, which was cast in aluminum by the famous Weta Workshop in New Zealand. He also had a spare screen-used hard rubber Samaritan casting, which he traded straight up for a casting of Adam Savage’s immaculate scratch-built Blade Runner PKD blaster. A perfect clone of visual designer TyRuben Ellingson’s original concept for the film, the Samaritan is one of the heaviest stunt handguns ever cast. “My Samaritan weighs 5 or 6 pounds,” Savage says proudly. “Guillermo had the stunt guns cast in hard rubber because he wanted them to feel heavy when [Hellboy star] Ron Perlman picked them up.” The Weta detailing is so accurate that this thing could pass for the hero Samaritan in a tight shot. “The gravitas and veracity of this prop is exceptional,” Savage says. “It feels luxurious to hold.”

Rene Chun is a frequent WIRED contributor. He wrote about the SFMOMA redesign in issue 24.05.

This article appears in the March issue. Subscribe now.

Go Back to Top. Skip To: Start of Article.