REVIEW: FOCALS SMART GLASSES BY NORTH
Peer through these custom-fitted smart glasses and you can almost see the future.
- WIREDFinally, smart glasses that look almost like real, if slightly thick, eyeglasses. Custom design and fitting process ensures the display is aligned with your eyes. Works with Alexa. Solid battery life.
- TIREDThick arms and wide frames means they don’t look exactly like regular glasses. Heavy and slightly uncomfortable to wear for long stretches. The Focals messaging experience on iOS is unwieldy; navigation app doesn’t offer enough context to be useful.
ONE NIGHT RECENTLY I dreamed I was thumbing the little microcontroller for my smart glasses. It’s not the first time technology had seeped from my everyday life into my subconscious mind. There was that time I dreamed up a giant Motorola smartwatch with a kickstand, and the time Justin Bieber pitched me on a new peer-to-peer payments app. But this controller, a tiny joystick on a thick loop around my forefinger, was so satisfying. So addictive. I couldn’t stop playing with it, like a pimple that’s ready to pop.
The controller is an accessory to a new pair of smart glasses called Focals. They’re made by North, a Canadian startup. As the story goes, North’s founders gave Google Glass a go back in the day, tried rigging the head-up display to work with a gesture-based armband they themselves had built, and then determined they could make better glasses. Armed with over $100 million in funding from Intel Capital and Amazon’s Alexa Fund, North toiled for four years to build Focals.
North’s focus was twofold: First, design and make their own miniaturized components, and second, release a pair of smart glasses to the world that actually look like glasses. And the Focals are tantalizingly close to that. But they also cost $999 and require custom fitting at one of the company’s temporary stores in Brooklyn, Toronto, or wherever their newly launched mobile pop-up might appear. And then there’s the whole experience of wearing them, which I did, and then didn’t, and then did again; all the while peering through the lenses for glimpses of the future.
The first time I wore Focals, one of my WIRED editors said he hated the future.
Specifically, he said he hated a future in which he’s chatting with a coworker and she’s getting notifications about the Golden Globes in front of her eyeballs. We were in a cab on our way to a press event at CES, the annual consumer electronics fest in Las Vegas, and he might not have noticed I was wearing smart glasses had I not said something.
Long hair does wonders to hide the thick arms of the Focals. Even if they’re not obscured, the Focals look distinctly less cyborg-like than other smart glasses on the market (which is still mostly limited to techno-optimist societies). While the arms of the Focals are made of aluminum, most of the product is made with nylon thermoplastic, which is very similar to the acetate that’s used in so many pairs of eyeglasses. The ones I’ve been wearing have a classic silhouette and a tortoiseshell frame.
There are other parts of Focals that give them away as smart specs. There’s the projector squeezed into the right arm, the orb that glows on the right lens, the chunky black ring you need to wear to scroll through the information that’s beamed into your eyes. (One of the learnings the Focals creators took away from Google Glass: They didn’t like the experience of having to reach up and touch a swipe pad.) But while I was wearing Focals, not everyone noticed them, which is to say not everyone noticed they were smart glasses. North gets kudos for that.
There’s a lot of technology that goes into making these smart glasses smart. But the more important question for a lot of would-be wearers: Why? Why do I need smart glasses? What do they do, exactly?
The Eyes Have It
Most smart glasses are called as much because they transfer some of the experience you’d have on another computing device—a smartphone, a PC—onto a display in front of your face. Many are labeled “AR glasses,” but each pair augments the world in front of you to varying degrees. Focals are not designed to project an NBA game (like Magic Leap’s One) or to run Autodesk apps (like Microsoft’s HoloLens head-up display) on your face. Instead, Focals are supposed to mirror the notifications you’d get on your phone, with some voice control and navigation applications thrown in for extra utility.
Also, these don’t have a built-in camera (like Snap’s Spectacles). This is the number one question people have asked when I’ve revealed that I’ve been wearing smart glasses. Focals do have a microphone, though, so you can talk to Alexa.
THE WIRED GUIDE TO VIRTUAL REALITY
The Focals experience is largely powered by your smartphone. They pair over Bluetooth with both iOS and Android phones. If a text message comes through on your phone, it will be shown on your Focals display. Same with an incoming phone call. If you get a notification from Apple News on iOS, it will show up on your Focals.
The accompanying ring, called the Loop, is what allows you to clear those notifications, to scroll from page to page, to respond to texts with shortcut responses or voice-dictated messages. When you wake up your Focals or select a menu option using the Loop’s tiny nub of a joystick, you hear a delightful click in your ear. The Loop is what I dreamed about. While I was wearing Focals, I couldn’t stop fiddling with the Loop on my forefinger. It is kind of ugly, and insanely satisfying to use, in the way body-focused repetitive behaviors can be.
While I was at CES, I realized that Focals’ killer app might just be its simplest: Telling the time. One quick push on the joystick would show the date and the time of day. It’s a useful thing to have in front of you while you’re at a massive conference, trying to make it to your meetings on time. I also saw news app notifications and, for better or worse, text message notifications. I say “for better or worse” because a group thread with friends about Kohler’s new smart toilet quickly devolved into an ongoing conversation about poop, and I couldn’t avoid it. (“Too bad you didn’t get that scoop, Lauren.”) Fortunately, the Focals only show text, not multimedia messages.
There are two voice-control protocols built into Focals: a homegrown one that lets you respond to text messages, and Amazon’s Alexa. If you long-press on the Loop joystick, you’ll summon Alexa, though it’s a version of the virtual assistant that’s tailored to smart glasses. For example, you can’t ask it to stream music, but you can ask it for factoids. Like “What time is it in Germany?” which is one of the things I asked my glasses before I made a phone call last week. You can ask Alexa for the weather, but that information is available as visual imagery on the glasses as well.
I also used the Focals’ built-in navigation feature, which relies on data from Mapbox and Foursquare and is designed for walking navigation. If the Focals detect that you’re driving, the glasses are disabled. You can get around this by indicating that you’re a passenger, but the company is careful to say that these are not meant to be used while driving. Anyway, when it comes to the walking navigation, I found it lacking. It was a series of sparse commands —walk 200 feet, turn left here—and I ended up using my phone again after a few minutes. In fairness, there is only so much information you can fit on a small head-up display.
When North was designing the glasses, the team was determined not to build something with fixed, flat lenses and an LED micro-projector. Flat glasses, they say, are obviously not “real” eyeglasses; plus, you need a curvature of the lens in order to offer prescription lenses, which North says are in the works. So North built its own micro-projector, which sits in the right arm of the glasses, and created a “holographic” film that goes inside the right lens. The projector spits out rays of light, which bounce off the thin film in the lens and go directly into your eye, which is how you see the visual imagery on Focals.
The result is a flat, 2D image that appears in front of your eyes, somewhere around the left shoulder of the person you might be talking to. It’s not volumetric in any way. You can’t walk up to the imagery, walk around it, manipulate it. But as basic as the Focals graphics are, they are surprisingly crisp.
The fact that each pair of glasses is custom-fitted helps. Wearable technology is a personal thing, and it’s a very real challenge to make a one-size-fits-all product. North requires Focals customers to sit for 3D scans in its stores and then pick up the glasses in person to have them adjusted. I was able to do this when I was in New York City over the holidays, and then company representatives heated and shaped the glasses for me when I met with them at CES. But this process severely limits the potential customer base for Focals.
As convenient as it was to reply to text messages without having to look at my phone, many of my text message exchanges were broken by Focals. This is largely because I had the glasses paired with an iPhone, and iOS has restrictions around how Messages are used by third-party apps. In order for North to craft and send a shortcut response to your Messages, it has to suck up a copy of the message into its servers (the company claims these are anonymized and encrypted) and send them back as an SMS. So every time I responded to an incoming text message through Focals, it created a new SMS thread in my Messages inbox. To make things even more awkward, each new SMS thread started with “Hey, it’s Lauren Goode. I’m messaging you from my Focals by North.”
The battery life on the glasses is expected to be around 16 hours on a single charge, and the Loop’s battery should last a few days. A lot of this depends on how intensively you use the Focals. The first time I wore them, they didn’t last 16 hours. They lasted through Sunday evening, a few hours on Monday morning, and another few hours on Tuesday morning before I got a low battery alert at 11:15 am. The case for the glasses doubles as a charging pod, which means you could, in theory, not have to worry about battery life if you plop them into the case each night. The case, however, is bulky as hell. It’s about the size (but not the weight) of a brick.
Wearing smart glasses doesn’t just spark questions about how long the battery lasts, or how your glasses handle iOS messaging, or what questions you absolutely can’t ask Alexa. Wearing smart glasses raises critical questions about how technology fits into our everyday lives, and whether we’re opening ourselves to the natural evolution of technology or trying really darn hard to shoehorn it in. Smart glasses sit on our faces; they are literally bumping up against our humanity.
At times when I was wearing Focals, I would look down—at a book, at my notes, at another darn screen—and the glasses would slip precariously down my nose. Custom-fitting didn’t help so much there. Other times I would look up and there would be some notification I was happy to see. Or, you know, just the time of day.
I cannot say with a good conscience that a person should fly to New York City or Toronto and spend $1,000 on a pair of Focals. But if you were already considering it—if you’ve already done it!—then know that you might be able to wear them and not have people call you out for wearing smart glasses. That you might find their greatest utility to be a simple one. And that you might get the urge to take them off before the day is over, because they’ll feel heavy after awhile.
Bottom line: Focals don’t fulfill the ultimate smart-glasses dream, but they come closer than anything I’ve worn yet.
Giving a nudge: How digital alerts can keep students on track
Colleges are using student data to craft custom text messages and other prompts to boost retention, but experts warn they can backfire.
Analysis: “The Era of Deep Learning Is Coming to an End”
Nothing lasts forever.
Out With The Old
Artificial intelligence developers may soon find themselves on the brink of a paradigm shift. Deep learning has dominated the field for several years — but may be on its way out.
The field of AI has shifted focus roughly every decade since it began in the 1950s. Now, a new analysis suggests that the 2020s will be no different.
A team from MIT Technology Review scanned all 16,625 research papers in the artificial intelligence section of arXiv, an open-source repository for sharing research, that were published between 1993 and November 18.
MIT Tech found that on machine learning started to pick up over the last 20 years — rapidly increasing in prevalence since about 2008 — but now that research fervor seems to be dying down.
Many of the new developments in artificial intelligence that we hear about nowadays are actually just applications of machine learningtechniques that have been hammered out for years.
And as the research community’s attention shifts from deep learning, it remains unclear what will take its place, according to MIT Tech. In the past, older types of artificial intelligence that didn’t really take off when they were first developed later resurfaced and taken off with new life. For instance, scientists first developed machine learning decades ago, but it only became commonplace about a decade ago.
MIT Tech didn’t predict what will come next. It may be that some form of existing technology will finally hits its stride, but it’s also possible that an AI engineer will develop some brand-new type of AI that’ll shape the future.
Earlier today, Apple released iOS 12.2 beta for developers, which includes new features such as Apple News for Canada, finally.
Now, 9to5Mac has unearthed a new setup screen which “clearly states that the user will be able to talk to Siri with AirPods or iPhone by saying “Hey, Siri”.”
Between your ears sits perhaps the most complex piece of biological machinery on the planet: an all-in-one computer, simulator, and creation device that operates out of a squishy, folded gray mass. And scientists aren’t quite sure how it works.
Gül Dölen, assistant professor of neuroscience at the Brain Science Institute at Johns Hopkins, thinks that neuroscientists might need take a step back in order to better understand this organ, which evolved in various forms in nearly every species of animal on Earth. Slicing a few brains apart or taking a few MRIs won’t be enough to get to the bottom of how these organs function. Instead, it might require a comparative approach; the most advanced catalog ever created. Dölen, who recently made headlines for her work giving MDMA to octopuses, would love to see neuroscientists band together to create a periodic table of the brain. And not just the human brain, but all brains.
She explained her ambitious experiment idea to Gizmodo:
“The periodic table of elements is remarkable. Whenever I look at it, I am amazed and awestruck all over again! Think of it: Just by knowing the number of electrons in the outer shell of an atom, you can deduce physical properties of the element, like is it a gas or a metal, and what’s more, you can use this information to make predictions about unknown properties of elements, and even predict the existence of elements that have yet to be found on Earth. Having the periodic table doesn’t solve all of the puzzles of chemistry, but it certainly gives us the outer border of the puzzle. In neuroscience we don’t have anything like that.”
Dölen compared present-day neuroscience to “somewhere between the ancient Greek’s recognition of four elements and the medieval alchemists trying to change lead into gold.”
Does that sound like hyperbole? Well, consider that neuroscientists can’t even agree on the brain’s most basic information-carrying unit. Perhaps it’s the average electrical field, or maybe it’s action potentials—the electrical output of single brain cells, or neurons. Maybe it’s the combined electrical activity that neurons collect from the other neurons, which they use to determine whether to fire or not. Or maybe its chemicals inside the cells. All of these ideas require different kinds of measurement, like blood-flow monitoring fMRI machines, action potential-detecting electrodes, voltage sensors for measuring the electrical activity before a neuron fires, and protein-detecting systems. Then there’s the blossoming field of genetics, which is also helping determine how the brain might work.
But perhaps each of these different measurements are just part of the many properties that brains have that must be catalogued. They’re equivalent to properties like whether an element is a solid or gas at room temperature, how much energy the atom needs to lose an electron, its radius, atomic weight, and configuration of electrons. But there are many kinds of brains out there. “Right now, our focus on just 5 species (humans, mice, fish, flies, and worms) really limits our ability to see the patterns,” she said. “It’s as if you were trying to figure out the organization of the periodic table by just looking at hydrogen, carbon, helium, oxygen, and gold.”
Attempts to create general rules for how certain brain properties can predict intelligence often fall apart, Dölen explained. We once thought that brain size could predict intelligence—but sperm whales have much larger brains than humans. Then, we thought ratio of brain size to body weight would predict intelligence—but tree shrews have a larger size-to-weight ratio than people. She pointed out that massive datasets have allowed scientists to create a more accurate picture. For example:
“Suzanna Herculano-Houzel’s lab actually developed a systematic method to count the neurons across over 500 species all across the tree of life. What they found is that, broadly speaking, the number of neurons scales with ‘intelligence,’ and that across different evolutionary lineages, the size of the brain is related either to the size of the neurons or to the number of neurons. So, for example, comparing the human brain to other primates, as the brain gets bigger, the number of neurons increases. But for rodents like mice and rats, as the brain gets bigger, the size of the neurons gets bigger. This huge data set also allows them to look at relationships between neuron number and intelligence, longevity, senility, sociality, etc.”
Dölen compared these insights to the comparative approaches behind the periodic table—once you find the proper patterns and line everything up, the table itself can make predictions. That was perhaps the periodic table’s most profound use: By simply arranging the atoms in a specific way based on their properties, chemist Dmitri Mendeleev was able to accurately the predict the existence and properties of three undiscovered elements based on the holes in his table. Dölen hopes a massive catalog of the properties of as many brains from as many species as possible, arranged in some pre-determined order, will reveal revolutionary insights about how brains work.
Ultimately, our understanding of brains is limited by our own humanity. “Because we can build cellphones but mice can’t, we define mice as less intelligent,” said Dölen. “However, compared to mice, humans are morons when it comes to smell intelligence (indeed mice have about 2,000 extra genes for detecting smell compared to humans). Similarly, mantis shrimp have 14 photoreceptors compared to our three, and so are likely to have much greater visual ‘intelligence’ than we do.”
Maybe it’s things that humans don’t always associate with smarts, like sociality, that actually lead to intelligence as we understand it. And maybe it will take lining all these brains up and looking for patterns to make universal rules about how they work.
Google planning changes to Chrome that break ad-blockers ‘for privacy’
The proposed API changes would limit content-blocking and reduce user agency across all Chromium browsers
Memory loss is one of the hallmarks of Alzheimer’s disease.
But it may be possible to reverse the symptom, restoring people’s memories, according to researchers.
Research published in the journal Brain suggests that a new approach using epigenetics– the study of chemical reactions and factors that influence genetics without changing the DNA sequence, effectively switching genes on and off, had shown it was possible to reverse memory decline.
Scientists used mouse models carrying the gene mutations for familial Alzheimer’s, where more than one member of a family had the disease, as well as post-mortem brain tissue from Alzheimer’s patients.
She said a key reason for cognitive decline in Alzheimer’s patients is the loss of glutamate receptors, which are critical to learning and short-term memory – itself the result of an epigenetic process known as repressive histone modification.
She said: “We found that in Alzheimer’s disease, many subunits of glutamate receptors in the frontal cortex are downregulated, disrupting the excitatory signals, which impairs working memory.”
By understanding that process, scientists could identify potential drugs that would reverse the process.
“Our study not only reveals the correlation between epigenetic changes and Alzheimer’s, we also found we can correct the cognitive dysfunction by targeting the epigenetic enzymes to restore glutamate receptors,” said Prof Yan.
“In this paper, we have not only identified the epigenetic factors that contribute to the memory loss, we also found ways to temporarily reverse them in an animal model of Alzheimer’s Disease.”
Researchers injected the Alzheimer’s animals three times with compounds designed to inhibit the enzyme that controls repressive histone modification.
Prof Yan said: “When we gave the Alzheimer’s animals this enzyme inhibitor, we saw the rescue of cognitive function confirmed through evaluations of recognition memory, spatial memory and working memory.
“We were quite surprised to see such dramatic cognitive improvement. At the same time, we saw the recovery of glutamate receptor expression and function in the frontal cortex.”
Future studies will be needed for long-lasting results, Prof Yan said, but she said an epigenetic approach works well for brain disorders such as Alzheimer’s because they allow the control of many genes rather than just one.
She added: “An epigenetic approach can correct a network of genes, which will collectively restore cells to their normal state and restore the complex brain function.”
—Watch the latest videos from Yahoo UK—
How Beyond Meat became a $550 million brand, winning over meat-eaters with a vegan burger that ‘bleeds’
In 2018, U.S. consumers ate roughly 13 billion burgers, according to data from consumer trends market research company NPD Group. And burgers are consistently one of the most popular items on menus across the country.
Still, people love it.
So what is it about burgers? Maybe it’s the juiciness people can’t resist, or that distinctive savory umami flavor. Maybe it’s the American-ness of it all.
Plant-based “meat” producer Beyond Meat is betting on it: The company is taking on the beef burger with Beyond Burger, a vegan veggie-based patty that is meant to look, cook, taste and even “bleed” like red meat, but that is healthier and more sustainable.
“The burger is something people love,” Ethan Brown, founder of Beyond Meat, tells CNBC Make It. “And so we went after that core part of the American diet.”
It’s working in a big way.
The company has famous investors like Bill Gates, Leonardo DiCaprio and even former McDonald’s CEO Don Thompson and America’s largest meat processor, Tyson Foods.
And since their debut at Whole Foods in May 2016, Beyond Burger patties have made their way into tens of thousands of supermarkets (from Kroger and Safeway to Whole Foods), restaurants (from TGI Friday to Carl’s Jr.), hotels (like The Ritz Carlton, Hong Kong) and even sports stadiums (like Yankee Stadium).
Beyond Meat says it has sold 25 million Beyond Burgers worldwide. The company recently filed for an IPO and is reportedly worth more than half a billion dollars.
Just don’t call Beyond Burger a veggie burger. It may be 100 percent plant-based (and GMO-, soy- and gluten-free), but this vegan patty is meant for meat-eaters too.
”[W]e’re reaching mainstream consumers that are interested in healthier forms of meat,” Brown tells CNBC Make It.
To accomplish a juicy, meat-tasting product that carnivores will crave, Beyond Meat biophysicists figure out, at a molecular level, what it is that makes meat taste and behave like meat. They then identify plant materials that behave the same way, to replicate it.
So “we like to think of meat, not from its origin — say from a chicken or a cow — but in terms of … the proteins, the carbohydrates, the lipids, the minerals and vitamins, all of which are available — except for cholesterol — in the plant kingdom,” says Beyond Meat biophysicist Rebecca Miller.
The lab technicians at Beyond Meat’s research and development lab in El Segundo, California, are even trained meat sommeliers, and they are constantly innovating on the product.
The main ingredients in the original Beyond Burger are pea protein, beet coloring and beet juice to make it “bloody,” and potato starch and coconut oil to create juiciness. Beyond recently launched its 2.0 burger (available only at Carl’s Jr. and A&W restaurants for now), which also includes brown rice and mung bean proteins, for a meatier taste and texture, according to the company’s website. Each 4-ounce Beyond Burger patty has 20 grams of protein and about 20 grams of fat, which is comparable to a beef patty.
Many are huge fans of Beyond Burger, which like a beef burger, can take grill marks, cooks slightly pink in the middle and releases juices when you bite in.
“It’s so meaty, it’s almost kind of freaky,” says vegan mom Erin Landry on her @mrs.modernvegan Instagram, after trying Beyond Burger at a Carl’s Jr. drive through.
“I’m not vegan … but I promise, this is actually really good,” says meat-eating music producer That Orko after taste-testing Carl’s Jr. Beyond Burger on pop singer Miss Krystle’s YouTube channel.
Two CNBC Make It staffers who tried Beyond Meat products also liked the burger but were even more impressed with its Beyond Sausage. “The burger is very tasty,” but the sausages, “they could be real,” says producer Mary Stevens.
Still, Beyond Burger is processed (the plant ingredients are put through heating, cooling and pressure to turn them into a meaty substance), no more than vegan junk food, say some critics. ( “It’s a process we’re proud of, and one we feel consumers are more comfortable with vs. the process of traditional livestock production,” says Allison Aronoff, Beyond Meat’s senior communications manager.)
And although eating a plant-based “meat” is healthier than read meat in many ways, it can be higher in sodium than beef, says dietitian Jen Bruning. (One Beyond Burger patty has 380 milligrams of sodium according to the company website; for comparison, Wegmans 80/20 ground beef patties have 90 milligrams per patty; the average fast food single patty burger has 378 milligrams of sodium.)
Whatever your burger pleasure, targeting meat-eaters is a smart move — there are way more of them than there are vegans and vegetarians. Only 5 percent of Americans identify as vegetarian and 3 percent vegan, according to a 2017 Gallup poll. Those numbers haven’t changed much in the last decade or so.
Brown says the company found that 93 percent of the consumers in conventional grocery stores that are buying a Beyond Meat product are also putting animal meat their basket. “So they’re buying not only plant based meat, but they’re buying animal meat and that’s a really important breakthrough for us,” Brown tells CNBC Make It.
One tipping point in bringing plant-based “meat” to the masses has been the increase in product quality thanks to brands like Beyond Meat, James Kenji López-Alt, chef/partner at Wursthall restaurant in San Mateo, California, tells CNBC Make It.
“Tens of millions of dollars have been invested into researching this product and making it better and making it more real meat-like. And I think we are … 99 percent of the way there,” he tells CNBC Make It. “It’s close enough that people eating it enjoy it the same way that they enjoy actual ground beef.”
Plus, he says, prices have “reduced drastically” to about the same amount as meat. (At Bareburger restaurant in the Hell’s Kitchen neighborhood of New York City, a Beyond Burger costs $12.95 and a comparable beef burger is $11.99. At the grocer, Beyond retails for about $5.99 for two patties, while four Wegmans patties retail for about $5.44 online.)
All this has made Beyond Meat big business.
Beyond Meat products are in more than 32,000 grocery stores, including Kroger, Safeway, Publix, Target and Wegmans. And Beyond Burger has menus from Fridays and Del Taco to Hamburger Mary’s Bar and Grill to upscale Brasserie Ruhlmann in New York City; they’re served at universities from Ohio State to Harvard and even theme parks like Legoland.
While TGI Fridays declines to share sales data, its senior director of food and beverage innovation David Spirito tells CNBC Make It that Fridays has guests saying they came to Fridays specifically for the Beyond Burger.
And burgers are not the only plant-based “meat” Beyond Meat sells. It also sells sausage, chicken strips and beef crumbles, and has other products in the works.
“We want to make bacon, we want to make steak, we want to make the most intricate and beautiful pieces of meat,” says Brown.
In November, Beyond Meat filed for a $100 million initial public offering, reporting a 167 percent increase increase in revenue (to $56.4 million) for the first nine months of 2018 from the same period in 2017.
The company has grown from a $4.8 million valuation in 2011 to $550 million in November 2017, when Beyond Meat closed its latest ($55 million) round of funding, according to private market data company PitchBook. In addition to Gates, DiCaprio and Tyson, notable investorsinclude Twitter co-founders Biz Stone and Evan Williams, Honest Tea founder Seth Goldman, venture capital firm Kleiner Perkins and the Humane Society of the United States.
But plant-based meat is not only lucrative, it’s good for the environment.
Beyond Meat was started in 2009 by Brown, who was once a carnivore, but growing up around his family’s farm in Western Maryland had an impact.
“I spent a lot of time there with dairy cows, so I was very close to animals growing up, loved them and was fascinated by them.”
Passionate about the environment, Brown pursued a career in clean energy to help mitigate the effects of climate change. “But I began to realize that livestock had a larger contribution to the climate than many other things that I was working on in terms of the emissions,” he tells CNBC Make It.
Indeed, 3 percent of U.S. greenhouse gas emissions come from methane emitted from cows. And it takes an average 308 gallons of water to produce just 1 pound of beef, according to the USDA. Raising livestock for meat and dairy also depletes farmland.
In fact, eliminating or reducing consumption of such products “is probably the single biggest way to reduce your impact on planet Earth, not just greenhouse gases, but global acidification, eutrophication, land use and water use,” said University of Oxford’s Joseph Poore, co-author of a recent study analyzing the environmental damage of farming.
Producing Beyond Burgers uses 99 percent less water, 93 percent less land, creates 90 percent fewer greenhouse gas emissions and requires 46 percent less energy than producing beef burgers, according to a September report commissioned by Beyond Meat. The report, which measures the environmental impact of a quarter pound Beyond Burger as compared to a quarter pound U.S. beef burger, was conducted by the Center for Sustainable Systems at University of Michigan.
And of course, plant-based “meat” production does not entail any inhumane treatment of animals, something that plagues factory farming.
For all these reasons, in 2018, Beyond Meat was a co-winner of the United Nations’ Champions of the Earth Award, in the Science and Innovation category. The other winner? Impossible Foods.
Beyond Meat isn’t the only plant-based “meat” game in town. Impossible Foods, which launched in 2011 and is headquartered in Redwood City, California, also makes its products entirely from plants.
Impossible Burger uses heme, a genetically engineered iron-containing molecule for the taste and aroma of meat. It is available at White Castle (the $1.99 slider) and at other restaurants in the U.S. and Hong Kong, and the company plans to hit grocery stores this year. Actor Kal Penn (who appropriately starred and “Harold and Kumar go to White Castle” — pre-vegan sliders) and Microsoft co-founder and billionaire Bill Gates have invested in the company. Impossible Foods was valued at $350 million in January 2018, according to Pitchbook.
Another emerging company in the space, San Francisco-based Memphis Meats, is growing animal meat in the lab. Launched in August 2015, Memphis Meats has raised money (reportedly over $20 million) from the likes of Gates and Richard Branson. Unlike Beyond Meat and Impossible Foods, Memphis Meat uses harvested animal cells to grow its product, which is known as called “clean meat.”
But Brown says Beyond Meat’s biggest competitor “really [is] the meat industry itself.”
U.S. retail sales of plant-based “meats” grew by 24 percent in 2018, while animal meats grew by just 2 percent, according to Nielsen data commissioned by the Plant Based Foods Association. The market for meat substitutes is expected to grow to $6.4 billion worldwide by 2023. And Brown and others believe alternative meats are the future.
“In 30 years or so, … I think that in the future clean and plant-based meat will become the norm, and in 30 years it is unlikely animals will need to be killed for food anymore,” Branson wrote in a blog post in February.
David Lee, Impossible Food’s chief operating officer and chief financial officer agrees. “Pat Brown [Impossible Foods founder and CEO] puts it very nicely,” Lee tells CNBC Make It. “He says, ‘You know, one day children everywhere will look up at their parents and say, “What? You used to eat meat from animals? How strange.”’
Lee says the goal is to give meat-eaters everywhere “something that tastes better but it’s better for them, that is better for the environment.”
It’s the goal of Beyond Meat as well.
“Someday, I think plant-based meat will overtake animal protein as the main source of meat,” Brown tells CNBC Make It. “I do believe it will happen in my lifetime.”