http://www.cnn.com/2017/04/21/health/naked-mole-rats-oxygen-study/

Naked mole-rats: The mammals that can survive without oxygen

A study found that naked mole-rats can switch to a metabolic system that doesn't require oxygen.

Story highlights

  • Naked mole-rats survived without oxygen for 18 minutes and recovered fully, one study shows
  • They use a plant-like method of creating energy without the need for oxygen

(CNN)One of the world’s most bizarre and remarkable animals is surprising scientists once again.

The naked mole rat is already known to be cold-blooded, resistant to cancer and feeling pain, and can live ten times longer than a mouse. The animal is alsoone of the strangest-looking mammals on the planet.
But now scientists believe these odd creatures can also survive without oxygen, by using a different method of creating energy in their bodies.
“Naked mole-rats are very social animals and typically live in underground colonies of up to 200 in East Africa,” Gary Lewin of the Max Delbrück Center for Molecular Medicine told CNN. Lewin co-led the study published Friday in Science.
With so many animals crowded into such a small living space — all of them taking in oxygen and producing carbon dioxide — they are used to oxygen-deprived environments, Lewin explained.
“We wanted to test systematically just how much oxygen deprivation these animals can stand,” he said.
Gary Lewin, co-author of the study, with a naked mole-rat

The team first tested the animals in an environment with five percent oxygen. “Anything less than 10 percent kills a human,” Lewin pointed out.
But the rodents were hardly affected at all, even after several hours of reduced oxygen.
The next test was to put the rodents in an environment with no oxygen at all.
“The animals quickly went to sleep,” explained Lewin. “They entered a state of suspended animation, a kind of coma, and survived like that for 18 minutes.”
When oxygen was reintroduced, the animals quickly recovered and suffered no long-term damage at all.
Analyzing the data, the scientists realized the animals were switching from a glucose-based metabolic system, which requires oxygen to release energy, to one based on fructose, which doesn’t need oxygen.
To continue creating the energy needed for cells to survive in vital organs such as the brain and heart — and keeping them functioning — the mole-rats were using a different form of fuel, one where lack of incoming oxygen is irrelevant.
“This is a fantastic piece of work,” said Michael Berenbrink, a senior lecturer on animal sciences at the University of Liverpool, who was not involved in the study.
“This type of metabolism is really unheard of among mammals,” he said. “There are some fish that have similar tricks… but they are also an exception. It really broadens our mind in terms of what evolution can do — how metabolic pathways can adapt,” he told CNN.
Naked mole-rats are very sociable animals and live in colonies of up to 200.

Lewin is excited about the potential implications for human beings.
Despite all of their remarkable traits, naked mole-rats are genetically very similar to mice, he explained, and not all that different to humans.
“Humans already have the ability to create energy from fructose … our livers do it all the time,” he explained. “The question is whether we can nudge the human body towards switching to a fructose-based metabolism when our supply of oxygen is low.”
Understanding just how these tiny, hairless creatures use fructose to stay alive could lead to new treatments for patients suffering oxygen deprivation due to strokes or heart attacks.
“Perhaps even feeding the brain fructose during this period of oxygen deprivation could help,” Lewin said.
He also wonders whether extreme divers, who survive for relatively long periods of time without taking in any oxygen, may have unwittingly taught their bodies how to switch from using glucose to fructose in the metabolic process.
“We just don’t know,” said Lewin. He hopes future studies will shed more light on exactly how these strange creatures survive without oxygen and what humans can learn from them.

http://www.zdnet.com/article/windows-10-ultralight-pcs-based-on-arm-smartphone-chips-set-for-q4-says-qualcomm/

windows10armdemo.png
Microsoft demoed Windows 10 Enterprise edition running on a Qualcomm Snapdragon Processor late last year.Image: Microsoft

Qualcomm has confirmed that new mobile Windows 10 PCs running its ARM-based Snapdragon 835 system on chip (SoC) will arrive by the fourth quarter.

Microsoft and Qualcomm announced in December that a new line of always-on cellular Windows 10 PCs running on Qualcomm’s ARM chips would be on the way at some point in 2017. But at Qualcomm’s Q2 results yesterday, the chipmaker’s CEO Steve Mollenkopf offered a firmer fourth quarter 2017 timeframe.

“Our Snapdragon 835 is expanding into mobile PC designs running Windows 10, which are scheduled to launch in the fourth calendar quarter this year,” Mollenkopf said, according to a SeekingAlpha transcript.

The ARM partnership between Microsoft and Qualcomm is notable as it expands Windows 10’s existing support of x86 chips from Intel and AMD. It also looks set to overcome the constraints of Microsoft’s previous ARM effort with Windows RT.

The Snapdragon 835 PCs will run full Windows 10 desktop, which has been compiled natively for Qualcomm’s SoCs. They’ll also run Win32 apps via an emulator, as well as universal Windows apps. Microsoft billed the forthcoming devices as a “truly mobile, power-efficient, always-connected cellular PC”.

The Snapdragon 835 is Qualcomm’s high-end chip, which powers Samsung’s Galaxy S8, and Xiaomi’s newly announced Mi 6, but so far no PCs.

Qualcomm sees the chips bringing smartphone technologies, such as Gigabit LTE and battery-saving efficiencies, to newer lightweight mobile PCs that are designed to be always on and always connected to a network.

The SoC’s key features include the Qualcomm’s Kryo 280 octa-core CPU, Adreno 540 GPU, the Snapdragon X16 LTE modem with one gigabit per second downloads, Bluetooth 5.0, and Qualcomm’s latest Quick Charge 4.0 technology.

http://www.theverge.com/2017/4/21/15385232/mercedes-benz-amazon-echo-alexa-google-home

Mercedes-Benz is connecting the Amazon Echo and Google Home to all its new cars

‘Ok Google, stop my house from burning down’

Mercedes-Benz announced today that all of its 2016 and 2017 vehicles in the US can now connect with both Amazon and Google’s digital voice assistants.

Starting today, Mercedes owners can instruct their Google Home or Amazon Echo to remotely start or lock their vehicles, as well as send addresses to their in-car navigation system. But a promo video by Mercedes shows a much more frightening use-case: using these digital voice assistants to compensate for incredibly stupid behavior, like leaving the house with both the iron and stovetops on at full blast.

“We want to offer our customers a broad range of services 24/7, not just when they are in our cars,” says Nils Schanz, head of Mercedes-Benz North America’s Internet of Things and Wearable Integration (how’s that for a title). “Mercedes-Benz’s goal is creating an intelligent ecosystem around cars and developing cutting-edge technology to make everyday life more convenient for our customers.”

Using your Echo and Google Home in conjunction with your Mercedes will involve more verbal gymnastics than if you were just using the smart home devices on their own. Let’s let Mercedes explain:

For instance, customers with Google devices can simply say, “Ok, Google, tell Mercedes me to start my car,” and it will remotely start the customer’s car. Another available feature includes remote lock. With Alexa devices, customers can say, “Alexa, ask Mercedes me to send an address to the car” for remote navigation input and point-of-interest requests.

There will be some app integration gymnastics as well.

Mercedes-Benz customers will need an active Mercedes me account and an active mbrace subscription. In order to link their accounts, customers will have to download the Google Home or Amazon Alexa app and connect it with Mercedes me.

It doesn’t appear that Mercedes-Benz owners will be able to use Alexa or Google Home from inside their vehicles, but rather will use Mercedes’ in-car system to control those devices.

Mercedes isn’t the first automaker to recognize the potential of third party digital voice assistants. At CES earlier this year, Ford unveiled its plan to roll out Alexa-equipped vehicles. Around the same time, Hyundai announced a new partnership with Google to add voice control through the Google Home.

But Mercedes has certainly been at the forefront of smart home integrations. Three years ago, the Daimler-owned car company said it would be adding Nest support to its vehicles, meaning that drivers will be able to tweak the temperature at home right through their dashboards.

Even so, most automakers are still trepidatious about fully embracing in-car app systems, like Apple’s CarPlay and Android Auto. Many hope to create their own app systems for drivers, so they can fully control the experience. But as you can see with The Verge’sScreenDrive series, that often amounts to a frustrating driving experience.

https://9to5mac.com/2017/04/21/koogeek-smart-socket-homekit-lights/

Review: Koogeek Smart Socket adapts your existing bulbs to work with HomeKit & Siri voice control

I also don’t recommend using the Koogeek Home app (which is what the included documentation directs you towards). Apple provides a very good HomeKit controller with the stock iOS 10 Home app and everything about the Smart Socket (aside from firmware updates) can be handled in the first-party app.

The Smart Socket is currently on sale for $40 at Amazon or ~$50 on eBay. I think the Socket makes sense for people that don’t want to invest in a comprehensive solution for every room, and just want to add some smart home features to a couple of places in the house. The Socket is a cool gift idea too for a friend — the novelty of controlling lights with Siri voice actions is really great.

Check out our previous review of the Smart Plug for more HomeKit accessories from this manufacturer.


When the bulb eventually dies out, you can buy another standard bulb from any shop and simply put that into the adaptor instead. The Smart Socket means you don’t have to buy special bulbs and you don’t have to change any of your fixtures. The only constraint is that it only works with Edison screw lights; probably the most common fitting for household lights. (It’s the type with the winding stalk that you can screw in and out of the socket with your hand).

The Smart Socket is powered by the electricity coming to the light fitting, and works like any other HomeKit accessory. Adding it to my Home was as simple as scanning the tag and took seconds using the iOS 10 Home app.

It then appears as its own tile; just tap the icon on the screen to toggle the light on and off. If the light is in a popular location in your house, you can add it as a Favorite in the Home app to access it anywhere in iOS from the Control Center pane.

Adding it to your Home automatically enables Siri voice integration, of course. You can control the bulb using any of the usual Siri features; I asked my Watch to turn on the light bulb whilst taking photos for this review and it dutifully obliged. The Socket uses Wi-Fi to communicate with HomeKit and was always very responsive to my remote commands.

On the negatives, the socket body itself is a large white bit of plastic. If you are using it with a light that doesn’t have a shade, it will stick out in the room and is unlikely to match your decor. In my case, this was not an issue as I used it with a standard ceiling light that includes a decorative shade. Once the socket is in the fitting, it’s very hard to see. I’ve included some pictures of before-and-after shots of the light to compare.

The other downside of this kind of product is that it relies on the electric wiring for power. This means the wall switch that the light is connected to has to stay on for the Smart Socket to be able to communicate with your iPhone wirelessly over HomeKit. It does reconnect quite quickly though if it is toggled on and off at the wall, which is nice.

(This latter point affects Philips Hue bulbs just the same of course; if you want a more elegant solution for the wall switch, you need to replace them with smart wall switches like Lutron.)

https://techcrunch.com/2017/04/20/amazons-new-alexa-developer-policy-now-bans-all-ads-except-in-music-and-flash-briefings/

Amazon’s new Alexa developer policy now bans all ads except in music and flash briefings

Amazon’s new Alexa developer policy now bans all ads except in music and flash briefings

Amazon has quietly introduced a change to its Alexa Skills Developer agreement aimed at further restricting advertisements within Alexa’s voice apps, which it calls “skills.”

Previously, the developer agreement stated that it would ban apps that used the Alexa app home cards – the cards that appear in the Alexa companion app – that can describe the skill in question or enhance the voice interaction with details provided as text-based content. Now the agreement simply bans skills that contain “any advertising for third-party products or services,” according to the updated documentation.Amazon carves out an exception for streaming music, radio and flash briefing apps (news briefings) where advertisements are not the core functionality of the skill. But other than that, ads are now largely banned on the Alexa platform.

The change has irked some Alexa developers who already feel that it’s too difficult to make money from their Alexa skills, as is. Others, however, are confident that Amazon will eventually introduce its own monetization platform – perhaps through in-app purchases, paid skills, or ways to leverage Amazon Pay (its payments platform) in their skills.

While Amazon is following an ambitious path toward making its voice computing technology powerful and ubiquitous – including by opening access to Echo speaker technology, Alexa’s voice technology, and the underlying technologies that power Alexa’s abilities to understand language – it has yet to fully address the needs of developers who want to build their own app businesses on top of its voice computing platform.

In fact, this problem is so often discussed that there’s an inside joke in an active Slack community for Alexa developers that involves the posting of a snow cone emoji. The joke is that it’s easier to make more money selling snow cones than building Alexa skills. The emoji is posted in response complaints, including, most recently, the change to Amazon’s Alexa Developer agreement.

According to posters in this community, the agreement was updated on Tuesday. We’ve asked Amazon to confirm, but the company declined to comment.

However, you can compare the two versions of the agreement by looking at the one live on Amazon.com, and a version cached by the Internet Archive’s Wayback machine:

In the former, there’s only a one-line description of what sort of advertising is banned – those using home cards – while the newer one broadens that to include “any advertising.”

There was initially some speculation that the change was made in response to technology being developed by VoiceLabs, which has been testing an advertising platform aimed at Alexa Skill developers involving “Sponsored Messages,” as referenced here in a blog post.  These will allow developers to insert brief ads, similar to those in podcasts, but which are interactive.

VoiceLabs’ system allows partner advertisers to connect with consumers who use Alexa’s voice-powered apps. But because any one Skill wouldn’t have enough users to capture that ad spend, VoiceLabs’ system instead combines users across Skills. This aggregated audience is then sizable enough to gain advertisers’ attention and interest.

But VoiceLabs’ co-founder and CEO Adam Marchick disputes the idea it’s his system that’s at all related to the policy change. He says that Amazon has known about Sponsored Messages since January, and has been collaborating with VoiceLabs on its development.

In addition, of the 1,300 developers on VoiceLabs’ platform, the majority of those planning to use Sponsored Messages are creating flash briefings, which are not affected by the new policy.

“Amazon has a really hard job,” says Marchick. “Consumer adoption is growing really quickly for this device, and developers are excited to innovate.”

However, he did caution that advertising has to be carefully considered as adoption grows. “They have a huge hit on their hands, and they want to be considerate of the consumer. To date, we’ve seen some of the advertising, and it’s not been considerate,” he says.

The change does come at a time when consumers are increasingly sensitive to unwelcome voice ads invading their connected speakers. Google Home came under fire when its speaker began playing ads for Disney’s “Beauty and The Beast” movie. Google denied this was an ad, claiming that it was an experiment in having Google Home’s Assistant surface other unique content.This month, Burger King hijacked Google Home speakers by creating an ad that would trigger the devices to read its Wikipedia entry for the Whopper, which it had conveniently edited beforehand to sound like marketing copy. Google quickly blocked the trigger, but not before the restaurant chain gained a lot of free press and consumer backlash.

Those examples are the same sort of advertisements that Amazon’s policy change are meant to head off, necessarily. But it does allow the company to summarily reject apps that are designed to use advertising in unwelcome ways – those that would ultimately annoy Alexa’s users, and decrease interest in voice computing in general.

It’s unclear to what extent Amazon will be enforcing this policy, however.

One developer, Joseph Jaquinta, who has been critical of Amazon’s policies, admits he’s openly violating the old ad policy in his skills. Both StarLanes and KnockKnock place ads in the home card – the former lets users play an ad for a bonus in the game, and the latter will simply read an ad to you and put it in your home card at some point.

“With 10,000 skills, how are you going to tell if someone starts advertising?” he asks. “I’m not seriously affected by the change in advertising policy. I had advertising in my skills before they even had a policy. And I’ve been in violation of their policy from the first day they introduced it. But they have zero enforcement and have never asked me about it.”

Enforcing ad policy is just one aspect of how Amazon isn’t tracking skills’ behavior. Developers also said you can update a skill’s content after it’s live and Amazon doesn’t notice the changes. This could end up being a workaround for the ad policy restrictions, for those developers who insist on breaking the advertising ban.

https://www.theguardian.com/technology/2017/apr/20/google-chrome-adblocker-feature

Google ‘may build an adblocker into Chrome’

Company could announce feature to prevent intrusive online adverts within weeks, according to reports

Google may outsource the definition of unacceptable adverts to the Coalition for Better Ads.
Google may outsource the definition of unacceptable adverts to the Coalition for Better Ads, an independent group set up by a consortium of major advertisers and agencies. Photograph: Stephen Shankland/FlickrA future version of Google Chrome may include a built-in adblocker, designed to prevent the most intrusive online adverts from being displayed on users’ computers and smartphones by default.

According to the Wall Street Journal, Google could announce the feature within weeks, but the specifics are not yet set in stone, and the company may yet scrap the entire plan.

If it does go ahead, the company would outsource the definition of unacceptable adverts to the Coalition for Better Ads, an independent group set up by a consortium of major advertisers and agencies in March. Its standards were set in place after “comprehensive research” involving more than 25,000 participants.

On desktop, the group bans “pop-up ads, auto-play video ads with sound, prestitial ads with countdown and large sticky ads”. On mobile “pop-up ads, prestitial ads, ads with density greater than 30%, flashing animated ads, auto-play video ads with sound, poststitial ads with countdown, full-screen scrollover ads, and large sticky ads” fail to make the standards.

One element of the plan which has yet to be fixed is what form the block would take: the Wall Street Journal suggests that Google may choose to block all advertising on sites with any offending ads, rather than just blocking the offending adverts themselves. That would lead to a much stronger push to publishers to ensure that all adverts on their page comply with the Coalition’s criteria.

Blocking ads may seem like a counterintuitive move for Google, which makes 86% of its revenue from advertising but the move could help avoid users adopting more aggressive, adblockers.

The majority of adverts Google serves comply with the Coalition’s standards, and its largest single revenue source – keyword adverts on search pages – is already seen as the gold standard of advert acceptability by many external critics. Another “acceptable ads” board, created by adblock developer Eyeo (which makes the Adblock Plus plugin for Chrome and Firefox), similarly allows Google’s search adverts through. But Eyeo charges large companies a portion of their revenue for the privilege of being on the Acceptable Ads list, a fee which publishers like Google and Amazon begrudgingly pay.

Unlike Facebook, its largest competitor in the digital advertising sector,the majority of Google’s ad revenue comes from users on the open web, where adblockers are technologically unhindered in how much content they can filter out of user’s browsing experience. Facebook, with an audience increasingly using the company’s own mobile apps, is largely protected from the adblocking boom, though it has still taken a more aggressive stance towards the tools, actively seeking to bypass them on its desktop website.

Chrome does already block some adverts, though not directly. A feature enabled by default in the browser prevents pop-ups from being shown unless a specific site is whitelisted. But the feature will block pop-ups regardless of whether or not they’re adverts, allowing it to skirt round the conventional definition of an adblocker; similarly, Chrome requires any content served using the Flash plugin to be manually activated, which blocks many adverts in practice (though a declining number of adverts are served using the ageing technology, partially as a result of Google’s changes).

Even before Google’s entry into the space was rumoured, adblocking developers had already begun changing how they interacted with publishers. Companies such as Eyeo and Brave, which block adverts, both allow a proportion of adverts through if they’re deemed to fit their own standards of acceptability; and both developers also have systems in place to allow users to pay for content in other ways than through advertising, if the publishers are interested in playing along.

If it launches, the adblocker could have ramifications for Google’s ongoing struggles with EU regulators. The European Commissioner for Competition, Margrethe Vestager, said “We will follow this new feature and its effects closely.”

Google told the Guardian: “We do not comment on rumour or speculation. We’ve been working closely with the Coalition for Better Ads and industry trades to explore a multitude of ways Google and other members of the Coalition could support the Better Ads Standards.”

Since you’re here …

… we’ve got a small favour to ask. More people are reading the Guardian than ever, but far fewer are paying for it. Advertising revenues across the media are falling fast. And unlike many news organisations, we haven’t put up a paywall – we want to keep our journalism as open as we can. So you can see why we need to ask for your help. The Guardian’s independent, investigative journalism takes a lot of time, money and hard work to produce. But we do it because we believe our perspective matters – because it might well be your perspective, too.

If everyone who reads our reporting, who likes it, helps to support it, our future would be much more secure.

https://techcrunch.com/2017/04/20/alexa-now-works-with-your-g-suite-calendar/

Alexa now works with your G Suite calendar

Amazon’s connected speakers and other Alexa-powered devices will now work with your G Suite calendar, the company announced this morning. Once enabled, users will be able to ask Alexa to give them an overview of their day or make changes and other additions to their calendar as needed, just by speaking.

The change represents another step towards making Echo and other Alexa speakers more practical devices to have in the office, or for general business use.

G Suite is not the first calendaring platform that Alexa supports. The company has offered Google Calendar integration since launch, and added support earlier this year for both Outlook Calendar (including Hotmail, MSN, and Live email accounts) as well as Office 365 Calendar, for those with Exchange Online mailboxes. However, G Suite is one of the last major calendaring services Amazon needed to round out the voice accessible calendaring functionality.

With the addition, you can say things like “Alexa, what’s on my calendar?” or “Alexa, add lunch with Sarah at noon to my calendar,” and Alexa will respond accordingly.

Note that to enable the feature, you don’t search for it in the Alexa Skill store, but rather make the change in the Alexa companion app. The option is available under Settings –> Calendar in the Accounts section of the Settings page.

http://www.cbsnews.com/news/young-cord-blood-revive-memory-in-the-aging-brain/

Could young blood revive memory in the aging brain?

A new study hints that young blood may harbor clues to a “fountain of youth” for older brains.

Researchers say blood from human umbilical cords appears to have helped reverse memory loss in aging mice.

The findings suggest that something in young blood is important in maintaining mental acuity.

No one, however, is saying that cord blood could be a magic bullet against Alzheimer’s or other forms of dementia.

For one, any effects seen in elderly rodents may fail to translate to humans.

Instead, the findings might set the stage for new drugs that target the dementia process, said study lead author Joseph Castellano. He’s an instructor in neurology at Stanford University School of Medicine.

“Part of what makes this exciting is that it suggests there’s more communication between the blood and brain than we’ve thought,” Castellano said.

The study builds on earlier work by the same Stanford team. There, the researchers found that old lab mice benefited from infusions of plasma (the liquid portion of blood) from young mice.

Specifically, the old mice showed improvements in learning and memory. This was measured by the ability to accomplish tasks like navigating a maze or building a nest.

The aim of the new study, Castellano said, was to see whether injections of human plasma given to mice could have similar effects.

It turned out that they did — at least when the plasma came from umbilical cords. Plasma from young adults had less of an impact. And plasma from older adults, ages 61 to 82, had no benefit at all.

That led to a critical question: What is it about umbilical cord blood that’s special?

The researchers found evidence that it might be a protein called TIMP2. It is present in high levels in cord plasma, they said, but declines with age.

What’s more, injections of TIMP2 benefited older rodents’ brains in the same way that cord plasma did.

Castellano said it was “surprising” that a single protein had such effects.

But, he noted, TIMP2 could be “upstream” of many biological processes. It belongs to a family of proteins that regulate other critical proteins. Those proteins, in turn, have the task of “chopping up” yet more proteins that exist in the matrix surrounding body cells.

But researchers know little about how TIMP2 acts on the brain, Castellano said.

“Now, we really need to get a better understanding of what it’s doing in the brain,” he said. “We are not saying we’ve found the protein that’s responsible for brain aging.”

Dr. Marc Gordon is a professor at the Litwin-Zucker Center for Alzheimer’s Disease and Memory Disorders at the Feinstein Institute for Medical Research in Manhasset, N.Y.

He agreed that the study identifies a protein “target” that should be studied further.

“But this is not saying that cord blood is a cure for aging,” Gordon stressed.

And it’s probably unrealistic to use cord blood as a dementia treatment, said Castellano.

Nor can anyone predict whether TIMP2 will point researchers toward new drugs for dementia. Findings in lab animals often fail to pan out in humans.

Plus, Gordon said, this study involved mice that were old, but did not have an “animal model” of Alzheimer’s. That refers to lab mice that are genetically modified to have Alzheimer’s-like brain pathology.

“What this could mean for human disease is purely speculative,” Gordon said.

Drugs for age-related brain disease have so far been “elusive,” Castellano said. The available medications for dementia symptoms have limited effects, and cannot stop the disease from progressing.

“We’re excited,” Castellano added, “about this knowledge that there are proteins present in the blood that evolve over the life span, and may affect brain function.”

The findings were published April 19 in Nature.

http://www.syracuse.com/us-news/index.ssf/2017/04/facebook_plans_to_let_you_type_with_your_brain_and_hear_with_your_skin.html

Facebook plans to let you type with your brain and hear with your skin

The human mind is a powerful thing, and Facebook plans to harness some of that energy.

Facebook has assembled a team of 60 scientists and engineers to work on the future of communication at Building 8, the company’s mysterious new hardware division, CNN reports. At the F8 conference in San Jose, Building 8 leader Regina Dugan said, “What if you could type directly from your brain? It sounds impossible, but it’s closer than you think.”

The human brain is streaming the equivalent of 40 HD movies every second. This new technology that can essentially read your mind using sensors and optical imaging, then translate that information into words.

According to TechCrunch, the project began six months ago and Building 8 is currently collaborating with medical researchers who specialize in machine learning for decoding speech and language, building optical neuroimaging systems with advanced spatial resolution and neural prosthetics. The plan is to ultimately be able to create non-implanted devices that allow people to type at 100 words per minute, 5X faster than typing on a phone, with just your mind.

The issue of privacy was also addressed at the F8 conference when Dugan provided the disclaimer that this was not about invading your thoughts, but rather taking words you have already decided to share and sending them to the speech center of your brain.

http://js.moatads.com/advancedigital402839074273/moatad.js#moatClientLevel1=HeaderBidding&moatClientLevel2=&moatClientLevel3=SYRACUSEONLINE/Nucleus_SY_Remnant_Marketing1_P13&moatClientLevel4=7177210&moatClientSlicer1=Marketing1&moatClientSlicer2=www.syracuse.com/us-news/2017/04/facebook_plans_to_let_you_type_with_your_brain_and_hear_with_your_skin.html/L24

(function(define){
define(function(a){
var id= "cx-ad-" + Math.floor(Math.random() * 1e10), d= document
d.write(‘

‘)
a.push({
pub: ‘advancedigital’,
pub_zone: ‘syracuse.com/Marketing1’,
sizes: [‘300×250’],
flag: true,
container: d.getElementById(id),
eventHost: ‘//b.mbid.io’,
zone_map_url: ‘//c.mbid.io/advancedigital/zm.js’
});
})})(function(f){ var a=this["uber_imps"]= this["uber_imps"] || [];f(a); });

//c.mbid.io/tags/b” style=”box-sizing: border-box; margin: 0px; padding: 0px; border: 0px; font-style: inherit; font-variant: inherit; font-weight: inherit; font-stretch: inherit; font-size: inherit; line-height: inherit; font-family: inherit; vertical-align: baseline; max-width: 100%; height: 250px; width: 300px;”>

USA Today said the implications of this new technology could be unsettling to consumers, many of whom think Facebook knows too much about their daily habits and actions — let alone their thoughts.

Examples of devices that let you type with your brain already exist for people with extreme paralysis. At the F8 conference, a video of a woman with ALS was shown with the help of implants, is able to type eight words a minute without moving her fingers. With Facebook’s goal of typing 100 words per minute, this could be used as what Dugan calls a “speech prosthetic” for people with communication disorders.

In addition to typing with your brain, Building 8 is also working on a way for humans to hear through their skin. It’s been building prototypes of hardware and software that let your skin mimic the cochlea in your ear that translates sound into specific frequencies for your brain. This technology could let deaf people essentially “hear” by bypassing their ears.

A video was shown at the conference of an early test where a Facebook engineer wore a special sleeve containing a system of actuators. It lets her feel “the acoustic shape of a word on her arm.” The engineer has learned the unique feeling of around nine words, what Facebook calls a tactile vocabulary.

var inReadHere = false;
window._ttf = window._ttf || [];
var this_site = window.location.hostname.split(‘.’);
this_site = this_site[this_site.length – 2];
var sites = {
‘al’ : {
pid: 59665
},
‘cleveland’ : {
pid: 59667
},
‘gulflive’ : {
pid: 59670
},
‘lehighvalleylive’ : {
pid: 59671
},
‘mardigras’ : {
pid: 59672
},
‘nj’ : {
pid: 59673
},
‘nola’ : {
pid: 59674
},
‘oregonlive’ : {
pid: 59675
},
‘pennlive’ : {
pid: 59676
},
‘silive’ : {
pid: 59677
},
‘syracuse’ : {
pid: 59678
},
‘masslive’ : {
pid: 59679
},
‘mlive’ : {
pid: 59681
},
‘newyorkupstate’ : {
pid: 59682
}
}
var site_ttf = sites[this_site];
if(!/Android|webOS|iPhone|iPod|BlackBerry|IEMobile|Opera Mini/i.test(navigator.userAgent)) {
var videoInset = window.top.document.querySelectorAll(".embeddedMedia");
var articleInset = window.top.document.querySelectorAll("#article_inset");
var videoCenter = window.top.document.querySelectorAll(".videoCenter");
var podcast = window.top.document.querySelectorAll("hr");
var entryWidget = window.top.document.querySelectorAll(".entry_widget_large");
var buffer = 30;
if (videoInset.length > 0 || podcast.length > 1 || articleInset.length > 0 || videoCenter.length > 0) { buffer = 50; }
var _tt_slot = "#article_container .entry-content > p, .entry-content p";
var pCheck = window.top.document.querySelectorAll("#article_container .entry-content > p, .entry-content p").length;
var _tt_minSlot = 3;
var _tt_slotMode = "out";
var _tt_avoidSlot = { slot: ".nola-factbox, iframe, .photo-data, .PullQuote, .PullQuoteAfter, .entry_widget_left, .pds-box, .entry_widget_small, .videoCenter, #article_inset, hr, #story, #story-package, .story, .entry_widget_large, .enhanced-link, .embeddedMedia, small, strong, .embed-container-16×9, .PDS_Poll, [title=’Twitter Tweet’], .inline-sidebar", dist: buffer }
if (pCheck 0) { _tt_minSlot = 10; }
} else {
var buffer = 30;
var _tt_slot = "#article_container .entry-content > p, .entry-content p";
var _tt_minSlot = 3;
var _tt_slotMode = "out";
var _tt_avoidSlot = { slot: ".nola-factbox, iframe, #caspioform, hr, #story, #story-package, .story, .entry_widget_large, .enhanced-link, .embeddedMedia, small, strong, .embed-container-16×9, .PDS_Poll, [title=’Twitter Tweet’], .inline-sidebar", dist: buffer }
}
for(var i = 0; i 0 || videoInset.length > 0) {
return (text.length > 340) && (obj[‘wrappedSizes’][‘width’] >= (350)); // Inset in Article, Teads will appears after 340 char. and never Small
} else {
return (text.length > 100) && (obj[‘wrappedSizes’][‘width’] >= (350)); // Otherwise, Display only after 100 char and never Small
}
}
}
});

(function (d) {
var js, s = d.getElementsByTagName(‘script’)[0];
js = d.createElement(‘script’);
js.async = true;
js.src = ‘//cdn.teads.tv/media/format.js’;
s.parentNode.insertBefore(js, s);
})(window.document);
}
setTimeout(function() { if (function (){var e=[].slice.call(document.querySelectorAll(‘div[id*="mixpo"], img, iframe, script[src*="mixpo"]’)).reduce(function(e,t){return"script"===t.nodeName.toLowerCase()?[].concat(e,{height:"30px"}):"iframe"===t.nodeName.toLowerCase()?[].concat(e,{height:t.height}):t.height0?e:null}()) { window.parent.document.getElementById("FrameTile2-iframe").style.maxHeight = "30px"; window.parent.document.getElementById("FrameTile2-iframe").height = "330px"; } else { window.parent.document.getElementById("FrameTile2-iframe").height = "0px"; window.parent.document.getElementById("FrameTile2").style.height = "0px" }},0)” height=”0px” style=”box-sizing: border-box; margin: 0px auto; padding: 0px; border: 0px; font-style: inherit; font-variant: inherit; font-weight: inherit; font-stretch: inherit; font-size: inherit; line-height: inherit; font-family: inherit; vertical-align: baseline; max-width: 100%; display: block; width: 980px;”>

“One day not so far away, it may be possible for me to think in Mandarin and for you to feel it instantly in Spanish,” said Dugan.

Facebook CEO Mark Zuckerberg has shown a predilection for telepathy, which he calls “the future of communication.” Once virtual reality and augmented reality have run their course, he has theorized, a form of technology-enabled telepathy will help people capture and then share their thoughts and feelings with friends.

http://www.ctvnews.ca/sci-tech/google-home-s-assistant-can-now-recognize-different-voices-1.3377195

Google Home’s assistant can now recognize different voices

Google HomeIn this Oct. 4, 2016 file photo, the new Google Pixel phone is displayed next to a Google Home smart speaker, left, following a product event in San Francisco. (Eric Risberg/AP)

SAN FRANCISCO — Google’s voice-activated assistant can now recognize who’s talking to it on Google’s Home speaker.

An update coming out Thursday will enable Home’s built-in assistant to learn the different voices of up to six people, although they can’t all be talking to the internet-connected speaker at the same time.

Distinguishing voices will allow Home to be more personal in some of its responses, depending on who triggers the assistant with the phrase, “OK Google” or “Hey Google.”

For instance, once Home is trained to recognize a user named Joe, the assistant will automatically be able to tell him what traffic is like on his commute, list events on his daily calendar or even play his favourite songs. Then another user named Jane could get similar information from Home, but customized for her.

The ability to distinguish voices may help Home siphon sales from Amazon.com’s Echo, a competing product that features its own voice-activated assistant, Alexa. The Echo doesn’t yet recognize different voices, so Alexa can’t retrieve more personal information for different accounts.

Google’s voice-distinction feature, however, won’t prevent unauthorized users from activating the assistant, as long as Home’s microphone is turned on.

That loophole allowed Burger King to recently air a TV commercial that included the phrase “OK Google” to prompt Home’s assistant to recite the ingredients of the fast-food restaurant’s Whopper burger from a Wikipedia entry.

Google quickly blocked Burger King’s commercial from toying with the Home assistant, but the marketing stunt illustrated how the technology can be manipulated. Voice-personalization eventually could enable Home’s users to block others from accessing the device, but Google isn’t ready to do that yet.

“It’s important to balance making sure the assistant on Google Home is still useful and able to answer a guest’s or friend’s question while also answering a few specific questions just for you,” Google spokeswoman Kara Stockton said.

The voice-distinction feature also isn’t being offered for the same digital assistant that operates on Google’s Pixel phone and other smartphones running on the latest version of its Android software. Google doesn’t think the technology is necessary on phones because most of those devices are password-protected and are usually used by just one person.