https://9to5mac.com/2016/12/27/san-francisco-ipad-as-plate/

San Francisco restaurant now using iPads as plates in effort to reach younger audience

In a rather bizarre move, a San Francisco restaurant is now serving food on iPads. No, you don’t order the food via the iPad; the food is literally served on the iPad. The restaurant, Quince, was awarded a third Michelin star earlier this year, but the new iPad dish is an effort to lure in a younger crowd.The dish served on an iPad is called “A Dog in Search of Gold” and described as “white truffle croquettes on iPads playing videos of water dogs on the truffle-hunt,” according to SFist.

Obviously there are some issues with this idea, namely health concerns. Presumably the iPads are cleaned thoroughly between customers, but there most likely aren’t any health code regulations for ensuring that they are cleaned in the safest way possible.

Described as “white truffle croquettes on iPads playing videos of water dogs on the truffle-hunt” by whoever sent the photo to Nakano, the plating raises some obvious questions. Namely, does the San Francisco Department of Public Health have an acceptable washing method for iPads? And, this being San Francisco, how long until someone reprograms one of those things to display one-star Quince reviews on Yelp?

Apparently, this isn’t a new thing. Chefs in the United Kingdom have been serving dishes on iPads since at least last year, but the trend hasn’t really come to the United States in full force.

We’ll see how this trend continues, but for now, what do you think? Let us know down in the comments.

https://www.cnet.com/products/sylvania-smart-multicolor-led-homekit-enabled/preview/

Here’s the first HomeKit smart bulb that doesn’t need a hub

CES is just a week away, and we’re sure to see plenty of new gadgets that work with Apple HomeKit, the set of smart home protocols built into the software that runs iPhones and iPads. North American lighting manufacturer Sylvania wanted to get out ahead of the sprawl, so we’re getting an early look at its own HomeKit-compatible offering: a new color-changing smart bulb that doesn’t need any extra hub hardware to connect with your home network.

The Sylvania Smart Multicolor LED.

Sylvania

That last bit will help set Sylvania apart from other HomeKit-compatible smart bulbs, including Philips Hue’s popular color-changing LEDs and the funky-looking LEDs in the Nanoleaf Smarter Kit. Both of those options transmit their signals using the Zigbee protocol, which means you need to plug a hub into your router to act as translator. Sylvania’s bulbs cut out the middle man by using Wi-Fi radios that your router can understand as soon as you screw them in. The company claims that the new bulbs are your very first hub-free option for use with HomeKit.

HomeKit compatibility comes with a couple of key advantages for iOS users. The most notable: Siri voice controls that allow you to tell Apple’s virtual assistant to turn your bulbs on and off, dim them up and down, or change their color.

You’ll also be able to control the Sylvania bulbs directly from Apple’s Home app alongside other HomeKit-compatible gadgets. With the Home app, you can group the bulbs with those other gadgets to create “scenes” that run automatically whenever you please. For instance you could set your lights to come on and your thermostat to crank up a few degrees whenever you return home from work at the end of the day. You can also pin individual bulbs or groups of bulbs to your iPhone’s Control Center — just swipe up and tap to turn things on and off, no app needed.

Pricing for the new bulb isn’t set yet, but if it follows suit with other color-changing smart bulbs in the Osram/Sylvania family of LEDs, it should likely cost somewhere around $40 a pop. Sylvania’s team tells me that the new bulbs will begin selling on Amazon in early 2017, and they’d be smart to hustle; Lifx, which also makes color-changing smart bulbs that communicate using Wi-Fi, recently announced plans to bring its latest generation of LEDs onto HomeKit’s platform by February.

http://www.sci-tech-today.com/story.xhtml?story_id=1230095R1D5F

Google’s Self-Driving Car Project Gets a New Name: Waymo

Google's Self-Driving Car Project Gets a New Name: Waymo

car project that Google started seven years ago has grown into a company called Waymo, signaling its confidence that it will be able to bring robot-controlled vehicles to the masses within the next few years.

“We are getting close and we are getting ready,” Waymo CEO John Krafcik said Tuesday after unveiling the company’s identity.

To underscore his point, Krafcik revealed the project had hit a key milestone in the journey to having fully autonomous cars cruising around public roads. In a trip taken in October 2015 , a pod-like car with no steering wheel and brake pads drove a legally blind passenger around neighborhoods in Austin, Texas without another human in the vehicle. It marked the first time one of the project’s cars had given a passenger a ride without a human on hand to take control of a self-driving car if something went wrong.

Krafcik called that trip taken by Steve Mahan, former director of the Santa Clara Valley Blind Center, an “inflection point” in the development of self-driving cars. It came a year before a Budweiser beer truck equipped with self-driving technology owned by ride-hailing service Uber completed a 120-mile trip through Colorado while being steered by a robot while a human sat in the back of trailer.

In doing so, Krafcik and other supporters of self-driving cars believe the technology will drastically reduce the number of deaths on the roads each year because they contend robots don’t get distracted or drunk, nor ignore the rules of the road, like humans do.

While Google’s self-driving cars were still in the research-and-development stage, its leaders indicated the vehicles would be commonplace by 2020. Krafcik declined to update the timetable Tuesday, saying only that “we are close to bringing this to a lot of people.”

Waymo’s transition from what once was viewed as a longshot experiment to a full-fledged company marks another step in an effort to revolutionize the way people get around.

Instead of driving themselves and having to find a place to park, people will be chauffeured in robot-controlled vehicles if Waymo, automakers and Uber realize their vision within the next few years. Waymo’s name is meant to be shorthand for “a new way forward in mobility.”

The newly minted company will operate within Google’s parent company, Alphabet, which was created last year to oversee far-flung projects that have nothing to do with Google’s main business of online search and advertising. Those projects, which Alphabet CEO Larry Page likens to “moonshots,” have lost $8 billion since 2014, with the research into self-driving cars accounting for a significant chunk of that amount.

Google began working on its self-driving technology in 2009 in a secretive lab called “X” run by company co-founder Sergey Brin. Since then, its fleet of cars has covered more than 2.3 million miles in the San Francisco Bay Area, Austin, Arizona and Washington state. In their travels, the self-driving vehicles have been involved in 35 traffic accidents. Google has said its self-driving vehicles were at fault in only one collision with a bus earlier this year.

The self-driving project had been expected to be spun out of the X lab since Krafcik, a former Hyundai USA executive, was hired as its CEO 15 months ago.

As its own company, Waymo will now face more pressure to generate a profit under Alphabet’s management instead of simply focusing on research. Rather than make its own cars, Waymo intends to license its technology to traditional automakers and trucking companies.

“We are not in the business of making better cars,” Krafcik said. “We are in the business of making better drivers.”

Earlier this year, Waymo’s precursor licensed its self-driving technology to Fiat Chrysler for 100 Pacifica minivans currently in production. Financial terms of that deal haven’t been disclosed.

The pressure to make money risks alienating some of the engineers who worked on the self-driving cars as a project that didn’t have a mandate to turn a profit. As it headed down the road to becoming Waymo, several key players quit the project. The defectors included its former director, Chris Urmson, and a co-founder Anthony Levandowski, who is now working on self-driving technology for Uber.

© 2016 Associated Press under contract with NewsEdge. All rights reserved.

http://www.iclarified.com/58464/apple-to-release-new-5inch-iphone-with-dual-vertical-cameras

Apple to Release New 5-inch iPhone With Dual Vertical Cameras?

Apple is planning to release a new 5-inch iPhone with dual cameras in a vertical orientation next year, according to a Macotakara report. Additionally, the company allegedly plans to release new 4.7-inch and 5.5-inch iPhone 7s models.

The basic specification is the same with the iPhone 7s・iPhone 7s Plus, and the iSight Duo cameras are installed vertically instead of horizontally, and various specifications are still under consideration. Furthermore, the officials are saying that the final specifications will be finalized at the second quarter of Apple’s 2017 fiscal year.

We’ve been hearing reports for some time that the 10th anniversary iPhone will be a major update. It will purportedly have a front glass cover and chassis, joined by a metal bezel and an edge-to-edge display that has no bezels on the top and bottom. Additionally, the front camera, Touch ID, speaker, and other sensors will apparently be embedded into the display. Another report claims that the device would be a clear piece of glass with a next-generation OLED screen. More recently, Apple is said to have be equipping the phone with wireless charging and a 3D camera.

Please follow iClarified on Twitter, Facebook, Google+, or RSS for updates.

http://www.theglobeandmail.com/report-on-business/international-business/after-shipping-more-than-1-billion-items-amazon-calls-2016-holiday-season-its-best-ever/article33437281/

After shipping more than 1 billion items, Amazon calls 2016 holiday season its ‘best-ever’

Amazon.com Inc. said it shipped more than 1 billion items worldwide this holiday season, which the top online retailer called its best ever, and its shares rose 1.5 per cent in afternoon trade.

The Amazon Echo home assistant and its smaller version, Echo Dot, topped the best-sellers list, said Jeff Wilke, chief executive of Amazon’s worldwide consumer division, in a press release.

“Despite our best efforts and ramped-up production, we still had trouble keeping them in stock,” he said.

Sales of voice-controlled Echo devices were nine times more than they were during last year’s holiday season, the company said. Amazon did not disclose comparable sales figures from a year earlier.

“It’s all relative to other numbers that they’ve never told us,” said analyst Jan Dawson of Jackdaw Research.

Shoppers can command home assistants like the Echo to perform a host of tasks, from playing music to turning on Christmas lights.

“While Amazon’s device sales are still relatively small growth drivers currently, we believe the proliferation of these devices will drive more ubiquitous use of Amazon services over time,” said Baird Equity Research analyst Colin Sebastian in a note, pointing to customers ordering more items by speaking to the Echo.

More than 72 per cent of Amazon’s customers worldwide shopped through mobile devices, the company added, and Dec. 19 was the busiest shopping day this holiday season.

“Prime customers are spending twice as much as other consumers using Amazon and helping to fuel rapid revenue growth that few retailers with only a fraction of Amazon’s revenues are able to generate,” Retail Metrics President Ken Perkins wrote in a note last week.

Alexa, the voice-controlled assistant on the Echo, and one-click ordering devices called Amazon Dash Buttons are making it easier for shoppers to “skip the trip,” and will put more pressure on rival retailers as they try to garner in-store and web traffic, Perkins said.

Other best sellers on Amazon included 72-pack Keurig K-Cups, the movie Finding Dory, Samsung Electronic Co. Ltd.’s Gear VR virtual reality headset and Nintendo Co. Ltd.’s Pokémon Sun and Pokémon Moon role-playing video games, the company said.

http://bgr.com/2016/12/27/this-modified-iphone-7-is-ultra-secure-but-itll-cost-you-big-bucks/

iphone encryption

http://www.kurzweilai.net/nanoarray-sniffs-out-and-distinguishes-multiple-diseases

Nanoarray sniffs out and distinguishes ‘breathprints’ of multiple diseases

December 23, 2016

Schematic representation of the concept and design of the study, which involved collection of breath samples from 1404 patients diagnosed with one of 17 different diseases. One breath sample obtained from each subject was analyzed with the artificially intelligent nanoarray for disease diagnosis and classification (represented by patterns in the illustration), and a second sample was analyzed with gas chromatography–mass spectrometry to explore its chemical composition. (credit: Morad K. Nakhleh et al./ACS Nano)

An international team of 63 scientists in 14 clinical departments have identified a unique “breathprint” for 17 diseases with 86% accuracy and have designed a noninvasive, inexpensive, and miniaturized portable device that screens breath samples to classify and diagnose several types of diseases, they report in an open-access paper in the journal ACS Nano.

As far back as around 400 B.C., doctors diagnosed some diseases by smelling a patient’s exhaled breath, which contains nitrogen, carbon dioxide, oxygen, and a small amount of more than 100 other volatile chemical components. Relative amounts of these substances vary depending on the state of a person’s health. For example, diabetes creates a sweet breath smell. More recently, several teams of scientists have developed experimental breath analyzers, but most of these instruments focus on one disease, such as diabetes and melanoma, or a few diseases.

Detecting 17 diseases

The researchers developed an array of nanoscale sensors to detect the individual components in thousands of breath samples collected from 1404 patients who were either healthy or had one of 17 different diseases*, such as kidney cancer or Parkinson’s disease.

The team used mass spectrometry to identify the breath components associated with each disease. By analyzing the results with artificial intelligence techniques (binary classifiers), the team found that each disease produces a unique breathprint, based on differing amounts of 13 volatile organic chemical (VOC) components. They also showed that the presence of one disease would not prevent the detection of others — a prerequisite for developing a practical device to screen and diagnose various diseases.

Based on the research, the team designed an organic layer that functions as a sensing layer (recognition element) for adsorbed VOCs and an electrically conductive nanoarray based on resistive layers of molecularly modified gold nanoparticles and a random network of single-wall carbon nanotubes. The nanoparticles and nanotubes have different electrical conductivity patterns associated with different diseases.**

The authors received funding from the ERC and LCAOS of the European Union’s Seventh Framework Programme for Research and Technological Development, the EuroNanoMed Program under VOLGACORE, and the Latvian Council of Science.

* Lung cancer, colorectal cancer, head and neck cancer, ovarian cancer, bladder cancer, prostate cancer, kidney cancer, gastric cancer, Crohn’s disease, ulcerative colitis, irritable bowel syndrome, idiopathic Parkinson’s, atypical Parkinsonism, multiple sclerosis, pulmonary arterial hypertension, pre-eclampsia, and chronic kidney disease.

** During exposure to breath samples, interaction between the VOC components and the organic sensing layer changes the electrical resistance of the sensors. The relative change of sensor’s resistance at the peak (beginning), middle, and end of the exposure, as well as the area under the curve of the whole measurement were measured. All breath samples identified by the AI nanoarray were also examined using an independent lab-based analytical technique: gas chromatography linked with mass spectrometry.


Abstract of Diagnosis and Classification of 17 Diseases from 1404 Subjects via Pattern Analysis of Exhaled Molecules

We report on an artificially intelligent nanoarray based on molecularly modified gold nanoparticles and a random network of single-walled carbon nanotubes for noninvasive diagnosis and classification of a number of diseases from exhaled breath. The performance of this artificially intelligent nanoarray was clinically assessed on breath samples collected from 1404 subjects having one of 17 different disease conditions included in the study or having no evidence of any disease (healthy controls). Blind experiments showed that 86% accuracy could be achieved with the artificially intelligent nanoarray, allowing both detection and discrimination between the different disease conditions examined. Analysis of the artificially intelligent nanoarray also showed that each disease has its own unique breathprint, and that the presence of one disease would not screen out others. Cluster analysis showed a reasonable classification power of diseases from the same categories. The effect of confounding clinical and environmental factors on the performance of the nanoarray did not significantly alter the obtained results. The diagnosis and classification power of the nanoarray was also validated by an independent analytical technique, i.e., gas chromatography linked with mass spectrometry. This analysis found that 13 exhaled chemical species, called volatile organic compounds, are associated with certain diseases, and the composition of this assembly of volatile organic compounds differs from one disease to another. Overall, these findings could contribute to one of the most important criteria for successful health intervention in the modern era, viz. easy-to-use, inexpensive (affordable), and miniaturized tools that could also be used for personalized screening, diagnosis, and follow-up of a number of diseases, which can clearly be extended by further development.