http://www.businessinsider.com/visualdx-machine-learning-app-for-skin-diagnosis-ceo-interview-2017-11

Apple CEO Tim Cook gave a shout-out to a $100-per-year app for doctors — here’s what it does

skin iphone appVisual Dx
  • Advances in machine learning now mean that doctors can take a photo and identify the disease or condition depicted.
  • Apple is a fan of one specific app, VisualDx, that uses new machine learning software to assist with diagnosis on an iPhone. 
  • VisualDx has built a database of 32,000 high-quality medical images. 

Apple CEO Tim Cook isn’t a doctor, but he talked about a piece of medical software, VisualDX, during Apple’s most recent earnings call.

It was an interesting choice of an app to highlight. Apple has deep ambition to break into the health and medical worlds, but although VisualDx is available to consumers through the Apple App Store, it’s not really an app for the public. It’s targeted at trained and credentialed doctors who can use it to help diagnose skin conditions and disorders.

This fall, the app has gotten a new trick — it can use an iPhone’s camera and machine learning to automatically identify skin conditions, or at least narrow down what they could be. Snap a picture in the app, and it will return a list of conditions the rash could be, based on decades of medical research.

“VisualDx is even pioneering new health diagnostics with Core ML, automating skin image analysis to assist dermatologists with their diagnoses,” Cook said.

In some ways, VisualDx offers a hint of the future of medicine, where a hot topic of conversation is whether sufficiently advanced computers and artificial intelligence could automate one of the core parts of what doctors do: identifying what the problem is.

In the future, some of this technology will trickle down to consumers, giving them the ability to snap photos of their own bodies, answer some questions, and ultimately figure out whether it’s a problem that requires medical attention, or simply a common rash, VisualDx CEO and medical doctor Art Papier tells Business Insider. VisualDx is currently developing a version of this tool, called Aysa, for common skin disorders.

“Consumers don’t want to play doctor themselves. But on a Saturday, they want to know, do I need to go to the emergency room with my child or can this wait until Monday when I could see my pediatrician,” Papier said.

“It’s really education and triage. It’s not diagnosis, we don’t believe in that,” he continued, “At least in the next few years, we’re not going to tell patients you’re totally OK, you don’t need to see a doctor.”

On-device

Art PapierDr. Art Papier.Rochester

The reason why Apple’s CEO mentioned VisualDx is because it’s using CoreML, a new set of software that makes it possible to run machine learning algorithms on a phone, instead of uploading the photos online to a server for processing.

“Our clients are hospitals and they really don’t like the idea of a doctor in their hospital taking a picture of a patient and then sending the picture to a third party or a private company,” Papier said.

“We realized when Apple announced CoreML in June, they announced that you can move your models onto the iPhone,” he continued. “Now the image gets analyzed on the phone, and the image never goes up to the cloud to us. We never see the image.”

Even still, the software can return an analysis in a second on a newer iPhone. The identification neural network is “trained” by researchers at VisualDx, but it can run on a phone, Papier said.

The models are trained using VisualDx’s own library of professional medical images, Papier said.

“We’re not like a wiki model where you know anyone can upload images to us and just tell us what they think they are,” Papier said. “We’re very very selective to work with the universities and individual physicians we know are experts.”

Many of VisualDx’s images were scanned from old collections of slides and film, from leading departments and doctors. It’s built a library of 32,000 images to train its models. “We developed this reputation as somebody that was going to preserve the legacy of medical photography,” Papier said.

Still, even with high-quality models and training data, Papier doesn’t think completely automated diagnosis will happen anytime soon. “The hype cycle right now for machine learning is off the charts,” he said.

“Machine learning will get you into a category on this, to get to the final mile, you have to ask the patient did you take a drug a week ago. Did you travel,” he said.

Here’s what VisualDx does:

View As: One Page Slides


VisualDx is intended for use by doctors to confirm and validate diagnoses. It allows doctors to search by symptoms, signs, and other patient factors.

Medical professionals can download the app from the App Store. If your institution doesn’t have a subscription, various in-app purchases range from $99 for a year’s access to DermExpert. A complete purchase with access to other medical information is $499 per year.

The newest feature in the app is called Derm Expert, which allows doctors to take a photo to help diagnose a skin condition.

The photo is completely analyzed on an iPhone or iPad, and it isn’t sent to the cloud.

It then returns multiple suggestions as to what the problem could be as well as one “best match.”

It allows doctors to add additional symptoms to improve the diagnosis.

And doctors can review other possibilities with photos, as well.

The app also includes other clinical content, including information about various disorders, treatment, and additional images.

It’s only for doctors now, but the company is working on an app that can help consumers identify skin conditions called Aysa.

It's only for doctors now, but the company is working on an app that can help consumers identify skin conditions called Aysa.

Aysa

https://www.thesun.co.uk/tech/4943624/robot-doctor-medical-exam-china-beijing/

Robot DOCTORS step closer to reality after machine passes his medical exams

Robot Xiao Yi has ‘mastered’ all required skills to solve medical problems, after passng China’s National Medical Licencing Exam

Xiao Yi “mastered” all required medical skills, scoring 465 out of 600 points in China’s National Medical Licencing Examination.

 Robot Xiao-Yi passed medical examinations with flying colours and has the skills to become a doctor

REUTERS
3
Robot Xiao-Yi passed medical examinations with flying colours and has the skills to become a doctor

Researchers from Tsinghua University, Beijing spent the last 12 months inputting information from dozens of medical books into his brain.

During the exam, the robo doc was forced to show a “reasoning process” when deciding how to treat symptoms, instead of regurgitating the information back.

Wu Ji, deputy boss of its electronic engineering department said: “Its score is 96 points above the acceptance line.

“This shows that it has indeed mastered the medical knowledge and clinical knowledge, and it has owned the basic ability to employ the knowledge to solve some problems.”

 Xiao Yi passed the national medical exam better than his handlers at Tsinghua University had expected

REUTERS
3
Xiao Yi passed the national medical exam better than his handlers at Tsinghua University had expected
 Researchers from the Beijing university spent the last 12 months filling his brain with medical data

REUTERS
3
Researchers from the Beijing university spent the last 12 months filling his brain with medical data

Although robot has flaunted his superior wealth of knowledge, researchers have warned he still has a long way to go before coming a qualified doctor.

But Wu said the examinations the robot undertook can help doctors to “avoid risks”.

http://www.kurzweilai.net/disturbing-video-depicts-near-future-ubiquitous-lethal-autonomous-weapons

Disturbing video depicts near-future ubiquitous lethal autonomous weapons

The technology described in the film already exists, says UC Berkeley AI researcher Stuart Russell
November 18, 2017


Campaign to Stop Killer Robots | Slaughterbots

In response to growing concerns about autonomous weapons, the Campaign to Stop Killer Robots, a coalition of AI researchers and advocacy organizations, has released a fictional video that depicts a disturbing future in which lethal autonomous weapons have become cheap and ubiquitous worldwide.

UC Berkeley AI researcher Stuart Russell presented the video at the United Nations Convention on Certain Conventional Weapons in Geneva, hosted by the Campaign to Stop Killer Robots earlier this week. Russell, in an appearance at the end of the video, warns that the technology described in the film already exists* and that the window to act is closing fast.

Support for a ban against autonomous weapons has been mounting. On Nov. 2, more than 200 Canadian scientists and more than 100 Australian scientists in academia and industry penned open letters to Prime Minister Justin Trudeau and Malcolm Turnbull urging them to support the ban.

Earlier this summer, more than 130 leaders of AI companies signed a letter in support of this week’s discussions. These letters follow a 2015 open letter released by the Future of Life Institute and signed by more than 20,000 AI/robotics researchers and others, including Elon Musk and Stephen Hawking.

“Many of the world’s leading AI researchers worry that if these autonomous weapons are ever developed, they could dramatically lower the threshold for armed conflict, ease and cheapen the taking of human life, empower terrorists, and create global instability,” according to an article published by the Future of Life Institute, which funded the video. “The U.S. and other nations have used drones and semi-automated systems to carry out attacks for several years now, but fully removing a human from the loop is at odds with international humanitarian and human rights law.”

“The Campaign to Stop Killer Robots is not trying to stifle innovation in artificial intelligence and robotics and it does not wish to ban autonomous systems in the civilian or military world,” explained Noel Sharkey of the International Committee for Robot Arms Control. Rather we see an urgent need to prevent automation of the critical functions for selecting targets and applying violent force without human deliberation and to ensure meaningful human control for every attack.”

For more information about autonomous weapons:

* As suggested in this U.S. Department of Defense video:


Perdix Drone Swarm – Fighters Release Hive-mind-controlled Weapon UAVs in Air | U.S. Naval Air Systems Command

https://www.engadget.com/2017/11/18/apple-imac-pro-includes-a10-chip-from-iphone/

Apple’s iMac Pro may have hands-free Siri voice control

It includes a version of the A10 chip from the iPhone 7.
Chris Velazco/Engadget

Those rumors of Apple using custom ARM chips for more features inside Macs? They’re true… and you might not have to wait long to witness it in action. Jonathan Levin has combed through BridgeOS code that should accompany the iMac Pro, and it looks as if Apple will be using a cut-down version of the iPhone 7’s A10 Fusion chip as a co-processor. While its full functionality isn’t clear yet, developer Steve Troughton-Smith notes that the A10 appears to handle macOS’ boot and security processes, such as passing firmware to the main Xeon processor and managing media copy protection. More importantly, Guilherme Rambo has found references to “hey Siri” support — as with Cortana on Windows 10, you might not have to click an icon or invoke a keyboard shortcut just to ask about the weather.

It’s possible that the A10 chip is always running, which would represent a break from the custom T1 chip driving the Touch Bar in some recent MacBook Pro models. That would line up with leaks which had custom ARM chips handling macOS’ Power Nap feature, which fetches data while the system is sleeping. The iMac Pro wouldn’t have to spool up its power-hungry Xeon processor just to grab your email. And of course, that could be very important for “hey Siri” commands, which could theoretically run even if the Mac is dormant.

The findings don’t necessarily mean that the iMac or macOS will be more restrictive than before. You may not know exactly what the A10 does until the iMac Pro ships, which is expected before the end of 2017. However, this does show that Apple is putting its mobile experience to use in the computer realm, even if it’s just to save power and processing overhead. And the iMac Pro may be a good testbed for this functionality. As Troughton-Smith points out, Apple could gauge how the companion A10 works without the bulk of users “freaking out.” You could get a more refined experience if and when the A10 finds its way to mainstream Macs — particularly MacBooks, where the chip could let Siri and Power Nap run without significantly affecting your battery life.

http://www.dailygalaxy.com/my_weblog/2017/11/seeing-through-the-big-bang-into-another-universe-ligo-gravitational-wave-discovery-may-confirm-an-o.html

“Seeing Through the Big Bang Into Another Universe” –LIGO Gravitational Wave Discovery May Confirm an Outrageous ‘New’ Cosmology (WATCH Weekend ‘Galaxy’ Stream)

Maxresdefault (1)

 

“Your theory is crazy, but it’s not crazy enough to be true,” said the great Danish physicist Niels Bohr. Enter Sir Roger Penrose. Correlated noise in the two LIGO gravitational-wave detectors may provide evidence that the universe is governed by conformal cyclic cosmology (CCC) which assumes that the universe consists of a succession of aeons, “the boundaries of infinity,” says Penrose of the University of Oxford. “The Big Bang was not the origin of our universe,” he observed.

Penrose proposes that there was an aeon before the Big Bang. The apparent noise is actually a real signal of gravitational waves generated by the decay of hypothetical dark-matter particles predicted by CCC from a previous aeon that can be seen in the cosmic microwave background –electromagnetic radiation left over from an early stage of the universe in Big Bang cosmology. 

GWmergerCombo

Penose argues that a significant amount of this noise could be a signal of astrophysical or cosmological origin – and specifically CCC.

View this brilliant, beautifully produced interview with Penrose…

 

 

Physicists at the Niels Bohr Institute, writes Hamish Johnston, editor of physicsworld.com, pointed out that some of the noise in the two LIGO detectors appears to be correlated – with a delay that corresponds to the time it takes for a gravitational wave to travel the more than 3000 kilometers between the instruments.

 

 

First proposed over a decade ago by Penrose, CCC assumes that each aeon begins with a big bang and proceeds into an unending future in which the universe expands at an accelerating rate. As this expansion becomes infinitely large, Penrose argues that it can be transformed back into the next big bang.

Penrose, Johnston writes, says that a “reasonably robust implication of CCC” is that dark matter consists of particles called erebons – the name deriving from the Greek god of darkness Erebos. As dark matter goes, erebons are extremely heavy and have masses of about 10–5 g. This is roughly the Planck mass and on a par with a grain of sand and about 22 orders of magnitude heavier than a proton.

When an erebon decays, Penrose states, it deposits all its energy into a gravitational wave frequencies well above the detection capabilities of LIGO, and would be detected and recorded as near-instantaneous impulses that could be mistaken for noise rather than a signal from the birth of the cosmos.

https://globalnews.ca/news/3868329/nasa-rising-sea-levels-new-york-london/

NASA tool shows how rising sea levels will affect New York, London, other coastal cities

Small iceberg in an ice fjord is seen in southern Greenland. Sea levels are projected to rise by between one and four feet around the world by 2100.

Small iceberg in an ice fjord is seen in southern Greenland. Sea levels are projected to rise by between one and four feet around the world by 2100.

A new tool developed by NASA scientists can pinpoint which glaciers and ice sheets are contributing to changing sea levels in almost 300 coastal cities.

The Gradient Fingerprint Mapping Simulation allows users to spin a virtual globe and select one of hundreds of coastal cities, which will then produce a small blurb about how much sea levels are projected to rise in the area and the name of the contributing glacier.

NASA launches a new tool that reveals projected sea level rise by coastal city.

NASA launches a new tool that reveals projected sea level rise by coastal city.

NASA

The accompanying study analyzes 293 major port cities and aims to let coastal planners accurately calculate local sea level changes as well as know which melting parts of the Earth’s polar ice cap present the biggest risks to each region.

“This study allows one person to understand which icy areas of the world will contribute most significantly to sea level change [rise or decrease] in their specific city,” Eric Larour, the project lead for NASA JPL’s Ice Sheet System Model (ISSM) and lead author of the study, told CNN.

https://9to5mac.com/2017/11/16/end-of-watch-os-1-apps-2018/

Apple will require new Apple Watch apps to be native starting next year

Apple today shared a news update encouraging developers to update their Apple Watch apps for watchOS 4. The update also shares that watchOS 1 apps will no longer be accepted next year.

Apple notes the benefits that come along with updating apps for watchOS 4.

Take advantage of increased performance, new background modes for navigation and audio recording, built-in altimeter capabilities, direct connections to accessories with Core Bluetooth, and more. In addition, the size limit of a watchOS app bundle has increased from 50 MB to 75 MB.

The deadline for all Apple Watch apps to be native (built with the watchOS 2 SDK or later) will be April 1st next year.

Please note that starting April 1, 2018, updates to watchOS 1 apps will no longer be accepted. Updates must be native apps built with the watchOS 2 SDK or later. New watchOS apps should be built with the watchOS 4 SDK or later.


Check out 9to5Mac on YouTube for more Apple news:

 

http://business.financialpost.com/news/retail-marketing/loblaw-says-it-ordered-25-tesla-electric-trucks-wants-fully-electric-fleet-by-2030

Loblaw says it ordered 25 Tesla electric trucks, wants fully electric fleet by 2030

Tesla unveiled the trucks, expected to begin production in 2019, on Thursday

Canadian grocery chain Loblaw Cos has placed an order for 25 of Tesla Inc’s new electric heavy duty trucks, a Loblaw spokeswoman said in a statement sent by email.

Tesla unveiled the trucks, expected to begin production in 2019, on Thursday.

The order furthers Loblaw’s goal of having a fully electric corporate fleet by 2030, the spokeswoman, Catherine Thomas, said. She did not disclose the cost of the trucks.