http://www.ctvnews.ca/health/pocket-doctor-is-online-health-care-the-way-of-the-future-1.2911803

Pocket doctor: Is online health care the way of the future?

Sometimes, it can take weeks to get an appointment to see your family doctor. If you go to a walk-in clinic, waits can often stretch for hours. In a world where smartphones are changing how we bank, commute and communicate, several companies are now offering Canadians the chance to keep a doctor in their pocket.

“The feedback from patients has been overwhelmingly positive,” Dr. Shazeen Bandukwala of the new Akira app told CTV News. “I hear back from patients saying, ‘you just saved me a two hour walk-in clinic visit.’”

Launched this week in Toronto, Akira is offering virtual medical visits via their smartphone app. Akira doctors can even write prescriptions, order lab tests and refer you to specialists.

Telehealth apps and websites, which already exist in the U.S., have been called the Uber of health care. The convenience factor is exactly why Catalina Lopez signed up for Akira.

“I didn’t have to sit in a waiting room for three hours to see a doctor,” Lopez told CTV News. “It was very convenient and it worked in the way I worked.”

Another service is called Ask the Doctor. Initially an online platform to ask doctors questions, the Canadian company is now offering direct patient-doctor visits in the U.S. and Ontario via text and video streaming.

“We find much of our traffic spikes when family physicians are not available,” Dr. Michael Warner, chief medical officer of Ask the Doctor, told CTV News. “It’s evenings, weekends and Friday afternoons… There are gaps in the current system and we fill those gaps in a way that is convenient for patients.”

Akira is currently available for a monthly fee of $9.99. Ask the Doctor is only available through employer health programs. Both services are hoping to eventually be covered by provincial health care plans. By diverting patients away from walk-in clinics and emergency rooms and by eliminating overhead, both believe they can save our healthcare system money.

“Governments subsidize physicians for their office,” Warner says. “They won’t have to pay for bricks and mortar.”

While both companies say they have secure websites and that all data and test results are secure and private, researchers who are just starting to look at this emerging field say the potential comes with questions.

“It’s proven that hackers can get into anything and take data,” Ryerson University information technology professor Reza Aria says. “We don’t want that data to end up in the hands of the wrong people.”

Although no one believes that a face on a screen can replace a doctor’s steady hands when it comes to serious diseases and emergencies, in a world where consumers like options, a doctor in the pocket may just be another solution.

With a report from CTV’s medical specialist Avis Favaro and producer Elizabeth St. Philip

http://www.independent.co.uk/news/uk/selfie-lovers-are-more-likely-to-overestimate-their-beauty-study-finds-a7041381.html

Selfie-lovers are more likely to overestimate their beauty, study finds

Research showed the public rated selfie-lovers as looking significantly more narcissistic

People who take selfies are likely to overestimate how attractive they are, according to a new study.

Research published in the Social Psychological and Personality Science also found that those who love snapping a photo of themselves are more likely to be seen as vain and narcissistic.

The University of Toronto used 198 college students in an experiment, of which 100 were self-confessed lovers of selfies, while the remainder rarely took one.

After taking a selfie, and having another photo taken by a third party, they were all asked to rate the snaps.

Members of the public were also asked to judge the photos by how attractive and likable the person looked, but also how narcissistic.

Their findings showed that while both groups thought they looked better in their own photos rather than those taken by a third party, regular self-takers rated themselves much higher.

They found that selfie-lovers tended to think they looked significantly better in photos they took than in ones shot by others.

While researchers judge both groups to be as narcissistic as each other, members of the public judged the selfie obsessives as looking “significantly more narcissistic” than their camera shy counterparts.

The researchers concluded taking regular selfies could lend itself to people displaying ‘self-favouring’ bias.

This can be amplified by learning the most flattering angle to take a photo at, or regularly receiving positive feedback from social media.

This gives them a distorted view of their own attractiveness, overestimating it, and can increase over time.

http://www.bbc.co.uk/newsround/36343502

American scientists develop tiny hovering robot insects

Meet the “RoboBee”, a tiny robot which can fly, and perch on the ceiling likea real-life insect!

It was designed by scientists from Harvard Microbotics Laboratory and is about thesame size as a ten pence coin.

The robots are special because they use something called ‘electrostatic adhesion’to perch on the ceiling.

That’s the same thing that happens when you rub a balloon on your jumper and itsticks to walls.

Static electricity

Perching on things allows the robots to save their energy on longer journeys.

Scientists think these robots could be used to help search and rescue teams get tohard to reach places after a natural disaster.

This got us thinking about some of the other incredible animal-inspired robots we’vemet on Newsround…

Amazing animal animatronics

You need to install Flash Player to play this content.
Robots inspired by animals

Scientists are turning to the animal world for inspiration for their latest robotdesigns.

We sent Ricky to check out some of those robots that have been made with ahelping hand from the animal kingdom.

He discovered robots which have been inspired by ants, bulls, cheetahs and evenfleas!

Super-strong robots with sticky gecko feet!

You need to install Flash Player to play this content.
Inventors explain how their tiny super robots work

These tiny robots can carry up to 2,000 times their own body weight, that’s thesame amount of power as a human pulling along a blue whale!

The scientists from Stanford University who invented them, were inspired by theway that certain animals move.

They looked at the way a gecko’s feet allow it to stick to surfaces, and how an ant’sfeet can help it to carry up to 100 times its own body weight.

http://www.gizmodo.com.au/2016/05/this-amazing-carbon-fiber-pavillion-was-woven-by-a-robot/

This Amazing Carbon Fibre Pavillion Was Woven By A Robot

London’s Victoria & Albert Museum has unveiled an incredibly intricate robotically woven and biologically inspired carbon-fibre pavilion in its courtyard.

Images: V&A Museum

This Amazing Carbon Fibre Pavillion Was Woven by a Robot

The structure, constructed as part of the museum’s new engineering season, is built up of 40 hexagonal components that cover over 185 square metres. Each panel is made from a combination of transparent glass fibre and black carbon fibre, woven into a fibrous structure inspired by a beetle’s forewing — known as elytra, hence the structure’s name, Elytra Filament Pavilion.

This Amazing Carbon Fibre Pavillion Was Woven by a Robot

The panels weigh around 45kg each. In total, that means that the entire structure weighs just 2.5 tonnes. The installation is the brainchild of experimental architect Achim Menges, along with collaborators Moritz Dörstelmann, Jan Knippers and Thomas Auer.

This Amazing Carbon Fibre Pavillion Was Woven by a Robot

Each panel takes around three hours to be constructed by a computer-programmed Kuka robot. One of those robots will sit within the courtyard during the course of the exhibition. While in situ, it will create new elements that can be added to the structure based on real-time sensed data.

This Amazing Carbon Fibre Pavillion Was Woven by a Robot

The Museum’s engineering season is open now and will run until November 6.

This Amazing Carbon Fibre Pavillion Was Woven by a Robot

Have you subscribed to Gizmodo Australia’s email newsletter? You can also follow us onFacebook, Twitter, Instagram and YouTube.

http://www.canadianunderwriter.ca/insurance/designated-highway-texting-zones-bill-passes-second-reading-ontario-legislature-1004091839/

Designated highway texting zones bill passes second reading in Ontario legislature

Ontario politicians from all three parties spoke Thursday in favour of an opposition private member’s bill that proposes to allow the transportation minister to “create designated highway texting zones.”

Bill 190, the Safe Texting Zones Act, was referred Thursday to the Standing Committee on Finance and Economic Affairs. Those zones are intended to remind drivers of legal opportunities to use their mobile devices.

September 23, 2013 – New Baltimore: Governor Cuomo continued the state’s efforts to reduce distracted driving by unveiling special “Texting Zones” along the New York State Thruway and State Highways that will give motorists a pull-off area to park and use their mobile devices. Existing Park-n-Ride facilities, rest stops, and parking areas along the Thruway and Highways will dual-function as Texting Zones, and signage will be placed along the highway to inform drivers where the Zones are located. A total of 298 signs will be located along major highways across the state, notifying motorists to 91 Texting Zone locations. Photo: New York Governor Andrew Cuomo’s flickr site.

If passed into law, Bill 190 would authorize the Minister of Transportation “to create designated highway texting zones where a driver is able to stop safely to use their device,” said Vic Fedeli, Progressive Conservative MPP for Nipissing, in the legislature. “This includes existing commuter parking lots, transit stations or service stations, and does not require any new infrastructure.”

Fedeli, the PC finance critic, tabled Bill 190 April 20.

“The real impetus of this bill would require that signage be displayed approaching these texting zones,” Fedeli said May 19 during second reading of Bill 190. “These would remind drivers that there is a nearby opportunity for them to legally and safely use their cellphone. Of course, people can still use their hand-held devices if a vehicle is pulled off a roadway or lawfully parked. This bill would designate specific areas to do exactly that, to assist drivers in obeying the law.”

One government MPP – Liberal Granville Anderson, who represents the riding of Durham – said Thursday he supports Bill 190.

“With the passage of Bill 31, the Making Ontario’s Roads Safer Act, this past June, drivers now face stiffer fines and penalties upon conviction,” Anderson told the legislature at Queen’s Park in Toronto. “I agree with [Fedeli] that that’s simply not enough and we have to do more to deter drivers from texting while driving.”

Bill 31, the Transportation Statute Law Amendment Act (Making Ontario’s Roads Safer), increases the maximum fine for distracted driving to $1,000. Bill 31 also provides for administrative licence suspensions “where a person is driving a motor vehicle or operating a vessel while impaired by a drug or by a combination of a drug and alcohol.”

Other changes to the Highway Traffic Act, from Bill 31, include a new rule that requires drivers to “remain stopped at a pedestrian crossover or school crossing until the person crossing the street and the school crossing guard are off the roadway.” Before, drivers were allowed to proceed once the pedestrian or crossing guard were no longer on the driver’s half of the roadway, the provincial government states in an explanatory note to Bill 31.

“Texting while driving cannot be done under any circumstances,” Anderson said Thursday. “The reality is, the chance of an accident dramatically increases the second you take your eyes off the road, and it’s just not worth the risk.

People of all ages, including senior citizens and young people, are texting and driving, suggested Wayne Gates, transportation critic for the New Democratic Party and MPP for Niagara Falls.

“Even though a person might think they’re just going to read a text or maybe they’ll type a little bit and then look at the road and then go back to typing, it doesn’t make a difference,” Gates said. “We know that people come up with all sorts of plans that they use to convince themselves they’re just sending one more text, until it goes wrong. It just happens too fast to be able to react.”

It takes about five seconds to send or read a text message, suggested Monique Taylor, NDP MPP for Hamilton Mountain.

“It doesn’t sound like a lot of time, but if you’re travelling at 90 kilometres per hour, that’s enough time to take you from one end of a football field to the other,” Taylor said.

“Drivers who text messages are 23 times more likely to be involved in a crash,” Fedeli said. “Economic losses caused by traffic-collision-related health care costs and lost productivity are at least $10 billion annually. That’s about 1% of our [gross domestic product].”

Fedeli noted that texting zones were established in New York State by Governor Andrew Cuomo.

“Existing park-and-ride facilities, rest stops and parking areas along the roads were to be equipped with texting zone signage, each serving double duty as one of the 91 locations across the state,” Fedeli said. “When introducing the initiative, Governor Cuomo said, ‘With this new effort, we are sending a clear message to drivers that there is no excuse to take your hands off the wheel and eyes off the road, because your texts can wait until the next texting zone.’ I think that’s something we can all agree on.”

https://www.sciencedaily.com/releases/2016/05/160519161227.htm

Large-scale technique to produce quantum dots

Date:
May 19, 2016
Source:
DOE/Oak Ridge National Laboratory
Summary:
Scientists have demonstrated a method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications.
Using this 250-gallon reactor, ORNL researchers produced three-fourths of a pound of zinc sulfide quantum dots, shown in the inset.
Credit: ORNL

A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy’s Oak Ridge National Laboratory.

While zinc sulfide nanoparticles — a type of quantum dot that is a semiconductor — have many potential applications, high cost and limited availability have been obstacles to their widespread use. That could change, however, because of a scalable ORNL technique outlined in a paper published in Applied Microbiology and Biotechnology.

Unlike conventional inorganic approaches that use expensive precursors, toxic chemicals, high temperatures and high pressures, a team led by ORNL’s Ji-Won Moon used bacteria fed by inexpensive sugar at a temperature of 150 degrees Fahrenheit in 25- and 250-gallon reactors. Ultimately, the team produced about three-fourths of a pound of zinc sulfide nanoparticles — without process optimization, leaving room for even higher yields.

The ORNL biomanufacturing technique is based on a platform technology that can also produce nanometer-size semiconducting materials as well as magnetic, photovoltaic, catalytic and phosphor materials. Unlike most biological synthesis technologies that occur inside the cell, ORNL’s biomanufactured quantum dot synthesis occurs outside of the cells. As a result, the nanomaterials are produced as loose particles that are easy to separate through simple washing and centrifuging.

The results are encouraging, according to Moon, who also noted that the ORNL approach reduces production costs by approximately 90 percent compared to other methods.

“Since biomanufacturing can control the quantum dot diameter, it is possible to produce a wide range of specifically tuned semiconducting nanomaterials, making them attractive for a variety of applications that include electronics, displays, solar cells, computer memory, energy storage, printed electronics and bio-imaging,” Moon said.

Successful biomanufacturing of light-emitting or semiconducting nanoparticles requires the ability to control material synthesis at the nanometer scale with sufficiently high reliability, reproducibility and yield to be cost effective. With the ORNL approach, Moon said that goal has been achieved.

Researchers envision their quantum dots being used initially in buffer layers of photovoltaic cells and other thin film-based devices that can benefit from their electro-optical properties as light-emitting materials.


Story Source:

The above post is reprinted from materials provided by DOE/Oak Ridge National Laboratory. Note: Materials may be edited for content and length.


Journal Reference:

  1. Ji-Won Moon, Tommy J. Phelps, Curtis L. Fitzgerald Jr, Randall F. Lind, James G. Elkins, Gyoung Gug Jang, Pooran C. Joshi, Michelle Kidder, Beth L. Armstrong, Thomas R. Watkins, Ilia N. Ivanov, David E. Graham.Manufacturing demonstration of microbially mediated zinc sulfide nanoparticles in pilot-plant scale reactors. Applied Microbiology and Biotechnology, 2016; DOI: 10.1007/s00253-016-7556-y

Cite This Page:

DOE/Oak Ridge National Laboratory. “Large-scale technique to produce quantum dots.” ScienceDaily. ScienceDaily, 19 May 2016. <www.sciencedaily.com/releases/2016/05/160519161227.htm>.

http://mobilesyrup.com/2016/05/20/google-and-levis-are-making-a-touch-sensitive-jacket-that-controls-your-phone/

Google and Levi’s are making a touch-sensitive jacket that controls your phone

Rose Behar

May 20, 2016 4:39pm

Google’s Advanced Technology and Projects (ATAP) group is partnering with Levi’s to produce a touch and gesture sensitive jacket using Project Jacquard’s specially-designed conductive yarn.

The yarn created by Project Jacquard combines a thin, metallic alloy with natural and synthetic yarns like cotton, polyester or silk. It can be woven through an entire piece of clothing, or in one part of the garment as a dedicated touch surface.

Levi’s uses the latter format in its black trucker jacket, which is part of the “Commuter” line and would look at home in a James Bond flick. The technology is all in the sleeve. A video released by Levi’s shows some potential use cases. Its example follows a man popping in some earbuds and getting on his bike. While riding, he can tap and make gestures on his sleeve to get directions, change his music and control calls.

However high-tech the jacket may seem, Google hastened to assure that it can be worn and used just like any other jacket. The only thing to take in to account will be removing the computer chips from the sleeves button loop previous to washing. Google states the jacket will be available as a limited beta this fall, with a wider release scheduled for Spring 2017.

Project Jacquard has made exceptional progress considering its technology was just announced at the 2015 Google I/O developer conference. Who knows what we’ll see next from other textile partners such as Savile Row. Hopefully for all the tech-savvy women out there, some ladies wear will be available sooner rather than later.

http://www.slashgear.com/see-googles-soli-make-apple-watch-look-like-a-relic-20440950/

 

Google has re-introduced us to Project Soli, a gesture-recognition tech that’s now small enough to fit in a smartwatch. What’s that got to do with Apple Watch, you might be wondering? The gestures they’ve demonstrated today include a call-back to what we mentioned back in May of 2015: that the Apple Watch Digital Crown would be out-done by Google’s Project Soli for smartwatches when the time came. That time is now.

The Apple Watch’s Digital Crown allows you to move through screens and lists and you can select items by tapping the device – certainly. It’s simple. What Google’s new implementation of radar technology in Project Soli does is to move your hand away from the watch – you don’t have to touch it at all.

Google presented some smartwatch concepts for Soli a bit ago. Now these concepts are coming to the real world.

Google is working with LG on this device. This is a prototype LG watch that looks similar to the LG Watch Urbane. In this device is a full-powered Soli sensor – one that no longer needs a full-powered PC to function.

This is not yet a consumer product, but chances are you’ll see it turn into one before too long.

Google’s ATAP presentation today also expanded our understanding of Soli as a platform – made for gesture sensing for smart devices.

Google’s Project Soli can be described simply as “a radar for gesture sensing.”

Simple.

Easy enough for anyone to understand once it’s already integrated in to a device. While the first Soli dev kits needed a full-sized PC to work, and were about the size of a square business card, today Google revealed what they’d done to make it easier. They’d reduced the size and power requirements enough that Soli can now run on a smartwatch.

Above you’ll see a brand new video showing developers working with the Soli Alpha developer kit. These projects just begin to scratch the surface of what’s possible with the chip. Stick around our new Soli tag portal for more – same with ATAP!

https://www.raspberrypi.org/blog/raspberry-pi-cloud-vision-google-io/

 

RASPBERRY PI WITH CLOUD VISION AT GOOGLE I/O

Matt visited Google I/O yesterday, and sent back some pretty incredible pictures.This event looks more like a music festival than a tech conference.

Google I/O

He was sending pictures and excited snippets of text back to Pi Towers all throughthe event, and then, when he got home, shared this video. I’ve been so excitedabout it that I’ve had it playing on repeat, and we all thought you’d like to see it too.

This is a demo of a Raspberry Pi robot working with Google’s Cloud Vision API –and it’s got such potential for your projects.

The robot is taking pictures and sending them to the cloud, where they’re analysedand sent back in real time. There’s facial detection – along with detection of whatemotion is showing on those faces. And cloud vision offers you image recognition,so you should be able get your robot to distinguish limes from green apples. Youcan then get the robot to act on that data – so you could set it to gather apples andnot limes, for example.

Cloud vision on a Pi robot

We’re pretty excited about the opportunities this API offers makers of all kinds ofRaspberry Pi devices. You can learn more here – please let us know if you startintegrating it into your own projects!