New device yields close-up look at cancer metastasis

October 31, 2014

Engineers at Johns Hopkins Institute for NanoBioTechnology (INBT) have invented a lab device to give cancer researchers an unprecedented microscopic look at metastasis (spread of tumor cells, causing more than 90 percent of cancer-related deaths), with the goal of eventually stopping the spread, described in their paper in the journal Cancer Report.

“There’s still so much we don’t know about exactly how tumor cells migrate through the body, partly because, even using our best imaging technology, we haven’t been able to see precisely how these individual cells move into blood vessels,” said Andrew D. Wong, a Department of Materials Science and Engineering doctoral student and lead author of the journal article. “Our new tool gives us a clearer, close-up look at this process.”

The device replicated these processes in a small transparent chip that incorporates an artificial blood vessel and surrounding tissue material. A nutrient-rich solution flows through the artificial vessel, mimicking the properties of blood.

With this novel lab platform, Wong said, the team was able to record a video of the movement of individual cancer cells as they crawled through a three-dimensional collagen matrix. This material resembles the human tissue that surrounds tumors when cancer cells break away and try to relocate elsewhere in the body. This process is called “invasion.”
Wong also created a video of single cancer cells prying and pushing their way through the wall of an artificial vessel lined with human endothelial cells, the same kind that line human blood vessels.

By entering the bloodstream through this process, called “intravasion,” cancer cells are able to hitch a ride to other parts of the body and begin to form deadly new tumors.

The breast cancer cells, inserted individually and in clusters in the tissue near the vessel, are labeled with fluorescent tags, enabling their behavior to be seen, tracked and recorded via a microscopic viewing system.

Wong’s doctoral advisor, Peter Searson, the Joseph R. and Lynn C. Reynolds Professor of Materials Science and Engineering and director of the INBT, said Wong took on this challenging project nearly five years ago—and ultimately produced impressive results.

“In the past, it’s been virtually impossible to see the steps involved in this process with this level of clarity. We’ve taken a significant leap forward.”

This improved view should give cancer researchers a much clearer look at the complex physical and biochemical interplay that takes place when cells leave a tumor, move through the surrounding tissue and approach a blood vessel. For example, the new lab device enabled the inventors to see detailed images of a cancer cell as it found a weak spot in the vessel wall, exerted pressure on it and squeezed through far enough so that the force of the passing current swept it into the circulating fluid.

“This device allows us to look at the major steps of metastasis as well as to test different treatment strategies at a relatively fast pace,” Wong said. “If we can find a way to stop one of these steps in the metastatic cascade, we may be able to find a new strategy to slow down or even stop the spread of cancer.”

Next, the researchers plan to use the device to try out various cancer-fighting drugs within this device to get a better look at how the medications perform and how they might be improved. A provisional patent has been obtained through the Johns Hopkins Technology Transfer office.

Wong’s work has been supported by an INBT training grant. Development of the cancer research device was supported by the National Institutes of Health.

Watson to help find new sources of oil: World’s first cognitive-technologies collaboration for oil industry applications

October 30, 2014

Scientists at IBM and Repsol SA, Spain largest energy company, announced today (Oct. 30) the world’s first research collaboration using cognitive technologies like IBM’s Watson to jointly develop and apply new tools to make it cheaper and easier to find new oil fields.

An engineer will typically have to manually read through an enormous set of journal papers and baseline reports with models of reservoir, well, facilities, production, export, and seismic imaging data.

IBM says its cognitive technologies could help by analyzing hundreds of thousands of papers, prioritize data, and link that data to the specific decision at hand. It will introduce “new real-time factors to be considered, such as current news events around economic instability, political unrest, and natural disasters.”

Dealing with big-data complexity
The oil and gas industry boasts some of the most advanced geological, geophysical and chemical science in the world. But the challenge is to integrate critical geopolitical, economic, and other global news into decisions. And that will require a whole new approach to computing that can speed access to business insights, enhance strategic decision-making, and drive productivity, IBM says.

This goes beyond the capabilities of Watson. But scientists at IBM’s Cognitive Environments Laboratory (CEL), collaborating with Repsol, plan to develop and apply new prototype cognitive tools for real-world use cases in the oil and gas industry. They will experiment with a combination of traditional and new interfaces based upon spoken dialog, gesture, robotics and advanced visualization and navigation techniques.

The objective is build conceptual and geological models, highlight the impact of the potential risks and uncertainty, visualize trade-offs, and explore what-if scenarios to ensure the best decision is made, IBM says.

Repsol is making an initial investment of $15 million to $20 million to develop two applications targeted for next year, Repsol’s director for exploration and production technology Santiago Quesada explained to Bloomberg Business Week. “One app will be used for oil exploration and the other to help determine the most attractive oil and gas assets to buy.”

What running birds can teach robots

With an eye toward making better running robots, researchers from from Oregon State University, the Royal Veterinary College and other institutions have made surprising new findings about some of nature’s most energy-efficient bipeds — running birds.

These are some of the most sophisticated runners of any two-legged land animals, including humans, the researchers found in a study published Wednesday (Oct. 29) in the Journal of Experimental Biology, with an impressive ability to run while minimizing energy cost, avoiding falls or injury, and maintaining speed and direction.

What running birds can teach robots
“Birds appear to be the best of bipedal terrestrial runners, with a speed and agility that may trace back 230 million years to their dinosaur ancestors,” said Jonathan Hurst, an associate professor and robotics expert in the OSU College of Engineering.

In the wild, an injury could lead to predation and death; and in like fashion, when food resources are limited, economy of motion is essential.

Surprisingly, a wide variety of ground-running bird species with very different body sizes use essentially the same strategy to accomplish these sometimes conflicting tasks. To hop over obstacles on uneven ground, they use a motion that’s about 70 percent a “vaulting” movement as they approach the obstacle, and 30 percent a more crouched posture while on top of the obstacle.

In collaboration with Monica Daley at the Royal Veterinary College in London, the researchers studied five species of birds and developed a computer model in OSU’s Dynamic Robotics Laboratory that closely matches that behavior.

The researchers began the study with a hypothesis that body stability would be a priority, since it might help avoid falls and leg injuries. But that’s not what they found. Instead, running birds have a different definition of stability — they allow their upper bodies to bounce around some, just so long as they don’t fall.

Large animals like are limited by the strength of their legs because peak loads increase with body mass, and they run with somewhat straighter legs to compensate. But the basic approach large birds use to run is similar to much smaller birds, and remains highly efficient.

Modern robots, by contrast, are usually built with an emphasis on total stability, which often includes maintaining a steady gait. This can be energy-intensive and sometimes limits their mobility.

What robots could learn from running birds, the scientists said, is that it’s okay to deviate from normal steady motions, because it doesn’t necessarily mean you’re going to fall or break something. Robotic control approaches “must embrace a more relaxed notion of stability, optimizing dynamics based on key task-level priorities without encoding an explicit preference for a steady gait,” the researchers said in their conclusion.

Collaborators on the research were from the Royal Veterinary College in the United Kingdom. The work was supported by the Biotechnology and Biological Sciences Research Council in the United Kingdom and the Human Frontier Science Program.

Google’s Nanoparticle Platform Project

Google announced a new “Nanoparticle Platform” project Tuesday to develop medical diagnostic technology using nanoparticles, Andrew Conrad, head of the Google X Life Sciences team, disclosed at The Wall Street Journal’s WSJD Live conference.

The idea is to use nanoparticles with magnetic cores circulating in
the bloodstream with recognition molecules to detect cancer, plaques, or too much sodium, for example.

There are a number of similar research projects using magnetic (and
other) nanoparticles in progress, as reported onKurzweilAI. What’s new in the Google project is delivering nanoparticles to the bloodstream via a pill and using a wearable wrist detector to detect the nanoparticles’ magnetic field and read out diagnostic results.

But this is an ambitious moonshot project. “Google is at least five to
seven years away from a product approved for use by doctors,” said Sam Gambhir, chairman of radiology at Stanford University Medical School, who has been advising Dr. Conrad on the project for more than a year, the WSJ reports.

“Even if Google can make the system work, it wouldn’t immediately be clear how to interpret the results. That is why Dr. Conrad’s team
started the Baseline study [see “New Google X Project to look for
disease and health patterns in collected data”], which he hopes will
create a benchmark for comparisons.”

As part of the Baseline project, the Google X Life Sciences group has
been developing related technology for wearable devices, to be worn by Baseline participants. The devices will collect data such as heart rates, heart rhythms, and oxygen levels.

For example, as KurzweilAI reported in January, a Google-designed
contact lens uses a tiny wireless chip and miniaturized glucose sensor that are embedded between two layers of soft contact lens material.

The chip was originally intended to help people with diabetes as they try to keep their blood sugar levels under control.

 ‘Nanomotor Lithography’ Answers Call For Affordable, Simpler Device Manufacturing

October 29, 2014

What does it take to fabricate electronic and medical devices tinier than a fraction of a human hair? Nanoengineers at the University of California, San Diego recently invented a new method of lithography inwhich nanoscale robots swim over the surface of light-sensitive material to create complex surface patterns that form the sensors and electronics components on nanoscale devices. Their research, published recently in the journal Nature Communications, offers a simpler and more affordable alternative to the high cost and complexity of current state-of-the-art nanofabrication methods such as electron beam writing.

Led by distinguished nanoengineering professor and chair Joseph Wang, the team developed nanorobots, or nanomotors, that are chemically-powered, self-propelled and magnetically controlled. Their proof-of-concept study demonstrates the first nanorobot swimmers able to manipulate light for nanoscale surface patterning. The new strategy combines controlled movement with unique light-focusing or light-blocking abilities of nanoscale robots.

“All we need is these self-propelled nanorobots and UV light,” said Jinxing Li, a doctoral student at the Jacobs School of Engineering and first author. “They work together like minions, moving and writing and are easily controlled by a simple magnet.”

State-of-art lithography methods such as electron beam writing are used to define extremely precise surface patterns on substrates used in the manufacture of microelectronics and medical devices. These patterns form the functioning sensors and electronic components such as transistors and switches packed on today’s integrated circuits. In the mid-20th century the discovery that electronic circuits could be patterned on a small silicon chip, instead of assembling independent components into a much larger “discrete circuit,” revolutionized the electronics industry and set in motion device miniaturization on a scale previously unthinkable.

Today, as scientists invent devices and machines on the nanoscale, there is new interest in developing unconventional nanoscale manufacturing technologies for mass production.

Li was careful to point out that this nanomotor lithography method cannot completely replace the state-of-the-art resolution offered by an e-beam writer, for example. However, the technology provides a framework for autonomous writing of nanopatterns at a fraction of the cost and difficulty of these more complex systems, which is useful for mass production. Wang’s team also demonstrated that several nanorobots can work together to create parallel surface patterns, a task that e-beam writers cannot perform.

The team developed two types of nanorobots: a spherical nanorobot made of silica that focuses the light like a near-field lens, and a rod-shape nanorobot made of metal that blocks the light. Each is self-propelled by the catalytic decomposition of hydrogen peroxide fuel solution. Two types of features are generated: trenches and ridges. When the photoresist surface is exposed to UV light, the spherical nanorobot harnesses and magnifies the light, moving along to create a trench pattern, while the rod-shape nanorobot blocks the light to build a ridge pattern.

“Like microorganisms, our nanorobots can precisely control their speed and spatial motion, and self-organize to achieve collective goals,” said professor Joe Wang. His group’s nanorobots offer great promise for diverse biomedical, environmental and security applications.

UC San Diego is investing heavily in robotics research while leveraging the partnership opportunities afforded by regional industry expertise in supporting fields such as defense and wireless technology, biotech and manufacturing. The Contextual Robotics Technologies International Forum was hosted by the Jacobs School of Engineering, the Qualcomm Institute and the Department of Cognitive Science.

Joe Wang is the director of the Center for Wearable Sensors at UC San Diego Jacobs School of Engineering and holds the SAIC endowed chair in engineering.

Data smashing’ could automate discovery, untouched by human hands

October 28, 2014

From recognizing speech to identifying unusual stars, new discoveries often begin with comparison of data streams to find connections and spot outliers. But simply feeding raw data into a data-analysis algorithm is unlikely to produce meaningful results, say the authors of a new Cornell study.

That’s because most data comparison algorithms today have one major weakness: somewhere, they rely on a human expert to specify what aspects of the data are relevant for comparison, and what aspects aren’t.

But these experts can’t keep up with the growing amounts and complexities of big data.

So the Cornell computing researchers have come up with a new principle they call “data smashing” for estimating the similarities between streams of arbitrary data without human intervention, and even without access to the data sources.

How ‘data smashing’ works

Data smashing is based on a new way to compare data streams. The process involves two steps.

The data streams are algorithmically “smashed” to “annihilate” the information in each other. The process measures what information remains after the collision. The more information remains, the less likely the streams originated in the same source.

Data-smashing principles could open the door to understanding increasingly complex observations, especially when experts don’t know what to look for, according to the researchers.

The researchers— Hod Lipson, associate professor of mechanical engineering and of computing and information science, and Ishanu Chattopadhyay, a former postdoctoral associate with Lipson now at the University of Chicago — demonstrated this idea with data from real-world problems, including detection of anomalous cardiac activity from heart recordings and classification of astronomical objects from raw photometry.

In all cases and without access to original domain knowledge, the researchers demonstrated that the performance of these general algorithms was on par with the accuracy of specialized algorithms and heuristics tweaked by experts to work.

The researchers described the work in Journal of the Royal Society Interface. It was supported by DARPA and the U.S. Army Research Office.

UBC researchers aim to simplify life saving drug

Media Release | October 29, 2014

Heparin, the life saving blood thinner used in major surgeries and treatment of heart diseases, is a complicated drug but a research team from the University of British Columbia has set out to make its use a lot safer by developing a universal antidote.

Heparin’s blood thinning action often requires an antidote to reverse its effect before serious bleeding issues arise especially in the case of major surgical procedures.

Finding an approved drug to reverse the blood thinning effect of heparin is complicated because there are about a dozen approved heparin products on the market. It’s further complicated by the fact that two of the most commonly prescribed heparins, low molecular weight heparins and fondaparinux have no known medically approved antidote.

None of the available synthetically made blood thinning reversal drugs work with all varieties of heparins and they are relatively toxic. The toxicity varies patient to patient which is another complicating factor for health care providers.

The UBC researchers took aim at a simple solution; a synthetic antidote that works with all heparins used in clinics today. The UBC team led by Jayachandran Kizhakkedathu developed a universal heparin antidote and its success in small animal testing is considered a fundamental breakthrough for both cardiothoracic surgery and the treatment of anticoagulant-related bleeding problems.

“Heparin is the second most prescribed drug after insulin,” said Kizhakkedathu. “Our universal heparin antidote addresses an unmet clinical need. It has potential to benefit all patients undergoing high-risk surgical procedures and treating bleeding complications.”

Kizhakkedathu, a Michael Smith Foundation Health Research Scholar with UBC’s Department of Pathology and Lab Medicine, Centre for Blood Research and the Department of Chemistry says a synthetic drug offers consistency in health effects and performance. He adds that it also avoids possible immunological reactions sometimes associated with antidotes with biological origin.

The UBC research was published Oct. 29 in the journal Science Translational Medicine.

The next step for UBC researchers is continued lab research with the goal of human testing in three to five years.

This research is funded by the Canadian Institutes of Health Research (CIHR) and Natural Sciences and Engineering Research Council of Canada (NSERC).

Western researchers make osteoarthritis breakthrough

London Community News

Researchers at Western University have identified a specific gene that plays a key role in the degradation of cartilage in osteoarthritis (OA). The study, published online in the journal Arthritis & Rheumatology, showed that when the gene, PPARdelta, was removed from cartilage the progression of post-traumatic osteoarthritis was considerably slowed.

Study co-author Frank Beier, PhD, a professor in the Department of Physiology and Pharmacology at Western’s Schulich School of Medicine and Dentistry, says this promising new research may be the first step to identifying new treatments.

“What this tells us is that this gene is an important player in the pathogenesis of the disease and therefore might be a potential therapeutic target,” Beier said.

According to a report by the Arthritis Alliance of Canada, more than 4.5 million adult Canadians currently live with OA, and it is estimated that it drives $10 billion in direct healthcare costs. Post-traumatic osteoarthritis, which is triggered by a specific injury, makes up 10 to 15 per ent of all cases of the disease and affects a younger, more active portion of the population.

“The current thinking is that what happens in those first hours after injury can have long-term impact,” Beier said. “This research shows that we might have a window that we could give these drugs right after the injury happens and maybe slow the onset of the cartilage degradation associated with osteoarthritis.” These research findings may also help to explain the link between obesity and OA. It has long been known that obesity is one of the major risk factors for OA, and the conventional thinking was that the link was associated with the increased load on the joints. However, recent evidence suggests that chemical signals circulating in the body contribute to osteoarthritis risk in obese patients. Beier says that because PPARdelta is activated by fatty molecules, lipids from a high-fat diet could directly activate the pathway that allows PPARdelta to break down cartilage in the animal model.

“This also suggests that in the future, modulation of PPARdelta through diet changes, as opposed to drugs, could also be a strategy to prevent Osteoarthritis,” Beier said.

The study’s lead author, PhD Candidate Anusha Ratneswaran, is currently on a three-month exchange in Australia with funding from an Osteoarthritis Research International Collaborative Scholarship.

UBC researchers to test ‘genetically informed’ drug therapy: Pharmacists hope gene technology will improve safe prescribing of drugs


Pharmacy manager Hoang Nguyen works at a Vancouver pharmacy that is participating in a study about how genetic makeup can affect how people respond to prescription medicine.

University of B.C. researchers will test whether personalized medicine is ready for prime time, now that it’s known that a person’s genetic makeup can affect how they respond to prescribed medications.

The study — said to be the first of its kind in North America — will involve B.C. patients taking warfarin, a blood thinner used to prevent clots. The patients will be asked to provide saliva samples at participating pharmacies and the DNA in those samples will then be analyzed at a UBC lab. The data will be used to compare the drug dose a patient received to what a “genetically informed” therapy would have been.

Although 18 variants — called polymorphisms or SNPs — in five genes may predict how patients metabolize warfarin, such genetic testing is not currently done before patients go on the drug. Instead, doctors take such things as weight, height and gender into consideration when writing a prescription — and there may be a trial-and-error period lasting weeks before patients are finally at safe, ideal therapeutic doses. Even then, they must return to labs an average of 16 times a year to ensure their medication is at the right dose, according to Geraldine Vance, CEO of the B.C. Pharmacy Association.

This study could eventually change that practice.

“Good prescribing practices are all about the right drug on the right patient at the right time and in the right dose,” said Corey Nislow, a UBC associate pharmacy professor leading the Genomic for Precision Drug Therapy project. “The point of this study is not to change patients’ medication doses now, but to understand the protocols that we need to put into place, to educate consumers and to help pharmacists so they can intelligently inform their customers … so when the science is mature, we have statistically powerful information about gene-drug associations.”

Warfarin (also known by the brand name Coumadin) is in the top 10 percent of most commonly prescribed medications; it is often used to prevent primary or repeat strokes, embolisms or heart attacks. About 60,000 patients are taking warfarin in B.C.; many start taking the drug while in hospital, where they may be kept for days merely for the purpose of monitoring and adjusting the dose, according to Vance.

Gene technology, Nislow said, has sped ahead of practical applications, so it is time to collect real-patient data to determine how community pharmacies can use the technology to improve on safe prescribing. Nislow said the Food and Drug Administration in the U.S. lists nearly three dozen drugs that are known, at this point, to have more than 100 gene-drug interactions, meaning that higher or lower doses should be used, depending on genetic variations.

Gene sequencing is used in the field of cancer diagnosis and treatment to help oncologists decide which chemotherapy agents might help patients and which ones won’t.

Nislow and his co-researchers — funded by $400,000 from the B.C. Pharmacy Association and Genome BC — are seeking 200 warfarin users for the first phase of the study. All of the genetic typing will be coded for privacy reasons; not even the participants will be told about their gene profiles.

Since the study is a pilot, warfarin patients who enlist will be doing so for purely altruistic purposes, to move knowledge along.

Dr. Bill Cavers, president of Doctors of B.C., said he has concerns about whether such a study is premature, given the early stages of gene testing. He also worries about privacy concerns. In the case of warfarin, Cavers thinks expensive gene testing is akin to “using a baseball bat to kill a mosquito.” The current approach used by doctors of starting a patient on a low dose and monitoring them closely through lab tests is perfectly reasonable, he believes.

Vance said it’s possible doctors may be worried that pharmacists will encroach on their terrain, but she can says pharmacists aren’t trying to diagnose and manage illness.

“We’re not talking about getting into their business,” she said, adding she hopes the study will be completed in 18 months so that rapid gene sequencing technology can be used outside of the oncology field.

According to a background document, there is no telling who would pay for such genome sequencing when or if it becomes available to the public; private insurers, the government and consumers themselves are all options.

Vance said it’s possible the study will find that genome analysis is useful for more accurately predicting the right dosing, in which case it may prove to be a cost saver for the health care system.

The lab cost associated with genome sequencing in this study is about $500 per saliva sample.

To enrol in the study or learn more about it, contact project manager Mark Kunzli at 604-827-1968 or 778- 855-6275.

Rare Brain Disease Researchers To Receive More Than $30 Million In Funding Over Next 5 Years

Agencies Award Four Grants to Advance Study of Frontotemporal Degeneration

PR Newswire, RADNOR, Pa., Oct. 28, 2014 /PRNewswire-USNewswire

Researchers studying frontotemporal degeneration (FTD) disease, a leading cause of early onset dementia, will receive more than $30 million over the next five years in grants from the National Institutes of Health (NIH). The funding will be used to further scientific collaboration and investigate new treatments in the quest to find a cure for FTD.

FTD received a total of four grants, each independently peer reviewed, that will allow for building reliable clinical networks to diagnose and treat FTD and related variants; recruiting FTD-causing gene mutation carriers for study; and study of a specific genetic mutation that is the most common cause of both inherited FTD and inherited ALS.

“The FTD community is extremely gratified to be the recipient of this unprecedented level of funding that we believe is the result of the tremendous momentum underway in FTD science,” said Susan Dickinson, Executive Director of The Association for Frontotemporal Degeneration (AFTD). “What started with FTD’s recent inclusion in national research priorities to cure Alzheimer’s disease and other dementias by 2025, has now catapulted into what promises to be significant progress in learning about this debilitating neurodegenerative disease.”

Three of the grants, totaling $5.9 million per year, are being funded by the NIH’s National Institute of Neurological Disorders and Stroke (NINDS), National Institute on Aging (NIA) and the National Center for Advancing Translational Sciences (NCATS). The three projects will enable scientists to collaborate on research approaches for FTD, with the goal of diagnosing and treating patients more effectively. For more information about these projects, click here.

“The projects aim to advance our understanding of FTD by improving diagnosis, identifying preventive strategies and providing new insights into the genetics underlying this complex disorder,” said Margaret Sutherland, Ph.D., program director at NINDS.

A fourth grant is part of $29 million in research earmarked for the Rare Diseases Clinical Research Network, a network of 22 consortia dedicated to furthering translational research and investigating new treatments for patients with rare diseases. The major focus of this grant is to study ALS, including the disease variant of ALS with FTD.

About FTD
FTD is a rare disease, affecting approximately 50,000 nationwide. It is a debilitating form of dementia that affects the frontal and/or temporal lobes of the brain. FTD strikes people in the prime of life–typically between ages 50 and 60–and erodes their ability to speak, move and/or behave within social norms. There is no known cure for FTD. Current treatments may address symptoms but do not alter or slow disease progression. For those affected, getting a correct diagnosis is challenging, as many physicians are unfamiliar with FTD.

About AFTD
The Association for Frontotemporal Degeneration envisions a world where FTD is understood, effectively diagnosed, treated, cured and ultimately prevented. For more information about AFTD or frontotemporal degeneration, visit or connect via or