Finland is happiest country based on World Happiness Report
Mar 26, 2018
CBC reported on the 2018 World Happiness Report rankings, which was co-edited by John Helliwell, a UBC professor emeritus of economics.
“Although immigrants come from countries with very different levels of happiness, their reported life evaluations converge towards those of other residents in their new countries,” Helliwell said.
Next-gen optical disc has 10TB capacity and six-century lifespan
A future alternative to hard disks and Blu ray for storing exponentially exploding zettabytes of “Long Data” in energy-intensive data centers
March 26, 2018
(credit: Getty)
Scientists from RMIT University in Australia and Wuhan Institute of Technology in China have developed a radical new high-capacity optical disc called “nano-optical long-data memory” that they say can record and store 10 TB (terabytes, or trillions of bytes) of data per disc securely for more than 600 years. That’s a four-times increase of storage density and 300 times increase in data lifespan over current storage technology.
Preparing for zettabytes of data in 2025
Forecast of exponential growth of creation of Long data, with three-year doubling time (credit: IDC)
According to IDC’s Data Age 2025 study in 2017, the recent explosion of Big Data and global cloud storage generates 2.5 PB (1015 bytes) a day, stored in massive, power-hungry data centers that use 3 percent of the world’s electricity supply. The data centers rely on hard disks, which have limited capacity (2TB per disk) and last only two years. IDC forecasts that by 2025, the global datasphere will grow exponentially to 163 zettabytes (that’s 163 trillion gigabytes) — ten times the 16.1ZB of data generated in 2016.
Examples of massive Long Data:
The Square Kilometer Array (SKA) radio telescope produces 576 petabytes of raw data per hour.
The Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative to map the human brain is handling data measured in yottabytes (one trillion terabytes).
Studying the mutation of just one human family tree over ten generations (500 years) will require 8 terabytes of data.
IDC estimates that by 2025, nearly 20% of the data in the global datasphere will be critical to our daily lives (such as biomedical data) and nearly 10% of that will be hypercritical. “By 2025, an average connected person anywhere in the world will interact with connected devices nearly 4,800 times per day — basically one interaction every 18 seconds,” the study estimates.
Replacing hard drives with optical discs
There’s a current shift from focus on “Big Data” to “Long Data,” which enables new insights to be discovered by mining massive datasets that capture changes in the real world over decades and centuries.* The researchers say their new Long-data memory technology could offer a more cost-efficient and sustainable solution to the global data storage problem.
The new technology could radically improve the energy efficiency of data centers. It would use 1000 times less power than a hard-disk-based data center by requiring far less cooling and doing away with the energy-intensive task of data migration (backing up to a new disk) every two years. Optical discs are also inherently more secure than hard disks.
“While optical technology can expand capacity, the most advanced optical discs developed so far have only 50-year lifespans,” explained lead investigator Min Gu, a professor at RMIT and senior author of an open-access paperpublished in Nature Communications. “Our technique can create an optical disc with the largest capacity of any optical technology developed to date and our tests have shown it will last over half a millennium and is suitable for mass production of optical discs.”
There’s an existing Blu-ray disc technology called M-DISC, that can store data for 1,000 years, but is limited to 100 GB, compared to 600 TB with the new technology, providing 6000 times more data on a disc.
“This work can be the building blocks for the future of optical long-data centers over centuries, unlocking the potential of the understanding of the long processes in astronomy, geology, biology, and history,” the researchers note in the paper. “It also opens new opportunities for high-reliability optical data memory that could survive in extreme conditions, such as high temperature and high pressure.”
How the nano-optical long-data memory technology works
The high-capacity optical data memory uses gold nanoplasmonic hybrid glass composites to encode and preserve long data over centuries. (credit: Qiming Zhang et al./Nature Communications, adapted by KurzweilAI)
The new nano-optical long-data memory technology is based on a novel gold nanoplasmonic* hybrid glass matrix, unlike the materials used in current optical discs. The technique relies on a sol-gel process, which uses chemical precursors to produce ceramics and glass with higher purity and homogeneity than conventional processes. Glass is a highly durable material that can last up to 1000 years and can be used to hold data, but has limited native storage capacity because of its inflexibility. So the team combined glass with an organic material, halving its lifespan (to 600 years) but radically increasing its capacity.
Data is further encoded by heating gold nanorods, causing them to morph, in four discrete steps, into spheres. (credit: Qiming Zhang et al./Nature Communications, adapted by KurzweilAI)
To create the nanoplasmonic hybrid glass matrix, gold nanorods were incorporated into a hybrid glass composite. The researchers chose gold because like glass, it is robust and highly durable. The system allows data to be recorded in five dimensions — three dimensions in space (data is stored in gold nanorods at multiple levels in the disc and in four different shapes), plasmonic-controlled multi-color encoding**, and light-polarization encoding.
Scientists at Monash University were also involved in the research.
* “Long Data” refers here to Big Data across millennia (both historical and future), as explained here, not to be confused with the “long data” software data type. A short history of Big Data forecasts is here.
Abstract of High-capacity optical long data memory based on enhanced Young’s modulus in nanoplasmonic hybrid glass composites
Emerging as an inevitable outcome of the big data era, long data are the massive amount of data that captures changes in the real world over a long period of time. In this context, recording and reading the data of a few terabytes in a single storage device repeatedly with a century-long unchanged baseline is in high demand. Here, we demonstrate the concept of optical long data memory with nanoplasmonic hybrid glass composites. Through the sintering-free incorporation of nanorods into the earth abundant hybrid glass composite, Young’s modulus is enhanced by one to two orders of magnitude. This discovery, enabling reshaping control of plasmonic nanoparticles of multiple-length allows for continuous multi-level recording and reading with a capacity over 10 terabytes with no appreciable change of the baseline over 600 years, which opens new opportunities for long data memory that affects the past and future.
Bloomberg Adjusts Its Tesla Model 3 Production Forecast Heading Into Final Week Of Q1
As we recently reported, Bloomberg has a new Tesla Model 3 production tracking and estimation tool. Last week, it projected Tesla was building 810 Model 3 sedans a week. In the broader electric car world, 810 cars a week isn’t bad, but it’s a long way from the 2,500 Elon expected by the end of the first quarter of 2018. Essentially right after our coverage of the tool, the Bloomberg tool started showing a significant uptick in production — heading into the last week of the quarter.
One component of the tracker is VIN number registration batches, which Tesla reports to the federal government. All manufacturers report upcoming VIN numbers. It’s how those cars get into the system so that when you go to the registry of motor vehicles to register your shiny new Belchfire 5000, the DMV computer knows your car is real. The VIN numbers also help federal officials keep track of the millions of cars manufactured each year for purposes of managing safety recalls and such.
Bloomberg admits the VIN tracker is not very responsive to the most recent input — it is a rolling average. Therefore, the shutdown of the Model 3 production facilities both in Fremont and in Nevada in February still weigh on the results. Here is the team’s latest thinking on the matter:
“Our model is, by design, slow to respond to such changes in the data. In order to avoid over-reacting to unusual batches, we’ve averaged our production rates over time. That means our model’s current estimated production rate is still being held back by February’s temporary manufacturing pause. We expect the improving trend will continue next week, based on the data we’ve already received.”
And here’s the payoff from all that number crunching: “The trajectory of the reported VINs is steeper and more consistent than we’ve seen before, and our model’s estimates are just beginning to reflect that. Based on some of the numbers we’re seeing, we think it’s possible that Tesla could already be producing well over 1,000 a week and climbing.”
There’s a notable difference between 810 cars a week and 1,500 or so cars a week, but even the latter is still far below Elon’s most recent predictions. But it’s the upward trend that is important. If the trend were flat or on a downward trajectory, that would be cause for serious concern. Even though missing yet another self-imposed target only solidifies Musk’s reputation for being super optimistic at all times about timelines, Tesla is on the way to being the dominant manufacturer of battery electric vehicles outside of China and far ahead of all legacy automakers.
The German car companies are rushing to catch up, but by the time they do, Tesla will have accelerated into the passing lane and be moving ahead fast. The question then will become whether anyone can catch them.
Editor’s note: Some might be inclined to bash Elon Musk and Tesla for still being late with their production targets, but as Elon tweeted last year, if he wasn’t inherently optimistic, Tesla likely wouldn’t be here today and no one could even complain that Model 3 production wasn’t ramping up to 500,000 cars a year fast enough. There would be no Model 3. And, by the way, what would that mean for the electric vehicle plans of major automakers? (Yes, that’s a rhetorical question.)
Multiple synapse heads send out filopodia (green) converging on one microglia (red), as seen by focused ion beam scanning electron microscopy (FIBSEM). [L. Weinhard/EMBL Rome]
Seeing is believing! Yet, in science, it’s not always practical or even possible to visualize events that occur in the natural world. Take, for instance, the human brain—an organ heavily shielded for its protection and often shrouded in mystery. While neuroscience has exponentially improved in the past few decades, visualization of various neuronal events has been exceedingly difficult. However now, investigators at The European Molecular Biology Laboratory (EMBL) have just captured, for the first time, microglia cells nibbling on brain synapses. Findings from the new study—published today in Nature Communications, in an article entitled “Microglia Remodel Synapses by Presynaptic Trogocytosis and Spine Head Filopodia Induction”—show that the special glial cells help synapses grow and rearrange, demonstrating the essential role of microglia in brain development.
“Our findings suggest that microglia are nibbling synapses as a way to make them stronger, rather than weaker,” explained senior study investigator Cornelius Gross, Ph.D., the deputy head of outstation and senior scientist at EMBL.
Around one in ten cells in your brain are microglia. Related to macrophages, they act as the first and main contact in the central nervous system’s active immune defense. They also guide healthy brain development. Researchers have proposed that microglia pluck off and eat synapses—connections between brain cells—as an essential step in the pruning of connections during early circuit refinement. But, until now, no one had seen them do it.
The EMBL researchers saw that roughly half of the time that microglia contact a synapse, the synapse head sends out thin projections or “filopodia” to greet them. In one particularly dramatic case—as seen in the image above—fifteen synapse heads extended filopodia toward a single microglia cell.
“As we were trying to see how microglia eliminate synapses, we realized that microglia actually induce their growth most of the time,” noted lead study investigator Laetitia Weinhard, Ph.D., a postdoctoral research scientist at EMBL.
Amazingly, it turns out that microglia might underlie the formation of double synapses, where the terminal end of a neuron releases neurotransmitters onto two neighboring partners instead of one. This process can support effective connectivity between neurons.
“This shows that microglia are broadly involved in structural plasticity and might induce the rearrangement of synapses, a mechanism underlying learning and memory,” Dr. Weinhard remarked.
Since this was the first attempt to visualize this process in the brain, the current study entails five years of technological development. The research team tried three different state-of-the-art imaging systems before succeeding.
“Here we used light sheet fluorescence microscopy to follow microglia–synapse interactions in developing organotypic hippocampal cultures, complemented by a 3D ultrastructural characterization using correlative light and electron microscopy (CLEM),” the authors wrote. “Our findings define a set of dynamic microglia–synapse interactions, including the selective partial phagocytosis, or trogocytosis (trogo-: nibble), of presynaptic structures and the induction of postsynaptic spine head filopodia by microglia. These findings allow us to propose a mechanism for the facilitatory role of microglia in synaptic circuit remodeling and maturation.”
By CLEM and light sheet fluorescence microscopy—a technique developed at EMBL—they researchers were able to make the first movie of microglia eating synapses.
“This is what neuroscientists have fantasized about for years, but nobody had ever seen before,” Dr. Gross concluded. “These findings allow us to propose a mechanism for the role of microglia in the remodeling and evolution of brain circuits during development.” In the future, he plans to investigate the role of microglia in brain development during adolescence and the possible link to the onset of schizophrenia and depression.
Why your boss should let you nap in the office tomorrow A nap at work tomorrow could help those with sleep problems. Picture; ThinkStock
Are you feeling a bit out of sorts since the clocks went forward? A sleep expert is calling for company bosses to allow their staff to take a nap in the office on Monday to make up for the havoc caused by Daylight Saving changes. Sleep therapist Dr Nerina Ramlakhan, says although the clocks going forward signals the start of a long-awaited summer, it can disrupt the nocturnal patterns of people who are already struggling to get a good night’s kip. More than a quarter of the UK population are already suffering from dangerously low levels of sleep, increasing the risk of diabetes, heart problems and depression.
New research from bed-makers Silentnight and the University of Leeds revealed 25 per cent of Brits are sleeping for only five or fewer hours per night. Dr Nerina warned those already suffering from lack of sleep may be at risk of long term health problems, so bosses should be mindful and give staff the option to sneak a short nap into their working day.
Dr Nerina said: “The loss of an hour in bed is particularly detrimental to individuals that already struggle with their sleep. If you are one of the 25 per cent of the nation getting less than 5 hours sleep a night, this time change could see you drop down to as little as four hours, which is a dangerously low amount. “Many employees may be feeling worse for wear on Monday after losing an hour of sleep over the weekend, so bosses should consider allowing their staff to take a short nap in the office to make up for lost time.”
The sleep doctor argues designated napping time may make for a stronger workforce as sleep is scientifically proven to improve physical and mental health and wellbeing. She said: “Just a twenty minute power nap can make a huge difference. “Naps have been scientifically proven to boost creativity and problem solving ability, and they can even rebalance the immune system, meaning staff are less likely to take sick days.
What Quantum Computing Is Really Good For (Right Now)
Chad Orzel , CONTRIBUTORI write about physics, science, academia, and pop culture.Opinions expressed by Forbes Contributors are their own.
LAS VEGAS, NV – JANUARY 08: Intel Corp. CEO Brian Krzanich delivers a keynote address at CES 2018 at Park Theater at Monte Carlo Resort and Casino in Las Vegas on January 8, 2018 in Las Vegas, Nevada. CES, the world’s largest annual consumer technology trade show, runs from January 9-12 and features about 3,900 exhibitors showing off their latest products and services to more than 170,000 attendees. (Photo by Ethan Miller/Getty Images)
Several years ago, I was on a panel discussion to mark the opening of the Mike & Ophelia Lazaridis Quantum-Nano Centre at the University of Waterloo, and the moderator asked us all to imagine being fifty years in the future, talking about the science accomplished at the Centre at an anniversary panel. Being a squishy moderate guy by nature, I said that I didn’t think we’d have quantum computers on every desktop, but that we might see some exciting developments from the field of quantum simulation, using quantum computers to understand the states of complex and strongly-interacting systems.
The next guy up, Ray LaFlamme, the head of Waterloo’s Institute for Quantum Computing, started his answer with “I want to be quantum teleported.” So, you know, so much for moderation…
Anyway, quantum computing is an exceptionally hot topic right now, with giants of the computing industry like Google and Microsoft and IBM and Intel giving splashy presentations at big meetings about their new quantum devices. And in a lot of ways, what we’re seeing is pretty much in line with what I said back in Waterloo: they’re making interesting devices mostly in the area of quantum simulation.
What do I mean by that? Popular treatments of quantum computing mostly focus on the area of quantum algorithms, where you investigate ways that replacing classical bits that are either “0” or “1” with qubits that can be some arbitrary superposition of “0” and “1” at the same time lets you do certain computations faster. The splashy big-money application of this is Shor’s algorithm for factoring numbers, which Scott Aaronson explained beautifully a while back, which gets spies and bankers interested because a fast method for factoring large numbers could compromise some encryption techniques. There are also quantum algorithms for fast search and other sots of problems mostly relating to collective properties of systems of numbers.
That branch of quantum computing is very exciting, but also super difficult to make dramatic progress in. The problem is that to do anything really interesting, you need huge numbers of qubits, and many of the architectures currently in use don’t scale all that well or quickly. We’ll probably get to the point of doing interesting computations with these kinds of systems, but I don’t think it will be revolutionizing the world all that soon.
There’s another angle on quantum computing though, what I referred to as “quantum simulation” above, and that’s more promising in the short term. It’s what’s really going on in a lot of the currently-hyped quantum devices, and there was a nice paper in Physical Review X (open-access) a couple of weeks ago giving a simple demonstration of the kind of thing you can do in this area.
GAITHERSBURG,MD – March, 5:Ari Dyckovsky, 17, works on quantum physics experiments in the Physical Measurement Lab that deals with quantum computing and simulation Monday March 5, 2012 in Gaithersburg, MD at NIST, National Institute of Standards and Technology. Ari Dyckovsky is a finalist in the Intel Science Talent search.(Photo by Katherine Frey/The Washington Post via Getty Images)
The idea behind quantum simulation is that a quantum-mechanical world is somewhat costly to simulate on a classical computer. Not only do you have to worry about particles being in multiple states at the same time, you have to worry about correlations between particles– the exact superposition of states for one particle can be tied to the state of each of the other particles in the system, regardless of where they’re located. This adds a lot of overhead: not only do you need to keep track of slightly more complicated individual states, you also need to include elements in your system that describe the “coherences” between each individual particle and all of the other particles. The resources needed to keep track of all that grow extremely rapidly as you increase the size of the simulation.
But, that’s mostly a problem if you’re thinking about doing the system with a classical computer, whose bits don’t naturally behave in a quantum manner. The idea of quantum simulation, on the other hand, is to model these kinds of systems with bits that are already quantum, and so have those kinds of superpositions and correlations already built in, with no need to add extra bits to track them. If you can map your quantum system of interest onto a system of qubits that you control very well, the system will naturally simulate exactly the kinds of quantum behavior that are most difficult to model on a classical computer.
It turns out that the scaling here makes it a lot easier to make progress, compared to the more algorithmic side of things. There’s some argument about the exact limit, but if you can get something like 50 qubits together, you’re in the realm where the best classical computers would have an extremely hard time simulating the results. This is the kind of thing that’s going on with most of the “quantum computing” systems being discussed at big companies right now: they’re doing things like finding the lowest-energy configuration for a system of interacting magnetic particles by mapping those interactions onto a network of qubits and letting it find its own lowest-energy configuration.
Of course, most systems that you’d be interested in for real-world applications also interact with the environment, which is a tricky problem because ensuring the integrity of a quantum computation involves working very hard to remove environmental effects that might complicate matters. That’s what the recent PRX paper is about: Putting the environment back into the simulation.
Mike & Ophelia Lazaridis Quantum-Nano Centre, Waterloo, Canada. Architect: KPMB Architects, 2012. The Institute for Quantum Computing and the Quantum-Nano Centre at sunset from Ring Road. (Photo by: Doublespace/View Pictures/UIG via Getty Images)
The specific problem they’re looking at is an energy transfer reaction where one part of a quantum system has some extra energy, and passes it to another part of the system. This is something that plays an important role in a lot of biological processes– photosynthesis, for example, where energy absorbed by a pigment molecule in one part of a plant cell is transferred to another part of the cell to be used in driving the chemical reactions of life.
This is a simple matter if the energy levels of the destination molecule exactly match those at the start of the process, but that rarely happens. Different molecules tend to have different energy levels, and even if they start out close together, environmental factors like the temperature and chemical environment can shift those energy states around. For these reactions to proceed in a hot, wet and messy environment like a real cell, there needs to be a way to gain a little extra energy from the environment, or dump a little excess energy into the environment. But that’s a tricky thing to simulate in a system like a quantum computer where you’ve worked very hard to limit disruption from the environment.
The recent paper is demonstrating a way of doing just that, in an extremely simple “simulator” consisting of two ions held in an electrostatic trap. These ions are held in the trap by high-voltage electric fields, but they also repel each other, and can thus “talk” to each other via their collective vibrational motion. If they put the two ions in there and excite one, it’s not hard to set up a situation where that energy will move back and forth between the two via that vibrational channel.
What they do in this paper is to add a laser that shifts the energy states of one of the two ions to stop that easy direct transfer. Then they simulate an environment by adding some extra lasers to modify the vibrational motion so that it can either provide an extra little boost to enable the energy transfer (when the target ion needs more energy than the source has), or soak up a little energy (when the source ion has more energy than the target ion). They can monitor how the reaction proceeds by tracking the state of the target ion, and compare their results to what they would expect from classical simulations of an energy transfer reaction in a noisy environment, and it agrees pretty nicely.
Shutterstock
Now, of course, this is a really simple simulation, one that you can model perfectly well with a good classical computer. In order to do something genuinely of interest for modeling biology, they’d need a lot more trapped ions and more complicated interactions. This is, however, a nice proof-of-principle experiment showing how you can use a relatively simple quantum system to simulate fairly rich physics, and get total control over not only the way the process of interest is mapped onto the simulator, but also total control over the “environment” of the simulated system.
That’s a pretty cool addition to the quantum-simulation toolkit, and another demonstration of how this is a fascinating research area in the near term. It’s going to be a long time before we have the ability to quantum teleport anybody, but quantum simulation is pushing forward rapidly, and it’s fun to watch.
Apple is said to have an entire new lineup of iPhones coming later this year, bringing the iPhone X design to a series of devices. Now, RBC Capital analyst Amit Daryanani has taken a stab at predicting the price points for this year’s new iPhones…
As a refresher, KGI, Bloomberg, and others have reported that Apple has three new iPhones on the docket for this year. First and foremost, an updated version of the 5.8-inch iPhone X that was originally introduced last year. Apple is also expected to announce a new 6.5-inch ‘iPhone X Plus’ with an OLED display, as well as a 6.1-inch variant with an LCD screen.
While those sources have offered a slew of details about the devices themselves, pricing remains a question.
RBC’s Amit Daryanani predicts that Apple’s 2018 iPhone lineup will start at “$700+” for the 6.1-inch LCD model, while the 5.8-inch OLED model could start at $899. Rounding out the lineup is the iPhone X Plus, which Daryanani predicts will start at $999 (via Kif Leswing).
Essentially, what RBC is predicting is that Apple will shift prices down by $100 in terms of price. The iPhone X is currently priced at $999, so Daryanani is predicting a $100 price cut there to make room for the iPhone X Plus at the high-end of the market, starting at $999.
As for the 6.1-inch model with an LCD display, a $799 price point doesn’t seem unreasonable as that’s what the iPhone 8 Plus currently runs, and the two will likely be similar in terms of specs.
Furthermore, some analysts have raised concerns about Apple’s pricing. A report last week said that as Apple raises its iPhone pricing, consumers are becoming less willing to spend as much on smartphones. Thus, moving the iPhone X down to $899 and offering the same notch design in the LCD model at an even lower price could attract some additional customers.
What do you think of these predictions? Let us know down in the comments!