How Stable Manure Protects Against Allergies

Scienceadaily.com - Improved hygiene has largely eliminated infectious diseases from everyday life. There is, however, a downside to this progress: the number of allergies is growing steadily. If the immune system is not kept busy by bacteria, viruses and worms, it sometimes overreacts to harmless things like pollen.


Researchers funded by the SNSF have now investigated the mechanism underlying the "farmhouse effect", which indicates that children who grow up on a farm are less likely to suffer from allergies. "We can only use superficial parameters to study children's immune systems," says lead investigator Philippe Eigenmann from Geneva University Hospitals. "This is why we wanted to study the allergic reaction of mice in detail."

Laboratory mice in the cowshed

For the experiment, the research group working with Eigenmann transferred mice directly to a cowshed in Vollèges near Martigny (Valais), the first time this had been done for an allergy study. Mice born on the farm reacted less intensely to an artificial allergen than those born in the laboratory. This was determined by measuring the extent of ear swelling. Mice that were not transferred to the cowshed until four weeks after birth were slightly less well protected. This finding is consistent with data from earlier studies in humans. Eigenmann adds: "Children of farmers' wives who worked in animal sheds while they were pregnant accordingly have even fewer problems with allergies."

Why probiotic foods don't really work

A comparison of cells and signalling substances in the immune system also shows that the reactions differ considerably. The immune defence of the farm-born mice was constantly activated but at the same time powerfully regulated by germs from the cowshed. "The immune system evidently learns to moderate its response," Eigenmann explains.

The animals' gut flora also differed depending on their living conditions. The diversity of bacteria was greater in the intestinal tract of farm mice, and a certain type of virus was present in larger quantities. These mastadenoviruses could be the factor that is moderating immune response.

The changes in gut flora and the immune system were many and varied, which explains why certain preventive measures based on administration of individual germs have little effect. The bacterial strains in probiotic foods such as yogurts are one example. Another is the administration of deactivated thread worm eggs. "We need to take as global an approach as possible to the factors and rethink our concept of cleanliness," Eigenmann explains.

Source : Swiss National Science Foundation (SNSF)

Clinical Guidelines to Prevent Peanut Allergy

Scienceadaily.com - An expert panel sponsored by the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health, issued clinical guidelines today to aid health care providers in early introduction of peanut-containing foods to infants to prevent the development of peanut allergy.

Peanut allergy is a growing health problem for which no treatment or cure exists. People living with peanut allergy, and their caregivers, must be vigilant about the foods they eat and the environments they enter to avoid allergic reactions, which can be severe and even life-threatening. The allergy tends to develop in childhood and persist through adulthood. However, recent scientific research has demonstrated that introducing peanut-containing foods into the diet during infancy can prevent the development of peanut allergy.


The new Addendum Guidelines for the Prevention of Peanut Allergy in the United States supplement the 2010 Guidelines for the Diagnosis and Management of Food Allergy in the United States (link is external). The addendum provides three separate guidelines for infants at various levels of risk for developing peanut allergy and is targeted to a wide variety of health care providers, including pediatricians and family practice physicians.

“Living with peanut allergy requires constant vigilance. Preventing the development of peanut allergy will improve and save lives and lower health care costs,” said NIAID Director Anthony S. Fauci, M.D. “We expect that widespread implementation of these guidelines by health care providers will prevent the development of peanut allergy in many susceptible children and ultimately reduce the prevalence of peanut allergy in the United States.”

Addendum Guideline 1 focuses on infants deemed at high risk of developing peanut allergy because they already have severe eczema, egg allergy or both. The expert panel recommends that these infants have peanut-containing foods introduced into their diets as early as 4 to 6 months of age to reduce the risk of developing peanut allergy. Parents and caregivers should check with their infant’s health care provider before feeding the infant peanut-containing foods. The health care provider may choose to perform an allergy blood test or send the infant to a specialist for other tests, such as a skin prick test or an oral food challenge. The results of these tests will help decide if and how peanut should be safely introduced into the infant’s diet.

Guideline 2 suggests that infants with mild or moderate eczema should have peanut-containing foods introduced into their diets around 6 months of age to reduce the risk of peanut allergy. Guideline 3 suggests that infants without eczema or any food allergy have peanut-containing foods freely introduced into their diets.

In all cases, infants should start other solid foods before they are introduced to peanut-containing foods.

Development of the Addendum Guidelines was prompted by emerging data suggesting that peanut allergy can be prevented by the early introduction of peanut-containing foods. Clinical trial results reported in February 2015 showed that regular peanut consumption begun in infancy and continued until 5 years of age led to an 81 percent reduction in development of peanut allergy in infants deemed at high risk because they already had severe eczema, egg allergy or both. This finding came from the landmark, NIAID-funded Learning Early About Peanut Allergy (LEAP) (link is external)study, a randomized clinical trial involving more than 600 infants.

“The LEAP study clearly showed that introduction of peanut early in life significantly lowered the risk of developing peanut allergy by age 5. The magnitude of the benefit and the scientific strength of the study raised the need to operationalize these findings by developing clinical recommendations focused on peanut allergy prevention,” said Daniel Rotrosen, M.D., director of NIAID’s Division of Allergy, Immunology and Transplantation.

In 2015, NIAID established a coordinating committee representing 26 professional organizations, advocacy groups and federal agencies to oversee development of the Addendum Guidelines to specifically address the prevention of peanut allergy. The coordinating committee convened a 26-member expert panel comprising specialists from a variety of relevant clinical, scientific and public health areas. The panel, chaired by Joshua Boyce, M.D., professor of medicine and pediatrics at Harvard Medical School, used a literature review of food allergy prevention research and their own expert opinions to prepare draft guidelines. The draft guidelines were available on the NIAID website for public comment from March 4 to April 18, 2016. The expert panel and coordinating committee reviewed the 104 comments received to develop the final Addendum Guidelines.

Source : NIH/National Institute of Allergy and Infectious Diseases

Could Parkinson’s Disease Start in the Gut?

Scienceadaily.com - Parkinson’s disease may start in the gut and spread to the brain via the vagus nerve, according to a study published in the April 26, 2017, online issue of Neurology®, the medical journal of the American Academy of Neurology. The vagus nerve extends from the brainstem to the abdomen and controls unconscious body processes like heart rate and food digestion.

The preliminary study examined people who had resection surgery, removing the main trunk or branches of the vagus nerve. The surgery, called vagotomy, is used for people with ulcers. Researchers used national registers in Sweden to compare 9,430 people who had a vagotomy over a 40-year period to 377,200 people from the general population. During that time, 101 people who had a vagotomy developed Parkinson’s disease, or 1.07 percent, compared to 4,829 people in the control group, or 1.28 percent. This difference was not significant.

Parkinson's disease may start in the gut and spread to the brain, according to a study.  Credit : wikimedia.org

But when researchers analyzed the results for the two different types of vagotomy surgery, they found that people who had a truncal vagotomy at least five years earlier were less likely to develop Parkinson’s disease than those who had not had the surgery and had been followed for at least five years. In a truncal vagotomy, the nerve trunk is fully resected. In a selective vagotomy, only some branches of the nerve are resected.

A total of 19 people who had truncal vagotomy at least five years earlier developed the disease, or 0.78 percent, compared to 3,932 people who had no surgery and had been followed for at least five years, at 1.15 percent. By contrast, 60 people who had selective vagotomy five years earlier developed Parkinson’s disease, or 1.08 percent. After adjusting for factors such as chronic obstructive pulmonary disease, diabetes, arthritis and other conditions, researchers found that people who had a truncal vagotomy at least five years before were 40 percent less likely to develop Parkinson’s disease than those who had not had the surgery and had been followed for at least five years.

“These results provide preliminary evidence that Parkinson’s disease may start in the gut,” said study author Bojing Liu, MSc, of the Karolinska Instituet in Stockholm, Sweden. “Other evidence for this hypothesis is that people with Parkinson’s disease often have gastrointestinal problems such as constipation, that can start decades before they develop the disease. In addition, other studies have shown that people who will later develop Parkinson’s disease have a protein believed to play a key role in Parkinson’s disease in their gut.”

The theory is that these proteins can fold in the wrong way and spread that mistake from cell to cell.

“Much more research is needed to test this theory and to help us understand the role this may play in the development of Parkinson’s,” Liu said. Additionally, since Parkinson’s is a syndrome, there may be multiple causes and pathways.

Even though the study was large, Liu said one limitation was small numbers in certain subgroups. Also, the researchers could not control for all potential factors that could affect the risk of Parkinson’s disease, such as smoking, coffee drinking or genetics.

Source : News Wise

Ripples in the Cosmic Web

Scienceadaily.com - The far-flung corners of intergalactic space are lonely places, barren of much else but atoms. In these vast expanses between the galaxies, only atoms — a haze of hydrogen gas left over from the Big Bang — occupy solitary cubes one meter on a side. On the largest scale, this diffuse material is arranged in a network of filamentary structures known as the “cosmic web,” its tangled strands spanning billions of light years and accounting for the majority of atoms in the universe.

Now, a team of astronomers, including UC Santa Barbara physicist Joseph Hennawi, have made the first measurements of small-scale ripples in this primeval hydrogen gas using rare double quasars. Although the regions of cosmic web they studied lie nearly 11 billion light years away, they were able to measure variations in its structure on scales 100,000 times smaller, comparable to the size of a single galaxy. The results appear in the journal Science.

This schematic representation illustrates the technique used to probe the small-scale structure of the cosmic web using light from a rare quasar pair. Photo Credit: J. ONORBE / MPIA

Intergalactic gas is so tenuous that it emits no light of its own. Instead astronomers study it indirectly by observing how it selectively absorbs the light coming from faraway sources known as quasars. Quasars constitute a brief hyperluminous phase of the galactic life cycle powered by matter falling into a galaxy’s central supermassive black hole. Acting like cosmic lighthouses, they are bright, distant beacons that allow astronomers to study intergalactic atoms residing between the location of the quasar and the Earth. But because these hyperluminous episodes last only a tiny fraction of a galaxy’s lifetime, quasars are correspondingly rare and are typically separated from each other by hundreds of millions of light years.

In order to probe the cosmic web on much smaller length scales, the astronomers exploited a fortuitous cosmic coincidence: They identified exceedingly rare pairs of quasars and measured subtle differences in the absorption of intergalactic atoms along the two sightlines.

“Pairs of quasars are like needles in a haystack,” explained Hennawi,  an associate professor in UCSB’s Department of Physics who pioneered the application of algorithms from “machine learning” — a brand of artificial intelligence — to efficiently locate quasar pairs in the massive amounts of data produced by digital imaging surveys of the night sky. “In order to find them, we combed through images of billions of celestial objects millions of times fainter than what the naked eye can see.”

Once identified, the quasar pairs were observed with the largest telescopes in the world, including the 10-meter Keck telescopes at the W.M. Keck Observatory on Mauna Kea, Hawaii, of which the University of California is a founding partner.

“One of the biggest challenges was developing the mathematical and statistical tools to quantify the tiny differences we measured in this new kind of data,” said lead author Alberto Rorai, Hennawi’s former Ph.D. student who is now a postdoctoral researcher at Cambridge University. Rorai developed these tools as part of the research for his doctoral degree and applied them to spectra of quasars with Hennawi and other colleagues.

The astronomers compared their measurements to supercomputer models that simulate the formation of cosmic structures from the Big Bang to the present. On a single laptop, these complex calculations would require almost 1,000 years to complete, but modern supercomputers enabled the researchers to carry them out in just a few weeks.

“The input to our simulations are the laws of physics and the output is an artificial universe, which can be directly compared to astronomical data,” said co-author Jose Oñorbe, a postdoctoral researcher at the Max Planck Institute for Astronomy in Heidelberg, Germany, who led the supercomputer simulation effort. “I was delighted to see that these new measurements agree with the well-established paradigm for how cosmic structures form.”

“One reason why these small-scale fluctuations are so interesting is that they encode information about the temperature of gas in the cosmic web just a few billion years after the Big Bang,” explained Hennawi.

Astronomers believe that the matter in the universe went through phase transitions billions of years ago, which dramatically changed its temperature. Known as cosmic re-ionization, these transitions occurred when the collective ultraviolet glow of all stars and quasars in the universe became intense enough to strip electrons off atoms in intergalactic space. How and when re-ionization occurred is one of the biggest open questions in the field of cosmology, and these new measurements provide important clues that will help narrate this chapter of cosmic history.

Source : University of California - Santa Barbara

Ice Cave in Transylvania Yields Window into Region's Past

Scienceadaily.com - Ice cores drilled from a glacier in a cave in Transylvania offer new evidence of how Europe's winter weather and climate patterns fluctuated during the last 10,000 years, known as the Holocene period.

The cores provide insights into how the region's climate has changed over time. The researchers' results, published this week in the journal Scientific Reports, could help reveal how the climate of the North Atlantic region, which includes the U.S., varies on long time scales.

The project, funded by the National Science Foundation (NSF) and the Romanian Ministry of Education, involved scientists from the University of South Florida (USF), University of Belfast, University of Bremen and Stockholm University, among other institutions.

Panoramic view of an ice cliff inside the Scărișoara Ice Cave, where the research was done. Credit: Gigi Fratila & Claudiu Szabo

Researchers from the Emil Racoviță Institute of Speleology in Cluj-Napoca, Romania, and USF's School of Geosciences gathered their evidence in the world's most-explored ice cave and oldest cave glacier, hidden deep in the heart of Transylvania in central Romania.

With its towering ice formations and large underground ice deposit, Scărișoara Ice Cave is among the most important scientific sites in Europe.

Scientist Bogdan Onac of USF and his colleague Aurel Perșoiu, working with a team of researchers in Scărișoara Ice Cave, sampled the ancient ice there to reconstruct winter climate conditions during the Holocene period.

Over the last 10,000 years, snow and rain dripped into the depths of Scărișoara, where they froze into thin layers of ice containing chemical evidence of past winter temperature changes.

Until now, scientists lacked long-term reconstructions of winter climate conditions. That knowledge gap hampered a full understanding of past climate dynamics, Onac said.

"Most of the paleoclimate records from this region are plant-based, and track only the warm part of the year -- the growing season," says Candace Major, program director in NSF's Directorate for Geosciences, which funded the research. "That misses half the story. The spectacular ice cave at Scărișoara fills a crucial piece of the puzzle of past climate change in recording what happens during winter."

Reconstructions of Earth's climate record have relied largely on summer conditions, charting fluctuations through vegetation-based samples, such as tree ring width, pollen and organisms that thrive in the warmer growing season.

Absent, however, were important data from winters, Onac said.

Located in the Apuseni Mountains, the region surrounding the Scărișoara Ice Cave receives precipitation from the Atlantic Ocean and the Mediterranean Sea and is an ideal location to study shifts in the courses storms follow across East and Central Europe, the scientists say.

Radiocarbon dating of minute leaf and wood fragments preserved in the cave's ice indicates that its glacier is at least 10,500 years old, making it the oldest cave glacier in the world and one of the oldest glaciers on Earth outside the polar regions.

From samples of the ice, the researchers were able to chart the details of winter conditions growing warmer and wetter over time in Eastern and Central Europe. Temperatures reached a maximum during the mid-Holocene some 7,000 to 5,000 years ago and decreased afterward toward the Little Ice Age, 150 years ago.

A major shift in atmospheric dynamics occurred during the mid-Holocene, when winter storm tracks switched and produced wetter and colder conditions in northwestern Europe, and the expansion of a Mediterranean-type climate toward southeastern Europe.

"Our reconstruction provides one of the very few winter climate reconstructions, filling in numerous gaps in our knowledge of past climate variability," Onac said.

Warming winter temperatures led to rapid environmental changes that allowed the northward expansion of Neolithic farmers toward mainland Europe, and the rapid population of the continent.

"Our data allow us to reconstruct the interplay between Atlantic and Mediterranean sources of moisture," Onac said. "We can also draw conclusions about past atmospheric circulation patterns, with implications for future climate changes. Our research offers a long-term context to better understand these changes."

The results from the study tell scientists how the climate of the North Atlantic region, which includes the U.S., varies on long time scales. The scientists are continuing their cave study, working to extend the record back 13,000 years or more.

Source : National Science Foundation

3-D Printing Offers New Approach to Making Buildings

Scienceadaily.com - The list of materials that can be produced by 3-D printing has grown to include not just plastics but also metal, glass, and even food. Now, MIT researchers are expanding the list further, with the design of a system that can 3-D print the basic structure of an entire building.

Structures built with this system could be produced faster and less expensively than traditional construction methods allow, the researchers say. A building could also be completely customized to the needs of a particular site and the desires of its maker. Even the internal structure could be modified in new ways; different materials could be incorporated as the process goes along, and material density could be varied to provide optimum combinations of strength, insulation, or other properties.

Ultimately, the researchers say, this approach could enable the design and construction of new kinds of buildings that would not be feasible with traditional building methods.

MIT researchers have designed a system that can 3-D print the basic structure of an entire building. The system consists of a tracked vehicle that carries a large industrial robotic arm, which has a smaller, precision-motion robotic arm at its end.

The robotic system is described this week in the journal Science Robotics, in a paper by Steven Keating PhD ’16, a mechanical engineering graduate and former research affiliate in the Mediated Matter group at the MIT Media Lab; Julian Leland and Levi Cai, both research assistants in the Mediated Matter group; and Neri Oxman, group director and associate professor of media arts and sciences.

The system consists of a tracked vehicle that carries a large, industrial robotic arm, which has a smaller, precision-motion robotic arm at its end. This highly controllable arm can then be used to direct any conventional (or unconventional) construction nozzle, such as those used for pouring concrete or spraying insulation material, as well as additional digital fabrication end effectors, such as a milling head.

Unlike typical 3-D printing systems, most of which use some kind of an enclosed, fixed structure to support their nozzles and are limited to building objects that can fit within their overall enclosure, this free-moving system can construct an object of any size. As a proof of concept, the researchers used a prototype to build the basic structure of the walls of a 50-foot-diameter, 12-foot-high dome — a project that was completed in less than 14 hours of “printing” time.

For these initial tests, the system fabricated the foam-insulation framework used to form a finished concrete structure. This construction method, in which polyurethane foam molds are filled with concrete, is similar to traditional commercial insulated-concrete formwork techniques. Following this approach for their initial work, the researchers showed that the system can be easily adapted to existing building sites and equipment, and that it will fit existing building codes without requiring whole new evaluations, Keating explains.

Ultimately, the system is intended to be self-sufficient. It is equipped with a scoop that could be used to both prepare the building surface and acquire local materials, such as dirt for a rammed-earth building, for the construction itself. The whole system could be operated electrically, even powered by solar panels. The idea is that such systems could be deployed to remote regions, for example in the developing world, or to areas for disaster relief after a major storm or earthquake, to provide durable shelter rapidly.



The ultimate vision is “in the future, to have something totally autonomous, that you could send to the moon or Mars or Antarctica, and it would just go out and make these buildings for years,” says Keating, who led the development of the system as his doctoral thesis work.

But in the meantime, he says, “we also wanted to show that we could build something tomorrow that could be used right away.” That’s what the team did with its initial mobile platform. “With this process, we can replace one of the key parts of making a building, right now,” he says. “It could be integrated into a building site tomorrow.”

“The construction industry is still mostly doing things the way it has for hundreds of years,” says Keating. “The buildings are rectilinear, mostly built from single materials, put together with saws and nails,” and mostly built from standardized plans.

But, Keating wondered, what if every building could be individualized and designed using on-site environmental data? In the future, the supporting pillars of such a building could be placed in optimal locations based on ground-penetrating radar analysis of the site, and walls could have varying thickness depending on their orientation. For example, a building could have thicker, more insulated walls on its north side in cold climates, or walls that taper from bottom to top as their load-bearing requirements decrease, or curves that help the structure withstand winds.

The creation of this system, which the researchers call a Digital Construction Platform (DCP), was motivated by the Mediated Matter group’s overall vision of designing buildings without parts. Such a vision includes, for example, combining “structure and skin,” and beams and windows, in a single production process, and adapting multiple design and construction processes on the fly, as the structure is being built.

From an architectural perspective, Oxman says, the project “challenges traditional building typologies such as walls, floors, or windows, and proposes that a single system could be fabricated using the DCP that can vary its properties continuously to create wall-like elements that continuously fuse into windows.”

To this end, the nozzles of the new 3-D printing system can be adapted to vary the density of the material being poured, and even to mix different materials as it goes along. In the version used in the initial tests, the device created an insulating foam shell that would be left in place after the concrete is poured; interior and exterior finish materials could be applied directly to that foam surface.

The system can even create complex shapes and overhangs, which the team demonstrated by including a wide, built-in bench in their prototype dome. Any needed wiring and plumbing can be inserted into the mold before the concrete is poured, providing a finished wall structure all at once. It can also incorporate data about the site collected during the process, using built-in sensors for temperature, light, and other parameters to make adjustments to the structure as it is built.

Keating says the team’s analysis shows that such construction methods could produce a structure faster and less expensively than present methods can, and would also be much safer. (The construction industry is one of the most dangerous occupations, and this system requires less hands-on work.) In addition, because shapes and thicknesses can be optimized for what is needed structurally, rather than having to match what’s available in premade lumber and other materials, the total amount of material needed could be reduced.

While the platform represents an engineering advance, Oxman notes. “Making it faster, better, and cheaper is one thing. But the ability to design and digitally fabricate multifunctional structures in a single build embodies a shift from the machine age to the biological age — from considering the building as a machine to live in, made of standardized parts, to the building as an organism, which is computationally grown, additively manufactured, and possibly biologically augmented.”

“So to me it’s not merely a printer,” she says, “but an entirely new way of thinking about making, that facilitates a paradigm shift in the area of digital fabrication, but also for architectural design. … Our system points to a future vision of digital construction that enables new possibilities on our planet and beyond.”

Source : Massachusetts Institute of Technology

NASA Spacecraft Dives Between Saturn and Its Rings

Scienceadaily.com - NASA's Cassini spacecraft is back in contact with Earth after its successful first-ever dive through the narrow gap between the planet Saturn and its rings on April 26, 2017. The spacecraft is in the process of beaming back science and engineering data collected during its passage, via NASA's Deep Space Network Goldstone Complex in California's Mojave Desert. The DSN acquired Cassini's signal at 11:56 p.m. PDT on April 26, 2017 (2:56 a.m. EDT on April 27) and data began flowing at 12:01 a.m. PDT (3:01 a.m. EDT) on April 27.

"In the grandest tradition of exploration, NASA's Cassini spacecraft has once again blazed a trail, showing us new wonders and demonstrating where our curiosity can take us if we dare," said Jim Green, director of the Planetary Science Division at NASA Headquarters in Washington.

This unprocessed image shows features in Saturn's atmosphere from closer than ever before. The view was captured by NASA's Cassini spacecraft during its first Grand Finale dive past the planet on April 26, 2017. Credit: NASA/JPL-Caltech/Space Science Institute 

As it dove through the gap, Cassini came within about 1,900 miles (3,000 kilometers) of Saturn's cloud tops (where the air pressure is 1 bar -- comparable to the atmospheric pressure of Earth at sea level) and within about 200 miles (300 kilometers) of the innermost visible edge of the rings.

While mission managers were confident Cassini would pass through the gap successfully, they took extra precautions with this first dive, as the region had never been explored.

"No spacecraft has ever been this close to Saturn before. We could only rely on predictions, based on our experience with Saturn's other rings, of what we thought this gap between the rings and Saturn would be like," said Cassini Project Manager Earl Maize of NASA's Jet Propulsion Laboratory in Pasadena, California. "I am delighted to report that Cassini shot through the gap just as we planned and has come out the other side in excellent shape."

The gap between the rings and the top of Saturn's atmosphere is about 1,500 miles (2,000 kilometers) wide. The best models for the region suggested that if there were ring particles in the area where Cassini crossed the ring plane, they would be tiny, on the scale of smoke particles. The spacecraft zipped through this region at speeds of about 77,000 mph (124,000 kph) relative to the planet, so small particles hitting a sensitive area could potentially have disabled the spacecraft.

As a protective measure, the spacecraft used its large, dish-shaped high-gain antenna (13 feet or 4 meters across) as a shield, orienting it in the direction of oncoming ring particles. This meant that the spacecraft was out of contact with Earth during the ring-plane crossing, which took place at 2 a.m. PDT (5 a.m. EDT) on April 26. Cassini was programmed to collect science data while close to the planet and turn toward Earth to make contact about 20 hours after the crossing.

Cassini's next dive through the gap is scheduled for May 2.

Launched in 1997, Cassini arrived at Saturn in 2004. Following its last close flyby of the large moon Titan on April 21 PDT (April 22 EDT), Cassini began what mission planners are calling its "Grand Finale." During this final chapter, Cassini loops Saturn approximately once per week, making a total of 22 dives between the rings and the planet. Data from this first dive will help engineers understand if and how they will need to protect the spacecraft on its future ring-plane crossings. The spacecraft is on a trajectory that will eventually plunge into Saturn's atmosphere -- and end Cassini's mission -- on Sept. 15, 2017.

More information about Cassini's Grand Finale, including images and video, is available at:

www.saturn.jpl.nasa.gov/grandfinale

The Cassini-Huygens mission is a cooperative project of NASA, ESA (European Space Agency) and the Italian Space Agency. JPL, a division of Caltech in Pasadena, California, manages the mission for NASA's Science Mission Directorate. JPL designed, developed and assembled the Cassini orbiter.

More information about Cassini is at:

www.nasa.gov/cassini
www.saturn.jpl.nasa.gov

Source : NASA/Jet Propulsion Laboratory

Smartphone-Controlled Cells Help Keep Diabetes in Check

Scienceadaily.com - Cells engineered to produce insulin under the command of a smartphone helped keep blood sugar levels within normal limits in diabetic mice, a new study reports.

More than 415 million people worldwide are living with diabetes, and frequently need to inject themselves with insulin to manage their blood sugars. Human cells can be genetically engineered into living factories that efficiently manufacture and deliver hormones and signaling molecules, but most synthetic biological circuits don't offer the same degree of sensitivity and precision as digital sensors.

Combining living tissues and technology, Jiawei Shao et al. created custom cells that produced insulin when illuminated by far-red light (the same wavelengths emitted by therapy bulbs and infrared saunas).

Image shows bone progenitor cells labeled by red glow inside a cleared femur.

The researchers added the cells to a soft bio-compatible sheath that also contained wirelessly-powered red LED lights to create HydrogeLEDs that could be turned on and off by an external electromagnetic field.

Implanting the HydrogeLEDs into the skin of diabetic mice allowed Shao and colleagues to administer insulin doses remotely through a smartphone application. They not only custom-coded the smartphone control algorithms, but designed the engineered cells to produce insulin without any "cross-talk" between normal cellular signaling processes.

The scientists went on to pair the system with a Bluetooth-enabled blood glucose meter, creating instant feedback between the therapeutic cells and the diagnostic device that helped diabetic animals rapidly achieve and maintain stable blood glucose levels in a small pilot experiment over a period of several weeks.

The authors say that successfully linking digital signals with engineered cells represents an important step toward translating similar cell-based therapies into the clinic. A related Focus by Mark Gomelsky highlights the findings further.

Source : Science Daily

Number Abilities in Humans, Birds and Fish are Based in Brain’s Subcortex

Scienceadaily.com - Cognitive neuroscience researcher Joonkoo Park at the University of Massachusetts Amherst, who recently received a five-year, $751,000 faculty early career development (CAREER) grant from the National Science Foundation (NSF) to address basic research questions about how our brains process number and magnitude and how such processes give rise to more complex mathematical thinking, has co-authored a paper that reports this week where in the brain numerical quantity evaluation is processed.

 In a series of experiments, Park and colleagues at Carnegie Mellon University used a psychophysical method that allowed them to“explore the extent to which the adult human subcortex contributes to number processing,” in particular to distinguish between cortical and subcortical involvement. Details appear in the current online edition of Proceedings of the National Academy of Sciences.


As Park explains, people can tell at a quick glance the difference between 8 and 10 apples without counting them. “It’s called number sense, and it’s evolutionarily ancient,” he notes. “That is, we share this ability with other primates, mammals, birds and fish. Even babies can discriminate between 10 and 20 dots well before they learn to count.”

In the recent study led by Marlene Behrmann at Carnegie Mellon, Park and colleagues measured college students’ performance in making numerical judgments on two dot array images presented very briefly one after another. Sometimes these two images were presented to a single eye (monocular presentation), and other times these two images were presented to different eyes (dichoptic presentation).

Under the monocular, but not the dichoptic, presentation, the visual information reaches the same subcortical structure. So, if participants do better in the monocular condition compared to the binocular condition, one can conclude that the subcortex is involved in numerical judgment. Indeed, the researchers found that participants performed better under monocular presentation when making the numerical judgment, especially when they discriminated two dot arrays with large ratios (4:1 or 3:1).

While it has been well established that humans share such a primitive numerical ability with other animals and even invertebrates, the brain basis of such an ability has been largely unknown, Park explains. This is because many other animals known to possess such a numerical ability have very limited computational power provided by the cortex. This new finding suggests that the coarse, primitive numerical ability shared across many species stems from the subcortex, an evolutionarily older brain structure.

With his CAREER grant, Park plans to study further questions that remain about the nature of this skill. Understanding mathematical ability is not only of interest to basic neuroscience but to educators who want to improve math education, he says. Similar to language development, the creation and use of mathematics is uniquely human, yet little is understood about the cognitive and neural processes that support it.

The neuroscientist points out that some research suggests that the sense of magnitude, which allow us to judge which is more and which is less without counting or using numerical symbols, provides a rudimentary foundation for mathematical thinking. But the picture is far from complete.
He says, “My research interest revolves around the neural basis of numerical cognition, its development, and individual differences, such as who is good at learning numbers and math, who is not, and why.”

One controversy Park is particularly interested to investigate is whether “number sense” involves judgment about numbers or is derived from judgment about mass/size. “It’s a technical distinction,” he says, “but actually quite important from a theoretical point of view. It goes back at least to Kant, who argued that we have an innate sense of number, space and time. We and other creatures may have been born with this ability but how the sense of number emerges is still an open question.”
Using a series of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) studies, Park plans to go beyond naming the brain region where magnitude processing takes place to identify the anatomy and function of neural pathways involved in magnitude processing and reveal neural mechanisms that support mathematical thinking. He says the EEG and fMRI techniques “counter each other’s weaknesses” in pathway analysis.

In addition, Park plans to pursue practical applications. By relating results to individual differences in more complex mathematical ability, he hopes to provide new insights into the factors that underlie successful math education. “This is meaningful because a lot of recent studies have shown that young children’s number sense is a reasonable predictor of their math skills. We’ll study to what extent the brain’s representation of magnitude is directly related to different aspects of more complex math ability such as geometry or arithmetic.”

Park also plans to create a new course in cognitive neuroscience methods for undergraduates at UMass Amherst and will engage pre-college high school students in his research over summer. He hopes to engage underrepresented minority students, children and families from diverse backgrounds in the scientific research.

Source : University of Massachusetts at Amherst

How to Color a Lizard: From Biology to Mathematics

Scienceadaily.com - From the clown fish to leopards, skin colour patterns in animals arise from microscopic interactions among coloured cells that obey equations discovered by the mathematician Alan Turing. Today, researchers at the University of Geneva (UNIGE), Switzerland, and SIB Swiss Institute of Bioinformatics report in the journal Nature that a southwestern European lizard slowly acquires its intricate adult skin colour by changing the colour of individual skin scales using an esoteric computational system invented in 1948 by another mathematician: John von Neumann. The Swiss team shows that the 3D geometry of the lizard’s skin scales causes the Turing mechanism to transform into the von Neumann computing system, allowing biology-driven research to link, for the first time, the work of these two giants in mathematics.


A multidisciplinary team of biologists, physicists and computer scientists lead by Michel Milinkovitch, professor at the Department of Genetics and Evolution of the UNIGE Faculty of Science, Switzerland, and Group Leader at the SIB Swiss Institute of Bioinformatics, realised that the brown juvenile ocellated lizard (Timon lepidus) gradually transforms its skin colour as it ages to reach an intricate adult labyrinthine pattern where each scale is either green or black. This observation is at odds with the mechanism, discovered in 1952 by the mathematician Alan Turing, that involves microscopic interactions among coloured cells. To understand why the pattern is forming at the level of scales, rather than at the level of biological cells, two PhD students, Liana Manukyan and Sophie Montandon, followed individual lizards during 4 years of their development from hatchlings crawling out of the egg to fully mature animals. For multiple time points, they reconstructed the geometry and colour of the network of scales by using a very high resolution robotic system developed previously in the Milinkovitch laboratory.

Flipping from green to black

The researchers were then surprised to see the brown juvenile scales change to green or black, then continue flipping colour (between green and black) during the life of the animal. This very strange observation prompted Milinkovitch to suggest that the skin scale network forms a so-called ‘cellular automaton’. This esoteric computing system was invented in 1948 by the mathematician John von Neumann. Cellular automata are lattices of elements in which each element changes its state (here, its colour, green or black) depending on the states of neighbouring elements. The elements are called cells but are not meant to represent biological cells; in the case of the lizards, they correspond to individual skin scales. These abstract automata were extensively used to model natural phenomena, but the UNIGE team discovered what seems to be the first case of a genuine 2D automaton appearing in a living organism. Analyses of the four years of colour change allowed the Swiss researchers to confirm Milinkovitch’s hypothesis: the scales were indeed flipping colour depending of the colours of their neighbour scales. Computer simulations implementing the discovered mathematical rule generated colour patterns that could not be distinguished from the patterns of real lizards.

How could the interactions among pigment cells, described by Turing equations, generate a von Neumann automaton exactly superposed to the skin scales? The skin of a lizard is not flat: it is very thin between scales and much thicker at the center of them. Given that Turing’s mechanism involves movements of cells, or the diffusion of signals produced by cells, Milinkovitch understood that this variation of skin thickness could impact on the Turing’s mechanism. The researchers then performed computer simulations including skin thickness and saw a cellular automaton behaviour emerge, demonstrating that a Cellular Automaton as a computational system is not just an abstract concept developed by John von Neumann, but also corresponds to a natural process generated by biological evolution.

The need for a formal mathematical analysis

However, the automaton behaviour was imperfect as the mathematics behind Turing’s mechanism and von Neumann automaton are very different. Milinkovitch called in the mathematician Stanislav Smirnov, Professor at the UNIGE, who was awarded the Fields Medal in 2010. Before long, Smirnov derived a so-called discretisation of Turing’s equations that would constitute a formal link with von Neumann’s automaton. Anamarija Fofonjka, a third PhD student in Milinkovitch’s team implemented Smirnov new equations in computer simulations, obtaining a system that had become undistinguishable from a von Neumann automaton. The highly multidisciplinary team of researchers had closed the loop in this amazing journey, from biology to physics to mathematics and back to biology.

Source : Université de Genève

Stanford Scientists Test Links Between Extreme Weather and Climate Change

Scienceadaily.com - After an unusually intense heat wave, downpour or drought, Noah Diffenbaugh and his research group inevitably receive phone calls and emails asking whether human-caused climate change played a role.

“The question is being asked by the general public and by people trying to make decisions about how to manage the risks of a changing climate,” said Diffenbaugh, a professor of Earth system science at Stanford’s School of Earth, Energy & Environmental Sciences. “Getting an accurate answer is important for everything from farming to insurance premiums, to international supply chains, to infrastructure planning.”

In the past, scientists typically avoided linking individual weather events to climate change, citing the challenges of teasing apart human influence from the natural variability of the weather. But that is changing.

Stanford Professor Noah Diffenbaugh and his research group have developed a framework for testing whether global warming has contributed to particular extreme weather events. (Image credit: Shutterstock)

“Over the past decade, there’s been an explosion of research, to the point that we are seeing results released within a few weeks of a major event,” said Diffenbaugh, who is also the Kimmelman Family Senior Fellow at the Stanford Woods Institute for the Environment.

In a new study, published in this week’s issue of Proceedings of the National Academy of Sciences, Diffenbaugh and a group of current and former Stanford colleagues outline a four-step “framework” for testing whether global warming has contributed to record-setting weather events. The new paper is the latest in a burgeoning field of climate science called “extreme event attribution,” which combines statistical analyses of climate observations with increasingly powerful computer models to study the influence of climate change on individual extreme weather events.

Climate change fingerprints

In order to avoid inappropriately attributing an event to climate change, the authors began with the assumption that global warming had played no role, and then used statistical analyses to test whether that assumption was valid. “Our approach is very conservative,” Diffenbaugh said. “It’s like the presumption of innocence in our legal system: The default is that the weather event was just bad luck, and a really high burden of proof is required to assign blame to global warming.”

The authors applied their framework to the hottest, wettest and driest events that have occurred in different areas of the world. They found that global warming from human emissions of greenhouse gases has increased the odds of the hottest events across more than 80 percent of the surface area of the globe for which observations were available. “Our results suggest that the world isn’t quite at the point where every record hot event has a detectable human fingerprint, but we are getting close,” Diffenbaugh said.

For the driest and wettest events, the authors found that human influence on the atmosphere has increased the odds across approximately half of the area that has reliable observations. “Precipitation is inherently noisier than temperature, so we expect the signal to be less clear,” Diffenbaugh said. “One of the clearest signals that we do see is an increase in the odds of extreme dry events in the tropics. This is also where we see the biggest increase in the odds of protracted hot events – a combination that poses real risks for vulnerable communities and ecosystems.”

The Stanford research team, which includes a number of former students and postdocs who have moved on to positions at other universities, has been developing the extreme event framework in recent years, focusing on individual events such as the 2012-2017 California drought and the catastrophic flooding in northern India in June 2013. In the new study, a major goal was to test the ability of the framework to evaluate events in multiple regions of the world, and to extend beyond extreme temperature and precipitation, which have been the emphasis of most event attribution studies.

Test cases

One high-profile test case was Arctic sea ice, which has declined by around 40 percent during the summer season over the past three decades. When the team members applied their framework to the record-low Arctic sea ice cover observed in September 2012, they found overwhelming statistical evidence that global warming contributed to the severity and probability of the 2012 sea ice measurements. “The trend in the Arctic has been really steep, and our results show that it would have been extremely unlikely to achieve the record-low sea ice extent without global warming,” Diffenbaugh said.

Another strength of a multi-pronged approach, the team said, is that it can be used to study not only the weather conditions at the surface, but also the meteorological “ingredients” that contribute to rare events. “For example, we found that the atmospheric pressure pattern that occurred over Russia during the 2010 heat wave has become more likely in recent decades, and that global warming has contributed to those odds,” said co-author Daniel Horton, an assistant professor at Northwestern University in Evanston, Illinois, and a former postdoc in Diffenbaugh’s lab who has led research on the influence of atmospheric pressure patterns on surface temperature extremes. “If the odds of an individual ingredient are changing – like the pressure patterns that lead to heat waves – that puts a thumb on the scales for the extreme event.”

Diffenbaugh sees the demand for rigorous, quantitative event attribution growing in the coming years. “When you look at the historical data, there’s no question that global warming is happening and that extremes are increasing in many areas of the world,” he said. “People make a lot of decisions – short term and long term – that depend on the weather, so it makes sense that they want to know whether global warming is making record-breaking events more likely. As scientists, we want to make sure that they have accurate, objective, transparent information to work with when they make those decisions.”

Other authors on the study, titled “Quantifying the influence of global warming on unprecedented extreme climate events,” include Danielle Touma, Allison Charland, Yunjie Liu and Bala Rajaratnam of Stanford University, and Stanford alumni Deepti Singh and Justin Mankin (now at the Lamont-Doherty Earth Observatory of Columbia University); Daniel Swain and Michael Tsiang (now at the University of California, Los Angeles); and Matz Haugen (now at the University of Chicago). Funding was provided by the U.S. National Science Foundation, the Department of Energy, the National Institutes of Health and Stanford University.

Source : Stanford University

UCF Professor Invents Way to Trigger Artificial Photosynthesis to Clean Air

Scienceadaily.com - A chemistry professor has just found a way to trigger the process of photosynthesis in a synthetic material, turning greenhouse gases into clean air and producing energy all at the same time.

The process has great potential for creating a technology that could significantly reduce greenhouse gases linked to climate change, while also creating a clean way to produce energy.

Fernando Uribe-Romo on Synthetic Photosynthesis
A UCF assistant professor has found how to trigger photosynthesis in a synthetic material. The process can turn greenhouse gas into the clean air and produce energy at the same time.

“This work is a breakthrough,” said UCF Assistant Professor Fernando Uribe-Romo. “Tailoring materials that will absorb a specific color of light is very difficult from the scientific point of view, but from the societal point of view we are contributing to the development of a technology that can help reduce greenhouse gases.”

The findings of his research are published in the Journal of Materials Chemistry A .

Uribe-Romo and his team of students created a way to trigger a chemical reaction in a synthetic material called metal–organic frameworks (MOF) that breaks down carbon dioxide into harmless organic materials. Think of it as an artificial photosynthesis process similar to the way plants convert carbon dioxide (CO2) and sunlight into food. But instead of producing food, Uribe-Romo’s method produces solar fuel.



It’s something scientists around the world have been pursuing for years, but the challenge is finding a way for visible light to trigger the chemical transformation. Ultraviolet rays have enough energy to allow the reaction in common materials such as titanium dioxide, but UVs make up only about 4 percent of the light Earth receives from the sun. The visible range – the violet to red wavelengths – represent the majority of the sun’s rays, but there are few materials that pick up these light colors to create the chemical reaction that transforms CO2 into fuel.

Researchers have tried it with a variety of materials, but the ones that can absorb visible light tend to be rare and expensive materials such as platinum, rhenium and iridium that make the process cost-prohibitive.

Uribe-Romo used titanium, a common nontoxic metal, and added organic molecules that act as light-harvesting antennae to see if that configuration would work.  The light harvesting antenna molecules, called N-alkyl-2-aminoterephthalates, can be designed to absorb specific colors of light when incorporated in the MOF. In this case he synchronized it for the color blue.

His team assembled a blue LED photoreactor to test out the hypothesis. Measured amounts of carbon dioxide were slowly fed into the photoreactor — a glowing blue cylinder that looks like a tanning bed — to see if the reaction would occur. The glowing blue light came from strips of LED lights inside the chamber of the cylinder and mimic the sun’s blue wavelength.

It worked and the chemical reaction transformed the CO2 into two reduced forms of carbon, formate and formamides (two kinds of solar fuel) and in the process cleaning the air.

“The goal is to continue to fine-tune the approach so we can create greater amounts of reduced carbon so it is more efficient,” Uribe-Romo said.

He wants to see if the other wavelengths of visible light may also trigger the reaction with adjustments to the synthetic material. If it works, the process could be a significant way to help reduce greenhouse gases.

“The idea would be to set up stations that capture large amounts of CO2, like next to a power plant. The gas would be sucked into the station, go through the process and recycle the greenhouse gases while producing energy that would be put back into the power plant.”

Perhaps someday homeowners could purchase rooftop shingles made of the material, which would clean the air in their neighborhood while producing energy that could be used to power their homes.

“That would take new technology and infrastructure to happen,” Uribe-Romo said. “But it may be possible.”

Other members of the team who worked on the paper include UCF graduate student Matt Logan, who is pursuing a Ph.D. in chemistry, and undergraduate student Jeremy Adamson, who is majoring in biomedical sciences. Kenneth Hanson and his research group at Florida State University helped interpret the results of the experiments.

Source : University of Central Florida

New Quantum Liquid Crystals May Play Role in Future of Computers

Scienceadaily.com - Physicists at the Institute for Quantum Information and Matter at Caltech have discovered the first three-dimensional quantum liquid crystal—a new state of matter that may have applications in ultrafast quantum computers of the future.

"We have detected the existence of a fundamentally new state of matter that can be regarded as a quantum analog of a liquid crystal," says Caltech assistant professor of physics David Hsieh, principal investigator on a new study describing the findings in the April 21 issue of Science. "There are numerous classes of such quantum liquid crystals that can, in principle, exist; therefore, our finding is likely the tip of an iceberg."

Liquid crystals fall somewhere in between a liquid and a solid: they are made up of molecules that flow around freely as if they were a liquid but are all oriented in the same direction, as in a solid. Liquid crystals can be found in nature, such as in biological cell membranes. Alternatively, they can be made artificially—such as those found in the liquid crystal displays commonly used in watches, smartphones, televisions, and other items that have display screens.

These images show light patterns generated by a rhenium-based crystal using a laser method called optical second-harmonic rotational anisotropy. At left, the pattern comes from the atomic lattice of the crystal. At right, the crystal has become a 3-D quantum liquid crystal, showing a drastic departure from the pattern due to the atomic lattice alone.

In a "quantum" liquid crystal, electrons behave like the molecules in classical liquid crystals. That is, the electrons move around freely yet have a preferred direction of flow. The first-ever quantum liquid crystal was discovered in 1999 by Caltech's Jim Eisenstein, the Frank J. Roshek Professor of Physics and Applied Physics. Eisenstein's quantum liquid crystal was two-dimensional, meaning that it was confined to a single plane inside the host material—an artificially grown gallium-arsenide-based metal. Such 2-D quantum liquid crystals have since been found in several more materials including high-temperature superconductors. These are materials that conduct electricity with zero resistance at around –150 degrees Celsius, which is warmer than operating temperatures for traditional superconductors.

John Harter, a postdoctoral scholar in the Hsieh lab and lead author of the new study, explains how 2-D quantum liquid crystals behave in strange ways. "Electrons living in this flatland collectively decide to flow preferentially along the x-axis rather than the y-axis even though there's nothing to distinguish one direction from the other," he says.

Now Harter, Hsieh, and their colleagues at Oak Ridge National Laboratory and the University of Tennessee have discovered the first 3-D quantum liquid crystal. Compared to a 2-D quantum liquid crystal, the 3-D version is even more bizarre. Here, the electrons not only make a distinction between the x-, y-, and z-axes, but they also have different magnetic properties depending on whether they flow forward or backward on a given axis.

"Running an electrical current through these materials transforms them from nonmagnets into magnets, which is highly unusual," says Hsieh. "What's more, in every direction that you can flow current, the magnetic strength and magnetic orientation changes. Physicists say that the electrons 'break the symmetry' of the lattice."

Harter hit upon the discovery serendipitously. He was originally interested in studying the atomic structure of a metal compound based on the element rhenium. In particular, he was trying to characterize the structure of the crystal's atomic lattice using a technique called optical second-harmonic rotational anisotropy. In these experiments, laser light is fired at a material, and light with twice the frequency is reflected back out. The pattern of emitted light contains information about the symmetry of the crystal. The patterns measured from the rhenium-based metal were very strange—and could not be explained by the known atomic structure of the compound.  

"At first, we didn't know what was going on," Harter says. The researchers then learned about the concept of 3-D quantum liquid crystals, developed by Liang Fu, a physics professor at MIT. "It explained the patterns perfectly. Everything suddenly made sense," Harter says.

The researchers say that 3-D quantum liquid crystals could play a role in a field called spintronics, in which the direction that electrons spin may be exploited to create more efficient computer chips. The discovery could also help with some of the challenges of building a quantum computer, which seeks to take advantage of the quantum nature of particles to make even faster calculations, such as those needed to decrypt codes. One of the difficulties in building such a computer is that quantum properties are extremely fragile and can easily be destroyed through interactions with their surrounding environment. A technique called topological quantum computing—developed by Caltech's Alexei Kitaev, the Ronald and Maxine Linde Professor of Theoretical Physics and Mathematics—can solve this problem with the help of a special kind of superconductor dubbed a topological superconductor.

"In the same way that 2-D quantum liquid crystals have been proposed to be a precursor to high-temperature superconductors, 3-D quantum liquid crystals could be the precursors to the topological superconductors we've been looking for," says Hsieh.

"Rather than rely on serendipity to find topological superconductors, we may now have a route to rationally creating them using 3-D quantum liquid crystals" says Harter. "That is next on our agenda."

Source : California Institute of  Technology

Naked Mole-Rats Turn Into Plants When Oxygen is Low

Scienceadaily.com - Deprived of oxygen, naked mole-rats can survive by metabolizing fructose just as plants do, researchers report this week in the journal Science.

Understanding how the animals do this could lead to treatments for patients suffering crises of oxygen deprivation, as in heart attacks and strokes.

“This is just the latest remarkable discovery about the naked mole-rat — a cold-blooded mammal that lives decades longer than other rodents, rarely gets cancer, and doesn’t feel many types of pain,” says Thomas Park, professor of biological sciences at the University of Illinois at Chicago, who led an international team of researchers from UIC, the Max Delbrück Institute in Berlin and the University of Pretoria in South Africa on the study.


In humans, laboratory mice, and all other known mammals, when brain cells are starved of oxygen they run out of energy and begin to die.

But naked mole-rats have a backup: their brain cells start burning fructose, which produces energy anaerobically through a metabolic pathway that is only used by plants – or so scientists thought.
In the new study, the researchers exposed naked mole-rats to low oxygen conditions in the laboratory and found that they released large amounts of fructose into the bloodstream. The fructose, the scientists found, was transported into brain cells by molecular fructose pumps that in all other mammals are found only on cells of the intestine.

“The naked mole-rat has simply rearranged some basic building-blocks of metabolism to make it super-tolerant to low oxygen conditions,” said Park, who has studied the strange species for 18 years.
At oxygen levels low enough to kill a human within minutes, naked mole-rats can survive for at least five hours, Park said. They go into a state of suspended animation, reducing their movement and dramatically slowing their pulse and breathing rate to conserve energy. And they begin using fructose until oxygen is available again.

The naked mole-rat is the only known mammal to use suspended animation to survive oxygen deprivation.

The scientists also showed that naked mole-rats are protected from another deadly aspect of low oxygen – a buildup of fluid in the lungs called pulmonary edema that afflicts mountain climbers at high altitude.

The scientists think that the naked mole-rats’ unusual metabolism is an adaptation for living in their oxygen-poor burrows. Unlike other subterranean mammals, naked mole-rats live in hyper-crowded conditions, packed in with hundreds of colony mates. With so many animals living together in unventilated tunnels, oxygen supplies are quickly depleted.

Source : University of Illinois at Chicago

How Venus Flytrap Triggers Digestion

Scienceadaily.com - The Venus flytrap digests its prey using enzymes produced by special glands. For the first time, a research team has measured and meticulously analysed the glands' activity.

Venus flytrap (Dionaea muscipula) is a carnivorous plant. Catching its prey, mainly insects, with a trapping structure formed by its leaves, the plants' glands secrete an enzyme to decompose the prey and take up the nutrients released.

Although postulated since Darwin's pioneering studies, these secretory events have not been measured and analysed until now: An international team of researchers headed by Rainer Hedrich, a biophysicist from Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany, present the results in the journal PNAS.

The Venus flytrap: The traps' insides are lined with red glands (a) that work like a plant "stomach" after a prey is caught. The glands secrete a digestive enzyme. This secretory mechanism was shown at the vesicle level in plants for the first time (b). The model illustration (c) shows that activated glands absorb calcium (Ca2+), thereby triggering the jasmonate signalling pathway and the secreting of hydrochloric acid (HCL) and digestive enzymes. (Picture: Sönke Scherzer/Dirk Becker)

When a prey tries to escape the closed trap, it will inevitably touch the sensory hairs inside. Any mechanical contact with the hairs triggers an electrical signal that spreads across the trap in waves. From the third signal, the plant produces the hormone jasmonate; after the fifth signal, the digestive glands that line the inside of the traps like turf are activated.

Glands secrete acidic vesicles to decompose prey

What happens next in the gland cells? They increasingly produce membranous bubbles filled with liquid (secretory vesicles) and give off their content. This happens after mechanical stimulation of the sensory hairs but also when the glands come into contact with the hormone jasmonate. The entire process depends on calcium and is controlled by a number of specific proteins.

Moreover, genes are activated in the glands: "We assume that they provide for the vesicles being loaded with protons and chloride, that is hydrochloric acid," Hedrich explains and he adds: "We used ion-sensitive electrodes to measure that repeated touching of the sensory hairs triggers the influx of calcium ions into the gland. The rising calcium level in the cytoplasm causes the vesicles to fuse with the plasma membrane, similarly to the neurotransmitter secretion of neurons. The influx of calcium is followed by the efflux of protons and chloride after a time delay."

Conclusive analysis with carbon fibre electrodes

What else do the gland vesicles contain? This was analysed using carbon fibre electrodes in cooperation with Erwin Neher (Göttingen), winner of the Nobel Prize, who has a lot of experience with this technique. Together with Neher, the JMU researcher Sönke Scherzer adjusted the measurement method to the conditions prevailing inside the Venus flytrap.

The team positioned a carbon fibre electrode over the gland surface and waited with excitement what would happen. "At first, we were disappointed because we did not immediately detect signals as known from secretory cells in humans and animals," Scherzer recalls.

Should the vesicles contain hydrochloric acid in the first hours after catching the prey but no digestive enzymes yet? And no molecules yet that assure the enzymes' functioning in the acidic environment? Does the plant have to produce all this first?

That's exactly how it works: Molecular biologist Ines Fuchs found out that the plant only starts to produce the enzymes that decompose the prey after several hours. The first characteristic signals occurred after six hours and the process was in full swing 24 hours later. During this phase, the trap is completely acidic and rich in digestive enzymes.

Stabilising effect of glutathione keeps enzymes fit

Professor Heinz Rennenberg (Freiburg) also found glutathione (GSH) in the secreted enzyme. This molecule keeps the enzymes functional in the acidic environment of the Venus flytrap.

The same processes as described above take place in the same chronological order both when the sensory hairs are stimulated and when exposing the trap to the hormone jasmonate only. "A touch will very quickly trigger the jasmonate signalling pathway, but it takes time until the vesicles are produced and loaded with the proper freight which is facilitated by the hormone," Hedrich explains.

Calcium is a mandatory ingredient

How the Venus flytrap floods its "green stomach" with the right mixture and breaks down the prey into its nutrients can be visualised using magnetic resonance imaging. Eberhard Munz from the MRT centre of the JMU's Department of Physics was responsible for this task.

His experiments also showed that when the influx of calcium into the glands is blocked, the trap remains dry. "The calcium activation of the gland cells is therefore crucial," Hedrich says. "So we will now take a closer look at the biology of the calcium channels of Venus flytrap. We also want to investigate the mechanism which counts the signals transmitted by the sensory hairs in the gland and translates it into jasmonate-dependent biology."


Penn Researchers Show Brain Stimulation Restores Memory During Lapses

Scienceadaily.com - A team of neuroscientists at the University of Pennsylvania has shown for the first time that electrical stimulation delivered when memory is predicted to fail can improve memory function in the human brain. That same stimulation generally becomes disruptive when electrical pulses arrive during periods of effective memory function.

The research team included Michael Kahana, professor of psychology and principal investigator of the Defense Advanced Research Projects Agency's Restoring Active Memory program; Youssef Ezzyat, a senior data scientist in Kahana's lab; and Daniel Rizzuto, director of cognitive neuromodulation at Penn. They published their findings in the journal Current Biology.

This work is an important step toward the long-term goal of Restoring Active Memory, a four-year Department of Defense project aimed at developing next-generation technologies that improve memory function in people who suffer from memory loss. It illustrates an important link between appropriately timed deep-brain stimulation and its potential therapeutic benefits.

A team of University of Pennsylvania neuroscientists showed for the first time that electrical stimulation delivered when memory is predicted to fail can improve memory function in the human brain. Here, the blue dots indicate overall electrode placement; the yellow dot (top-right corner) indicates the electrode used to stimulate the subject’s brain to increase memory performance. 

To get to this point, the Penn team first had to understand and decode signaling patterns that correspond to highs and lows of memory function.

"By applying machine-learning methods to electrical signals measured at widespread locations throughout the human brain," said Ezzyat, lead paper author, "we are able to identify neural activity that indicates when a given patient will have lapses of memory encoding."

Using this model, Kahana's team examined how the effects of stimulation differ during poor versus effective memory function. The study involved neurosurgical patients receiving treatment for epilepsy at the Hospital of the University of Pennsylvania, the Thomas Jefferson University Hospital, the Dartmouth-Hitchcock Medical Center, the Emory University Hospital, the University of Texas Southwestern, the Mayo Clinic, Columbia University, the National Institutes of Health Clinical Center and the University of Washington. Participants were asked to study and recall lists of common words while receiving safe levels of brain stimulation.

During this process, the Penn team recorded electrical activity from electrodes implanted in the patients' brains as part of routine clinical care. These recordings identified the biomarkers of successful memory function, activity patterns that occur when the brain effectively creates new memories.

"We found that, when electrical stimulation arrives during periods of effective memory, memory worsens," Kahana said. "But when the electrical stimulation arrives at times of poor function, memory is significantly improved."

Kahana likens it to traffic patterns in the brain: stimulating the brain during a backup restores the normal flow of traffic.

Gaining insight into this process could improve the lives of many types of patients, particularly those with traumatic brain injury or neurological diseases, such Alzheimer's. "Technology based on this type of stimulation," Rizzuto said, "could produce meaningful gains in memory performance, but more work is needed to move from proof-of-concept to an actual therapeutic platform."

This past November, the RAM team publicly released an extensive intracranial brain recording and stimulation dataset that included more than 1,000 hours of data from 150 patients performing memory tasks.

Source: University of Pennsylvania

Physicists Create Time Crystals

Scienceadaily.com - Harvard physicists have created a new form of matter -- dubbed a time crystal -- which could offer important insights into the mysterious behavior of quantum systems.

Traditionally speaking, crystals -- like salt, sugar or even diamonds -- are simply periodic arrangements of atoms in a three-dimensional lattice.

Time crystals, on the other hand, take that notion of periodically-arranged atoms and add a fourth dimension, suggesting that -- under certain conditions -- the atoms that some materials can exhibit periodic structure across time.


Led by Professors of Physics Mikhail Lukin and Eugene Demler, a team consisting of post-doctoral fellows Renate Landig and Georg Kucsko, Junior Fellow Vedika Khemani, and Physics Department graduate students Soonwon Choi, Joonhee Choi and Hengyun Zhou built a quantum system using a small piece of diamond embedded with millions of atomic-scale impurities known as nitrogen-vacancy (NV) centers. They then used microwave pulses to "kick" the system out of equilibrium, causing the NV center's spins to flip at precisely-timed intervals -- one of the key markers of a time crystal. The work is described in a paper published in Nature in March.

Other co-authors of the study are Junichi Isoya, Shinobu Onoda,and Hitoshi Sumiya from University of Tsukuba, Takasaki Advanced Research Institute and Sumitomo, Fedor Jelezko from University of Ulm, Curt von Keyserlingk from Princeton University and Norman Y. Yao from UC Berkeley.

But the creation of a time crystal isn't significant merely because it proves the previously-only-theoretical materials can exist, Lukin said, but because they offer physicists a tantalizing window into the behavior of such out-of-equilibrium systems.

"There is now broad, ongoing work to understand the physics of non-equilibrium quantum systems," Lukin said. "This is an area that is of interest for many quantum technologies, because a quantum computer is basically a quantum system that's far away from equilibrium. It's very much at the frontier of research...and we are really just scratching the surface."

But while understanding such non-equlibrium systems could help lead researchers down the path to quantum computing, the technology behind time crystals may also have more near-term applications as well.

"One specific area where we think this might be useful, and this was one of our original motivations for this work, is in precision measurement," Lukin said. "It turns out, if you are trying to build...for example, a magnetic field sensor, you can use NV-center spins," he said. "So it's possible these non-equilibrium states of matter which we create may turn out to be useful."

The notion that such systems could be built at all, however, initially seemed unlikely. In fact several researchers (names are Patrick Bruno, Haruki Watanabe, Masaki Oshikawa) went so far as to prove that it would be impossible to create a time crystal in a quantum system that was at equilibrium.

"Most things around us are in equilibrium," Lukin explained. "That means if you have a hot body and a cold body, if you bring them together, their temperature will equalize. But not all systems are like that."

One of the most common examples of a material that is out of equilibrium, he said, is something many people wear on a daily basis -- diamond.

A crystallized form of carbon that forms under intense heat and pressure, diamond is unusual because it is meta-stable, meaning that once it adopts that crystal formation, it will stay that way, even after the heat and pressure are removed.

It is only very recently, Lukin said, that researchers began to realize that non-equilibrium systems -- particularly those known as "driven" systems, which researchers can "kick" with periodic energy pulses, can exhibit the characteristics of a time crystal.

One of those characteristics, he said, is that the crystal's response across time will remain robust with respect to perturbations.

"A solid crystal is rigid...so if you push on it, maybe the distance between atoms changes a little, but the crystal itself survives," he said. "The idea of a time crystal is to have that type of order in a time domain, but it must be robust."

One other important ingredient is typically if you keep pushing a system away from equilibrium it starts heating up, but it turns out there is a class of systems which are resistant to this heating," Lukin added. "It turns out the time crystal effect is strongly related to this idea that a system is excited, but it not absorbing energy."

To build such a system, Lukin and colleagues began with a small piece of diamond which was embedded with so many NV centers it appeared black.

"We subject that diamond to microwave pulses, which change the orientation of the spins of the NV centers," Lukin explained. "That basically takes all the spins that are pointed up and turns them down, and a next pulse turns them back up."

To test the robustness of the system, Lukin and colleagues varied the timing of the pulses to see whether the material would continue to respond like a time crystal.

"If you don't orient all the spins fully up or down each time, then very rapidly, you will end up with a completely random system ," Lukin said. "But the interactions between the NV centers stabilize the response: they force the system to respond in a periodic, time crystalline manner."

Such systems could ultimately be critical in the development of useful quantum computers and quantum sensors, Lukin said, because they demonstrate that two critical components -- long quantum memory times and a very high density of quantum bits -- arent' mutually exclusive.

"For many applications you want both of those," Lukin said. "But these two requirements are usually contradictory....this is a well-known problem. The present work shows that we can achieve the desired combination. There is still a lot of work to be done, but we believe these effects might enable us to create a new generation of quantum sensors, and could possibly in the long run have other applications to things like atomic clocks."

Source : Phys

Is Soda Bad for Your Brain? (And Is Diet Soda Worse?)

Scienceadaily.com - Americans love sugar. Together we consumed nearly 11 million metric tons of it in 2016, according to the US Department of Agriculture, much of it in the form of sugar-sweetened beverages like sports drinks and soda.

Now, new research suggests that excess sugar—especially the fructose in sugary drinks—might damage your brain. Researchers using data from the Framingham Heart Study (FHS) found that people who drink sugary beverages frequently are more likely to have poorer memory, smaller overall brain volume, and a significantly smaller hippocampus—an area of the brain important for learning and memory. The FHS is the nation’s longest running epidemiological study, begun in 1948, supported by the National Heart, Lung, and Blood Institute, and run by BU since 1971.

But before you chuck your sweet tea and reach for a diet soda, there’s more: a follow-up study found that people who drank diet soda daily were almost three times as likely to develop stroke and dementia when compared to those who did not.


Researchers are quick to point out that these findings, which appear separately in the journals Alzheimer’s & Dementia and Stroke, demonstrate correlation but not cause and effect. While researchers caution against overconsuming either diet soda or sugary drinks, more research is needed to determine how—or if—these drinks actually damage the brain, and how much damage may be caused by underlying vascular disease or diabetes.

“These studies are not the be-all and end-all, but it’s strong data and a very strong suggestion,” says Sudha Seshadri, a School of Medicine professor of neurology and a faculty member at BU’s Alzheimer’s Disease Center, senior author on both papers. “It looks like there is not very much of an upside to having sugary drinks, and substituting the sugar with artificial sweeteners doesn’t seem to help.”

“Maybe good old-fashioned water is something we need to get used to,” she adds.

Matthew Pase, a fellow in the MED neurology department and an FHS investigator, who is lead author on both papers, says that excess sugar has long been associated with cardiovascular and metabolic diseases like obesity, heart disease, and type 2 diabetes, but little is known about its long-term effects on the human brain. He chose to study sugary drinks as a way of examining overall sugar consumption. “It’s difficult to measure overall sugar intake in the diet,” he says, “so we used sugary beverages as a proxy.”

For the first study, published in Alzheimer’s & Dementia on March 5, 2017, researchers examined data, including magnetic resonance imaging (MRI) scans and cognitive testing results, from about 4,000 people enrolled in the FHS Offspring and Third-Generation cohorts. (These are the children and grandchildren of the original volunteers enrolled in 1948.) The researchers looked at people who consumed more than two sugary drinks a day of any type—soda, fruit juice, and other soft drinks—or more than three per week of soda alone. Among that high-intake group, they found multiple signs of accelerated brain aging, including smaller overall brain volume, poorer episodic memory, and a shrunken hippocampus, all risk factors for early-stage Alzheimer’s disease. Researchers also found that higher intake of diet soda—at least one per day—was associated with smaller brain volume.

In the second study, published in Stroke on April 20, 2017, the researchers, using data only from the older Offspring cohort, looked specifically at whether participants had suffered a stroke or been diagnosed with dementia because of Alzheimer’s disease. After measuring volunteers’ beverage intake at three points over 7 years, the researchers then monitored the volunteers for 10 years, looking for evidence of stroke in 2,888 people over age 45, and dementia in 1,484 participants over age 60. They found, surprisingly, no correlation between sugary beverage intake and stroke or dementia. However, they found that people who drank at least one diet soda per day were almost three times as likely to develop stroke and dementia.

Although the researchers took age, smoking, diet quality, and other factors into account, they could not completely control for preexisting conditions like diabetes, which may have developed over the course of the study and is a known risk factor for dementia. Diabetics, as a group, drink more diet soda on average, as a way to limit their sugar consumption, and some of the correlation between diet soda intake and dementia may be due to diabetes, as well as other vascular risk factors. However, such preexisting conditions cannot wholly explain the new findings.

“It was somewhat surprising that diet soda consumption led to these outcomes,” says Pase, noting that while prior studies have linked diet soda intake to stroke risk, the link with dementia was not previously known. He adds that the studies did not differentiate between types of artificial sweeteners and did not account for other possible sources of artificial sweeteners. He says that scientists have put forth various hypotheses about how artificial sweeteners may cause harm, from transforming gut bacteria to altering the brain’s perception of sweet, but “we need more work to figure out the underlying mechanisms.”

Source : Boston University