Posts by AltonParrish:
By Alton Parrish.
Credit: © S. Keay
A French-Italian team led by Jean-Philippe Goiran, CNRS researcher, has tried to definitely verify the hypothetical location of the harbour, by using a new geological corer. This technology solves the problem of groundwater which makes this area rather difficult for archeologists to excavate beyond 2 m deep.
Two sediment cores have been extracted, showing a complete 12 m depth stratigraphy and the evolution of the harbour zone in 3 steps:
(2) 2- A middle layer, rich in grey silty-clay sediments, shows a typical harbour facies. According to calculations, the basin had a depth of 6.5 m at the beginning of its operation (dated between the 4th and 2d centuries BC). Previously considered as a river harbour that can only accommodate low draft boats, Ostia actually enjoyed a deep basin capable of receiving deep draft marine ships.
(3) 3 – Finally, the most recent stratum, composed of massive alluvium accumulations, shows the abandonment of the basin during the Roman imperial period. With radiocarbon dates, it is possible to deduce that a succession of major Tiber floods episodes of the Tiber finally came to seal the harbour of Ostia between the 2nd century BC and the 1st quarter of the 1st century AD (and this despite possible phases of dredging). At that time, the depth of the basin was less than 1 m and made any navigation impossible. It was then abandoned in favor of a new harbour complex built 3 km north of the Tiber mouth, called Portus. This alluvium layer fits with the geographer Strabo’s text (58 BC – 21/25 AD) who indicated the sealing of the harbour basin by sediments of the Tiber at that time (Geographica, 231-232).
The discovery of the river mouth harbour of Ostia, north of the city and west of the Imperial Palace, will help better understand the links between Ostia, its harbour and the ex-nihilo settling of Portus, initiated in 42 AD and completed in 64 AD under the reign of Nero. This gigantic 200 ha wide complex became the harbour of Rome and the largest ever built by the Romans in the Mediterranean.
Between the abandonment of the port of Ostia and the construction of Portus, researchers estimate that nearly 25 years have passed. Rome was the capital of the ancient Roman world and the first city to reach one million inhabitants. So how was it supplied with wheat during that period? The question arises now researchers.
(1) This work was also carried out in collaboration with the Maison Méditerranéenne des Sciences de l’Homme (CNRS / Aix-Marseille Université), the Universita Roma 3, the Institut Universitaire de France and received the support of the ANR (Agence Nationale de la Recherche).
Citation: J.-Ph. Goiran, F. Salomon, E. Pleuger, C. Vittori, I. Mazzini, G. Boetto, P. Arnaud, A. Pellegrino, décembre 2012, “Résultats préliminaires de la première campagne de carottages dans le port antique d’Ostie”, Chroniques des Mélanges de l’Ecole Française de Rome, vol. n°123-2.
By Audrey Neuman.
These partnerships including the U.S. Army, the U.S. Department of Energy and the U.S. Navy further validate the potential impact of these advanced materials. The disruptive material science platforms developed by Powdermet and its partners have applications in industries, such as Energy, Defense, and Transportation among others.
“Powdermet has been at the forefront of technical innovation for the past decade, and our expertise in nanotechnology and advanced materials development has attracted top-tier partners,” said Robert Miller, CEO of Abakan Inc. “The company is now focused on commercializing several of its technology platforms following our successful research and development which have shown very impressive results for these platforms. We expect to serve multi-billion dollar markets across a variety of industries.”
Powdermet’s EnCompTM Energetic nanocomposite materials improve both the energy density and power density of batteries. Powdermet has partnered with the U.S. Army to develop applications requiring instant-on storage of a large amount of electrical energy at high voltages for long periods of time without significant current leakage. Powdermet has utilized its nanoparticle synthesis and controlled dispersion processing capabilities to demonstrate melt-processable nanoparticle-filled engineered composite film dielectrics showing 20 to 30 J/cc of energy density, highlighting the opportunity for the technology to vastly improve energy storage technologies for the transportation and renewable energy sectors.
In addition to these initial defense applications, nanodielectric materials for smart grid applications are expected to represent a $500 million market by 2017 according toNanoMarkets. The nanodielectric materials are expected to have several other applications, replacing and enhancing batteries in the electric vehicle, consumer electronics and other industries, cumulating to a more than $1 billion market for nanodielectrics in the next five to seven years.
Additionally, Powdermet’s long-term collaboration with Michigan State University also received investment from the Department of Energy to further develop critical nanocomposite separators needed for ultra-high power density multivalent batteries based on ionic liquids. Multivalent batteries enable an order of magnitude increase in the power density of batteries, enabling the handling of high power loads in smaller, lighter, lower cost devices. This technology complements Powdermet’s ongoing efforts to develop encapsulated nanocomposite anode and cathode materials addressing needs within the $100 billion global advanced battery manufacturing value chain.
Finally, Powdermet’s MComPTM Microcomposite cermet (ceramic-metal) materials provide the friction and wear performance equivalent to advanced, diamond-like carbon coatings, but with the toughness, strength and formability of metals. Powdermet has partnered with the U.S. Navy to provide a solution to contamination issues in spherical plain airframe bearings using advanced coatings that have already been commercialized through Abakan subsidiary MesoCoat. These nanocomposite cermet materials have applications across the transportation, energy, military, construction and other sectors for reducing friction and extending the life of, or eliminating the need for lubricants, in highly stressed systems.
Abakan develops, manufactures, and markets advanced nanocomposite materials,innovative fabricated metal products and highly engineered metal composites for applications in the oil and gas, petrochemical, mining, aerospace and defense, energy, infrastructure and processing industries.
By Michael Bernstein.
Carolyn Slupsky and colleagues explain that past research showed a link between formula-feeding and a higher risk for chronic diseases later in life. Gaps exist, however, in the scientific understanding of the basis for that link.
The scientists turned to rhesus monkeys, stand-ins for human infants in such research, that were formula-fed or breast-fed for data to fill those gaps.
Their analysis of the monkeys’ urine, blood and stool samples identified key differences between formula-fed and breast-fed individuals. It also produced hints that reducing the protein content of infant formula might be beneficial in reducing the metabolic stress in formula-fed infants. “Our findings support the contention that infant feeding practice profoundly influences metabolism in developing infants and may be the link between early feeding and the development of metabolic disease later in life,” the study states.
By Alton Parrish.
Pitotti is a multimedia digital rock art exhibition in the South Lecture Room of the Museum of Archaeology and Anthropology (MAA).
It brings some of the earliest human figures in European rock art to life with interactive graphics, 3D printing and video games; exploring the potential links between the world of archaeology and the world of film, digital humanities and computer vision.
The images on display are just a fraction of the 300,000 rock engravings that act as an archive of nearly 10,000 years of human history at the UNESCO World Heritage Site in Valcamonica, detailing how a small clan of hunter-gatherers eventually became part of the Roman Empire.
The engravings were made from as early as 7,000BC and continued to be made right up till the 16th century AD, with the richest activity taking place in the Iron Age (1100BC-200BC, before the Roman Empire).
Now, a host of artists have injected digital life into these rock-art figures, with video projections, an ambient cinema and an interactive touch screen table where multiple visitors can explore and play with a digitised rock face.
A large projection of the Valcamonica is hooked up to a video-game style joystick which can be used to navigate the valley and discover the carvings, and detailed scans have allowed the creation of 3D printed panels which visitors can touch.
Dr Frederick Baker, Senior Research Associate at the McDonald Institute of Archaeological Research, said: “Human beings have engraved the rocks with their everyday lives and stories. It’s a kind of visual autobiography. Through the help of digitisation we discover this tribe and get insight into the symbols which actually adorn the rock, to get these symbols moving.
“These images created by humans depict animals to tell of hunting and aspects of everyday life. Computer visual science, film, 3D and graphic design explore ancient prehistoric art forms.
“Visitors can interact and discover with their fingers in a tactile manner that was impossible until today. Now in the 21st century digital art allows the rock art to move from the valley onto the screens and into the digital world that surrounds us.”
The exhibition grew from years of research by Dr Christopher Chippindale and Dr Frederick Baker, both members of the Cambridge University Prehistoric Picture Project. Pitoti is a word from the Lombard dialect and is a local word describing these engraved figures as “little puppets”.
Dr Chippindale said: “What European rock art gives us is the world of prehistoric Europeans, as they themselves experienced it and understood it. Our prehistoric ancestors chose to make engravings of animals, but few of plants. Many of deer, but few of sheep, and vast numbers of armed warriors in opposed pairs. Why? Because those aspects of their lives were vital and central to them.”
The images depicted also tell the story of how innovations spread across continents.
“Horses appear, as do ploughs and carts,” said Professor Graham Baker from the MacDonald Institute. “Their wheels are a huge innovation which leads later to chariots. The engravings of musical instruments remind us that the arts were also part of prehistoric life.”
According to Dr Baker, the link between the rock art and the digital animations are stronger than might be imagined.
He added: “Some of the humans and animals in the art are made in rather rigid forms. Others look to be in lively, animated motion – frozen at a certain moment as if they were stills from an animated cartoon. What the figures cannot do is move: there were no film cameras or animation studios in prehistoric times. But with our cameras and studios, today we can take the metaphor literally. So much the ancient artists could not do – working only with hammer and stone against tough resistant rock – our new digital technologies can.”
By Alton Parrish.
At the suburban Chicago laboratory, a group of scientists has seemingly defied the laws of physics and found a way to apply pressure to make a material expand instead of compress/contract.
“It’s like squeezing a stone and forming a giant sponge,” said Karena Chapman, a chemist at the U.S. Department of Energy laboratory. “Materials are supposed to become denser and more compact under pressure. We are seeing the exact opposite. The pressure-treated material has half the density of the original state. This is counterintuitive to the laws of physics.”
Because this behavior seems impossible, Chapman and her colleagues spent several years testing and retesting the material until they believed the unbelievable and understood how the impossible could be possible. For every experiment, they got the same mind-bending results.
“The bonds in the material completely rearrange,” Chapman said. “This just blows my mind.”
This discovery will do more than rewrite the science text books; it could double the variety of porous framework materials available for manufacturing, health care and environmental sustainability.
Scientists use these framework materials, which have sponge-like holes in their structure, to trap, store and filter materials. The shape of the sponge-like holes makes them selectable for specific molecules, allowing their use as water filters, chemical sensors and compressible storage for carbon dioxide sequestration of hydrogen fuel cells. By tailoring release rates, scientists can adapt these frameworks to deliver drugs and initiate chemical reactions for the production of everything from plastics to foods.
Pressure-induced transitions are associated with near 2-fold volume expansions. While an increase in volume with pressure is counterintuitive, the resulting new phases contain large fluid-filled pores, such that the combined solid + fluid volume is reduced and the inefficiencies in space filling by the interpenetrated parent phase are eliminated
“This could not only open up new materials to being porous, but it could also give us access to new structures for selectability and new release rates,” said Peter Chupas, an Argonne chemist who helped discover the new materials.
The team published the details of their work in the May 22 issue of the Journal of the American Chemical Society in an article titled “Exploiting High Pressures to Generate Porosity, Polymorphism, And Lattice Expansion in the Nonporous Molecular Framework Zn(CN)2 .”
The scientists put zinc cyanide, a material used in electroplating, in a diamond-anvil cell at the Advanced Photon Source (APS) at Argonne and applied high pressures of 0.9 to 1.8 gigapascals, or about 9,000 to 18,000 times the pressure of the atmosphere at sea level. This high pressure is within the range affordably reproducible by industry for bulk storage systems. By using different fluids around the material as it was squeezed, the scientists were able to create five new phases of material, two of which retained their new porous ability at normal pressure.
“By applying pressure, we were able to transform a normally dense, nonporous material into a range of new porous materials that can hold twice as much stuff,” Chapman said. “This counterintuitive discovery will likely double the amount of available porous framework materials, which will greatly expand their use in pharmaceutical delivery, sequestration, material separation and catalysis.”
The scientists will continue to test the new technique on other materials.
The research is funded by the U.S. Department of Energy’s Office of Science.
By Duke University.
Duke Sociologist Kieran Healey of the Kenan Institute for Ethics has written a brief primer on how one might use the sort of metadata available to National Security Agency snoops to figure out a burgeoning “terrorist” cell — in 1772 Boston.
Written in a sort of pseudo-18th century grammar (with mostly modern spellings, thankfully) Healy walks through the steps one would take to turn the membership lists of various organizations into a matrix of connections showing who the key agitators might be.
Using this “Social Networke Analysis,” but knowing nothing else about these names, he quickly fingers Sam Adams and Paul Revere as “persons of interest.”
The clever and instructive post has taken off on social media in the last day — reaching 100,000 views on Tuesday, and spreading even more strongly on Wednesday.
“I wanted to give non-specialists a sense of how the structural analysis of what’s being called ‘metadata’ works, and to show in a fun but hopefully telling way how much you can get out of that approach,” Healywrote in his blog the next day. “So I tried to emphasize that I was using one of the earliest, and (in retrospect) most basic methods we have, but one that still has the capacity to surprise people unfamiliar with (social network analysis).”
By Alton Parrish.
When water freezes into ice, its molecules are bound together in a crystalline lattice held together by hydrogen bonds. Hydrogen bonds are highly versatile and, as a result, crystalline ice reveals a striking diversity of at least 16 different structures.
In all of these forms of ice, the simple H2O molecule is the universal building block. However, in 1964 it was predicted that, under sufficient pressure, the hydrogen bonds could strengthen to the point where they might actually break the water molecule apart. The possibility of directly observing a disassociated water molecule in ice has proven a fascinating lure for scientists and has driven extensive research for the last 50 years. In the mid-1990s several teams, including a Carnegie group, observed the transition using spectroscopic techniques. However, these techniques are indirect and could only reveal part of the picture.
A preferred method is to “see” the hydrogen atoms–or protons–directly. This can be done by bouncing neutrons off the ice and then carefully measuring how they are scattered. However, applying this technique at high enough pressures to see the water molecule dissociate had simply not been possible in the past. Guthrie explained that: “you can only reach these extreme pressures if your samples of ice are really small. But, unfortunately, this makes the hydrogen atoms very hard to see.”
The Spallation Neutron Source was opened at Oak Ridge National Laboratory in Tennessee in 2006, providing a new and intensely bright supply of neutrons. By designing a new class of tools that were optimized to exploit this unrivalled flux of neutrons, Guthrie and his team–Carnegie’s Russell Hemley, Reinhard Boehler, and Kuo Li, as well as Chris Tulk, Jamie Molaison, and António dos Santos of Oak Ridge National Laboratory–have obtained the first glimpse of the hydrogen atoms themselves in ice at unprecedented pressures of over 500,000 times atmospheric pressure.
“The neutrons tell us a story that the other techniques could not,” said Hemley, director of Carnegie’s Geophysical Laboratory. “The results indicate that dissociation of water molecules follows two different mechanisms. Some of the molecules begin to dissociate at much lower pressures and via a different path than was predicted in the classic 1964 paper.”
“Our data paint an altogether new picture of ice,” Guthrie commented. “Not only do the results have broad consequences for understanding bonding in H2O, the observations may also support a previously proposed theory that the protons in ice in planetary interiors can be mobile even while the ice remains solid.”
And this startling discovery may prove to be just the beginning of scientific discovery. Tulk emphasized “being able to ‘see’ hydrogen with neutrons isn’t just important for studies of ice. This is a game-changing technical breakthrough. The applications could extend to systems that are critical to societal challenges, such as energy. For example, the technique can yield greater understanding of methane-containing clathrate hydrates and even hydrogen storage materials that could one day power automobiles”.
The group is part of Energy Frontier Research in Extreme Environments (EFree), an Energy Frontier Research Center headquartered at Carnegie’s Geophysical Laboratory.
Contacts and sources:
By Alton Parrish.
Now, computer simulations by Stanford scientists reveal that sound waves in the ocean produced by the earthquake probably reached land tens of minutes before the tsunami. If correctly interpreted, they could have offered a warning that a large tsunami was on the way.
Although various systems can detect undersea earthquakes, they can’t reliably tell which will form a tsunami, or predict the size of the wave. There are ocean-based devices that can sense an oncoming tsunami, but they typically provide only a few minutes of advance warning.
Because the sound from a seismic event will reach land well before the water itself, the researchers suggest that identifying the specific acoustic signature of tsunami-generating earthquakes could lead to a faster-acting warning system for massive tsunamis.
Discovering the signal
The finding was something of a surprise. The earthquake’s epicenter had been traced to the underwater Japan Trench, a subduction zone about 40 miles east of Tohoku, the northeastern region of Japan’s larger island. Based on existing knowledge of earthquakes in this area, seismologists puzzled over why the earthquake rupture propagated from the underground fault all the way up to the seafloor, creating a massive upward thrust that resulted in the tsunami.
Direct observations of the fault were scarce, so Eric Dunham, an assistant professor of geophysics in the School of Earth Sciences, and Jeremy Kozdon, a postdoctoral researcher working with Dunham, began using the cluster of supercomputers at Stanford’s Center for Computational Earth and Environmental Science (CEES) to simulate how the tremors moved through the crust and ocean.
2011 Tōhoku earthquake and tsunami, An aerial view of damage in the Sendai region with black smoke coming from the Nippon Oil Sendai oil refinery
Retroactively, the models accurately predicted the seafloor uplift seen in the earthquake, which is directly related to tsunami wave heights, and also simulated sound waves that propagated within the ocean.
In addition to valuable insight into the seismic events as they likely occurred during the 2011 earthquake, the researchers identified the specific fault conditions necessary for ruptures to reach the seafloor and create large tsunamis.
The model also generated acoustic data; an interesting revelation of the simulation was that tsunamigenic surface-breaking ruptures, like the 2011 earthquake, produce higher amplitude ocean acoustic waves than those that do not.
The model showed how those sound waves would have traveled through the water and indicated that they reached shore 15 to 20 minutes before the tsunami.
“We’ve found that there’s a strong correlation between the amplitude of the sound waves and the tsunami wave heights,” Dunham said. “Sound waves propagate through water 10 times faster than the tsunami waves, so we can have knowledge of what’s happening a hundred miles offshore within minutes of an earthquake occurring. We could know whether a tsunami is coming, how large it will be and when it will arrive.”
The team’s model could apply to tsunami-forming fault zones around the world, though the characteristics of telltale acoustic signature might vary depending on the geology of the local environment. The crustal composition and orientation of faults off the coasts of Japan, Alaska, the Pacific Northwest and Chile differ greatly.
“The ideal situation would be to analyze lots of measurements from major events and eventually be able to say, ‘this is the signal’,” said Kozdon, who is now an assistant professor of applied mathematics at the Naval Postgraduate School. “Fortunately, these catastrophic earthquakes don’t happen frequently, but we can input these site specific characteristics into computer models – such as those made possible with the CEES cluster – in the hopes of identifying acoustic signatures that indicates whether or not an earthquake has generated a large tsunami.”
Dunham and Kozdon pointed out that identifying a tsunami signature doesn’t complete the warning system. Underwater microphones called hydrophones would need to be deployed on the seafloor or on buoys to detect the signal, which would then need to be analyzed to confirm a threat, both of which could be costly. Policymakers would also need to work with scientists to settle on the degree of certainty needed before pulling the alarm.
If these points can be worked out, though, the technique could help provide precious minutes for an evacuation.
The study is detailed in the current issue of the journal The Bulletin of the Seismological Society of America.
Eric Dunham, Geophysics
Bjorn Carey, Stanford News Service
By Alton Parrish.
For this study, research technicians searched the VARS database to find every video clip that showed debris on the seafloor. They then compiled data on all the different types of debris they saw, as well as when and where this debris was observed.
Deep-sea currents wrapped this plastic bag around a deep-sea gorgonian coral 2,115 meters (almost 7,000 feet) below the ocean surface in Astoria Canyon, off the Coast of Oregon.
Image: ©2006 MBARI
In total, the researchers counted over 1,500 observations of deep-sea debris, at dive sites from Vancouver Island to the Gulf of California, and as far west as the Hawaiian Islands. In the recent paper, the researchers focused on seafloor debris in and around Monterey Bay—an area in which MBARI conducts over 200 research dives a year. In this region alone, the researchers noted over 1,150 pieces of debris on the seafloor.
The largest proportion of the debris—about one third of the total—consisted of objects made of plastic. Of these objects, more than half were plastic bags. Plastic bags are potentially dangerous to marine life because they can smother attached organisms or choke animals that consume them.
Metal objects were the second most common type of debris seen in this study. About two thirds of these objects were aluminum, steel, or tin cans. Other common debris included rope, fishing equipment, glass bottles, paper, and cloth items.
The researchers found that trash was not randomly distributed on the seafloor. Instead, it collected on steep, rocky slopes, such as the edges of Monterey Canyon, as well as in a few spots in the canyon axis. The researchers speculate that debris accumulates where ocean currents flow past rocky outcrops or other obstacles.
A tangle of rope and fishing gear lies on the seafloor about 1,000 meters (3,280 feet) deep in Monterey Canyon.
Image: ©2012 MBARI
The researchers also discovered that debris was more common in the deeper parts of the canyon, below 2,000 meters (6,500 feet). Schlining commented, “I was surprised that we saw so much trash in deeper water. We don’t usually think of our daily activities as affecting life two miles deep in the ocean.” Schlining added, “I’m sure that there’s a lot more debris in the canyon that we’re not seeing. A lot of it gets buried by underwater landslides and sediment movement. Some of it may also be carried into deeper water, farther down the canyon.”
In the same areas where they saw trash on the seafloor, the researchers also saw kelp, wood, and natural debris that originated on land. This led them to conclude that much of the trash in Monterey Canyon comes from land-based sources, rather than from boats and ships.
Although the MBARI study also showed a smaller proportion of lost fishing gear than did some previous studies, fishing gear accounted for the most obvious negative impacts on marine life. The researchers observed several cases of animals trapped in old fishing gear.
Other effects on marine life were more subtle. For example, debris in muddy-bottom areas was often used as shelter by seafloor animals, or as a hard surface on which animals anchored themselves. Although such associations seem to benefit the individual animals involved, they also reflect the fact that marine debris is creating changes in the existing natural biological communities.
A young rockfish hides out in a discarded shoe, 472 meters (1,548 feet) deep in San Gabriel Canyon, off Southern California.
Image: ©2010 MBARI
To make matters worse, the impacts of deep-sea trash may last for years. Near-freezing water, lack of sunlight, and low oxygen concentrations discourage the growth of bacteria and other organisms that can break down debris. Under these conditions, a plastic bag or soda can might persist for decades.
MBARI researchers hope to do additional research to understand the long-term biological impacts of trash in the deep sea. Working with the Monterey Bay National Marine Sanctuary, they are currently finishing up a detailed study of the effects of a particularly large piece of marine debris—a shipping container that fell off a ship in 2004.
During research expeditions, researchers occasionally retrieve trash from the deep sea. However, removing such debris on a large scale is prohibitively expensive, and can sometimes do more damage than simply leaving it in place.
Schlining noted, “The most frustrating thing for me is that most of the material we saw—glass, metal, paper, plastic—could be recycled.” She and her coauthors hope that their findings will inspire coastal residents and ocean users to recycle their trash instead of allowing it to end up in the ocean. In the conclusion of their article, they wrote, “Ultimately, preventing the introduction of litter into the marine environment through increased public awareness remains the most efficient and cost-effective solution to this dilemma.”
Monterey Bay Aquarium Research Institute (MBARI)
By Alton Parrish.
Credit: Stony Brook
“Our nanotechnology produces entanglements that are millions of times more dense than woven products such as fabrics or carpets,” said lead researcher Miriam Rafailovich, Distinguished Professor of Materials Science and Engineering and Co-Director in the Program of Chemical and Molecular Engineering at Stony Brook University. “The microfibers trap them by attaching to microstructures on their legs taking away their ability to move, which stops them from feeding and reproducing.”
Successful tests were performed using live bed bugs and termites in Professor Rafailovich’s lab with the assistance of Ying Liu, a scientist with Stony Brook University’s Advanced Energy Research and Technology Center and Stony Brook graduate students Shan He and Linxi Zhang.
Kevin McAllister, Fibertrap’s co-founder added, “We are very excited to move this advancement from the lab to the consumer. Our goal has always been to make a difference for people living in areas where bed bugs are pervasive and difficult to eradicate.”
The microfibers are safe for humans and pets and unlike chemical treatments the insects cannot develop a resistance to it.
About Bed Bugs
Bed bugs (Cimex lectularius) are small, flat, parasitic insects that feed solely on the blood of people and animals while they sleep. Bed bugs are reddish-brown in color, wingless, range from one millimeter (mm) to seven mm (roughly the size of Lincoln’s head on a penny), and can live several months without a blood meal.
Bed bug infestations usually occur around or near the areas where people sleep. These areas include apartments, shelters, rooming houses, hotels, cruise ships, buses, trains and dorm rooms. They hide during the day in places such as seams of mattresses, box springs, bed frames, headboards, dresser tables, inside cracks or crevices, behind wallpaper or any other clutter or objects around a bed. Bed bugs have been shown to be able to travel over 100 feet in a night but tend to live within eight feet of where people sleep. A bed bug bite affects each person differently. Bite responses can range from an absence of any physical signs of the bite, to a small bite mark, to a serious allergic reaction. Bed bugs are not considered to be dangerous; however, an allergic reaction to several bites may need medical attention. (Source: Centers for Disease Control and Prevention) For more information please check out the Centers for Disease Control and Prevention Bed Bug FAQs.
New York City consistently ranks in the top 10 or 15 cities with the worst bed bug problem across the nation. An annual list released by Orkin Pest Control based upon bed bug business in U.S. cities, lists Chicago as having the worst bed bug problem for 2012; New York City comes in at #10.
By Alton Parrish.
The radar imagery revealed that 1998 QE2 is a binary asteroid. In the near-Earth population, about 16 percent of asteroids that are about 655 feet (200 meters) or larger are binary or triple systems. Radar images suggest that the main body, or primary, is approximately 1.7 miles (2.7 kilometers) in diameter and has a rotation period of less than four hours. Also revealed in the radar imagery of 1998 QE2 are several dark surface features that suggest large concavities. The preliminary estimate for the size of the asteroid’s satellite, or moon, is approximately 2,000 feet (600 meters) wide. The radar collage covers a little bit more than two hours.
First radar images of asteroid 1998 QE2 were obtained when the asteroid was about 3.75 million miles (6 million kilometers) from Earth. The small white dot at lower right is the moon, or satellite, orbiting asteroid 1998 QE2.
The closest approach of the asteroid occurs on May 31 at 1:59 p.m. Pacific (4:59 p.m. Eastern / 20:59 UTC), when the asteroid will get no closer than about 3.6 million miles (5.8 million kilometers), or about 15 times the distance between Earth and the moon. This is the closest approach the asteroid will make to Earth for at least the next two centuries. Asteroid 1998 QE2 was discovered on Aug. 19, 1998, by the Massachusetts Institute of Technology Lincoln Near Earth Asteroid Research (LINEAR) program near Socorro, N.M.
The resolution of these initial images of 1998 QE2 is approximately 250 feet (75 meters) per pixel. Resolution is expected to increase in the coming days as more data become available. Between May 30 and June 9, radar astronomers using NASA’s 230-foot-wide (70 meter) Deep Space Network antenna at Goldstone, Calif., and the Arecibo Observatory in Puerto Rico, will perform an extensive campaign of observations on asteroid 1998 QE2. The two telescopes have complementary imaging capabilities that will enable astronomers to learn as much as possible about the asteroid during its brief visit near Earth.
Radar is a powerful technique for studying an asteroid’s size, shape, rotation state, surface features and surface roughness, and for improving the calculation of asteroid orbits. Radar measurements of asteroid distances and velocities often enable computation of asteroid orbits much further into the future than if radar observations weren’t available.
NASA places a high priority on tracking asteroids and protecting our home planet from them. In fact, the United States has the most robust and productive survey and detection program for discovering near-Earth objects. To date, U.S. assets have discovered more than 98 percent of the known Near-Earth Objects.
In 2012, the Near-Earth Object budget was increased from $6 million to $20 million. Literally dozens of people are involved with some aspect of near-Earth object research across NASA and its centers. Moreover, there are many more people involved in researching and understanding the nature of asteroids and comets, including those objects that come close to Earth, plus those who are trying to find and track them in the first place.
In addition to the resources NASA puts into understanding asteroids, it also partners with other U.S. government agencies, university-based astronomers, and space science institutes across the country that are working to track and better understand these objects, often with grants, interagency transfers and other contracts from NASA.
NASA’s Near-Earth Object Program at NASA Headquarters, Washington, manages and funds the search, study, and monitoring of asteroids and comets whose orbits periodically bring them close to Earth. JPL manages the Near-Earth Object Program Office for NASA’s Science Mission Directorate in Washington. JPL is a division of the California Institute of Technology in Pasadena.
In 2016, NASA will launch a robotic probe to one of the most potentially hazardous of the known Near-Earth Objects. The OSIRIS-REx mission to asteroid (101955) Bennu will be a pathfinder for future spacecraft designed to perform reconnaissance on any newly-discovered threatening objects. Aside from monitoring potential threats, the study of asteroids and comets enables a valuable opportunity to learn more about the origins of our solar system, the source of water on Earth, and even the origin of organic molecules that lead to the development of life.
NASA recently announced development of a first-ever mission to identify, capture and relocate an asteroid for human exploration. Using game-changing technologies this mission would mark an unprecedented technological achievement that raises the bar of what humans can do in space. Capturing and redirecting an asteroid will integrate the best of NASA’s science, technology and human exploration capabilities and draw on the innovation of America’s brightest scientists and engineers.
More information about asteroid radar research is at: http://echo.jpl.nasa.gov/
More information about the Deep Space Network is at: http://deepspace.jpl.nasa.gov/dsn .
By Alton Parrish.
However, research by psychologists at the universities of Kent, Magdeburg and Cambridge, and the Medical Research Council, has shown that, contrary to this core assumption, some people can intentionally and voluntarily suppress unwanted memories – in other words, control their brain activity, thereby abolishing brain activity related to remembering. This was demonstrated through experiments in which people who conducted a mock crime were later tested on their crime recognition while having their electrical brain activity measured. Critically, when asked to suppress their crime memories, a significant proportion of people managed to reduce their brain’s recognition response and appear innocent.
This finding has major implications for brain activity guilt detection tests, among the most important being that those using memory detection tests should not assume that brain activity is outside voluntary control, and any conclusions drawn on the basis of these tests need to acknowledge that it might be possible for suspects to intentionally suppress their memories of a crime and evade detection.
Dr Zara Bergstrom, Lecturer in Cognitive Psychology at the University of Kent and principal investigator on the research, said: ‘Brain activity guilt detection tests are promoted as accurate and reliable measures for establishing criminal culpability. Our research has shown that this assumption is not always justified. Using these types of tests to say that someone is innocent of a crime is not valid because it could just be the case that the suspect has managed to hide their crime memories.’
Dr Michael Anderson, Senior Scientist at the Medical Research Council Cognition and Brain Sciences Unit in Cambridge, commented: ‘Interestingly, not everyone was able to suppress their memories of the crime well enough to beat the system. Clearly, more research is needed to identify why some people were much more effective than others.’
Dr Anderson’s group is presently trying to understand such individual differences with brain imaging.
Dr Jon Simons, of the Department of Psychology at the University of Cambridge, added: ‘Our findings would suggest that the use of most brain activity guilt detection tests in legal settings could be of limited value. Of course, there could be situations where it is impossible to beat a memory detection test, and we are not saying that all tests are flawed, just that the tests are not necessarily as good as some people claim. More research is also needed to understand whether the results of this research work in real life crime detection.’
‘Intentional retrieval suppression can conceal guilty knowledge in ERP memory detection tests’ (Zara M. Bergström, Michael C. Anderson, Marie Buda, Jon S. Simons and Alan Richardson-Klavehn) will be published byBiological Psychology in September 2013 (Volume 94 issue 1).
It is currently online at http://dx.doi.org/10.1016/j.biopsycho.2013.04.012
By Alton Parrish.
“It might explain why we’re here at all,” said David Radford, who oversees specific ORNL activities in the Majorana Demonstrator research effort. “It could help explain why the matter that we are made of exists.”
The Majorana Demonstrator is being assembled and stored 4,850 feet beneath the earth’s surface in enriched copper to limit the amount of background interference from cosmic rays and radioactive isotopes.
Radford, a researcher in ORNL’s Physics Division and an expert in germanium detectors, has been delivering germanium-76 to Sanford Underground Research Laboratory (SURF) in Lead, S.D., for the project. After navigating a Valentine’s Day blizzard on the first two-day drive from Oak Ridge, Radford made a second delivery in March.
ORNL serves as the lead laboratory for the Majorana Demonstrator research effort, a collaboration of research institutions representing the United States, Russia, Japan and Canada. The project is managed by the University of North Carolina’s Prof. John Wilkerson, who also has a joint faculty appointment with ORNL.
Research at SURF is being conducted 4,850 feet beneath the earth’s surface with the intention of building a 40-kilogram germanium detector, capable of detecting the theorized neutrinoless double beta decay. Detection might help to explain the matter-antimatter imbalance.
Before the detection of the unobserved decay can begin, however, the germanium must first be processed, refined and enriched. Radford coordinated the multistep process, which includes an essential pit stop in Oak Ridge.
The 42.5 kilograms of 86-percent enriched white germanium oxide powder required for the project is valued at $4 million and was transported from a Russian enrichment facility to a secure underground ORNL facility in a specially designed container. The container’s special shielding and underground storage limited exposure of the germanium to cosmic rays.
Without such preventative measures, Radford says, “Cosmic rays transmute germanium atoms into long-lived radioactive atoms, at the rate of about two atoms per day per kilogram of germanium. Even those two atoms a day will add to the background in our experiment. So we use underground storage to reduce the exposure to cosmic rays by a factor of 100.”
The germanium must further undergo a reduction and purification process at two Oak Ridge companies, Electrochemical Systems, Inc. (ESI) and Advanced Measurement Technology (AMETEK), before being moved to its final destination in South Dakota. ESI works to reduce the powdered germanium oxide to metal germanium bars. ORTEC, a division of AMETEK, further purifies the bars, using the material to grow large single crystals of germanium, and turning those into one-kilogram cylindrical germanium detectors that will be used in the Demonstrator. Once they leave AMETEK, Radford and his team transport the detectors to SURF.
The enrichment process is lengthy. The Majorana Demonstrator project began the partnership with ESI four years ago. To date, ORNL has delivered — via Radford’s two trips — nine of the enriched detectors, which are valued at about $2 million including the original cost of the enriched germanium oxide powder.
Requiring a total of 30 enriched detectors, the Majorana Demonstrator is not expected to be fully complete and operational until 2015.
Those involved in the Majorana research effort believe its completion and anticipated results will help pave the way for a next-generation detector using germanium-76 with unprecedented sensitivity. The future one-ton detector will help to determine the ratio and masses of conserved and annihilated lepton particles that are theorized to cause the initial imbalance of matter and antimatter from the Big Bang.
“The research effort is the first major step towards building a one-ton detector — a potentially Nobel-Prize-worthy project,” Radford says.
ORNL’s partner institutions in the Majorana Demonstration Project are Black Hills State University, Duke University, Institute for Theoretical and Experimental Physics (Russia), Joint Institute for Nuclear Research (Russia), Los Alamos National Laboratory, Lawrence Berkeley National Laboratory, North Carolina State University, Osaka (Japan) University, Pacific Northwest National Laboratory, South Dakota School of Mines and Technology, Triangle Universities Nuclear Laboratory, Centre for Particle Physics (Canada), University of Chicago, University of North Carolina, University of South Carolina, University of South Dakota, University of Tennessee and the Center for Experimental Nuclear Physics and Astrophysics.
The Majorana Demonstrator research project is funded by the National Science Foundation and the Department of Energy’s Office of Nuclear Physics.
By Alton Parrish.
Dr. Zakri, a national of Malaysia who co-chaired 2005′s landmark Millennium Ecosystem Assessment and serves also as science advisor to his country’s prime minister, cited fast-growing evidence that “we are hurtling towards irreversible environmental tipping points that, once passed, would reduce the ability of ecosystems to provide essential goods and services to humankind.”
The incremental loss of Amazon rainforest, for example, “may seem small with shortsighted perspective” but will eventually “accumulate to cause a larger, more important change,” he said. Experts warn that ongoing climate change, combined with land use change and fires, “could cause much of the Amazon forest to transform abruptly to more open, dry-adapted ecosystems, threatening the region’s enormous biodiversity and priceless services,” he added.
“It has been clear for some time that a credible, permanent IPCC-like science policy platform for biodiversity and ecosystem services is an important but missing element in the international response to the biodiversity crisis,” Dr. Zakri told the 7th Trondheim Conference on Biodiversity.
The Millennium Ecosystem Assessment “demonstrated that such an intergovernmental platform can create a clear, valuable policy-relevant consensus from a wide range of information sources about the state, trends and outlooks of human-environment interactions, with focus on the impacts of ecosystem change on human well-being. It showed that such a platform can support decision-makers in the translation of knowledge into policy.
“The Millennium Ecosystem Assessment provides our baseline,” he said. “The IPBES will tell us how much we have achieved, where we are on track, where we are not, why, and options for moving forward. It will help to build public support and identify priorities.”
The structure of IPBES mimics that of the IPCC but its aims go further to include capacity building to help bridge different knowledge systems.
“IPBES will reduce the gulf between the wealth of scientific knowledge on declining natural world conditions, and knowledge about effective action to reverse these damaging trends,” he said.
Even barnyard diversity is in decline
Some scientists have termed this the “sixth great extinction episode” in Earth’s history, according to Dr. Zakri, noting that the loss of biodiversity is happening faster and everywhere, even among farm animals.
He underlined findings by the UN Food and Agriculture Organization that genetic diversity among livestock is declining.
“The good news is the rate of decline is dropping but the latest data classify 22% of domesticated breeds at risk of extinction,” Dr. Zakri said.
Breeds become rare because their characteristics either don’t suit contemporary demand or because differences in their qualities have not been recognised. When a breed population falls to about 1,000 animals, it is considered rare and endangered.
Causes of genetic erosion in domestic animals are the lack of appreciation of the value of indigenous breeds and their importance in niche adaptation, incentives to introduce exotic and more uniform breeds from industrialised countries, and product-focused selection.
Among crops, meanwhile, about 75 per cent of genetic diversity was lost in the last century as farmers worldwide switched to genetically uniform, high-yielding varieties and abandoned multiple local varieties. There are 30,000 edible plant species but only 30 crops account for 95% of human food energy, the bulk of which (60%) comes down to rice, wheat, maize, millet and sorghum.
“The decline in the diversity of crops and animals is occurring in tandem with the need to sharply increase world food production and as a changing environment makes it more important than ever to have a large genetic pool to enable organisms to withstand and adapt to new conditions,” he said.
Biodiversity and the Sustainable Development Goals
According to Dr. Zakri, the most important outcome of last year’s Rio+20 international environmental summit of nations was agreement to set new multi-year global objectives to succeed the Millennium Development Goals (2000 – 2015).
Biodiversity is expected to feature prominently in the new “Sustainable Development Goals.”
For specifics, Dr. Zakri commended the Aichi Biodiversity Targets, already established through the Convention on Biological Diversity, which contain five strategic priorities and 20 specific targets internationally agreed for achievement by 2020, beginning with public awareness of the value of biodiversity and the steps people can take to conserve and use it sustainably.
“The Aichi Targets are an important contribution to the SDG process and it is up to us to ensure that they are fully considered,” he said.
“I would argue, though, that advancing towards equity and sustainable development requires us to go beyond. We need to meet the fundamental challenge of decoupling economic growth from natural resource consumption, which is forecast to triple by 2050 unless humanity can find effective ways to ‘do more and better with less.’ There are no simple blueprints for addressing a challenge as vast and complex as this but it’s imperative we commit to that idea.
“We also need measures of societal progress that go beyond Gross Domestic Product. We need the kind of vision embodied in the Inclusive Wealth Index being pioneered by Sir Partha Dasgupta of Cambridge University, Anantha Duraiappah at IHDP, and Pushpam Kumar at UNEP. As they have convincingly argued, enlightened measures of wealth that include natural capital, not just output like GDP, offers a real portrait of sustainable development,” he added.
“The idea that natural capital should be measured like this makes many nervous. And I agree that many of the services the environment provides, like clean water and air, are irreplaceable necessities.
“In theory, however, the undoubted value of these natural treasures should be reflected in their price, which should rise steeply as they become scarcer. In practice, natural assets are often hard to price well, if at all. Although this work is still in its infancy, it is worth recalling that GDP has only been measured for the last 70 years. And that originally it was a far cruder metric than today. The reality over many decades and the recent experience with the MDGs demonstrate all too clearly the limited success that even legal biodiversity-related commitments have in the absence of some sort of metric that speaks to other sectors and interests involved in the development process. We need to urge more economists to do the hard but valuable work of pricing the seemingly priceless. Ensuring these ideas are properly reflected in the SDGs could provide the type of support and encouragement needed.”
Contacts and sources:
Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES)
Member nations (110): http://www.ipbes.net/about-ipbes/members-of-the-platform.html
Profile of Prof. Zakri: http://en.wikipedia.org/wiki/Zakri_Abdul_Hamid
Biodiversity from terrestrial, marine, coastal, and inland water ecosystems provides the basis for ecosystems and the services they provide that underpin human well-being. However, biodiversity and ecosystem services are declining at an unprecedented rate, and in order to address this challenge, adequate local, national and international policies need to be adopted and implemented. To achieve this, decision makers need scientifically credible and independent information that takes into account the complex relationships between biodiversity, ecosystem services, and people. They also need effective methods to interpret this scientific information in order to make informed decisions. The scientific community also needs to understand the needs of decision makers better in order to provide them with the relevant information. In essence, the dialogue between the scientific community, governments, and other stakeholders on biodiversity and ecosystem services needs to be strengthened.
To this end, a new platform has been established by the international community – the ‘Intergovernmental Platform on Biodiversity and Ecosystem Services’ (IPBES). IPBES was established in April 2012, as an independent intergovernmental body open to all member countries of the United Nations. The members are committed to building IPBES as the leading intergovernmental body for assessing the state of the planet’s biodiversity, its ecosystems and the essential services they provide to society.
IPBES provides a mechanism recognized by both the scientific and policy communities to synthesize, review, assess and critically evaluate relevant information and knowledge generated worldwide by governments, academia, scientific organizations, non-governmental organizations and indigenous communities. This involves a credible group of experts in conducting assessments of such information and knowledge in a transparent way. IPBES is unique in that it will aim to strengthen capacity for the effective use of science in decision-making at all levels. IPBES will also aim to address the needs of Multilateral Environmental Agreements that are related to biodiversity and ecosystem services, and build on existing processes ensuring synergy and complementarities in each other’s work.
Specific discussions on IPBES started following the final meeting of the multi-stakeholder international steering committee for the consultative process on an International Mechanism of Scientific Expertise on Biodiversity (IMoSEB) in November 2007. The consultation towards IMoSEB decided to invite the Executive Director of UNEP – in collaboration with governments and other partners – to convene an intergovernmental and multi-stakeholder meeting to consider the establishment of an intergovernmental mechanism for biodiversity and ecosystem services. There was also consensus among the stakeholders involved in the Millennium Ecosystem Assessment (MA) follow-up initiative that the follow up to the IMoSEB process and the MA follow-up process should merge. It was the coming together of the MA follow up process with the follow up to the IMoSEB consultations that led to the present process on IPBES.
Three intergovernmental and multistakeholders meetings (Malaysia 2008, Kenya 2009, Republic of Korea 2010) were held to discuss ways to strengthen the science-policy interface on biodiversity and ecosystem services. At the first two meetings, the gaps and needs for strengthening the science policy interface were identified, and at the meeting in June 2010, in Busan, Republic of Korea, governments decided that an IPBES should be established, what the focus of its work programme should be, and agreed on many of the principles of its operation as part of the Busan Outcome.
The Busan Outcome was welcomed by the 10th Conference of the Parties to the Convention on Biological Diversity (CBD) in Nagoya in October 2010, and was subsequently considered at the 65th session of the United Nations General Assembly (UNGA). UNGA passed a resolution requesting UNEP to convene a plenary meeting to fully operationalize IPBES at the earliest opportunity. This resolution was then taken on board by UNEP in a decision at the 26th session of the UNEP Governing Council meeting, held in February 2011.
The plenary meeting was held in two sessions. The first session was held from 3 to 7 October 2011 in Nairobi. The second session of the plenary was hosted by UNEP, in collaboration with UNESCO, FAO and UNDP, in Panama City from 16 to 21 April 2012. There, many of the modalities and institutional arrangements for the Platform were finalised and 94 Governments adopted a resolution establishing the Platform as an independent intergovernmental body.
The first meeting of the Platform’s Plenary (IPBES-1) was held in Bonn, Germany from 21 to 26 January 2013, hosted by the Government of Germany. The final outcome document of this session is available as IPBES/1/12, which includes decisions on the next steps for the development of an initial work programme, the status of contributions and initial budget for the Platform for 2012, the IPBES administrative and institutional arrangements, and the procedure for receiving and prioritizing requests put to the Platform. In addition the report includes the updated rules of procedure for the plenary of the Platform.
The intersessionnal process towards the second session of the Platform’s plenary (IPBES-2), anticipated in December 2013, is contained in decision IPBES/1/2. More on the intersessional process here:http://www.ipbes.net/intersessional-process.html
By Alton Parrish.
“Conventional lenses only capture two dimensions of a three-dimensional object,” says one of the paper’s co-authors, NIST’s Ting Xu. “Our flat lens is able to project three-dimensional images of three-dimensional objects that correspond one-to-one with the imaged object.” An article published in the journal Nature* explains that the new lens is formed from a flat slab of metamaterial with special characteristics that cause light to flow backward—a counterintuitive situation in which waves and energy travel in opposite directions, creating a negative refractive index.
Naturally occurring materials such as air or water have a positive refractive index. You can see this when you put a straw into a glass of water and look at it from the side. The straw appears bent and broken, as a result of the change in index of refraction between air, which has an index of 1, and water, which has an index of about 1.33. Because the refractive indices are both positive, the portion of the straw immersed in the water appears bent forward with respect to the portion in air. The negative refractive index of metamaterials causes light entering or exiting the material to bend in a direction opposite what would occur in almost all other materials.
For instance, if we looked at our straw placed in a glass filled with a negative-index material, the immersed portion would appear to bend backward, completely unlike the way we’re used to light behaving. In 1967, Russian physicist Victor Veselago described how a material with both negative electric permittivity and negative magnetic permeability would have a negative index of refraction.
(Permittivity is a measure of a material’s response to an applied electric field, while permeability is a measure of the material’s response to an applied magnetic field.) Veselago reasoned that a material with a refractive index of -1 could be used to make a lens that is flat, as opposed to traditional refractive lenses, which are curved. A flat lens with a refractive index of -1 could be used to directly image three-dimensional objects, projecting a three-dimensional replica into free space.
A negative-index flat lens like this has also been predicted to enable the transfer of image details substantially smaller than the wavelength of light and create higher-resolution images than are possible with lenses made of positive-index materials such as glass. It took over 30 years from Veselago’s prediction for scientists to create a negative-index material in the form of metamaterials, which are engineered on a subwavelength scale. For the past decade, scientists have made metamaterials that work at microwave, infrared and visible wavelengths by fabricating repeating metallic patterns on flat substrates. However, the smaller the wavelength of light scientists want to manipulate, the smaller these features need to be, which makes fabricating the structures an increasingly difficult task.
Until now, making metamaterials that work in the UV has been impossible because it required making structures with features as small as 10 nanometers, or 10 billionths of a meter. Moreover, because of limitations inherent in their design, metamaterials of this type designed for infrared and visible wavelengths have, so far, been shown to impart a negative index of refraction to light that is traveling only in a certain direction, making them hard to use for imaging and other applications that rely on refracted light.
To overcome these problems, researchers working at NIST took inspiration from a theoretical metamaterial design recently proposed by a group at the FOM Institute for Atomic and Molecular Physics in Holland. They adapted the design to work in the UV—a frequency range of particular technological interest. According to co-authors Xu, Amit Agrawal and Henri Lezec, aside from achieving record-short wavelengths, their metamaterial lens is inherently easy to fabricate. It doesn’t rely on nanoscale patterns, but instead is a simple sandwich of alternating nanometer-thick layers of silver and titanium dioxide, the construction of which is routine.
And because its unique design consists of a stack of strongly coupled waveguides sustaining backward waves, the metamaterial exhibits a negative index of refraction to incoming light regardless of its angle of travel. This realization of a Veselago flat lens operating in the UV is the first such demonstration of a flat lens at any frequency beyond the microwave. By using other combinations of materials, it may be possible to make similarly layered metamaterials for use in other parts of the spectrum, including the visible and the infrared.
The metamaterial flat lens achieves its refractive action over a distance of about two wavelengths of UV light, about half a millionth of a meter—a focal length challenging to achieve with conventional refractive optics such as glass lenses. Furthermore, transmission through the metamaterial can be turned on and off using higher frequency light as a switch, allowing the flat lens to also act as a shutter with no moving parts. “Our lens will offer other researchers greater flexibility for manipulating UV light at small length scales,” says Lezec.
“With its high photon energies, UV light has a myriad of applications, including photochemistry, fluorescence microscopy and semiconductor manufacturing. That, and the fact that our lens is so easy to make, should encourage other researchers to explore its possibilities.” The new work was performed in collaboration with researchers from the Maryland NanoCenter at the University of Maryland, College Park; Syracuse University; and the University of British Columbia, Kelowna, Canada.
Contacts and sources:
By Alton Parrish.
Bonn Declaration issued by 500 scientists at ‘Water in the Anthropocene’ conference
A conference of 500 leading water scientists from around the world today issued a stark warning that, without major reforms, “in the short span of one or two generations, the majority of the 9 billion people on Earth will be living under the handicap of severe pressure on fresh water, an absolutely essential natural resource for which there is no substitute. This handicap will be self-inflicted and is, we believe, entirely avoidable.”
“The list of human activities and their impact on the water systems of Planet Earth is long and important,” Anik Bhaduri, Executive Officer of the Global Water System Project (GWSP).
“We have altered the Earth’s climatology and chemistry, its snow cover, permafrost, sea and glacial ice extent and ocean volume—all fundamental elements of the hydrological cycle. We have accelerated major processes like erosion, applied massive quantities of nitrogen that leaks from soil to ground and surface waters and, sometimes, literally siphoned all water from rivers, emptying them for human uses before they reach the ocean. We have diverted vast amounts of freshwater to harness fossil energy, dammed major waterways, and destroyed aquatic ecosystems.”
“The idea of the Anthropocene underscores the point that human activities and their impacts have global significance for the future of all living species — ours included. Humans are changing the character of the world water system in significant ways with inadequate knowledge of the system and the consequences of changes being imposed. From a research position, human-water interactions must be viewed as a continuum and a coupled system, requiring interdisciplinary inquiry like that which has characterized the GWSP since its inception.”
Among many examples of humanity’s oversized imprint on the world, cited in a paper by James Syvitski, Chair of the International Geosphere-Biosphere Programme and three fellow experts (in full:http://bit.ly/Yx4COp), and in a new “Water in the Anthropocene” video to debut in Bonn May 21 (available at gwsp.org and http://www.anthropocene.info):
Credit: gwsp.org/ www.anthropocene.infoHumanity uses an area the size of South America to grow its crops and an area the size of Africa for raising livestockDue to groundwater and hydrocarbon pumping in low lying coastal areas, two-thirds of major river deltas are sinking, some of them at a rate four times faster on average than global sea level is risingMore rock and sediment is now moved by human activities such as shoreline in-filling, damming and mining than by the natural erosive forces of ice, wind and water combinedMany river floods today have links to human activities, including the Indus flood of 2010 (which killed 2,000 people), and the Bangkok flood of 2011 (815 deaths)
On average, humanity has built one large dam every day for the last 130 years. Tens of thousands of large dams now distort natural river flows to which ecosystems and aquatic life adapted over millennia
Drainage of wetlands destroys their capacity to ease floods—a free service of nature expensive to replace
Evaporation from poorly-managed irrigation renders many of the world’s rivers dry — no water, no life. And so, little by little, tens of thousands of species edge closer to extinction every day.
Needed: Better water system monitoring and governance
The water community stresses that concern now extends far beyond ‘classic’ drinking water and sanitation issues and includes water quality and quantity for ecosystems at all scales.
Says GWSP co-chair Claudia Pahl-Wostl: “The fact is, as world water problems worsen, we lack adequate efforts to monitor the availability, condition and use of water — a situation presenting extreme long term cost and danger.”
“Human water security is often achieved in the short term at the expense of the environment with harmful long-term implications. The problems are largely caused by governance failure and a lack of systemic thinking in both developed and developing countries. Economic development without concomitant institutional development will lead to greater water insecurity in the long-term. Global leadership is required to deal with the water challenges of the 21st century.”
Credit: gwsp.org/ www.anthropocene.info
It will recommended priorities for decision makers in the areas of earth system science and water resources governance and management.
And it will constitute a scientific prelude to October’s Budapest Water Summit, a major objective of which is to elevate the importance of water issues within the UN General Assembly negotiations on the Sustainable Development Goals — a set of globally-agreed future objectives to succeed the UN Millennium Development Goals in 2015.
Observers expect adoption of “water security” as a Sustainable Development Goal
Water expert Janos Bogardi, Senior Advisor to GWSP, says the absence of defined global water quantity and quality standards for personal use, agriculture and healthy ecosystems are critical gaps as the world community develops its next set of shared medium-term objectives.
“These definitions constitute a cardinal challenge today for scientists and politicians alike. It is important to reach consensus in order to make progress on the increasingly important notion of ‘water security’,” says Dr. Bogardi, stressing that changing terminology will not in itself solve problems. “Replacing the word ‘sustainability’ with ‘security’ is not a panacea.”
With respect to quantity, less than 20 liters daily for sanitary needs and drinking is deemed “water misery” while 40 to 80 liters is considered “comfortable.” (Current US per capita average daily consumption is over 300 liters; daily usage in urban Germany is about 120 liters per capita and in urban Hungary, where water is relatively expensive, the figure is 80 liters.)
Missing also are authoritative scientific determinations of how much water can be drawn without crossing a “tipping point” threshold into ecosystem collapse. While there is no general rule, GWSP scientists say withdrawals of 30% to 40% of a renewable freshwater resource constitutes “extreme” water stress, but underline scope to continue satisfying needs if water is returned and recycled in good quality. Mining fossil groundwater resources is by definition non-sustainable.
The GWSP is developing water quality guidelines for people, agriculture and ecosystems in the context of the Sustainable Development Goals.
“The urgency of formulating the post-2015 Sustainable Development Goals and a tracking system for their success means that quite soon the SDG negotiators must offer-up water targets,” says Dr. Vörösmarty. “Whether they focus predominantly on continuing the Millennium Development Goals (narrowly on drinking water and sanitation for human health) or formulate a more comprehensive agenda that simultaneously optimizes water security for humans as well as for nature remains an open question. The water sciences community stands ready to take on this challenge. Are the the decision makers?”
Definitions of water security
In 2007, World Bank expert David Grey and Claudia Sadoff of IUCN, defined water security as “The availability of an acceptable quantity and quality of water for health, livelihoods, ecosystems and production, coupled with an acceptable level of water-related risks to people, environments and economies.”
Their use of the term “acceptable” acknowledges that water security has relative, negotiable meanings.
In March, another formulation was set out by UN-Water, the United Nations’ inter-agency coordination mechanism for all water-related issues.
It defined water security as: “The capacity of a population to safeguard sustainable access to adequate quantities of and acceptable quality water for sustaining livelihoods, human well-being, and socio-economic development, for ensuring protection against water-borne pollution and water-related disasters, and for preserving ecosystems in a climate of peace and political stability.” (seehttp://bit.ly/1864vMG)
The full text of The Bonn Declaration:
In the short span of one or two generations, the majority of the 9 billion people on Earth will be living under the handicap of severe pressure on fresh water, an absolutely essential natural resource for which there is no substitute. This handicap will be self-inflicted and is, we believe, entirely avoidable.
After years of observations and a decade of integrative research convened under the Earth System Science Partnership (ESSP) and other initiatives, water scientists are more than ever convinced that fresh water systems across the planet are in a precarious state.
Mismanagement, overuse and climate change pose long-term threats to human well-being, and evaluating and responding to those threats constitutes a major challenge to water researchers and managers alike. Countless millions of individual local human actions add up and reverberate into larger regional, continental and global changes that have drastically changed water flows and storage, impaired water quality, and damaged aquatic ecosystems.
Human activity thus plays a central role in the behavior of the global water system.
Since 2004, the Global Water System Project (GWSP) has spearheaded a broad research agenda and new ways of thinking about water as a complex global system, emphasizing the links that bind its natural and human components. Research carried out by GWSP and its partners has produced several important results that inform a better global understanding of fresh water today.
Humans are a key feature of the global water system, influencing prodigious quantities of water: stored in reservoirs, taken from rivers and groundwater and lost in various ways. Additional deterioration through pollution, now detectable on a global scale, further limits an already-stressed resource base, and negatively affects the health of aquatic life forms and human beings.
At a time of impending water challenges, it remains a struggle to secure the basic environmental and social observations needed to obtain an accurate picture of the state of the resource. We need to know about the availability, condition and use of water as part of a global system through sustained environmental surveillance. History teaches us that failure to obtain this basic information will be costly and dangerous.
Humans typically achieve water security through short-term and often costly engineering solutions, which can create long-lived impacts on social-ecological systems. Faced with a choice of water for short-term economic gain or for the more general health of aquatic ecosystems, society overwhelmingly chooses development, often with deleterious consequences on the very water systems that provide the resource.
Traditional approaches to development are counterproductive, destroying the services that healthy water systems provide, such as flood protection, habitat for fisheries and pollution control. Loss of these services will adversely affect current and future generations.
Sustainable development requires both technological and institutional innovation. At present, the formulation of effective institutions for the management of water lags behind engineering technologies in many regions.
Research from the GWSP and elsewhere confirms that current increases in the use of water and impairment of the water system are on an unsustainable trajectory. However, current scientific knowledge cannot predict exactly how or precisely when a planetary-scale boundary will be breached. Such a tipping point could trigger irreversible change with potentially catastrophic consequences.
The existing focus on water supply, sanitation and hygiene has delivered undoubted benefits to people around the world, but equally, we need to consider wider Sustainable Development Goals in the context of the global water system. Ecosystem-based sustainable water management, a pressing need that was reaffirmed at the Rio+20 Earth Summit, requires that solving water problems must be a joint obligation of environmental scientists, social scientists, engineers, policy-makers, and a wide range of stakeholders.
These realities motivate the water community assembled in Bonn for the Global Water System Project Conference “Water in the Anthropocene” to make a set of core recommendations to institutions and individuals focused on science, governance, management and decision-making relevant to water resources on earth. Given the development imperatives associated with all natural resources at the dawn of the 21st century, we urge a united front to form a strategic partnership of scientists, public stakeholders, decision-makers and the private sector. This partnership should develop a broad, community-consensus blueprint for a reality-based, multi-perspective, and multi-scale knowledge-to-action water agenda, based on these recommendations:
1) Make a renewed commitment to adopt a multi-scale and interdisciplinary approach to water science in order to understand the complex and interlinked nature of the global water system and how it may change now and in future.
2) Execute state-of-the-art synthesis studies of knowledge about fresh water that can inform risk assessments and be used to develop strategies to better promote the protection of water systems.
3) Train the next generation of water scientists and practitioners in global change research and management, making use of cross-scale analysis and integrated system design.
4) Expand monitoring, through traditional land-based environmental observation networks and state-of-the-art earth-observation satellite systems, to provide detailed observations of water system state.
5) Consider ecosystem-based alternatives to costly structural solutions for climate proofing, such that the design of the built environment in future includes both traditional and green infrastructure.
6) Stimulate innovation in water institutions, with a balance of technical- and governance-based solutions and taking heed of value systems and equity. A failure to adopt a more inclusive approach will make it impossible to design effective green growth strategies or policies.
Many river floods today have links to human activities, including the Indus flood of 2010 (which killed 2,000 people), and the Bangkok flood of 2011 (815 deaths)
On average, humanity has built one large dam every day for the last 130 years. Tens of thousands of large dams now distort natural river flows to which ecosystems and aquatic life adapted over millennia
Drainage of wetlands destroys their capacity to ease floods-a free service of nature expensive to replace
Evaporation from poorly-managed irrigation renders many of the world’s rivers dry — no water, no life. And so, little by little, tens of thousands of species edge closer to extinction every day.
Global Water System Project
By Alton Parrish.
The announcement, now in its sixth year, coincides with the anniversary of the birth of Carolus Linnaeus — the 18th century Swedish botanist responsible for the modern system of scientific names and classifications.
Credit: Composite: Jacob Sahertian
“We have identified only about two million of an estimated 10 to 12 million living species and that does not count most of the microbial world,” said Quentin Wheeler, founding director of the International Institute for Species Exploration at ASU and author of “What on Earth? 100 of our Planet’s Most Amazing New Species” (NY, Plume, 2013).
“For decades, we have averaged 18,000 species discoveries per year which seemed reasonable before the biodiversity crisis. Now, knowing that millions of species may not survive the 21st century, it is time to pick up the pace,” Wheeler added.
“We are calling for a NASA-like mission to discover 10 million species in the next 50 years. This would lead to discovering countless options for a more sustainable future while securing evidence of the origins of the biosphere,” Wheeler said.
Taxon experts pick top 10
Members of the international committee made their top 10 selection from more than 140 nominated species. To be considered, species must have been described in compliance with the appropriate code of nomenclature, whether botanical, zoological or microbiological, and have been officially named during 2012.
“Selecting the final list of new species from a wide representation of life forms such as bacteria, fungi, plants and animals, is difficult. It requires finding an equilibrium between certain criteria and the special insights revealed by selection committee members,” said Antonio Valdecasas, a biologist and research zoologist with Museo Nacional de Ciencias Naturales in Madrid, Spain. Valdecasas is the international selection committee chairman for the top 10 new species.
“We look for organisms with unexpected features or size and those found in rare or difficult to reach habitats. We also look for organisms that are especially significant to humans — those that play a certain role in human habitat or that are considered a close relative,” Valdecasas added.
This year’s top 10 come from Peru; NE Pacific Ocean, USA: California; Democratic Republic of the Congo; Panama; France; New Guinea; Madagascar; Ecuador; Malaysia; and China.
Top 10 New Species, 2013
“I don’t know whether to be more astounded by the species discovered each year, or the depths of our ignorance about biodiversity of which we are a part,” shared Wheeler.
“At the same time we search the heavens for other earthlike planets, we should make it a high priority to explore the biodiversity on the most earthlike planet of them all: Earth,” he added. “With more than eight out of every 10 living species awaiting discovery, I am shocked by our ignorance of our very own planet and in awe at the diversity, beauty and complexity of the biosphere and its inhabitants.”
Describing the discoveries
Tiny violet: Not only is the Lilliputian violet among the smallest violets in the world, it is also one of the most diminutive terrestrial dicots. Known only from a single locality in an Intermontane Plateau of the high Andes of Peru, Viola lilliputana lives in the dry puna grassland eco-region. Specimens were first collected in the 1960s, but the species was not described as a new until 2012. The entire above ground portion of the plant is barely 1 centimeter tall. Named, obviously, for the race of little people on the island of Lilliput in Jonathan Swift’s Gulliver’s Travels.
Country: NE Pacific Ocean; USA: California
Carnivorous sponge: A spectacular, large, harp- or lyre-shaped carnivorous sponge discovered in deep water (averaging 3,399 meters) from the northeast Pacific Ocean off the coast of California. The harp-shaped structures or vanes number from two to six and each has more than 20 parallel vertical branches, often capped by an expanded, balloon-like, terminal ball. This unusual form maximizes the surface area of the sponge for contact and capture of planktonic prey.
Country: Democratic Republic of the Congo
Old World monkey: Discovered in the Lomami Basin of the Democratic Republic of the Congo, the lesula is an Old World monkey well known to locals but newly known to science. This is only the second species of monkey discovered in Africa in the past 28 years. Scientists first saw the monkey as a captive juvenile in 2007. Researchers describe the shy lesula as having human-like eyes. More easily heard than seen, the monkeys perform a booming dawn chorus. Adult males have a large, bare patch of skin on the buttocks, testicles and perineum that is colored a brilliant blue. Although the forests where the monkeys live are remote, the species is hunted for bush meat and its status is vulnerable.
No to the Mine! Snake
Snail-eating snake: A beautiful new species of snail-eating snake has been discovered in the highland rainforests of western Panama. The snake is nocturnal and hunts soft-bodied prey including earthworms and amphibian eggs, in addition to snails and slugs. This harmless snake defends itself by mimicking the alternating dark and light rings of venomous coral snakes. The species is found in the Serranía de Tabasará mountain range where ore mining is degrading and diminishing its habitat. The species name is derived from the Spanish phrase “No a la mina” or “No to the mine.”
A Smudge on Paleolithic Art
Fungus: In 2001, black stains began to appear on the walls of Lascaux Cave in France. By 2007, the stains were so prevalent they became a major concern for the conservation of precious rock art at the site that dates back to the Upper Paleolithic. An outbreak of a white fungus, Fusarium solani, had been successfully treated when just a few months later, black staining fungi appeared. The genus primarily includes fungi that occur in the soil and are associated with the decomposition of plant matter. As far as scientists know, this fungus, one of two new species of the genus from Lascaux, is harmless. However, at least one species of the group, O. gallopava, causes disease in humans who have compromised immune systems.
World’s Smallest Vertebrate
Country: New Guinea
Tiny frog: Living vertebrates — animals that have a backbone or spinal column — range in size from this tiny new species of frog, as small as 7 millimeters, to the blue whale, measuring 25.8 meters. The new frog was discovered near Amau village in Papua, New Guinea. It captures the title of ‘smallest living vertebrate’ from a tiny Southeast Asian cyprinid fish that claimed the record in 2006. The adult frog size, determined by averaging the lengths of both males and females, is only 7.7 millimeters. With few exceptions, this and other ultra-small frogs are associated with moist leaf litter in tropical wet forests — suggesting a unique ecological guild that could not exist under drier circumstances.
Endangered shrub: Eugenia is a large, worldwide genus of woody evergreen trees and shrubs of the myrtle family that is particularly diverse in South America, New Caledonia and Madagascar. The new species E. petrikensis is a shrub growing to two meters with emerald green, slightly glossy foliage and beautiful, dense clusters of small magenta flowers. It is one of seven new species described from the littoral forest of eastern Madagascar and is considered to be an endangered species. It is the latest evidence of the unique and numerous species found in this specialized, humid forest that grows on sandy substrate within kilometers of the shoreline. Once forming a continuous band 1,600 kilometers long, the littoral forest has been reduced to isolated, vestigial fragments under pressure from human populations.
Glow-in-the-dark cockroach: Luminescence among terrestrial animals is rather rare and best known among several groups of beetles — fireflies and certain click beetles in particular — as well as cave-inhabiting fungus gnats. Since the first discovery of a luminescent cockroach in 1999, more than a dozen species have (pardon the pun) “come to light.” All are rare, and interestingly, so far found only in remote areas far from light pollution. The latest addition to this growing list is L. luckae that may be endangered or possibly already extinct. This cockroach is known from a single specimen collected 70 years ago from an area heavily impacted by the eruption of the Tungurahua volcano. The species may be most remarkable because the size and placement of its lamps suggest that it is using light to mimic toxic luminescent click beetles.
No Social Butterfly
Social media lacewing: In a trend-setting collision of science and social media, Hock Ping Guek photographed a beautiful green lacewing with dark markings at the base of its wings in a park near Kuala Lumpur and shared his photo on Flickr. Shaun Winterton, an entomologist with the California Department of Food and Agriculture, serendipitously saw the image and recognized the insect as unusual. When Guek was able to collect a specimen, it was sent to Stephen Brooks at London’s Natural History Museum who confirmed its new species status. The three joined forces and prepared a description using Google Docs. In this triumph for citizen science, talents from around the globe collaborated by using new media in making the discovery. The lacewing is not named for its color — rather for Winterton’s daughter, Jade.
Hanging Around in the Jurassic
Hangingfly fossil: Living species of hangingflies can be found, as the name suggests, hanging beneath foliage where they capture other insects as food. They are a lineage of scorpionflies characterized by their skinny bodies, two pairs of narrow wings, and long threadlike legs. A new fossil species, Juracimbrophlebia ginkgofolia, has been found along with preserved leaves of a gingko-like tree, Yimaia capituliformis, in Middle Jurassic deposits in the Jiulongshan Formation in China’s Inner Mongolia. The two look so similar that they are easily confused in the field and represent a rare example of an insect mimicking a gymnosperm 165 million years ago, before an explosive radiation of flowering plants.
Why create a top 10 new species list?
Arizona State University’s International Institute for Species Exploration announces the top 10 new species list each year as part of its public awareness campaign to bring attention to biodiversity and the field of taxonomy.
“Sustainable biodiversity means assuring the survival of as many and as diverse species as possible so that ecosystems are resilient to whatever stresses they face in the future. Scientists will need access to as much evidence of evolutionary history as possible,” said the institute’s Wheeler, who is also a professor in ASU’s School of Life Sciences in the College of Liberal Arts and Sciences, and in the School of Sustainability, as well as a senior sustainability scientist with the Global Institute of Sustainability.
“All of our hopes and dreams for conservation hinge upon saving millions of species that we cannot recognize and know nothing about,” Wheeler added. “No investment makes more sense than completing a simple inventory to the establish baseline data that tells us what kinds of plants and animals exist and where. Until we know what species already exist, it is folly to expect we will make the right decisions to assure the best possible outcome for the pending biodiversity crisis.”
Additionally, the announcement is made on or near May 23 to honor Linnaeus. Since he initiated the modern system for naming plants and animals, nearly two million species have been named, described and classified. Excluding unknown millions of microbes, scientists estimate there are between 10 and 12 million living species.
IISE International Selection Committee: Antonio G. Valdecasas, Museo Nacional de Ciencias Naturales, CSIC, Spain, Committee Chair; Andrew Polaszek, Natural History Museum, England; Ellinor Michel, Natural History Museum, England; Marcelo Rodrigues de Carvalho, Universidade de São Paulo; Aharon Oren, The Hebrew University of Jerusalem; Mary Liz Jameson, Wichita State University, USA; Alan Paton, Kew Royal Botanical Gardens, England; James A. Macklin, Agriculture and Agri-Food Canada, Canada; John S. Noyes, Natural History Museum, England; Zhi-Qiang Zhang, Landcare Research, New Zealand; and Gideon Smith, South African National Biodiversity Institute, South Africa.
By Alton Parrish.
These are some of the conclusions of a study conducted by researchers from the Department of Physical and Sports Education at the University of Granada. Their research has also shown a widespread belief among elite athletes that the fight against doping is inefficient and biased, and that the sanctions imposed “are not severe enough”.
In an article in the journal “Sports Medicine”, the most important publication in the field of Sport Sciences, researchers Mikel Zabala and Jaime Morente-Sánchez have analysed the attitudes, beliefs and knowledge about doping of elite athletes from all over the world. To this end, they conducted a literature review of 33 studies on the subject published between 2000 and 2011, in order to analyse the current situation and, as a result of this, to act by developing specific, efficient anti-doping strategies.
Fewer controls in team-based sports
The results of the University of Granada study reveal that athletes participating in team-based sports appear to be less susceptible to using doping substances. However, the authors stress that in team sports anti-doping controls are clearly both quantitatively and qualitatively less exhaustive.
The study indicates that coaches seem to be the principle influence and source of information for athletes when it comes to starting or not starting to take banned substances, while doctors and other specialists are less involved. Athletes are becoming increasingly familiar with anti-doping rules, but there is still a lack of knowledge about the problems entailed in using banned substances and methods, which the researchers believe should be remedied through appropriate educational programmes.
Moreover, they also conclude that a substantial lack of information exists among elite athletes about dietary supplements and the secondary effects of performance-enhancing substances.
In the light of their results, the University of Granada researchers consider it necessary to plan and conduct information and prevention campaigns to influence athletes’ attitudes towards doping and the culture surrounding this banned practice. “We should not just dedicate money almost exclusively to performing anti-doping tests, as we currently do. To improve the situation, it would be enough to designate at least a small part of this budget to educational and prevention programmes that encourage athletes to reject the use of banned substances and methods”, Mikel Zabala and Jaime Morente-Sánchez conclude. In this context, one pioneering example in their opinion is the Spanish Cycling Federation’s “Preventing to win” project.
Citation: Doping in Sport: A Review of Elite Athletes’ Attitudes, Beliefs, and Knowledge.
Morente-Sánchez J, Zabala M. Sports Medicine. 2013 Mar 27.
By Alton Parrish.
Archaeologists have made a discovery in southern subtropical China which could revolutionise thinking about how ancient humans lived in the region.
They have uncovered evidence for the first time that people living in Xincun 5,000 years ago may have practised agriculture –before the arrival of domesticated rice in the region.
This is Dr Mingqi Li sampling one of the pebble tools for ancient starch using an ultrasonic bath, Institute of Geographical Sciences and Natural Resources Research, Chinese Academy of Sciences in Beijing.
Credit: Dr. Huw Barton
Current archaeological thinking is that it was the advent of rice cultivation along the Lower Yangtze River that marked the beginning of agriculture in southern China. Poor organic preservation in the study region, as in many others, means that traditional archaeobotany techniques are not possible.
Now, thanks to a new method of analysis on ancient grinding stones, the archaeologists have uncovered evidence that agriculture could predate the advent of rice in the region.
The research was the result of a two-year collaboration between Dr Huw Barton, from the School of Archaeology and Ancient History at the University of Leicester, and Dr Xiaoyan Yang, Institute of Geographical Sciences and Natural Resources Research, Chinese Academy of Sciences, in Beijing.
Funded by a Royal Society UK-China NSFC International Joint Project, and other grants held by Yang in China, the research is published in PLOS ONE.
This shows the Xincun site under excavation, a) Neolithic living surface under cleaning.
Credit: Dr Jun Wei
“We have used a relatively new method known as ancient starch analysis to analyse ancient human diet. This technique can tell us things about human diet in the past that no other method can.
“From a sample of grinding stones we extracted very small quantities of adhering sediment trapped in pits and cracks on the tool surface. From this material, preserved starch granules were extracted with our Chinese colleagues in the starch laboratory in Beijing. These samples were analysed in China and also here at Leicester in the Starch and Residue Laboratory, School of Archaeology and Ancient History.
“Our research shows us that there was something much more interesting going on in the subtropical south of China 5,000 years ago than we had first thought. The survival of organic material is really dependent on the particular chemical properties of the soil, so you never know what you will get until you sample. At Xincun we really hit the jackpot. Starch was well-preserved and there was plenty of it. While some of the starch granules we found were species we might expect to find on grinding and pounding stones, ie. some seeds and tuberous plants such as freshwater chestnuts, lotus root and the fern root, the addition of starch from palms was totally unexpected and very exciting.”
This is a map of the study region in southern China (A), Xincun site indicated by red triangle (B), and details of the Xincun site including excavation areas marked by red grids, stippling shows location of coastal sand dunes (C).
Credit: Xiaoyan Yang
Dr Barton said: “The presence of at least two, possibly three species of starch producing palms, bananas, and various roots, raises the intriguing possibility that these plants may have been planted nearby the settlement.
“Today groups that rely on palms growing in the wild are highly mobile, moving from one palm stand to another as they exhaust the clump. Sedentary groups that utilise palms for their starch today, plant suckers nearby the village, thus maintaining continuous supply. If they were planted at Xincun, this implies that ‘agriculture’ did not arrive here with the arrival of domesticated rice, as archaeologists currently think, but that an indigenous system of plant cultivation may have been in place by the mid Holocene.
“The adoption of domesticated rice was slow and gradual in this region; it was not a rapid transformation as in other places. Our findings may indicate why this was the case. People may have been busy with other types of cultivation, ignoring rice, which may have been in the landscape, but as a minor plant for a long time before it too became a food staple.
By Alton Parrish.
The nanoparticles were designed to sense glucose levels in the body and respond by secreting the appropriate amount of insulin, thereby replacing the function of pancreatic islet cells, which are destroyed in patients with Type 1 diabetes. Ultimately, this type of system could ensure that blood-sugar levels remain balanced and improve patients’ quality of life, according to the researchers.
“Insulin really works, but the problem is people don’t always get the right amount of it. With this system of extended release, the amount of drug secreted is proportional to the needs of the body,” says Daniel Anderson, an associate professor of chemical engineering and member of MIT’s Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science.
Anderson is the senior author of a paper describing the new system in a recent issue of the journal ACS Nano. Lead author of the paper is Zhen Gu, a former postdoc in Anderson’s lab. The research team also includes Robert Langer, the David H. Koch Institute Professor at MIT, and researchers from the Department of Anesthesiology at Boston Children’s Hospital.
Mimicking the pancreas
Currently, people with Type 1 diabetes typically prick their fingers several times a day to draw blood for testing their blood-sugar levels. When levels are high, these patients inject themselves with insulin, which breaks down the excess sugar.
In recent years, many researchers have sought to develop insulin-delivery systems that could act as an “artificial pancreas,” automatically detecting glucose levels and secreting insulin. One approach uses hydrogels to measure and react to glucose levels, but those gels are slow to respond or lack mechanical strength, allowing insulin to leak out.
The MIT team set out to create a sturdy, biocompatible system that would respond more quickly to changes in glucose levels and would be easy to administer.
Their system consists of an injectable gel-like structure with a texture similar to toothpaste, says Gu, who is now an assistant professor of biomedical engineering and molecular pharmaceutics at the University of North Carolina at Chapel Hill and North Carolina State University. The gel contains a mixture of oppositely charged nanoparticles that attract each other, keeping the gel intact and preventing the particles from drifting away once inside the body.
Using a modified polysaccharide known as dextran, the researchers designed the gel to be sensitive to acidity. Each nanoparticle contains spheres of dextran loaded with an enzyme that converts glucose into gluconic acid. Glucose can diffuse freely through the gel, so when sugar levels are high, the enzyme produces large quantities of gluconic acid, making the local environment slightly more acidic.
That acidic environment causes the dextran spheres to disintegrate, releasing insulin. Insulin then performs its normal function, converting the glucose in the bloodstream into glycogen, which is absorbed into the liver for storage.
In tests with mice that have Type 1 diabetes, the researchers found that a single injection of the gel maintained normal blood-sugar levels for an average of 10 days. Because the particles are mostly composed of polysaccharides, they are biocompatible and eventually degrade in the body.
The researchers are now trying to modify the particles so they can respond to changes in glucose levels faster, at the speed of pancreas islet cells. “Islet cells are very smart. They can release insulin very quickly once they sense high sugar levels,” Gu says.
Before testing the particles in humans, the researchers plan to further develop the system’s delivery properties and to work on optimizing the dosage that would be needed for use in humans.
The research was funded by the Leona M. and Harry B. Helmsley Charitable Trust and the Tayebati Family Foundation.