Tianjin Customs seize globes that violate the one-China principle

Tianjin Customs recently seized 92 globes that mistakenly labeled China's Taiwan region alongside the name of countries, violating the one-China principle, according to China's General Administration of Customs (GAC) on Thursday.

The globes were discovered by on-site customs officers at North China's Tianjin's Xingang Customs, during the inspection of a declared imported shipment of globes in the import freight channel, according to the GAC.

The globes have been seized according to law and will be further processed.

The GAC said that maps are the main representation of the national territory. A correct national map is a symbol of national sovereignty and territorial integrity.

All printed matter and publications that do not comply with China's regulations on the content representation of open map are strictly prohibited from printing or importing/exporting, according to the GAC.

Additionally, relevant map or map product production and import-export enterprises should strictly comply with laws and regulations and carry out map business activities in accordance with the law.

China's Ministry of Natural Resources (MNR) issued the open map content representation specification in February, requiring open maps or the content representation of map graphics products, should comply with the specification. Detailed requirements have been specified to regulate the depiction of maps concerning China's Taiwan region, the South China Sea, and the Diaoyu Islands, among other areas.

Humans have pondered aliens since medieval times

For beings that are supposedly alien to human culture, extraterrestrials are pretty darn common. You can find them in all sorts of cultural contexts, from comic books, sci-fi novels and conspiracy theories to Hollywood films and old television reruns. There’s Superman and Doctor Who, E.T. and Mindy’s friend Mork, Mr. Spock, Alf, Kang and Kodos and My Favorite Martian. Of course, there’s just one hitch: They’re all fictional. So far, real aliens from other worlds have refused to show their faces on the real-world Earth — or even telephone, text or tweet. As the Italian physicist Enrico Fermi so quotably inquired during a discussion about aliens more than six decades ago, “Where is everybody?”
Scientific inquiry into the existence of extraterrestrial intelligence still often begins by pondering Fermi’s paradox: The universe is vast and old, so advanced civilizations should have matured enough by now to send emissaries to Earth. Yet none have. Fermi suspected that it wasn’t feasible or that aliens didn’t think visiting Earth was worth the trouble. Others concluded that they simply don’t exist. Recent investigations indicate that harsh environments may snuff out nascent life long before it evolves the intelligence necessary for sending messages or traveling through space.
In any event, Fermi’s question did not launch humankind’s concern with visitors from other planets. Imagining other worlds, and the possibility of intelligent life-forms inhabiting them, did not originate with modern science or in speculative fiction. In the ancient world, philosophers argued about the possibility of multiple universes; in the Middle Ages the question of the “plurality of worlds” and possible inhabitants occupied the deepest of thinkers, spawning intricate and controversial philosophical, theological and astronomical debate. Far from being a merely modern preoccupation, life beyond Earth has long been a burning issue animating the human race’s desire to understand itself, and its place in the cosmos.

Other worlds, illogical
From ancient times Earth’s place was widely regarded to be the center of everything. As articulated by the Greek philosopher Aristotle, the Earth was the innermost sphere in a universe, or world, surrounded by various other spheres containing the moon, sun, planets and stars. Those heavenly spheres, crystalline and transparent, rotated about the Earthly core comprising four elements: fire, air, water and earth. Those elements layered themselves on the basis of their essence, or “nature” — earth’s natural place was at the middle of the cosmos, which was why solid matter fell to the ground, seeking the inaccessible center far below.

On the basis of this principle, Aristotle deduced the impossibility of other worlds. If some other world existed, its matter (its “earth”) would seek both the center of its world and of our world as well. Such opposite imperatives posed a logical contradiction (which Aristotle, having more or less invented logic, regarded as a directly personal insult). He also applied further reasoning to point out that there is no space (no void) outside the known world for any other world to occupy. So, Aristotle concluded, two worlds cannot both exist.

Some Greeks (notably those advocating the existence of atoms) believed otherwise. But Aristotle’s view prevailed. By the 13th century, once Aristotle’s writings had been rediscovered in medieval Europe, most scholars defended his position.
But then religion leveled the philosophical playing field. Fans of other worlds got a chance to make their case.

In 1277, the bishop of Paris, Étienne Tempier, banned scholars from teaching 219 principles, manny associated with Aristotle’s philosophy. Among the prohibited teachings on the list was item 34: that God could not create as many worlds as he wanted to. Since the penalty for violating this decree was excommunication, Parisian scholars suddenly discovered rationales allowing multiple worlds, empowering God to defy Aristotle’s logic. And since Paris was the intellectual capital of the European world, scholars elsewhere followed the Parisian lead.
While several philosophers asserted that God could make many worlds, most intimated that he probably wouldn’t have bothered. Hardly anyone addressed the likelihood of alien life, although both Jean Buridan in Paris and William of Ockham in Oxford did consider the possibility. “God could produce an infinite [number of] individuals of the same kind as those that now exist,” wrote Ockham, “but He is not limited to producing them in this world.”
Populated worlds showed up more prominently in writings by the renegade thinkers Nicholas of Cusa (1401–1464) and Giordano Bruno (1548–1600). They argued not only for the existence of other worlds, but also for worlds inhabited by beings just like, or maybe better than, Earth’s humans.

“In every region inhabitants of diverse nobility of nature proceed from God,” wrote Nicholas, who argued that space had no center, and therefore the Earth could not be central or privileged with respect to life. Bruno, an Italian friar, asserted that God’s perfection demanded an infinity of worlds, and beings. “Infinite perfection is far better presented in innumerable individuals than in those which are numbered and finite,” Bruno averred.

Burned at the stake for heretical beliefs (though not, as often stated, for his belief in other worlds), Bruno did not live to see the triumph of Copernicanism during the 17th century. Copernicus had placed the sun at the hub of a planetary system, making the Earth just one planet of several. So the existence of “other worlds” eventually became no longer speculation, but astronomical fact, inviting the notion of otherworldly populations, as the prominent Dutch scientist Christiaan Huygens pointed out in the late 1600s. “A man that is of Copernicus’ opinion, that this Earth of ours is a planet … like the rest of the planets, cannot but sometimes think that it’s not improbable that the rest of the planets have … their inhabitants too,” Huygens wrote in his New Conjectures Concerning the Planetary Worlds, Their Inhabitants and Productions.

A few years earlier, French science popularizer Bernard le Bovier de Fontenelle had surveyed the prospects for life in the solar system in his Conversations on the Plurality of Worlds, an imaginary dialog between a philosopher and an uneducated but intelligent woman known as the Marquise.

“It would be very strange that the Earth was as populated as it is, and the other planets weren’t at all,” the philosopher told the Marquise. Although he didn’t think people could live on the sun (if there were any, they’d be blinded by its brightness), he sided with those who envisioned inhabitants on other planets and even the moon.

“Just as there have been and still are a prodigious number of men foolish enough to worship the Moon, there are people on the Moon who worship the Earth,” he wrote.

From early modern times onward, discussion of aliens was not confined to science and philosophy. They also appeared in various works of fiction, providing plot devices that remain popular to the present day. Often authors used aliens as stand-ins for evil (or occasionally benevolent) humans to comment on current events. Modern science fiction about aliens frequently portrays them in the role of tyrants or monsters or victims, with parallels to real life (think Flash Gordon’s nemesis Ming the Merciless, a 1930s dictator, or the extraterrestrials of the 1980s film and TV show Alien Nation — immigrants encountering bigotry and discrimination). When humans look for aliens, it seems, they often imagine themselves.

Serious science
While aliens thrived in fiction, though, serious scientific belief in extraterrestrials — at least nearby — diminished in the early 20th century, following late 19th century exuberance about possible life on Mars. Supposedly a network of lines interpreted as canals signified the presence of a sophisticated Martian civilization; its debunking (plus further knowledge about planetary environments) led to general agreement that finding intelligent life elsewhere in the solar system was not an intelligent bet.
On the other hand, the universe had grown incredibly vaster than the early Copernicans had imagined. The sun had become just one of billions of stars in the Milky Way galaxy, which in turn was only one of billions of other similar galaxies, or “island universes.” Within a cosmos so expansive, alien enthusiasts concluded, the existence of other life somewhere seemed inevitable. In 1961, astronomer Frank Drake developed an equation to gauge the likelihood of extraterrestrial life’s existence; by the 1990s he estimated that 10,000 planets possessed advanced civilizations in the Milky Way alone, even before anybody really knew for sure that planets outside the solar system actually existed.

But now everybody does. In the space of the last two decades, conclusive evidence of exoplanets, now numbering in the thousands, has reconfigured the debate and sharpened Fermi’s original paradox. No one any longer doubts that planets are plentiful. But still there’s been not a peep from anyone living on them, despite years of aiming radio telescopes at the heavens in hope of detecting a signal in the static of interstellar space.

Maybe such signals are just too rare or too weak for human instruments to detect. Or possibly some cosmic conspiracy is at work to prevent civilizations from communicating — or arising in the first place. Or perhaps civilizations that do arise are eradicated before they have a chance to communicate.

Or maybe the alien invasion has merely been delayed. Fermi’s paradox implicitly assumes that other civilizations have been around long enough to develop galactic transportation systems. After all, the universe, born in the Big Bang 13.8 billion years ago, is three times as old as the Earth. So most analyses assume that alien civilizations had a head start and would be advanced enough by now to go wherever they wanted to. But a new paper suggests that livable galactic neighborhoods may have developed only relatively recently.
In a young, smaller and more crowded universe, cataclysmic explosions known as gamma-ray bursts may have effectively sterilized otherwise habitable planets, Tsvi Piran and collaborators suggest in a paper published in February in Physical Review Letters.

A planet near the core of a galaxy would be especially susceptible to gamma-ray catastrophes. And in a young universe, planets closer to the galactic edge (like Earth) would also be in danger from gamma-ray bursts in neighboring satellite galaxies. Only as the expansion of the universe began to accelerate — not so long before the birth of the Earth — would galaxies grow far enough apart to provide safety zones for life.

“The accelerated expansion induced by a cosmological constant slows the growth of cosmic structures, and increases the mean inter-galaxy separation,” Piran and colleagues write. “This reduces the number of nearby satellites likely to host catastrophic” gamma-ray bursts. So most alien civilizations would have begun to flourish not much before Earth’s did; those aliens may now be wondering why nobody has visited them.

Still, the radio silence from the sky makes some scientists wonder whether today’s optimism about ET’s existence will go the way of the Martian canal society. From one sobering perspective, aliens aren’t sending messages because few planets remain habitable long enough for life to develop an intelligent civilization. One study questions, for instance, how likely it is that life, once initiated on any planet, would shape its environment sufficiently well to provide for lasting bio-security.

In fact, that study finds, a wet, rocky planet just the right distance from a star — in the Goldilocks zone — might not remain habitable for long. Atmospheric and geochemical processes would typically drive either rapid warming (producing an uninhabitable planet like Venus) or quick cooling, freezing water and leaving the planet too cold and dry for life to survive, Aditya Chopra and Charles Lineweaver conclude in a recent issue of Astrobiology. Only if life itself alters these processes can it maintain a long-term home suitable for developing intelligence.

“Feedback between life and environment may play the dominant role in maintaining the habitability of the few rocky planets in which life has been able to evolve,” wrote Chopra and Lineweaver, both of the Australian National University in Canberra.

Yet even given such analyses — based on a vastly deeper grasp on astronomy and cosmology than medieval scholars possessed — whether real aliens exist remains one of those questions that science cannot now answer. It’s much like other profound questions also explored in medieval times: What is the universe made of? Is it eternal? Today’s scientists may be closer (or not) to answering those questions than were their medieval counterparts. Nevertheless the answers are not yet in hand.

Maybe we’ll just have to pose those questions to the aliens, if they exist, and are ever willing to communicate. And if those aliens do arrive, and provide the answers, humankind may well discover how medieval its understanding of the cosmos still is. Or perhaps the aliens will be equally clueless about nature’s deepest mysteries. As Fontenelle’s philosopher told the Marquise: “There’s no indication that we’re the only foolish species in the universe. Ignorance is quite naturally a widespread thing.”

A third of the population can’t see the Milky Way at night

At night, a river of stars cuts through the dense darkness of space. These celestial bodies form our galaxy’s core and their soft glow earned our galaxy the moniker “Milky Way.” But for more than a third of Earth’s population, the glare of artificial lights conceals this cosmic wonder from view, researchers report June 10 in Science Advances. Nearly 80 percent of North Americans and 60 percent of Europeans can no longer see the galactic core at night, the researchers estimate.

Using a combination of satellite measurements and on-the-ground observations, the researchers assembled the first global atlas of artificial sky luminance, recording light pollution from everything from streetlamps to spotlights. Nearly four in five people worldwide live under light-polluted skies, the atlas reveals. Singapore boasts the brightest nights, the team found, with skies so luminous that no one living there can fully adapt to night vision. Nights are darkest in Chad, the Central African Republic and Madagascar, where more than three-quarters of inhabitants can gaze up at the stars under pristine viewing conditions.
Bright nights aren’t just an eyesore for stargazers. Artificial lights can disrupt wildlife by, for example, confounding sex-seeking fireflies (SN Online: 8/12/15) and misguiding moths (SN: 6/13/15, p. 9).

On a mission for science, on Jupiter and on Earth

I am on a mission. I want everyone to appreciate and understand science — even those who assume (often based on the way they were taught in school) that they don’t like it. Science is important, and frequently amazing. In this issue alone, you can read about the solar-powered spacecraft that, after a five-year journey, will soon arrive at Jupiter to discover what lurks beneath the planet’s blanket of haze and clouds. As NASA’s Juno spacecraft settles into a cloud-skimming series of orbits around the gas giant, it will probe what makes up the planet, its origins and the nature of its core. Learn about efforts to developvaccines for mosquito-ferried scourges , from Zika to dengue. And read about the latest volley in the confounding search for the cause of Alzheimer’s and ancient cave circles built by Neandertals.
Luckily, I work for an organization with a mission aligned with my own. And Society for Science & the Public just got a big boost in its efforts to sow understanding and appreciation of science. On May 26, the Society announced a new sponsor of its flagship competition, the Science Talent Search. Like the science fair I wrote about in the last issue (SN: 6/11/16, p. 2), STS offers young scientists a national stage on which they can shine. Regeneron Pharmaceuticals Inc. of Tarrytown, N.Y., has stepped in to replace Intel, STS sponsor since 1998. (Founded by the Society in 1942, STS was originally sponsored by Westinghouse.)

Regeneron has also upped the game, pledging $100 million over 10 years and increasing the value of the scholarships and other awards to $3.1 million annually. The top student winner will now get $250,000, enough for a full-ride at many universities. “We are over the moon,” Maya Ajmera, CEO and president of the Society and publisher of Science News, told the Washington Post. “Regeneron is truly helping the Society scale its work in an unprecedented way,” she says.
Regeneron, founded in 1988, developed the cholesterol-fighting drug Praulent that went on sale last year and Eylea, used to treat vision diseases such as wet macular degeneration, among other products. Regeneron’s chief scientific officer George Yancopoulos was a top-10 finalist in STS in 1976. Yancopoulos (a former trustee of the Society) and his fellow STS alum Leonard Schleifer, Regeneron CEO and president, now want to give back. “The Westinghouse was a game changer for me as a high school student,” Yancopoulos says. “It truly set me on the path I am on today. I want to be able to grow that ability to motivate the best and the brightest to pursue careers in science.”

Notably, the biotech firm will dedicate $30 million of the total to expand the Society’s efforts in outreach and equity, designed to encourage more young people to engage in original research. In addition to better supporting educators using research-based approaches, the new funds will increase grants to teachers working with underserved students. It will also grow the Science News in High Schools program, sending the magazine to 4,000 more high schools and, I hope, inspiring students to make discoveries of their own.

We are also expanding efforts to get Science News to you. Look for our updated iPad app in July and, coming soon, apps for Android tablets, Kindle Fire and smartphones.

Remnants of Earth’s original crust preserve time before plate tectonics

Not all of the newborn Earth’s surface has been lost to time. Transformed bits of this rocky material remain embedded in the hearts of continents, new research suggests. These lingering remnants hint that full-fledged plate tectonics, the movements of large plates of Earth’s outer shell, began relatively late in the planet’s history, researchers report in the March 17 Science.

These revelations come from ancient continental rock in Canada that preserves geochemical traces of the even older, 4.2-billion-year-old rock from which it formed. “For the first time, we can say something about what kind of rock was a precursor to the first continental crust,” says study coauthor Jonathan O’Neil, a geochemist at the University of Ottawa.
Earth began as a molten ball around 4.54 billion years ago, and over the next tens of millions of years, its surface cooled and solidified. Almost all of Earth’s early rocky surface has been destroyed and recycled by geologic processes such as plate tectonics. The oldest known unaltered bits of the planet aren’t rocks but tiny zircon crystals formed nearly 4.4 billion years ago (SN Online: 2/23/14). The oldest actual rocks date back to only about 4 billion years. “We’re missing a lot of Earth’s history,” O’Neil says.

The new discovery fills in some of that history. In northeastern Canada, along the eastern shore of the Hudson Bay, O’Neil and geochemist Richard Carlson of the Carnegie Institution for Science in Washington, D.C., discovered 2.7-billion-year-old continental rocks that hinted at something much older. The rocks contain an unusually large abundance of an isotope of neodymium that formed only during the first few hundred million years of Earth’s history. To have so much of this neodymium, the rocks must have formed from material that was first created more than 4.2 billion years ago, the researchers calculate. That’s far older than the oldest rocks ever studied.

Based on the composition of the Canadian rocks, the researchers think that the precursor material was similar to the crust that underlies modern oceans. The finding affirms previous studies that suggested that the first continental crust arose from the partial melting of oceanic crust (SN Online: 9/20/16).

But unlike modern oceanic crust, which typically lingers for less than 200 million years before getting recycled into Earth’s interior by plate tectonics, the precursor crust survived for more than a billion years before being reworked into continental crust 2.7 billion years ago. Plate tectonics during the precursor crust’s life span must have therefore been nonexistent, sluggish or limited to certain regions, O’Neil concludes.

“If you ask five geologists the simple question of when did plate tectonics start, you’ll have answers from 4.3 billion years ago to 1 billion years ago,” he says. The new finding seems to rule out the idea that full-blown, global plate tectonics began early in Earth’s history.

The new work is exciting and sheds light on the processes that set the scene for Earth’s subsequent evolution and habitability, says geologist Tony Kemp of the University of Western Australia in Crawley. Other vestiges of early crust may lurk undiscovered elsewhere on Earth as well, he says. “It will be intriguing to see how this [research] unfolds with future studies of this type.”

Anatomy analysis suggests new dinosaur family tree

The standard dinosaur family tree may soon be just a relic.

After examining more than 400 anatomical traits, scientists have proposed a radical reshuffling of the major dinosaur groups. The rewrite, reported in the March 23 Nature, upsets century-old ideas about dinosaur evolution. It lends support to the accepted idea that the earliest dinosaurs were smallish, two-legged creatures. But contrary to current thinking, the new tree suggests that these early dinosaurs had grasping hands and were omnivores, snapping up meat and plant matter alike.
“This is a novel proposal and a really interesting hypothesis,” says Randall Irmis, a paleontologist at the Natural History Museum of Utah and the University of Utah in Salt Lake City. Irmis, who was not involved with the work, says it’s “a possibility” that the new family tree reflects actual dinosaur relationships. But, he says, “It goes against our ideas of the general relationships of dinosaurs. It’s certainly going to generate a lot of discussion.”

The accepted tree of dinosaur relationships has three dominant branches, each containing critters familiar even to the non–dinosaur obsessed. One branch leads to the “bird-hipped” ornithischians, which include the plant-eating duckbills, stegosaurs and Triceratops and its bony-frilled kin. Another branch contains the “reptile-hipped” saurischians, which are further divided into two groups: the plant-eating sauropods (typically four-legged, like Brontosaurus) and the meat-eating theropods (typically two-legged, like Tyrannosaurus rex and modern birds).
This split between the bird-hipped and reptile-hipped dinos was first proposed in 1887 by British paleontologist Harry Seeley, who had noticed the two strikingly different kinds of pelvic anatomy. That hypothesis of dinosaur relationships was formalized and strengthened in the 1980s and has been accepted since then.

The new tree yields four groups atop two main branches. The bird-hipped ornithischians, which used to live on their own lone branch, now share a main branch with the reptile-hipped theropods like T. rex. This placement suggests these once-distant cousins are actually closely related. It also underscores existing questions about the bird-hipped dinos, an oddball group with murky origins; they appear late in the dinosaur fossil record and then are everywhere. Some scientists have suggested that they evolved from an existing group of dinosaurs, perhaps similarly herbivorous sauropods. But by placing the bird-hipped dinos next to the theropods, the tree hints that the late-to-the-party vegetarian weirdos could have evolved from their now close relatives, the meat-eating theropods.

Sauropods (like Brontosaurus) are no longer next to the theropods but now reside on a branch with the meat-eating herrerasaurids. Herrerasaurids are a confusing group of creatures that some scientists think belong near the other meat eaters, the theropods, while others say the herrerasaurids are not quite dinosaurs at all.

The new hypothesis of relationships came about when researchers led by Matthew Baron, a paleontologist at the University of Cambridge and Natural History Museum in London, decided to do a wholesale examination of dinosaur anatomy with fresh eyes. Using a mix of fossils, photographs and descriptions from the scientific literature, Baron and colleagues surveyed the anatomy of more than 70 different dinosaurs and non-dino close relatives, examining 457 anatomical features. The presence, absence and types of features, which include the shape of a hole on the snout, a cheekbone ridge and braincase anatomy, were fed into a computer program, generating a family tree that groups animals that share specialized features.

In this new interpretation of dinosaur anatomy and the resulting tree, many of the earliest dinosaurs have grasping hands and a mix of meat-eating and plant-eating teeth. If the earliest dinos were really omnivores, given the relationships in the new four-pronged tree, the evolution of specialized diets (vegetarians and meat eaters) each happened twice in the dinosaur lineage.

When the researchers saw the resulting tree, “We were very surprised — and cautious,” Baron says. “It’s a big change that flies in the face of 130 years of thinking.”

The arrangement of the new tree stuck even when the researchers fiddled around with their descriptions of various features, Baron says. The close relationship between the bird-hipped, plant-eating ornithischians and the reptile-hipped, meat-eating theropods, for example, isn’t based on one or two distinctive traits but on 21 small details.

“The lesson is that dinosaur groups aren’t characterized by radical new inventions,” says paleontologist Kevin Padian of the University of California, Berkeley. “The relationships are read in the minutiae, not big horns and frills.” That said, Padian, whose assessment of the research also appears in Nature, isn’t certain that the new tree reflects reality. Such trees are constructed based on how scientists interpret particular anatomical features, decisions that will surely be quibbled with. “The devil is in the details,” Padian says. “These guys have done their homework and now everyone’s going to have to roll up their sleeves and start checking their work.”

When coal replaces a cleaner energy source, health is on the line

Where I grew up in Tennessee, a coal-fired power plant perches by the river, just down from the bridge that my wild brothers and their friends would jump off in the summer. Despite the proximity, I never thought too much about the power plant and the energy it was churning out.

But then I read an April 3 Nature Energy paper on coal-fired energy production that used my town — and others in the Tennessee Valley Authority area — as a natural experiment. The story the data tell is simultaneously fascinating and frustrating, and arrives at a politically prescient time. In recent weeks, the Trump administration has signaled a shift in energy policy back toward the fossil fuel.

The roots of this story were planted in the 1930s, when the TVA was created as a New Deal project to help haul America out of the Great Depression. The organization soon got into the power business, relying on a mix of energy sources: hydropower, coal and nuclear. After the 1979 accident at Three Mile Island — a nuclear power plant in Pennsylvania — stricter regulations driven by public fear prompted TVA to shut down its two nuclear reactors. Those temporary closures in 1985 left a gaping hole in the region’s energy production, a need immediately filled by coal.

Economist Edson Severnini realized that this dramatic shift from nuclear to coal offered a chance to study the effects of coal-fired power on health. He analyzed power production, particle pollution and medical records of babies born near coal plants. One in particular picked up the most slack: Paradise Fossil Plant in Paradise, Ky. (Incidentally, that town is the same coal-ravaged one John Prine sings about in arguably the best song ever written.) The plant’s power production increased by about 27 percent, replacing about a quarter of the missing nuclear energy.

Not surprisingly, air pollution near the Paradise plant rose, Severnini found. The levels of an air pollution indicator called total suspended particulate fell below the Environmental Protection Agency’s limit at the time (but wouldn’t have passed today’s tougher standards, Severnini says). Still, babies born near the plant in the 18 months after the nuclear shutdowns in 1985 were about 5 percent smaller than babies born in the 18 months before. No difference in birth weight showed up in babies born near other power plants that didn’t change their output (including my town’s).
That 5 percent difference was “really, really surprising,” says Severnini, of Carnegie Mellon University in Pittsburgh. Studies have linked low birth weight to trouble later in life, including a lower IQ, lower earnings and health problems, particularly heart disease.
UCLA environmental epidemiologist Beate Ritz puts that 5 percent drop in context. “These coal-fired power plants coming online can be compared with a pregnant woman smoking one pack of cigarettes a day,” she says. “That’s pretty bad.”

Ritz, who studies the hazards of air pollution in Los Angeles, points out that it’s not just the lowest birth weight babies affected. The whole curve of birth weights shifted, so that in all likelihood, most babies born there were impacted in some way. “There’s only a small percent in the upper end of the curve that is unaffected,” she says. “Everybody else has probably some kind of subtle effect that you can’t measure on their brain development, on their lung development, on their immune system.”

The study compares nuclear energy to coal. But the issue is far more complex than that, Severnini says. He hopes the example he found will serve as a reminder of how all energy decisions come with complex trade-offs. “Any energy production choice we make has costs and benefits, and we need to weigh them fully.”

The TVA case study fits with many other examples of how coal pollution can harm health, says Bernard Goldstein, a physician and environmental public health expert at University of Pittsburgh. “We should get rid of particulates, and coal contributes to that,” he says.

U.S. dependence on coal is ebbing, in part because natural gas is cheap right now. But coal isn’t dead yet. “My administration is putting an end to the war on coal,” President Donald Trump said March 28, before he signed an executive order that lifted the ban on coal leases on federal land. He also aims to lift other restrictions that affect the coal industry. It’s not yet clear how — or whether —those policies will be enacted, or whether they’ll be enough to revive the coal industry. (Tellingly, the Paradise plant plans to shut down two of its three coal-burning units as it shifts to natural gas.)

“If the president gets his way, this would slow [coal’s descent] down,” says Goldstein, who coauthored a March 23 New England Journal of Medicine opinion piece on why the Trump administration should pay attention to environmental science. Goldstein likens the situation to government efforts to discourage teenage smoking, a trend that’s also decreasing. Just because the numbers are already falling doesn’t mean we shouldn’t hasten that drop, he says.

And unlike exposure to other pollutants like cigarette smoke, air isn’t optional. “You don’t have a choice,” Ritz notes. We are all breathing the air that’s around us, whether we are in Paradise or not.

When tumors fuse with blood vessels, clumps of breast cancer cells can spread

PHILADELPHIA — If you want to beat them, join them. Some breast cancer tumors may follow that strategy to spread through the body.

Breast cancer tumors can fuse with blood vessel cells, allowing clumps of cancer cells to break away from the main tumor and ride the bloodstream to other locations in the body, suggests preliminary research. Cell biologist Vanesa Silvestri of Johns Hopkins University School of Medicine presented the early work December 4 at the American Society for Cell Biology/European Molecular Biology Organization meeting.

Previous research has shown that cancer cells traveling in clusters have a better chance of spreading than loners do (SN: 1/10/15, p. 9). But how clusters of cells get into the bloodstream in the first place has been a mystery, in part because scientists couldn’t easily see inside tumors to find out.

So Silvestri and colleagues devised a see-through synthetic version of a blood vessel. The vessel ran through a transparent gel studded with tiny breast cancer tumors. A camera attached to a microscope allowed the researchers to record the tumors invading the artificial blood vessel. Sometimes the tumors pinched the blood vessel, eventually sealing it off. But in at least one case, a small tumor merged with the cells lining the faux blood vessel. Then tiny clumps of cancer cells broke away from the tumor and floated away in the fluid flowing through the capillary. More work is needed to confirm that the same process happens in the body, Silvestri said.

It’s official: Termites are just cockroaches with a fancy social life

Termites are the new cockroach.

Literally. The Entomological Society of America is updating its master list of insect names to reflect decades of genetic and other evidence that termites belong in the cockroach order, called Blattodea.

As of February 15, “it’s official … that termites no longer have their own order,” says Mike Merchant of Texas A&M University in College Station, chair of the organization’s common names committee. Now all termites on the list are being recategorized.
The demotion brings to mind Pluto getting kicked off the roster of planets, says termite biologist Paul Eggleton of the Natural History Museum in London. He does not, however, expect a galactic outpouring of heartbreak and protest over the termite downgrade. Among specialists, discussions of termites as a form of roaches go back at least to 1934, when researchers reported that several groups of microbes that digest wood in termite guts live in some wood-eating cockroaches too.

Once biologists figured out how to use DNA to work out genealogical relationships, evidence began to grow that termites had evolved as a branch on the many-limbed family tree of cockroaches. In 2007, Eggleton and two museum colleagues used genetic evidence from an unusually broad sampling of species to publish a new tree of these insects (SN: 5/19/07, p. 318). Titled “Death of an order,” the study placed termites on the tree near a Cryptocercus cockroach.

Cryptocercus roaches live in almost termitelike style in the Appalachian Mountains, not too far from chemical ecologist and cockroach fan Coby Schal at North Carolina State University in Raleigh. Monogamous pairs of Cryptocercus roaches eat tunnels in wood and raise young there. The offspring feed on anal secretions from their parents, which provide both nutrition and starter doses of the wood-digesting gut microbes that will eventually let the youngsters eat their way into homes of their own.
Termites are “nothing but social cockroaches,” Schal says. Various roaches have some form of social life, but termites go to extremes. They’re eusocial, with just a few individuals in colonies doing all of the reproducing. In extreme examples, Macrotermes colonies can grow to 3 million individuals with only one queen and one king.

After several years of debate, the common names committee of the American entomologists’ organization voted it was time to switch to the new view of termites. At a February meeting of the society board, there was no objection. The common names of individual termite species, of course, will remain as something-something “termite.”

Considering whether to demote a whole order of insects is an uncommon problem, says Whitney Cranshaw of Colorado State University in Fort Collins, a longtime member of the society’s naming committee. “Probably some of us, including myself, didn’t want to make the change because we liked it the way it was,” he says. Termites and cockroaches as separate orders were easy to memorize for the undergraduates he teaches. Yet, he voted yes. “It’s what’s right.”

Museum mummies sport world’s oldest tattoo drawings

Two human mummies housed at the British Museum in London for more than a century boast the world’s oldest known — and longest hidden — tattoos of figures and designs, a new investigation finds. These people lived in Egypt at or shortly before the rise of the first pharaoh around 5,100 years ago.

Radiocarbon analyses of hairs from the mummies date the bodies to between 3351 B.C. and 3017 B.C., says a team led by Egyptologist Renée Friedman of the University of Oxford and bioarchaeologist Daniel Antoine of the British Museum in London. Infrared photography revealed that smudges on a male mummy’s upper right arm depict a wild bull and a Barbary sheep, while a female mummy bears four S-shaped patterns on her right shoulder and a line with bent ends on her right arm. These animals and figures appear in Egyptian art from the same period, the researchers report online March 1 in the Journal of Archaeological Science. Both sets of tattoos — which consist of a carbon-based pigment, possibly soot — may have symbolized power, social status or knowledge of cult activities, but their precise meanings are unclear.
The two were the only mummies found with tattoos, out of seven mummies originally buried at a southern Egyptian site and now held at the British Museum. All of the bodies had been preserved by the desert’s dry heat.

The tattooed Egyptian mummies are approximately as old as Ötzi the Iceman. The mummified man found in the Italian Alps has 61 dark lines tattooed on his limbs and torso, but no pictures or designs (SN: 1/23/16, p. 5). Some of the Iceman’s tattoos covered areas of joint disease and may have been intended as treatments. CT scans of the two Egyptian mummies found no signs of bone disease near or below tattoos.