What's new

China Science & Technology Forum

Robot wars: China shows off automated doctors, teachers and combat stars
19 Aug 2018
a3ae7a615eb60d52fd6532f285dd74a14afc406e.jpg
FP/File / WANG ZHAO
By 2020, China is aiming for half of the industrial robots sold in the country to be made by Chinese companies, up from 27 percent currently -- with a target of 70 percent by 2025


Robots that can diagnose diseases, play badminton and wow audiences with their musical skills are among the machines China hopes could revolutionise its economy, with visitors to a Beijing exhibition offered a glimpse of an automated future.

The popular stars of this year's World Robot Conference, which ends Sunday, were undoubtedly the small, amateur-made "battle bots" which smashed, hammered and sawed their way through their opponents to a cacophony of cheers and shouts from a rapt audience.

"With this robot, I can fully express myself. I love the sparks," said Huang Hongsong, one of around a dozen Chinese youths whose creations went head-to-head.

But while the battle bots are designed largely to entertain onlookers, China is deadly serious about riding the robotic wave with an eye on its economy.

Cheap manufacturing propelled the populous giant to become the world's second largest economy in just a few decades.

But the country's population is ageing, leaving it facing a double whammy of a worker shortage and increased labour costs as it gets wealthier.

f977c79229a724469822534988224a098afdc63d.jpg
AFP/File / WANG ZHAO
President Xi Jinping has called for a 'robot revolution' as China faces an ageing population and higher labour costs

Automated machines offer a possible way out with President Xi Jinping in 2014 calling for a "robot revolution".

Under the ruling Communist Party's road map for its industrial future -- dubbed "Made in China 2025" -- state subsidies are pouring into the sector.

And at the robot show, a vast array of machines demonstrated how technology may eventually replace human workers.

In one corner, a mechanical arm -- designed to teach children -- painted an elegant Chinese character while a robotic fish explored its tank and a bat flapped its mechanical wings overhead.

a8389778659565d29471dba850ac35efe2c5af0e.jpg
AFP/File / WANG ZHAO
Under the ruling Communist Party's road map for its industrial future, state subsidies are pouring into the robotics sector


- Delicate balance -

By 2020, China is aiming for half of the industrial robots sold in the country to be made by Chinese companies, up from 27 percent currently -- with a target of 70 percent by 2025.

"Robots are the jewel in the crown for the manufacturing industry... a new frontier for our industrial revolution," said Xin Guobin, China's vice minister of industry, as he opened the conference.

But it is a delicate balancing act for Chinese policy-makers due to the potential for human job losses -- a 2016 World Bank report said automation could threaten up to 77 percent of jobs in China's current labour market.

Nonetheless a great robotic leap forward has already been made.

China is now the world's number one market for industrial robots with some 141,000 units sold last year, accounting for a third of global demand, according to the International Federation of Robotics, which says demand could rise an additional 20 percent per year until 2020.

"China has huge opportunities to increase the level of its industrial automation (and) industrial robotisation," said Karel Eloot, an expert at consultancy firm McKinsey.

He notes that China still has huge room for growth given that competitors like Japan and Germany have four times the level of robotisation in their factories compared to the Asian giant.

Qu Daokui, president of local firm Siasun, which was showing off a snake-like robot that can operate in narrow passages, said China needs to increase the quality and sophistication of its robots, particularly in the field of AI.

"We used to focus on the accuracy, reliability and speed of robots -- now it's their flexibility, intelligence and adaptability that makes the difference," he said, adding robots needed to interact and adapt to their environments and "make independent decisions".

710e649f5328e06f2fa8abee2ee0026af239616b.jpg
AFP/File / WANG ZHAO
Outside China's factories, robots are becoming a more visible presence, deployed in restaurants and banks and even delivering parcels


- Doctor Bot -

Outside China's factories, robots are becoming a more visible presence, deployed in restaurants and banks and even delivering parcels.

China's iFlytek, a specialist in speech recognition systems, presented a new "medical assistant" robot at the Beijing show which it said was able to help identify up to 150 diseases and ailments -- even passing a national medical qualification exam with a high score.

The robot, which operated in conjunction with a doctor, asks patients a series of diagnostic questions and can also analyse X-rays.

"It's already being used in hospitals since March and has made some 4,000 diagnoses," company president Liu Qingfeng said, adding such a device could be particularly useful for clinics in more remote parts of China.

Chindex, a subsidiary of the conglomerate Fosun, also distributes the "Da Vinci System" in China, an American built robot with arms and high-tech cameras to aid surgeons in the operating theatre.

"It transcends the limits of the (human) eye," chief operating officer Liu Yu enthused.

But like the diagnostic robot, it still needs a helping human hand.

"It only helps the doctor, it cannot replace them. It would not be ethical, the human body is still too complicated," he said.


Robot wars: China shows off automated doctors, teachers and combat stars | AFP.com
 
. .
Scientists map out bread wheat genome
By Guo Kai | chinadaily.com.cn | Updated: 2018-08-20 11:04
f_art.gif
w_art.gif
in_art.gif
more_art.gif

5b7a5062a310add1c697b5b0.jpeg
Monks from the Shaolin Temple harvest wheat at a farm in Dengfeng, Henan province, June 21, 2018. [Photo/China News Service]

An international research team has recently completed the first sequence of the hexaploid bread wheat genome and published the results on Science magazine, and Chinese scientists joined in the research on the chromosome 7DL.

Compared to other crops, deciphering wheat genome is more complicated, since the high-repetitive hexaploid wheat genome is five times as large as human genome and 40 times of rice genome.

The International Wheat Genome Sequencing Consortium separated the 21 chromosomes of bread wheat Chinese Spring in 2005 and established relative bacterial artificial chromosome libraries for teams to shoulder relative work.

The Chinese team led by Song Weining, professor from the State Key Laboratory of Crop Stress Biology for Arid Areas in NWAFU, shouldered the work to decipher, assemble and order the chromosome 7DL.

Song said that it is a brand-new exploration in the construction of the wheat genomic physical mapping and deciphering, and the size of chromosome 7DL is as large as the whole rice genome.

The workload was much heavier than rice genome sequencing, but under support from the government and other relative programs, the team finally finished the mapping and sequence deciphering work in about 10 years, he said.

The mapping will give researchers the codes in better mastering the rule of wheat growth and development and is significant for selective breeding.
 
.
First in China! Waterproof electrical socket unveiled
New China TV
Published on Aug 20, 2018

What happens if you drop an electrical socket on a live circuit into water? This one, invented by Chinese engineers, will turn the common sense on its head. Find out in this video. Don't blink!
 
.
Reflecting on success
Source:Xinhua Published: 2016-4-28 1:03:01

7bc523b2-3eac-4054-9bca-44d88ddf3f24.jpeg
Engineers at the Changchun Institute of Optics, Fine Mechanics and Physics under the Chinese Academy of Sciences surround their newly developed silicon carbide mirror, which has a diameter of 4.03 meters. The larger the mirror, the higher the resolution and the greater the light-gathering power of the telescope or satellite in which it is used. Photo: Xinhua
我国研制出4米大口径碳化硅非球面光学反射镜_图片频道_新华网
这是8月21日拍摄的4米量级高精度碳化硅非球面反射镜。
This is a picture of the 4 meter high precision silicon carbide aspheric mirror shot on August 21.

  当日,探索9年、经18个月加工“打磨”,一块直径4米、重达1.6吨的“大镜子”在中国科学院长春光学精密机械与物理研究所通过项目验收。这是国家重大科研装备研制项目“4米量级高精度碳化硅非球面反射镜集成制造系统”的最新成果,标志着我国大口径碳化硅非球面光学反射镜制造技术水平已经跻身国际先进行列。
On that same day, after 9 years of exploration and 18 months of processing, the “big mirror” with a diameter of 4 meters and a weight of 1.6 tons was approved by the Changchun Institute of Optics, Fine Mechanics and Physics of the Chinese Academy of Sciences. This is the latest achievement of the national major scientific research equipment development project "4-meter-scale high-precision silicon carbide aspheric mirror integrated manufacturing system", which indicates that China's large-diameter silicon carbide aspheric optical mirror manufacturing technology has now ranked best in the world.

China develops large aperture optical mirror with high accuracy
Source: Xinhua| 2018-08-22 16:29:32|Editor: ZX


CHANGCHUN, Aug. 22 (Xinhua) -- China has developed a high accuracy four-meter-aperture optical mirror, an important tool for deep space and astronomical observation.

Developed by Changchun Institute of Optics, Fine Mechanics, and Physics of the Chinese Academy of Sciences, the silicon carbide aspheric optical mirror measures 1.6 tonnes.

The silicon carbide used in production provides more stability to the surface of the mirror, allowing for greater accuracy at 20 nanometers.

In addition to the mirror, the research group also developed the manufacturing equipment used for the mirror's production and owns the IP rights for the equipment.


1123305216_15348596690431n.jpg

1123305216_15348596689681n.jpg

a010254ac4e0f0d03155526f0767c8dd

4米量级反应烧结炉
1f770060a24439fba67f8d9bb1f73292

4米量级大型磁控溅射镀膜机
0da103e9f5dd5990f18af08e6e58ac45

采用磁流变抛光加工4米反射镜
9f135bb2a961618edd1a6316def104d2

 
Last edited:
.
PHYSICS
'Spooky' Quantum Entanglement Confirmed Using Distant Quasars
Ryan F. Mandelbaum
Yesterday 5:10pm

qgqzbyutqhccwepnboyu.jpg
A quasar
Image: NASA


If you read enough science news, you’ll know that there’s a long list of experiments attempting to “prove Einstein wrong.” None have yet contradicted his hallmark theory of relativity. But the latest effort to falsify his statements surrounding “spooky action at a distance” has gone truly cosmic.

Scientists have long performed tests demonstrating that the quantum concept of “entanglement” forces us to accept something that doesn’t make much logical sense. But in order to get around loopholes in previous iterations of the test, which are conducted fully here on Earth, scientists lately have hooked their experiments up to telescopes observing the cosmos.

“We’ve outsourced randomness to the furthest quarters of the universe, tens of billions of light years away,” David Kaiser, one of the study’s authors from MIT, told Gizmodo.

Let’s start at the beginning: Quantum mechanics describes the universe’s smallest particles as having a restricted set of innate properties, which are mostly a mystery to us humans until we measure them. The math of quantum mechanics introduces the idea that two particles can become “entangled,” so their joint properties must be described with the same mathematical machinery. But here’s the problem: If you separate these particles to opposite ends of the universe and measure them, they’ll maintain this eerie connection; you can still infer the properties of one particle by measuring the other.

Einstein, along with Boris Podolsky and Nathan Rosen, thought that one of two things could cause this “spooky action at a distance,” as Einstein described it. Either the particles somehow communicate faster than the speed of light, which Einstein’s theories demonstrated is impossible, or there was hidden information humans weren’t accessing that ensured particles took on these correlated values in the first place.

But John Stewart Bell theorized that hidden information could never accurately recreate what quantum mechanics forces the particles to do. Scientists have devised increasingly complex ways to test this theory since the 1960s.

These tests usually look rather similar. Scientists generate pairs of entangled photons, each with one of two polarization states—imagine that, viewed from a certain angle, both photons are either small vertical lines or horizontal lines. The photons, if entangled, will have the same polarization state—though which one, horizontal or vertical, is a mystery until the measurement. The scientists send the photons to two distant detectors that measure the photons from two angles: the angle from which the polarization and entanglement are visible, or a different angle (if the photons are viewed from this different angle, they become unentangled). Each detector lies in wait for the particles—which, if everything lines up, will produce a simultaneous blip. These simultaneous blips should occur more frequently for sets of entangled particles than sets of unentangled ones.

Some percentage of simultaneous blips above a certain threshold would prove Einstein, Podolsky, and Rosen wrong—it would demonstrate that there are no hidden variables in the laws of physics predetermining the particles’ identities.

But there’s a loophole—perhaps the apparatus influences the measurement, somehow, and forces the photons to carry the same polarization? In order to prevent this, scientists randomly switch the detector between the two measurement angles. Then comes the next loophole: What if the random-number generator determining the measurement angle isn’t really random; what if what we see as randomness has actually been predetermined by the laws of physics that brought humans to this point?

Two teams of scientists got around this problem by hooking their random-number generator up to a pair of telescopes. In the more dramatic case, the team including Kaiser worked from two telescopes on La Palma in the Canary Islands: the Telescopio Nazionale Galileo, pointing at bright light sources called quasars on one side of the sky that emitted their light 7.78 billion and 3.22 billion years ago, and the William Herschel Telescope, pointing to a light source that emitted light 12.21 billion years ago. If each telescope observed light that was slightly bluer than a reference color, its corresponding detector would measure the light’s polarization in one setting. If the light was slightly redder, then the detector would use the other setting.

In a test of 30,000 pairs of particles, their polarizations correlated too closely to be explained by one of these local hidden variable theories, according to the paper published in Physical Review Letters. That means that any hidden force that could have influenced both particles would have needed to happen billions of years ago to somehow influence the way scientists measured these particles here on Earth. Or, the more likely explanation is that quantum mechanics remains spooky at a distance and can’t be explained by hidden variables. It appears that Einstein was wrong about this one.

The researchers took care to account for astronomical things that might have biased their measurements. For example, they chose a color of light to measure that wouldn’t be absorbed by interstellar gas, and they ensured that they took gravity and the universe’s expansion into account, explained Kaiser. The second, similar experiment, also published in Physical Review Letters, also observed the higher-than-classical correlations, bolstering both papers’ evidence.

Quantum mechanics’ weirdness continues to boggle minds. This weirdness is at the heart of the emerging field of quantum computers, which rely on entanglement in order to perform their calculations. Said Kaiser: “These devices are built on the assumption that entanglement is real.”

Scientists can perhaps further refine these tests by using of light from even deeper into the universe.

It’s the job of physicists to test the laws of physics and ensure that they continue not to break. I hope by now it’s become utterly clear: In quantum physics, spookiness is a given.



'Spooky' Quantum Entanglement Confirmed Using Distant Quasars | GIZMODO

  1. Dominik Rauch, Johannes Handsteiner, Armin Hochrainer, Jason Gallicchio, Andrew S. Friedman, Calvin Leung, Bo Liu, Lukas Bulla, Sebastian Ecker, Fabian Steinlechner, Rupert Ursin, Beili Hu, David Leon, Chris Benn, Adriano Ghedina, Massimo Cecconi, Alan H. Guth, David I. Kaiser, Thomas Scheidl, and Anton Zeilinger. Cosmic Bell Test Using Random Measurement Settings from High-Redshift Quasars. Phys. Rev. Lett. (2018). DOI: 10.1103/PhysRevLett.121.080403
  2. Ming-Han Li, Cheng Wu, Yanbao Zhang, Wen-Zhao Liu, Bing Bai, Yang Liu, Weijun Zhang, Qi Zhao, Hao Li, Zhen Wang, Lixing You, W. J. Munro, Juan Yin, Jun Zhang, Cheng-Zhi Peng, Xiongfeng Ma, Qiang Zhang, Jingyun Fan, and Jian-Wei Pan. Test of Local Realism into the Past without Detection and Locality Loopholes. Phys. Rev. Lett. (2018). DOI: 10.1103/PhysRevLett.121.080404
 
.
iHuman Team at ShanghaiTech University Deciphers the First Human Frizzled Receptor Structure

Aug 23, 2018 - The Xu Laboratory at the iHuman Institute of ShanghaiTech University has deciphered the high-resolution crystal structure of the first human Frizzled receptor. As a gatekeeping protein in regulating the fundamental Wnt signaling in embryonic development and tumorigenesis, the Frizzled receptors (FZDs) have been cancer drug targets for decades without success. However, these results, which illustrate the unique architecture of Frizzled-4 in ligand-free state and explain the long-standing hurdle of identifying potent ligand for this family of receptors, will benefit basic and therapeutic research that could lead to important new drug discoveries. These new findings are published today in the advanced online edition of the journal Nature, titled “Crystal structure of Frizzled 4 receptor in ligand-free state” by Yang et al.

“In order to understand why no one has been able to develop good tool ligands or drug molecules for FZDs and to tackle the mystery of FZD signaling, we solved the intact transmembrane domain structure of Frizzled4 receptor (FZD4) at 2.4 angstrom resolution,” said Fei Xu, Assistant Professor at iHuman Institute and the School of Life Science and Technology, ShanghaiTech University, and corresponding author of the study.

Surprising to the authors was the observation that the traditional orthosteric ligand binding site is very narrow making it hard for small molecules to enter or bind. “These findings improve our understanding of the FZD ligand and signaling, and show that ligand design for this pocket may require special considerations that could be inspired by this crystal structure,” said Fei Xu.

“To generate a more stable human FZD4 protein amenable for structure determination in the absence of a ligand (apo state), the team spent four years and screened hundreds of constructs, optimized purification procedures, and tried thousands of possible crystallization methods,” said Shifan Yang, postdoctoral fellow at iHuman Institute, and first author of the paper. “It turns out to be extremely challenging to obtain diffraction-quality crystals, likely due to the lack of a tool ligand to lock the flexible region in the protein.”

With further optimization of protein engineering, purification, and crystallization conditions, the team finally solved the structure of FZD4 in the apo state, which is the first structure of the Frizzled family GPCRs and the first apo structure of a ligand-regulated GPCR.

“For close personal reasons, I chose FZD4 four years ago when I started my postdoctoral career. It is an amazing protein, as it is not only a target for eye blindness given its critical role in retinal angiogenesis, but also a unique representative for us to understand other FZDs which are emerging cancer drug targets,” said Shifan Yang.

“The orthosteric ligand binding pocket is very unique in FZD4 and other FZDs,” said Yiran Wu, Research Assistant Professor at iHuman Institute, and second author of this paper. “It is not only narrow, but also highly hydrophilic making traditional GPCR ligands difficult to bind tightly.”

In addition to the discovery of a vacant pocket, this work also reveals an unusual packing in transmembrane helix VII and the short helix8, providing new insight into a potential activation mechanism of this family of receptors. “Such a remarkable structure provides a more accurate template to redirect the efforts on Frizzled drug discovery”, said Raymond Stevens, Director of iHuman Institute, ShanghaiTech University.

Other key co-authors of this paper are: Shaowei Dong, Yuxiang Chen, Yu Guo, Suwen Zhao, Bingjie Zhang, Wenqing Shui, and Mengchen Pu from ShanghaiTech University; Ting-Hai Xu, Parker W. de Waal, Yuanzheng He, Zachary J DeBruine, Eric Xu, and Karsten Melcher from the Van Andel Research Institute; and Gye Won Han, Petr Popov, and Vsevolod Katritch from the University of Southern California. Financial support for this work comes, in part, from the Shanghai Municipal Government, ShanghaiTech University, and an NSF of China grant.

Full text link: http://dx.doi.org/10.1038/s41586-018-0447-x

5b7dff7b450b1.jpg
Overall structure of Frizzled4 and highlight of the orthosteric pocket. a, Side view of ΔCRD-FZD4 with transmembrane domain shown as cartoon and coloured in white. The extracellular side (hinge domain (green), ECL1 (red), ECL2 (blue) and ECL3 (orange)) pack together to form a compact structure. b, Residues engaged in the orthosteric pocket are shown as spheres coloured in magenta. The cyan region highlights the narrow part of this pocket.

5b7dff8f3f734.jpg
Dr. Fei Xu, corresponding author of the study.

5b7dff9d6d1b4.jpg
Artistic illustration of the Frizzled4 protein structure surfacing on the retina of an eyeball. It demonstrates the functional role of Frizlzled4 in retinal angiogenesis and in maintaining the integrity of the blood-retinal barrier. Image created by Julie Liu.



iHuman Team at ShanghaiTech University Deciphers the First Human Frizzled Receptor Structure | iHuman Institute
 
.
NEWS AND VIEWS | 22 AUGUST 2018
Role for the longevity protein SIRT6 in primate development
Monkeys genetically engineered to lack the gene SIRT6 die a few hours after birth, displaying severe growth defects. This finding reveals a previously unknown role for the SIRT6 protein in primate development.

Shoshana Naiman & Haim Y. Cohen

For decades, biologists using model organisms such as mice and fruit flies have faced concerns about the relevance of their findings to humans. Using a model that is more evolutionarily similar to humans, such as another primate, could potentially close this frustrating gap. In a paper online in Nature, Zhang et al.1 use CRISPR–Cas9 gene-editing techniques to generate macaque monkeys lacking the gene SIRT6. Strikingly, they show that the SIRT6 protein has a role in embryonic development in macaques that was not previously uncovered in mice.



---> Role for the longevity protein SIRT6 in primate development | Nature.com

Weiqi Zhang, Haifeng Wan, Guihai Feng, Jing Qu, Jiaqiang Wang, Yaobin Jing, Ruotong Ren, Zunpeng Liu, Linlin Zhang, Zhiguo Chen, Shuyan Wang, Yong Zhao, Zhaoxia Wang, Yun Yuan, Qi Zhou, Wei Li, Guang-Hui Liu & Baoyang Hu. SIRT6 deficiency results in developmental retardation in cynomolgus monkeys. Nature (2018). DOI: 10.1038/s41586-018-0437-z
 
.
China Focus: Scientists find key gene related to primates' growth, lifespan
Source: Xinhua| 2018-08-23 11:03:42|Editor: mmm


BEIJING, Aug. 23 (Xinhua) -- Chinese scientists have identified a gene playing an important role in regulating the development and lifespan of primates through genome-editing technology and experiments on monkeys and human stem cells.

The study may open the door to new research on human development and aging, as well as new treatments for age-related disorders, said Liu Guanghui, a professor at the Institute of Biophysics of the Chinese Academy of Sciences (CAS).

Understanding the genetic basis of aging is the prerequisite to delaying aging and treating age-related illnesses, Liu said.

In 1999, scientists found that the Sir2 gene plays a role in prolonging the lifespan of Saccharomyces cerevisiae, a kind of yeast. Then scientists found that the SIRT6 gene, a homologue of Sir2, is involved in the regulation of aging and longevity in rodents.

Deficiency of SIRT6 from mice leads to features of accelerated aging such as loss of subcutaneous fat, spinal curvature, colitis and shortening of telomere.

However, the biological function of SIRT6 in primates remains largely unknown.

A joint research team of scientists from the CAS biophysics and zoology institutes spent three years studying how to knockout SIRT6 gene in different tissues of monkeys using CRISPR-Cas9-based gene editing technology.

Scientists injected the gene editing tools into 98 monkey zygotes, of which 48 developed into embryos with normal form and were implanted into 12 surrogate mother monkeys. Four became pregnant, giving birth to three baby monkeys 165 days later and one aborted.

They were the first primates that were deficient in SIRT6 gene. However, different from SIRT6 deficient mice that show premature aging phenomena about two to three weeks after birth, SIRT6-depleted monkeys died within hours after birth. Notably, they exhibited severe prenatal developmental retardation.

"Our results for the first time suggest that SIRT6 is involved in regulating development in non-human primates, and might provide an insight into the research of human developmental disorders," Liu said.

In addition, Chinese scientists conducted in vitro experiments to generate SIRT6-null human embryonic stem cells. This showed that SIRT6 deficiency hindered the differentiation of stem cells to mature neurons.

Their study was published in the latest issue of Nature.

"In future research, we plan to test whether other longevity genes also play a similar role in monkey, and continue our research to unravel the mechanism of human aging," Liu said.
 
.
NEWS | 22 AUGUST 2018
230-million-year-old turtle fossil deepens mystery of reptile's origins
Two-metre-long specimen could help researchers pin down how and when turtles developed traits such as their shell.

Jeremy Rehm

d41586-018-06012-0_16060614.jpg
An artist’s impression of Eorhynchochelys sinensis, a turtle ancestor that lived about 230 million years ago. Credit: Yu Chen

A fossilized turtle discovered in southwestern China fills an evolutionary hole in how the reptiles developed features such as a beak and shell, researchers report1 on 22 August in Nature. Although the specimen can help scientists to pin down when modern turtles developed such characteristics, it’s also muddied the waters when it comes to illuminating the group’s origins.

The roughly 2-metre-long animal, dubbed Eorhynchochelyssinensis, lived about 230 million years ago. Its skull is similar to those of modern turtles, whereas the rest of the animal’s skeleton is more like that of a predecessor that lived 10 million years before.

This new species fits almost perfectly in the evolutionary picture that researchers conceived of years before regarding how turtles acquired their signature features, says Rainer Schoch, an amphibian and reptile palaeontologist at the Stuttgart State Museum of Natural History in Germany. “We’re really happy to see this.”

Filling in the gaps
Turtles haven’t changed much over the past 210 million years. They all have a top shell formed from the fusion of their spine and ribs, a bottom shell that protects their belly, a sharp beak and a mouth without any teeth. But the group lacks a feature common to most modern reptiles — two pairs of holes in their skull, behind their eyes, where jaw muscles attached.

The absence of those holes has contributed to a decades-long debate on the exact position of turtles on the reptile family tree. And this has compounded researchers’ struggle to work out when and how turtle characteristics first evolved.

A specimen discovered in 2008, called Odontochelys semitestacea, offered the first clues2. The roughly 220-million-year-old animal possessed teeth and a bottom shell, and its wide ribs hinted at the beginnings of a top shell. But it lacked a beak and the pairs of holes in its skull.

Then, in 2015, scientists found Pappochelys rosinae, a 240-million-year-old specimen that was missing a top shell, but showed the first signs of a bottom shell3. Unlike modern turtles, P. rosinae had two pairs of openings in its skull, indicating for the first time that turtles were closely related to other modern reptiles.

Now, the discovery of Eorhynchochelys fills in the gap between these two species. The fossil turtle possesses a single pair of holes behind its eyes, suggesting a gradual transition from Pappochelys to modern turtles.

d41586-018-06012-0_16060616.jpg
Eorhynchochelys, the latest fossilized-turtle discovery, was more than 2 metres long.Credit: Xiao-Chun Wu

But the presence of a beak on the Eorhynchochelys skeleton is what really intrigued study co-author Xiao-Chun Wu, a palaeontologist at the Canadian Museum of Nature in Ottawa. It’s a trait that researchers hadn’t seen in early turtle fossils until now, and that seems to have disappeared in some species and reappeared in others millions of years later. This suggests that the evolution of a beak in modern turtles was not a straight path, Wu says.

Family ties
But even though Eorhynchochelys helps to demonstrate the acquisition of turtle traits, Schoch says, it’s not so informative about their place on the evolutionary tree.

Most genetic studies over the past 20 years have positioned crocodilians, dinosaurs and modern birds as the turtles’ closest evolutionary relatives. But some studies looking at DNA or RNA, as well as analyses of turtle anatomy, have pointed to lizards and snakes as the group’s closest relatives.

After including Eorhynchochelys’s physical characteristics in an analysis with those of other fossilized reptiles, however, Wu and his colleagues say that turtles aren’t as closely related to any of those groups as other research suggests. They’re more of an offshoot from earlier ancestors, Wu says.

Schoch is sceptical of this claim, however, saying that researchers don’t know enough about the anatomy of early reptilian ancestors to know for sure where turtles fall.

We just need to find out more about those early ancestors, Schoch says. “That is now the big problem and the next step that will have to be taken.”



230-million-year-old turtle fossil deepens mystery of reptile's origins | Nature.com

Chun Li, Nicholas C. Fraser, Olivier Rieppel, Xiao-Chun Wu. A Triassic stem turtle with an edentulous beak. Nature (2018). DOI: 10.1038/s41586-018-0419-1
 
.
Spallation neutron source passes assessment, checks
chinadaily.com.cn | Updated: 2018-03-26 10:12
5ab856f5a3105cdce09f4e3d.jpeg
A spectrometer detector at the spallation neutron source in Dongguan, South China's Guangdong province. [Photo/chinanews.com]

The Institute of High Energy Physics under the Chinese Academy of Sciences said Sunday China's spallation neutron source, one of the country's most important scientific facilities in Dongguan, South China's Guangdong province, has passed assessment and checks by the CAS.

The spallation neutron source is the first of its kind in the country and the fourth worldwide and will provide intense pulsed neutron beams for scientific research. It is a significant step in the country solving problems on the frontier of science.

Construction for the SNS started in September 2011, and total investment has reached 2.3 billion yuan ($364.4 million).

5ab856f5a3105cdce09f4e47.jpeg

5ab856f5a3105cdce09f4e41.jpeg

5ab856f5a3105cdce09f4e3f.jpeg

5ab856f5a3105cdce09f4e43.jpeg

China’s first spallation neutron source goes into operation
By Guo Meiping
2018-08-24 11:56 GMT+8

d4b4f970b5dd451f8f1ac725cf4eabc0.jpg
China's first spallation neutron source (SNS) has officially begun operating on Thursday, making the country the fourth in the world to possess such a facility.

The China Spallation Neutron Source, or CSNS, is considered a “super microscope” that can provide the most intense pulsed neutron beams for scientific research.

The equipment can accelerate protons before smashing them into a target to produce neutrons. The neutrons are then sent to numerous instruments that are used by researchers to study materials.

The “super microscope” is ideal for studying the microstructure of materials, said Chen Hesheng, manager of the CSNS project.

Chen added that the CSNS can be used for researching residual stress of large metal parts, which is vital for improving the performances of key parts of high-speed trains, aircraft engines, and nuclear power plants.

a241491aa1ef4d5cb7652609da989ea5.jpg
An aerial view of the CSNS. /VCG Photo

The CSNS consists of a powerful linear proton accelerator, a rapid circling synchrotron, a target station and three neutron instruments.

More than 90 percent of the equipment was based on independent research and development and can be domestically produced.

Construction of the CSNS project in China started in 2011 in Dongguan City, south China's Guangdong Province, with a total investment of around 2.3 billion yuan (364 million US dollars).

As one of the largest science and technology infrastructure projects in China, the equipment is expected to have positive effects in promoting the sciences, high-tech development, and national security.

[Top image via VCG]

(With input from Xinhua.)
 
.
Aging Connection
Study identifies molecular link between aging and neurodegeneration

By KEVIN JIANG August 23, 2018 Research

braingear_850_iStock-912219896.jpg

For decades researchers have worked to shed light on the causes of neurodegenerative disorders, a group of devastating conditions, including Alzheimer’s and Parkinson’s, that involve the progressive loss of neurons and nervous system function. In recent years, numerous factors, from genetic mutations to viral infections, have been found to contribute to the development of these diseases.

Yet age remains the primary risk factor for almost all neurodegenerative disorders. A precise understanding of the links between aging and neurodegeneration has remained elusive, but research from Harvard Medical School now provides new clues.

Get more HMS news here

In a study published in Cell on Aug. 23, the research team describes the discovery of a molecular link between aging and a major genetic cause of both amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD), two related neurodegenerative diseases with shared genetic risk factors.

The findings, the researchers said, reveal possible new targets for treatment of these and other neurodegenerative diseases.

“Our study provides the first description of a molecular event that connects aging with neurodegeneration,” said senior study author Junying Yuan, the HMS Elizabeth D. Hay Professor of Cell Biology. “These insights are a critical step towards understanding the mechanisms by which aging predisposes individuals to neurodegeneration.”

The results also highlight the need for a better understanding of the biology of neurodegenerative diseases in the context of aging.

“Laboratory models of neurodegenerative diseases often have a missing element, and that is the contribution of age,” Yuan said. “We have to understand better the process in its totality, not just its isolated components, to better guide clinical trials and improve our chances of finding effective treatments for these devastating diseases.”

RIP rescue

Also known as Lou Gehrig’s disease, ALS is a progressive, incurable condition marked by the gradual death of motor neurons. It shares some clinical and genetic features with FTD, which is marked by an early and rapid onset of dementia.

Around one in 10 patients with both diseases carry genetic mutations that cause the partial dysfunction of a protein known as TBK1. Previous studies, including by Yuan and colleagues, have shown that TBK1 is involved in a form of programmed cell death and in neuroinflammation, a hallmark of neurodegenerative disorders. How TBK1 contributes to the development of ALS and FTD, however, was unclear.

In the current study, Yuan and colleagues modeled the reduced levels of TBK1 found in ALS and FTD patients by creating mice that had only one functional copy of the gene that produces TBK1. These mice were healthy and similar in appearance to normal mice. In contrast, those that lacked the gene entirely died before birth.

However, the team found that mice without TBK1 could be fully rescued—surviving birth and becoming healthy adults—by blocking the activity of RIPK1, another protein known to play a central role in programmed cell death, neuroinflammation and neurodegenerative disease. Further analyses revealed that TBK1 normally functions to inhibit the activity of RIPK1 during embryonic development.

This discovery prompted the researchers to investigate another protein, called TAK1, previously known to also inhibit RIPK1 function. When they looked at data on TAK1 expression in human brains, the scientists found that TAK1 expression declines significantly with age. In the brains of patients with ALS, TAK1 expression was further reduced compared with the brains of similarly aged people without ALS.

Brake it down

To model the interaction between partial loss of TBK1 and TAK1 with aging, the team created mice that expressed half the usual amount of TBK1. The mice also expressed half the usual amount of TAK1 in their microglia, the immune cells of the brain where TAK1 is normally most active.

With reductions in both TBK1 and TAK1, these mice displayed traits associated with ALS and FTD, including motor deficits, hind limb weakness, anxiety-like behavior in new environments and changes in brain chemistry. The mice had a reduction in the number of neurons in the brain, and increased motor neuron dysfunction and cell death.

When the team inhibited RIPK1 activity independently of TBK1 and TAK1, they observed a reversal in symptoms.

Like a pair of brakes on a bicycle, TAK1 and TBK1 appear to work together to suppress the activity of RIPK1, and even if one brake fails, the other can compensate, the researchers said. But if both begin to fail, RIPK1 activity increases, leading to cell death and neuroinflammation.

This may be why individuals with TBK1 mutations do not develop ALS and FTD until they become older, when TAK1 levels decline with age, Yuan said.

Several clinical trials are underway to test the safety and efficacy of drugs that block RIPK1 activity in neurodegenerative and chronic inflammatory diseases, and these findings support the rationale for those trials, Yuan added.

“I think the next couple of years will reveal whether RIPK1 inhibitors can help ALS and FTD patients,” she said. “I think our study makes us more confident those efforts might work.”

Despite numerous large-scale clinical trials, however, no effective therapeutics have yet been developed for neurodegenerative diseases. This research now establishes a model of study that incorporates both aging and genetic risk for ALS and FTD, which may have broad implications.

“Many trials have been launched based on data from studies in mice, but how can a two-year-old mouse, for example, fully reflect what happens in an 80-year-old patient with Alzheimer’s?” Yuan said. “We need to develop new thinking on how to model these diseases to incorporate the element of aging, and we think this study is an important step toward that goal.”

The scientists are currently investigating why TAK1 levels decline with age and its potential role in other neurodegenerative diseases.

Additional authors on the study include Daichao Xu, Taijie Jin, Hong Zhu, Hongbo Chen, Dimitry Ofengeim, Chengyu Zou, Lauren Mifflin, Lifeng Pan, Palak Amin, Wanjin Li, Bing Shan, Masanori Gomi Naito, Huyan Meng, Ying Li, Heling Pan, Liviu Aron, Xian Adiconis, Joshua Z. Levin and Bruce A. Yankner.

The study was supported by the National Institute of Neurological Disorders and Stroke, the National Institute on Aging and the National Institute of Mental Health of the National Institutes of Health (1R01NS082257, 1R01AG047231, RF1AG055521, RO1AG046174, RO1MH113279), the National Key R&D Program of China, the National Natural Science Foundation of China and the Chinese Academy of Sciences.

Image: wildpixel/iStockPhoto



Aging Connection | Harvard Medical School

Daichao Xu, Taijie Jin, Hong Zhu, Hongbo Chen, Dimitry Ofengeim, Chengyu Zou, Lauren Mifflin, Lifeng Pan, Palak Amin, Wanjin Li, Bing Shan, Masanori Gomi Naito, Huyan Meng, Ying Li, Heling Pan, Liviu Aron, Xian Adiconis, Joshua Z. Levin, Bruce A. Yankner, Junying Yuan. TBK1 Suppresses RIPK1-Driven Apoptosis and Inflammation during Development and in Aging. Cell, 2018; DOI: 10.1016/j.cell.2018.07.041
 
.
Can China build a US$145 million superconducting computer that will change the world?
Chinese scientists are embarking on a one-billion yuan, high-risk, high-reward plan to build low-energy top-performance computing systems

China is building a 1 billion yuan (US$145.4 million) “superconducting computer” – an unprecedented machine capable of developing new weapons, breaking codes, analysing intelligence and – according to official information and researchers involved in the project – helping stave off surging energy demand.

Computers are power-hungry, and increasingly so. According to an estimate by the Semiconductor Industry Association, they will need more electricity than the world can generate by 2040, unless the way they are designed is dramatically improved.

The superconducting computer is one of the most radical advances proposed by scientists to reduce the environmental footprint of machine calculation.

The concept rests on sending electric currents through supercooled circuits made of superconducting materials. The system results in almost zero resistance – in theory at least – and would require just a fraction of the energy of traditional computers, from one-fortieth to one-thousandth, according to some estimates.

INTO THE SUPER LEAGUE
Chinese scientists have already made a number of breakthroughs in applying superconducting technology to computers. They have developed new integrated circuits with superconducting material in labs and tested an industrial process that would enable the production of relatively low cost, sophisticated superconducting chips at mass scale. They have also nearly finished designing the architecture for the computer’s systems.

Supercomputer superpower China takes biggest lead over US in 25 years

Now the aim is to have a prototype of the machine up and running as early as 2022, according to a programme quietly launched by the Chinese Academy of Sciences (CAS) in November last year with a budget estimated to be as much as one billion yuan.

If these efforts are successful, the Chinese military would be able to accelerate research and development for new thermonuclear weapons, stealth jets and next-generation submarines with central processing units running at the frequency of 770 gigahertz or higher. By contrast, the existing fastest commercial processor runs at just 5Ghz.

ced72e2c-a844-11e8-851a-8c4276191601_1320x770_124439.JPG


The advance would give Chinese companies an upper hand in the global competition to make energy-saving data centres essential to processing the big data needed for artificial intelligence applications, according to Chinese researchers in supercomputer technology.

CAS president Bai Chunli said the technology could help China challenge the US’ dominance of computers and chips.

“The integrated circuit industry is the core of the information technology industry … that supports economic and social development and safeguards national security,” Bai said in May during a visit to the Shanghai Institute of Microsystem and Information Technology, a major facility for developing superconducting computers.

“Superconducting digital circuits and superconducting computers … will help China cut corners and overtake [other countries] in integrated circuit technology,” he was quoted as saying on the institute’s website.

But the project is high-risk. Critics have questioned whether it is wise to put so much money and resources into a theoretical computer design that is yet to be realised, given that similar attempts by other countries have ended in failure.

IN THE BEGINNING
The phenomenon of superconductivity was discovered by physicists more than a century ago. After the second world war, the United States, the former Soviet Union, Japan and some European countries tried to build large-scale, cryogenically cooled circuits with low electric resistance. In the US, the effort attracted the support of the government’s spy agency, the National Security Agency (NSA) and defence department because of the technology’s potential military and intelligence applications.

But the physical properties of superconducting materials, such as niobium, was less well understood than silicon, which is used in traditional computers.

SpaceX prepares to launch supercomputer that could help guide astronauts on future missions

As a result, chip fabrication proved challenging, and precise control of the information system at low temperatures, sometimes close to absolute zero, or minus 273 degrees Celsius, were a headache. Though some prototypes were made, none could be scaled up.

Meanwhile, silicon-based computers advanced rapidly with increasing speed and efficiency, raising the bar for research and development for a superconducting computer.

e9a02748-a846-11e8-851a-8c4276191601_1320x770_124439.JPG


But those big gains using silicon seem to have ended, with the high-end Intel Core i7 chips, for instance, have been on computer store shelves for nearly a decade.

And as supercomputers grow bigger, so too does their energy consumption. Today’s fastest computers, the Summit in the US and China’s Sunway TaihuLight, require 30 megawatts of power to run at full capacity, more power than a Los Angeles-class nuclear submarine can generate. And their successors, the exascale supercomputers, which are capable of 1,000 petaflops, or performing 1 million trillion floating-point operations per second, is likely to need a stand-alone power station.

Li Xiaowei, executive deputy director of the State Key Laboratory of Computer Architecture, who is well acquainted with the Chinese programme, said the main motivation to build a superconducting computer was to cut the energy demands of future high-performance computers.

“It will be a general-purpose computer capable to run different kinds of algorithms … from text processing to finding big prime numbers”, the latter an important method to decode encrypted messages, according to Li.

Man vs machine: Supercomputer draws first blood in showdown with Go grandmaster Lee Se-dol in Seoul

Li would not give technical details of the machine under construction but he confirmed it would not be a quantum computer.
“It is built and run on a classical structure,” he told the South China Morning Post.

Instead of encoding information in bits with a value of one or zero, quantum computers use qubits, which act more like switches and can be a one and a zero at the same time. Most types of quantum computers also require extremely cold environments to operate.

Quantum computers are believed to be faster than classical superconducting computers but are likely to be limited to specific jobs and take a lot longer to realise. Many technologies, though, can be shared and moved from one platform to another.

THE RACE IS ON
China is not the only country in the race. The NSA launched a similar project in 2014. The Cryogenic Computing Complexity programme under the Office of the Director of National Intelligence has awarded contracts to research teams at IBM, Raytheon-BBN and Northrop Grumman to build a superconducting computer.

“The power, space, and cooling requirements for current supercomputers based on complementary metal oxide semiconductor technology are becoming unmanageable,” programme manager Marc Manheimer said in a statement.

“Computers based on superconducting logic integrated with new kinds of cryogenic memory will allow expansion of current computing facilities while staying within space and energy budgets, and may enable supercomputer development beyond the exascale.”

During the initial phase of the programme, the researchers would develop the critical components for the memory and logic subsystems and plan the prototype computer. The goal was to later scale and integrate the components into a working computer and test its performance using a set of standard benchmarking programs, according to NSA.

9564bf94-a847-11e8-851a-8c4276191601_1320x770_124439.jpg


The deadline and budget of the US programme has not been disclosed.

Back in China, Xlichip, an electronics company based in Shenzhen, a growing technology hub in the country’s south, confirmed on Tuesday that it had been awarded a contract to supply test hardware for a superconducting computer programme at CAS’s Institute of Computing Technology in Beijing.

“The client has some special requirements but we have confidence to come up with the product,” a company spokeswoman said, without elaborating.

The world’s next fastest supercomputer will help boost China’s growing sea power

Fan Zhongchao, researcher with CAS’s Institute of Semiconductors who reviewed the contract as part of an expert panel, said the hardware was a field-programmable gate array (FPGA), a reconfigurable chip that could be used to simulate and test the design of a large-scale, sophisticated integrated circuit.

“The overall design [of the FPGA testing phase] is close to complete,” he said.
There are signs that China is getting closer to its superconducting goal.

Last year, Chinese researchers realised mass production of computer chips with 10,000 superconducting junctions, according to the academy’s website. That compares to the more than 800,000 junctions a joint research team at Stony Brook University and MIT squeezed into a chip. But most fabrication works reported so far were in small quantities in laboratories, not scaled up for factory production.

Zheng Dongning, leader of the superconductor thin films and devices group in the National Laboratory for Superconductivity at the Institute of Physics in Beijing, said that if 10,000-junction chips could be mass produced with low defect rates, they could be used as building blocks for the construction of a superconducting computer.

CHIPPING AWAY
Zheng said China’s determination to develop the new technology would only be strengthened by the trade war with the United States. Many Chinese companies are reliant on US computing chips and the White House’s threats in May to ban chip exports to Chinese telecommunications giant ZTE almost sent the company to the wall.

“It is increasingly difficult to get certain chips from the US this year. The change is felt by many people,” he said.

China’s race for the mother of all supercomputers just got more crowded

But Zheng said China should not count on the superconducting computer technology to challenge US dominance. The US and other countries such as Japan had invested many more years in this area of research than China and although their investments were smaller they were consistent, giving them a big edge in knowledge and experience.

“One billion yuan is a lot of money, but it cannot solve all the remaining problems. Some technical issues may need years to find a solution, however intensive the investment,” Zheng said.

“Year 2022 may be a bit of a rush.”

22ddb40e-a846-11e8-851a-8c4276191601_1320x770_124439.jpg


Wei Dongyuan, a researcher at the Chinese Academy of Science and Technology for Development, a government think tank on science policies, said China should be more transparent about the programme and give the public more information about its applications.
“It can be used in weather forecasts or to simulate the explosion of new nuclear weapons. One challenge is to develop a new operating system. Software development has always been China’s soft spot,” he said.

Chen Quan, a supercomputer scientist at Shanghai Jiao Tong University, said superconducting was often mentioned in academic discussions on the development of the next generation of high-performance computers.
China was building more than one exascale computer, and “it is possible that one will be superconductive”, he said.

This article appeared in the South China Morning Post print edition as: the search for Billion-dollar brain heats upChina’s search for billion-dollar brain heats up
 
.
American, Chinese scientists find two new vegetable fatty acids
Source: Xinhua| 2018-08-28 05:42:16|Editor: Mu Xuequan


WASHINGTON, Aug. 27 (Xinhua) -- American and Chinese scientists discovered two new fatty acids in vegetable oils, potentially to be developed as high-quality lubricants.

The study published on Monday in the journal Nature Plants revealed that two acids, Nebraskanic acid and Wuhanic acid, made up nearly half of the seed oil found in the Chinese violet cress, and named after their discoverers, the University of Nebraska-Lincoln and Huazhong Agricultural University in China.

"People thought maybe they'd found everything there was to find," said Nebraska's Ed Cahoon, a George Holmes University Professor of biochemistry who co-authored the paper. "It's been at least several decades since somebody has discovered a new component of vegetable oil like this."

Fatty acids represent the primary components of vegetable oils, which are best known for their role in the kitchen but have also found use in biodiesel fuels, lubricants and other industrial applications.

Most off-the-shelf vegetable oils, such as canola or soybean oil, contain the same five fatty acids. Those conventional fatty acids all contain either 16 or 18 carbon atoms and feature similar molecular structures, according to the researchers.

By contrast, Nebraskanic and Wuhanic rank among a class of "unusual" fatty acids that contain fewer or more carbon atoms and uncommon molecular branches that stem from those carbons. They both have 24 atoms.

All known fatty acids generally add two carbon atoms at the end of a four-step biochemical cycle, then continue doing so until assembly is complete.

But the Nebraskanic and Wuhanic acids seem to go in a way rarely if ever seen outside of certain bacteria. Both acids appear to follow the traditional script until adding their 10th pair of carbon atoms, Cahoon said.

After reaching that milestone, though, the acids appear to skip the last two steps of the four-step cycle, twice cutting short the routine to accelerate the addition of the 11th and 12th carbon pairs.

The process also leaves behind an oxygen-hydrogen branch, or hydroxyl group, in the fatty acid chain, according to the study.

"We believe that the fatty acids are linked to one another through the hydroxyl groups to form a complex matrix of fatty acids, which is quite different from how fatty acids are arranged in a typical vegetable oil," said Cahoon.

That unique assembly and structure could account for the corresponding oil's superior performance as a lubricant, which was tested at the University of North Texas (UNT).

Compared with castor oil, the violet cress oil reduced friction between steel surfaces by 20 percent at 25 degrees Celsius and by about 300 percent at 100 degrees Celsius.

"When we saw the long-chain molecules and their arrangement, we knew the oil found in Chinese violet cress seeds would make an excellent lubricant," said Diana Berman, assistant professor of materials science and engineering at UNT.

"This oil doesn't just have the potential to supplement or replace petroleum-based oil; it can also replace synthetics. It is a renewable solution to a limited-resource problem," said Berman.

The team also managed to pinpoint two genes that, when activated, help kick-start production of the fatty acids, thus ramping up production of the oil to an industrial scale.

"With breeding and bringing in other germplasm, maybe we can make this plant into an industrial oilseed crop," Cahoon said. "Right now, the yield is less than half that of canola, but canola's been intensively bred for more than 50 years."
 
.
The strength of gravity has been measured to new precision
Pinpointing Big G could help refine mass measurements for Earth and other celestial objects
BY MARIA TEMMING
1:00PM, AUGUST 29, 2018

082718_MT_gravity-constant_feat.jpg
HOMING IN Gravity (illustrated here bending spacetime) has been notoriously hard to measure. Now two new lab experiments estimate the strength of gravity, or Big G, with record precision.
VCHAL/SHUTTERSTOCK


We now have the most precise estimates for the strength of gravity yet.

Two experiments measuring the tiny gravitational attraction between objects in a lab have measured Newton’s gravitational constant, or Big G, with an uncertainty of only about 0.00116 percent. Until now, the smallest margin of uncertainty for any G measurement has been 0.00137 percent.

The new set of G values, reported in the Aug. 30 Nature, is not the final word on G. The two values disagree slightly, and they don’t explain why previous G-measuring experiments have produced such a wide spread of estimates (SN Online: 4/30/15). Still, researchers may be able to use the new values, along with other estimates of G, to discover why measurements for this key fundamental constant are so finicky — and perhaps pin down the strength of gravity once and for all.

The exact value of G, which relates mass and distance to the force of gravity in Newton’s law of universal gravitation, has eluded scientists for centuries. That’s because the gravitational attraction between a pair of objects in a lab experiment is extremely small and susceptible to the gravitational influence of other nearby objects, often leaving researchers with high uncertainty about their measurements.

Weighing in
Two new measurements for the strength of gravity (red squares, with short error bars indicating uncertainty) fall close to or within the currently accepted range for Big G (shaded gray). The new estimates are much more precise than those from other experiments in the last 40 years (teal dots and longer error bars).

082918_MT_gravity-constant_inline_730_rev.png
S. SCHLAMMINGER/NATURE 2018

The current accepted value for G, based on measurements from the last 40 years, is 6.67408 × 10−11 meters cubed per kilogram per square second. That figure is saddled with an uncertainty of 0.0047 percent, making it thousands of times more imprecise than other fundamental constants — unchanging, universal values such as the charge of an electron or the speed of light (SN: 11/12/16, p. 24). The cloud of uncertainty surrounding G limits how well researchers can determine the masses of celestial objects and the values of other constants that are based on G (SN: 4/23/11, p. 28).

Physicist Shan-Qing Yang of Huazhong University of Science and Technology in Wuhan, China, and colleagues measured G using two instruments called torsion pendulums. Each device contains a metal-coated silica plate suspended by a thin wire and surrounded by steel spheres. The gravitational attraction between the plate and the spheres causes the plate to rotate on the wire toward the spheres.

But the two torsion pendulums had slightly differently setups to accommodate two ways of measuring G. With one torsion pendulum, the researchers measured G by monitoring the twist of the wire as the plate angled itself toward the spheres. The other torsion pendulum was rigged so that the metal plate dangled from a turntable, which spun to prevent the wire from twisting. With that torsion pendulum, the researchers measured G by tracking the turntable’s rotation.

To make their measurements as precise as possible, the researchers corrected for a long list of tiny disturbances, from slight variations in the density of materials used to make the torsion pendulums to seismic vibrations from earthquakes across the globe. “It’s amazing how much work went into this,” says Stephan Schlamminger, a physicist at the National Institute of Standards and Technology in Gaithersburg, Md., whose commentary on the study appears in the same issue of Nature. Conducting such a painstaking set of experiments “is like a piece of art.”

These torsion pendulum experiments yielded G values of 6.674184 × 10−11 and 6.674484 × 10−11 meters cubed per kilogram per square second, both with an uncertainty of about 0.00116 percent.

This record precision is “a fantastic accomplishment,” says Clive Speake, a physicist at the University of Birmingham in England not involved in the work, but the true value of G “is still a mystery.” Repeating these and other past experiments to identify previously unknown sources of uncertainty, or designing new G–measuring techniques, may help reveal why estimates for this key fundamental constant continue to disagree, he says.



The strength of gravity has been measured to new precision | Science News

Qing Li, Chao Xue, Jian-Ping Liu, Jun-Fei Wu, Shan-Qing Yang, Cheng-Gang Shao, Li-Di Quan, Wen-Hai Tan, Liang-Cheng Tu, Qi Liu, Hao Xu, Lin-Xia Liu, Qing-Lan Wang, Zhong-Kun Hu, Ze-Bing Zhou, Peng-Shun Luo, Shu-Chao Wu, Vadim Milyukov & Jun Luo. Measurements of the gravitational constant using two independent methods. Nature (2018). DOI: 10.1038/s41586-018-0431-5
 
.
Back
Top Bottom