Author Archive

The first telescope of a Cherenkov Telescope Array

  • A multinational effort develops the Cherenkov Telescope Array to study high-energy galactic radiation sources
  • The array will take takes advantage of the Cherenkov effect that takes place when gamma rays hit the atmosphere
  • The inaugurated telescope was to be the first one to be operated by the Cherenkov Telescope Array Observatory

On Wednesday October the 10th 2018, in the Canary Island of La Palma more than 200 people from around the World met at the north headquarters of the Cherenkov Telescope Array (CTA). They met there on an unique occasion: the inauguration of the first Large-Sized Telescope (LST) of the Cherenkov Telescope Array.

Cherenkov radiation is emitted when a charged particle travels through a dielectric (i.e. an electrically insulating) medium above a characteristic speed threshold. This effect can be seen, for instance, around underwater nuclear reactors, which emit a blue glow as a result of this effect. In an astronomical context, high-energy gamma rays can impact on the atmosphere, generating showers of charged particles able to surpass the required speed threshold. Cherenkov telescopes detect the particle showers released by gamma rays. They do so indirectly, by using taking advantage of the released Cherenkov radiation, the faint light radiated by the charged particles in the particle showers.

The Cherenkov Telescope array

The Cherenkov Telescope Array project was conceived for the building of a next-generation ground-based gamma-ray detection telescope. The CTA was planned to consist of two arrays of Imaging Atmospheric Cherenkov telescopes (IACTs), split between both hemispheres of the Earth. The Northern Hemisphere array located in La Palma would put an emphasis on the study of extragalactic objects of the lowest possible energies, while the Southern Hemisphere array, located at Cerro Paranal (Chile) would cover the full range of energies, concentrating on sources of radiation inside our galaxy.

High-speed charged particles crossing the water around an underwater nuclear reactor generate Cherenkov radiation
High-speed charged particles crossing the water around an underwater nuclear reactor generate Cherenkov radiation

The CTA will study in detail the spatial structure, light curve and energy spectra of close to a thousand astronomical sources, via the study of very high-energy gamma ray astronomy on basis of Cherenkov radiation observation. The CTA builds on top of previous successes and experience sucha as that of the High Energy Stereoscopic System (H.E.S.S.) and the Major Atmospheric Gamma Imaging Cherenkov Telescopes (MAGIC telescopes).

On the one hand, CTA uses telescope arrays and stereoscopic analysis, improving the sensitivity dramatically. On the other, it exploits the use of large telescopes to attain the lowest possible energy/radiation threshold. Taken together, this facilitates the detailed study of the universe in connection to the strongest energy radiations: gamma rays. By the observation of the results of their impact on the atmosphere, it will be possible to study the most extreme fundamental physics and astrophysical phenomena.

The Telescope

The telescope inaugurated in October 2018, called LST-1, was the first of four Large-Sized Telescopes (LST) at the north part of the CTA Observatory at the Roque de los Muchachos Observatory (island of La Palma, Canary Islands) managed by the Instituto de Astrofísica de Canarias. Once the north telescope array is fully finished, the north network will have fifteen Medium-Sized Telescopes (MSTs) installed.

On October the 9th, 2015, an act celebrated the placement of the first stone of the LST-1. Once the foundations of the telescope were complete, in January 2017, the team continued to complete construction milestones such as the installation of the rail system (September 2017) or the installation of the mirrors (December 2017). In February 2018, the structure of the LST-1, and the camera support were installed in June of that year. The camera was finally installed on September 25th 2018.

The LST team comprises over 200 scientists of ten countries including Brazil, Croatia, France, Germany, India, Italy, Japan, Poland, Spain and Sweden. In such an international context, design and management was undertaken jointly by the Annecy Laboratory of Particle Physics (LAPP); the Max Planck Institute for Physics at Munich; the National Institute for Nuclear Physics of Italy (INFN); The Institute for Cosmic Ray Research (ICRR) of the University of Tokyo; the Institute of High Energy Physics (IFAE) at Barcelona and the Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT) in Madrid. Among the institutions who participate of the construction also are the Institute of Cosmos Sciences of the University of Barcelona (ICCUB) and the Institute of Space Sciences (IEEC).

Major Atmospheric Gamma Imaging Cherenkov Telescope II (MAGIC II), another Cherenkov-effect based telescope
Major Atmospheric Gamma Imaging Cherenkov Telescope II (MAGIC II), another Cherenkov-effect based telescope

IFAE and local contributions to the LST-1 project

Of the contributing institutions, together with IGFAE, two other catalan research institutions had an important participation in the technological development of the LST-1. The Institute for High Energy Physics (IFAE) was responsible for the coordination, control and assembly of the mechanical system for the ground anchoring and rotation of the telescope. The Institute of Cosmos Sciences of the University of Barcelona (ICCUB) contributed to the design of one of the devices for the amplification of the signal. Finally, the Institute of Space Sciences participated with the development of the control software and scheduler. All three institutions contributed to the definition of the scientific objectives of the project.

Gamma ray detection

Other than the LST, two more types of telescopes were needed to be able to study the energy range between 20 gigaelectronvolt (GeV) and 300 teraelectronvolt (TeV): the telescopes of small and medium size. As the low-energy gamma rays produce little Cherenkov light, the telescopes need to have large mirrors to capture the images. As a result, four LST telescopes will be located both at the North and South of the CTA observatory, to cover the 20 – 150 GeV low-energy sensitivity range of the CTA.

The LST, with a parabolic reflecting surface and a diameter of 23 metres, is held in place by a tubular structure made of carbon fibre and steel pipes. The reflecting surface of 400 square metres captures and focuses the Cherenkov effect light towards the camera, where the photomultiplier tubes convert the light into electric signals that are processed by the camera electronic systems. Even though the LST-1 is 45 metres tall and weighs about 100 tonnes, it still is extremely fast and can reposition in only 20 seconds in order to be able to acquire the brief signals of low-energy gamma rays.

Artist representation of all three classes of CTA telescopes planned for the southern hemisphere at Paranal Observatory
http://www.ifae.es/eng/cta-news/item/791-the-first-telescope-on-a-cherenkov-telescope-array-site-makes-its-debut.htmlArtist representation of all three classes of CTA telescopes planned for the southern hemisphere at Paranal Observatory

The LST is to increase the reach of science to cosmological distances and fainter sources with soft energy spectra. Both the speed of reorientation of the telescopes and the low energy threshold they provide are key elements for the study of transitory gamma ray sources in our galaxy, or for the study of active galactic nuclei and the explosions of gamma rays shifting towards the “high-red” part of the radiation spectrum.

This inaugurated telescope was expected to be the first LST of the CTA, and the first telescope in one in the array to be operated by the CTA observatory (CTAO). However, as every technical delivery in the CTA multinational project, a stringent process or revision would await the LST-1 in order to verify that the design accomplishes the scientific objectives of the CTA, its exploitation needs, security standards, and so forth, before being approved by CTAO.

The website of the inauguration of the LST-1 can be visited here to find information in other languages and to obtain supplementary materials. In the meantime, the project has continued its advance. Find online the current status of this project, which is to provide astrophysics with a richness of new cosmological data in the future to come.

Image credits:

Cherenkov radiation from underwater reactor was downloaded from Wikipedia and licensed via a Creative Commons Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0) license.

Major Atmospheric Gamma Imaging Cherenkov Telescope II (MAGIC II) was downloaded from Wikimedia Commons and licensed via an Attribution-ShareAlike 3.0 Unported (CC BY-SA 3.0) license.

Artist representation of South CTA telescopes was downloaded from Wikimedia Commons and licensed via an Attribution 4.0 International (CC BY 4.0) license.

Unprecedented rains decimate surface microbial communities in the hyperarid core of the Atacama Desert

  • Rains in recent years decimate the species of microorganisms in the driest desert on Earth
  • Extinctions of up to 85% of the species take place
  • Related parallels are traced between  causes of Atacama and Mars orography

Atacama: the setting for a microscopic extinction

The Atacama Desert covers a 1000-km long strip of land in northern Chile. Depending on the estimates, it has an extension of between 105,000 km2 and 128,000 km2. A territory of sand, stony terrain, salt lakes, and magmatic terrain from the coast until the Andes mountain range, the Atacama Desert has been termed the driest and oldest desert on Earth. At its core, it contains some of the most hyperarid soils on the planet.

The conditions at Atacama result in some of the most hyperarid environments on Earth, approaching aspects of planet Mars
The conditions at Atacama result in some of the most hyperarid environments on Earth, approaching aspects of planet Mars

There, for at least the previous 500 years there had been no rain. This changed, however, in recent years. Rain episodes were recorded for the first time in the hyperarid soil of Atacama. Surprisingly, the sudden abundance of water was devastating for the existing microbial life. That was the conclusion of a study published in Nature Scientific Reports by an international group of researchers from the Centro de Astrobiología (CAB, CSIC-INTA).

For scientists, these unexpected rains could be attributed to global climate change. In the context of the Atacama Desert, contrary to what could have been expected, the presence of water has not meant the flourishing of life in the Atacama desert. Much on the contrary, the rains have caused huge devastation in the microbial species present before the rainfall.

According to the results, the high rainfall has caused a mass extinction event of most of the indigenous microbial species. The extent of this extinction reached levels of up to 85% of these species. Such high mortality rates came as a result of the osmotic stress caused by the sudden abundance of water. Native microorganisms, perfectly adapted to live under extreme dryness conditions and optimized to extract the scarce environmental humidity, have been unable to cope with the sudden flooding, dying as a result under the water-excess conditions.

Images of bacterial species identified in the Atacama lagoons (left) and phylogenetic diagram (right), from original article
Images of bacterial species identified in the Atacama lagoons (left) and phylogenetic diagram (right), from original article

The study represented a breakthrough in understanding the microbiology of extreme arid environments and presented a new paradigm to understand the evolutionary route of the hypothetical early Martian microbiota.

A step forward in the microbiology of extreme arid environments

Mars, a hyperarid planet, also experienced catastrophic floods in ancient times. Mars had a first period, the Noachian (between 4.5 and 3.5 billion years ago), in which there was a lot of water on its surface. We know that from preserved hydrogeological evidences, in the form of ubiquitous hydrated minerals on the surface, traces of rivers, lakes, deltas and a possible hemispheric ocean in the northern plains. Later on, Mars lost its atmosphere and hydrosphere and became the dry and arid world that we know today.

However, during a later period, the Hesperian (from 3.5 to 3 billion years ago), large volumes of water dug the surface of Mars and formed overflow channels, the largest found in the Solar System. If there were still microbial communities living under extreme desiccation conditions, they would have undergone osmotic stress processes similar to those observed in Atacama.

A possibility that arises from the Atacama study conclusions is that, possibility, the recurrence of liquid water on Mars could have contributed to the extinction of Martian life, if it ever existed, rather than become an opportunity for the re-flowering of resilient microbiota.

Nitrogen in the soil -parallels with Atacama

The Atacama study notes that large deposits of nitrates in the Atacama Desert offer evidence of long periods of extreme dryness in the past. The nitrates became concentrated at valley bottoms and former lakes by sporadic rains about 13 million years ago. Nitrates happen to be suitable nutrients for certain microbes.

The Atacama nitrates may represent a convincing analogue to the nitrate deposits that were discovered on Mars by the rover Curiosity (as reported in a 2015 study about evidence of indigenous nitrogen in Martian solid samples, in the Proceedings of the National Academy of Sciences).

The rover Curiosity provided data about the presence of nitrogen in the form of nitrates at the Martian surface
The rover Curiosity provided data about the presence of nitrogen in the form of nitrates at the Martian surface

Earlier on, CAB researcher Fairén and colleagues published in Nature Astronomy another, related work. The researchers investigated the formation of clay during brief periods of warmer, wetter conditions in ancient Mars. They discovered the sporadic appearance of short-lasting wet environments in early Mars, also then a generally hyperdry planet. This finding, in part, could contribute to explain the observed Martian mineralogy.

Altogether, the long periods of dryness followed by short wet periods, could have been in the origin of the nitrate deposits on Mars, similar to those deposits found in the Atacama Desert. Among others, the combined understanding of mineralogy, climate or chemistry at extreme sites on Earth (as the Atacama Desert) has shown that it allows to obtain a better understanding of the past and present of Mars. With several nations planning missions to Mars, the planet will certainly remain in the spotlight of planetary exploration in the future to come, along with the research.

Image credits:

Atacama Desert landscape is in the public domain and was downloaded from Pixabay.

Atacama microorganisms figure was downloaded and modified (cropped and cleaned from text elements) from the original research article at Nature, and was licensed via an Attribution 4.0 International (CC BY 4.0) license.

Rover Curiosity picture is in the public domain and was downloaded from Pixabay.

New harvest of ERC Starting Grants

  • 480 ERC Starting Grants have been awarded in the whole of Europe and associated countries
  • 20 of the grants went to Spain, with five SOMMa institutions holding a total of 7 of the grants
  • The researchers are bestowed with up to 1.5 million Euro to be spent over a five-year period

The ERC Starting Grants

480 ERC Starting Grants were awarded at the start of September to researchers from the EU and a number of associated countries. The grants, worth a total of 621 million Euro, give outstanding starting researchers up to 1.5 Million Euro to spend over a period of 5 years. These funds will allow them to build their own teams, while developing ambitious, impactful projects. Being extremely competitive, these grants are in the top tier of the funding opportunities provided to European researchers.

Of the awarded grants of this year, up to 20 went to Spain-based researchers. Seven of those went to SOMMa institutions: the Center for Cooperative Research in Biomaterials (CIC biomaGUNE), the Centre for Genomic Regulation (CRG), the Institute for Bioengineering of Catalonia (IBEC), the Institute of Photonic Sciences (ICFO) and the Institute of Mathematical Sciences (ICMAT).

The SOMMa awardees:

Center for Cooperative Research in Biomaterials

At CIC biomaGUNE, researcher Alberto Elosegui-Artola with his project VISCOMATRIX will study the extracellular matrix. This component of pluricellular organisms is a three-dimensional network made of biomolecules that surround the cells. The extracellular matrix constitutes a local environment in which cells find themselves immersed into a scaffolding that influences their overall mechanical properties. In turn, it influences biological processes such as wound healing or cell growth. The project of Elosegui-Artola addresses a lesser-studied aspect of extracellular matrix mechanic properties: its viscoelasticity. There is evidence that viscoelasticity can play a relevant role in processes such as cancer or fibrosis, as well as in regenerative medicine.

Centre for Genomic Regulation

The CRG was awarded up to three new ERC grants. Of them, their researcher Sara Sdelci will study enzymes critical for cancer cells to divide in an uncontrolled way with her project EPICAMENTE. The newly obtained knowledge would support the design of drugs targeting new processes key to cancer cell growth, while at the same time selectively sparing healthy cells from negative effects.

Several of the awarded projects fall are in fields connected to biomedicine and affine disciplines
Several of the awarded projects fall are in fields connected to biomedicine and affine disciplines

The project EvoCellMap of Arnau Sebé will study the molecular foundations behind the differences in cell types in different organisms. The study, spanning across different species, will examine types of cells as different as are neurons and muscle cells, exploring their common foundations. The research could help answer a question with profound implications: why cellular life exists.

Finally, the project SYSAGING went to researcher Nicholas Stroustrup. Stroustrup will devote his efforts to the development of new experimental tools and mathematical frameworks related to the molecular context of aging. These tools would help understand how certain interventions at the molecular level can result in systemic outcomes, effectively modifying the aging process.

Institute for Bioengineering of Catalonia

The project PANDORA, at the Institute for Bioengineering of Catalonia (IBEC), by Loris Rizzello, won another of the awards. The proposal fuels a possible paradigm shift regarding the way in which infectious diseases caused by intracellular pathogens are treated. The aim of the project is to find a universal cure to attack infectious diseases, in the process also tackling the rise of antibiotic-resistant bacteria. The project, focusing on tuberculosis, will study which are the molecular signatures exhibited by infected cells. From such signatures, it will aim at being able to recognise and target the infected cells selectively using nanoparticles to attack them, while sparing healthy ones.

Institute of Photonic Sciences

Dmitri Efetov of ICFO was awarded an ERC starting Grant to undertake SuperTwist, a project with the objective of understanding better the superconductivity in a particular sort of graphene: the “magic” angle graphene. This is a type of graphene in which “misaligned” stacks of the material exhibit superconductivity, as well as other particular physical properties. One of the defining characteristics of graphene is its superconductor order parameter, which will be determined for “magic” angle graphene. As no current method can measure it by itself, and key techniques are moreover missing for such nano-scale materials, novel approaches will have to be implemented that combine aspects from disruptive measurement techniques and of materials science

New methods will be developed, overcoming challenges that cannot be addressed by conventional techniques and procedures
New methods will be developed, overcoming challenges that cannot be addressed by conventional techniques and procedures

Institute of Mathematical Sciences

Closing the list of SOMMa member awardees, is researcher Javier Gómez Serrano of ICMAT. The project CAPA will address the generation of methods and ideas to find mathematical singularities and ways to fully describe the evolution of fluids regardless of their initial state. This is of particular relevance in the modelling of waves and of hot and cold weather fronts, which relate to the occurrence of storms. Other questions to be addressed fall into a field known as Spectral Mathematics, which includes questions as for example: can the shape of a drum be inferred from its sound? Novel mathematical methods will be proposed, allowing to treat problems that cannot be studied by means of the long-standing “pen and paper” or “chalk and blackboard” approach.

This septemvirate of researchers will, after the five-year period of the grant, doubtlessly have had chance to provide valuable contributions to science. While they consolidate their career and contribute with their research, we wish them successes with their endeavours.

Image credits:

Wet lab picture was cropped from a larger Wet lab picture in the public domain, and downloaded from Pixnio.

Innovation and research picture was downloaded from Flickr and licensed via an Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0).

Severo Ochoa – Maria de Maeztu institutions meet with autonomous communities and the Ministry

  • A meeting took place between the Ministry of Science, Innovation and Universities, senior officials of 15 Spanish Autonomous Communities and 42 members of the SOMM alliance
  • Six centres and units of excellence “Severo Ochoa–Maria de Maeztu” showed the potential impact of the programme for Autonomous Communities
  • “The support of autonomous communities is essential for Spain”, declares Luis Serrano, president of the SOMM alliance

The Ministry, autonomous communities and Severo Ochoa / Maria de Maeztu institutions together

On Monday, September the 9th took place a meeting between the Ministry of Science, Innovation and Universities, the Severo Ochoa and Maria de Maeztu centres and units of excellence and high-officials and counselors of 15 autonomous communities. The programme of excellence “Severo Ochoa – Maria de Maeztu” identifies and promotes the excellence in scientific research. Its objective is to give recognition to the research institutions at the forefront of their fields, and to boost their impact, international scientific leadership and competitivity.

The event took place at the Ministry, and was co-presided by the Minister Pedro Duque and by Luis Serrano, president of the SOMM alliance. The director of the Spanish State Agency of Research, Enrique Playán, and the first vice president of SOMMa, Maria Blasco.

The minister of Science, Innovation and Universities, Pedro Duque, stressed “the firm support of the Government to the Severo Ochoa and Maria de Maeztu programme of excellence”, highlighting “the importance of the Autonomous Communities being able to host such centres, which would be to the benefit of their science and innovation systems.” The minister has declared, moreover, that “the level of the selection requirements will remain as stringent as it currently is, reason why the objective is that more excellent research institutions appear, something that requires of the support of the various governments.”

As declared Luis Serrano, “this meeting is symptomatic of a constructive and positive stance with regard to research in Spain. The support and interest of the autonomous communities for research is essential for Spain having expectations of becoming a country with a truly knowledge-based economy. The Severo Ochoa and Maria de Maeztu programme is a powerful lever of which autonomous communities can make use to strengthen their research institutions.”

Table of the meeting with autonomous communities: Luis Serrano, Rafael Rodrigo, Pedro Duque, Enrique Playán and Maria Blasco
Table of the meeting with autonomous communities: Luis Serrano, Rafael Rodrigo, Pedro Duque, Enrique Playán and Maria Blasco

Stories of success put into display

The meeting in the Ministry had as object to emphasise the impact of the Severo Ochoa – Maria de Maeztu programme, its benefits for the consolidation of research of excellence, and its potential to help to sustain and to become a springboard for the research poles of autonomous communities.

Six centres and units of research have exposed their path towards obtaining the award and the impact that the prize has had. A round of questions and answers at the end of the act allowed the autonomous communities to learn how they can benefit from the programme.

During the event presented their case the Institute for Cross-Disciplinary Physics and Complex Systems (IFISC, Balearic Islands), the Galician Institute of High Energy Physics (IGFAE, Galicia), the Gene Expression and Morphogenesis department (GEM-DCM2, CABD, Andalusia), the Instituto de Neurociencias de Alicante (IN – CSIC – UMH, Valencian Community), the Basque Center for Applied Mathematics (BCAM, Basque Country) and the Instituto de Astrofísica de Canarias, altogether making clear the presence of research institutions of excellence over the whole of Spanish geography.

Group picture of SOMMa directors after the meeting
Group picture of SOMMa directors after the meeting

Image credits:

Photographs were kindly provided by staff of the Spanish Ministry of Science, Innovation and Universities.

Young researchers engineer a prospective life-saving bacterium – a possible metastatasis prevention probiotic

  • A group of students take their baptism in research: they develop a proof of concept for a probiotic to prevent cancer metastasis
  • The young researchers engineer a bacteria to take up and consume long-chain fatty acids, a major risk factor for metastasis
  • Further development could eventually have this idea give the leap into a real application saving many lives

iGEM and the Giant Jamboree

This is a story that starts with an announcement on a notice board at the Pompeu Fabra University (UPF). A group of young students would find and read it, forming as a result a team that would make it to Boston to participate in iGEM, an international competition in Synthetic Biology.

The culmination of the efforts of these students in the iGEM 2018 competition came with iGEM’s Giant Jamboree, that took place in Boston, Massachussets. From October 31st to November 4th, during a full 4 days of experience sharing, cooperation, competition against themselves while striving for improvement, and celebration of the achievements of participant teams.

iGEM, an independent non-profit whose name stands for “International Genetically Engineered Machine”, traces its origins back to January 2003. Then, a group of teachers of the Massachusetts Institute of Technology (MIT) devised a new study course. The initially small course, perhaps inexpectedly, grew to become a team competition the next year, further growing in size the following year and the next.

The MIT campus in Boston, Massachussets, birthplace of iGEM
The MIT campus in Boston, Massachussets, birthplace of iGEM

By 2016, the competition gathered already more than 5000 participants. The growth continued, with the 2018 edition hosting over 340 teams of 40 countries, totalling 5790 participants. As of today, iGEM has become an initiative that works not only to promote the education of young scientists, but also to contribute to the progress of the field of synthetic biology, a discipline combining principles of both biology and engineering to create living systems for solving specific problems.

The team and how it came about

With the support of three SOMMa members, the Department of Experimental and Health Sciences at Universitat Pompeu Fabra (DCEXS – UPF), the Information and Communication Technologies Engineering Department at Universitat Pompeu Fabra (DTIC – UPF) and the Centre for Genomic Regulation (CRG) these students launched their quest, going as far back as the year 2017.

Hundreds of teams of pioneering young scientists developed their projects, in a process that would bring them up an intensive path of learning and challenge. The Barcelona team was composed of students of human biology, biomedical engineering, mathematics and telecommunications engineering of the UPF. Leveraging their collective skills and drive, they developed a proof of concept probiotic for preventing metastasis. Its eventual translation into a successful, real application, could prove its potential to save many lives: as many as 90 % of deaths by cancer are caused by metastasis.

The iGEM 2018 Barcelona Team in Boston
The iGEM 2018 Barcelona Team in Boston

The project was launched well in advance, with a large fraction of the time dedicated to fundraising, coordinated by the students themselves. Eventually, enough funds could be secured via crowdfunding, even managing to obtain funds from a number of companies such as Promega. This was non-trivial, as the team noticed: the stronger the financial base, the more time and effort could be allocated towards the scientific development in terms of documentation, laboratory work, as well as to the computational modelling tasks involved. During the whole development of the project, the team would retain the last word for each decision to be taken, giving them a deserved feeling of ownership for their success.

The science and idea behind, at a glance

The originary idea came from a publication in which metastatic cells were found to be expressing a high amount of a certain cell receptor called CD36. This receptor resulted in an enhanced uptake of long-chain fatty acids (LCFA) by those cells. In addition to this, it was known that a high presence of LCFAs in the diet of humans is linked to an increased risk of developing metastasis, in case of cancer. The proposal that arose from this was to develop a bacterium that would takes up long-chain fatty acid from the diet, not only taking them up but also metabolising them more efficiently.

The team would thus engineer bacteria that, if used as a probiotic, could result in a decreased absorption of LCFA by humans, by that decreasing the risk of metastasis. This bacterium would take the name of Gargantua, in honour of the famous (and voracious) character of the XVI century French novels written by François Rabelais.

It was initially proposed for the bacterium to mimic the use of the same CD36 receptor as metastatic cells. This quickly proved not easily feasible, so an alternative was found. An equivalent bacterial alternative, the protein FadL, would have a similar effect and be an appropriate replacement with an equivalent function. Together with a group of closely related proteins, FadD and a system of complementary proteins, not only absorption but also largely enhanced degradation of fatty acids would become possible.

The project engineered E. coli to decrease LCFA absorption in humans, potentially decreasing the risk of metastasis.
The project engineered E. coli to decrease LCFA absorption in humans, potentially decreasing the risk of metastasis.

More pieces had to be added to this biological machine. Team members Marta Vilademunt and Dimitrije Ivancic brought forth, as good practice in synthetic biology, that when one intends to synthetize or degrade a compound, it is necessary for the engineered system to be able to detect the compound in order to adapt adequately. In that logic, a molecular sensor capable for detecting the influx of LCFA was built into the bacterium. Connected to a biological switch, this would turn on and off the system, allowing the bacterium to react coherently and more efficiently to the presence or absence of LCFA inside the bacterium.

Finally, the same as humans are often misled by sweet, making them ignore other nutritious food sources, many bacteria tend to favour the consumption of sugars as well. Making the bacteria unable to use sugar as fuel, their genes for sugar processing were “knocked out”. This would result in the bacterium being forced to grow and thrive on other nutrients, focusing on the desired LCFA.

iGEM, during and afterwards

Students Vilademunt and Ivancic explained that since the inception of the project, the maturation of the idea took almost a year. For some time, this was just a side-project for the team while each one of them was busy with other aspects of their career. Some members  were outside Spain at the time, meaning that coordination needed to be done remotely, keeping the contact alive via periodic teleconferences.

The project kept under a constant process of maturation, with lots of learning with the trial and error, gradually changing after solving the encountered issues, one after the other. The original concept would keep repeatedly re-adjusting, resulting in a far more complex and capable system than initially planned.

The effort, commitment, ability to improve, or the proposal of a genuinely original idea were all part and requirement of the whole process. Having to prove lots of patience and resilience in face of frustration, and the fact of having chance to make the jump from theory to practice, made the process of personal growth exponential. Also the need to focus a relevant fraction of their energies on fundraising, a completely different aspect, summed on top of all this process of learning. With many decisions still laying ahead and manifold possible life directions still wide open, the participation in iGEM may have given them a good head start in their future careers.

Aftermath of the competition

The competition of each team was, first and foremost, against itself. In this competition, the Barcelona team was awarded with a gold medal owing to their excellent idea, the development and presentation of the project, together with the feasibility of bringing the research into results with true impact. Gargantua was, as a result, also nominated as one of the best therapy-oriented projects of the competition. Room for further development exists.

iGEM opens many doors to its former participants, offering the possibility to keep engaged as ambassadors, instructors or advisors to new teams. This large community keeps growing, providing over time contributions that have over time become foundational works in the field synthetic biology. Other examples exist of remarkable success stories, with companies such as Gingko Bioworks, that started as an iGEM team, and by early 2019 had yet over 200 employees and was valued at around 1 billion dollars. Will Gargantua be the start for another such success? Let’s bet in their favour.

Picture credits

MIT Campus picture was downloaded from Wikipedia and licensed via an Attribution 3.0 Unported (CC BY 3.0) license.

UPF-CRG Barcelona team picture was downloaded from the team wiki at the iGEM 2018 website, and licensed via a Creative Commons Attribution (CC BY) Copyright license.

Escherichia coli picture was downloaded from Flickr and licensed via an Attribution 2.0 Generic (CC BY 2.0) license.

A discovery aimed at alleviating famine

  • Research team led by CRAG researcher Ana Caño-Delgado obtains plants resistant to water scarcity
  • A modification of the steroid hormone signalling enables to obtain a drought resistant plant without affecting plant growth
  • Researchers work to translate this advance to cereals and horticultural species

Drought and its impact

Extreme drought episodes and heatwaves are among the effects of climate change that are becoming clearly visible. These effects have a definite impact on vegetation, including by that an impact on agriculture. The decrease in rainfall and the abnormally hot temperatures in northern and eastern Europe have caused large losses in cereals and potato crops and in other horticultural species in the immediate and recent past. It is claimed that at least 40% of crop losses worldwide are due to drought, a proportion that might rise enormously as a result of climate change.

Experts have long warned that to ensure food security it is becoming necessary to use plant varieties that remain productive in drought conditions. Drylands, areas characteristically having a scarcity of water, will be increasing in extension in the future, as global temperatures increase. Previously wet areas may become increasingly dry. The plants present, including crops, will become subject to conditions of stronger hydric stress, posing a challenge to their growth –and in extreme cases, their survival-.

The Sahel is an example of a region with high dependence on agriculture, yet historically subjected to recurrent droughts
The Sahel is an example of a region with high dependence on agriculture, yet historically subjected to recurrent droughts

For agriculture, severe droughts are a serious threat, potentially resulting in a strong impact to the food supply, even resulting in famine episodes in affected countries. Such events have already been recurrent in a number of regions across the world, for instance in Ethiopia, Somalia or the African Sahel region that spans from Mauritania, Mali, Chad, Niger and Burkina Faso to parts of some of the countries to the South.

A basic research biotechnological approach… for a tangible solution

Bearing in mind the problems posed to agriculture by increasing number and intensity of droughts, a team led by researcher Ana Caño-Delgado, of the Center for Research in Agricultural Genomics (CRAG) have researched on the engineering of plants resistant to drought. The team of Caño-Delgado has obtained plants with an enhanced resistance to hydric stress, by acting on the signalling pathway of the plant steroid hormones known as brassinosteroids. The study, published in the journal Nature Communications, is the first to find a strategy to increase hydric stress resistance without affecting overall plant growth.

Brassinosteroids are involved in numerous plant processes including cell expansion and elongation, vascular differentiation, pollen tube formation, senescence, cell division and cell wall regeneration, and the resistance to chilling and drought. These hormones bind and are perceived by different plasma membrane-located receptor proteins, causing in turn the relay of a signal in the cell that ends up producing effects such as cell elongation or division. One of these receptor proteins is the protein BL3 , object of the study.

Ana Caño-Delgado has been studying how the plant steroids (brassinosteroids) regulate plant development and growth in the model plant Arabidopsis thaliana for more than 15 years. Since 2016, and thanks to a project funded by the European Research Council, the laboratory of Dr. Caño-Delgado uses their accumulated knowledge to find strategies to confer drought resistance to plants. By modifying brassinosteroid signalling, researchers had so far achieved Arabidopsis plants with increased drought resistance, but due to the complex action of these hormones on plant growth, these plants were also much smaller than their unmodified counterparts.

Arabidopsis thaliana plants after drought stress. Wild-type plant to the left; plant overexpressing BRL3 receptor to the right.
Arabidopsis thaliana plants after drought stress. Wild-type plant to the left; plant overexpressing BRL3 receptor to the right.

In the work, the researchers have studied drought resistance and growth in Arabidopsis thaliana plants with mutations in different brassinosteroid receptors. The study allowed the researchers to discover that plants that over-express the brassinosteroid receptor BRL3 specifically in vascular tissue are more resistant to the lack of water than control plants. Additionally, unlike other mutants, they do not present defects in their development and growth. As explains Caño-Delgado, “we have discovered that modifying brassinosteroid signaling only locally in the vascular system, we are able to obtain drought resistant plants without affecting their growth”.

At a later stage, CRAG researchers in collaboration with researchers from Europe, the United States and Japan analysed the metabolites in the genetically modified plants. From the obtained data, it was found that Arabidopsis plants overexpressing the BRL3 receptor produce more osmoprotective metabolites (such as sugars and proline) in the aerial parts and in the roots under normal irrigation conditions. These compounds would help to better conserve water in the plant.

When the modified plants were actually exposed to drought conditions, specific protective metabolites would additionally quickly accumulate in the roots, protecting them from drying out. So, the higher amounts of BRL3 resulting from the overexpression would, in a way, be preparing the plant to respond to the situation of water scarcity. This mechanism, known as priming, can be somehow compared to the effect of vaccines in the human body: when actual drought conditions take place, the additional mechanism would be triggered, but the plant would already be better prepared to resist the adversity of drought conditions.

Making the jump from fundamental to applied research: a coming solution for species of agronomic interest?

As said Caño-Delgado, “drought is one of the most important problems in today’s agriculture. So far, the biotechnological efforts that have been made to produce plants more resistant to drought have not been very successful, because as a counterpart to enhanced drought resistance there was always a decrease in plant growth and productivity. It seems that we have finally found a strategy that could be applied and we want to continue exploring it “.

The original discovery was done with, Arabidopsis thaliana, a small herb used as a model plant that has no direct application outside of research. The results, however, make way for new developments with a more practical application. The team of Caño-Delgado is already working on applying this strategy on plants of agronomic interest, especially in cereals. If successful, the next steps can could a large impact on agriculture.

Image credits:

Frontpage image of dry maize plants is in the public domain and was downloaded from Pixabay.

Map of the Sahel was downloaded from Wikimedia Commons and licensed via an Attribution-ShareAlike 3.0 Unported (CC BY-SA 3.0) license.

Image of Arabidopsis plants exposed to drought stress was made by the researchers and kindly provided by CRAG.

Paper farming for thermoelectrics: breakthrough to recycle waste heat into electricity

  • The device is composed of bacterial-produced cellulose with small amounts of carbon nanotubes
  • The production method and recyclability of the components make this a sustainable and environmentally friendly product
  • The device could be used to generate electricity in wearables, medical and sports applications, and as an intelligent thermal insulator

Thermoelectric materials: a new member in the family

Thermoelectric materials, capable of transforming heat into electricity, are very promising elements for converting residual heat into electrical energy. They allow us to utilize in an efficient way thermal energy that is hardly usable and that generally would just be lost.

Researchers at the Institute of Materials Science of Barcelona (ICMAB-CSIC) have created a cost-effective, easy to produce and potentially recyclable thermoelectric material: a paper capable of converting waste heat into electricity. These device could be used to generate electricity from residual heat. For instance, they could be used to feed sensors in the field of the Internet of Things, or in Agriculture and Industry 4.0.

The advance may have application, among others, on the Internet of things, and in Industry and Agriculture 4.0
The advance may have application, among others, on the Internet of things, and in Industry and Agriculture 4.0

“This device is composed of cellulose produced in situ at the laboratory, by means of bacteria, using small amounts of a conductor nanomaterial, carbon nanotubes, using a sustainable and environmentally friendly strategy” explains Mariano Campoy-Quiles, researcher at ICMAB.

“In the near future, they could be used as wearable devices, in medical or sports applications, for example. And if the efficiency of the device were even more optimized, this material could lead to intelligent thermal insulators or to hybrid photovoltaic-thermoelectric power generation systems” predicts Campoy-Quiles.

In addition, “due to the high flexibility of the cellulose and to the scalability of the process, these devices could be used in applications where the residual heat source has unusual shapes or extensive areas, as they could be completely covered with this material” highlights Anna Roig, researcher at the ICMAB, about the versatility of the device.

Since bacterial cellulose can even be home-made, this could be a first step towards a new energy paradigm in which users could be able to make their own electric generators. While this reality is still far away, the study is yet a good starting point.

Farming thermoelectric paper in the lab

“Instead of making a material for energy, we cultivate it” explains Mariano Campoy-Quiles, a researcher of this study. “Bacteria, dispersed in an aqueous culture medium containing sugars and carbon nanotubes, produce the nanocellulose fibers that will end up forming the device, in which the carbon nanotubes are embedded” continues Campoy-Quiles.

“We obtain a mechanically resistant, flexible and deformable material, thanks to the cellulose fibers, and with a high electrical conductivity, thanks to the carbon nanotubes,” explains Anna Laromaine, researcher at the ICMAB. “The intention is to approach the concept of circular economy, using sustainable materials that are not toxic for the environment, which are used in small amounts, and which can be recycled and reused,” explains Roig.

Farming... a new thermoelectric paper at ICMAB-CSIC. A cheap, cost-effective, recyclable and versatile material
Farming… a new thermoelectric paper at ICMAB-CSIC. A cheap, cost-effective, recyclable and versatile material

Roig claims that, in comparison to other similar materials, this one “has a higher thermal stability if compared to other thermoelectric materials based on synthetic polymers, which allows it to reach temperatures of 250 °C. In addition, the device does not use toxic elements, and the cellulose can easily be recycled, since it can be degraded by an enzymatic process converting it into glucose, while recovering the carbon nanotubes, which are the most expensive element of the device”. Moreover, the thickness, color and transparency of the material can be controlled.

Campoy-Quiles explains that carbon nanotubes have been chosen for their small dimensions: “Thanks to their nanoscale diameter and their few microns in length, carbon nanotubes allow, with very little quantity (in some cases, down to 1 %), to obtain electrical percolation, i.e. a continuous path where the electrical charges can travel through the material, allowing cellulose to be conductive and thermal insulator at the same time”.

Additionally, the use of such a small amount of nanotubes (up to a maximum of 10 %), while maintaining the overall efficiency of a material containing 100 % (of nanotubes), makes the process very economic and energy efficient” adds Campoy-Quiles. “On the other hand, the dimensions of carbon nanotubes are similar to those of cellulose nanofibres, which results in a homogeneous dispersion. In addition, the inclusion of these nanomaterials has a positive impact on the mechanical properties of cellulose, making it even more deformable, extensible and resistant”, adds Roig.

This study is the result of an interdisciplinary project (FIP-THERMOPAPER) between different groups of the Institute of Materials Science of Barcelona (ICMAB-CSIC) in the framework of the “Frontier Interdisciplinary Projects” call, a strategic action of the Severo Ochoa project of excellence.

Image credits:

Hydroponic culture picture downloaded from Wikimedia Commons, and licensed via an Attribution-ShareAlike 4.0 International license (CC BY-SA 4.0).

“Farming paper” picture was kindly provided by ICMAB-CSIC, and was somewhat cropped from the original size.

A research – clinic collaboration to treat language disturbances

Neure Clinic is born as an initiative of knowledge transfer of the Basque Centre on Cognition, Brain and Language (BCBL), with the idea in mind of providing society with some of the developments of the institution, through an evaluation and diagnostic service. The aim of Neure Clinic is to provide exhaustive neuro-psychological and logopedical evaluations in the field of language. In this process, BCBL contributes the latest advances provided by the research teams at BCBL.

Basic research translating into application and social impact:

Neure Clinic translates knowledge of basic cognitive processes to socially relevant applications. The clinic focuses on disturbances as the breakdown due to brain pathology (tumours and stroke patients) and developmental language disorders (developmental dyslexia and specific language impairment). While generally acquiring language skills is natural and easy for most children, there are cases in which this process is incredibly challenging and costly.

Developmental disabilities as these can have dramatic and far-reaching impact on the lives of those affected, from their academic achievement to their mental health and economic status. For instance, developmental dyslexia and specific language impairment (SLI) are two neurogenetic disorders that affect the development of language skills and are found in 5 to 10% of the population (i.e. between 1 and 3 children per classroom). For such disorders, the earlier and the more comprehensive the diagnosis, the better the prognosis will be. As such, early diagnosis is of utmost importance, lest those early disturbances develop into more severe impairments.

The clinic is an opportunity for establishing collaboration with professionals in the field to set up a way of providing a detailed and well-informed diagnosis. The clinic will be a valued advanced diagnosis venue (including, for instance, neuroimaging tests) for developmental and learning disorders related to language, math and reading.

Neure Clinic addresses the diagnosis, treatment and monitoring of learning disorders, focusing primarily on school children
Neure Clinic addresses the diagnosis, treatment and monitoring of learning disorders, focusing primarily on school children

Neure Clinic will bring in new interesting data that, with the informed consent of the patients, will add to the value already present at the database of the Basque Center on Cognition Brain and Language (BCBL). This hence generates a win-win situation of cooperation between both bodies: BCBL’s expertise will benefit the clinic, while the collection of clinical histories in the database will benefit BCBL’s research.

The activities of the clinic:

Neure Clinic has developed a unique evaluation program of Basque and Spanish language skills in monolingual and bilingual children who struggle with language and reading acquisition. In collaboration with the BCBL, the clinic assesses reading and language skills to advise families on the most appropriate clinical and educational intervention strategies.

Neure Clinic carries out the neuropsychological evaluation of patients with brain tumours and patients that suffer from aphasia or related language impairments. The clinic has created assessment protocols that combine the administration of behavioural neuropsychological tools and neuroimaging measures to provide a rich set of data for a more comprehensive diagnosis, and therefore, for tailored responses to language deficits.

The diagnoses provided by Neure Clinic are unique because the assessment batteries have been built based on a close collaboration between practitioners (neuropsychologists and speech and language therapists) and BCBL researchers with extensive expertise in the field of language disorders.

The multidisciplinary collaborative developed at Neure clinic has allowed to create tasks that assess all the different levels of the causal explanatory chain, from brain, through cognition, to behaviour. The description and quantification of a disorder at each possible level of analysis within the same patient results in novel insights into the possible cause of each individual case. This provides teachers and practitioners with critical information to offer the best support to their patients.

The diagnosis program is the first to offer and integrate systematic neuroimaging evaluation. As well as creating a unique systematic clinical evaluation of skills at all levels of analysis, this initiative feeds back into the BCBL’s basic research significantly improving its scientific and theoretical knowledge in the field of cognitive neuroscience and language disorders.

Image credits:

Brain hemispheres image is in the public domain and was downloaded from Pixabay.

Neure Clinic picture was kindly provided by BCBL.

Carriers for catalyst particles: bystanders or active players after all?

  • Found weak spot of catalysts used to remove toxic car exhaust gases
  • Inert supports to catalyst converters have a paradoxically non-neutral effect
  • The finding will help improve and extend lifespan of catalysts used to convert car exhaust gases

A new catalytic effect

A catalytic effect has been discovered that may contribute to enhance the effectiveness of catalytic converters in cars. The expectable impact: carbon monoxide (CO) emission cuts, and increased durability of these car components. A team of researchers from the Institute of Theoretical and Computational Chemistry (IQTC) of the University of Barcelona and the Vienna University of Technology contributed to this breakthrough.

Catalytic converters are used to transform toxic exhaust gases into less harmful substances
Catalytic converters are used to transform toxic exhaust gases into less harmful substances

The work, published in the journal Nature Materials, had its experimental part conducted by the team of Prof. Günther Rupprechter, from Vienna UT. The data were analysed using computational modelling in Barcelona by the group of ICREA Professor Konstantin Neyman from the IQTC. Results showed that chemical processes in particles used as catalysts for automobile exhaust gases change significantly if they are placed on oxide supports. These supports, that are expected to be inert and not active in the chemical reaction, would hence be playing a role nonetheless, and affecting the catalyst, consisting in palladium microcrystalline particles.

The same as food should not vary depending on the material of the dish it is served on, in chemical reactions catalysed by metal particles, the substrate (i.e. the inert support) should not play a role in the outcome, neither on the conservation of the particle. Catalytic particles, with a diameter often spanning thousands of atoms, are thus not expected (neither desired) to be affected by chemical reactions happening outside the reaction interface; that is, away from the location where the actually catalysed reaction is taking place.

The poisoning of catalysts by carbon monoxide

Vehicles using a combustion engine use a catalytic converter to convert toxic exhaust substances into more innocuous compounds. Different types of catalysts are used in cars, but in all of them one of the main reactions involved is the conversion of the toxic gas carbon monoxide (CO) into carbon dioxide (CO2), which, while still a pollutant, it is far less toxic.

For this chemical transformation to take place, the catalyst surface first becomes covered with oxygen molecules. This is due to a phenomenon called adsorption, by which the atoms of these molecules reversibly interact with the surface of the catalyst, forming a thin layer around the catalyst surface.

While adsorbed on the catalyst particle surface, the atoms of the oxygen molecules become more reactive. As result, they become more readily available for converting CO to CO2. It is at the oxygen-covered surface that the interaction of the atoms of oxygen with CO takes place, leading the reaction to take place effectively and turning CO into CO2. Alas, under certain circumstances, a disruption of this process may occur.

By the phenomenon of adsorption, a thin layer of molecules or atoms is formed around the adsorbent surface
By the phenomenon of adsorption, a thin layer of molecules or atoms is formed around the adsorbent surface

During the normal operation of these catalysts, the surface is covered by a layer of oxygen molecules. However, after a given oxygen atom reacts with a CO molecule, empty spaces appear on the oxygen layer that covered the catalyst particles. In order to sustain catalysis, these oxygen “gaps” on the catalyst need to be rapidly filled by other oxygen atoms. In the presence of significant amounts of CO molecules, this takes yet another turn: when CO is abundant enough, the empty sites can be occupied by CO instead of by oxygen.

When the latter process happens at a large enough scale, it results in the catalyst surface being covered by a CO layer, rather than with O2. As a result, CO2 can not be formed and the catalyst does not fulfil its function anymore. In such cases, “one speaks of a deactivated or ‘poisoned by carbon monoxide’ state of the catalyst”, says Professor Neyman.

An inert support… not just a bystander, nonetheless

The poisoning of a catalyst, and the rate of poisoning depend on how high is the CO concentration of exhaust gases. In fact, results showed that the material of the substrate or physical support where palladium catalyst grains placed is crucial. For instance, “if palladium particles are placed on a surface of zirconium oxide or magnesium oxide, then poisoning of the catalyst happens at a higher concentration of carbon monoxide”, says Professor Yuri Suchorski, first author of the article.

If large enough concentrations of CO are present in exhaust gases, the catalyst can more readily become "poisoned"
If large enough concentrations of CO are present in exhaust gases, the catalyst can more readily become “poisoned”

Questions arise as a result of this finding. For instance, why should the nature of the expectedly inert support affect the chemical reactions that take place on the surface of the catalyst particle? Moreover, why does the interface between the palladium catalyst particle and the support influence the behaviour of palladium particles deposited on them? Such questions were addressed by the researchers of this work.

Using a photoemission electron microscope, researchers can control the propagation of a chemical reaction in real-time. With this instrument, the authors learned that carbon dioxide poisoning always starts at the edge of a grain, at the contact place with the inert support. It is from there that carbon monoxide poisoning quickly expands over the whole particle, as happens with decay in a deteriorating apple.

Carbon monoxide attacks at the edge

The poisoning of catalyst particles starts where it does essentially due to geometrical reasons. To start with, oxygen atoms at the edge of the particle have fewer neighbouring oxygen atoms to replace them. In addition, when “oxygen gaps” appear at the edges, a CO molecule can fill the gap more easily than in the middle surface.

It is proven that the support modifies the properties of the catalyst metal particle: “according to our calculations, the bonds between the metal atoms of the particle and the adsorbed oxygen layer are strengthened precisely at the borderline to the support”, notes Professor Neyman. Stressing the effect of the support, he concludes, “palladium atoms in intimate contact with an oxide support can bind oxygen atoms stronger”.

As carbon monoxide starts its “poisoning” at the edge, this effect is crucial: the edge of the metallic oxide is the weak spot of the particle. If this is circumvented the particle of the catalyst is hence protected from CO poisoning, its long-term effectiveness in removing toxic exhaust gases enhanced, and its lifespan, largely extended.

Image credits

Car in the nature in Autumn picture is in the public domain and was downloaded from Pexels.

Adsorption phenomenon picture was is in the public domain and was downloaded from Simple Wikipedia.

Automobile exhaust gas was downloaded from Wikipedia and licensed via an Attribution-ShareAlike 3.0 Unported (CC BY-SA 3.0) license.

Chemical changes… directed by mechanical forces

  • Researchers find a new factor influencing the outcome of chemical reactions
  • Mechanical stress is found to be capable of directing the reactions towards different products
  • This discovery may have implications in fields as nanotechnology, engineering, as well as chemistry and biochemistry

A collaboration from the Ruhr region to Barcelona

A study with the participation of the University of Barcelona (UB) and the Ruhr University cast light onto a series of chemical changes that can be caused by the application of mechanical tension forces on molecules. The work, with participation of Dr. Jordi Ribas Ariño of the Theoretical and Computational Chemistry Institute of the UB (IQTC) and the Physical Chemistry and Materials Science dept. of the UB, showcases the reduction of disulphide molecules linked to polymeric networks under such mechanically-modified conditions.

The study, with the participation of  Professor Dominik Marx of the Ruhr University Bochum, made use of computer simulation to describe how mechanical stress can influence chemical reactivity, becoming a milestone being published in the journal Nature Chemistry. From the industrious region of the Ruhr river in Germany comes a collaboration resulting in an advance with manifold ramifications.

A shift of a paradigm?

So far, heat, light and electricity have been regarded as the main key tools or levers to use in order to drive and fine-tune chemical reactions. The result of mixing the right components in the right proportions and conditions is the conversion of reagent molecules into one or more products with new, different properties –and eventually, added value-.

Purified sulfur. Disulphides arise when one or more molecules containing sulfur atoms form a sulfur-sulfur bond
Purified sulfur. Disulphides arise when one or more molecules containing sulfur atoms form a sulfur-sulfur bond

Years ago, it was yet found that certain molecules, upon exposure to mechanical stress or tension forces had a particular behaviour. It was found that such molecules could in those conditions undergo chemical modifications that would not take place under other circumstances. A field of chemistry arose, that studies such chemical processes and how they are influenced by the mechanical force applied to the involved molecules. That field was called covalent mechanochemistry, which has yet become a thriving field of research.

Dr. Ribas Ariño and Prof. Marx found an unexpected complexity into a particular chemical reaction: in the reduction reaction of disulphide links into alkaline medium, the breaking of a covalent link between carbon and sulphide atoms was not happening, if the reaction was activated by an external mechanical force.

Disulphides, molecules formed by the covalent link between two sulphur atoms, show quite particular mechanochemical properties when inside an alkaline solution. For instance, when they are exposed to a mechanical tension they undergo structural changes that completely modify their reactivity. These changes were described in detail in their article published in 2013 in Nature Chemistry.

Proteins contain typical examples of disulphides and their importance: they can change radically their structure and function
Proteins contain typical examples of disulphides and their importance: they can change radically their structure and function

The discovery of force-induced conformational changes steering chemical reactivity was suggested to play roles in the understanding of aspects of protein regulation, or in the design of mechanoresponsive materials.

A finding with profound consequences

Using computer simulations, the researchers found that, while applying force on the system accelerates the reaction, it also enforces a distorsion of the chemical bond angles. This is effect is responsible for “steering” the reaction into a particular direction, while preventing certain products from appearing. This has the consequence of opening new potential ways for controlling chemical reactivity, and confirms that mechanical tension can potentially be used to control the products resulting of certain chemical reactions.

Dr. Ariño said, “the results open the door to design specific applications for these small molecules, such as synthesis of materials that become more rigid when stressed (as happens with muscles and bones) or elastic bands that become shorter when pulled at, or on the other hand, the application of ultrasounds to activate selective chemical reactions” says Prof. Ariño of the Faculty of Chemistry of the UB, member of the Reference Network in Theoretical and Computational Chemistry (XRQTC).

The finding has implications in fields extending outside the domains of chemistry or the chemical industry
The finding has implications in fields extending outside the domains of chemistry or the chemical industry

Furthermore, the results allow predicting properties and behaviour of mechanophores, which are molecules that can undergo a chemical reaction when exposed to external mechanical stress. The prospects opened by these advances are potentially enormous. For instance, numerous sub-fields of engineering, nanotechnology or chemistry may find application to the implications of this notable breakthrough in basic research.

Image credits:

Purified sulfur picture was downloaded from Wikipedia and is in the public domain.

Bacyllus anthracis protein, by Argonne Laboratory, was downloaded from Wikipedia and licensed via a Creative Commons Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0) license.

Picture of Chemical factory in Antwerp (Belgium) is in the public domain and was downloaded from Pixabay.

SOMMa Open Science Day – a day of exchange about science openness

  • The SOMMa Open Day took place at DTIC-UPF, co-organised with BCAM Bilbao
  • Dozens of professionals meet in a day of sharing about Open Science
  • The SOMMa working group in Open Science grows and starts planning future actions

SOMMa and Open Science

Opening science is one way to make it more accessible as well as more collaborative. It contributes to increase its impact and to generate new opportunities to engage citizens. SOMMa, to that regard, believes that Open Science will help accelerate scientific discovery and expand the impact and benefits of science for society. Convinced of that, SOMMa supports it resolutely.

SOMMa members include institutions which already have clear policies in favour of Open Science, and that develop specific actions for openness: the open sharing of their scientific processes and results, the deploying of data repositories and infrastructure, the sharing of expertise and good practice, training and more.

Venue of the SOMMa Open Science Day, DTIC-UPF, from the above inside.
Venue of the SOMMa Open Science Day, DTIC-UPF, from the above inside.

On past November the 25th took place the 100xCiencia.3 meeting, where SOMMa members, together with external partners debated about how to more and better engage citizens into science and research, also hand-in-hand with political debates on how to improve the situation of science and science policy in Spain.

More recently, on April the 8th took place the “SOMMa Open Science day”. It was co-organised by two of the 48 members of the alliance, an the Engineering and Information Technologies Department of the Universitat Pompeu Fabra (DTIC-UPF) and the Basque Center for Applied Mathematics (BCAM Bilbao), both of them part of the SOMMa working group in Open Science.

An appetizer before the start

Open Science activities started earlier that morning after crossing the gates of DTIC-UPF, the venue for the day. Before the actual SOMMa Open Science Day took place, workshops about Open Science were yet taking place in parallel. These would carry on for the duration of the SOMMa event, but there was chance to have a glimpse of them. It was under the lead of ORION Open Science, an European project coordinated by Barcelona’s Centre for Genomic Regulation (CRG), that the companion workshops on Open Science would begin.

A moment of the workshop, with Emma Harris explaining the basics of Open Access publishing
A moment of the workshop, with Emma Harris explaining the basics of Open Access publishing

To the session contributed law expert Malcolm Bain (Across Legal), offering insights into the General Data Protection Regulation (GDPR), as relevant to the treatment of data. The workshop would later be followed by a tutorial about Wikimedia public resources (by Diego Sáez Trumper, Wikimedia Foundation). Of course, the usual pauses for coffee and sandwiches also had their place, giving room for getting attendees to know each other.

SOMMa Open Science Day starts

With the ORION workshop still ongoing, started the SOMMa Open Science Day. It kicked off at the auditorium of DTIC-UPF with Lawyer Malcolm Bain back in scene, making the opening keynote speech. He re-opened the subject of GDPR, treating about matters such as the obligations one needs to fulfil to comply with GDPR, and how this needs to translate into a solid protocol where traceability must remain solid at all parts of the workflow.

A key point stressed was the need for the data management plan clearly defining what can, and what cannot be done with a given set of data. Failing to do so could lead to potentially dire legal consequences. During all the handling, use and re-use of data, the principle of accountability presides. These are all considerations that take particular relevance for potentially shareable datasets, not allowing the efficient use and re-use of data to be neglected as a result.

Malcolm Bain starting his talk about how to effectively –and safely- comply with GDPR
Malcolm Bain starting his talk about how to effectively –and safely- comply with GDPR

Still with the words of Mr. Bain lingering, the end of the lunch break was followed by a thorough explanation on the situation of Open Science in Spain. Pilar Rico, head of the Unit of Open Access, Repositories and Journals at the Spanish Foundation for Science and Technology (FECYT) richly described the many action taking place. Dr. Rico talked, with the stream of her discourse flowing into aspects of the implications opening research, and the impact on evaluation processes that openness of research should have.

The importance was discussed of defining right indicators for measuring openness. It was mentioned, the hurdles found for implementation of open access policies (and more broadly, open Science) is common both to Spain and to Europe, which invites to coordinated action. Before finishing, Rico issued an invitation for SOMMa to participate in the working group on Open Science at the Spanish State level, an invitation that SOMMa readily accepted, with an intent on contributing positively towards Spanish science and to its openness.

Actors, approaches and stakeholders, as explained by Pilar Rico: a panoramic view at a glance
Actors, approaches and stakeholders, as explained by Pilar Rico: a panoramic view at a glance

Michela Bertero, of the International and Scientific Affairs department at CRG spoke next. She talked about what steps can be taken in support of Open Science. The importance of doing so at the policy level was highlighted, with a focus on the role of European policy. Nonetheless, much can still be done from research institutions themselves, something that was not overlooked.

An illustration of the challenges ahead and the actors and stakeholders involved was put on display, after which attention was drawn to the European Commission’s Open Science Policy Platform. This group, that interacts with many of the involved stakeholders, gives advice to the European Commission for the successful development and deployment of the Open Science policy. As mentioned Bertero, this platform has directly and indirectly yielded yet a pool of close to 100 recommendations for Open Science policy in Europe.

As a relevant example of actions taken from the side of research institutions, the positioning of the EU-Life alliance in face of the Plan S initiative for Open Access publishing was put forth. Their document reaction to the Plan S. was discussed, drawing attention to two key ideas: quality and peer review should be and remain at the heart of Open Access policy, and for its successful deployment, a coordinated effort is necessary.

From left to right: Xavier Serra, Pilar Rico and Michela Bertero, in the final discussion
From left to right: Xavier Serra, Pilar Rico and Michela Bertero, in the final discussion

The closing speaker was the director of DTIC-UPF, Xavier Serra. He took word with his presentation giving insights into the use of Open Science principles for the distribution of internal resources at their institution. More details of what this has entailed and how it has contributed to generate novel structures can be found at the DTIC-UPF website, where it is explained in some detail how they went about strategically funding excellence under open science principles.

During the talk, the case of the Freesound platform was put forth. This, an open sound sharing platform for re-usable soundclip storage and deposition was a success that, perhaps unexpectedly, translated into a stream of donations that allowed for its sustainability, collaterally unlocking resources for advancing other projects. After Dr. Serra closed the talk, the two previous speakers took seats with him in the auditorium table, facing the public and their questions.

Closing the Open Science day

Subjects addressed or suggested before were brought forth in the final discussion, in one reformulation or another. The opening up or not of research, it was said, can be subjected indeed to opposing forces. An obvious example is whether researchers are willing to sacrifice publishing in the highest impact journals in order to publish Open Access. In an extremely competitive landscape for researchers, this may pose with an impossible dilemma.

Should Open Science criteria be included into the evaluation of the excellence of a given project or institution? If yes, and quite critically, what are the best indicators for rewarding the openness at the same time as the excellence of science, neglecting none of both? While it was agreed that advocating openness should not entail renouncing to excellence, it could be foreseen that this subject would remain subject of discussion. Alas, the clock finally struck the lively discussion to an end. If you did not make it, perhaps you can watch some of the recordings at the DTIC-UPF website, here.

Warming up after the SOMMa Open Science day: interlude before the meeting of the Open Science working group
Warming up after the SOMMa Open Science day: interlude before the meeting of the Open Science working group

Finally, just a moment after the end of the SOMMa Open Science Day, a publicly open meeting of the Open Science working group of the SOMM alliance took place. Open, among others, to the recruitment of new members, the collecting of ideas and yet starting to look ahead to the future. What actions were to be taken next? An enlarged critical mass of motivated contributors was formed, leaving as a guarantee… that more news on Open Science actions by SOMMa are to come.

Picture Credits:

All pictures taken by the staff of the SOMMa communications office except the picture of OS stakeholders, kindly provided by Michela Bertero.

The legacy of Planck

  • For four years, the Planck mission collected measurements of ancestral microwave cosmic radiation
  • In 2011, 2013 and 2018, three datasets from these measurements were released, with increasing data accuracy
  • The last and final dataset was to contribute to the most accurate quantification of the age of the Universe till that moment

What was the Planck mission?

Named after the German Nobel laureate Max Planck (1858-1947), the Planck mission of the European Space Agency (ESA) was the first European space observatory whose main goal was to study the Cosmic Microwave Background (CMB), the relic radiation from the Big Bang. The CMB preserves a picture of the Universe as it was about 380 000 years after the Big Bang. As a result, it can reveal the initial conditions for the evolution of the Universe.

Artist interpretation of the Planck satellite observatory in space.
Artist interpretation of the Planck satellite observatory in space.

In the year 1996, the Planck mission was selected as a medium-size mission for obtaining an all-sky image of the temperature and polarisation fluctuations of the CMB, with an accuracy set by fundamental astrophysical limits. This would allow to chart the most accurate maps yet of the CMB. The European Space Agency (ESA) operated the Planck observatory starting in the year 2009. It was a space-based satellite-observatory that observed the “relic” cosmic microwave background (CMB). Planck became an important source of astrophysical data and helped test theories about the early Universe and the origin of the cosmic structure. After the end of the mission, Planck had contributed to the most precise measurements of key cosmological parameters, including the age of the Universe, or the density of matter and dark matter.

The mission of the Planck observatory was put to a final end on October 2013. During the Spring of 2013, the first set of data resulting of the Planck mission was released. That was followed in 2015 by a second release, together with a cautionary note about the precision of some data. By 2018, a new set, the “Legacy” data of Planck had been re-processed by the Planck consortium was to be released with enhanced temperature and polarisation determinations. These new data were to be fully suitable for performing cosmology work with, significantly improving on the two earlier datasets.

Diving into the details of the Planck project

Planck was launched on May 14th 2009 and during four years it collected data. These data would be useful for understanding better the Universe, but also the properties of the components of our own, and other galaxies, as well as their clustering. After just over four years of  remarkable operation, the mission was turned off on October 23rd 2013. It provided high quality data from which an enormous amount of results in the areas of cosmology and astrophysics were derived.

The results obtained from the two first datasets, as well as the products on which they were based on, were made public in 2013 and later in 2015. Among the results the most worth pointing out, one is the best determination of age, composition and shape of the Universe to date. Its results rendered the Planck Collaboration worthy of the 2018 Royal Astronomical Society Award.

Artist interpretation of Planck. In the background, measurements of cosmic microwave background
Artist interpretation of Planck. In the background, measurements of cosmic microwave background

The legacy of Planck, this third and last release, has much to do with the knowledge of the physical properties of the Universe. The results published so far established a homogeneous and isotropic cosmological model that, with only 6 parameters, is able to truly reproduce the Planck observations of the primordial and most remote radiation in the Universe. In relation to its composition, it has been possible for the first time to map the all-sky distribution of dark matter and determine its abundance with sub-percent precision. It has been possible to strongly constrain the alternative models to the cosmological constant for the dark energy.

The results reinforce the existence of an inflationary period in the very early Universe in which it expanded exponentially, and when appeared the quantum seeds that gave rise to galaxies and other structures that form what is known as the “cosmic web”. Another consequence derived from the Planck data is the stringent upper limit imposed on the neutrino mass that, together with the lower limit obtained from neutrino experiments, imply a narrow window of around a tenth of electron-volt (value that, on the other hand, demands an explanation within the standard model of particle physics).

The impact that Planck publications have had on the scientific community has been very remarkable. The Planck Collaboration, made up of some 200 scientists, has published 136 articles in the journal Astronomy and Astrophysics with an average of about 200 citations per article. The most cited papers are the cosmological publications included in the Planck Core Science Program (focused on the study of the CMB), two of them being the most cited in physics in the years 2014 and 2016, respectively. In particular, this has been key for the Instituto de Física de Cantabria (IFCA) having the highest impact relative to the world of all CSIC centres between the years 2012-2014.

In the final set of publications leveraging those data will complete the Planck legacy with the new and more precise results included. The improved results are a consequence of a revision in the calibration of data and in the reduction of systematic effects that prevented the extraction of all the available information in certain data (that of polarisation).

The notorious Max Planck (middle) was awarded with the Nobel of physics in 1918 for his discovery of energy quanta
The notorious Max Planck (middle) was awarded with the Nobel of physics in 1918 for his discovery of energy quanta

This final “legacy” round of results will imply an even more precise determination of the cosmological parameters and of the properties of the components of our galaxy. The newest Planck data will be essential to complement those obtained by the next cosmological experiments starting during the following decade, such as the ESA Euclid satellite (to be launched in 2021) aimed to study the nature and properties of the dark energy and dark matter through a deep and precise mapping of galaxies; or KATRIN, which will study the mass and properties of neutrinos.

IFCA Participation

Enrique Martínez-González, head of the Observational Cosmology and Instrumentation group, participated in the proposal of the mission to the ESA scientific program in 1993 as well as in the subsequent activities of instrumental development, data analysis and derivation of the cosmological results, being Co-investigator of the Planck Low Frequency Instrument (LFI). He and the rest of the members of the cosmology group at IFCA who participate in Planck, hold the status of Planck Scientist and belonging to the LFI Core Team and have played a relevant role in the achievement of the important legacy left by Planck. Regarding the instrumental aspect, the IFCA group coordinated the project for the development of the back-end modules of the radiometers of the LFI at 30 and 44 GHz, in close collaboration with the group at the Department of Communication Engineering (DICOM) of the University of Cantabria (UC), and contributed to their posterior simulation and characterization.

In relation to the scientific exploitation of the data, it has significantly contributed to the two sets of publications of the Planck Core Science Program published in 2014 and 2016, respectively, leading three papers in each set: on the isotropy and statistics of the CMB, on the catalogue of point sources and on the detection of the integrated Sachs-Wolfe effect. Also in relation to both sets of publications, temperature and polarisation maps of the CMB have been produced by means of the component separation code SEVEM developed by the group, one of the four official codes of the mission. In addition, the group has led two papers on the Sunyaev-Zeldovich effect due to the hot gas in the Virgo cluster and in filaments of the cosmic web, respectively, and another one on the recently produced multi-frequency catalogue of non-thermal sources.

Image credits

The images “Front view of the Planck satellite” and “Planck and the cosmic microwave background” are copyright of the European Space Agency and were used according to their usage policy.

The frontpage picture of Max Planck is in the public domain and was downloaded from Wikimedia Commons.

The picture of Walther Nernst, Albert Einstein, Max Planck, Robert Millikan and Max Von Laue in 1931 is in the public domain and was downloaded from Wikipedia.

Sidebar