Nine and sixty ways of particle tracking
Lomonossov Science Project presents results of a contest for the best technique of intracellular particle tracking. "" also has an article about this contest.

A contest for the best technique of intracellular particle tracking (simultaneous tracking of the motions of hundreds and thousands of intracellular organelles, virions and even individual molecules), that is an important issue in cellular biology and has applications in search for appropriate medicines against certain diseases (including Parkinson's and Alzheimer's diseases), has found no undeniable winner. Techniques proposed by all the participants, including the MSU professor Yannis Kalaidzidis, find their own ways for solving the problem.

The article published in Nature Methods describes not a scientific invention or discovery but the results of a contest of scientific techniques. The task was to find an original solution for a certain scientific problem, and as much as 14 groups from all over the world have taken part. Participants are co-authors of this article, while its leading authors are the organizers. Paper preparation took more than a year: it was accepted in December 2013. The problem itself was formulated in May 2012 in Barcelona during the IEEE International Symposium on Biomedical Imaging (ISBI'12).

"In the live cell imaging there is a not yet completely solved problem of simultaneous tracking of hundreds to thousands of intra-cellular organelles, vesicles, virions and individual fluorescent molecules," – explains one of the participants of the contest, Yannis Kalaidzidis, professor of the Faculty of Bioengineering and Bioinformatics, MSU and research scientist at Max Planck Institute of Molecular Cell Biology and Genetics, Dresden.

Inside the cells, there are numerous vesicles (small intracellular formations, sort of membrane-protected containers inside which nutrients and signaling molecules are stored and transported; viruses and mycobacteria also hijack them to penetrate the cell) responsible for intracellular transport, and disorders of this vesicular transport may cause a vast variety of diseases. Apart from metabolic failures, these include neurodegenerative diseases such as Alzheimer's and Parkinson's diseases and many senile genetic disorders.

In order to trace this transport, vesicles are marked and imaged by time-lapse microscopy. Previously, tracking was preformed by manual connection of vesicles on sequential frames, but today, since hundreds and thousands rather than tens of objects are tracked, manual tracking becomes problematic. Such an extensive study requires computer-based analysis. There are number of numerical algorithms developed for the tracking and practically every research group use its own. To compare the efficiencies of different methods, organizers used artificially generated time series of images where the participants had to track all the particles. Reconstructed particle trajectories were then compared to the real trajectories used during the simulations.

The contest actually ended with no single winner. The problem was equally well resolved by different techniques, meaning that, as in Kipling's verse, "There are nine and sixty ways of constructing tribal lays, And every single one of them is right!" ("In the Neolithic Age", Stanza 5, 1895). To be more precise, in this particular case every single one is equally right.

According to Kalaidzidis who presented on the contest his program MotionTracking developed for tracking of intracellular organelles in 2001-2004 and described in a 2005 paper, the negative result of the competition was predetermined by the ill-posedness and complexity of the problem. The drawback of the problem formulation that should be corrected in future contests is over-simplification of the test images that makes the problem solvable for algorithms that may fail when confronted with real data. At the same time, organizers note that the inability to track particles without errors even in such simplified image series means the problem remains unsolved and requires more profound analysis.

"This simplified approach, -- says Kalaidzidis, -- underestimates the influence of some problems present in applied quantitative microscopy and thus may be biased toward algorithms designed specifically for the contest."

Researchers aiming to predict future flu virus
Lomonossov Science Project presents an article about the ways to predict future flu virus. "" also reviewed the research.

Every year, influenza outbreaks claim hundreds of thousands of human lives. Though vaccination against flu is fairly efficient, the disease is difficult to exterminate because of the high evolutionary rate of the flu virus. Every year, new flu strains spread over the planet that differ slightly from those that were common a year before, which helps the virus to escape the immune response, and possibly compromises the efficiency of anti-viral drugs. Furthermore, from time to time, a drastically new strain appears, posing a threat of human pandemic. Both processes are due to changes in the viral genome, but of a different degree. The differences in the seasonal flu usually result from point mutations in the influenza virus genes, while major pandemics are often connected to profound genetic shifts known as reassortments. The link between these two phenomena was for the first time studied by the Russian research team from the Faculty of Bioengineering and Bioinformatics of the Moscow State University (MSU) in collaboration with the Central Research Institute of Epidemiology and the Institute for Information Transmission Problems of the Russian Academy of Sciences (IITP). One of the authors, professor Alexey Kondrashov, is also affiliated with the University of Michigan. The results were published in PLoS Genetics.

Georgii Bazykin, the corresponding author on the paper, who is a leading researcher at the Faculty of Bioengineering and Bioinformatics at the MSU and the head of the Molecular Evolution division at IITP, explains: "Influenza virus genome consists not of a single DNA or RNA molecule as many other viruses do, but of eight individual segments resembling in some sense the chromosomes of the human genome. Every segment is a separate RNA molecule." If different strains co-infect a single cell, their genomes may exchange these segments in a process called reassortment. This may lead to emergence of a novel genome consisting, for instance, of three segments obtained from one viral genome and five segments from another.

"Most major flu epidemics that we know were caused by such reassortments," proceeds Bazykin. "When you analyze the strains that have caused these outbreaks, you find that they had combinations of viral genome segments that were never seen together before. This was the case for the 1957 and the 1968 pandemics, as well as for the swine flu in 2009. The deadliest Spanish flu pandemic of 1918 had probably the same nature, although it is hard to be certain for the events so distant in time."

A reassortment may produce a highly virulent strain, because a strong genetic shift makes it "unfamiliar" to the immune system of most humans, which allows the virus to spread efficiently throughout the population.

This is the evolutionary scenario known as antigenic shift. Another path, known as antigenic drift, is a process of gradual accumulation of smaller mutations. These mutations cause changes in the viral antigenic proteins, primarily, the surface antigens hemagglutinin (HA) and neuraminidase (NA). The genes coding for these proteins evolve rapidly in the course of the arms race between the virus and immune system.

"The seasonal flu outbreaks are primarily caused by this antigenic drift," explains Georgii Bazykin." Hence every year many of us catch a flu caused by a new strain of the constantly evolving virus."

The relationship between these two processes, -- antigenic shift and antigenic drift, -- has never been studied before. To fill this gap, Russian researchers first aimed on localizing the reassortments that the virus had experienced during its evolution. They considered the Influenza A H3N2 virus that entered the human population in 1968. Using a data base containing 1376 sequenced viral genomes, they used bioinformatic techniques to reconstruct the history of reassortments, and to pinpoint the reassortments on the evolutionary tree of the virus. They then checked the hypothesis that genetic shift causes subsequent genetic drift; i.e., that reassortments increase the frequency of smaller, point mutations that appear when individual nucleotides are replaced. The hypothesis was supported by the data: indeed, a reassortment increases the subsequent rate of single-nucleotide substitutions.

"We believe that this effect is connected to the fact that reassorted genes have to operate in a new genetic environment," says Bazykin."Since genes are connected to each other, if gene A has changed, a new version of gene B is also likely to be preferable. As a result, every reassortment event is followed by a trail of additional point mutations."

Since reassortments produce the most virulent pandemic-causing strains, the results of this work may be relevant to prediction of the future emergence of such potentially dangerous outbreaks.

The analogue of a tsunami for telecommunication
Lomonossov Science Project presents an article about the analogue of a tsunami for telecommunication. "" also reviewed the research.

Development of electronics and communication requires a hardware base capable for increasingly larger precision, ergonomics and throughput. For communication and GPS-navigation satellites, it is of great importance to reduce the payload mass as well as to ensure the signal stability. Last year, researchers from the Moscow State University (MSU) together with their Swiss colleagues from EFPL performed a study that can induce certain improvements in this direction. The scientists demonstrated (this paper was published in Nature Photonics) that the primary source of noise in microresonator based optical frequency combs (broad spectra composed of a large number of equidistant narrow emission lines) is related to non-linear harmonic generation mechanisms rather that by fundamental physical limitations and in principle reducible.

On December 22st, a new publication in Nature Photonics is appearing where they extend their results. Michael Gorodetsky, one of the co-authors of this paper, professor of the Physical Faculty of MSU affiliated also in the Russian Quantum Centre in Skolkovo, says that the study contains at least three important results: scientists found a technique to generate stable femtosecond (duration of the order of 10-15 seconds) pulses, optical combs and microwave signals.

Physicists used a microresonator (in this particular case, a millimeter-scale magnesium fluoride disk was used, where whispering-gallery electromagnetic oscillations may be excited, propagating along the circumference of the the resonator) to convert continuous laser emission into periodic pulses of extremely short duration. The best known analogous devices are mode-locked lasers that generating femtosecond, high-intensity pulses. Applications of these lasers range from analysis of chemical reactions at ultra-short timescales to eye-surgery.

"In mode-locked femtosecond lasers complex optical devices, media and special mirrors are normally used. However we succeeded in obtaining stable pulses just in passive optical resonator using its own non-linearity," -- Gorodetsky says. This allows, in future, to decrease drastically the size of the device.

The short pulses generated in the microresonator are in fact what is known as optical solitons (soliton is a stable, shape-conserving localized wave packet propagating in a non-linear medium like a quasiparticle; an example of a soliton existing in nature is a tsunami wave). "One can generate a single stable soliton circulating inside a microresonator. In the output optical fiber, one can obtain a periodic series of pulses with a period corresponding to a round trip time of the soliton." -- Gorodetsky explains.

Such pulses last for 100-200 femtoseconds, but the authors are sure that much shorter solitons are achievable. They suggest that their discovery allows to construct a new generation of compact, stable and cheap optical pulse generators working in the regimes unachievable with other techniques. In the spectral domain, these pulses correspond to the so-called optical frequency "combs" that revolutionized metrology and spectroscopy and brought to those who developed the method a Nobel Prize in physics in 2005 ( American John Hall and German Theodor Haensch received the Prize "for their contributions to the development of laser-based precision spectroscopy, including the optical frequency comb technique"). Currently existing comb generators are much larger and more massive.

At the same time, as the scientists show, a signal generated by such a comb on a photodetectors a high-frequency microwave signal with very low phase noise level. Ultra-low-noise microwave generators are extremely important in modern technology; they are used in metrology, radiolocation, telecommunication hardware, including satellite communications. Authors note that their results are critical for such applications as broadband spectroscopy, telecommunications, and astronomy.

Salt under pressure is not NaCl
Lomonossov Science Project presents an article about the structures radically altering understanding of chemistry. "" also reviewed the research.

In the very beginning of the school chemistry course, we are told of NaCl as an archetypal ionic compound. Being less electronegative, sodium loses its electron to chlorine, which, following the "octet rule", thus acquires the 8-electron electronic configuration of a noble gas. All the rules predict NaCl to be the only possible compound formed by chlorine and sodium. The research team led by Artem R. Oganov, Professor of Crystallography at Stony Brook University, has discovered new sodium chlorides that call for revision of textbook chemistry. In their article "Unexpected stable stoichiometries of sodium chlorides" published in the latest issue of "Science", they predicted stability of compounds that are impossible from the point of view of classical chemistry - NaCl3, NaCl7, Na3Cl, Na2Cl3, or Na2Cl. The lead author of the paper, sharing equal credit with Oganov, is Weiwei Zhang, a visitor in Oganov's laboratory.

"We know from school chemistry that for many classes of compounds a simple charge balance rule applies to all stable compounds. In ionic compounds Cl atoms have formal charge of -1, and sodium has +1, hence the only possible compound is NaCl. This rules out, for example, NaCl2. And NaCl5 and Na2Cl3 should be impossible, too. Yet, one can perform a calculation and estimate how unfavourable they are. It turns out that a large number of "weird" compounds may be successfully stabilized by just increasing pressure. At that point traditional rules break down", -- says Oganov.

These structures radically altering our understanding of chemistry were calculated using the crystal structure prediction technique invented by Oganov and used currently by more than 1500 scientists around the world. Oganov has called his code USPEX (Universal Structure Predictor: Evolutionary Xrystallography) making this Russian word, standing for "success", popular around the crystallographers and material scientists.

Exotic compounds not only expand our understanding of chemistry but may find new practical applications in future. For example, NaCl7, NaCl3, Na3Cl2, and Na2Cl are metals (that explains the apparent violation of electroneutrality since charge balance rules are inapplicable to metals), and only one semiconducting phase of NaCl3 is stable in the pressure range between 250 and 480 thousand atmospheres. Metallic Na3Cl has a unique structure. It consists of alternating layers of NaCl and of pure sodium layers. Sodium chloride layers are dielectric, sodium layers are conducting, hence the crystal as a whole is a unique quasi-two-dimensional conductor. Such substances were recently found to show some interesting physical effects.

"I think that such materials should find practical applications. The only trouble is that these substances are stable only at high pressures. However, other extreme conditions may be used to produce such materials. For example, on the surfaces of other crystals. Surface is an extreme state, too, where nearly half of the bonds are broken, and the chemical composition is known to be completely different from that inside the crystal", -- says Oganov.

New sodium chlorides are not mere fantasy or an exotic numerical result. Alexander Goncharov, experimental physicist from the Carnegie Institute in Washington, has reproduced the high pressures required to stabilize the non-standard sodium chlorides in his laboratory and confirmed directly the existence of the compounds predicted by USPEX.

Oganov considers the new substances not a mere amusing exception but the first signs of a new chapter of chemistry. Such unusual compounds may exist in planetary interiors, where pressures are high.

"I think that sodium chloride could not be an exception. Most likely, we have encountered a new class of compounds that will show up throughout a wide range of chemical systems. We have confirmed this for KCl where the phase diagram is even richer. For a planet-forming system magnesium-oxygen, we predicted two new compounds, MgO2 and Mg3O2. We assume that magnesium silicate is always MgSiO3 or Mg2SiO4 in the interior of the Earth. We have always thought this way but now we ask ourselves -- couldn't it be otherwise? Even if these are the only compositions in the Earth’s core, the rocky core of Jupiter can contain its own unexpected silicates", -- supposes Oganov.

Though the existence of these new compounds was confirmed experimentally, the nature of their stability is not all cases easy to understand.

Artem R. Oganov, a graduate of the Crystallography Department (Geological Faculty) of Moscow State University, is now a Full Professor at Stony Brook University and Adjunct Professor of Moscow State University. He is a winner of a "mega-grant" state project in the Moscow Institute of Physics and Technology (MIPT, the famous Phystech, in Dolgoprudny). The international team of Oganov's lab in MIPT studies, in particular, the nature of the chemical bonds in the new "impossible" compounds. Their studies will provide new insights into the nature of the new compounds and help construct artificial functional materials.

Development of inflorescence architecture in monocotyledonous plants
Lomonossov Science Project presents an article about the development of inflorescence architecture in monocotyledonous plants. "" also reviewed the research.

A joint multi-disciplinary project performed by botanists and plant physiologists of the Moscow State University and the Royal Botanic Gardens, Kew (UK) revealed that the presence or absence and position of leaves within an inflorescence represent important taxonomic markers and key features in regulation of individual flower patterning and development. The results of several years' work on this subject have been summarized in a review published in a special inflorescence issue of Annals of Botany - one of the leading scientific pier-reviewed botanical journals. This is one of the first attempts to link together the morphological, anatomical and physiological data on inflorescence structure and development.

The scientific approaches in examination of flowers and inflorescences are usually separated: either research is focused on the flowers themselves (i.e. on their structure, number, position and initiation of different floral organs, molecular mechanisms driving floral development) or on the arrangement of leaves and flowers within entire inflorescences. This review summarizes current views on interaction between the flower-subtending bracts (i.e. leaves which bear flowers in their axils), floral prophylls (leaves of the pedicel) and flowers, how non-floral organs affect the floral development and the overall structure of the inflorescence.

Scientists focused their attention on the ‘early divergent’ monocot families such as Alismataceae, Acoraceae, Scheuchzeriaceae, and Potamogetonaceae. On molecular-phylogenetic trees of monocotyledonous plants, these families belong to basal branches. All genera examined are semi-aquatic and aquatic saline or fresh-water plants from different parts of the world. They include the unique secondarily submerged sea-grasses Zostera and Posidonia, which play an important role in marine ecosystems.

To set seed, each individual flower and entire inflorescence require an efficient supply. Nutrients and water are supplied through conductive tissues. In plant bodies, conductive tissues form more or less numerous vascular bundles. The pattern of distribution and interaction between vascular bundles is termed vasculature. The authors showed that transformations in inflorescence architecture directly affect the vasculature and thus nutrient transport towards the flowers and fruits. In general, larger flowers and organs require more intensive vasculature. "Sometimes, plants demonstrate interesting engineering solutions in their vasculature -- says Assistant Professor Margarita Remizowa, PhD, -- for example, the flower-subtending bracts are absent in inflorescences of Triglochin, resulting in deep rearrangement of its conducting system. Arabidopsis thaliana - the most popular model organism in botanical research - has a similar structure of inflorescence and conducting system. This means that plants demonstrate a universal mechanism of vasculature response on evolutionary elimination of flower-subtending bracts. Reduction of flower-subtending bracts affects patterns of interaction between the conductive systems of the flower and the inflorescence axis."

Floral and vascular patternings are invisible to the naked eye, and result from processes inside meristems that determine the position and number of parts. One of the central ideas of the paper is that this patterning develops according to a particular set of universal rules. Professor Vladimir Choob, PhD, DSci, co-author of a model of flower development, thinks that floral development is regulated by hormonal flows. Hormonal transportation within floral meristems pre-determine both positional information of floral organs and architecture of the vascular system.

The authors also speculate on the evolutionary disappearance of flower-subtending bracts from the inflorescence structure. Two patterns of flower-subtending bract reduction can be hypothesized in basal monocots. In the first scenario, the flower-subtending bract is morphologically invisible but retains its positional information. This suppression allows the possibility of re-emergence of the flower-subtending bract in the same place during evolution. Flower development proceeds unaltered, just as if the flower-subtending bract was still in place. In the second scenario, the flower-subtending bract partly passes its functions to other organs (such as perianth segments) via the formation of a complex organ. Instead of two separate organs just one is developed. The positional information of both is retained at the level of floral patterning, but the sequence of inception of floral organs and their developmental behaviour are affected.

"To explore basic principles of morphogenetic regulation in a flower, we examined floral development under a scanning electron microscope, -- says Professor Dmitry Sokoloff, PhD, DSci, -- and traced vasculature using 3D-reconscructions of serial anatomical sections. It has become widely accepted that vascular patterns inside the flower follow the transportation flow of the auxin (a hormone affecting many aspects of morphogenesis) at the initial stages of growth. Thus, we can try to reconstruct the development of floral patterning using data on fully developed flowers. An equally important achievement is the recently developed mathematical models of flower development. Such a study was possible only in collaboration between the scientists from different fields."

Doctor Paula Rudall, PhD, DSci, added “These long-term interactions between scientists from different countries and institutes have proved mutually beneficial, not only in sharing of material and facilities, but also in fruitful exchange of ideas. Our collaborations have resulted in an increase of publications from both our institutes.”

Lilium asiaticum prophyll Engl copy (1)

In the floral primordium, perianthal bracts are laid down first (shown with green and blue). Then in the center, future carpels (shown with pink), and next, stamens appear (shown with yellow).

Ways of the photoelectric effect. How physicists have learned how to select them.
Lomonossov Science Project presents an article about the newest things found on the mechanisms of superconductivity. "" also reviewed the research.

An international team including theorists from the Department of Electromagnetic Processes and Atomic Nuclei Interactions of the MSU Institute of Nuclear Physics managed, for the first time in the history of photoelectric studies, to eliminate one serious obstacle that hampered these investigations for many years -- namely, the nuclear magnetic moment. This work was recently published in "Physical Review Letters".

In contrast to its apparent simplicity (that brought Einstein his Nobel Prize), the photoelectric effect, when an electron is knocked out from its parent atom by a photon, is quite complicated to analyze in general, especially when the atom contains a large number of electrons. Like the many-body problem in classical mechanics, the quantum many-body problem is very difficult to conceptualize and remains a serious challenge for theory. Hence, the principal role in this field is played by experiment. The latter, however, faces its own difficulties when it comes to unraveling data associated with the atomic photoeffect itself from a variety of other effects due to essentially irrelevant phenomena.

Not the least among the latter phenomenon is related to the spin (and thus the magnetic moment) of the atomic nucleus. It may be thought of as the quantum generalization of the angular momentum in classical mechanics, which is calculated as the product of the linear momentum (mass times velocity) of a particle and its position vector relative to the axis of rotation. Each proton and each neutron in the nucleus possess their own magnetic moment. While these moments tend to largely compensate each other, the resultant moment does not always have to vanish. Any residual moment, even though relatively small and hence its interaction with the electronic shell is labeled as “hyperfine”, may dramatically influence the process of the photoelectron emission. A non-zero nuclear spin spoils the picture, in particular when the atom is excited, which is the reason for this case being of such strong interest for physicists.

A collaboration of seven physicists from Italy, France, Germany, and Russia chose to perform their study on xenon -- the element previously used to resolve mysterious features in the atomic photoeffect. Being a noble gas, xenon is very convenient for such studies: it does not form chemical bonds and does not contaminate the apparatus with its compounds. Even more important in the choice was that, among all the noble gases, only xenon has stable isotopes with both zero and non-zero nuclear magnetic moments. Furthermore, xenon is an interesting atom on his own rights, due to the large number of electrons and the associated complicated dynamics of its electron shells.

The experimental design suggested isotope separation with the help of a mass-spectrometer. Subsequently, each of the isotopes was excited with synchrotron radiation and simultaneously irradiated with a wavelength-tunable laser beam. All ejected electrons were counted and sorted by energy and scattering angle.

All this is easy to say, but reality is much more complicated. The first targets excited by synchrotron radiation were obtained in the late 1990s, but the principal difficulty was to combine two radiation beams, laser and synchrotron. Moscow theorist A. N. Grum-Grzhimailo, one of the collaborators, says that only a few people in the world are currently capable to solve this problem. One of them -- Michael Meyer from the European XFEL GmbH based in Hamburg (Germany) -- contributed to the experiment described above. The actual experiment was carried out using a unique beamline with variable polarization, maintained by the group of Laurent Nahon at the French synchrotron SOLEIL.

The role of the two theorists from the Institute of Nuclear Physics,, A. N. Grum-Grzhimailo and E. V. Gryzlova (the winner of the UNESCO 2012 L'Oreal award for "Women in Science"), was like the song goes: "paint on... later on, I will explain it all" (a reference to "The Painting Artists" by B. Okudzhava).

The task was to provide a theoretical interpretation for the photoeffect on the excited xenon atom, isolated from the influence of the nuclear magnetic moment. Nobody expected a quiet and peaceful life within the collective of 54 electrons, but the gradual improvement of the existing theoretical models finally led to success in describing the pure atomic photoelectric effect. This work, A. N. Grum-Grzhimailo says, is paving the way for a large class of studies with artificially disabled nuclear magnetic moments and for complicated atomic processes with isotope selection that we could previously not even think about.

Iron age of high-temperature superconductivity
Lomonossov Science Project presents an article about the newest things found on the mechanisms of superconductivity.

The results were published in the Physical Review B journal. "" also reviewed the research.

An international collaboration including Russian physicists from Moscow, Chernogolovka and Yekaterinburg have studied one member of the recently discovered family of superconductors based on iron compounds and find this exotic form of superconductivity to have complex, multi-gap character. A fact of principal importance for understanding the mechanisms of superconductivity is that the superconducting gap width never becomes zero around the constant-energy Fermi surface. Results of this work were published in one of the leading physical journals of the world.

After the discovery of high temperature superconductivity in complex copper oxides (cuprates), exotic, i. e. exhibiting certain unexpected properties, superconductivity was found also in certain layered iron-based compounds. And though the critical temperatures in these new substances are still lower than in cuprates, one can predict with certainty that in the field of condensed matter physics, an "iron age" should follow the current "copper age".

All the handbooks tract superconductivity as an ability of some metals to have strictly zero resistance below some temperature known as critical. This phenomenon is commonly applied in electromagnet construction for accelerators of charged particles (including the Large Hadronic Collider) and, for example, in the nuclear magnetic resonance tomography, one of the most perspective methods of medical diagnosis.

When in the year 1957 John Bardeen (known as the inventor of the semiconductor-based transistor) together with Leon Cooper and John Schrieffer explained the superconductivity mechanism (currently known as the "BCS mechanism" after the initials of the three authors), it seemed so mathematically elegant that was immediately accepted by the scientific community. Today it is still immune for criticism but the physicists gradually come to the conclusion that the BCS theory only explains the very basement of the much more complicated phenomenology of superconductivity.

BCS explains metallic superconductivity by electron binding in the so-called Cooper pairs. Pair formation is initiated by electron interaction with phonons -- disturbances of the metallic crystal lattice. For description of the conditions leading to superconductivity, the important concept of Fermi surface plays the principal role. This surface separates the quantum states occupied by electrons and vacant quantum states.
Note that Fermi surface is a surface in the momentum (or reciprocal) space rather than the usual coordinate space. In the framework of BCS, superconductivity is a consequence of existence of some energy gap near this surface -- an energy range forbidden for electron states.

Fermi surface was initially considered spherical with the constant gap width. According to Alexander Vasiliev, one of the contributors of this study, chair of Low-Temperature Physics and Superconductivity department of the Physical faculty of MSU, the reality is much more complicated. The form of Fermi surface may be far from spherical and depends on numerous factors such as external pressure. Gaps, on the other hand, may be multiple. And the electron binding may be provided by some other means rather than lattice vibrations, in particular by oscillations of the magnetic sub-system.

Iron-based superconductors studied by the authors were discovered quite recently, in 2008. Immediately after being discovered, these compounds set a large number of questions before the investigators. And there are very few satisfactory answers to these questions. The authors selected iron selenide known for having the simplest lattice structure among all the iron-based superconductors, thus being the ideal target for a deeper study. Professor Vasiliev says such objects are studied in all the leading laboratories of the world.

In the experiment described, external field and temperature dependences of the critical parameters were studied, primarily of the so-called "first critical field" (the weakest magnetic field capable for penetrating the lattice). The obtained information proved to be of great importance for further analysis of the superconductivity mechanisms in iron-based superconductors. The breakthrough was enabled by the excellent quality of the superconductors produced by Dmitry Chareev in the crystal growth laboratories of MSU and the ingenious technique of experimental data reduction proposed by Belgian physicists Mahmoud Abdel-Hafiez and Victor Moshchalkov.

Experimental results argue that the iron selenide superconductivity originates either from strong modulation of the energy gap width or from existence of two energy gaps with the widths different by an order of magnitude. An important result, according to the authors, is that the gap width never becomes zero at the Fermi surface. The experimental results presented in this article may be of much help for the theorists currently developing the theory of one of the most enigmatic and intriguing quantum phenomena that finds more and more applications in modern technology.

The paradoxes of immunity
Today Lomonossov Science Project presents an article about intestinal immunity.

Human intestines were shown long ago to belong not only to the digestive but also to the immune system protecting the organism against bacteria, viruses and other pathogens, as well as against carcinomas. A novel important aspect of the mechanisms of this intestinal immunity was uncovered by the researchers from the Belozersky Institute of Physico-Chemical Biology and Biological Faculty of Lomonosov Moscow State University (MSU) and Engelhardt Institute of Russian Academy of Sciences (subsequently abbreviated as RAS) in collaboration with their colleagues from the German Rheumatism Research Center (DRFZ)
and other research centers from Germany, Switzerland, France and the USA.

The results were published on December 6 in the Science journal. "" also reviewed the research.

Intestinal immunity provides protection against different pathogens, in particular, via production of antibodies, or immunoglobulin A (IgA). Recently, a new type of innate lymphoid cells was discovered responsible for formation of lymphoid tissues and regulation of innate immunity. These cells also produce a large diversity of cytokines.

Cytokines are small proteins involved in signal communications between cells and thus regulating different processes. More than 100 such cytokines are currently known but for many of them the mechanism of action is yet poorly understood. Immunologists from MSU and RAS revealed a novel mechanism function of one of the cytokines, lymphotoxin alpha.

"We find a new function of this protein that in many senses appears paradoxical.” says Sergei Nedospasov, head of the Immunology Department of the Biological Faculty of MSU, and leading investigator.

“It is connected to immunity regulation in the gut. This discovery made by Andrei Kruglov from the Belozersky Institute is at odds with the commonly adopted scheme." he adds.

Scientists worked with a unique strain of mice produced in the laboratory of professor Nedospasov nearly 10 years ago. These are genetically modified mice in which only selected cell types do not produce lymphotoxin. These mice were generated by embryonic stem cell technology.

"We used the approach known as `reverse genetics' to modify the expression of the gene responsible for lymphotoxin production," explains Sergei Nedospasov.

Lymphotoxin is produced in two forms: the free, soluble form and the transmembrane form attached to cell membranes. Their roles in mucosal immunity are different, and that of the soluble form remained completely unknown until recently. Currently, scientists find that the soluble form regulates production of IgA in lamina propria by controlling the recruitment of T lymphocytes to the intestine. T cells, or T lymphocytes, are the lymphocytes playing a very important role in adaptive immunity.

In particular, they can recognize and eliminate cells presenting foreign antigens. Apart from this, they help the B cells to produce

Recently, IgA levels in the gut were shown to regulate the composition of the intestinal microflora. And this regulatory function, as it was shown, depends on lymphotoxin.

Biologists sequenced the DNA of intestinal microflora of the mice with knocked-out lymphotoxin genes in innate lymphoid cells and found its composition to differ significantly from that in wild-type laboratory mice.

This discovery may have important medical applications. One of the most commonly used drugs against autoimmune diseases is Etanercept (also known as Enbrel) that, in addition to Tumor Necrosis Factor, also blocks the lymphotoxin. This particular drug was shown to be inefficient for treatment of intestinal autoimmune diseases.

"Without knowing the role of the soluble lymphotoxin in the intestine, the importance of this fact was underestimated,” stresses Nedospasov. “Though our data do not allow to make a definitive conclusion yet, the mere fact that a new role of lymphotoxin, previously unknown, has been uncovered, suggests that new results should be expected in the near future. If Etanercept (Enbrel) will be found to affect the gut microbiota of the patients, it would become an important clinical result."

Diamond formation in the depths of the Earth
Today Lomonossov Science Project presents an abstract of the article written by Russian scientist Yuri Palyanov and his colleagues.

It was published in Proceedings of the National Academy of Sciences and also reviewed by "Gazeta.Ru".

High-temperature rock interactions at subduction zone tectonic plate boundaries may create conditions suitable for diamond formation, according to a study. At subduction zones, the Earth's crust sinks back into the hot mantle, bringing oxidized, surface-formed minerals into contact with metal-saturated, reduced mantle material.

To investigate the effects of this interaction on carbonate minerals, Yuri Palyanov and colleagues constructed high-pressure, high-temperature chambers to simulate the conditions in a subducting slab. At temperatures above 1000 °C, the authors report, carbonate minerals reacted with iron in the simulated mantle material to form new minerals, including iron carbide and graphite. Further temperature increases resulted in a gradient of oxidation states, forming a reaction zone at the carbonate-iron boundary.

Above 1200 °C, the authors discovered diamonds, up to a millimeter in diameter, forming in the carbonate or iron melts. The authors suggest that the boundary conditions between high-temperature iron and carbon minerals created the right conditions for diamond growth, and report that diamonds formed in oxidized conditions contained more nitrogen impurities than those formed in reduced conditions. Subduction zone diamonds may also display a characteristic carbon isotope ratio, and these two phenomena together may explain some of the compositional variation seen among diamonds, according to the authors.


Nugget galaxies are not yet extinct
Astronomers have found a new type of extra-galactic objects that can significantly alter the currently accepted theories of galaxy evolution. Russian co-author of this investigation told “Gazeta.Ru” about how the “nuggets” were discovered.

Astronomers discovered a new unusual type of a galaxy the mere existence of which is able to alter significantly the scientists' views on the galaxy formation in the remote past. In 2005, while scientists were analyzing the data obtained with the Hubble Space Telescope (HST), their attention was captured by the existence in the early Universe of unexpectedly compact galaxies extremely rich in red stars. The galaxies were found at high redshifts (Z>1,4) meaning that their spectra are shifted redwards due to their rapid recession from us. Redshift is a measure of remoteness of the galaxies that formed during different stages of the evolution of the Universe:

objects in the nearby Universe have Z=0, while the most distant known galaxy has Z slightly above 7, and the light we receive from such a distant object was emitted only 700 million years after the Big Bang.

The galaxies discovered in 2005 were called “red nuggets” not only because of their size and color but also because their existence challenged the existing theories of galaxy evolution.

The absence of such galaxies in the nearby Universe raised a new question for astronomers: why did these galaxies become extinct?

Investigators from the Harvard-Smithsonian Center for Astrophysics (USA) in collaboration with the Russian astronomer Igor Chilingarian also affiliated to the Sternberg Astronomy Institute, Moscow State University, find that younger red nuggets have not gone extinct but rather provide the missing link between the distant nuggets of the early Universe and the massive elliptical galaxies of our epoch.

They haven't been sought for good enough.

To find low-redshift counterparts of these galaxies, astronomers addressed the Sloan Digital Sky Survey (SDSS) and selected 600 candidates compact enough to look like ordinary stars in the images taken by ground-based telescopes. Analyzing earlier images of these regions made by the HST, astronomers found that these objects (in the total number of 9 very compact galaxies) are indeed very close analogs of the red nuggets found in the younger Universe.

Largest “nuggets” are ten times as heavy as the Milky Way, smallest are ten times lighter.

Igor Chilingarian, one of the co-authors of this discovery, filled in the details for “”.

-- Red nuggets, like pieces of native metal, is this an established name?

-- For the first time I hear about “nugget galaxies” in Russian. Nobody has ever written about them in Russia, hence there is no such term. But the translation is reasonably good.

Why is this galaxy type important and why does it cast a doubt upon the existing galaxy formation scenarios?

-- Since the discovery of red nuggets in 2005 at high redshifts Z=2 (i. e. in the Universe as young as 3 billion years), the Dutch group led by Marijn Franx and Pieter van Dokkum insisted that these objects are absent in the local Universe, but existed in the remote past and since then gradually inflated in size by growing their stellar populations.

But no numerical simulation succeeded to reproduce such a scenario, that inspired scientists for a search for alternative mechanisms of galaxy evolution. At the same time, we show that such objects exist right were hiding in plain sight (however the redshift of z=0.5 corresponds to a distance of 5 billion light years, when the Universe was 8.6 billion years old rather than 13.6 as today) but are yet rather rare.

At the redshift of 2 they were easily detected thanks to selection effects: their compactness makes their surface brightness higher. Their own origin is a difficult question because contemporary galaxy formation models predict elliptical galaxies to form through mergers, and it is unclear how to make such a compact object with the stellar density that high through merging events only.

If the objects look like mere stars in the SDSS images, how did you manage to select them?

– In the SDSS, there is a photometric catalog and there is a spectral one. Photometric catalog lists these objects as stars that means their sizes are small enough for the survey to interpret them as point-like. At the same time, there are spectra of these objects looking like “almost normal” galaxies in the redshift range of 0.2-0.6.

How can one estimate the mass and the size of the object using its spectra?

– Spectra provide us with the parameters of the stellar population such as the age and the chemical composition of the stars constituting the galaxy. This gives us the stellar mass, and the velocity dispersion may be linked to the dynamical mass provided that we know the size. And to know the size, we had to harvest their images from the HST archive, hence the number of the galaxies listed in the paper is only 9 while the number of SDSS candidates was larger than 600.

What would it mean if this galaxy type proves to be numerous?

– If they prove to be really abundant (i. e. if most of our candidates are confirmed), we would require a new theory capable to explain the production of considerable amount of red nugget galaxies. While for one or two objects, one can always propose some exotic formation scenario reproducing all the observed properties. But for numerous objects exotic scenarios would not fit, they are just too improbable.

My recent paper in “Science” is dedicated to compact elliptical galaxies. Red nuggets may be considered as up-scaled versions of these compact ellipticals, even their stellar populations are quite similar. But forming a giant elliptical galaxy through tidal stripping is a questionable task since, firstly, one requires a really massive perturber for that and, secondly, dynamic friction effects are much stronger (dynamic friction drag is proportional to mass) that would finally lead to a merger with the host galaxy.


Log in