The Simons Center for Geometry and Physics is pleased to announce the latest issue of SCGP News, a biannual publication which reflects the Center’s mission, scientific and cultural events.

Download the Newsletter HERE!

Director of the C.N. Yang Institute for Theoretical Physics

Most of us would agree that addition and subtraction are simpler than multiplication and division, at least for big numbers. And this is not just for people who lack an affinity for math. When quantitative astronomy began in earnest, astronomers like Tycho Brahe and Johannes Kepler were faced with multiplications and divisions involving very large numbers, when they used trigonometry to deduce the motions of the planets. As seen in many other cases, this “practical” problem in the pursuit of pure science stimulated a solution with uses and implications far beyond its original inspiration. The unwieldy numbers of observational astronomy led to an innovation that reduced multiplication to addition, and division to subtraction. This invention was the logarithm, developed most influentially by John Napier around the beginning of the seventeenth century.

What are logarithms? Well, for every number *“a”* we define another number, log(*a*) , “the logarithm of a, ” which has the property that if *c* and *d* are two numbers, then the log of their product, log(*c*x*d)* is the sum of their two logarithms,

log (*c*x*d*) = log(*c*) + log(*d*).

How would this be used in practice? You have a table with a list of the (approximate) value of the logarithm for every positive number. Then, when presented with two large numbers with many digits, it’s not necessary to multiply them. Instead you go to that table of logs, look up the logs of the two numbers, add those two logs, and then return to the table to find out what number has that log. That new number will be the same as the product of the original numbers.

In days gone by, this property of logarithms was built into the scientific calculation tool of choice, the slide rule (see Figure 1).

*Figure 1. Slide rule. [From the office of C.N. Yang, Stony Brook University.]*

A slide rule consists of two bars, one moveable (thus the name “slide”), with numbers marked out at points whose distances from the point marked 1 are proportional to their logarithms. Matching up the 1 on the moveable scale with any number *X* on the fixed scale does the multiplication of *X* times every number on the moveable scale automatically. From the figure, we check, for example, that 2 times 2 is 4 and 2 times 3 is 6, and so forth. Notice that in the picture the bigger numbers are getting closer together, which suggests that as the number *N* increases, log(*N*) doesn’t increase nearly as fast. To be specific, for large numbers, *N*, which means that *1/N* is small,

*log(N+1) = log(N) + log(1 + 1/N) ≈ log(N) + 1/N**,*

where the “wavy” equals sign means “approximately equal to.” We see that as we increase the number whose log we’re taking by 1, then to a very good approximation, the value of that logarithm increases only by *1/N*. In fact, any time we identify a function *f(x)* that changes by *1/x* when *x* changes by a unit, that function is proportional to log(*x*). [Note: here we are using what is called the “natural log.” There are other, equally good logarithms, but they are all found from the natural log by multiplying every natural log by the same number.]

Logarithms aren’t limited to their use in multiplication and division. They often occur in the description of natural phenomena, in quantities that change with distance, time, momentum or energy. But no matter how we measure these quantities, they can’t appear all by themselves in a logarithm, because you can find a log for the number 2, but not for 2 meters. A log can depend on a distance, but only if it is the distance measured in some unit, so that it is really the logarithm of a ratio, like

*log (2 meters/1 meter).*

But log*(2 meters/1 meter)* is the same number as log*(2 miles/1 mile)*. Thus, if we change both the scale of the quantity (2 meters) and the scale of the unit of measure (1 meter) in the same way (meters to miles), we don’t change their ratio. We say that such a logarithm of a ratio is “scale invariant.” For most purposes a mile is very different than a meter, but in applications of the laws of nature, some measureable quantities can be scale invariant, at least approximately. In fact, scale invariance is a signal for the quantum field theories of the Standard Model. To see this, let’s talk about particles.

In the world of classical physics, we think of an isolated particle as stable in time. If the particle experiences a force, its motion will be changed, and Newton’s laws tell us just how this happens. Our classical particle knows about forces, but experiences them only if they are applied from the outside, or if the particle happens to move into a region where these forces act.

In the quantum view, things are much more lively. There is no such thing as a truly isolated particle. All quantum systems are restless — any particle will at all times experience all the forces to which it is sensitive, emitting and reabsorbing other particles that are associated with those forces. For electrons, the force is electromagnetism, and the associated particle is the photon. For quarks it’s the gluon of the strong interactions. This really means that a picture of nature in terms of isolated particles is inadequate, and that the “true” particles are more complex combinations of all these possibilities. Still, it’s convenient to think of an electron, for example, as emitting photons of specific energies, one at a time.

Such an emission changes the status quo, or “state,” consisting of a single electron to a new state, in which there is an electron and a photon. This state, however, has more energy than the original one. Classically, this forbids the whole process, because energy must be conserved, but in quantum mechanics such a state with the wrong energy can last for a finite amount of time, given on average by the ratio

T = (Planck’s constant) ⁄ (Photon’s energy).

So, an electron is never really alone, it always populates its neighborhood with such “virtual” photons, and we describe this situation as a “virtual state” of the electron.

Do virtual states really exist? Yes, we have good evidence that they do. Imagine that an electron in a virtual state is suddenly deflected by a force. For example, it might absorb a photon from another source. This can happen when our electron passes close by the nucleus of an atom. The process is illustrated in Figure 2, where the electron, represented by a straight line, comes in from the left, and emits a “virtual photon,” represented by a horizontal wavy line.

*Figure 2. Feynman diagram for the Bethe-Heitler production of a photon.*

* *

Before it encounters the nucleus, the combination of the electron and photon doesn’t have the right energy to be “real.” (If it did, isolated electrons could emit photons, providing a kind of perpetual motion machine!). The energy is adjusted when the electron passes by the atomic nucleus, and the electron exchanges the other virtual photon (the vertical wavy line) with the nucleus (the black circle), so that both the virtual photon and the electron now have just the right energy for their momenta, and off they both go on the right of the figure, in a state that can last indefinitely. When this happens, the process is called the “Bethe-Heitler” production of a photon. This is basically how X-rays are produced for medical imaging and other purposes.

In the Bethe-Heitler process, the nucleus in effect “shakes loose” the photon from the electron, and the two separate, as would any other virtual partners that happened to be “in the air” of the particular virtual state at the particular time the electron passes by the nucleus. A photon is, however, by far the most likely virtual partner, because it can carry just a little energy, thus occurring in virtual states that can last a long time.

A picture like Figure 2 is sometimes referred to as a “Feynman diagram,” which illustrates quantum-mechanical processes like this one. These diagrams help us picture such quantum processes intuitively, and they also come with rules that tell us how to turn the intuitive picture into quantitative predictions. How likely is the Bethe-Heitler process, and how does it depend on the energy of the virtual state? This depends on the theory and on the energy transfer. The rules for Feynman diagrams in each theory are used to compute the answers.

Famously, quantum mechanics makes predictions in terms of probabilities, numbers that have to be somewhere between 0 and 1, where 1 indicates “a sure thing” and 0, “not a chance.”

The probabilities that tell us how elementary particles behave in isolation or in collisions actually depend on the number of dimensions in our world, because whatever its energy, the virtual photon could move in any direction, and the range of possible directions depends on how many dimensions there are. This is where the logarithms come in. When we calculate the probability of finding an electron in a state with a photon of energy *E*, summing over all directions in three space like *1/E*, the telltale sign that the total probability is a log. The same goes for finding a quark accompanied by a gluon, in the theory of the strong interactions that is part of the Standard Model.

This *1/E* dependence means that the total probability for the Bethe-Heitler production of photons between any two energies is given by the logarithm of the ratio of those energies. Suppose, then, we ask for the probability for encountering photons in a range of energies, say from *E**min* to *E**max*. For a single photon the answer is of the form

*Probability to find one photon* = *F* x α^{2} log (*E**max*/*E**min*),

where *F* depends on the velocity and direction of the electron before and after the collision, and α is a number called the “electromagnetic coupling.” It is a measure of how easily virtual states change one into another. It is one of the fundamental constants in nature. Its value is small, about 1/137. It is a “pure number” with no units, and if it were not, the energy dependence would have been different.

The Bethe-Heitler logarithm exhibits the property of scale invariance — if we multiply the smaller and larger scales by the same number, we get the same probability. The probability of finding a photon emitted between (say) 2 electron volts and 4 electron volts is the same as between 4 electron volts and 8 electron volts. This feature also appears, at least approximately, at much higher energies for all the components of contemporary particle physics, that is, the Standard Model. Approximate scale invariance is a direct result of having dimensionless coupling constants and of being in four dimensions (three space, one time).

So far so good, but there is a surprising consequence of this picture. After all, the electron isn’t always so unfortunate as to pass by a nucleus while its photon companion, of energy *E*, is in a virtual state. Left undisturbed, after a time proportional to *1/E*, the electron reunites with its virtual friend. But using just the same method, we can calculate the probability for all such “unseen” photons between two energies, *E**min* and *E**max*, to be emitted and then absorbed. When we do, we get the same log(*E**max*/*E**min*), as in the Bethe-Heitler process. But if we don’t see these virtual photons, there really is no limit to their maximum energy, *E**max*, and we end up with the logarithm of infinity, which is itself infinite. This is a property shared by all the quantum field theories in the Standard Model; they all have infinities, which emerge through logarithms, from the hidden lives of their elementary particles. In effect, the electrons of the Standard Model are spendthrift with energy, loaning it freely to their virtual photons with no limit. And yet, these loans are always repaid. Technically, this problem is avoided by a method known as “renormalization,” which can be applied successfully to theories with dimensionless couplings. Renormalization “hides” whatever it is that keeps virtual photons from carrying infinite energies to and from their parent electrons.

After renormalization, approximate scale invariance has a tendency to isolate low energy phenomena from the influence of whatever lies at higher energies or shorter distances. In fact, the Standard Model may turn out to be difficult to improve upon, simply because its approximate scale invariance makes its predictions self-consistent up to energies far beyond those directly accessible to accelerators. The emphasis here is on the word “may,” because this scale invariance may fail past a certain energy, indicating the presence of new fixed energy scales beyond those we know today.

Einstein felt that dimensionless constants ought not to be a basic ingredient in a truly fundamental theory[1], and over the years the logarithmic infinities associated with them have displeased many illustrious physicists. The very first natural force law, of Newton’s gravity, is defined with a dimension- full number, known as “Newton’s constant.” The most influential modern “completion” of the Standard Model including gravity is string theory, in which the basic constant is the string tension, a quantity with dimensions, related to Newton’s constant. The logarithms at the heart of the Standard Model may yet yield to compelling experimental or theoretical evidence for new, dimensional scales that can tell a meter from a mile, or a gram from a ton.

[1] Ilse Rosenthal-Schneider, “Reality and Scientific Truth: Discussions with Einstein, Von Laue, and Planck.” Wayne State University Press (1980), pp. 36, 41 and 74.

]]>*Recently there has been a discussion among mathematicians, as well as in press and several blogs, covering the developments in symplectic geometry. Professor Fukaya expressed interest in giving his opinion and we are happy to present it here:*

The set of the solutions of the equation *x*^{2} + *y*^{2} – *z*^{2} = 0 has ‘singularity’ at the point (*x,y,z*) = (0,0,0). On the other hand, *x*^{2} + *y*^{2} – *z*^{2} = -1 has no such singularity. *(See the figure)*. Singular objects, such as *x*^{2} + *y*^{2} – *z*^{2} = 0 are very popular in algebraic geometry, the branch of mathematics that studies the space defined by

algebraic equations.

On the contrary, it is always difficult to include ‘singular spaces’ in differential geometry, or topology. A ‘manifold’ is the main target of research in those fields. The notion of a manifold goes back to Riemann, who introduced this concept as a space which looks like Euclidean space everywhere, locally. In other words, a ‘manifold’ is a space that has no singularity. It seems to me that there is still no reason to change this situation and shift the main focus of the field to the study of singular spaces. However, for the purpose of researching a manifold in differential geometry or topology, it also becomes important to study certain ‘singular spaces’ as a tool of the study.

In the 1970s, studying nonlinear differential equation on a manifold became an important part of differential geometry. In the 1980s it became important to research not only an individual solution of a nonlinear differential equation, but also the set of all solutions, as a whole. Such a set is called a ‘moduli space’. Using moduli spaces in differential geometry or topology was especially successful in two areas. One is the mathematical study of gauge theory and its application to the topology of low dimensional manifolds (such a study was initiated by S. Donaldson), and the other is the theory of pseudo-holomorphic curve in symplectic geometry (which was initiated by M. Gromov). Symplectic geometry is an area that started as a geometric study of the equations of classical mechanics.

Moduli space appeared in algebraic geometry much earlier than in differential geometry. When the foundation of modern algebraic geometry was built by the works of A. Weil, O. Zariski, A. Grotendieck, etc., the study of moduli spaces was one of the important reasons why extremely singular objects are included among the spaces to be researched.

In the differential geometric study of moduli spaces, people need to extract certain algebraic information from moduli spaces. The simplest information to be extracted is the number of points of such a space. Later on people started to extract more sophisticated information from spaces. Especially A. Floer, who found several versions of ‘Floer homology’, where he obtained groups rather than numbers from moduli spaces. With time, adding more structures to Floer homology became an important area of research and produced many applications.

To ‘count’ the number of points of a moduli space is actually a tricky problem. The number of solutions of the equation *x*^{2} = 0 is naively one since 0 is the only solution. However, it is more natural to regard it as two, since for any ϵ ¹ 0, the equation *x*^{2} = ϵ has exactly two solutions. In the world of algebraic geometry such a way to ‘count’ the number of points is known as ‘intersection theory’, and is closely related to the foundation of algebraic geometry. In the realm of differential geometry, singular objects are harder to study.

In the 1980s people used rather an ad-hoc method to find correct ‘count’ of the number of points of a moduli space.

When the study of moduli spaces in symplectic geometry made much progress it became necessary to find a more systematic way of such ‘count’. In 1996 several groups of mathematicians found it. We (K.F. and K. Ono) were one of them. Other groups included G. Tian, J. Li, G. Liu, Y. Ruan, and B. Siebert. This method is now called ‘virtual technique’.

By this method, two of the important problems of the field were solved. One is Arnold’s conjecture about the number of periodic solutions of Hamilton’s equation, an ordinary equation appearing in classical mechanics. The other is the construction of Gromov-Witten invariant, which is a basic invariant in the ‘topological version’ of string theory. When certain Floer homology is nonzero, it implies existence of a periodic solution of Hamilton’s equation. Gromov-Witten invariant is a ‘count’ of the number of solutions of a differential equation, non-linear Cauchy-Riemann equation.

These two problems had previously been solved under certain additional assumptions. By the new way to ‘count’ the number of points, it became possible to solve it in complete generality, in 1996.

When we found it, I believed this method would become a basic tool of the field. During the years 2000-2010, several important works used our version of virtual technique, which we called Kuranishi structure. J. Solomon’s and Melissa Liu’s important PhD theses both studied ‘open string analogue’ of Gromov-Witten invariant in two different situations and used Kuranishi structure. Ono used it to solve Flux conjecture, a famous open problem in symplectic geometry. Fan-Jarvis-Ruan built an important new theory, which they called ‘quantum singularity theory’. In the technical part of their theory, they used Kuranishi structure.

However, using Kuranishi structure did not become the standard of the field. Y. Eliashberg, H. Hofer, and S. Givental proposed a theory which they called Symplectic Field Theory. It uses the same kind of moduli spaces for its foundation. Hofer, together with K. Wyscocki and E. Zehnder, were building a version of virtual technique. They called it Polyfold. In those days, various people working in symplectic geometry mentioned on various occasions that Polyfold theory would soon be complete, becoming the standard by which all the previous approaches would be replaced.

I thought there could be many different approaches, each of which had its own advantage, to establish ‘the foundations of symplectic geometry’. On the other hand, for us (myself and my collaborators Y.-G. Oh, H. Ohta and Ono) the only way to persuade people to understand the importance of our approach was to continue working and to produce more applications.

As I mentioned, putting more structure to Floer homology was an important direction of our research. For this purpose, we needed to improve our method so that it could be safely used in more difficult situations. We were working on ‘Lagrangian Floer theory’, which is a version of Floer homology, and is related to ‘open string’ and ‘D-brane’. Our study was completed in 2009 and we wrote a two-volume research monograph. Soon after that, this theory was generalized by M. Akaho and D. Joyce. Joyce was not satisfied with our version of virtual technique and started a project to rewrite it. His way had various advantages compared to ours (and ours also had various advantages compared to his approach.)

We continued working on Lagrangian Floer theory. Our book in 2009 provided a foundation of the theory, but it did not contain much concrete calculations of the Floer homology we produced. For example, the notion called ‘bounding cochain’ is a major player of our theory. However, in 2009 the example of useful bounding cochain we knew was only 0. Later on, we found the first example of bounding cochain, which is far from 0 and is useful in the study of symplectic geometry. It was a solution of an equation *x*^{3} – *x* – *T*^{α} = 0 and is a complicated power series of *T*. Bounding cochain is a parameter to deform Floer homology. We found that, only at that particular value, Lagrangian Floer homology of a space (called *CP*^{2}#* – CP*^{2 }) becomes nonzero. I was very happy when we found that an abstract notion ‘bounding cochain’, which is difficult to define and is hard to calculate, actually has highly nontrivial examples and is useful in symplectic geometry.

We thought that those generalizations and applications clarified the importance of our version of virtual technique in symplectic geometry.

Around that time, some people who had been ignoring our results, started asking us mathematical questions directly and suggested that there was a gap in our work. I was very happy to hear that, since serious mathematical communications with those people became possible at last, 16 years after we found it. A Google group called ‘Kuranishi’ was started in 2012, whose moderator was Hofer. There D. McDuff and K. Wehrheim posed several questions concerning the detail of our approach to virtual technique. We stopped our research on applications and concentrated on answering their questions in as much detail as possible. We replied to all of their questions. After 6 months no more questions were asked and the Google group was terminated.

I moved from Kyoto to the Simons Center for Geometry and Physics (SCGP) in that year. In

2013-2014, together with McDuff and J. Morgan, I organized a full-year program in SCGP on ‘foundations of symplectic geometry’. My motivation to organize this program was to provide people an occasion to present objections or questions to various approaches to virtual technique. We had two conferences, two lecture series, and many seminar talks. Many people visited SCGP during the conference, or other various periods of the program. During the program, Solomon presented an example which was related to certain issue in our approach. We wrote a paper to clarify this point,[1] which is recently published. Other than that we did not hear objections to our approach.

Hofer and Joyce gave a series of interesting talks presenting their approaches. There were other various approaches appearing around that time. Tian, together with B. Chen, wrote a paper to continue his way of studying virtual technique around 2005. Chen, together with B. Wang, also gave a talk on their approach at the SCGP during our program. One difference between their approach and ours is that we reduce problems to a finite dimensional geometry but they work directly in an infinite dimensional situation. D. Yang studied the relation between our approach and Polyfold theory. J. Pardon wrote a paper that put more emphasis on the algebraic side of the story. I think all of these different research methods contain various new and significant ideas.

The whole construction of ‘virtual technique’ consists of 3 steps. We start with nonlinear differential equation, Analysis. We then obtain some ‘singular space’ and study them, Geometry. Finally, we produce some algebraic structure, Algebra. If one works harder in one of those three parts, then in the other two parts the required amount of the work is smaller. In Polyfold approach, people work harder in analysis, and so less in geometry and algebra. In Pardon’s approach he works harder in algebra, and less in analysis and geometry. In ours and Joyce’s approach, we work harder in geometry, and so less in analysis and algebra. The difference between our approach and Joyce’s is that we study ‘singular space’ in a way closer to ‘manifold’, while Joyce studies it in a way closer to ‘scheme or stack’, the notion appearing in the foundation of algebraic geometry.

I think depending on mathematical taste and background, various researchers have different opinions on the version of virtual technique that is easier to understand and use. This is one reason why I think it is useful that various approaches will be worked out in detail — so that each researcher can choose their favorite one.

I think at this stage of 2017, it is becoming a consensus of the majority of the researchers of the field that, for the purpose to prove Arnold’s conjecture and Gromov-Witten invariant, all of those approaches will work. (The disagreement is mainly on when, where and who completed it. This is not related to mathematics and further discussion would be coarse and vulgar.

I am afraid to say, however, that for more advanced parts of the virtual technique such as those we have been developing since 2000-present, the consensus on its rigor, soundness, or cleanness is still missing. For example, McDuff and Wehrheim, in a paper arXiv:1508.01560v2 page 10, said that their version of ‘Kuranishi method’ is applicable only to Gromov-Witten invariant. Especially they denied its applicability to Floer homology. The purpose of much of my research since 2000 is to improve our version of virtual technique and widen the scope of its applications. Various people are now working on research in symplectic geometry and related areas such as Mirror symmetry, using various versions of virtual technique. I firmly believe that most of that research is based on sound, rigorous and clean foundations.[2] Unfortunately, this is not a consensus of the majority of the researchers of the field.

Together with my collaborators, I am trying to do my best to change this situation. I believe this effort contributes to the sound development of the field.

In this article I compared ‘the foundations of symplectic geometry’ to ‘the foundations of algebraic geometry’ several times. While I do think that symplectic geometry is as important as algebraic geometry, as regards to the foundations of the subjects at this time those of algebraic geometry have existed longer, and are broader. The foundations of symplectic geometry are parallel to the part of the foundations of algebraic geometry dealing with moduli spaces. One very important aspect of ‘the foundations of algebraic geometry’ is its application to number theory. There is nothing comparable to it in ‘the foundations of symplectic geometry.’

My dream is that in the future, virtual techniques will make some serious contribution to establishing the mathematical foundations of quantum field theory. If this dream comes true it could be comparable to the application of ‘the foundations of algebraic geometry’ to number theory. One could then say that ‘the foundations of symplectic geometry’ are comparable to ‘the foundations of algebraic geometry’. I believe there is a significant possibility that in the future this will actually happen.

[1] Shrinking good coordinate systems associated to Kuranishi structures, J. Sympl. Geom. 14 (2016).

[2] When we wrote the book on Lagrangian Floer theory in 2009, we made a few corrections to the definition of Kuranishi structure in our paper 1996. However none of its applications is affected by this correction. The definition we use now (2017) is equivalent to the one in our book of 2009.

]]>* Director of the C.N. Yang Institute for Theoretical Physics*

* *On October 9 and 10, the C.N. Yang Institute for Theoretical Physics (YITP) at Stony Brook University celebrated its fiftieth anniversary with a symposium of talks by faculty and returning alumni, held at the Simons Center for Geometry and Physics. The Institute for Theoretical Physics was founded in 1966, when Chen Ning Yang came to the then little known Stony Brook University, providing it instantly with an international profile. The concept of an ITP (renamed YITP after Yang’s retirement in 1999) was developed by the Physics Chair Alec Pond, by faculty member Max Dresden, and by Stony Brook President John Toll. Early faculty included Ben Lee, Gerald Brown, Ernest Courant and Max Dresden. Its first postdocs were William Bardeen, Hwa-Tung Nieh, Michael Nieto and Wu-Ki Tung, each of whom went on to make important contributions in particle physics, as did many of their successors into our own era. Bill Bardeen was among those who returned for the event.

The Institute is associated with milestone advances in gravity, including the discovery of supergravity, in particle physics and quantum field theory, as well as early work on the renormalization of gauge theories, in neutrinos and QCD collider theory, in statistical mechanics, and the Yang/Baxter equation and solvable models. Over five decades, YITP faculty have supervised approximately 250 doctoral graduates, and hosted nearly one hundred postdoctoral fellows, in addition to numerous visitors. Among doctoral alumni, Luis Alvarez-Gaume, long of the Theory Department at CERN, has taken on the role of Director of the Simons Center for Geometry and Physics, whose physics faculty are also members of the YITP. Other returning participants with current leadership roles included alumni Eric Laenen, head of the Dutch National Institute for Subatomic Physics NIKEF Theory Group, and Kostas Skenderis, Director of the Southampton Theory, Astrophysics and Gravity Centre, and former postdocs Stephen Libby, Leader of the Theory and Modeling Group at Lawrence Livermore Laboratory, and Jianwei Qiu, Associate Laboratory Director for Theoretical and Computational Physics at Jefferson Laboratory.

The symposium involved short reviews by current faculty, covering much classic material, combined with a variety of talks from returning participants on their experiences and current work. Reports on new results included those by Bernard de Wit (Utrecht/NIKEF) on the construction of N=4 superconformal theories, and by Shoucheng Zhang (Stanford) who spoke about topological insulators. At the end of the second day, the symposium concluded with a public lecture by alumnus Ashoke Sen (Harish-Chandra Institute) titled “What is String Theory?”.

At the symposium dinner, participants saw a short video by the YITP’s founder, C.N. Yang, who now lives in China. Yang recounted that in 1966, “the opportunity and challenge proved irresistible” to found the ITP and take part in “building up a new research university” at Stony Brook. Referring to his years at Stony Brook as a “second career,” after the Institute for Advanced Study, he recalled the origins of supergravity and informal lectures from Jim Simons, which eventually “contributed to the increasingly close contact between the world communities of physicists and mathematicians.” At Stony Brook, this contact is facilitated by the Simons Center, and Jim Simons, founder of the Center and former chair of Mathematics at Stony Brook, was present at the dinner to recall his many conversations with Yang that led to those lunchtime lectures, and their friendship over the years. Yang concluded his video message by stating that, “at this celebration of the first 50 years of the ITP we can look forward to the next 50 years of close and fruitful collaboration between the ITP and the Simons Center, and we can also look forward to great advances in the unravelling of the fundamental structure of the physical universe.”

The symposium provided the chance for participants to meet with old friends and with new faculty, whose arrival has added to the breadth and depth of theoretical research at the YITP. In the past decade, seven faculty members have joined the Institute, beginning with Leonardo Rastelli, Director of the new Simons Collaboration on the Non-Perturbative Bootstrap, and followed by Patrick Meade, Rouven Essig and Christopher Herzog, all working in aspects of high energy theory; Tzu- Chieh Wei, in quantum information and condensed matter; Marilena Loverde, in cosmology; and just this year, Alexander B. Zamolodchikov, who has joined as the first C.N. Yang – Deng Wei Professor of Physics. The YITP is fortunate to be able to look with pride at the sweeping contributions of its past in research and in training, with a strong sense of momentum and optimism for its role in the future of theoretical science.

]]>*Interview by Maria Shtilmark*

**Let us begin with Stony Brook where you obtained your PhD in 1981. How did your thesis and your advisor influence your future in theoretical physics?**

There was quite an influence! I came to Stony Brook at the end of August 1978, passed the qualifying exams in January, and started working at the C.N. Yang Institute for Theoretical Physics (then called the ITP). My idea was to work in supergravity. As an undergraduate, I had read a nice article on supergravity by Dan Freedman and Peter van Nieuwenhuizen. At the time there was a student exchange agreement between my university and Stony Brook, which had just started as Spain was coming out of the Franco period. So first I went to the army, and once I finished military service I was fortunate to be accepted here, and to start working after the qualifiers with Dan Freedman. That was a very nice experience, he was a wonderful advisor, and when he moved to MIT in 1980 I went with him. I finished my thesis at MIT, as I was working on projects that he suggested. The only issue was whether I should have defended my thesis at MIT or Stony Brook. At MIT they said that if I wanted to defend the thesis I would have to pass the qualifiers again. We thought this was nonsense and my defense took place here, March 12, 1981.

In fact, in December 1980 I was already granted a fellowship at the Harvard Society of Fellows, so fortunately, I had a postdoc position to go to, even if I didn’t pass the thesis. It is a society of scholars, and to become a Junior Fellow you were supposed to be good at what you do, but also be able to interact with people in other areas of academic activity, and a PhD thesis wasn’t even required. But I passed the thesis, and obviously Dan Freedman’s letters to the Harvard Society of Fellows was important for my access to this privileged position. Dan Freedman has always been a great support in my whole career. I am in great debt to him.

**It is fascinating to learn about your service in the army! What was your rank?**

Second lieutenant. First, I was in a boot camp for three months, and then at A Military Academy, close to Madrid. I went to the Artillery Academy, ending up as a second lieutenant, and then six months of practice. And I was doing it while I was studying for my BA in Physics. At the time there were so many strikes at the university that I could only take few real courses. Either students, or professors were always on strike. So, I worked at home and decided I would complete my military duties (as at the time it was mandatory) while also working on my university degree. I had two six-month breaks, in the fourth and the fifth year of studies. Fortunately, I finished the army at the same time as I finished my BA in physics. At the time physicists and mathematicians were sent either to naval or artillery academies. At least, they trusted we could do some ballistics.

**I noticed that a lot of your interviews are in Spanish, and you are greatly involved with science in Madrid. How important is it for you to come back to Spain and support science?**

Ever since my PhD I’ve been visiting Spain regularly. I even took part in the construction of “The Institute of Theoretical Physics” at the Autonoma University in Madrid. I did my best for this to happen, but of course the locals had the biggest responsibility and credit. I went there frequently to lecture and teach graduate courses. When I was at CERN I directed some theses from students coming from Spain. I felt it was my duty towards the young people of my country to help them get an opportunity. I’ve had quite a number of Spanish students, and many are professors in Spain, or elsewhere. At CERN we can’t grant PhDs, but we can share students with member states, and here at the Center we can probably share students with the faculty in the Mathematics or Physics Departments. This way we are not officially PhD directors, but morally we are co-directors.

**Since CERN has been brought up − During your time at CERN what was the most exciting discovery there, in your opinion, the one that meant the most to you and was special for your work?**

In a sense, there are positive and negative discoveries. I think the best discovery of all has been the fact that now we all talk about the standard model of particle physics. Of course, many laboratories, e.g., SLAC, Fermilab, Brookhaven, CERN, etc., have contributed to that. But LEP (Large Electron-Positron Collider) set the standard model on its feet. It allowed for precision measurements, essentially for all aspects, except for the Higgs. The interesting thing is that because it is a machine based on leptons, electrons and positrons, you could actually predict the mass of the top quark, and interestingly enough, that information was passed to Fermilab which eventually made the discovery of the top quark around 1994-95. This information was important because it is a very difficult measurement and there is a large background, so for Fermilab that was a very useful piece of information. And finally, although LEP was made, among other things, to try and discover the Higgs particle, the scalar particle of the standard model, we had to wait until 2012 to have a machine discover it: LHC. So, in a sense I think that the last nearly 20 years of CERN have been to vindicate and to close the standard model. Then, of course, pushing the boundaries for dark matter, pushing the boundaries for supersymmentry, but I think the important legacy for these 20 years has really been to leave the standard model standing on good ground. And to see that happening was very exciting.

**Could we talk about your book, “An Invitation to Quantum Field Theory”1? The goal is very well explained — “We have selected representative topics containing some of the more innovative, challenging concepts, presenting them without too many proofs or technical details;” “the book tries to motivate the reader to study QFT, not to provide a thorough presentation;” and you also “focus on some conceptual subtleties,” and you mention realization of symmetries in particle physics. Are you happy with the results in general, and of this particular focus on symmetry?**

There are topics in QFT that somehow pass from one book to the other without any critical revision. Then some archetypes, or mantras, are created, and the reader is not asked to think thoroughly about some of those concepts. Unfortunately, this has led to some misconceptions propagating in time. Some of the deeper aspects of QFT concepts, discrete symmetries, e.g. breaking of parity, matter-antimatter asymmetry, CP violation, and CPT, one of the holy grails of the subject, are not explained in enough depth, the approach to them being often more of an engineering approach. We believe it is possible to explain to the reader that the CPT symmetry follows from three simple things, which are properties of our current knowledge: special relativity, quantum mechanics, locality. These three things are enough to prove the CPT and to study its consequences in particle physics. And this is one of the reasons we came up with the book — we don’t have to use the engineering approach with the symmetries. You can understand them at a deeper level from fundamental principles in physics. The interplay between symmetries and conservation laws, the properties of broken symmetries are often presented in ways that we did not find satisfying. We were trying to explain not the standard recipes, but what are the underlying concepts behind them.

**Please, tell us a few words about how your article on Gravitational Anomalies, written with Ed Witten, came about, as it made a big impact on string theory.**

That was when I was a Junior Fellow at Harvard. You could spend one year wherever you wanted, and Edward was kind enough to invite me to spend a year at Princeton. Edward had just come from a conference, I think it was in Texas, where he had been speaking with Sir Michael Atiyah precisely on this type of issue, so he drew me into this work. Clearly, at the time I was fairly ignorant on anomalies, so it was a great privilege to be working with Edward and to complete this work with him. We showed that the type IIB supergravity in ten dimensions was anomaly free through a rather remarkable cancellation of different contributions. Somehow we did not study other possible cancellations in more detail, and we missed what is called the Green-Schwarz anomaly cancellation mechanism. A real pity. We also worked on different things, like the descent equations and more mathematical formal structures of anomalies, and we missed, or at least I did, the possibility of exploring other anomaly cancellations, the ones that eventually led to the formulation of the heterotic string. Of course, working with Edward Witten is always an awe-provoking experience.

**Let us return to Stony Brook — what are your memories from your student years here?**

At the time graduate students were living in a place called Stage 12, not what you would call a ghetto, but something different from a normal American campus. It was a very international community. Many people made successful careers, like Ashoke Sen, who came in the same year as I (1978), Sunil Mukhi, Rohini Godbole, Rabindranath Akhoury, Ergin Sezgin… it was serendipitous to have all these people in the same place at the same time, and we really enjoyed being together. It felt like an extended family away from home. We shared life together, not only scientific interests.

At the time I was married, and my first son was born, so as soon as my family joined me I had to leave campus, because children were not allowed on campus housing. With the immense salary we were getting as teaching assistants, we had to share a house with an Italian mathematician and an American physicist. We used to call the place where we lived “Selden Horror.” The house was in really bad shape, and we could pay a good fraction of the rent by remaking the house, so we spent one year doing that, from the plumbing to the roof, new floors, etc. We really re-built the house, and eventually the owner sold it for a very good price. There was also a lot of conflict between different clans, or families in the area, but fortunately, we got along with them well.

For Christmas we would go back to Spain to see the family (of course the family would send us tickets as we couldn’t afford them). Before my trip, I went to one of the clan leaders, who was our neighbor, and I asked him if he could keep our keys and keep an eye on the house while we were away. He was so surprised that someone would put so much trust in him that he was our protector for the rest of our stay. People were very friendly and nobody disturbed us…

**How does it feel to be back?**

It feels like coming home. Many professors that I worked with as a teaching assistant are still active, and I have plenty of fun memories. I was here at the beginning of my scientific life, then I was away for 30 years, and now, at the dusk of my life, I came back. It feels like opening a new cycle, it feels nice. It is a homecoming in a sense. The place has changed, but not that much. And I’ve also known the people who are at the Center from outside the Center many years ago. Now I also need to get acquainted with the University administration. Something that one usually does not do as a graduate student.

**Could you share your vision of the Simons Center’s place on the international map of the similar institutions?**

I think that the Simons Center is unique in many respects. Clearly, one aspect is service to the community (programs, workshops), and that works extremely well. It is very well-financed and very well-organized. The staff of the Center are great. They are a small team of people who are very efficient and very motivated. This service component has been very successful, and will continue to thrive in the future. I believe we need to strengthen the research that is produced at the Center itself. We want to have it not only as a great place for meetings, workshops, and discussions, but also as one of the top centers for high-quality research. So far we’ve had great people hired, hence the initial conditions are excellent. But, given that we are a small group, it is important to share common goals with the Math Department and the Physics Department, especially the YITP. I think it is important to see how we can all collaborate, like a symbiosis. How we can use the Center’s resources and facilities to help in the common plan. There are various ways to go about it, and I have a number of ideas on how to proceed, but we will see. Last May I presented some of these ideas at the Board of Trustees, and their reaction was quite positive. So, I think it’s possible to implement them and take the Simons Center to the next level. I think we’ve reached the cruising altitude, but now we have to determine the height.

**I love the metaphors you use — is literature important to you, the written word?**

Very important. Literature has always been my passion. You can’t work, or at least I can’t, only on physics, because you become less efficient. It is always good to engage in creative or artistic activity that allow you to relax and return to your research problems with renewed energy. And for me these have always been literature and music.People who do research are very obsessive, almost neurotically so. And you need some strong attraction to remove your attention from what you’ve been working on for weeks, months, maybe years. For me literature has such power of attraction, along with music. Some of my favorites are Russians, like Dostoyevsky, Tolstoy, Lermontov, Pushkin, and Bulgakov and Andreyev, among the more modern. From the Spanish I would name the classics, and more modern ones, such as Borges, Cort.zar, Garc.a M.rquez, Vargas Llosa. My all-time French favorite is Balzac. I also enjoy reading essays, biographies, and history books.

**Do you prefer an e-copy or a paper copy?**

The good thing about e-copy is that I can travel with it. But I like to touch the book. It’s a good feeling to have it in your hands and to be able to browse it. So I buy both, hard copy and e-copy. The physical feeling of satisfaction of touching a book can’t be obtained by electronic media yet, but of course it allows you to carry thousands of books around in a very small device. Or when you can’t sleep, you can read without disturbing your partner.

**I also know that you play piano?**

I do my best. I started playing music, or shall I say learning music when I was 40. I didn’t know how to read music before, so I do my best, but age does not help…

**This is very unusual and shows very broad talent.**

Perhaps more obsession than talent. When I am at home we always have classical music playing. There are pieces that you know essentially by heart, note by note, and to play them in the background while you work is very soothing, it helps me concentrate. When I am little distracted I can latch on to the music and I know how it unfolds, it allows me to focus. But after listening to music so much I developed this yearning to play it and I said “before I am 40 I have to start,” as after 40 your brain begins to lose plasticity at a higher rate. I started to learn music theory, then took playing lessons, and eventually a wonderful professional pianist in the Geneva Conservatory decided to take me as a student. She found it amusing that this old guy had so much interest. I was one of the few non -professional students she had. I took a 90-minute class weekly, and practiced at home for an hour every day. My aim was to play Beethoven sonatas, and I am getting there. Some are already accessible, and this gives me immense pleasure. I have tried pieces of varying difficulty from mostly classical authors. I haven’t gone into jazz, I am absorbed by music up to the beginning of the 20th century.

I have to look for a teacher now, and of course I would love to have access to a piano. Being able to play music is very important for me, as I will be alone a lot. Cinzia [DaVia] is going to be at the Physics Department 50% of the time, starting January 2017. Until then she will be going back and forth from the UK. She will be based in Manchester where she is a professor of Physics. So, playing music will keep me company.

**What upcoming programs at the Center are you especially looking forward to? **

I am interested in the program on Entanglement. I’ll have to see what happens with Mathematics of Gauge Fields, since most organizers are mathematicians… I fear the worst (laughs). They may not be discussing the aspects of gauge theory that I am more interested in, their quantum properties. I am looking forward to a number of them, like the one on Turbulent and Laminar Flows, certainly looking forward to that. The programs are great, as for the workshops, I will attend most of them, time permitting.

**You have already become an organizer of one — on gravitational waves. **

I am organizing the Universe through Gravitational Waves with an expert on numerical relativity, Vitor Cardoso, and one of our local experts on gravity and cosmology, Marilena Loverde. I think it will be good, not just for the mathematics, but all aspects on how to detect them, what can we learn, and there are things about the sources of gravitational waves that are going to surprise us for decades. In fact, in different ranges gravitational waves will open windows to the universe that are not accessible otherwise, e.g. the Big Bang. We could even nearly see how the Universe begun. String Theory and Scattering Amplitudes are organized by two of my former graduate students at CERN, Katrin and Melanie Becker, so it will be fun to collaborate with them again. I will be involved, at least initially, in both of them, and then depending on the time available, I will see how involved I can be. I think it is important to go to the opening talk. And I want to make sure that every time there is a workshop, there is also a colloquium, accessible to Physics and Math Departments, which should highlight the reasons why this workshop is taking place, so that not only 30 specialists, but the Stony Brook community at large can appreciate it.

**What is your main acknowledgement of your predecessor, John Morgan and the work that he did? **

John started the Center, under his mandate very good people were hired, and some of them are still here. All are accomplished and world famous. So, to get it started and to bring it to this level, is already a great achievement.

*September 6, 2016 ** *

*Interview by Maria Shtilmark*

**You are a theoretical physicist, but the topic of the Simons Center workshop you spoke at was “Automorphic Forms, Mock Modular Forms and String Theory” (Aug 29-Sept 2, 2016) which sounds, at least to me, more like abstract mathematics. How did you become interested in aspects of number theory?**

Well, my interest in certain aspects of number theory has come as a surprise to me. My background is in physics. I studied for my PhD in the Physics Department in Cambridge, and my initial interests in string theory were very much related to experimental physics of the strong interactions, which is where string theory originated. At that time I viewed my mathematical training as a tool for helping to advance the physics that I was interested in. In those days, it would have seemed unlikely that I might talk to mathematicians, let alone take part in mathematical conferences. But over the years, as string theory developed, not only have I come to a greater appreciation of mathematics as a tool in understanding theoretical physics, string theory in particular, – I’ve also come to appreciate why mathematicians do what they do. It’s always been a puzzle as to why the laws of physics can be reduced to elegant mathematical statements — as famously emphasized by Eugene Wigner, who called this the “unreasonable effectiveness of mathematics.” In recent years, as I have edged closer to the math community, I’ve been able to get a deeper sense of the elegance and the beauty of mathematics for its own sake. However, the language used by mathematicians is quite tough for a physicist to master, and it has been important for me that certain mathematicians are able to adapt their style in order to communicate with theoretical physicists. Of course, there are also certain theoretical physicists who have a natural ability to communicate in either of these two cultures.

String theory has evolved in spectacular ways, and now touches many areas that were not part of its initial purpose. And several of these diverse areas have deep connections with developing areas of mathematics, which is one of the reasons why the theory is so fascinating. So, my interest in string theory has followed a path that makes it impossible not to be interested in the kind of mathematics that is going on in this workshop, which is intrinsic to the structure of string theory in several ways.

**Having just spoken of the spectacular evolution of string theory, could you comment on how modern theoretical physics compares with the pre-war period, when it was more converged, unified? Is it harder for a present day researcher to grasp all areas of theoretical physics? **

It is of course extremely difficult to imagine what theoretical physics was like in the period before or just after the Second World War. Just in terms of the sheer number of people in the subject, it is a hugely bigger and more diverse field. Many subfields of theoretical physics have evolved into major fields in their own right. Recently, in the process of reviewing Dirac’s achievements, I studied some literature about what kind of physics was going on at the turn of the 20th century. Most of the research of that period has long been forgotten. The major advances in that era—the beginnings of particle physics, quantum mechanics, special and general relativity—were not only astonishing, but they involved a relatively small number of researchers. And now, of course, that is no longer true. It is impossible for any one person to follow the advances in all areas of theoretical physics. In many senses, research is a more communal activity.

One of the impressive things about string theory over the last 20 years is the way it has come to be viewed not so much as a theory of particle physics, which is where it started, but as a theoretical umbrella which contains the mathematical ideas needed to attack a very wide variety of problems in theoretical physics. So, there are people who call themselves string theorists, but who are attacking problems ranging from nitty-gritty condensed matter physics to early universe cosmology. Although many of these ideas are very speculative, it is imaginative speculation motivated by the rich structure of the theory. A key aspect of this structure involves symmetries, known as dualities, which relate apparently very different descriptions of the same system. The most powerful example of such dualities is the so-called “gauge-gravity correspondence.” This is a deep property of quantum gravity that identifies a theory of gravitational forces, which is described by string theory in a certain volume of D-dimensional space, with a non-gravitational theory on the (D-1) dimensional boundary of that volume. This has suggested the possibility that some aspects of condensed matter systems, which are described by non-gravitational forces in three space dimensions, might be described in terms of properties of gravity in one higher dimension. Conversely, puzzling aspects of string theory, or quantum gravity, may be explained in terms of properties of the boundary non-gravitational theory. This idea has opened up a whole range of possible interconnections between what we would normally think as very different disciplines.

It is intriguing that the same mathematical structure describes such different-looking physical systems. For example, a system of condensed matter at finite temperature is described within this holographic framework in terms of a black hole in an extra dimension, with the temperature identified with the temperature of Hawking radiation.

However, I don’t think that anyone really believes that when you are measuring the viscosity of a hot liquid you are looking at a black hole in higher dimensions. One should distinguish a real physical system from the mathematical framework that is used to describe it.

This evolution of string theory into a theoretical structure within which a wide variety of physical systems may be described has significantly broadened its scope. I sometimes think of an analogy with what was happening shortly after quantum mechanics was finally formulated in the 1920s. That was a dramatic breakthrough in fundamental physics, which had immediate applications to atomic physics. However, the subsequent development of quantum field theory had an even bigger impact. This involved the idea of quantizing the electro-magnetic and the electron fields. Although quantum field theory was invented as an explanation of quantum electrodynamics, it was not understood well enough to calculate any of its consequences for a long time. However, it quickly became a hugely successful tool for calculations in other areas of physics, such as condensed matter physics. Feynman diagrams, which came along later, became standard tools in quantum field theory that are useful for studying particle physics, but also a wide range of other physical phenomena. I like to think of this as an analogy to what string theory might do. String theory has a close relationship to quantum field theory — indeed in some sense it encompasses all interesting quantum field theories. From this viewpoint it is an overarching structure within which a range of interesting physical systems may be described.

Another series of recent developments in understanding the structure of string theory involve a reexamination of the role of quantum entanglement, which is a fundamental property of quantum mechanics that distinguishes it from classical physics. There are strong indications that this is at the heart of the black hole information paradox and may be an essential ingredient in understanding how space-time emerges from a more fundamental formulation of the theory. These ideas are still quite primitive, but the aims are very ambitious.

**You spoke of philosophy in physics, do you have an explanation as to why string theory causes certain controversy or denial?**

Some of my colleagues are disturbed by the fact that other theoretical physicists are antagonistic towards string theory. But I think it is fine for people to argue about its merits — scientific progress always involves argument. Actually, since string “theory” is not really a theory, there is no sense in which it can simply be “wrong.” However, it could turn out that it is not a useful approach to understanding theoretical physics. It has not yet been of direct use, but indirectly it has inspired some very novel ways of thinking about problems ranging from cosmology to condensed matter physics.

If people who understand enough about string theory don’t like it – that’s fine, although from my perspective they have bad taste! My problem is with people who are antagonistic to it without understanding its structure. Sometimes you get emails from people claiming to prove that Einstein was wrong. And then you discover they don’t actually know what Einstein said even though they know he is wrong. That is also often the case with people who attack string theory. But I don’t think there is a substantial antagonism to it among those who have studied it, other than a few individuals who enjoy publicizing their views. My US colleagues seem to view this as a problem but I don’t think this affects the way academic appointments are made in the UK. I don’t feel people fail to get jobs because they work on string theory.

**Is it true that your collaboration with John Schwarz started in the CERN canteen in 1979?**

Yes, it started in the CERN cafeteria in 1979, which illustrates a great virtue of CERN as a meeting point for people from all over the world. We had known each other earlier, when I was a postdoc at the Institute for Advanced Study in Princeton and John was Assistant Professor at Princeton University, but in that period we didn’t interact much. In 1979 we were both in CERN for the summer, and we simply started talking about what we thought was interesting. It turned out we were both interested in the same thing, so we started to work on it – which led nowhere that first summer! But our interactions were sufficiently interesting, and we decided to get together at the Aspen Center for Physics the next summer. The following couple of years he invited me to spend some months at Caltech and one year he spent a term in London, where I was a lecturer at Queen Mary College. That was a wonderful place with a Head of Department (John Charap) who really appreciated the significance of research and made it possible for me to take leaves of absence, so I had a very flexible research schedule. In addition, the fact that both John and I were unmarried meant that we had quite flexible lifestyles, which was important in scheduling long absences from home.

**Speaking of collaborations—as I understand, one reason you came to the Center was to collaborate with Erik D’Hoker? **

Both Eric and I received invitations to this workshop, organized by Boris Pioline and Steve Miller (among others—M.S.), both of whom have been my recent collaborators. Steve is a number theorist from Rutgers who has been very important to me as a source of mathematical wisdom over the past few years. He is the first real mathematician that I’ve collaborated with. So, I wanted to come here because of the workshop. The possibility of continuing the project that Eric and I have been working on together with Pierre Vanhove, who is also here, made it particularly timely to be here.

**What do you think of this new format for the Center, namely independent visitors?**

That’s a very good development. Last week, before the workshop started, was very different from this week, as we didn’t have a full program of talks. This allowed time for more intense one-on-one interaction, which has been very fruitful. But this week is interesting in another way, as there is such a diverse collection of experts at the workshop. This includes mathematicians whose talks I can’t always understand, but several of their observations have been very useful.

Putting mathematicians and physicists in touch when they have something to say to each other can be very fruitful. However, it can fail miserably unless an effort is made to communicate across the culture barrier. It is not very useful if they are incomprehensible to each other, even if they are great mathematicians or physicists.

**Yesterday Ken Ono, who had opened the workshop, presented the recently released film “The Man Who Knew Infinity,” on which he had worked as an Associate Producer and Consultant. It was a story of Srinivasa Ramanujan, one of the greatest mathematicians of the last century. Among many achievements, he introduced mock modular forms, one of the topics of this workshop, and he is famous for his work while at Cambridge. In your opinion, how authentic was this representation, did it render the atmosphere of the place, and how has Cambridge changed?**

I think the image of Cambridge in the early 20th century portrayed in the film is probably quite accurate — the formality of the place, and the all-male nature and mild racism. I don’t know that much about Hardy and Littlewood, but their lives in Trinity College evidently had some weird aspects. They were both bachelors who lived in the College, but communicated largely by sending letters to each other. Hardy would write a letter and a servant would come and pick it up and take it to Littlewood, living practically next door! That’s what you might call old-fashioned.

Cambridge has changed hugely since the days of Ramanujan. For a start, in Ramanujan’s time Trinity College was a male enclave, whereas now it has as many women as men. Until the early 1970s only three out of about thirty colleges were women’s colleges, so when I was a student men outnumbered women by about nine to one.

The film was a very worthy attempt to portray something which is almost impossible to explain to non-mathematicians – to try and give the public some insight to why mathematics is such an amazing subject. It was much better than several other recent films with scientific themes.

**Thank you very much for your time, and hope to see you again soon at the Simons Center! **

Yes, and in conclusion I would like to thank the Simons Center for providing such an ideal environment for meeting with other researchers in fields related to my own, but who I would not normally get a chance to meet.

*August 31, 2016*

* **On November 17, 2016, Stony Brook University, and the Simons Center for Geometry and Physics, recognized long-time supporters, the Della Pietra family, by dedicating the Simons Center auditorium the “Della Pietra Family Auditorium.” An event was held to honor the family — brothers Vincent and Stephen Della Pietra, and their wives Barbara Amonson and Pamela Hurst-Della Pietra — and acknowledge their many years of commitment to enhancing the University’s most innovative research and academic programs.*

The relationship between the Della Pietra family and the Simons Center for Geometry and Physicsis a deep one, and began in 2011 with a generous donation to fund the Della Pietra Lecture Series. Physicists themselves, the Della Pietra brothers, along with their families, started the series as means to bring greater awareness of recent and impactful discoveries in Physics and Mathematics to the Long Island community, in particular to high school, undergraduate and graduate students with an interest in the sciences. By attracting world-renowned scientists to the Simons Center for Geometry and Physics, the series enhances the intellectual activity of the Center, and provides an extra ordinary opportunity for both the campus and local community to connect with brilliant minds. Each edition of the series consists of three separate lectures geared towards a specific academic audience; a technical talk for advanced faculty, a high school lecture for young students, and a general talk, open to the public. Over the years, the Center has been pleased to welcome many notable speakers such as Brian Greene (Columbia University), John H. Schwarz (California Institute of Technology), Bonnie Bassler (Princeton University), and Nobel Laureates David Gross (Kavli Institute for Theoretical Physics)and Frank Wilczek (MIT) to name just a few.

“We are at the vanguard of new thinking in relation to the nature of the universe, resulting from the collaboration and collisions of ideas between some of the world’s top researchers,” said the Simons Center’s Director Luis Álvarez-Gaumé. “The Della Pietra family gift is a gratifying validation of our work and we intend to continue generating a significant return on their investment in our mission.

”In addition to their steadfast support of the Simons Center for Geometry and Physics, the Della Pietra family has made many impactful contributions to the greater Stony Brook University and has invested more than $10 million across campus, making them one of the University’s top 10 donors of all time. They’ve helped provision the Global Health Institute, the Staller Center for the Arts, the Laurie Center for Pediatric MS, the College of Arts and Sciences Teaching and Education Fund, the Children’s Hospital Task Force, a visiting Professorship in the Department of Music, a family endowed Chair in Biomedical Imaging, and countless student scholarships.

At the dedication event, faculty, staff and Stony Brook University officials enthusiastically came together to pay tribute to this all-too-deserving23 family. Among the many in attendance were Simons Center Founder James H. Simons, University Provost Michael A. Bernstein, Simons Center Board of Trustees members, and the Center’s newly appointed Director, Luis Álvarez-Gaumé, whose relationship with the Della Pietra brothers has come full circle, beginning many years ago when he was their Academic Advisor at Harvard University. At a time when support of higher education is of utmost importance, the contributions from the Della Pietra family have been invaluable, and this sentiment of appreciation was echoed throughout the evening. “We are humbled by the Della Pietra family’s confidence in Stony Brook. Their philanthropy is invigorating,” said Senior Vice President of University Advancement Dexter A. Bailey, Jr. “The Della Pietra Family Auditorium in the Simons Center is a permanent reminder of our gratitude and the impact the family has had on our campus community.”

For more information about the Della Pietra lecture series, or to view videos from past lectures, please visit: scgp.stonybrook.edu/della-pietra-lecture-series

]]>