Monday, July 26, 2021

Farewell, Big Steve

Steven Weinberg, a giant of theoretical physics, passed on July 23, 2021 -- he was 88 years old. His best known work, for which he received the Nobel prize, proposed the unification of electromagnetic and weak forces, and formed a key component of the Standard Model of particle physics. But his lifetime of work ranged from cosmology to gravitation to quantum field theory to foundations of quantum mechanics. A brief autobiography.
Wikipedia: It is a story widely told that Steven Weinberg, who inherited Schwinger's paneled office in Lyman Laboratory (Harvard Physics department), there found a pair of old shoes, with the implied message, "think you can fill these?"
Indeed, it is true that almost no one on the planet could have filled Schwinger's shoes. But Big Steve did, and more.

 
Below I've reproduced a post from 2017, Steven Weinberg: What's the matter with quantum mechanics? 

The video of Weinberg's talk is from 2016, when he would have been 83 or so.



In this public lecture Weinberg explains the problems with the two predominant interpretations of quantum mechanics, which he refers to as Instrumentalist (e.g., Copenhagen) and Realist (e.g., Many Worlds). The term "interpretation" may be misleading because what is ultimately at stake is the nature of physical reality. Both interpretations have serious problems, but the problem with Realism (in Weinberg's view, and my own) is not the quantum multiverse, but rather the origin of probability within deterministic Schrodinger evolution. Instrumentalism is, of course, ill-defined nutty mysticism 8-)

Physicists will probably want to watch this at 1.5x or 2x speed. The essential discussion is at roughly 22-40min, so it's only a 10 minute investment of your time. These slides explain in pictures.

See also Weinberg on Quantum Foundations, where I wrote:
It is a shame that very few working physicists, even theoreticians, have thought carefully and deeply about quantum foundations. Perhaps Weinberg's fine summary will stimulate greater awareness of this greatest of all unresolved problems in science.
and quoted Weinberg:
... today there is no interpretation of quantum mechanics that does not have serious flaws. 
Posts on this blog related to the Born Rule, etc., and two of my papers:
The measure problem in many worlds quantum mechanics

On the origin of probability in quantum mechanics

Dynamical theories of wavefunction collapse are necessarily non-linear generalizations of Schrodinger evolution, which lead to problems with locality.

Among those who take the Realist position seriously: Feynman and Gell-Mann, Schwinger, Hawking, and many more.

Thursday, July 22, 2021

Embryo Screening for Polygenic Disease Risk: Recent Advances and Ethical Considerations (Genes 2021 Special Issue)



It is a great honor to co-author a paper with Simon Fishel, the last surviving member of the team that produced the first IVF baby (Louise Brown) in 1978. His mentors and collaborators were Robert Edwards (Nobel Prize 2010) and Patrick Steptoe (passed before 2010). In the photo above, of the very first scientific conference on In Vitro Fertilization (1981), Fishel (far right), Steptoe, and Edwards are in the first row. More on Simon and his experiences as a medical pioneer below. 

This article appears in a Special Issue: Application of Genomic Technology in Disease Outcome Prediction.
Embryo Screening for Polygenic Disease Risk: Recent Advances and Ethical Considerations 
L. Tellier, J. Eccles, L. Lello, N. Treff, S. Fishel, S. Hsu 
Genes 2021, 12(8), 1105 
https://doi.org/10.3390/genes12081105 
Machine learning methods applied to large genomic datasets (such as those used in GWAS) have led to the creation of polygenic risk scores (PRSs) that can be used identify individuals who are at highly elevated risk for important disease conditions, such as coronary artery disease (CAD), diabetes, hypertension, breast cancer, and many more. PRSs have been validated in large population groups across multiple continents and are under evaluation for widespread clinical use in adult health. It has been shown that PRSs can be used to identify which of two individuals is at a lower disease risk, even when these two individuals are siblings from a shared family environment. The relative risk reduction (RRR) from choosing an embryo with a lower PRS (with respect to one chosen at random) can be quantified by using these sibling results. New technology for precise embryo genotyping allows more sophisticated preimplantation ranking with better results than the current method of selection that is based on morphology. We review the advances described above and discuss related ethical considerations.
I excerpt from the paper below. 

Some related links: 





Introduction:
Over a million babies are born each year via IVF [1,2]. It is not uncommon for IVF parents to have more than one viable embryo from which to choose, as typical IVF cycles can produce four or five. The embryo that is transferred may become their child, while the others might not be used at all. We refer to this selection problem as the “embryo choice problem”. In the past, selections were made based on criteria such as morphology (i.e., rate of development, symmetry, general appearance) and chromosomal normality as determined by aneuploidy testing. 
Recently, large datasets of human genomes together with health and disease histories have become available to researchers in computational genomics [3]. Statistical methods from machine learning have allowed researchers to build risk predictors (e.g., for specific disease conditions or related quantitative traits, such as height or longevity) that use the genotype alone as input information. Combined with the precision genotyping of embryos, these advances provide significantly more information that can be used for embryo selection to IVF parents. 
In this brief article, we provide an overview of the advances in genotyping and computational genomics that have been applied to embryo selection. We also discuss related ethical issues, although a full discussion of these would require a much longer paper. ...

 Ethical considerations:

For further clarification, we explore a specific scenario involving breast cancer. It is well known that monogenic BRCA1 and BRCA2 variants predispose women to breast cancer, but this population is small—perhaps a few per thousand in the general population. The subset of women who do not carry a BRCA1 or BRCA2 risk variant but are at high polygenic risk is about ten times as large as the BRCA1/2 group. Thus, the majority of breast cancer can be traced to polygenic causes in comparison with commonly tested monogenic variants. 
For BRCA carrier families, preimplantation screening against BRCA is a standard (and largely uncontroversial) recommendation [39]. The new technologies discussed here allow a similar course of action for the much larger set of families with breast cancer history who are not carriers of BRCA1 or BRCA2. They can screen their embryos in favor of a daughter whose breast cancer PRS is in the normal range, avoiding a potentially much higher absolute risk of the condition. 
The main difference between monogenic BRCA screening and the new PRS screening against breast cancer is that the latter technology can help an order of magnitude more families. From an ethical perspective, it would be unconscionable to deny PRS screening to BRCA1/2-negative families with a history of breast cancer. ...

 

On Simon Fishel's experiences as an IVF pioneer (see here):

Today millions of babies are produced through IVF. In most developed countries roughly 3-5 percent of all births are through IVF, and in Denmark the fraction is about 10 percent! But when the technology was first introduced with the birth of Louise Brown in 1978, the pioneering scientists had to overcome significant resistance. There may be an alternate universe in which IVF was not allowed to develop, and those millions of children were never born. 

Wikipedia: ...During these controversial early years of IVF, Fishel and his colleagues received extensive opposition from critics both outside of and within the medical and scientific communities, including a civil writ for murder.[16] Fishel has since stated that "the whole establishment was outraged" by their early work and that people thought that he was "potentially a mad scientist".[17] 

I predict that within 5 years the use of polygenic risk scores will become common in some health systems (i.e., for adults) and in IVF. Reasonable people will wonder why the technology was ever controversial at all, just as in the case of IVF.

Figure below from our paper. EHS = Embryo Health Score. 

Monday, July 19, 2021

The History of the Planck Length and the Madness of Crowds

I had forgotten about the 2005-06 email correspondence reproduced below, but my collaborator Xavier Calmet reminded me of it today and I was able to find these messages.

The idea of a minimal length of order the Planck length, arising due to quantum gravity (i.e., quantum fluctuations in the structure of spacetime), is now widely accepted by theoretical physicists. But as Professor Mead (University of Minnesota, now retired) elaborates, based on his own experience, it was considered preposterous for a long time. 

Large groups of people can be wrong for long periods of time -- in financial markets, academia, even theoretical physics. 

Our paper, referred to by Mead, is 

Minimum Length from Quantum Mechanics and Classical General Relativity 

X. Calmet, M. Graesser, and S. Hsu  

https://arxiv.org/abs/hep-th/0405033  

Phys Rev Letters Vol. 93, 21101 (2004)

The related idea, first formulated by R. Buniy, A. Zee, and myself, that the structure of Hilbert Space itself is likely discrete (or "granular") at some fundamental level, is currently considered preposterous, but time will tell. 

More here

At bottom I include a relevant excerpt from correspondence with Freeman Dyson in 2005.


Dear Drs. Calmet, Graesser, Hsu,

I read with interest your article in Phys Rev Letters Vol. 93, 21101 (2004), and was pleasantly surprised to see my 1964 paper cited (second citation of your ref. 1).  Not many people have cited this paper, and I think it was pretty much forgotten the day it was published, & has remained so ever since.  To me, your paper shows again that, no matter how one looks at it, one runs into problems trying to measure a distance (or synchronize clocks) with greater accuracy than the Planck length (or time).

I feel rather gratified that the physics community, which back then considered the idea of the Planck length as a fundamental limitation to be quite preposterous, has since come around to (more or less) my opinion.  Obviously, I deserve ZERO credit for this, since I'm sure that the people who finally reached this conclusion, whoever they were, were unaware of my work.  To me, this is better than if they had been influenced by me, since it's good to know that the principles of physics lead to this conclusion, rather than the influence of an individual.  I hope that makes sense. ...

You might be amused by one story about how I finally got the (first) paper published after 5 years of referee problems.  A whole series of referees had claimed that my eq. (1), which is related to your eq. (1), could not be true.  I suspect that they just didn't want to read any further.  Nothing I could say would convince them, though I'm sure you would agree that the result is transparently obvious.  So I submitted another paper which consisted of nothing but a lengthy detailed proof of eq. (1), without mentioning the connection with the gravitation paper.  The referees of THAT paper rejected it on the grounds that the result was trivially obvious!!  When I pointed out this discrepancy to the editors, I got the gravitation paper reconsidered and eventually published.

But back then no one considered the Planck length to be a candidate as a fundamental limitation.  Well, almost no one.  I did receive support from Henry Primakoff, David Bohm, and Roger Penrose.  As far as I can recall, these were the only theoretical physicists of note who were willing to take this idea seriously (and I talked to many, in addition to reading the reports of all the referees).

Well anyway, I greet you, thank you for your paper and for the citation, and hope you haven't found this e-mail too boring.

Yours Sincerely,

C.  Alden  Mead


Dear Dr. Mead,

Thank you very much for your email message. It is fascinating to learn the history behind your work. We found your paper to be clearly written and useful.

Amusingly, we state at the beginning of our paper something like "it is widely believed..." that there is a fundamental Planck-length limit. I am sure your paper made a contribution to this change in attitude. The paper is not obscure as we were able to find it without much digging.

Your story about the vicissitudes of publishing rings true to me. I find such stories reassuring given the annoying obstacles we all face in trying to make our little contributions to science.

Finally, we intend to have a look at your second paper. Perhaps we will find another interesting application of your ideas.

Warm regards,

Stephen Hsu

Xavier Calmet

Michael Graesser

 

Dear Steve,

Many thanks for your kind reply.  I find the information quite interesting, though as you say it leaves some historical questions unanswered.  I think that Planck himself arrived at his length by purely dimensional considerations, and he supposedly considered this very important.

As you point out, it's physically very reasonable, perhaps more so in view of more recent developments.  It seemed physically reasonable to me back in 1959, but not to most of the mainstream theorists of the time.

I think that physical considerations (such as yours and mine) and mathematical ones should support and complement each other.  The Heisenberg-Bohr thought experiments tell us what a correct mathematical formalism should provide, and the formal quantum mechanics does this and, of course, much more.  Same with the principle of equivalence and general relativity.  Now, the physical ideas regarding the Planck length & time may serve as a guide in constructing a satisfactory formalism.  Perhaps string theory will prove to be the answer, but I must admit that I'm ignorant of all details of that theory.

Anyway, I'm delighted to correspond with all of you as much as you wish, but I emphasize that I don't want to be intrusive or become a nuisance.

As my wife has written you (her idea, not mine), your e-mail was a nice birthday present.

Kindest Regards, Alden


See also this letter from Mead which appeared in Physics Today.  


The following is from Freeman Dyson:
 ... to me the most interesting is the discrete Hilbert Space paper, especially your reference [2] proving that lengths cannot be measured with error smaller than the Planck length. I was unaware of this reference but I had reached the same conclusion independently.

 

Tuesday, July 13, 2021

Peter Shor on Quantum Factorization and Error Correction

 

This talk by Peter Shor describes the discovery of his quantum algorithm for prime factorization, and the discovery of quantum error correcting codes. The talk commemorates the first conference (Endicott House meeting) on the physics of computation in 1981. See 40 Years of Quantum Computation and Quantum Information.

Shor did not attend the 1981 meeting, where Feynman gave the keynote address Simulating Physics With Computers -- he was in his senior year at Caltech. But he recalls a talk that Feynman gave around the same time, on the possibility that negative probabilities might illuminate the EPR experiment and the Bell inequalities. 

Coincidentally, in my senior year (1986) I got Feynman to give a talk to the Society of Physics Students on this very topic! (I think I was president of SPS at the time.)

Sunday, July 11, 2021

Winner Take All in Global E-Commerce? Alibaba vs Amazon

 

Alibaba and Amazon are set to fight it out for global e-commerce dominance. Both are building out distribution networks all over the world. It might not be winner take all, but there are huge returns to scale so there may be only a few winners that dominate in the future.

Alibaba has some big advantages: lower cost structure (PRC salaries) and better direct contacts with the factories in China that produce the goods. 

In fact, a major trend underway is disintermediation between manufacturers and consumers. See, e.g., Shein, with 100M+ downloads (fast fashion).

All of these companies form the C2M (consumer to manufacturer) layer that provides the following functions.

1. Fulfillment and inventory management
2. Product and price discovery 
3. Product evaluation, trust and consumer confidence

Item #1 has both physical and information infrastructure components, and is the most expensive to build. Items #2 and #3 can be built entirely virtually with much lower barrier to entry.

Just for fun I checked on AliExpress and I could find many of the same products as on Amazon, but at much lower prices. Not surprising, as it's all made in China these days.

At the moment Amazon delivery to US customers is much faster, but the situation varies by country and is changing rapidly.

Tuesday, July 06, 2021

Decline of the American Empire: Afghan edition (stay tuned for more)

There are photos and video to remind us of the ignominious US withdrawal from S. Vietnam after twenty years of conflict and millions of deaths.

Over the July 4th weekend, the US military abandoned Bagram airforce base in Afghanistan, without even informing the Afghan commander and his troops. 

Conveniently for our warmongering neocon "nation-building" interventionist elites, there are (as yet) no photos of this pullout.
BAGRAM, Afghanistan (AP) — The U.S. left Afghanistan’s Bagram Airfield after nearly 20 years by shutting off the electricity and slipping away in the night without notifying the base’s new Afghan commander, who discovered the Americans’ departure more than two hours after they left, Afghan military officials said. 
Afghanistan’s army showed off the sprawling air base Monday, providing a rare first glimpse of what had been the epicenter of America’s war to unseat the Taliban and hunt down the al-Qaida perpetrators of the 9/11 attacks on America. The U.S. announced Friday it had completely vacated its biggest airfield in the country in advance of a final withdrawal the Pentagon says will be completed by the end of August. 
“We (heard) some rumor that the Americans had left Bagram ... and finally by seven o’clock in the morning, we understood that it was confirmed that they had already left Bagram,” Gen. Mir Asadullah Kohistani, Bagram’s new commander said. ...
I wrote the following (2017) in Remarks on the Decline of American Empire:
1. US foreign policy over the last decades has been disastrous -- trillions of dollars and thousands of lives expended on Middle Eastern wars, culminating in utter defeat. This defeat is still not acknowledged among most of the media or what passes for intelligentsia in academia and policy circles, but defeat it is. Iran now exerts significant control over Iraq and a swath of land running from the Persian Gulf to the Mediterranean. None of the goals of our costly intervention have been achieved. We are exhausted morally, financially, and militarily, and still have not fully extricated ourselves from a useless morass. George W. Bush should go down in history as the worst US President of the modern era. 
2. We are fortunate that the fracking revolution may lead to US independence from Middle Eastern energy. But policy elites have to fully recognize this possibility and pivot our strategy to reflect the decreased importance of the region. The fracking revolution is a consequence of basic research from decades ago (including investment from the Department of Energy) and the work of private sector innovators and risk-takers. 
3. US budget deficits are a ticking time bomb, which cripple investment in basic infrastructure and also in research that creates strategically important new technologies like AI. US research spending has been roughly flat in inflation adjusted dollars over the last 20 years, declining as a fraction of GDP. 
4. Divisive identity politics and demographic trends in the US will continue to undermine political cohesion and overall effectiveness of our institutions. ("Civilizational decline," as one leading theoretical physicist observed to me recently, remarking on our current inability to take on big science projects.) 
5. The Chinese have almost entirely closed the technology gap with the West, and dominate important areas of manufacturing. It seems very likely that their economy will eventually become significantly larger than the US economy. This is the world that strategists have to prepare for. Wars involving religious fanatics in unimportant regions of the world should not distract us from a possible future conflict with a peer competitor that threatens to match or exceed our economic, technological, and even military capability.
If you are young and naive and still believe that we can mostly trust our media and government, watch these videos for a dose of reality.




[ The video embedded above was a documentary about Julian Assange and Wikileaks on the DW channel, which I had queued to show the Collateral Murder video. It included an interview with the US soldier who saved one of the children in the rescue van that was hit with 30mm Apache fire. Inexplicably, DW has now removed the video from their channel. Click through to YouTube below for the content. ]




Some things never change. Recall the personal sacrifices made by people like Daniel Ellsberg to reveal the truth about the Vietnam war. Today it is Julian Assange...




Note Added: While it was easy to predict this outcome in 2017, it wasn't much harder to call it in 2011. See this piece from The Onion:
KABUL, AFGHANISTAN—In what officials said was the "only way" to move on from what has become a "sad and unpleasant" situation, all 100,000 U.S. military and intelligence personnel crept out of their barracks in the dead of night Sunday and quietly slipped out of Afghanistan. 
U.S. commanders explained their sudden pullout in a short, handwritten note left behind at Bagram Airfield, their largest base of operations in the country. 
"By the time you read this, we will be gone," the note to the nation of Afghanistan read in part. "We regret any pain this may cause you, but this was something we needed to do. We couldn't go on like this forever." 
"We still care about you very much, but, in the end, we feel this is for the best," the note continued. "Please, just know that we are truly sorry and that we wish you all the greatest of happiness in the future." 
... After reportedly taking a "long look in the mirror" last week, senior defense officials came to the conclusion that they had "wasted a decade of [their] lives" with Afghanistan ...

Friday, July 02, 2021

Polygenic Embryo Screening: comments on Carmi et al. and Visscher et al.

In this post I discuss some recent papers on disease risk reduction from polygenic screening of embryos in IVF (PGT-P). I will focus on the science but at the end will include some remarks about practical and ethical issues. 

The two papers are 

Carmi et al. 

Visscher et al. 

Both papers study risk reduction in the following scenario: you have N embryos to choose from, and polygenic risk scores (PRS) for each which have been computed from SNP genotype. Both papers use simulated data -- they build synthetic child (embryo) genotypes in order to calculate expected risk reduction. 

I am very happy to see serious researchers like Carmi et al. and Visscher et al. working on this important topic. 

Here are some example results from the papers: 

Carmi et al. find a ~50% risk reduction for schizophrenia from selecting the lowest risk embryo from a set of 5. For a selection among 2 embryos the risk reduction is ~30%. (We obtain a very similar result using empirical data: real adult siblings with known phenotype.)

Visscher et al. find the following results, see Table 1 and Figure 2 in their paper. To their credit they compute results for a range of ancestries (European, E. Asian, African). We have performed similar calculations using siblings but have not yet published the results for all ancestries.

Relative Risk Reduction (RRR): 
Hypertension: 9-18% (ranges depend on specific ancestry) 
Type 2 Diabetes: 7-16% 
Coronary Artery Disease: 8-17% 

Absolute Risk Reduction (ARR): 
Hypertension: 4-8.5% (ranges depend on specific ancestry) 
Type 2 Diabetes: 2.6-5.5% 
Coronary Artery Disease: 0.55-1.1%

Note, families with a history of the disease would benefit much more than this. For example, parents with a family history of breast cancer or heart disease or schizophrenia will often produce some embryos with very high PRS and others in the normal range. Their absolute risk reduction from selection is many times larger than the population average results shown above. 

My research group has already published work in this area using data from actual siblings: tens of thousands of individuals who are late in life (e.g., 50-70 years old), for whom we have health records and genotypes. 


We have shown that polygenic risk predictors can identify, using genotype alone, which of a pair of siblings has a specific disease condition: the sib with high PRS is much more likely to have the condition than the sib with normal range PRS. In those papers we also computed Relative Risk Reduction (RRR), which is directly relevant to embryo selection. Needless to say I think real sib data provides better validation of PRS than simulated genotypes. The adult sibs have typically experienced a shared family environment and also exhibit negligible population stratification relative to each other. Using real sib data reduces significantly some important confounds in PRS validation. 

See also these papers: Treff et al. [1] [2] [3]

Here are example results from our work on absolute and relative risk reduction. (Selection from 2 embryos.)


Regarding pleiotropy (discussed in the NEJM article), the Treff et al. results linked above show that selection using a Genomic Index, which is an aggregate of several polygenic risk scores, simultaneously reduces risks across all of the ~12 disease conditions in the polygenic disease panel. That is, risk reduction is not zero-sum, as far as we can tell: you are not forced to trade off one disease risk against another, at least for the 12 diseases on the panel. Further work on this is in progress. 

In related work we showed that DNA regions used to predict different risks are largely disjoint, which also supports this conclusion. See 



To summarize, several groups have now validated the risk reduction from polygenic screening (PGT-P). The methodologies are different (i.e., simulated genotypes vs studies using large numbers of adult siblings) but come to similar conclusions. 

Whether one should regard, for example, relative and absolute risk reduction in type 2 diabetes (T2D) of ~40% and ~3% (from figure above) as important or valuable is a matter of judgement. 

Studies suggest that type 2 diabetes results in an average loss of over 10 quality-adjusted life years -- i.e., more than a decade. So reducing an individual's risk of T2D by even a few percent seems significant to me. 

Now multiply that by a large factor, because selection using a genomic index (see figure) produces simultaneous risk reductions across a dozen important diseases.

Finally, polygenic predictors are improving rapidly as more genomic and health record data become available for machine learning. All of the power of modern AI technology will be applied to this data, and risk reductions from selection (PGT-P) will increase significantly over time. See this 2021 review article for more.


Practical Issues 

Aurea, the first polygenically screened baby (PGT-P), was born in May 2020.
See this panel discussion, which includes 

Dr. Simon Fishel (member of the team that produced the first IVF baby) 
Elizabeth Carr (first US IVF baby) 
Prof. Julian Savalescu (Uehiro Chair in Practical Ethics at the University of Oxford) 
Dr. Nathan Treff (Chief Scientist, Genomic Prediction)
Dr. Rafal Smigrodzki (MD PhD, father of Aurea) 

Astral Codex Ten recently posted on this topic: Welcome Polygenically Screened Babies :-) Many of the comments there are of high quality and worth reading. 


Ethical Issues 

Once the basic scientific results are established, one can meaningfully examine the many ethical issues surrounding embryo selection. 

My view has always been that new genomic technologies are so powerful that they should be widely understood and discussed -- by all of society, not just by scientists. 

However, to me it is clear that the potential benefits of embryo PRS screening (PGT-P) are very positive and that this technology will eventually be universally adopted. 

Today millions of babies are produced through IVF. In most developed countries roughly 3-5 percent of all births are through IVF, and in Denmark the fraction is about 10 percent! But when the technology was first introduced with the birth of Louise Brown in 1978, the pioneering scientists had to overcome significant resistance. There may be an alternate universe in which IVF was not allowed to develop, and those millions of children were never born.
Wikipedia: ...During these controversial early years of IVF, Fishel and his colleagues received extensive opposition from critics both outside of and within the medical and scientific communities, including a civil writ for murder.[16] Fishel has since stated that "the whole establishment was outraged" by their early work and that people thought that he was "potentially a mad scientist".[17]
I predict that within 5 years the use of polygenic risk scores will become common in some health systems (i.e., for adults) and in IVF. Reasonable people will wonder why the technology was ever controversial at all, just as in the case of IVF.

This is a very complex topic. For an in-depth discussion I refer you to this recent paper by Munday and Savalescu. Savalescu, Uehiro Chair in Practical Ethics at the University of Oxford, is perhaps the leading philosopher / bioethicist working in this area. 
Three models for the regulation of polygenic scores in reproduction 
Journal of Medical Ethics
The past few years have brought significant breakthroughs in understanding human genetics. This knowledge has been used to develop ‘polygenic scores’ (or ‘polygenic risk scores’) which provide probabilistic information about the development of polygenic conditions such as diabetes or schizophrenia. They are already being used in reproduction to select for embryos at lower risk of developing disease. Currently, the use of polygenic scores for embryo selection is subject to existing regulations concerning embryo testing and selection. Existing regulatory approaches include ‘disease-based' models which limit embryo selection to avoiding disease characteristics (employed in various formats in Australia, the UK, Italy, Switzerland and France, among others), and 'laissez-faire' or 'libertarian' models, under which embryo testing and selection remain unregulated (as in the USA). We introduce a novel 'Welfarist Model' which limits embryo selection according to the impact of the predicted trait on well-being. We compare the strengths and weaknesses of each model as a way of regulating polygenic scores. Polygenic scores create the potential for existing embryo selection technologies to be used to select for a wider range of predicted genetically influenced characteristics including continuous traits. Indeed, polygenic scores exist to predict future intelligence, and there have been suggestions that they will be used to make predictions within the normal range in the USA in embryo selection. We examine how these three models would apply to the prediction of non-disease traits such as intelligence. The genetics of intelligence remains controversial both scientifically and ethically. This paper does not attempt to resolve these issues. However, as with many biomedical advances, an effective regulatory regime must be in place as soon as the technology is available. If there is no regulation in place, then the market effectively decides ethical issues.
Dalton Conley (Princeton) and collaborators find that 68% of surveyed Americans had positive attitudes concerning polygenic screening of embryos.

Wednesday, June 30, 2021

Six Ways From Sunday: Tucker vs NSA

 


Chuck Schumer: You take on the intelligence community, they have six ways from Sunday to get back at you.


 

Tucker Carlson has potential as a politician -- there is at least a small chance that someday he'll be POTUS. The intelligence services are, I am sure, very interested in any kompromat they can acquire on him for future use. You mean foreign intel services? No, I mean our intel services :-(

Clarification, from comments
The post is not primarily about Tucker. It's about intel services spying on American citizens. 
Most importantly, Tucker's story is credible: some whistleblower saw intercepted Tucker emails and contacted him to let him know he is under surveillance. But as anyone paying attention knows, we are ALL under surveillance due to "bulk collection" revealed many years ago, e.g., by Snowden. The Rogers saga and FISC report show that this bulk-collected data is not very well protected from intel agency types who want to have a peek at it...  
Re: bulk collection, non-denial denials ("not an intelligence target of the Agency" ha ha), see
Wikipedia: According to a report in The Washington Post in July 2014, relying on information furnished by Snowden, 90% of those placed under surveillance in the U.S. are ordinary Americans, and are not the intended targets. The newspaper said it had examined documents including emails, message texts, and online accounts, that support the claim.
Below is a Rogers timeline covering illegal spying using NSA data. This illegal use of data is a matter of record -- undisputed, but also largely unreported. The FISC (FISA court) report on this illegal use of data appeared in April 2017; the author is Rosemary Collyer, the head FISA judge. The report was originally classified Top Secret but was later declassified and released with redactions. Collyer uses the phrase "institutional lack of candor" when referring to behavior of federal agencies in their dealings with FISC over this issue. ... 
The court learned in October 2016 that analysts ... were conducting prohibited database searches “with much greater frequency than had previously been disclosed to the court.” The forbidden queries were searches of Upstream Data using US-person identifiers. The report makes clear that as of early 2017 NSA Inspector General did not even have a good handle on all the ways that improper queries could be made to the system. ... 
March 2016 – NSA Director Rogers becomes aware of improper access to raw FISA data. 
April 2016 – Rogers orders the NSA compliance officer to run a full audit on 702 NSA compliance. 
April 18 2016 – Rogers shuts down FBI/NSD contractor access to the FISA Search System. 
Mid-October 2016 – DNI Clapper submits a recommendation to the White House that Director Rogers be removed from the NSA. 
October 20 2016 – Rogers is briefed by the NSA compliance officer on the Section 702 NSA compliance audit and “About” query violations. 
October 21 2016 – Rogers shuts down all “About" query activity. Rogers reports the activity to DOJ and prepares to go before the FISA Court. 
October 21 2016 – DOJ & FBI seek and receive a Title I FISA probable cause order authorizing electronic surveillance on Carter Page from the FISC. At this point, the FISA Court is unaware of the Section 702 violations. 
October 24 2016 – Rogers verbally informs the FISA Court of Section 702(17) violations. 
October 26 2016 – Rogers formally informs the FISA Court of 702(17) violations in writing. 
November 17 2016 (morning) – Rogers travels to meet President-Elect Trump and his Transition Team in Trump Tower. Rogers does not inform DNI James Clapper. 
November 17 2016 (evening) – Trump Transition Team announces they are moving all transition activity to Trump National Golf Club in New Jersey.
I was recently in a Zoom meeting on geopolitics that included Admiral Rogers. I wanted to ask him privately about the above. Perhaps someday I'll get the chance.
 

Caption: NSA Director Rogers describes to Congress how little privacy Americans have from government surveillance. 

Alternate Caption: NSA Director Rogers tells Congress how much legal oversight remains over the activities of intel services.

Tuesday, June 29, 2021

Machine Learning Prediction of Biomarkers from SNPs and of Disease Risk from Biomarkers in the UK Biobank (published version)


This is the published version of our MedRxiv preprint discussed back in April 2021. It is in the special issue Application of Genomic Technology in Disease Outcome Prediction of the journal Genes. 

There is a lot in this paper: genomic prediction of important biomarkers (e.g., lipoprotein A, mean platelet (thrombocyte) volume, bilirubin, platelet count), prediction of important disease risks from biomarkers (novel ML in a ~65 dimensional space) with potential clinical applications. As is typical, genomic predictors trained in a European ancestry population perform less well in distant populations (e.g., S. Asians, E. Asians, Africans). This is probably due to different SNP LD (correlation) structure across populations. However predictors of disease risk using directly measured biomarkers do not show this behavior -- they can be applied even to distant ancestry groups.

The referees did not like our conditional probability notation:
( biomarkers | SNPs )   and   ( disease risk | biomarkers )
So we ended up with lots of acronyms to refer to the various predictors.

Some of the biomarkers identified by ML as important for predicting specific disease risk are not familiar to practitioners and have not been previously discussed (as far as we could tell from the literature) as relevant to that specific disease. One medical school professor and practitioner, upon seeing our results, said he would in future add several new biomarkers to routine blood tests ordered for his patients.
 
Machine Learning Prediction of Biomarkers from SNPs and of Disease Risk from Biomarkers in the UK Biobank 
Erik Widen 1,*,Timothy G. Raben 1, Louis Lello 1,2,* and Stephen D. H. Hsu 1,2 
1 Department of Physics and Astronomy, Michigan State University, 567 Wilson Rd, East Lansing, MI 48824, USA 
2 Genomic Prediction, Inc., 675 US Highway One, North Brunswick, NJ 08902, USA 
*Authors to whom correspondence should be addressed. 
Academic Editor: Sulev Koks 
Genes 2021, 12(7), 991; https://doi.org/10.3390/genes12070991 (registering DOI) 
Received: 30 March 2021 / Revised: 22 June 2021 / Accepted: 23 June 2021 / Published: 29 June 2021 
(This article belongs to the Special Issue Application of Genomic Technology in Disease Outcome Prediction) 
Abstract 
We use UK Biobank data to train predictors for 65 blood and urine markers such as HDL, LDL, lipoprotein A, glycated haemoglobin, etc. from SNP genotype. For example, our Polygenic Score (PGS) predictor correlates ∼0.76 with lipoprotein A level, which is highly heritable and an independent risk factor for heart disease. This may be the most accurate genomic prediction of a quantitative trait that has yet been produced (specifically, for European ancestry groups). We also train predictors of common disease risk using blood and urine biomarkers alone (no DNA information); we call these predictors biomarker risk scores, BMRS. Individuals who are at high risk (e.g., odds ratio of >5× population average) can be identified for conditions such as coronary artery disease (AUC∼0.75), diabetes (AUC∼0.95), hypertension, liver and kidney problems, and cancer using biomarkers alone. Our atherosclerotic cardiovascular disease (ASCVD) predictor uses ∼10 biomarkers and performs in UKB evaluation as well as or better than the American College of Cardiology ASCVD Risk Estimator, which uses quite different inputs (age, diagnostic history, BMI, smoking status, statin usage, etc.). We compare polygenic risk scores (risk conditional on genotype: PRS) for common diseases to the risk predictors which result from the concatenation of learned functions BMRS and PGS, i.e., applying the BMRS predictors to the PGS output.


Figure 11. The ASCVD BMRS and the ASCVD Risk Estimator both make accurate risk predictions but with partially complementary information. (Upper left): Predicted risk by BMRS, the ASCVD Risk Estimator and a PRS predictor were binned and compared to the actual disease prevalence within each bin. The gray 1:1 line indicates perfect prediction. ... The ASCVD Risk Estimator was applied to 340k UKB samples while the others were applied to an evaluation set of 28k samples, all of European ancestry. (Upper right) shows a scatter plot and distributions of the risk predicted by BMRS versus the risk predicted by the ASCVD Risk Estimator for the 28k Europeans in the evaluation set. The BMRS distribution has a longer tail of high predicted risk, providing the tighter confidence interval in this region. The left plot y-axis is the actual prevalence within the horizontal and vertical cross-sections, as illustrated with the shaded bands corresponding to the hollow squares to the left. Notably, both predictors perform well despite the differences in assigned stratification. The hexagons are an overlay of the (lower center) heat map of actual risk within each bin (numbers are bin sizes). Both high risk edges have varying actual prevalence but with a very strong enrichment when the two predictors agree.

Sunday, June 27, 2021

Othram CEO David Mittelman interview with Razib Khan

 
Razib Khan interviews Othram CEO David Mittelman. 

See Othram: the future of DNA forensics for some lab photos and analysis of relative identification via DNA database, the legal status of large databases (23andMe, Ancestry), etc. 

Fun fact: David is a serious powerlifter :-)
 


I've done a lot of things in science, but I get a unique kind of personal satisfaction every time Othram solves a case! Of course, this is happening so often now that I can barely keep track.

Saturday, June 19, 2021

LEO SAR, hypersonics, and the death of the naval surface ship

 

Duh... Let's spend ~$10B each for new aircraft carriers that can be easily monitored from space and attacked using hypersonic missiles. 

Sure, in a real war with a peer competitor we'll have to hide them far from the conflict zone. But they're great for intimidating small countries...

More on aircraft carriers.

The technology described in the videos is LEO SAR = Low Earth Orbit Synthetic Aperture Radar. For some people it takes vivid imagery to convey rather basic ideas.

In an earlier post we described how sea blockade (e.g., against Japan or Taiwan) can be implemented using satellite imaging and missiles, drones, AI/ML. Blue water naval dominance is not required. PLAN/PLARF can track every container ship and oil tanker as they approach Kaohsiung or Nagoya. All are in missile range -- sitting ducks. Naval convoys will be just as vulnerable. 

Sink one tanker or cargo ship, or just issue a strong warning, and no shipping company in the world will be stupid enough to try to run the blockade. With imaging accuracy of ~1m, missile accuracy will be similar to that of precision guided munitions using GPS.
 


Excerpt below from China’s Constellation of Yaogan Satellites and the Anti-Ship Ballistic Missile – An Update, International Strategic and Security Studies Programme (ISSSP), National Institute of Advanced Studies (NIAS -- India), December 2013. With present technology it is easy to launch LEO (Low Earth Orbit) micro-satellites on short notice to track ships, but PRC has had a sophisticated system in place for almost a decade.
Authors: Professor S. Chandrashekar and Professor Soma Perumal 
We can state with confidence that the Yaogan satellite constellation and its associated ASBM system provide visible proof of Chinese intentions and capabilities to keep ACG strike groups well away from the Chinese mainland. 
Though the immediate purpose of the system is to deter the entry of a hostile aircraft carrier fleet into waters that directly threatens its security interests especially during a possible conflict over Taiwan, the same approach can be adopted to deter entry into other areas of strategic interest
Viewed from this perspective the Chinese do seem to have in place an operational capability for denying or deterring access into areas which it sees as crucial for preserving its sovereignty and security.
ICEYE, a Finnish micro-satellite company, wants to use its constellation to monitor the entire planet -- Every Square Meter, Every Hour. This entire network would cost well under a billion USD, and it uses off-the-shelf technology. 

It seems plausible to me that PLARF would be able to put up additional microsats of this type even during a high intensity conflict, e.g., using mobile launchers like for the DF21/26/41. A few ~10 minute contacts per day from a small LEO SAR constellation (i.e., just a few satellites) provides enough targeting data to annihilate a surface fleet in the western Pacific.




Added from comments
:
... you can make some good guesses based on physics and the technologies involved. 
1. Very hard to hit a hypersonic missile that is maneuvering on its way in. It's faster than the interceptor missiles and they can't anticipate its trajectory if it, e.g., selects a random maneuver pattern. 
2. I don't think there are good countermeasures for hiding the carrier from LEO SAR. I don't even think there are good countermeasures against final targeting seekers (IR/radar) on the ASBM (or a hypersonic cruise missile) but this depends on details. 
3. If the satellite has the target acquired during the final approach it can transmit the coordinates to the missile in flight and the missile does not have to depend on the seeker. On the Chinese side it is claimed that the ASBM can receive both satellite and OTH radar targeting info while in flight. This seems plausible technologically, and similar capability is already present in PLAAF AAM (i.e., mid-flight targeting data link from jet which launched the AAM). 
4. The radar cross section of a large ship is orders of magnitude larger than, e.g., a jet fighter. The payload of a DF21/26/17 is much larger than an AAM so I would guess the seeker could be much more powerful than the IR/AESA seeker in, e.g., PL-15 or similar. (Note PL-15 and PL-XX/21 have very long (BVR) engagement ranges, like 150km or even 400km and this is against aircraft targets, not massive ships.) The IR/radar seeker in an ASBM could be comparable to those in a jet fighter. 
I seriously doubt you can hide a big ship from a hypersonic missile seeker that is much larger and more powerful than anything on an AAM, possibly as powerful as the sensors on a jet fighter. 
On launch the missile will have a good fix on the target location from the satellite data. In the ~10m time of flight the uncertainty in the location of, e.g., a carrier is ~10km. So the seeker needs to find the target in a region of roughly that size, assuming no in-flight update of target location. 
https://www.iiss.org/public... 
https://sameerjoshi73.mediu... 
Finally, keep in mind that sensor (both the missile seeker and on the satellite) and AI/ML capability are improving rapidly, so the trend is definitely against the carrier.

USN guy: We'll just hide the carrier from the satellite and missile seekers using, you know, countermeasures!  [Aside: don't cut my carrier budget!]

USAF guy: Uh, the much smaller AESA/IR seeker on their AAM can easily detect an aircraft from much longer ranges. How will you hide a huge ship?

USN guy: We'll just shoot down the maneuvering hypersonic missile using, you know, methods. [Aside: don't cut my carrier budget!]

Missile defense guy: Can you explain to us how to do that? If the incoming missile maneuvers we have to adapt the interceptor trajectory (in real time) to where we project the missile to be after some delay. But we can't know its trajectory ahead of time, unlike for a ballistic (non-maneuvering) warhead.

Monday, June 14, 2021

Japan and The Quad (Red Line geostrategy podcast)

 

I recommend this episode of The Red Line geostrategy podcast: Japan and The Quad

Serious analysts from the Asia-Pacific region (e.g., Australia, India, Japan, etc.) are often much better than their US counterparts. US analysts tend to misperceive local political and economic realities, and can be captives of Washington DC groupthink (e.g., about weapons systems like aircraft carriers or the F35 or missile defense). 

For example, Australian analysts acknowledged the vulnerability of US aircraft carriers to PRC ASBM and cruise missiles well before it became common for US analysts to openly admit the problem. The earliest technical analysis I could find of PRC satellite capability to track US surface ships in the pacific came from an Indian military think tank (see below maps), at a time when many US "experts" denied that it was possible.

In this podcast Japan's reliance on sea lanes for energy, food, and raw materials is given proper emphasis. Japan imports ~60% of its food calories and essentially all of its oil. The stituation is similar for S. Korea and Taiwan. It is important to note that blocking sea transport to Taiwan and Japan does not require PLAN blue water dominance. ASBM and cruise missiles which threaten aircraft carriers can also hold oil tankers and global shipping at risk from launch sites which are on or near the Asian mainland. Missile + drone + AI/ML technology completely alters the nature of sea blockade, but most strategic planners do not yet realize this. Serious conflict in this region would likely wreak havoc on the economies of Taiwan, S. Korea, and Japan.
        
Red Line: ... In seeking to counter an ever-expanding China, Tokyo is turning abroad in search of allies. Key to this is the recent revival of "The Quad", a strategic dialogue between the US, Australia, Japan and India. Will it be enough to counter their rising neighbour across the East China Sea? Is this the first step to creating an "Asian NATO", and how will China respond? 
Guests: 
Owen Swift 
Geopolitics and defence analyst specialising in Australian & East Asian Defence Written with organisations including The Australian Strategic Policy Institute and Monash University. Senior Producer and resident Asia-Pacific expert at The Red Line 
John Nilsson-Wright 
Senior Lecturer on Japanese Geopolitics and International Relations for Cambridge University. Senior Research Fellow for Northeast Asia for Chatham House. Author of the book Unequal Allies about the post-war relationship between Japan and the United States. 
John Coyne 
Head of the Northern Australia Strategic Policy Centre at the Australian Strategic Policy Institute (ASPI). Head of Strategic Policing and Law Enforcement at ASPI. One of the most trusted experts when it comes to the dynamics of East Asia for Australia and the United States.

 

Part 1: The Return from Armageddon (02:52) Owen Swift overviews Japan's place in East Asia and the fundamental geographic challenges that inform its geopolitics. We tackle Japan's inability to domestically provide the resources and food that its population needs, and how it has historically dealt with this insecurity. The consequences of World War 2 wreaked havoc on Japan's economy, political system and territorial holdings. We analyse the short and long term consequences of this, and seek to understand why it was that Japan took a positive view of the US occupation, comparing it to the option of a possible partial USSR occupation. In the 1980s some thought Japan was on a path to overtake the US economically. While that hasn't come to pass, we look at what it was that made Japan's economic miracle, and the effect that US involvement has today. We look at domestic issues in Japan, including the drastic demographic decline, their ongoing 'defensive only' posture, and the policy options on the table for balancing against the rise of China. Finally we overview Japan's involvement in the Asia-Pacific region as a whole, analysing who it has the best relations with. We look at the extensive investments and infrastructure development Japan is undertaking in ASEAN states, and its cooperation with India and Australia in recent years. 
Part 2: The Grand Dilemma (17:56) John Nilsson-Wright helps us understand the fundamental shifts in Japanese politics and foreign policy, including Article 9, the tension between the Yoshida doctrine, public opinion and US pressure within Japan, and the country's re-entry into the sphere of great power competition. We examine the extent of Japan's military presence in the Indo-Pacific; looking at its exercises with other powers and the concerns its neighbours have, some of whom still bear significant scars from World War 2. South Korea's relationship with Japan is one that, on the surface, seems like it should be closer than it is. We analyse why it is that despite their mutual interest in countering North Korea and China, their close geography and both being under the US umbrella, the two states have been unable to overcome enormous domestic resentment and historic scars. Japan's constitution has very tight constraints on what it can do militarily. Nilsson-Wright helps us understand the details of these restrictions and their history over the past few decades. We look at how the legal interpretation of the article has changed as Japan's needs have changed. We also look at Japan's expanding concept of national interest, which began as a purely defensive, geographically limited concept, but that has continued to expand in recent years. We contrast that with the difficulty the government has had with domestic views of Japan's role on the global stage. We tackle territorial issues including the Kuril and Sakhalin Islands, and look at Japan's role in a potential Taiwanese conflict. 
Part 3: A United Front? (43:48) John Coyne a takes us through the details of the Quad, and the roles that its constituent members play. We look at Japan's re-examination of their supply chains, their development of strategic depth and the recent news that they are considering abolishing the 1% of GDP cap on military spending. Coyne helps us understand what the Quad actually is - It is not and does not seek to be an Asian NATO - just as ASEAN is not an Asian EU. We look at what the limitations are for each of the states involved, particularly India. We look at the actual relationships and cooperation that has been seen between Quad members. With Japan's newfound willingness to be involved in military operations, we examine how closely they will work with Australia, India and the United States, and the extent to which the Quad is more than just symbolic. We then turn to China's response. Is China likely to seek a grouping like the Quad in opposition to it? Can the Quad actually contain China or its navy in any practical sense? What will China do if cooperation tightens? We look at how China has already sought to hit back, targeting Australia in particular with the "14 Grievances", which were delivered as a consequence of the deterioration of their relationship. Australia's membership and participation in the Quad is a key part of this deterioration. Finally we look at how the Quad members have worked to strategically separate from China, such as Japan's work to defeat China's monopoly on the rare earths industry.

The map below appeared in the 2017 blog post On the military balance of power in the Western Pacific.


Here is another map:


Excerpt below from China’s Constellation of Yaogan Satellites and the Anti-Ship Ballistic Missile – An Update, International Strategic and Security Studies Programme (ISSSP), National Institute of Advanced Studies (NIAS -- India), December 2013. With present technology it is easy to launch LEO (Low Earth Orbit) micro-satellites on short notice to track ships, but PRC has had a much more sophisticated system in place for almost a decade.
Authors: Professor S. Chandrashekar and Professor Soma Perumal 
We can state with confidence that the Yaogan satellite constellation and its associated ASBM system provide visible proof of Chinese intentions and capabilities to keep ACG strike groups well away from the Chinese mainland. 
Though the immediate purpose of the system is to deter the entry of a hostile aircraft carrier fleet into waters that directly threatens its security interests especially during a possible conflict over Taiwan, the same approach can be adopted to deter entry into other areas of strategic interest
Viewed from this perspective the Chinese do seem to have in place an operational capability for denying or deterring access into areas which it sees as crucial for preserving its sovereignty and security.

Bonus: This political cartoon about the G7 meeting has been widely shared in the sinosphere. Some of the esoteric meaning may be lost on a US audience, but see here for an explanation. Note the device on the table which turns toilet paper into US dollars. The Japanese dog is serving radioactive water to the co-conspirators. Italy, the BRI participant, refuses the drink. France and Germany seem to be thinking about it carefully. Who is the little frog? (Hint: NTD)

Sunday, June 13, 2021

An Inconvenient Minority: The Attack on Asian American Excellence and the Fight for Meritocracy (Kenny Xu)


Kenny Xu is a brave young man. His new book An Inconvenient Minority: The Attack on Asian American Excellence and the Fight for Meritocracy expertly documents a number of unpleasant facts about American society that most major media outlets, education leaders, and social justice advocates have been obfuscating or outright suppressing for decades.

1. Asian Americans (not foreign students from Asia, but individuals of Asian heritage who are US citizens or permanent residents) have been discriminated against in admission to elite institutions of higher education for over 30 years. 

To put it bluntly, Asian Americans must, on average, outperform all other groups in order to have an equal chance of admission to universities like Harvard or Yale. If one were to replace Asian Americans with Jews in the previous sentence, it would describe the situation in the early 20th century. Looking back, we are rightfully ashamed and outraged at the conduct of elite universities during this period. Future Americans, and observers all over the world, will eventually have the same reaction to how Asian Americans are treated today by these same institutions.

2. Asian American success, e.g., as measured using metrics such as income, wealth, or education, is problematic for simplistic narratives that emphasize race and "white supremacy" over a more realistic and multifaceted analysis of American society.

3. Efforts to guarantee equal outcomes, as opposed to equal opportunities, are anti-meritocratic and corrosive to social cohesion, undermine basic notions of fairness, and handicap the United States in scientific and technological competition with other nations.

The Table of Contents, reproduced below, gives an idea of the important topics covered. Xu had an insider's view of the Students for Fair Admission v. Harvard trial, now awaiting appeal to the Supreme Court. He also describes the successful effort by a grass roots coalition of Asian Americans to defeat CA Proposition 16, which would have reinstated racial preferences in the public sector (including college admissions) which were prohibited by Proposition 209 in 1996.

Over the years I have had many conversations on this topic with well-meaning (but often poorly informed) parents of all ethnic and cultural backgrounds. I cannot help but ask these people
Are you OK with discrimination against your child? What did they do to deserve it? 
Are you going to let virtue-signaling administrators at the university devalue the hard work and hard-won accomplishments of your son or daughter? Are you going to do anything about it?
and I cannot help but think
If you won't do anything about it, then f*ck you. Your kids deserve better parents.

Kenny calls it a Fight for Meritocracy. That's what it is -- a fight. Don't forget that Meritocracy is just a fancy word for fairness. It's a fight for your kid, and all kids, to be treated fairly.

I highly recommend the book. These issues are of special concern to Asian Americans, but should be of interest to anyone who wants to know what is really happening in American education today.





Related posts: discrimination against Asian Americans at elite US universities, on meritocracy, and UC faculty report on the use of SAT in admissions.

Thursday, June 03, 2021

Macroscopic Superpositions in Isolated Systems (talk video + slides)

 

This is video of a talk based on the paper
Macroscopic Superpositions in Isolated Systems 
R. Buniy and S. Hsu 
arXiv:2011.11661, to appear in Foundations of Physics 
For any choice of initial state and weak assumptions about the Hamiltonian, large isolated quantum systems undergoing Schrodinger evolution spend most of their time in macroscopic superposition states. The result follows from von Neumann's 1929 Quantum Ergodic Theorem. As a specific example, we consider a box containing a solid ball and some gas molecules. Regardless of the initial state, the system will evolve into a quantum superposition of states with the ball in macroscopically different positions. Thus, despite their seeming fragility, macroscopic superposition states are ubiquitous consequences of quantum evolution. We discuss the connection to many worlds quantum mechanics.
Slides for the talk.

See this earlier post about the paper:
It may come as a surprise to many physicists that Schrodinger evolution in large isolated quantum systems leads generically to macroscopic superposition states. For example, in the familiar Brownian motion setup of a ball interacting with a gas of particles, after sufficient time the system evolves into a superposition state with the ball in macroscopically different locations. We use von Neumann's 1929 Quantum Ergodic Theorem as a tool to deduce this dynamical result. 

The natural state of a complex quantum system is a superposition ("Schrodinger cat state"!), absent mysterious wavefunction collapse, which has yet to be fully defined either in logical terms or explicit dynamics. Indeed wavefunction collapse may not be necessary to explain the phenomenology of quantum mechanics. This is the underappreciated meaning of work on decoherence dating back to Zeh and Everett. See talk slides linked here, or the introduction of this paper.

We also derive some new (sharper) concentration of measure bounds that can be applied to small systems (e.g., fewer than 10 qubits). 

Related posts:




Wednesday, May 26, 2021

How Dominic Cummings And The Warner Brothers Saved The UK




Photo above shows the white board in the Prime Minister's office which Dominic Cummings and team (including the brothers Marc and Ben Warner) used to convince Boris Johnson to abandon the UK government COVID herd immunity plan and enter lockdown. Date: March 13 2020. 

Only now can the full story be told. In early 2020 the UK government had a COVID herd immunity plan in place that would have resulted in disaster. The scientific experts (SAGE) advising the government strongly supported this plan -- there are public, on the record briefings to this effect. These are people who are not particularly good at order of magnitude estimates and first-principles reasoning. 

Fortunately Dom was advised by the brothers Marc and Ben Warner (both physics PhDs, now working in AI and data science), DeepMind founder Demis Hassabis, Fields Medalist Tim Gowers, and others. In the testimony (see ~23m, ~35m, ~1h02m, ~1h06m in the video below) he describes the rather dramatic events that led to a switch from the original herd immunity plan to a lockdown Plan B. More details in this tweet thread.


I checked my emails with Dom during February and March, and they confirm his narrative. I wrote the March 9 blog post Covid-19 Notes in part for Dom and his team, and I think it holds up over time. Tim Gowers' document reaches similar conclusions.


 

Seven hours of riveting Dominic Cummings testimony from earlier today. 


Shorter summary video (Channel 4). Summary live-blog from the Guardian.



This is a second white board used in the March 14 meeting with Boris Johnson:



Blog Archive

Labels