Wednesday, August 30, 2023

The World of Yesterday: Steve Hsu on polygenic scores, gene editing, human flourishing

 

I really enjoyed this long conversation with Dan Schulz, an MSU engineering grad who works in tech. Dan did his homework and we covered a lot of important topics.

Transcript: https://www.danschulz.co/p/3-steve-hsu 
Apple: https://apple.co/44eTSrJ 
Spotify: https://spoti.fi/3P03SzN 

Timestamps
 
(0:00:00) - Intro 
(0:00:33) -  Genomic Prediction 
(0:05:54) - IVF 
(0:12:34) - Phenotypic data 
(0:15:42) - Predicting height 
(0:28:27) - Pleiotropy 
(0:39:14) - Optimism 
(0:45:03) - Gene editing 
(0:48:27) - Super intelligent humans 
(1:01:27) - Regulation 
(1:06:36) - Human values 
(1:17:38) - Should you do IVF? 
(1:26:06) - 23andMe 
(1:29:03) - Jeff Bezos 
(1:34:29) - Richard Feynman 
(1:43:43) - Where are the superstar physicists? 
(1:45:37) - Is physics a good field to get into?

Thursday, August 24, 2023

Aella: Sex Work, Sex Research, and Data Science — Manifold #42

 

Aella is a sex worker, sex researcher, and data scientist. 


Interviews with ex-prostitutes on the pimp life (Las Vegas) 

An earlier Aella interview with Reason: 


Audio-only and Transcript:

Steve and Aella discuss: 

(00:00) - Introduction 
(01:22) - Aella's background and upbringing 
(12:45) - Aella's experiences as a sex worker and escorting 
(29:52) - Pimp culture 
(38:01) - Seeking Arrangement 
(43:50) - Cheating 
(46:50) - OnlyFans, farming simps 
(51:49) - Incels and sex work 
(56:24) - Porn and Gen-Z 
(01:12:43) - Embryo screening 
(01:21:43) - How far off is IVG?

Thursday, August 10, 2023

AI on your phone? Tim Dettmers on quantization of neural networks — Manifold #41

 

Tim Dettmers develops computationally efficient methods for deep learning. He is a leader in quantization: coarse graining of large neural networks to increase speed and reduce hardware requirements. 

Tim developed 4-and 8-bit quantizations enabling training and inference with large language models on affordable GPUs and CPUs - i.e., as commonly found in home gaming rigs. 

Tim and Steve discuss: Tim's background and current research program, large language models, quantization and performance, democratization of AI technology, the open source Cambrian explosion in AI, and the future of AI. 





0:00 Introduction and Tim’s background 
18:02 Tim's interest in the efficiency and accessibility of large language models 
38:05 Inference, speed, and the potential for using consumer GPUs for running large language models 
45:55 Model training and the benefits of quantization with QLoRA 
57:14 The future of AI and large language models in the next 3-5 years and beyond

Blog Archive

Labels