Taxonomy of procrastination
8 May 2023 | 12:00 am

Nobody gets everything they want in life. That's OK. If everyone was a sportscaster-rockstar-scientist-model-author-influencer-billionaire, we still wouldn't be happy because everyone else would be too busy to be impressed. But still, it's a little sad when you don't at least *try* to get what you want. My mental model is: Inside my head there's a guy named Jim. When I decide I want to do something, Jim does a calculation: How much time and energy will this take, and how much reward will it bring?

Numbers without which it’s impossible to talk about weight loss
4 May 2023 | 12:00 am

We lose weight when we burn more calories than we eat. But how much weight do you lose for a given caloric deficit? This isn’t complicated. But it’s not trivial either, because the body has two forms of energy reserves:

  • Body fat is familiar. This is used for long-term energy storage.

  • Glycogen is the other form of storage. It’s stored in your liver and muscle cells and is the primary form of short-term energy storage.

The issue is: Glycogen is 10 times heavier than body fat per stored calorie. This means that changes in diet produce sharp swings in body weight that are easy to misinterpret if you don’t think about glycogen. So do think about glycogen. Use these four numbers:

Bodyfat to store 1000 calories 0.13kg (0.28 lb)  
Glycogen to store 1000 calories 1.13kg (2.5 lb)  
Energy stored as glycogen in a typical person ~2000 calories  
Weight of all glycogen in a typical person ~2.25kg (5lb)  

Story time

Day 1. You’re an average adult and you stop eating for a day, creating a 2000-calorie per day deficit. What happens?

  • To stay alive, your body burns your entire 2.25kg (5lb) of stored glycogen.
  • Incidentally, this corresponds to around 0.5kg of sugar and 1.75kg of water. You’ll drink less or pee more to get rid of this.
  • At the end of the day, you’re 2.25kg (5lb) lighter.

Day 2. You continue not eating. Since all your glycogen is gone, your body needs to burn fat. To make 2000 calories, it burns 0.26kg (0.57lb) of bodyfat.

Days 3-6. You continue not eating (don’t do this) and after 6 days you’ve lost 2.25kg of glycogen and 1.3kg of fat for a total of 3.55kg.

Day 7. You have a cheat day and eat 4000 calories. What happens?

  • Your body uses 2000 calories to stay alive.
  • It uses the other 2000 calories to make 0.5kg of sugar.
  • It binds that sugar together with 1.75kg of water (obtained by peeing less or drinking more) to make 2.25kg of glycogen, which it packs away in your muscles and liver.

Now you’re only down 1.15kg, rather than 3.55kg like you were before the cheat day. But do you panic? No, you do not panic because you remember your good friend dynomight telling you that (a) fluctuations always happen when you begin or end of a diet because glycogen is heavy and (b) huge deficits are needed to lose any significant amount of fat.

Note: In reality, things aren’t quite this simple: Your body burns a mixture of fat and glycogen where the glycogen decreases over time. But it does appear to deplete pretty quickly.

Where do numbers come from?

First, remember this table from grade school?

Macronutrient Calories per gram
Carbohydrates 4
Protein 4
Fat 9

Second, how much body fat do you need to store 1000 calories? Well, that corresponds to 1000/9=111g of pure fat. However, the body fat or “adipose tissue” in your body is only 87% fat. (You can’t just store blobs of pure fat; you still need vascular cells and whatnot.) So you need 111/0.87 = 128g of adipose tissue:

    1000 calories
    × (1g fat / 9 calories)
    × (1g adipose tissue / 0.87g fat)
    = 128g adipose tissue.

Third, how much glycogen do you need to store 1000 calories? Well, glucose is a carbohydrate so you’d need 1000/4=250g of pure glucose. But to store it, your body bundles together lots of glucose molecules into the polymer called glycogen and then stores it in a hydrated form with around 3.5g of water for each gram of glucose (plus a tiny bit of potassium). So you store 250×4.5=1125g of hydrated glycogen.

    1000 calories
    × (1g glucose / 4 calories)
    × (4.5g hydrated glycogen / 1g glucose)
    = 1125g hydrated glycogen

Fourth, how much total energy do you store as glycogen? An average person might store around 500g of non-hydrated glycogen, mostly in the skeletal muscles and less in the liver. That corresponds to 2000 calories.

Fifth, how much does all your stored glycogen weigh? Remember, you store it hydrated with 3.5g of water for each 1g of glucose. So the 500g of pure glycogen corresponds to around 500×4.5=2.25kg of hydrated glycogen. This is what people mean when they talk about “water weight”. It is mostly water, but how much you store is determined by calories, not how much water you drink.

Finally, for fun, how many calories do you store as fat? A typical person might have around 20% body fat (highly varying) and might weigh 80kg (also highly varying, and correlated). That would mean they have 16kg of body fat. That’s 16×0.87=13.9kg of pure fat or 13900×9=125000 calories.

    80,000g humanmeat
    × (0.2 g adipose tissue / 1 g humanmeat)
    × (0.87 g fat / 1 g adipose tissue)
    × (9 calories / 1 g fat)
    = 125,000 calories

Reasons after nonpersons
24 April 2023 | 12:00 am

The year is 2029 and you’ve finally done it. You’ve written the first AI program that is so good—so damn good—that no one can deny that it thinks and has subjective experiences.

Surprisingly, the world doesn’t end. Everything seems fine. Except—all those bongcloud philosophy 101 thought experiments people have done for generations suddenly seem much less hypothetical. But also, kind of tedious?


han frozen

Bongcloud 101: When Vader froze Han in carbonite, did he, like, die? And did he get reincarnated later? If he was never unfrozen, then when did he die?

You 2029: If you stop your program, do you “kill” it?


Bongcloud 101: A thousand years from now, you discover a mineshaft with 500 cryogenically frozen bodies. If you don’t revive them, is that murder?

You 2029: You’re walking down the street and find a thumb drive with a bunch of AI programs on it. Are you ethically obligated to start them?



Bongcloud 101: I have severe epilepsy, which you treat by cutting my corpus callosum so my two cortical hemispheres can’t directly communicate. After, I say I’m fine and I seem fine, but sometimes my two arms try to do contradictory things. And if you block the light going into the left half of my eyes, I can’t tell you what’s in front of me—even though my left arm still immediately captures any available burritos. Are there two people in my head?

You 2029: You take your AI and split it into two modules with limited communication. Is there now a second consciousness?


Bongcloud 101: You and me, we love each other. So much that mere marriage won’t do. To honor our bond, what we need is high-bandwidth interconnect between all four of our cortical hemispheres. So we design a, like, interpersonal love-callosum. After installation, we’re confused at first, but a few weeks later, we act as one and report that we are a single sentience. Did one of us die?

You 2029: You take your AI and my AI and link them up. Did you “kill” one?


Bongcloud 101: Bored one night, you sneak into my bedroom and recode my neurons so I have the memories and personality of Napoleon. When I wake up, I’m less interested in philosophical rambling on the internet, and much more interested in taking over the world. Am I still me?

You 2029: You take your computer and switch out the AI program for a new one. Is it still the “same” AI?

(If these AI questions seem tedious, please remember that’s the point.)


Bongcloud 101: Once you were 8. Now you are 48. In all those years, you got mostly new memories, you gained a new appreciation for quiet restaurants and empty lawns, and most of your atoms turned over several times. Are you still the same person?

You 2029: You take your AI and slowly change it. Eventually, you’ve changed it a lot. Is it still the “same” AI?


Bongcloud 101: Your identical twin is in an accident and their cortex is destroyed. Always generous, you volunteer to have your left hemisphere removed and installed in their head. Both bodies survive and—while not the brightest—both say they are you. Which is right?

You 2029: You take half of your AI and move it to a new computer. Which computer is the old AI? Neither? Both?


Bongcloud 101: You’re having one of your edgelord what does it all mean life is finite why is there no permanence-type crises. But then you have a thought: Maybe you aren’t your atoms, but the pattern, so as long as the universe exists there’s a chance your pattern could be recreated. You bring this up with your math support group and get into a debate about how the probability of being recreated might evolve as the universe approaches thermodynamic equilibrium. Does immortality hinge on the solution to some infinite series?

You 2029: Someone blows up your computer. (Something about “mesa-optimizers” and the “orthogonality thesis”.) You rewrite your AI program and run it on a new computer. Is it still the “same”?


Bongcloud 101 (the archetypal bongcloud): You scan your body, destroy it, and then have a machine recreate it on Mars. Did you kill yourself, or just transport yourself? If you accidentally failed to destroy the old you, should you happily jump into a vat of acid, content that there’s another copy out there?

You 2029: When you move your AI to new hardware, you accidentally fail to turn off the old one at the same time. If you turn it off later, does that mean you killed it?


Here’s what I notice: When you do these experiments with people, they’re thought-provoking, if perhaps “embarrassingly bongcloud”. But when you translate them to AI, they seem pedantic and silly—it’s “obvious” that they don’t have meaningful answers.

I think this shows that these questions don’t have answers for people either. They are based on abstractions that seem fundamental, but that’s only because of contingencies in how evolution made us.

Here’s how life and consciousness seem to work:

  1. Consciousness comes bundled in discrete units called “animals”. Each consciousness is tied to a specific meat-computer.

  2. You can count how many conscious things are in a given room.

  3. Life has a beginning and end.

  4. You are the same person you were yesterday.

None of these seem to be true for AIs. Once we start saving and restoring and copying and splitting and merging them, I bet we’ll notice:

  • AIs aren’t really “born”, nor do they “die”. They are just systems that evolve over time.

  • The AI is independent of the hardware it’s running on.

  • It’s hard to say “how many” AIs exist. There will just be modules everywhere with different degrees of interconnect. Module 42,872,232 will have a huge fast pipe to module 42,872,231 on the same chip but a tiny slow pipe to module 52,030 on a satellite circling Neptune.

  • It’s often meaningless to ask if an AI is “the same” as some other AI.

But if 1-4 aren’t true for AIs, then they aren’t really true for people either, right? If people could connect and merge and remix their brains at will, we’d look at personhood very differently. They are illusions that seem to be true because evolution happened to design us in a way where they are almost true, aside from the occasional glitch in the matrix like split-brain patients or twilight anesthesia.

Of course, philosophers have been questioning these things for a long time. But those experiments seem very hypothetical. The illusions are very convincing, and I don’t think you can (or should?) act like they aren’t true.

But when we can actually talk to AIs, where these things seem obviously false, the vibe will be very different. The illusion will be harder to maintain, and I think we’ll shift towards a real-world bongcloud vision where “you” are “one corner of the consciousness field” (or whatever) experiencing an illusion of selfhood and continuity because evolution programmed that into you.

More News from this Feed See Full Web Site