Endless growth
3 May 2024 | 9:12 am

Stumbling on a post by an American author that I highly respect reignited my doubts about a dominant growth at all costs culture from the US.

While expressing my thoughts about minimalism sold as a product, I indulged on the uncomfortable question if I was biased against the supposed (by myself) American tendency to commodify everything. Back then, the only feedback I got from people living in the US was a passive-aggressive post on Mastodon that ended with a meme, smugly invalidating any opinion about American values that might come from abroad.

A few days ago, on one of my favourite blogs Life Is Such A Sweet Insanity, a post about travelling on a budget airline contained the following illuminating thought:

For some reason, the American mindset is endless growth. Everything must get bigger, everything must get better, and more, more, more, how do you like it, how do you like it. But the truth of the matter is, nothing natural undergoes infinite growth, other than some cancers.J.P. Wing

J.P. is an amazing writer, and I share an awful lot of his attitude, fully respecting his opinions the rare times when they don’t align with mine. He’s American. So here I go again: why most of this growth-at-all-costs destructive culture seems to be coming from there?

I’ve recently decided to stop reading The Conversation, after two consecutive posts were openly accusing Europe’s investors of not doing enough to be more like Silicon Valley. I’m seriously confused: how can anyone really believe, in 2024, that their business model is anything close to being sustainable? The mental slavery that parts of Europe still seems to be having towards the rot economy fuelled by a type of capitalism not integral to the continent is truly bewildering.

Reply via email

More Indieweb Automation
2 May 2024 | 7:17 pm

Follow-up to a previous case study on how I automated my static website publishing workflow. This time, a lean Shortcut script is allowing me to write webmentions in seconds.

For a while now I’ve been using notes to send Indieweb webmentions in the form of replies and likes. As these tend to be very short pieces, I thought again about how to bypass the annoying bits in the process of creating a note. When I experimented with Apple’s Shortcuts to find ways to expedite my writing workflow, I discovered it can be a pretty powerful tool.

Unlike regular posts, likes and replies only require an URL, a name, and in the case of replies also a bit of written content. Everything else can be inferred automatically (date, file name, creation of the file, tags). That’s when I decided to make the creation of a webmention with Jekyll as fast as possible. What I’ve ended up with is an icon in the Dock that I can click whenever I’m ready to reply or like someone’s post. The applet asks me for:

  • the URL of the post I’m liking or replying to;
  • the name of the person that I’m mentioning;
  • the title of the post (for likes only).

Everything else is created in the Shortcut automation:

Screenshot of the Apple software Shortcuts showing an automation to publish content on my website

Similarly to what I’ve achieved with the previous new post automation, once I input the two required info in pop-up prompts, a file is created, the editor Typora is launched, a Terminal session is launched—and minimised in the Dock—with the proper alias commands to serve the website locally. At that point, I can write my reply and be done with it in a very short time.

After duplicating and adjusting the script for likes, I ended up with 3 automation icons: new post, new reply, new like.

Screenshot of a section of a macOS Dock showing three blue icons for posting, and indieweb replies and likes

Reply via email

Re: Blocking Bots
2 May 2024 | 3:41 pm

Inspired by Neil Clarke and Ethan Marcotte, I moved my list of crawlers to a Jekyll YAML data file, and now use it to compile both the .htaccess and robots.txt files.

The premise is simple: to opt out of AI bots scraping my website and participate to the ongoing training of LLMs, I used to block a bunch of them via the old and trusted robots.txt. Since a rewrite condition within Apache’s .htaccess adds a further level of protection, I went on and created a single data file, writing the logic to use it to feed both files.

Once created bots.yml in my Jekyll data directory, I used a loop to iterate through the single items in my robots.liquid source file:

layout: none
permalink: /robots.txt
{%- for item in site.data.bots -%}
User-agent: {{ item }}
Disallow: /

{%- endfor %}

The end result after a site build is a /robots.txt file containing the entire list of disallowed bots. The rewrite instructions to block AI crawlers in the .htaccess file are the same as suggested by Ethan. Instead of performing a for loop, I just print them inline within a rewrite condition, separated by a pipe character:

# Block bots
<IfModule mod_rewrite.c>
  RewriteEngine on
  RewriteBase /
  RewriteCond %{HTTP_USER_AGENT} ({{ site.data.bots | sort_natural | join: "|" }}) [NC]
  RewriteRule ^ – [F]


Since I was there, I also optimised the .htaccess.liquid source file by creating a further YAML data file with all my redirects, looping through them in a now neat source file:

# Redirects
{%- for item in site.data.redirects %}
Redirect 301 {{ item }}
{%- endfor %}

Reply via email

More News from this Feed See Full Web Site