An avatar for Allan

Allan Lasser

Compostmodernist

  • Shelf
  • Portfolio
  • Résumé
Currently Reading
The cover of the book Tomorrow, and Tomorrow, and Tomorrow

Tomorrow, and Tomorrow, and Tomorrow

Gabrielle Zevin

Recently Read
The cover of the book Saving Time

Saving Time

Discovering a Life Beyond the Clock

Jenny Odell

The cover of the book The Ends of the World

The Ends of the World

Volcanic apocalypses, lethal oceans, and our quest to understand Earth's past mass extinctions

Peter Brannen

  • An Interview with Former TechCrunch Editor-in-Chief Matthew Panzarino About Covering Technology

    stratechery.com

    I couldn’t bring myself to just say, “Hey, we’re going to eat dirt with a kid and a mortgage and all this stuff,” so I just woke up at 3:00 AM instead and wrote for three, four hours, then went to work. Worked some retail, did my photography jobs, came home, had dinner with my wife, put her to bed, and then wrote for another two, three hours, then got two or three hours sleep. I just did that for two years, I just ground it out. I wrote, I don’t know, however many articles.

    [...]

    There was no magic to it, it wasn’t any big break stuff. It was just really grinding. Just publish, publish, publish, then I did that for two years.

    And then two years into it, I just saw a tweet and somebody was like, “Hey, we are looking for a West Coast writer” and I looked up the website and I’m like, this is run by some crazy Dutch guys and they’re trying to compete with TechCrunch and all these other websites, and that was The Next Web. They very graciously took a chance on me and I said, “Here’s my portfolio, here’s my website, go look at it. This is my portfolio, there’s nothing more to this, I don’t have a degree to show you, I don’t have a journalism credential. What I have is this body of work.”

    What I have is two years of waking up at three in the morning and hustling.

    That’s it. And I said, “Read the stuff at the end.” Read the newest things.

    11/29/2023 7:10 PM

  • Friday, October 13, 2023

    Progressively Enhanced Search with Next.js 13 and React Server Components

    I just recently upgraded my digital garden with search and, like adding feeds, the Next.js App Router made this feature extremely easy to accomplish. But more than a pleasant developer experience, I'm most excited by how simple it was to create a progressively enhanced search interface with React Server Components.

    The foundation of the search component is statically-rendered, browser-native HTML that provides a fast, robust, and secure baseline search experience. If you're on a very slow connection, searching the site should be faster than waiting for the interactive JS to finish loading. But, once that JS does load, I can layer interactive affordances on top, like providing instant results as you type and improving navigation.

    While this was ultimately a fairly simple exercise, I have a few reasons for sharing the process of developing this component. First, search is a very common pattern and the more documentation on how to build it, the better. Second, this example touches on many new features introduced to React and Next.js while still being simple enough to wrap your head around. Finally, I wanted to share how this framework enables developers to follow a path of progressive enhancement from the very beginning, instead of just shrugging off accessability until "some time later on."

    Creating a progressively enhanced search component with Next.js App Router and React Server Components

    The first thing I did was create the statically-rendered HTML foundation for this feature. I started by creating some data types, a search function, a <Search /> component, and a new /search page:

    src/types/search.ts
    export interface SearchResult {
      _id: string;
      title: string;
    }
    
    export interface SearchResponse {
      results?: SearchResult[];
      error?: string;
    }
    src/data/search.ts
    import type { SearchResponse } from "src/types/search";
    
    export default async function searchFn(
      query?: string
    ): Promise<SearchResponse> {
      if (!query || query.length < 2) return {};
      try {
        // Provide your own search logic here!
        return { results: ... }
      } catch (e) {
        return { error: String(e) };
      }
    }
    

    This search function could do anything—maybe it searches by calling a service like Algolia, or maybe it directly queries a Postgres database. Since the function runs on the server, I'm not risking leaking credentials to the client.

    src/components/search.tsx
    import type { SearchResponse, SearchResult } from "src/types/search";
    
    function ResultsList({ results }: { results?: SearchResult[] }) {
      if (!results || !results?.length) return null;
      return (
        <ul>
          {results.map((result) => (
            <li key={result._id}>{result.title}</li>
          ))}
        </ul>
      );
    }
    
    interface SearchProps {
      initialData: {
        query: string;
        response: SearchResponse;
      };
    }
    
    export default function Search(props: SearchProps) {
      const { query, response } = props.initialData;
      return (
        <div>
          <form method='GET' action='/search'>
            <input name='query' value={query} />
          </form>
          <ResultsList results={response?.results} />
        </div>
      );
    }
    

    After defining the Search component, I rendered it onto a /search page:

    src/app/search/page.tsx
    import Search from "src/components/search"
    import searchFn from "src/data/search"
    
    interface PageParams {
      searchParams: Record<string, string>;
    }
    
    export default async function SearchPage({ searchParams }: PageParams) {
      const { query } = searchParams;
      const response = await searchFn(query);
    
      return <Search initialData={{query, response}} />
    }

    With just this, I already have a fully working search! This relies on the fundamental form functionality provided in every browser. On submission, the form redirects to the search page and provides the query as a URL search parameter. Since the SearchPage is server-rendered, it runs the search and passes the results down to the component without a lick of JavaScript. As a bonus, the search field will be pre-populated with the query string if we refresh the page, or navigate forward or back later.

    Now, with this rock-solid foundation in place, I started to progressively enhance the search experience with JavaScript-driven interactions to improve its responsiveness and utility.

    Enhancement #1: Instant Search

    It's nice getting search results after hitting enter, but it's even nicer to see the results update live as you type.

    Since this enhancement has client-side interactivity, I needed to add some asynchronous data fetching behavior and state management inside the component. The simplest way to add this dynamic functionality with React was to encapsulate it as a hook:

    src/hooks/useSearch.ts
    import { useCallback, useEffect, useRef, useState } from "react";
    import { SearchResponse } from "src/types/search";
    
    const SEARCH_DELAY = 250; // milliseconds
    
    export interface InitialState {
      query?: string;
      response?: SearchResponse;
    }
    
    export function useSearch(initialState?: InitialState) {
      const [query, setQuery] = useState(initialState?.query ?? "");
      const [response, setResponse] = useState(initialState?.response ?? {});
      const prevQueryRef = useRef<string>(query);
    
      const runSearch = useCallback(async () => {
        if (query !== prevQueryRef.current) {
          if (!query) {
            setResponse({});
          } else {
            try {
              const fetchRes = await fetch(`/api/search?query=${query}`);
              const searchRes: SearchResponse = await fetchRes.json();
              setResponse(searchRes);
            } catch (e) {
              setResponse({ error: String(e) });
            }
          }
        }
        prevQueryRef.current = query;
      }, [query]);
    
      // Debounce the search so we're not running on every keystroke
      useEffect(() => {
        const timeout = setTimeout(runSearch, SEARCH_DELAY);
        return () => {
          clearTimeout(timeout);
        };
      }, [runSearch]);
    
      return {
        query,
        response,
        setQuery,
      };
    }

    This hook declares the query and response as stateful values, returning them back to the component along with a function to update the query state. It also tracks the value of the previous query with a reference, which will persist the value between component rendering cycles. Finally, it defines a search function and runs it whenever the value of query changes. Finally, the search function runs are debounced to ensure we're not running it responsibly.

    Since useEffect can be confusing, it's worth looking closer at how the dependency arrays ensure the search function is run as we expect. The runSearch function is declared with useCallback and a dependency array of [query]. This means that only after the query changes will runSearch be reevaluated with the new query value. Likewise, the effect will only and always run the search function any time the function's value is reevaluated. This dependency chaining means, indirectly, that the effect will run the search function every time the query changes.

    It's also worth noting that I'm using a different search function in the hook than the one defined in src/data/search.ts and used by src/app/search/page.tsx. That's because the code in the hook will only ever execute on the client. I always try to avoid making external calls from client code, which risks exposing any API keys or other potentially sensitive information to clients (or, at the very least, risks shipping broken code that cannot access necessary environment variables).

    Instead, I took advantage of Route Handlers in Next.js 13 to create a new endpoint to mediate between the client component and the data logic.

    src/app/api/search/route.ts
    import { type NextRequest, NextResponse } from "next/server";
    import type { SearchResponse } from "src/types/search";
    import searchFn from "src/data/search";
    
    export async function GET(request: NextRequest) {
      const query = request.nextUrl.searchParams.get("query") ?? "";
      const response = await searchFn(query);
      return NextResponse.json<SearchResponse>(response);
    }

    Finally, I updated the Search component to use my new useSearch hook.

    src/components/search.tsx
    "use client";
    
    import { ChangeEvent } from "react";
    import { useSearch, InitialState } from "src/hooks/useSearch";
    import type { SearchResult } from "src/types/search";
    
    function ResultsList({ results }: { results?: SearchResult[] }) {
      if (!results || !results?.length) return null;
      return (
        <ul>
          {results.map((result) => (
            <li key={result._id}>{result.title}</li>
          ))}
        </ul>
      );
    }
    
    interface SearchProps {
      initialData?: InitialState;
    }
    
    export default function Search(props: SearchProps) {
      const { query, response, setQuery } = useSearch(props.initialData);
    
      function handleChange(event: ChangeEvent<HTMLInputElement>) {
        setQuery(event.currentTarget.value);
      }
    
      return (
        <div>
          <form method='GET' action='/search'>
            <input name='query' value={query} onChange={handleChange} />
          </form>
          <ResultsList results={response?.results} />
        </div>
      );
    }

    Compared to the component I first created, this one:

    • Declares "use client" at the top of the file, identifying it to Next as a client component.
    • Assumes the initial data it receives is the same as the initial state of useSearch.
    • Uses the stateful values of query and response provided by the hook instead of the ones directly passed to the component.
    • Makes the search field a controlled component by locking its value to the stateful query value; to update the value, I added a function that updates the state whenever the field emits an onChange event.

    The Search component will still work without JavaScript, but now when JavaScript is available it will be able to return results immediately. Of course, this behavior could be further customized. It could instead show search suggestions for a type-ahead experience, or prefetch result data without rendering it until the form is submitted.

    Enhancement #2: Navigation

    Speaking of form submission: by relying on basic browser behavior, the form will automatically redirects to the /search page upon submission. Without changing anything, this would run the query on the server and render a new page with the results.

    But, since Next.js provides client-side routing, I wanted to make this interaction even smoother for users who've loaded JS. By handling the form's onSubmit event, I upgraded the form behavior to prefer client-side routing when it's available.

    It's helpful to have the query preserved in the navigation history, so that if I choose a result and then use the browser's "back" button, I'm returned to the same search I just preformed. But when using the enhanced instant search, the query isn't preserved in the history.

    This was an easy fix. I added a line to the hook's search function to update the route after the search completes. While I was at it, I consolidated my onChange and onSubmit handlers into the hook. Now my hook provides everything the enhanced component needs, without concerning the component with any underlying state management.

    src/hooks/useSearch.ts
    import {
      ChangeEvent,
      FormEvent,
      useCallback,
      useEffect,
      useRef,
      useState,
    } from "react";
    import { useRouter } from "next/navigation";
    import { SearchResponse } from "src/types/search";
    
    const SEARCH_DELAY = 250; // milliseconds
    
    export interface InitialState {
      query?: string;
      response?: SearchResponse;
    }
    
    export function useSearch(initialState?: InitialState) {
      const [query, setQuery] = useState(initialState?.query ?? "");
      const [response, setResponse] = useState(initialState?.response ?? {});
      const prevQueryRef = useRef<string>(query);
      const router = useRouter();
    
      const handleChange = (event: ChangeEvent<HTMLInputElement>) => {
        setQuery(event.currentTarget.value);
      };
    
      const handleSubmit = useCallback(
        (event?: FormEvent<HTMLFormElement>) => {
          event?.preventDefault();
          router.push(`/search?query=${query}`);
        },
        [router, query]
      );
    
      const runSearch = useCallback(async () => {
        if (query !== prevQueryRef.current) {
          if (!query) {
            setResponse({});
          } else {
            try {
              router.push(`?query=${query}`);
              const fetchRes = await fetch(`/api/search?query=${query}`);
              const searchRes: SearchResponse = await fetchRes.json();
              setResponse(searchRes);
            } catch (e) {
              setResponse({ error: String(e) });
            }
          }
        }
        prevQueryRef.current = query;
      }, [query, router]);
    
      // Debounce the search so we're not running on every keystroke
      useEffect(() => {
        const timeout = setTimeout(runSearch, SEARCH_DELAY);
        return () => {
          clearTimeout(timeout);
        };
      }, [runSearch]);
    
      return {
        query,
        response,
        handleChange,
        handleSubmit,
      };
    }
    

    (I opted to update the search param during search, rather than on every change to the query value, to avoid adding unhelpful noise into a visitor's browser history.)

    src/components/search.tsx
    "use client";
    
    import { useSearch, InitialState } from "src/hooks/useSearch";
    import type { SearchResult } from "src/types/search";
    
    function ResultsList({ results }: { results?: SearchResult[] }) {
      if (!results || !results?.length) return null;
      return (
        <ul>
          {results.map((result) => (
            <li key={result._id}>{result.title}</li>
          ))}
        </ul>
      );
    }
    
    interface SearchProps {
      initialData?: InitialState;
    }
    
    export default function Search(props: SearchProps) {
      const { query, response, handleChange, handleSubmit } = useSearch(
        props.initialData
      );
    
      return (
        <div>
          <form method='GET' action='/search' onSubmit={handleSubmit}>
            <input name='query' value={query} onChange={handleChange} />
          </form>
          <ResultsList results={response?.results} />
        </div>
      );
    }
    

    Wrapping Up

    By the end of this process, I had only added around 250 lines of code across six files:

    1. A type declaration file (which could have been inlined elsewhere).
    2. A data handling file, to keep sensitive logic isolated from components.
    3. A server component that works without any JavaScript enhancement, which was then upgraded to a client component when enhancements were layered on.
    4. A hook that provides all the enhanced client-side search functionality (which, again, could have been inlined alongside the client component).
    5. A user-facing /search page to render the component and load results, which also statically generates results pages for users without JavaScript.
    6. A /api/search route handler to allow my client-side functionality to safely call my data handling function.

    Over half of this was in service of enhanced client-side functionality, which while totally optional, was very easy to include.

    Progressive enhancement is a process, not only an outcome

    In my experience, when a software company promises to improve the accessibility or compatibility of a feature at a later time, that promise almost never come true. It's understandable why that's the case. There's always new demands and higher priorities that are unforeseeable from the start.

    That's why it's important for web developers to use the technologies and strategies that make progressive enhancement part of the development process from the outset. That's also why I'm so excited by the continued development of Next.js and React to make it easier than ever to develop and deploy progressively enhanced frontends.

    With Next App Router and React Server Components, I had the framework to easily develop and render static HTML that quickly achieved fundamental functionality. That provided me the foundation to layer on richer functionality for an improved experience. Progressive enhancement was part of the development process from the get-go, leading to a component that works for everybody, all the time.

  • CEO pay slightly declined in 2022

    epi.org

    Adam Tooze’s most recent Chartbook newsletter opens with this study on CEO pay.

    Since CEO pay is mostly stock based, calculating it is not entirely straightforward because the value of stocks is continually changing. We use two measures to give a fuller picture: a backward-looking measure—realized compensation—and a forward-looking measure—granted compensation. Using the realized compensation measure, compensation of the top CEOs shot up 1,209.2% from 1978 to 2022 (adjusting for inflation). Top CEO compensation grew roughly 28.1% faster than stock market growth during this period and far eclipsed the slow 15.3% growth in a typical worker’s annual compensation. CEO granted compensation rose 1,046.9% from 1978 to 2022.

    In this study, “the CEOs examined [...] head large firms.” But even within startups, it’s funny how so many founders give themselves a C-level title when there’s really nobody else to manage or even a real business underneath them.

    CEO compensation has even been breaking away from that of other very highly paid workers. Over the last three decades, compensation grew far faster for CEOs than it did for the top 0.1% of wage earners (those earning more than 99.9% of wage earners). CEO compensation in 2021 (the latest year for which data on top 0.1% wage earners are available) was 7.68 times as high as wages of the top 0.1% of wage earners, a ratio 4.1 points greater than the 3.61-to-1 average CEO-to-top-0.1% ratio over the 1951–1979 period.

    The fact that CEO compensation has grown much faster than the pay of the top 0.1% of wage earners indicates that CEO compensation growth does not simply reflect a competitive race for skills (the “market for talent”) that would also increase the value of highly paid professionals more generally. Rather, the growing pay differential between CEOs and top 0.1% earners suggests the growth of substantial economic rents (income not related to a corresponding growth of productivity) in CEO compensation. CEO compensation does not appear to reflect the greater productivity of executives but their ability to extract concessions from corporate boards—a power that stems from dysfunctional systems of corporate governance in the United States. But because so much of CEOs’ income constitutes economic rent, there would be no adverse impact on the economy’s output or on employment if CEOs earned less or were taxed more.

    The report also provides some policy recommendations to reverse the trend:

    Ideally, tax reforms would be paired with changes in corporate governance:

    • Implementing higher marginal income tax rates at the very top would limit rent-seeking behavior and reduce the incentives for executives to push for such high pay.
    • Another option is to set corporate tax rates higher for firms that have higher ratios of CEO-to-worker compensation. Clifford (2017) recommends setting a cap on executive compensation and taxing companies on any amount over the cap, similar to the way baseball team payrolls are taxed when salaries exceed a cap.

    Dysfunctional governance can explain income inequality within businesses as well as within American society at large. But the solution for income inquality in both situations is the same! If taxing the 1% is an argument for more efficient, competitive businesses, can that make it more politically popular? Not while government—national or corporate—is trying to appease CEOs at the expense of everyone else.

    10/2/2023 2:26 PM
  • Treading Thin Air

    lrb.co.uk

    We have reached a stage of global warming at which every decision is critical: we don’t know when our last chance will have been. So when, for example, we base the vast part of our climate policy on offset markets and carbon taxes, as we are doing, and proceed to calculate the social cost of carbon to determine an ‘optimal’ carbon tax that ‘efficiently’ manages the ‘trade-offs’ between the costs and benefits of emitting GHGs, we are doing something much more dangerous than is usually acknowledged. A precise calculation of the ‘optimal’ carbon tax is nothing more than a claim that the best way forward is to perch the gargantuan machine of contemporary capitalism as close as possible to the precipice without tipping us all over the edge. That is neither efficient nor optimal. It is a myopic and recklessly arrogant approach to the unknown fate of life on earth.

    What we need is a much more honest assessment of what we do not or cannot know, which is, among other important things, where the edge is. We might, in fact, be past it already, treading thin air like Wile E. Coyote before the fall. Today’s politicians don’t like uncertainty: it introduces doubt. Yet we are in desperate need of a politics that looks catastrophic uncertainty square in the face. That would mean taking much bigger and more transformative steps: all but eliminating fossil fuels, for a start, and prioritising democratic institutions over markets. The burden of this effort must fall almost entirely on the richest people and richest parts of the world, because it is they who continue to gamble with everyone else’s fate.

    9/18/2023 12:11 PM

  • Monday, August 28, 2023

    Feast of Lanterns

    I helped organize and throw the twentieth Feast of Lanterns on August 26th in Indianapolis's Spades Park. The event drew over 10,000 Indianapolis residents and earned tens of thousands of dollars in revenue for local businesses.

    Families gathered on the lawn in front of the stage.
    Friends and families eating at communal tables underneath Edison bulbs.

    While I had volunteered during setup and teardown in previous years, this was my first year as a member of the organizing committee. Under this role, I provided organizational and logistical support for the rest of the team: administering digital tools and services, tracking project process, and coordinating communication and shared files.

    I also designed and printed a number of graphic assets for the festival, including the poster and marketing graphics, a visitor's guide, various site maps detailing layout concerns, and twenty-foot-tall stage banners.

  • The Brain Behind ‘Barbie’

    rollingstone.com

    This Rolling Stone interview with Greta Gerwig has some amazing moments. I highly recommend reading it in full after seeing Barbie. Here are my highlights (interviewer in bold, emphasis mine):

    There’s a lovely scene where Barbie sees an older woman — a sight she’d never encountered in Barbieland — and tells her she’s beautiful.

    I love that scene so much. And the older woman on the bench is the costume designer Ann Roth. She’s a legend. It’s a cul-de-sac of a moment, in a way — it doesn’t lead anywhere. And in early cuts, looking at the movie, it was suggested, “Well, you could cut it. And actually, the story would move on just the same.” And I said, “If I cut the scene, I don’t know what this movie is about.”

    From the moment that Margot came to me and I knew we were making this for Margot, I equally knew we were making this for Ryan. And I did not know Ryan at all. I’d never met him. I just was sure, and as soon as I thought of it, it made me so happy. Who else could do this? It’s some combination of Marlon Brando meets Gene Wilder meets John Barrymore meets John Travolta.

    I felt with both of them [Robie and Gosling] that I might direct movies for a long time and never see anything that uniquely and gloriously unhinged.

    I think of the film as humanist above anything else. How Barbie operates in Barbieland is she’s entirely continuous with her environment. Even the houses have no walls, because you never need to hide because there’s nothing to be ashamed of or embarrassed of. And suddenly finding yourself in the real world and wishing you could hide, that’s the essence of being human. But when we were actually shooting on Venice Beach, with Margot and Ryan in neon rollerblading outfits, it was fascinating because it was actually happening in front of us. People would go by Ryan, high-five him, and say, “Awesome, Ryan, you look great!” And they wouldn’t actually say anything to Margot. They’d just look at her. It was just surreal. In that moment, she did feel self-conscious. And as the director, I wanted to protect her. But I also knew that the scene we were shooting had to be the scene where she felt exposed. And she was exposed, both as a celebrity and as a lady. To be fair, Ryan was like, “I wish I wasn’t wearing this vest.” [Laughs.] But it was a different kind of discomfort.​

    There are clips online of you and Kate onstage together in a production at Columbia University.

    We lived together, we were in an improv group together. I always thought Kate was the funniest, most talented person I knew. But then you have this moment where you think, “Well, maybe that was just college.” But I was right!

    When I was casting and I called her, we laughed the whole time because I think we both had the same experience at that moment. For whatever reason, with the direction that our lives led us, I’m actually directing this movie, and she actually is a comedic genius who was recognized as such. And now we’re adults, and I’m saying, “Do you want to come do this?” It was like, we’d gotten into a time machine when we were 18 and came out at 39. The reality is, we’re still the 18-year-old kids who are making musicals. We actually didn’t get more sophisticated than we were at 18.

    Does anyone? If not exactly 18, then that 18–24 range? Wiser hopefully, but I agree that sophistication has a ceiling.

    You’re a member of the Directors Guild, the Writers Guild, and the Actors Guild. The Writers Guild is already on strike, and the other guilds don’t seem too happy, either. There are whispers of a tri-Guild walkout.

    I’m really proud of being a union member. I’m in support 100 percent of however we come at this.

    I’m living through this moment like everybody else is, especially in terms of the AI thing, which is terrifying and exciting. I don’t know what to say about it. I guess it’s clearly a tool that hopefully can be used to help. I think it’s incredibly important to protect creative people — writers and directors and actors — because I don’t think what they can do can be replicated. We have to set some very firm ground rules moving forward. Because otherwise, we’re looking at a world that becomes a photocopy of a photocopy of a photocopy.

    7/24/2023 1:36 AM
  • The Ancient ‘Wonder Material’ Sucking CO2 Out of the Atmosphere

    reasonstobecheerful.world

    The logistics of biochar production also mean that rather than massive centralized facilities, the most workable large-scale deployment will require many thousands of mid-sized plants spread across the world, according to Reinaud.

    “Biochar production is necessarily a distributed enterprise. So if you don’t have 5,000 cows, and a huge amount of manure, it’s probably not very appealing to you. But when it’s in the same spot, it’s a really appealing sustainability proposition.”

    6/1/2023 8:07 PM
  • The dystopian lake filled by the world’s tech lust

    bbc.com

    The intriguing thing about both neodymium and cerium is that while they’re called rare earth minerals, they’re actually fairly common. Neodymium is no rarer than copper or nickel and quite evenly distributed throughout the world’s crust. While China produces 90% of the global market’s neodymium, only 30% of the world’s deposits are located there. Arguably, what makes it, and cerium, scarce enough to be profitable are the hugely hazardous and toxic process needed to extract them from ore and to refine them into usable products.

    More reading on neodymium, with this story focused on its global sourcing. This mineral is so prevalent in consumer electronics, it feels like a very important and underreported resource.

    It could be argued that China’s dominance of the rare earth market is less about geology and far more about the country’s willingness to take an environmental hit that other nations shy away from.

    Seems like there’s a correlation between the offshoring of industrial production and the establishment of stricter environmental policy in the 70s and 80s. It may be fair to say it’s less the country’s willingness to create toxic sites as it is the lack of safeguards against it.

    And there’s no better place to understand China’s true sacrifice than the shores of Baotou toxic lake. Apparently created by damming a river and flooding what was once farm land, the lake is a “tailings pond”: a dumping ground for waste byproducts.

    Whenever I spend time out in Colorado, I see the landscape as a kind of postindustrial wasteland: tailings piles, abandoned mines, and even the evolution of mining trams into chair lifts. Those landscapes are leftover from the last century’s mineral extraction of precious metals—gold, silver, copper—that fueled and funded westward expansion.

    For how different they are, one thing that America and China have in common is their landmass. Huge, nations, spanning continents. With all that space, it becomes easy to find far-flung areas where industrial processes and wastes can flow outside awareness for the people they serve—which, based on the piece, is the global electronics market.

    4/20/2023 2:31 PM
  • What if climate change meant not doom — but abundance?

    washingtonpost.com

    Much of the reluctance to do what climate change requires comes from the assumption that it means trading abundance for austerity, and trading all our stuff and conveniences for less stuff, less convenience. But what if it meant giving up things we’re well rid of, from deadly emissions to nagging feelings of doom and complicity in destruction? What if the austerity is how we live now — and the abundance could be what is to come?

    This doesn’t seem like much of a what-if—so many people are currently living under austerity now.

    As I’ve learned more about ecovillages and cooperative communities, one of my biggest realizations is the power of communal sufficiency. When living in a community that can generate 80% of its own power and grow 50% of its own food, abundance becomes the norm.

    “Getting and spending, we lay waste our powers,” William Wordsworth wrote a couple of centuries ago. What would it mean to recover those powers, to be rich in time instead of stuff?

    For so many of us, being busy with work has leached away our capacity to pursue true riches. What if we were to prioritize reclaiming our time — to fret less about getting and spending — and instead “spend” this precious resource on creative pursuits, on adventure and learning, on building stronger societies and being better citizens, on caring for the people (and other species and places) we love, on taking care of ourselves?

    This argument echos How to Do Nothing and rethinking the value of our time and our circumstances.

    I appreciate Solnit putting forward a vision of a post-consumer society that isn’t doom-and-gloom. If degrowth is going to succeed as a politics, it needs to be oriented towards building public wealth.

    4/17/2023 1:34 PM
  • Animal, Vegetable, Capital

    lux-magazine.com

    The premise of making animals active participants in the market actually hits a much deeper meridian line of modernity than mere capitalism. It brings to the fore the entire project of categorizing life, human and otherwise, into binaries of “people” and “property,” a project going back to the Scientific Revolution and the Enlightenment. These two categories (and who/what falls into each one) have shaped capitalism, chattel slavery, settler colonialism, scientific racism, and the premise of nation-states as we live with them today.

    There is certainly precedent for giving nonhuman life the kind of agency typically afforded to people in law. Scholars and historians often cite Christopher Stone’s 1972 paper “Should Trees Have Standing?” as the foundational text for the concept known as “the rights of nature,” which affords nonhuman life legal standing to defend its right to exist (or really, to have a human lawyer defend its right to exist). In 2008, Ecuador ratified a new constitution that included a chapter recognizing the Rights of Nature; courts in Colombia, New Zealand, and Bangladesh have granted rights to national parks, mountains, and rivers.

    Of course, the origins of industrial capitalism lie in humans declaring the parameters of personhood for other humans: Enslaved people weren’t people, they were property. That capacity to claim property (and declare who is or is not property) is one measure of personhood within capitalism.

    I think this piece makes a good introduction to the broad arguments that inform the “rights of nature” movement and the expanding inclusivity of legal personhood. I hope to see more legal action undertaken to protect ecologies and natural systems in the coming years.

    But there‘s tension when advocating for expanding legal personhood to nonhuman life at the same moment when the established protections for women and trans people are being eroded. ”Environmental justice” is a powerful shared vision of a more equal and equitable future that protects human and nonhuman life.

    Expanding parameters of legal personhood and access to market participation are more harm reduction for living under capitalism than they are building an alternative to it. To be clear, capitalism has a lot of harms and reducing them is good, but these approaches can easily be subsumed into maintaining existing structures.

    Even taken as good-faith harm reduction, something is lost when the pursuit of otherness-in-connection gets flattened into transactional, financialized charity.

    4/17/2023 1:01 PM
  • Neodymium

    popula.com

    My most common day-to-day interactions with magnets are incredibly intimate. The pixels on the anodyne surface of the smartphone in my palm don’t burrow into my consciousness as intensely as do the sounds pumped by small, powerful magnets through speakers and earbuds, or the haptic buzz of “vibrate mode” produced by the tiny, magnetic motor deep inside. Magnets allow smartphones to whisper in your ear, to leap into life with manic energy at a call or a push notification.

    Magnets inside most of today’s consumer electronics are a blend of neodymium, iron, boron, and a tiny bit of dysprosium (sometimes called “NdFeB magnets”).

    It would be easier to tell a story where a phone’s delicate whispers and magnetic whirrs are part of a big complicated history of war and colonialism and resource anxiety and death because that story appears to have answers. The media and advocacy narratives of conflict minerals and environmental destruction of rare earth element mining have primed readers to expect something sinister and uncouth beneath the surface of consumer electronics. “We have powerful magnets in phones today because a lot of people died in a town you’ve never heard of in the Congo in 1978”–sounds about right, you cynically reply. The debut of GPS on the public stage was as an instrument of precision bombing during the first Gulf War; the origins of the Internet are intertwined with Cold War paranoia and planning for nuclear collapse; cell phones are full of other people’s blood.

    I worry that leaning so hard on these familiar narratives also gives them power. It makes that cascade of cruelty seem inevitable, simply How Things Are Done rather than choices that were made. While the invisible hand of the market, the steamrolling inertia of colonial powers, or the march of technological progress can feel about as difficult to thwart or circumvent as the pull of neodymium magnets, I am wary of treating them like unbreakable laws of the universe.

    4/6/2023 12:47 PM

  • Monday, April 3, 2023

    Generating Feeds with Next.js Route Handlers

    Since I’ve started collecting notes and highlights here, I’ve been meaning to return them as formatted feeds, RSS being the main one. Well, I got around to it. It was way easier than I remembered, and I even got bonus Atom and JSON feeds out of it.

    I’m using Next 13.2 and its new App Directory to generate the site, so this made feeds delightfully simple to implement. In fact, it may be the best experience I’ve ever had for developing content feeds like these. I want to share my walkthrough and results since this is a pretty common task when setting up a new project with Next, and all the existing examples were based in Next’s older pages generation system.

    How to Generate RSS, Atom, and JSON Feeds with Markdown content using Next.js App Directory Route Handlers

    I started from the point of already having data-fetching functions for getting all my notes from my CMS (the aptly named getAllNotes and getNoteTitle).

    When adding a new function to generate the feed, it simply has to set the top-level properties then run over the notes to add them as entries. I author and store all my notes as Markdown, so for each note I render its body into HTML. Each feed format then gets its own Route Handler, which calls the generator function for the formatted feed. Finally, I update the top-level metadata to include links to the newly added feeds.

    Create a Site URL

    I quickly realized I needed a little utility function to get the canonical site URL. Since I build and host using Vercel, I want to make sure my site URL corresponds with its preview deploy URL. I used a combination of environment variables to figure that out, using a dedicated SITE_URL variable with Vercel’s system environment variables to figure out the build’s context and dedicated URL.

    src/utils/getSiteUrl.ts
    export default function getSiteUrl() {
      let protocol = "https";
      let domain = process.env.SITE_URL;
      switch (process.env.VERCEL_ENV) {
        case "preview":
          domain = process.env.VERCEL_URL;
          break;
        case "development":
        case undefined:
          protocol = "http";
          break;
      }
      return `${protocol}://${domain}`;
    }

    Render Markdown to HTML

    To render Markdown into HTML, I used the unified library with the plugins:

    1. remark-parse to parse the Markdown string into an AST
    2. remark-rehype to convert the Markdown into HTML
    3. rehype-sanitize to ensure the HTML is safe to render
    4. rehype-stringify to turn the AST back into a string

    This string was then passed as the content value for each feed item.

    src/utils/markdownToHtml.ts
    import { unified } from "unified";
    import remarkParse from "remark-parse";
    import remarkRehype from "remark-rehype";
    import rehypeSanitize from "rehype-sanitize";
    import rehypeStringify from "rehype-stringify";
    
    export default async function markdownToHtml(input: string) {
      const file = await unified()
        .use(remarkParse)
        .use(remarkRehype)
        .use(rehypeSanitize)
        .use(rehypeStringify)
        .process(input);
    
      return file;
    }

    Create the Feed

    With other site generation frameworks I’ve used, generating feeds has meant writing a template XML file and filling in dynamic values with curly-braced variables, usually with that format’s spec open alongside. This time, I was able to use the feed package for all the XML authoring. As a result, generating multiple feed formats became a matter of making a function call.

    The generateFeed function is based on an example provided by Ashlee M Boyer. It creates a feed with proper metadata, then generates each post. Since the Markdown generation runs asynchronously, adding entries needs to happen inside a Promise.all call. This way, generateFeed waits to return the feed object until all content has finished generating.

    src/utils/generateFeed.ts
    import { Feed } from "feed";
    import smartquotes from "smartquotes";
    import getAllNotes from "src/data/getAllNotes";
    import getNoteTitle from "src/data/getNoteTitle";
    import markdownToHtml from "./markdownToHtml";
    import getSiteUrl from "./getSiteUrl";
    
    export default async function generateFeed() {
      const notes = await getAllNotes();
      const siteURL = getSiteUrl();
      const date = new Date();
      const author = {
        name: "Allan Lasser",
        email: "allan@lasser.design",
        link: "https://allanlasser.com/",
      };
      const feed = new Feed({
        title: "Allan Lasser",
        description: "Thoughts, reading notes, and highlights",
        id: siteURL,
        link: siteURL,
        image: `${siteURL}/logo.svg`,
        favicon: `${siteURL}/favicon.png`,
        copyright: `All rights reserved ${date.getFullYear()}, Allan Lasser`,
        updated: date,
        generator: "Feed for Node.js",
        feedLinks: {
          rss2: `${siteURL}/feeds/rss.xml`,
          json: `${siteURL}/rss/feed.json`,
          atom: `${siteURL}/rss/atom.xml`,
        },
        author,
      });
      await Promise.all(
        notes.map(
          async (note) =>
            new Promise<void>(async (resolve) => {
              const id = `${siteURL}/notes/${note._id}`;
              const url = note.source?.url ? note.source.url : id;
              const content = String(await markdownToHtml(smartquotes(note.body)));
              feed.addItem({
                title: smartquotes(getNoteTitle(note)),
                id,
                link: url,
                content,
                date: new Date(note._createdAt),
              });
              resolve();
            })
        )
      );
      return feed;
    }

    Create the Feed Endpoints

    Now here comes the fun part. Creating feed endpoints becomes so simple it’s silly. Using Route Handlers introduced in Next.js 13.2, adding a new endpoint is as simple as creating a folder in the App Directory with the name of the feed file, then creating a route.ts file inside it.

    So, to add the RSS feed, I create the folder src/app/feeds/rss.xml and then create route.ts inside it.

    src/app/feeds/rss.xml/route.ts
    import generateFeed from "src/utils/generateFeed";
    
    export async function GET() {
      const feed = await generateFeed();
      return new Response(feed.rss2(), {
        headers: { "Content-Type": "application/rss+xml" },
      });
    }

    To create the Atom and JSON feeds, I follow the same process ensuring that the appropriate method and content type are used in the format’s route handler.

    src/app/feeds/atom.xml/route.ts
    import generateFeed from "src/utils/generateFeed";
    
    export async function GET() {
      const feed = await generateFeed();
      return new Response(feed.atom1(), {
        headers: { "Content-Type": "application/atom+xml" },
      });
    }
    src/app/feeds/feed.json/route.ts
    import generateFeed from "src/utils/generateFeed";
    
    export async function GET() {
      const feed = await generateFeed();
      return new Response(feed.json1(), {
        headers: { "Content-Type": "application/json" },
      });
    }
    

    Adding alternates to site metadata

    The last step is updating the site’s <head> to reference these feeds to make them more discoverable to readers. This is made even easier using the App Directory’s Metadata API—also new to Next.js 13.2. In the top-most page or layout file in my app directory, I add an alternates property to the exported metadata object:

    src/app/layout.tsx
    import { Metadata } from "next";
    import getSiteUrl from "src/utils/getSiteUrl";
    
    export const metadata: Metadata = {
      title: "Allan Lasser",
      viewport: { width: "device-width", initialScale: 1 },
      icons: [{ type: "image/x-icon", url: "/static/favicon.ico" }],
      alternates: {
        canonical: getSiteUrl(),
        types: {
          "application/rss+xml": `${getSiteUrl()}/feeds/rss.xml`,
          "application/atom+xml": `${getSiteUrl()}/feeds/atom.xml`,
          "application/json": `${getSiteUrl()}/feeds/feed.json`,
        },
      }
    }

    That’s it!

    Now after running next dev, I can see I have feed files generated at /feeds/rss.xml, /feeds/atom.xml, and /feeds/feed.json. I’ve gotten feeds in three different formats with only a few libraries and simple, easily testable functions.

    After deploying to production, you can now follow my new notes via:

    • RSS,
    • Atom, and
    • JSON Feed

    The flourishing, decentralized Web

    The level of productivity I feel when using Next.js, Vercel, and GitHub together is really hard to beat. It feels like the tools are getting out of my way and letting me developer smaller PRs faster.

    I’m still a daily RSS user. It’s my preferred way to read on the web. I’m glad to see that there’s still robust library support for RSS and feed generation, at least within the Node ecosystem at least. I don’t think RSS is going anywhere, especially since it powers the entire podcasting ecosystem. It’s great to see the longevity of these open standards.

    Speaking of open standards, integrating an ActivityPub server into a Next.js application is something I’m interested in exploring next. It’d be very cool to have a site generated out of an aggregation of one’s own ActivityPub feeds, for example combinining posts from personal micro.blog, Mastodon and Pixelfed into a single syndicated feed.

    Seeing all of the recent progress in decentralizing important services has felt so cool. We can still keep the Web wild and weird, empower individuals with more tools for expressing themselves online, and have it all be user-friendly. Content feeds are an important force for good here, so I’m very glad how easy it is these days for even a novice developer to publish them.

  • Theirs and No One Else’s

    lrb.co.uk

    The origin of conductors’ music is usually attributed to Beethoven. In her interview, Tár rightly cites the opening of Beethoven’s Fifth Symphony (1808) as a locus classicus in the history of modern conducting. The rhythm and rhetorical emphasis of its famous motif is not impossible for an orchestra to play without a conductor, but it’s far more effective with one. And then, as Wagner points out, there’s the question of the fermatas (pauses) – someone has to decide what Beethoven wants.

    Like many of his contemporaries, Wagner thought of music history as teleological. Haydn and Mozart were innocent geniuses; it was the music of Beethoven, and, above all, Beethoven’s Ninth Symphony, that blasted open a path to the future where Wagner himself stood. Composed between 1822 and 1824 and first performed in Vienna in 1824, Beethoven’s Ninth shattered existing paradigms of symphonic form, challenging notions of what the nature of music might be. From the outset it was seen as a limit case, and it took decades for European musical culture to digest it. The response of composers was either to regroup and retrench (Mendelssohn, Schumann) or to attempt to strike out into the uncharted territory the symphony gestured towards (Berlioz, Liszt, Wagner). For orchestral players, it forced a fundamental revision of technique and performance practice.

    3/30/2023 2:22 PM
  • Putting the Silicon in Silicon Valley

    lrb.co.uk

    Before we get to the geopolitics, can we have a moment to inhabit the technological sublime? Microchips are some of the most extraordinary objects humanity has ever made. Miller has a good illustration of this: the coronavirus is tiny, about a hundred billionths of a metre across, but it is a galumphing heifer of a beast compared to the smallest transistors being made in Fab 18, which are half that size. TSMC is now talking about transistor nodes in terms of three billionths of a metre. This is so small that quantum effects, which happen mostly at the subatomic level, become relevant.

    While a critical analysis of their materiality and politics is more interesting, I enjoy Lanchester’s step back and admire the achievement. It’s fun to think that advanced chips are a technology that operates across mindboggling scales, requiring global supply chains, decades of investment, inventive ingenuity, and nearly atomic manufacturing. And all this to produce something that billions of people carry with them every day, powered by angels dancing on the head of a pin.

    3/30/2023 1:54 PM
  • Life in a Mud Box

    buildingshed.substack.com

    This newsletter is called Buildingshed (like watershed, foodshed, or fibershed) because I’m curious about where materials come from, how interrelated systems of energy, labor, and transport turn agricultural products and mined resources into built things, and how places depend on each other. A city skyline or a house like mine looks particular and sits in one spot, but represents a history of extraction and exchange with other landscapes, whether quarries, landfills, or timberlands.

    Earlier today I learned the terms “walkshed”—the walkable distance surrounding a transit stop or other point—and “bikeshed” (same thing, but for biking).

    I’ve already been learning about watersheds as part of my permaculture design course, and I love this way of looking at the landscape. Everything has a shed!

    2/17/2023 3:01 AM
  • Building Steam in Lithium Valley

    prospect.org

    A giant cloud had formed a wall, enveloping the horizon in darkness. They call it a haboob, a collapsed thunderstorm that stirs up silt and clay after plummeting to Earth. The dust tornado sat between me and my hotel, and once I started down the country road, which ran along a cow feedlot that stretched for (I counted) three miles, visibility dropped to maybe a couple of feet.

    Embedded within that dust is more than a century of policy mismanagement, environmental disaster, and regional despair. The recent fortunes of Imperial County, along the U.S.-Mexico border, have risen and fallen with water levels at the Salton Sea, California’s largest inland lake.

    ​Dust from that dry lake bed, polluted with agricultural chemicals, blows into nearby towns. Pediatric asthma hospitalizations in the region are as much as twice the state average, a crisis for the disproportionately poor residents.

    “It will get worse before it becomes better,” Frank Ruiz, Salton Sea Program Director for the National Audubon Society, told me.

    Perhaps best of all, DLE creates a virtuous circle. Geothermal plants have enormous up-front costs compared to solar and wind. But adding lithium extraction makes the payoff much more profitable, and subsequently enables more plants to get built, increasing clean baseload power.

    “The most favorable reclamation Scenario for adopting these specific technologies will be [one] in which the Sea is allowed to shrink.” In other words, maximizing industry operations is synonymous with perpetuating the continued public-health hazard of exposed lake beds.

    2/14/2023 7:09 PM

  • Saturday, February 11, 2023

    Attention Gardening

    In summing up the unlikely, 30-year story of how Yellowstone’s algae inspired the invention of PCR (the biochemical technique used for COVID testing), Clive Thompson writes:

    I think I’m so smitten by this story — with its mix of deep curiosity into seemingly pointless subjects, followed by the discovery that this “pointless” material is wildly useful in a new domain — because it dovetails with my interest in “rewilding” one’s attention.

    I’ve written a bunch about “rewilding” (essays here), which is basically the art of reclaiming one’s attention from all the forces that are trying to get you to obsess over the same stuff that millions of other people are obsessing over. Mass media tries to corral your attention this way; so do the sorting-for-popularity algorithms of social media.

    Now, sometimes that’s good! It’s obviously valuable, and socially and politically responsible, to know what’s going on in the world. But our media and technological environment encourages endless perseveration on The Hot Topic of Today, in a way that can be kind of deadening intellectually and spiritually. It is, as I’ve written, a bit like “monocropping” your attention. And so I’ve been arguing that it’s good to gently fight this monocropping — by actively hunting around and foraging for stuff to look at, read, and see that’s far afield, quirkier, and more niche.

    2/11/2023 3:03 PM

    This “rewilding” is the same sort of shift in attention away from commercial platforms that Jenny Odell argues for in her book. She uses the exact same analogy to “monocropping”:

    It’s important for me to link my critique of the attention economy to the promise of bioregional awareness because believe that capitalism, colonialist thinking, loneliness, and an abusive stance toward the environment all coproduce one another. It’s also important because of the parallels between what the economy does to an ecological system and what the attention economy does to our attention. In both cases, there’s a tendency toward an aggressive monoculture, where those components that are seen as “not useful” and which cannot be appropriated (by loggers or by Facebook) are the first to go.

    pg. xviii10/5/2023 3:17 PM

    A monoculture is an illuminating frame for considering attention. Created in an attempt to achieve economies of scale, monocultures reduce biodiversity and exhaust their soil. To make up for this, they’re covered in heavy amounts of fertilizer and pesticide to maintain their productivity. The analogs to commercial social media are clear. Whether they’re lying about their metrics, unfairly compensating their creators, or simply moderating your timelines without explanation or accountability, commercial social media companies create toxic social conditions in order to establish themselves as places for huge numbers of people to sink their attention. Once they have it, they turn the screws to maximize value for their owners despite the damage it does to their ecosystems.

    In resisting this monoculture, I think Thompson misses a helpful middle-ground between a monocrop and a wilderness. In-between lies a garden: small-scale, intentional, low-impact cultivation of attention. A great garden takes time to establish, but once it does it can live by itself, supported by its rich diversity and interdependency.

    When I think about the ways I focus my attention, I’ve already established a few gardens. My library of books. My collection of RSS feeds. My relationships. My actual garden! All of these contribute to a diverse, interconnected space of shared ideas that help me understand and appreciate the world in new ways.

  • The Power of Indulging Your Weird, Offbeat Obsessions

    clivethompson.medium.com

    I think I’m so smitten by this story — with its mix of deep curiosity into seemingly pointless subjects, followed by the discovery that this “pointless” material is wildly useful in a new domain — because it dovetails with my interest in “rewilding” one’s attention.

    I’ve written a bunch about “rewilding” (essays here), which is basically the art of reclaiming one’s attention from all the forces that are trying to get you to obsess over the same stuff that millions of other people are obsessing over. Mass media tries to corral your attention this way; so do the sorting-for-popularity algorithms of social media.

    Now, sometimes that’s good! It’s obviously valuable, and socially and politically responsible, to know what’s going on in the world. But our media and technological environment encourages endless perseveration on The Hot Topic of Today, in a way that can be kind of deadening intellectually and spiritually. It is, as I’ve written, a bit like “monocropping” your attention. And so I’ve been arguing that it’s good to gently fight this monocropping — by actively hunting around and foraging for stuff to look at, read, and see that’s far afield, quirkier, and more niche.

    2/11/2023 3:03 PM
  • Metafoundry 75: Resilience, Abundance, Decentralization

    tinyletter.com

    What’s changed in the last few decades is the development of technologies that can effectively harness the diffuse, but decentralized and inexhaustible, energy in our environment.

    Renewable energy sources are a step up, not a step down; instead of scarce, expensive, and polluting, they have the potential to be abundant, cheap, and globally distributed. Transitioning all of our infrastructural systems to be powered by renewable sources is about growing out the number of people who have access to more energy, who benefit from using it to meet human needs, whether as basic as cooking food or as modern as global telecommunications.

    We live on a sun-drenched blue marble hanging in space, and for all that we persist in believing it’s the other way around, that means we have access to finite resources of matter but unlimited energy. We can learn to act accordingly.

    We are living at the cusp of remaking ourselves from a primitive species that gets most of our energy from literally setting stuff on fire, and that just junks stuff when we’re done with it, into an species that fits harmoniously into a planetwide ecosystem, that uses energy from the sun, harnesses it for use and to fabricate what we need to thrive, and then returns those materials to the common pool to be used and shared again.

    2/11/2023 3:00 PM
  • Tiktok’s enshittification

    pluralistic.net

    Here is how platforms die: first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die.

    I call this enshittification, and it is a seemingly inevitable consequence arising from the combination of the ease of changing how a platform allocates value, combined with the nature of a “two sided market,” where a platform sits between buyers and sellers, hold each hostage to the other, raking off an ever-larger share of the value that passes between them.

    Once you understand the enshittification pattern, a lot of the platform mysteries solve themselves. Think of the SEO market, or the whole energetic world of online creators who spend endless hours engaged in useless platform Kremlinology, hoping to locate the algorithmic tripwires, which, if crossed, doom the creative works they pour their money, time and energy into:

    https://pluralistic.net/2022/04/11/coercion-v-cooperation/#the-machine-is-listening

    2/11/2023 2:56 PM
  • The Shit Show

    furbo.org

    In Craig Hockenberry’s post marking the abrupt interruption of third party apps’ ability to access the Twitter API, he speaks to what he sees coming next:

    One thing I’ve noticed is that everyone is going to great lengths to make something that replaces the clients we’ve known for years. That’s an excellent goal that eases a transition in the short-term, but ignores how a new open standard (ActivityPub) can be leveraged in new and different ways.

    Federation exposes a lot of different data sources that you’d want to follow. Not all of these sources will be Mastodon instances: you may want to stay up-to-date with someone’s Micro.blog, or maybe another person’s Tumblr, or someone else’s photo feed. There are many apps and servers for you to choose from.

    Twitter is clamping down their API. Federation is just starting up. Combine the idea of an ActivityPub client with Robin Sloan’s desire for a new format beyond the timeline and a whole new design space opens.

    1/18/2023 10:44 PM
  • Critical Ecology with Suzanne Pierre

    alieward.com

    I loved this interview with Dr. Suzanne Pierre, founder of the Critical Ecology Lab on Alie Ward’s Ologies podcast. The way Dr. Pierre frames her study of ecology within critical theory opens the space to questioning how power operates within a larger, living landscape. For example, when she asks how the ecologies of the South was impacted by applying forced human labor to establish cotton monocultures, the answers require reckoning with the scale of America’s slavery system and add a new dimension to our understanding of its lasting impacts. This way of thinking is inspiring as I start getting deeper into thinking about ecological systems and start forming my own questions about them.

    12/16/2022 1:43 PM
  • The Smartest Fashion Podcast Explores Ivy Style—And Asks if Prep Is Back

    putthison.com

    The new season of Articles of Interest tells the history of Ivy and Prep fashion. It’s a style that’s core to living in America. This series shares what made 99% Invisible a great show to start off with. Avery Trufelman gives a name and tells a story about the designers and design decisions that inform the utterly ordinary phenomenon of how we dress ourselves.

    I keep coming back to this metaphor: in the way that you have to look at whiteness to look at race, or look at masculinity to look at gender, you have to look at preppy style to understand all of the countercultural fashion movements of the 20th century.

    Clothing is about semiotics, and when you put together an outfit, you’re crafting a sentence.

    Then in the 1980s, people who thought they could never have access to brokers suddenly had access to passive income. With that came the feeling that you also have the right to the Old Money look. They were like, “I, too, am a monied person; I just need the costume to back it up.” I’m so fascinated with how these social changes affect our fashion choices. If preppy is back—and that’s a big if—I wonder if it has to do with this new money free-for-all we’re seeing. People are becoming millionaires overnight by investing in Gamestop or being on OnlyFans, and I wonder if the accruement of wealth is forever married to this style.

    12/12/2022 11:58 PM
  • What’s Going on Inside the Fearsome Thunderstorms of Córdoba Province?

    nytimes.com

    This was an excellent story about the research happening on storms in Argentina. As the climate gets warmer, there’s more moisture and energy in the atmosphere, which fuels more powerful storms. This is mostly reported in the context of hurricanes, but it’s equally true for the tornado-producing storms that roll across the Midwest in the spring and summer.

    In the United States, which is home to the most extensive weather forecasting infrastructure in the world, around a third of severe weather predictions still prove wrong — not only about timing and location but also size, duration and intensity. The false-alarm rate for tornadoes continues to hover at about 70 percent, while the average warning time has only increased from about 10 minutes in the mid-1990s to 15 minutes today.

    If one of these budding cells manages to punch through the tropopause, as the boundary between the troposphere and stratosphere is called, the storm mushrooms, feeding on the energy-rich air of the upper atmosphere. As it continues to grow, inhaling up more moisture and breathing it back down as rain and hail, this vast vertical lung can sprout into a self-sustaining system that takes on many different forms.

    Composed of millions of micro air currents, electrical pulses and unfathomably complex networks of ice crystals, every storm is a singular creature, growing and behaving differently based on its geography and climate.

    Until the launch of global weather satellites in the 1990s, this level of sampling and detection wasn’t widely available outside North America. When NASA deployed its Tropical Rainfall Measuring Mission in 1997, the satellite offered the first comprehensive look at the entire world’s weather. And part of what it revealed was an enormous regional variability in the size and intensity of storms.

    with ever more heat, moisture and unstable air available to feed on, storms in many parts of the world have begun to exhibit increasingly erratic behavior.

    researchers at M.I.T. and Princeton now consider a Category Six hurricane a realistic possibility

    ​!!!

    In 2019, a study conducted by Stockholm University found that one of the only uniform impacts of climate change was on forecasting, which has become more difficult.

    Founded in the 1990s, by the meteorologist Joshua Wurman, C.S.W.R. is a seminomadic 11-person research institution that over the years has earned a reputation for pushing boundaries in chasing technology. In the mid-90s, Wurman built the first truck-mounted doppler radar system, nicknamed the “doppler on wheels,” or DOW. By 1999, a DOW had recorded the fastest wind speed in history within a tornado, in Moore, Okla., at 301 m.p.h. Since then, perhaps no other organization has ventured as far into the world’s deadliest tempests as C.S.W.R., whose fleet of four trucks has now transmitted data from inside 15 hurricanes and about 250 tornadoes.

    ​CSWR is the Center for Severe Weather Research.

    They tell us that the sky, like our drying forests, is rapidly becoming an ocean of fuel, but they don’t tell us where and when it might ignite — much less what, exactly, might spark it.

    11/28/2022 4:33 PM

  • Sunday, November 20, 2022

    Thinking through a system for reading

    I want my website/homepage to give me ways to keep track of bookmarks, notes and highlights in the things that I’m reading. I’m typically creating these on my phone or tablet, and the kind of data varies with the kind of reading:

    • when browsing the web, I’m saving tagged bookmarks into Pinboard. This is like my private search engine, where it’s easy to recall things I have seen in the past that I wanted to remember.
    • when reading on my phone or tablet, it’s usually RSS or Instapaper–in that case, I’m typically highlighting passages and marking posts as favorites. I want to be capturing my highlights and favs as content in Sanity.
    • I don’t typically read books on computer, but I also want to be capturing highlights and tracking favs as content in Sanity.

    Seems like we have a system starting to come together:

    1. bookmarks can continue living in Pinboard, and I can provide a way to browse and search these on my personal site.
    2. I have the same workflow triggers for my reading. I want to save favorite articles or books into Sanity, and I want to capture highlights or reading notes on those entities.

    Are a highlight and a note the same? I think so, because they’re both text. That text could be anchored to a page or other location, but it’s all just text content at the end of the day. I think I would want to create multiple entries, with one for each note/highlight. This also allows for a note to be combined with a highlight if a passage triggers a thought.

    This is exactly the same kind of Markdown I’d be capturing in Drafts. I could easily turn this into a JSON payload and send it off.

    Since I’m starting from a place where I’m capturing notes, it makes sense that notes would have an association with a source. The source could be a web article, a book, a movie, anything really. I could put a type field on sources to distinguish between mediums, if necessary. I could also pretty easily generate a citation for each note if I know the page and the source!

    If I’m capturing a note on a web article, I’m probably going to need to create the source at the same time that I create the note. If I’m reading a book, there’s a good chance that I’ll need to find an existing source for the note.

    At the time I’m creating a note, there’s two ways that I will be associating it with a source. I will either need to be creating the source at the same time, or I’ll need to find an existing source.

    This workflow suggests that I’ll want to:

    1. Capture the highlight and create a draft entry in Sanity
    2. Get the URL for the newly created entry, then open its page in the browser
    3. From there, I can associate the newly created note with a source and publish it.

    I also want to be able to create a source from a URL or from an ISBN. This is a convenience feature and can come later!