Skip to content

til

how to trigger a gh-action only if the issue is created by the repo owner

what i learned

you can add an if key to a job to conditionally run jobs. you also have a lot of metadata available in github actions regarding the event that triggered it and the repo it is on.

put together you can add a condition like:

.github/workflows/issue-to-md.yml
...
job:
  job_name:
    runs_on: ubuntu
    if: ${{ github.event.issue.user.login == github.repository_owner }}
...

using typer and uv to run a script with inline dependencies

what i learned

because uv supports running scripts with dependencies declared in inline metadata and typer can turn any function into a cli you can put both of them together and build some really powerful small utilities. all you need is to define a function and wrap it in typer.run() in a script with typer as a dependency in the inline metadata.

after some iterations, this is the final script (so far):

issue-to-md.py
# /// script
# dependencies = [
#   "typer",
#   "rich",
#   "pyyaml",
# ]
# ///

import json
import re
from datetime import datetime
from pathlib import Path
from zoneinfo import ZoneInfo

import typer
import yaml
from rich import print
from typing_extensions import Annotated


def generate_post_from_issue(
    issue_title: Annotated[str, typer.Option("--title", "-t")],
    issue_body: Annotated[str, typer.Option("--body", "-b")],
    issue_labels: Annotated[str, typer.Option("--labels", "-l")],
    issue_created_at: Annotated[str, typer.Option("--created-at", "-c")],
    base_dir: Annotated[str, typer.Option("--base-dir", "-d")] = "blog/posts",
):
    # Convert labels to a list of tags
    tags = [label["name"] for label in json.loads(issue_labels)]

    # Convert ISSUE_CREATED_AT to PST and format as YYYY-MM-DD
    utc_time = datetime.strptime(issue_created_at, "%Y-%m-%dT%H:%M:%SZ")
    pst_time = utc_time.astimezone(ZoneInfo("America/Los_Angeles"))
    created_at_pst = pst_time.date()

    # Extract the category from the part of the title before the first colon, default to "project" if none
    category = (
        issue_title.split(":")[0].strip().lower() if ":" in issue_title else "project"
    )

    # Extract the title content after the first colon
    title = (
        issue_title.split(":", 1)[1].strip()
        if ":" in issue_title
        else issue_title.strip()
    )

    # Determine directory based on category
    dir_path = Path(base_dir) / ("til" if category == "til" else "")
    dir_path.mkdir(parents=True, exist_ok=True)

    # Generate a slugified version of the title for the filename
    slug = re.sub(r"[^a-z0-9]+", "-", title.lower()).strip("-")

    # Create the front matter dictionary
    front_matter = {
        "title": title,
        "date": created_at_pst,
        "categories": [category],
        "tags": tags,
    }

    # Prepare YAML front matter and issue body
    yaml_front_matter = yaml.dump(front_matter, default_flow_style=False)
    content = f"---\n{yaml_front_matter}---\n\n{issue_body}"

    # Define filename
    filename = dir_path / f"{slug}.md"

    # Write content to file
    filename.write_text(content, encoding="utf-8")

    print(f"Markdown file created: {filename}")


if __name__ == "__main__":
    typer.run(generate_post_from_issue)

feels like a micro-package.

creating til posts from github issues using github actions

what i learned

you can automate creating a new markdown file in a directory in your repo with front matter metadata from github issues. you can then create a pull request to deploy those changes to your main branch. my plan is to use this to capture more ideas on the go (on my phone).

.github/workflows/issue-to-md.yml
name: Create Post from Issue

permissions:
  contents: write
  pull-requests: write

on:
  issues:
    types: [opened]

jobs:
  create-post:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Generate Post from Issue
        env:
          ISSUE_NUMBER: ${{ github.event.issue.number }}
          ISSUE_TITLE: ${{ github.event.issue.title }}
          ISSUE_BODY: ${{ github.event.issue.body }}
          ISSUE_LABELS: ${{ toJson(github.event.issue.labels) }}
          ISSUE_CREATED_AT: ${{ github.event.issue.created_at }}
        run: |
          # Convert labels to a list of tags
          TAGS=$(echo $ISSUE_LABELS | jq -r '.[] | .name' | paste -sd, -)

          # Convert ISSUE_CREATED_AT to PST and format as YYYY-MM-DD
          CREATED_AT_PST=$(TZ="America/Los_Angeles" date -d "${ISSUE_CREATED_AT}" +"%Y-%m-%d")

          # Extract the category from the part of the title before the first colon, default to "project" if none
          CATEGORY=$(echo "$ISSUE_TITLE" | awk -F: '{print $1}' | tr -d '[:space:]' | tr '[:upper:]' '[:lower:]')
          if [ -z "$CATEGORY" ]; then
            CATEGORY="project"
          fi

          # Extract the title content after the first colon
          TITLE=$(echo "$ISSUE_TITLE" | sed 's/^[^:]*: *//')

          # Determine directory based on category
          if [ "$CATEGORY" = "til" ]; then
            DIR="blog/posts/til"
          else
            DIR="blog/posts"
          fi
          echo $DIR >> $GITHUB_STEP_SUMMARY
          echo $CATEGORY >> $GITHUB_STEP_SUMMARY

          # Generate a slugified version of the title for the filename
          SLUG=$(echo "$TITLE" | tr '[:upper:]' '[:lower:]' | tr -cs '[:alnum:]' '-' | sed 's/^-//;s/-$//')

          echo $SLUG >> $GITHUB_STEP_SUMMARY

          # Create the front matter with category, tags, and formatted date
          FRONT_MATTER="---\ntitle: \"$TITLE\"\ndate: ${CREATED_AT_PST}\ncategories: [${CATEGORY}]\ntags: [${TAGS}]\n---"

          # Prepare content for markdown file
          CONTENT="$FRONT_MATTER\n\n$ISSUE_BODY"

          # Save the content to a markdown file
          FILENAME="${DIR}/${SLUG}.md"
          echo $FILENAME >> $GITHUB_STEP_SUMMARY
          echo -e "$CONTENT" > "$FILENAME"

      - name: Commit and push changes
        env: 
          ISSUE_TITLE: ${{ github.event.issue.title }}
          ISSUE_NUMBER: ${{ github.event.issue.number }}
          GH_TOKEN: ${{ github.token }}
        run: |
          git config --local user.name "github-actions[bot]"
          git config --local user.email "github-actions[bot]@users.noreply.github.com"
          git checkout -b add-post-$ISSUE_NUMBER
          git add .
          git commit -m "Add post for Issue: $ISSUE_TITLE"
          git push -u origin add-post-$ISSUE_NUMBER
          gh pr create --title "#$ISSUE_NUMBER - $ISSUE_TITLE" --body "Adding new post. Closes #$ISSUE_NUMBER"

running sudo commands without password on VPS

what i learned

you can configure your VPS / server to be able to run sudo commands without being asked for your password. you just need to create a sudoers file.

  • first you have to create sudoers file

    sudo visudo -f /etc/sudoers.d/$USER
    

    when i asked chatgpt for this i found you can just run sudo visudo and it’ll open the sudoers file.

  • now, let’s say you have a user app that you want to be able to run apt update and apt upgrade without asking for sudo password. you need to add this line to your sudoers file

    app ALL=(ALL) NOPASSWD:/usr/bin/apt update, /usr/bin/apt upgrade
    

how it works

  1. app - the username on the system
  2. ALL=(ALL) - this means this rule to all hosts and allows acting as any user
  3. NOPASSWD - no password
  4. /usr/bin/apt update - you must pass the full path for the commands you want to run without a password.

how to set up ffmpeg as a lambda layer

what i learned

how to add ffmpeg and ffprobe as a lambda layer to be used by lambda functions.

Getting ffmpeg

# ffmpeg
wget https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz

# checksum
wget https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz.md5

md5sum -c ffmpeg-release-amd64-static.tar.xz.md5

# extract
tar xvf ffmpeg-release-amd64-static.tar.xz

Side note: i had to brew install md5sha1sum and brew install wget on my local laptop

Creating Lambda Layer

  1. create ffmpeg/bin/
  2. copy ffmpeg into it
  3. zip ffmpeg/
# Create bin/
mkdir -p ffmpeg/bin

# Copy ffmpeg
cp ffmpeg-6.0-amd64-static/ffmpeg ffmpeg/bin

# Zip directory
cd ffmpeg
zip -r ../ffmpeg.zip .

Finally

Upload zip file as a lambda layer.

Bonus

In my case I also included ffprobe as it's also required for whisper.

how to create an alias in the gh CLI

what i learned

you can create aliases in the GitHub CLI. i'm not super familiar with aliases. i've used them in the past to automate long commands. currently i'm using a couple at work to shorten dbt commmands ever so slightly (from dbt run --target prod --select <models> to prod-run <selection query>).

however, i had only seen these as aliases one sets up at the profile level/scope. as in, we'd go to ~/.bash_profile or ~/.zsh_profile and add a new alias that's set every time we open a new terminal.

this is the first time i see a cli offer that within the tool itself. i wonder if this is a common practice i've missed until now.

in the GitHub cli you can use the command alias set to set an alias (docs).

i usually have to google the full list of flags i would like to run when creating a repo via the gh-cli so i figured i'd save it as an alias now. this is why i ~~wish i remembered~~ would like to run most times:

gh repo create <name> \
--public \
--add-readme \
--clone \
--gitignore Python \
--license bsd-3-clause-clear

simply create a public repo named include a ReadME, a license and a gitignore file and finally clone it to the local directory.

i might add the --disable-wiki simply because i don't use the wikis.

from the docs:

The expansion may specify additional arguments and flags. If the expansion includes positional placeholders such as "$1", extra arguments that follow the alias will be inserted appropriately. Otherwise, extra arguments will be appended to the expanded command.

so what i did was run

gh alias set pyrepo 'repo create "$1" --public --add-readme --clone --gitignore=Python --license=bsd-3-clause-clear'

and if i choose to i can add a description by adding -d "my repos description" right after gh pyrepo <name>

how to use gh-actions to produce example images of code

what i learned

I learned to chain a lot of small tools using GitHub Actions to produce ready-to-share images of code examples for social media (namely, instagram and twitter) from my phone. The steps, generally speaking, go as follows:

  1. Create a new page on a Notion Database. Probably will create a specific template for this, like I do with TIL’s but it’s not necessary.
  2. GitHub Action: Use my markdownify-notion python package to write the markdown version of this page and save it on a “quarto project” folder. This let’s me use one general front-matter yaml file for all files rather than automate adding front matter to each file. I can still add specific front matter to files if I want to. (this TIL is an example of how this works - I’m writing it on Notion on my phone.)
  3. GitHub Action: Use Quarto to render this markdown file --to html and save it on an “output” directory. This will execute the code in the code cells and save the output inline.
  4. GitHub Action: Use shot-scraper to produce two files: a png screenshot and a pdf file. I’m using shot-scraper for the PDF as well rather than using quarto because it’s easier and I am not in need of customizing this pdf file at all just yet. I’m creating it and saving it essentially just because I can, it’s easy, and might find use for it later.
  5. GitHub Action: Once there are new png or pdf files in the “output” directory, I then use s3-credentials to put those objects on a S3 bucket I also created using s3-credentials . This tool is fantastic s3-credentials.readthedocs.io

This is how the final image looks like

9EB00936-09DE-4836-93B6-8504E7E036A8