Code Snippets
  • Tailwind form inputs

    • %input.form-input.w-full{type:"text"}
  • Pull heroku db - simple

    • heroku pg:backups:capture --app APP_NAME
    • heroku pg:backups:download --app APP_NAME
    • Then to import that locally

    • pg_restore --verbose --clean --no-acl --no-owner -d database-name NAME_OF_FILE.dump
  • Check Largest Postgres Tables - data wise

    • Go to metabase, click "Ask A Question" -> "Native Query"

    • select schemaname as table_schema,
          relname as table_name,
          pg_size_pretty(pg_total_relation_size(relid)) as total_size,
          pg_size_pretty(pg_relation_size(relid)) as data_size,
          pg_size_pretty(pg_total_relation_size(relid) - pg_relation_size(relid))
            as external_size
      from pg_catalog.pg_statio_user_tables
      order by pg_total_relation_size(relid) desc,
               pg_relation_size(relid) desc
      limit 20;
  • Pull Heroku DB - exclude certain tables

    • heroku pg:credentials:url --app app_name
      export uri=INSERT_CONNECTION_URL_FROM_PREVIOUS_STEP
      pg_dump $uri -O -x -Fc -f database.dump
      pg_restore --verbose --clean --no-acl --no-owner -h localhost -d my_database database.dump
  • PG Dump Local Db

    • pg_restore --verbose --clean --no-acl --no-owner -d my_database database.dump
  • Export Remote postgres & import into Heroku DB

    • The Issue with Heroku's recommended approach is when database versions don't match, which happens all the time. So we'll export with the raw sql format.

    • pg_dump --format=c --compress=9 --no-acl --no-owner --no-privileges --username=user --host=db.myserver.com db_name > db.sql
    • Now use psql to import it from the command line. Replace the credentials here with real Heroku postgres creds.

    • psql -d central_development -W -f farmbank.sql
    • To import to an heroku instance, get the credentials from the "credentials" section in Heroku postgres

    • Option A - using pg_restore

    • pg_restore --verbose --clean --no-acl --no-owner -h ec2-54-155-46-64.eu-west-1.compute.amazonaws.com -d d3a482rnu2fnno -U aabwuyoxbpkxot atlas.sql
    • Option B - using psql

    • psql -h ec2-54-155-46-64.eu-west-1.compute.amazonaws.com -p 5432 -d d3a482rnu2fnno -U aabwuyoxbpkxot -W -f atlas.sql
  • Clear Delayed Jobs

    • From the command line

    • rake jobs:clear
    • Or, From a rails console

    • Delayed::Job.delete_all
  • Clear only delayed Jobs which are failing

    • Delayed::Job.where("attempts > 1").delete_all
  • Access a heroku app from a different account via CLI

    • heroku logout
      heroku login
  • Kill Rails Server

    • Replace PID with the id of the process returned in the first command

    • lsof -wni tcp:3000
      kill -9 PID
  • Remove stale postmaster.pid file

    • cd ~/Library/Application\ Support/Postgres/var-12/
      rm postmaster.pid
  • Exit a binding.pry loop unconditionally

    • !!!
  • Use ImprovMx to send emails via Rails

    • Ensure Improvmx is configured to send email

    • Use improvmx Logs tab to debug and see if improvmx is receiving requests.

    • When testing locally, remember to set config.action_mailer.perform_deliveries = true in development.rb

    • # config/environments/production.rb
      
      config.action_mailer.delivery_method = :smtp
      config.action_mailer.smtp_settings = {
        :address              => 'smtp.improvmx.com',
        :port                 => 587,
        :domain               => 'yourdomain.com',
        :user_name            => 'no-reply@yourdomain.com', # Whatever username you've configured in Improvmx goes here
        :password             => ENV['IMPROVMX_SMTP_PASSWORD'],
        :authentication       => 'login',
        :enable_starttls_auto => true
      }
  • Postgres JSONB

    • Find records where jsonb attribute doesn't exist

    • where info -> 'err' is not null and info-> 'err' != 'null'
  • Google Sheets - Find cells that contain a particular string

    • =FIND("STRING",B907,1)
    • Then filter where value > 0 to only show cells with that string

  • Port forwarding rules on Ubuntu

    • See the existing rules

      • sudo iptables -t nat -v -L PREROUTING -n --line-number
    • Add a rule (forward port 80 to port 3000)

      • sudo iptables -t nat -I PREROUTING -p tcp --dport 80 -j REDIRECT --to-ports 3000
    • Remove a rule

      • sudo iptables -t nat -D PREROUTING -p tcp --dport 80 -j REDIRECT --to-ports 3000
  • Tmux Basics

    • List all sessions

    • tmux ls
    • Kill a session

    • tmux kill-session –t <name>
  • Rails - Temporarily Connect to an external database to perform direct data import/export

    • (With Heroku) - Get your database credentials

      • heroku pg:credentials:url --app walletid
    • Add a new database in database.yml using the credentials obtained at step 1

      • newdb:
          adapter: postgresql
          encoding: utf8
          pool: 5
          port: 5432
          host: xxx-xxx.eu-west-1.compute.amazonaws.com
          username: xxxxx
          password: xxxxx
          database: xxxxx
    • Add a new temporary model and tell it to connect to your newly added data source

      • # models/sync_stub.rb
        class SyncStub < ApplicationRecord
          establish_connection :newdb
          self.table_name = "tasks"
        end
    • Great, now you can do things like SyncStub.all.count from a console or script

    • Bonus. Here's a helper script that will let you copy a subset of records from one table to another. I usually place it in the SyncStub model itself, but you can use it wherever.

      • # models/sync_stub.rb
        class SyncStub < ApplicationRecord
          
          establish_connection :rds
          self.table_name = "tasks"
          
          def self.import_records(records,primary_attribute="id",override_table_name=nil)
            # records should contain the ActiveRecord collection of records you'd like to import
            # An example of using this is as follows:
            # SyncStub.import_records(SyncStub.where("user_id = ?",1))
            # Assuming the table_name is set to tasks, this would import
            # all tasks which have a user id of '1'
            constantized_class = override_table_name || self.table_name.classify.constantize
        
            records.each do |record|
        
              new_record = constantized_class.find_or_initialize_by(
                "#{primary_attribute}": record.send("#{primary_attribute}")
              )
              record.attributes.each do | attribute, value |
                begin
                  new_record.send("#{attribute}=",value)
                rescue => exception
                  puts "Error: #{exception}"          
                end
              end
              new_record.save
            end
        
            ActiveRecord::Base.connection.reset_pk_sequence!(self.table_name)
        
          end
        end
      • Which will let you use the following

      • SyncStub.import_records(SyncStub.all)
  • Create an s3 bucket where all objects are public

    • Go to the S3 console and create a new bucket.

    • Ensure the "Block Public Access" section is unchecked

    • Go to the "Permissions" section for the bucket

    • Under "Bucket Policy", click "Edit"

      • Copy-paste the following snippet. Replace BUCKETNAMEHERE with your own

      • {
          "Version": "2012-10-17",
          "Statement": [
            {
              "Effect": "Allow",
              "Principal": "*",
              "Action": "s3:GetObject",
              "Resource": "arn:aws:s3:::BUCKETNAMEHERE/*"
            }
          ]
        }
  • HTMX with rails - remote request

    • This example uses the main page body in the main application template, but you can specify any dom element you like

    • First, give the page content a div identifier

      • # application.html.erb
        <div id="main-page-content">
          <%= yield %>
        </div>
    • Use htmx to load in some remote content (in this case from the /gallery route). This example will fetch the content as soon as the page is loaded (because we specify load as the hx-trigger)

      • <div hx-get="/gallery" hx-target="#main-page-content" hx-trigger="load" ></div>
    • Tell your controller not to render the surrounding template if the incoming request is from htmx

      • # controllers/gallery_controller
        def index
          # Do Stuff
          render :layout => false if request.headers['HX-Request']
        end
  • HTMX - Submit a form from a dropdown

    • Tell htmx to fetch new content when someone changes the value of a dropdown.

    • <select name="sort_by" hx-get="/gallery" hx-target="#main-page-content" hx-trigger="change" >
        <option value="">Sort By</option>
        <option value="asc">Newest -&gt; Oldest</option>
        <option value="desc">Oldest -&gt; Newest</option>
      </select>
    • Optionally - also tell your dropdown to send all the form data (as opposed to just the currently changed dropdown element) when changed

    • The following snippet will send a GET request to /gallery?sort_by=asc&name=banksy with a HX-Request header, and load the response into the #main-page-content div when the dropdown is changed.

    • <select class="gallery-dropdown" name="artist">
        <option value="">Select Artist</option>
        <option value="banksy">Banksy</option>
        <option value="warhol">Andy Warhol</option>
      </select>
      
      <select name="sort_by" hx-get="/gallery" hx-target="#main-page-content" hx-trigger="change" hx-include=".gallery-dropdown">
        <option value="">Sort By</option>
        <option value="asc">Newest -&gt; Oldest</option>
        <option value="desc">Oldest -&gt; Newest</option>
      </select>
  • Active Record - manually add a schema migration record to the database

    • Not recommended - used to get out of frustrating migration dead ends

    • rails c
      ActiveRecord::SchemaMigration.create!(version:"20220617082000")
  • Linux - working with groups & permissions

    • List Groups

    • cat /etc/groups
    • Create a group

    • sudo groupadd GROUPNAME
    • Create a group with a custom id

    • sudo groupadd -g 000 GROUPNAME
    • Add user to group

  • Default Rails New

    • rails new misc --database=postgresql --skip-action-mailer	--skip-action-mailbox	 --skip-action-text	--skip-active-job	--skip-active-storage	--skip-action-cable	--skip-javascript	
  • Set up a basic Github Action to test migrations

    • Create a new file at /.github/workflows/ci.yml with the following content. Replace ruby-version with the relevant version for your app.

    • name: Continuous integration
      on:
        push:
          branches: [master]
        pull_request:
          branches: [master]
      jobs:
        test_migrations:
          runs-on: ubuntu-latest
          
          services:
            postgres:
              image: postgres:11
              env:
                POSTGRES_USER: postgres
                POSTGRES_PASSWORD: postgres
              ports: ['5432:5432']
              options:
                --health-cmd pg_isready
                --health-interval 10s
                --health-timeout 5s
                --health-retries 5
          steps:
            - uses: actions/checkout@v3
            - name: Setup Ruby
              uses: ruby/setup-ruby@v1
              with:
                ruby-version: '3.0.0'
            - name: Build and run test
              env:
                DATABASE_URL: postgres://postgres:@localhost:5432/test
                RAILS_ENV: test
                POSTGRES_USER: postgres
                POSTGRES_PASSWORD: postgres
              run: |
                sudo apt-get -yqq install libpq-dev
                gem install bundler:1.17.3
                bundle install --jobs 4 --retry 3
                bundle exec rails db:create
                bundle exec rails db:migrate
      
    • Ensure the database.yml file explicitly lists port, host, username and password

    • # config/database.yml
      
      test:
        <<: *default
        database: yourdb
        host: localhost
        port: <%= ENV["DATABASE_PORT"] || 5432 %>
        username: <%= ENV["POSTGRES_USER"] %>
        password: <%= ENV["POSTGRES_PASSWORD"] %>
  • Use Crono Gem for scheduled jobs

    • Add it to the app

    • gem 'crono'
      gem 'daemons'
      rails generate crono:install
      rake db:migrate
    • Enable the admin panel

      • # config/routes.rb
        Rails.application.routes.draw do
          mount Crono::Engine, at: '/crono'
        end
      • # controllers/crono/jobs_controller.rb
        
        # This is a monkey-patch of the default crono controller 
        # that just adds a before_action for auth. 
        # If this becomes out of date, the original file is at: 
        # https://github.com/plashchynski/crono/blob/main/app/controllers/crono/jobs_controller.rb
        
        module Crono
          class JobsController < ApplicationController
            before_action :authenticate_admin_user!
            def index
              @jobs = Crono::CronoJob.all
            end
        
            def show
              @job = Crono::CronoJob.find(params[:id])
            end
          end
        end
    • Start the crono worker

      • bundle exec crono start RAILS_ENV=production
  • Provision a Digital Ocean Droplet for a rails environment

    • Either

      • source <(curl -s https://gist.githubusercontent.com/tonyennis145/c65ef5e8f83947199dadaedc341a7dc8/raw/4490ad95eca88bb54dd44130d118ab3161a66a9a/fresh_vm.sh)
    • Or

      • #! bin/bash
        
        echo "Installing rbenv and ruby-build"
        git clone https://github.com/rbenv/rbenv.git /usr/bin/.rbenv 
        git clone https://github.com/rbenv/ruby-build.git /usr/bin/.rbenv/plugins/ruby-build 
        export RBENV_ROOT=/usr/bin/.rbenv
        echo 'export RBENV_ROOT=/usr/bin/.rbenv' >> ~/.bashrc 
        echo 'export PATH="/usr/bin/.rbenv/bin:$PATH"' >> ~/.bashrc 
        echo 'eval "$(rbenv init -)"' >> ~/.bashrc 
        echo 'export PATH="/usr/bin/.rbenv/plugins/ruby-build/bin:$PATH"' >> ~/.bashrc 
        
        echo "Installing apt-get packages including imagemagick, and nginx"
        sudo apt-get update 
        sudo apt-get install -y git-core curl zlib1g-dev build-essential libv8-dev libssl-dev libreadline-dev libyaml-dev libxml2-dev libxslt1-dev libcurl4-openssl-dev libffi-dev postgresql postgresql-contrib nginx dirmngr gnupg libpq-dev apt-transport-https wget ca-certificates imagemagick 
        
        echo "Installing postgres"
        wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
        sudo sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt/ $(lsb_release -cs)-pgdg main" >> /etc/apt/sources.list.d/pgdg.list'
        sudo apt update
        sudo apt install -y postgresql postgresql-contrib
        
        echo "Installing Passenger"
        sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 561F9B9CAC40B2F7
        sudo sh -c 'echo deb https://oss-binaries.phusionpassenger.com/apt/passenger bionic main > /etc/apt/sources.list.d/passenger.list'
        sudo apt-get update
        sudo apt-get install -y libnginx-mod-http-passenger 
        if [ ! -f /etc/nginx/modules-enabled/50-mod-http-passenger.conf ]; then sudo ln -s /usr/share/nginx/modules-available/mod-http-passenger.load /etc/nginx/modules-enabled/50-mod-http-passenger.conf ; fi
        sudo service nginx restart
        
        echo "Installing Certbot"
        sudo snap install core; sudo snap refresh core
        sudo snap install --classic certbot
        
        echo "Creating root postgres user"
        sudo -u postgres createuser -s -r root
        sudo -u postgres createdb root
        
        echo "Installing Ruby 3.1.2"
        
        export PATH="/usr/bin/.rbenv/bin:$PATH"
        export PATH="/usr/bin/.rbenv/plugins/ruby-build/bin:$PATH"
        rbenv init
        rbenv install --verbose 3.1.2
        rbenv global 3.1.2
        
    • Create postgres user and db

      • sudo -u postgres createuser --superuser root; sudo -u postgres createdb root
    • Set up app directory

      • mkdir apps
        sudo chmod 777 -R /apps 
    • Add line to nginx.conf

      • passenger_default_user root;
  • Provision a Digital Ocean Droplet for a node environment

    • echo "Installing NVM"
      
      curl https://raw.githubusercontent.com/creationix/nvm/master/install.sh | bash 
      
      echo "Installing Node & NPM"
      nvm install --lts
  • Investigate which files/folders are using up space on Linux

    • apt update
      apt install ncdu
      cd /
      ncdu
  • Fix the “No Space Left on Device” Error on Linux

    • sudo lsof / | grep deleted
      # Look at what processes are still running for deleted files, get their PID and kill them
      kill -9 PROCESS_ID
  • Readable strftime format

    • object.created_at.strftime("%b %e, %H:%M")
  • Hyperscript scroll a horizontally scrollable div to the end when it loads

    • <div _="init set my.scrollLeft to my.scrollWidth">
  • Heroku sign in to CLI on remote server (solve ip address mismatch issue)

  • Allow connecting to postgres on a DigitalOcean Ubuntu VM from metabase or elsewhere

    • Warning: Don't do this on a production server, or unless you know what you're doing.

    • Update postgresql.conf with this

    • listen_addresses='*'
    • Update pg_hba.conf with this

    • # Added the following line manually to allow external connections to the database, as per https://dba.stackexchange.com/questions/83984/connect-to-postgresql-server-fatal-no-pg-hba-conf-entry-for-host
      host  all  all 0.0.0.0/0 md5
    • Create a new postgres user, give it super user privileges, and set a strong password.

    • Create the user from the command line

    • sudo -u postgres createuser metabase
    • Use psql to see the user table and then alter the metabase user & set a password. Generate a super secure password then

    • psql
      
      \du
      
      ALTER USER metabase WITH SUPERUSER;
      
      \password metabase
    • Restart postgres

    • sudo service postgresql restart
  • Rewrite git history to update name and email address of committer

    • git filter-branch --env-filter '
      export GIT_AUTHOR_EMAIL="me@example.com";
      export GIT_COMMITTER_EMAIL="me@example.com";
      export GIT_AUTHOR_NAME="My Name";
      export GIT_COMMITTER_NAME="My Name";
      ' --tag-name-filter cat -- --branches --tags
  • Basic active menu highlighting with inline vanilla js and onclick

    • <div onclick="Array.from(this.children).forEach(function(div) { div.classList.remove('text-black')}); event.target.classList.add('text-black')">
        <div class="text-black">Active Item</div>
        <div>Active Item</div>
        <div>Active Item</div>
      </div>
  • Deferring loading of assets to improve pagespeed

    • <link rel="preload" href="/stylesheets/tailwind-f475bbe.css<%= cache_buster %>" as="style" onload="this.onload=null;this.rel='stylesheet'">
      <link rel="preload" href="/stylesheets/forms-f475bbe.css<%= cache_buster %>" as="style" onload="this.onload=null;this.rel='stylesheet'">
      <link rel="preload" href="/stylesheets/custom.css<%= cache_buster %>" as="style" onload="this.onload=null;this.rel='stylesheet'">
      
      <noscript>
      <link rel="stylesheet" href="/stylesheets/tailwind-f475bbe.css<%= cache_buster %>">
      <link rel="stylesheet" href="/stylesheets/forms-f475bbe.css<%= cache_buster %>">
      <link rel="stylesheet" href="/stylesheets/custom.css<%= cache_buster %>">
      </noscript>
  • Add heroku config vars from an application.yml file

    • CONFIG_FILE="config/application.yml"
      
      # Read each line in the application.yml file
      while IFS= read -r line
      do
          # Extract the key and value from the line
          KEY=$(echo $line | cut -d ':' -f 1 | xargs)
          VALUE=$(echo $line | cut -d ':' -f 2- | xargs)
      
          # Check if the line is not empty and contains a key-value pair
          if [[ ! -z "$KEY" && ! -z "$VALUE" ]]
          then
              # Set Heroku config variable
              heroku config:set "$KEY=$VALUE" --app farmbank
          fi
      done < "$CONFIG_FILE"

  • Website Page