It has been 20 years since I started this blog. Time flies.
Author: Paweł Gościcki
Enums in Ruby on Rails backed by PostgreSQL’s ENUM
Let me present you the way I usually create enums in Rails’ ActiveRecord models. I’ll be utilizing the capabilities of the underlying PostgreSQL database and its ENUM
type.
Let’s start with an example Subscription
model:
class Subscription < ApplicationRecord
ACTIVE = 'active'.freeze
INACTIVE = 'inactive'.freeze
STATES = [ACTIVE, INACTIVE].freeze
enum state: {
active: ACTIVE,
inactive: INACTIVE
}
validates :state, presence: true
end
The above will expect a string type state
database field (instead of the default numeric one). Let’s create a migration for it:
class CreateSubscriptions < ActiveRecord::Migration[7.1]
def change
reversible do |direction|
direction.up do
execute <<-SQL
CREATE TYPE subscription_state AS ENUM ('active', 'inactive');
SQL
end
direction.down do
execute <<-SQL
DROP TYPE subscription_state;
SQL
end
end
create_table :subscriptions do |t|
t.column :state, :subscription_state, default: 'active', null: false
t.timestamps null: false
end
end
end
Notice that in the model I have intentionally left out the inclusion validation for the state
field. This is because Rails will automatically raise ArgumentError
if we try to assign a different value to it. As such, it is nice to automatically rescue from such situations in the ApplicationController
:
class ApplicationController < ActionController::Base
rescue_from ArgumentError, with: :bad_request
...
private
def bad_request exception
message = exception.message
respond_to do |format|
format.html do
render 'bad_request', status: :unprocessable_entity, locals: { message: }
end
format.json do
render json: { status: 'ERROR', message: }, status: :unprocessable_entity
end
end
end
end
Let’s add some tests. I’ll be using shoulda-matchers for some handy one-liners:
describe Subscription do
specify ':state enum' do
expect(described_class.new).to define_enum_for(:state)
.with_values(active: 'active', inactive: 'inactive')
.backed_by_column_of_type(:enum)
end
it { is_expected.to allow_values(:active, :inactive).for :state }
it { is_expected.to validate_presence_of :state }
describe ':state enum validation' do
it 'raises ArgumentError when assigning an invalid value' do
expect { described_class.new.state = 'canceled' }.to raise_exception ArgumentError
end
end
end
And an accompanying request spec for a most likely SubscriptionsController
:
describe 'API subscriptions requests' do
describe 'PATCH :update' do
let(:subscription) { create :subscription }
let(:params) { Hash[subscription: { state: 'invalid' }] }
context 'when unsuccessful' do
it 'responds with :unprocessable_entity with error details in the JSON response' do
patch(subscriptions_path, params:)
expect(response).to be_unprocessable
expect(json_response).to be_a Hash
expect(json_response['status']).to eql 'ERROR'
expect(json_response['message']).to include 'ArgumentError'
end
end
end
end
Voilà!
Running Puppeteer on AWS Lambda in a Docker container
The aim of this guide is to provide a working solution to generating a PDF version of a webpage using Puppeteer running in a Docker container as a Lambda function. The Docker container approach is used to bypass the 50MB Lambda code size limit. The other option is to use something like chrome-aws-lambda.
We’ll start with the Dockerfile
, which assumes Lambda function with a node.js v16 engine called index.js
with a named handler
export:
FROM public.ecr.aws/lambda/nodejs:16
# Required for puppeteer to run
RUN yum install -y amazon-linux-extras
RUN amazon-linux-extras install epel -y
# Chromium dependencies
RUN yum install -y \
GConf2.x86_64 \
alsa-lib.x86_64 \
atk.x86_64 \
cups-libs.x86_64 \
gtk3.x86_64 \
ipa-gothic-fonts \
libXScrnSaver.x86_64 \
libXcomposite.x86_64 \
libXcursor.x86_64 \
libXdamage.x86_64 \
libXext.x86_64 \
libXi.x86_64 \
libXrandr.x86_64 \
libXtst.x86_64 \
pango.x86_64 \
xorg-x11-fonts-100dpi \
xorg-x11-fonts-75dpi \
xorg-x11-fonts-Type1 \
xorg-x11-fonts-cyrillic \
xorg-x11-fonts-misc \
xorg-x11-utils
RUN yum update -y nss
# Chromium needs to be installed as a system dependency, not via npm; otherwise there will be an error about missing libatk-1.0
RUN yum install -y chromium
COPY index.js package.json package-lock.json ${LAMBDA_TASK_ROOT}
RUN npm ci --omit=dev
CMD [ "index.handler" ]
The above Dockerfile
assures all required dependencies are in place. The next step is to setup the Puppeteer’s launch. Here is the relevant snippet from the Lambda function code:
import puppeteer from 'puppeteer'
const viewportOptions = {
args: [
// Flags for running in Docker on AWS Lambda
// https://www.howtogeek.com/devops/how-to-run-puppeteer-and-headless-chrome-in-a-docker-container
// https://github.com/alixaxel/chrome-aws-lambda/blob/f9d5a9ff0282ef8e172a29d6d077efc468ca3c76/source/index.ts#L95-L118
// https://github.com/Sparticuz/chrome-aws-lambda/blob/master/source/index.ts#L95-L123
'--allow-running-insecure-content',
'--autoplay-policy=user-gesture-required',
'--disable-background-timer-throttling',
'--disable-component-update',
'--disable-dev-shm-usage',
'--disable-domain-reliability',
'--disable-features=AudioServiceOutOfProcess,IsolateOrigins,site-per-process',
'--disable-ipc-flooding-protection',
'--disable-print-preview',
'--disable-setuid-sandbox',
'--disable-site-isolation-trials',
'--disable-speech-api',
'--disable-web-security',
'--disk-cache-size=33554432',
'--enable-features=SharedArrayBuffer',
'--hide-scrollbars',
'--ignore-gpu-blocklist',
'--in-process-gpu',
'--mute-audio',
'--no-default-browser-check',
'--no-first-run',
'--no-pings',
'--no-sandbox',
'--no-zygote',
'--single-process',
'--use-angle=swiftshader',
'--use-gl=swiftshader',
'--window-size=1920,1080',
],
defaultViewport: null,
headless: true,
}
const browser = await puppeteer.launch(viewportOptions)
try {
const page = await browser.newPage()
const url = 'https://...'
await page.goto(url, { waitUntil: ['domcontentloaded', 'networkidle0'] })
await page.emulateMediaType('print')
const pdf = await page.pdf({})
} catch (error) {
...
}
Testing useNavigate() / navigate() from react-router v6
Testing navigate()
is slightly more problematic with the latest v6 (as of writing this post) react-router than just asserting on history.push()
as it was the case in the previous versions. Let’s say we have this ButtonHome
component:
import { useNavigate } from 'react-router-dom'
const ButtonHome = () => {
const navigate = useNavigate()
const onClick = () => navigate('/home')
return (
<button onClick={onClick}>
Home
</button>
)
}
I would write a test for this component using the react-testing-library in the following way:
import * as router from 'react-router'
import { render } from '@testing-library/react'
import userEvent from '@testing-library/user-event'
import ButtonHome from './ButtonHome'
describe('ButtonHome', () => {
const ui = userEvent.setup()
const navigate = jest.fn()
beforeEach(() => {
jest.spyOn(router, 'useNavigate').mockImplementation(() => navigate)
})
it('renders the button and navigates to /home upon click', async () => {
render(withRouter(<ButtonHome />))
await ui.click(screen.queryByText('Home'))
expect(navigate).toHaveBeenCalledWith('/home')
})
})
The relevant bits just for testing the router are as follows:
import * as router from 'react-router'
const navigate = jest.fn()
beforeEach(() => {
jest.spyOn(router, 'useNavigate').mockImplementation(() => navigate)
})
it('...', () => {
expect(navigate).toHaveBeenCalledWith('/path')
})
The test also requires the following withRouter()
helper, which I have in jest.setup.js
:
import { Route, Router, Routes } from 'react-router-dom'
import { createBrowserHistory } from 'history'
const history = createBrowserHistory()
const withRouter = (children, opts = {}) => {
const { path, route } = opts
if (path) {
history.push(path)
}
return (
<Router location={history.location} navigator={history}>
<Routes>
<Route
path={route || path || '/'}
element={children}
/>
</Routes>
</Router>
)
}
global.withRouter = withRouter
Debugging Safari web application on iOS devices
The idea here is to be able to use Safari’s developer tools connected to the web page/application running on a real iOS device. You can achieve similar effect by using XCode’s Simulator app, but certain things don’t work there (looking at you, Apple Pay).
Enable USB internet sharing
- Open System Preferences > Sharing
- Select Internet Sharing
- If the checkbox next to Internet Sharing is enabled, uncheck it
- Check iPhone USB on the right side
- Check Internet Sharing on the left side. You will have to confirm it
- Find message saying: “Computers on your local network can access your computer at: xxxx.local”. The
xxxx
part is the host. You will need it later.
Make your application available on the new host
Now it’s necessary to make your application available on the new xxxx.local
host. In nginx
it is a matter of adding xxxx.local
to the server_name
directive:
server_name existing.server xxxx.local;
For rails server
it’s necessary to bind to all local interfaces:
rails server -b 0.0.0.0
Connect your iOS device
- on iOS go to Settings -> Safari -> Advanced and toggle “Web Inspector”
- on Mac open Safari and go to Preferences -> Advanced and check “Show Develop menu in menu bar”
- connect iPhone via USB cable
- on Mac restart Safari
- on iPhone open http://xxxx.local
- on Mac open Safari and go to the Develop menu. You will now see the iOS device you connected to your Mac. Click on it to start debugging.
Troubleshooting
Sometimes Safari won’t show the connected iOS device. If that is the case, you can start with restarting Safari both on the iOS device and Mac. If that doesn’t work, restart the iOS device and then the Mac.
Creating a self-signed, wildcard SSL certificate for Chrome 58+
Chrome 58+ requires Subject Alternative Name
to be present in the SSL certificate for the domain name you want to secure. This is supposed to be a replacement for Common Name
, which has some security holes (like being able to define a certificate for *.co.uk
, which is not possible with SAN
).
I’ll be using MacOS and OpenSSL v1.1.1d installed via brew.
Recent OpenSSL versions add the basicConstraints=critical,CA:TRUE
x509v3
SAN
extension by default, which prevents such generated certificate to work in Chrome 58+. We need to disable that first.
Edit /usr/local/etc/openssl@1.1/openssl.cnf
(your path may vary) and comment the following line:
[ req ]
# x509_extensions = v3_ca # The extensions to add to the self signed cert
And then you are off to generate the certificate. I’ll be using the *.example.net
domain name here.
/usr/local/Cellar/openssl@1.1/1.1.1d/bin/openssl req \
-x509 \
-newkey rsa:4096 \
-sha256 \
-days 7000 \
-nodes \
-out cert.pem \
-keyout key.pem \
-subj "/C=US/O=Org/CN=*.example.net" \
-addext "basicConstraints=critical,CA:FALSE" \
-addext "authorityKeyIdentifier=keyid,issuer" \
-addext "keyUsage = digitalSignature, nonRepudiation, keyEncipherment, dataEncipherment" \
-addext "subjectAltName=DNS:example.net,DNS:*.example.net"
This will generate two files. key.pem
, which is the private key, without passphrase and cert.pem
, which is the actual certificate.
Verify that the actual certificate has required x509v3 SAN extensions:
$ openssl x509 -in cert.pem -noout -text
Certificate:
Data:
Version: 3 (0x2)
Serial Number:
70:4c:28:...
Signature Algorithm: sha256WithRSAEncryption
Issuer: C=US, O=Org, CN=*.example.net
Validity
Not Before: Oct 2 15:48:10 2019 GMT
Not After : Dec 1 15:48:10 2038 GMT
Subject: C=US, O=Org, CN=*.example.net
Subject Public Key Info:
Public Key Algorithm: rsaEncryption
Public-Key: (4096 bit)
Modulus:
00:a7:b5:01...
Exponent: 65537 (0x10001)
X509v3 extensions:
X509v3 Basic Constraints: critical
CA:FALSE
X509v3 Authority Key Identifier:
DirName:/C=US/O=Org/CN=*.example.net
serial:70:4C:...
X509v3 Key Usage:
Digital Signature, Non Repudiation, Key Encipherment, Data Encipherment
X509v3 Subject Alternative Name:
DNS:example.net, DNS:*.example.net
Signature Algorithm: sha256WithRSAEncryption
59:1d:96:...
The last step is to import the certificate (cert.pem
) into the keychain (I’m using the login
keychain) and trust it.
So easy. So hard.
Ruby date is (ir)rational
Consider the following:
[DEV] main:0> Date.today
Fri, 05 Apr 2019
[DEV] main:0> (Date.today + 0.5)
Fri, 05 Apr 2019
[DEV] main:0> (Date.today + 0.5) == Date.today
false
The above was executed in a Rails application and was the reason for a quite long wtf moment. Things become more clear when I executed this:
[DEV] main:0> Date.today + 0.5 - Date.today
1/2
Turns out that the Date holds a Rational
offset inside of it. This becomes more evident when you execute the above in irb
:
irb(main):003:0> Date.today
=> #<date: 2019-04-05 ((2458579j,0s,0n),+0s,2299161j)>
irb(main):004:0> Date.today + 0.5
=> #<date: 2019-04-05 ((2458579j,43200s,0n),+0s,2299161j)>
LEFT OUTER JOIN in ActiveRecord
I always forget how to construct those queries in ActiveRecord, so here it goes.
Assuming we have the following structure:
class User < ActiveRecord::Base
has_many :authentications
end
class Authentication < ActiveRecord::Base
belongs_to :user
end
We can generate the LEFT OUTER JOIN
SQL query in the following way (to see, for example, if we have dangling user references in authentications
):
Authentication.joins('LEFT OUTER JOIN users ON authentications.user_id = users.id').where('users.id IS NULL').where('authentications.user_id IS NOT NULL')
Will generate the following SQL:
SELECT `authentications`.* FROM `authentications` LEFT OUTER JOIN users ON authentications.user_id = users.id WHERE (users.id IS NULL) AND (authentications.user_id IS NOT NULL)
i.e. it will select all authentications with incorrect (dangling) user_id
references.
Setting up mocha with sinon and chai
I was unable to quickly find a solution for this, so here’s a little guide on how to set it up together in a proper way.
First, install the libraries:
npm install mocha --save-dev
npm install sinon --save-dev
npm install chai --save-dev
I come from the Ruby world, so I expect to have a spec
command, spec_helper.js
file and specs living inside spec/
directory (with a nested structure).
Inside package.json
file define the spec
command:
"scripts" : {
"spec": "mocha --opts spec/mocha.opts"
}
We will be using BDD style (expect().to()
) of chai. Inside spec/mocha.opts
add:
--recursive **/*_spec.js
--require spec/spec_helper.js
--ui bdd
Create spec/spec_helper.js
, which will require chai and sinon and we will require `spec_helper.js` inside all specs (similarly to how RSpec in Ruby world works).
const sinon = require('sinon')
const expect = require('chai').expect
global.sinon = sinon
global.expect = expect
And now create your spec file (spec/module_spec.js
). You should not be required to include any libraries there. Now you can run your specs:
npm run spec
References:
Writing NullObject in Ruby to please Rubocop gods
Lets get right to it. Here’s how one can write a NullObject
in a modern, Rubocop-friendly manner:
class NullObject
def method_missing method, *args, &block
if respond_to? method
nil
else
super
end
end
def respond_to_missing? _name, _include_private = false
true
end
end
One of the things you should support is the fallback to super
. I achieve that by the if/else
block. In practice super
is never reached, so this feels a little bit like a waste.
The other thing is to declare whether your object (NullObject
) should respond to methods it doesn’t have. A true null object should respond to all methods, so I set respond_to_missing?
to true
.
Just for posterity, here’s how you could write specs for it:
describe NullObject do
it 'returns nil for any method call' do
null = NullObject.new
expect(null.missing_method).to be_nil
expect(null.some_other_missing_method(1, 2, 3)).to be_nil
end
it 'responds to missing methods' do
null = NullObject.new
expect(null.respond_to?(:missing_method)).to be true
end
end
Additional reading about the NullObject
pattern:
- http://wiki.c2.com/?NullObject
- http://www.virtuouscode.com/2011/05/30/null-objects-and-falsiness/