Here are some technical notes on the site I made for Pixel Plants.

PP 035

PP 035

Use image compression to store the replay data. In addition to the images themselves, I store the order in which pixels were placed onto the canvas so that I can visually recreate the generation process. I previously stored this data as JSON with delta-compressed x and y channels, but the results were still many megabytes large. Pixels are most often placed nearby other recently placed pixels, so for this iteration I decided to encode the data as an image rather than JSON to take advantage of spatial locality during compression. I store the base-256-encoded placement order for each pixel as that pixel's color value, which amounts to storing a permutation over all pixels in the image. The resulting file sizes were smaller, from 1.1 to 1.5mb, with webp achieving a compression rate of 12-14 bits per pixel.

PP 082

PP 082

Compress assets with png, webp, and jpg. A large fraction of the data for the site consisted of the art image and replay image. I compressed both with pngcrush, and also compressed them using lossless webp. The webp files were smaller but not all browsers support the format yet, so I programmed the site to load the webp files when available and to fall back to png otherwise. Other image assets could stand to be compressed lossily, so I did the same thing but using lossy webp and jpg.

Here's the pngcrush command I used to compress files in place. -brute tries all permutations of compression settings, -c specifies the color type (2 is true color without an alpha channel), and -ow means overwrite:

pngcrush -brute -c 2 -ow 35.png

For webp, I used either the -lossless flag or a quality -q of 95 (out of 100), a compression method -m of 6 (slowest, but best), and enabled multithreading with -mt for a nice speedup:

cwebp 35.png -q 95 -m 6 -mt -o 35.webp

I also learned about [seq](<https://en.wikipedia.org/wiki/Seq_(Unix)>) and used it together with xargs to compress all of the images with one command, e.g. this one for square thumbnails:

seq 100 | xargs -I {} sh -c 'cwebp {}.png -q 95 -m 6 -mt -o {}.webp'

Netlify DNS can be very slow. I noticed when reloading the site that pages occasionally take very long to load, and tracked down the culprit to DNS resolution, which was occasionally taking 500-1000ms to complete using Netlify and their DNS service. After searching around I discovered that Netlify can have very long DNS resolution times (eg. 700ms) and there's nothing that users can do about it. I switched to Cloudflare Pages, which seemed to solve this problem, though now deploys take several minutes longer (they're working on it).

Use webpagetest and Chrome's 3G simulator to test page performance on slower connections. In addition to monitoring file sizes I also tested the site to get an idea of how long it took for various assets on the page to load. This resulted in a number of optimizations, including adding a loading indicator to the replay data, adding image with and height properties to prevent unnecessary page reflows, and bundling together all of the JavaScript into a single file to minimize the number of serial network requests necessary for the page to begin showing content.

Do some Observable-specific optimizations. Since the site is built with Observable, bundling the JS takes a bit more work than usual and involves creating a [Library](<https://github.com/observablehq/stdlib#Library>) with a custom resolve function that returns the bundled instances of eg. [htl](<https://github.com/observablehq/htl>). Other Observable-specific changes include removing the Markdown dependency by replacing the title cells with HTML template literals, and using the notebook visualizer to prune all unwanted dependencies, e.g. only load the submodules of d3 that were being used rather than the default d3 bundle. It would be nice if there was a way to remove unused parts of Stdlib, e.g. the SQLite client — the minified JS for the site still contains code for unused Observable builtins like SQLite and FileAttachments.

Smooth transitions on image load. The "Shuffle" button swaps in new thumbnails in the image grid, and I wanted those images to fade in smoothly once they're loaded. This turned out to be a bit tricky to do, since most of the naïve approaches would not wait for the image to load before starting the animation. I learned that CSS transitions on the background-image property aren't supported across all browsers due to not being animatable according to the CSS spec, so ended up adding new <img> elements to the DOM at opacity 0 and fading them in in order to create the transition.

Untitled

Use ffmpeg for animation videos. I wanted to create little square animations of the placement process, So I forked the Observable notebook containing the animation component and created a version that would save every frame as a PNG file. I used the following command to turn the frame sequence into a video:

ffmpeg -framerate 60 -i anim_frames/035/%d.png -c:v h264_videotoolbox -b:v 5M anims/035.mp4

035.mp4

Have a coherent theme. I think one reason why this project didn't particularly get shared was the lack of a coherent theme – the art was impressionist, the name was Pixel Plants, and the tone wasn't very consistent. Also, the lead-in, while possibly amusing, didn't really set the tone for what came after. All these things made the project harder to understand; it's much better when everything is pushing in the same direction.