Well, I had some downtime today and decided to smash out a small script to mirror the static maps from Google I'm using, because frankly Google don't render them fast enough for my liking (he says, tongue-in-cheek, chortling). The script itself is pretty ugly, but at this point it does the job:

grep -r 'Location:' fwaggle.org/content/ | cut -d ':' -f 3 | sed -e 's/^[[:space:]]*//' | sort | uniq | while read loc; do \
lochash=$(echo -n "$loc" | sha256); \
filepath="files.fwaggle.org/maps/${lochash}.png"
if [ ! -f "$filepath" ]; then \
echo "Mirroring map for $loc"; \
wget -o /dev/null "https://maps.googleapis.com/maps/api/staticmap?center=$loc&size=240x110&zoom=3&maptype=roadmap&markers=color:blue%7Clabel:S%7C$loc" -O "$filepath"; \
sleep 5; \
fi; \
done

The sleep 5 is probably not entirely necessary, but I decided to add it anyway to ensure I don't get banned from Google Maps. If you can't read the above, basically I take the SHA-256 hash of each location, and if I don't already have an image with that name, I grab a copy of the map. This script takes place before my images are rsynced to my files.fwaggle.org domain.

How to get those into Pelican though? It turned out that was a bit more involved, as Pelican's Jinja templater doesn't contain any sort of hashing filter by default. It turns out it's not terribly difficult to write a filter plugin for Pelican, so I did that. I'll probably open-source it properly outside of my site's source, but for now here's the entirety of the plugin's plugin/hash/__init__.py:

from pelican import signals
from hashlib import sha256

def hash_256(input):
    return sha256(input).hexdigest()

def add_filter(pelican):
    pelican.env.filters.update({'hash_256': hash_256})

def register():
    signals.generator_init.connect(add_filter)

Then the template code is super simple:

<img src="//files.fwaggle.org/maps/{{ article.location | hash_256 }}.png" alt="{{ article.location }}" />

After making some other tweaks (like locally mirroring one of the fonts I use because fonts.gstatic.com was slow to respond also), Pingdom thinks I have the fastest website they've ever seen (rounding errors on percentages notwithstanding):

Faster than 100% of websites!

That's the entire page load time, from a hot-cache at Cloudflare. Results from their other monitoring locations are similar, as long as Cloudflare's cache is still active, which I abuse the shit out of their page rules to ensure that happens as much as my use of their free plan will let me. Of course on a cold cache they get progressively worse in times the further away from Sydney you get. I like the fact that I can't tell it apart from my slow-ass local server most of the time!

Update - 2018-04-20: I slightly modified the plugin code, pushed it to GitHub and imported it as a submodule just on the off chance it's useful to someone else.

Horsham, VIC, Australia fwaggle

Published:


Modified:


Filed under:


Location:

Horsham, VIC, Australia

Navigation: Older Entry Newer Entry