Just to help out anyone else who’s brain is turning to mush trying to figure this out:
I went to write a new post tonight and discovered that Hugo flat out refused to render the new post. Nothing I did would make Hugo display or serve up the new page. Digging further, Hugo wouldn’t convert the post using the conversion functions, and running –verbose or –debug showed that as far as Hugo was concerned, the new pages flat out didn’t exist.
I was wracking my brain for hours over this - checking and double-checking paths, ensuring file permissions were correct, removing old posts, attempting to disable caches - until I did some frontmatter splitting and discovered that if I left the date field off, the posts showed up.
It turned out that something in my config has changed since I created my last posts, and Hugo is now rendering posts relying on the post’s timezone to determine when it should be published. I recently updated my version of Hugo to v0.54.0, and I also specifically altered the way my dates are generated in archetypes\default.md to match how I’d like them to be stored, and one or both of those changes meant that Hugo went from generating new posts in a consistent timezone, to generating them in my local timezone but acting as though they were in UTC. This meant that posts were being ignored but would have published without a problem +1030 hours from the time I wrote them.
Now I’ve simply altered my archetype to include -0700 to tell Hugo to append my local TZ to new dates, and now a hugo new posts/whatever.md generates a file that shows up immediately when I serve the file.
In my archetypes\default.md file I’ve set date to:
Using Wordpress my grade was an F, but the change wasn’t enough in-and-of-itself to change the grade at all. It turns out Mozilla is super persnickety about HTTPS security and focuses on your site’s
Content Security Policy as one of it’s primary measures.
The CPS is not something I’d ever heard of before. Other sites gave my site a clean bill of health when I’d checked to see if my SSL certificate was doing it’s job, so I figured my site was safe. It turns out that browsers now support a Content Security Policy header that can tell the browser to ignore any potentially dangerous content that isn’t explicitly allowed by the creator.
My ruleset (via Headers in .htaccess) looks something like the following:
default-src is the base level rule, and by setting it to ‘none’, we tell the browser to ignore anything that isn’t explicitly spelled out below.
frame-src is set to allow only youtube.com iframes (eg. this post)
form-action only allows submitting forms to staticman.net for comments and duckduckgo.com for the search form on the front page
font-src is set to allow google fonts
img-src allows images from my amazon s3 bucket, Flickr, Gravatars, and an image for visitor statistics (using Matomo so your data isn’t going anywhere).
script-src allows cloudflare hosted JS because the theme I’m using uses some libraries there.
style-src allows CSS from googleapis.com and cloudflare, again for the theme.
By specifying ‘self’ for JS and CSS, and explicitly not using ‘unsafe-inline’ I’ve forced myself to move everything to self-hosted CSS and JS files, instead of using inline style on html elements or onClick JS. From the Mozilla docs on the matter:
And with comments enabled, I want as much protection from XSS as possible.
Now The Geekorium scores a delightful A+ on the Mozilla Observatory, and a score of 125⁄100, which is the sort of ‘extra-credit’ number I’m looking for in my security.
One thing that moving away from WordPress means is that I can no longer publish on the go.
I mean, I never really did, but at least I had the option. Now to post I must be in front of my PC with the Hugo software installed and a copy of my repo. I could get the repo on any computer and even install Hugo if I needed to be elsewhere, but my home computer has the key to log into my server, so I’m not making it easy on myself.
I can however, use a portable git client (I’m trying out FastHub for GitHub and write my posts on the go, then tidy and publish them later.
I’m banking on the idea that reducing the barriers to writing will increase the number of posts that get published. We’ll see.
I’ve re-enabled comments here at The Geekorium, and imported all my old comments, so go nuts!
To import all your old comments, I used a script written by someone else, then parsed them through a dodgy PHP script I made myself to rename everything into the format my site is relying on, so there might be shenanigans with the imported comments. Please let me know if anything seems off.
That leaves me with the next question: how do I ensure I don’t get flooded with spam? I’ve had comments back on for all of 2 days, and I get a steady trickle of Pull Requests from the Staticman bot triggered by spam comments. On the Wordpress site I had Akismet turned on, which all but eliminated bad-faith for me, the way modern email clients almost never let the chaff through.
The simplest answer is the Google reCAPTCHKA1 - the latest version doesn’t even ask you to tick the “I’m not a robot” box let alone click on thirteen boxes of street crossings. It’s a tempting solution, but it’s owned and operated by Google, and everything your users do on your website is captured for analysis. As spelled out in their documentation:
reCAPTCHA works best when it has the most context about interactions with your site, which comes from seeing both legitimate and abusive behavior.
reCAPTCHA learns by seeing real traffic on your site.
In a perfect world, Google would only use this data to improve the service. Maybe that’s all they’re doing, but I take my reader’s privacy seriously - more than my own - and I’m genuinely concerned what Google is doing with this enormous corpus of user data capcha’d by these little blue boxes all over the web. They’re more pervasive than Facebook logins and social buttons, and unlike the earlier version, it’s no longer training robots to recognise trains or traffic lights, it’s training computers how to recognise human behaviour.
It’s looking likely I’m going to have to palm user data off to someone to determine if they’re a robot or not. I’m not happy about it, but it appears to be the price unless I’m willing to sift through dozens of spam comments a day. It wouldn’t be so bad, except Git’s policy of keeping history means that the spam I receive is attached to my site’s repo forever, even if the comment never makes it here.
My final recourse is to try something that I’m guessing won’t work for long. Staticman has a feature that checks for valid form data. The check is basic enough that the field can be present in the data as long as it’s blank. If it has a value set it immediately fails validation. I’ve set a dummy field in the form that needs to be left blank. If a ‘bot fills it in, it should get picked up and fail to submit. I’m not sure how long it will slow them down, but I’m going to give it a shot.
I’ve also disabled the form on posts older than a month, so if you want to comment, do it now!
They say “imitation is the sincerest form of flattery”, and I do hope they’re right. I’ve been reading Rubenerd for the longest time, and his lovely minimal(ist) website built on Hugo has had me dying to try out the technology for the longest time.
While there’s nothing wrong with Wordpress, I’ve always found it just a little too clunky for my tastes - and slow. That might be because I’ve always used it on shared hosting with less than optimised databases. The idea of a super fast and efficient text-only site is appealing.
So if you can’t tell the difference, today’s post (and all past posts) are now brought to you by Hugo, powered by Go.
I also used this as the excuse I needed to finally put the effort into dual booting Linux on my machine. I’m trying out Linux Mint, and I’m proud I actually got it working with Secure Boot1. Starting out, my “flow” is to create a post in Markdown, then build the site and rsync it to to the same location my old site was.
Please let me know if you notice anything funky. As usual I can be reached on Telegram, Discord, and just recently, Twitter2. However, I’m aware that there’s lots of posts that will not have survived the switch over without some… problems. I will get to them eventually.
The process of moving was interesting. All my posts in Wordpress were written in Textile which for years was my preferred markup language, but Textile turned out to be Betamax to Markdown’s VHS, or what Mercurial is to Git, or what Bitbucket is to Github, or what this sentence is to any other sentence.
The first step was to learn just enough Go to build the Go Wordpress Importer. This pulls all the posts out of a Wordpress Export XML file, then uses Pandoc to convert the HTML to whatever format you like. I built in the ability to toss in some extra Pandoc magic to convert from Textile to HTML then from HTML to Markdown.
From there, Hugo does most of the heavy lifting as long as you can find a theme you like that includes all the nice stuff you want included. I quite like Er but I’ve forked it as ooh-er for my own purposes.
The next step is to build comments back in. It’s something that Ruben has forgone - not for technical reasons I believe - but I really enjoy the one or two I get occasionally. It’s not an easy problem to solve with a static site though, but I think I’ll be leaning on Staticman to add comments into the github repo. I found a slightly different script that also uses Github, but adds comments as “issues”. While appealing, I also want to ensure I’m not tied completely to Github for all time.
Let me know what you think of the changes. I’ll post more when I have comments up and running.
So I came in here to do a post about something completely different, but discovered that Wordpress has enabled their Gutenberg editor by default with the latest version of the software, and it’s both enticing and scary to try something new, so I thought I’d give it a shot.
On the surface it’s got some advantages I think for people who want to write pretty posts.
What’s immediately appealing is that everything is a block of “something” and you have to be very deliberate in what something you want that something to be. For example, if you want to insert a quote, you start a new paragraph and you select the “quote” block type and blammo, there’s your quote:
Which is something that for years I’ve thought was missing from all the nice GUI editors bundled with netlog software. I’ve had to deal with the source-code HTML fallout of websites written with WYSIWYG editors, and for the most part what you see on the front end might be what you get, but how you get it is usually some form of Lovecraft-ian horror on the back end, with tags embedded in tags like they’ve been involved in a transporter accident.
The ideal goal of a “block” powered editor in my mind would be to teach your users how to think in blocks, so that their HTML is structured and formatted from the get-go with the particular idiosyncrasies of that format in mind. I’m not sure if that’s what the authors of Gutenberg set out to accomplish, but it’s the ideal outcome I can think of from such a project.
Personally I gave up on WYSIWYG years ago because I wanted precise control over what I wrote and not have the editor insert it’s ideas of how to output my thoughts. I began using Textile (markup) and have since dabbled a little in Markdown, and if I’m truly not getting the output I want, I switch to plain HTML. So, my initial reaction to having Gutenberg thrust upon me was to immediately reach for the off switch.
As an aside - I wanted to write a quick footnote here, but by default Gutenberg does not appear to support them. I’m guessing there are plugins for this, or maybe a setting I’ve missed, but it doesn’t appear to be possible out of the box - something I cannot abide.
What I wanted to write as a footnote was that I did enable Gutenberg early as a plugin just to see what it was all about, but freaked out and turned it off immediately because change is awful and should never be tolerated. It’s possible that I left it turned on, and only thought I disabled it, but I’m pretty sure it’s turned on by default, and research is for chumps.
While I’m writing, I’m noticing what I’m going to presume is a bug that’s causing the cursor to reset to the top of the paragraph I’m writing every time the page auto-saves. This is annoying. It could be a setting or another plugin I have causing the issue though, so it may not happen to everyone.
In summary, what I’m hoping to find when I press publish is a concise and minimal HTML output on my final page. The block paradigm, and the beautifully crafted interface for building those blocks appeals to me on a technological level, and I truly hope that the Gutenberg idea sticks and is embraced by the Wordpress user base. While there appear to be some minor issues (that might be unique to my setup), the idea is sound and may go some way to improving the guts of the sites that use it, which is a win.
Addendum: Gutenberg is wigging out with my Textile plugin and adding an extra <br/> tag after every paragraph. Other than that, the output HTML is every bit as simple and elegant as I could have hoped for. I will need to find a resolution to the Textile/Gutenberg conflict some time, and it might simply be switching off Textile once and for all, but if you come here and the page still has giant empty space between paragraphs, you’ll know it’s not because of Gutenberg.
I took the kids and Mil to Cleland Wildlife
Park today. It’s one of
my favourite places in Adelaide, and I’ve made some fun memories with
the kids and various grandparents over the last ten years.
Today I thought we’d do something different. We’ve got a year-long membership
we’ve barely used in this last 12 month period, so entry is free, and
it was such a lovely sunny autumn day, I thought it might be nice to
just chill out and try a new role-playing game I’ve been wanting to
play with the kids. So we spent the morning and early afternoon
printing, coloring ((I will always use the American spelling because
computers don’t understand colours)), cutting, and sticking and had a
go at the first campaign in Hero
really simple role playing game (like dungeons and dragons).
This is the whole family’s first RPG - I’ve watched a couple of games in my time, but never participated - and my first attempt at being GM(Game Master).
We didn’t get very far - turns out stopping to explain rules and pat
potoroos can eat into game time - but I think the kids had fun. Ammy
played a healer, Evie played a rogue and Merry played a warrior. Mil was
a Warlock with water powers. The basic gist of the game is that the
characters themselves are kids so that the players can relate to them
and get involved in the adventures.
I’m looking forward to putting more time into it. The kids all have great imaginations, so I think they’ll really take to it. And I had to promise to take them all back to Cleland soon because we barely got to see any animals this time.
I got in touch on twitter and asked if he wanted my old C=128 because
although for nostalgia reasons I had held onto both my families old 64
and 128s, having both sitting in a closet doing nothing seemed stupid.
Rubenerd was glad to take it off my hands, and I’m glad it’s going to
someone who clearly loves retro computers in a way I will never emulate
My memories of this thing are playing Wizball to the wee hours with my mum, her elation when she finally clocked it, playing Subsunk and
Cosmonaut and River Raid and Curse of Sherwood and so many other games
my young hands couldn’t master.
I was never much of a gamer, and I never did put in the hours to finish
all those games I loved, but it did make me love computers and the
potential behind them. I also wish I could say I’d programmed much on
either machine, but copying out code from a book didn’t teach me much at
the time, but it did help me see what someone could do with the right
mindset and training, and planted a seed for programming that took
another 25 years to finally grow.
The 64 and 128 have reached a level of nostalgia in my mind that they
possibly don’t deserve, but they’re the only couple of things from my
childhood that I really remember clearly. That and the
Chubbles that were clearly
more gimmick than
substance. Oh, and the Rubick’s Magic I got one Christmas
and couldn’t put down.
Today I finally got off my ass and posted it to Rubenerd. Shipping was
surprisingly cheap because he didn’t need the vintage printer or the
after-market disk drive I have for it.
I won’t mention the contents, and save that for a surprise for him,
beyond the following:
A slighlty rattly Commodore 128
A power supply that no longer works, but that Rubenerd assures me he
can fix or replace
I would love if Rubenerd could post an unboxing when he gets it. I left
a couple of things in there that I hope he doesn’t mind receiving and
having to store somewhere. And I’d love it if he could post it to The
Fleet when he gets it working,
with a suitably cute name. Enjoy it, learn new things on it, and share!