The making of this blog with Pelican
My personal blog has had two previous versions, using somewhat different technologies, but both of them requiring databases and server side scripting. This time I decided to ditch all that and use static website generation.
Static sites
A static website consists of plain html files that doesn't require any database or server side scripting like PHP to work. The html files might link images, javascript, downloads and whatever is necessary, but no processing functionality should be required of the web server except to serve the plain files.
The website files are created in a process similar to that of compiling and deploying software. A compiler (in this case Pelican) outputs the html and associated files based on source material such as MarkDown texts and theme templates, and a deployment script pushes it all to a web server when ready to publish. Software development methodologies meets publishing! No wonder all the geeks love compiling their websites (and all the rest keep on trucking WordPress..)
The model has a big security advantage since there is no server side functionality with potential security flaws to be hacked. On the other hand, the static nature of the website means that less options for dynamically created content is available. All interactivity has to be done through Javascript. The Talkyard comments on this site is an example of that.
A critique of the WordPress work flow
The second version of Lars Electric' Endeavors was powered by WordPress. I didn't like the typical work flow in WordPress where you generate the content in a browser based user interface accessible through the admin login. It's annoying to be presented with updates and warning notifications after logging in, when you actually wanted to get work done.
The work flow resulted in double work for me. I want to have a local copy of my material, so I would start by writing content in local files and then go through the annoying process of copy/pasting it in the browser editor, fixing the formatting and uploading and linking the images. The resulting local and public copy of the material would then get out of sync easily if something had to be updated.
With a static website generator, all of those issues goes away. The source material directly generates the website in an automated process.
How this site was made and why Pelican
There are a couple of other popular static website generators. I got started with Pelican first because of it's easy pip install pelican
installation that results in fresh stable version, and the fact it uses Python that I am somewhat familiar with. It's not that Pelican is without issues, and it is likely that some of the other generators could have provided a smoother experience, but after investing significant time and being rather satisfied with the result, I decided to stick with Pelican.
I was surprised, however, by the amount of work I had to do to get the theme adjusted to my liking. After trying many of the themes available here I settled on the tuxlite_tbs theme and started digging into how it worked and began modifying things. The theme, as it is now, is kind of hacky and likely not following best practices at all. But with a clean separation between theme and content, it doesn't feel that bad as it can always be improved in the future. As long as the important URL's that others might link to are stable.
Traffic analytics and respect for the user
In order to maximize loading speed, reliability and in respect for the readers privacy and data consumption, I've made an effort to avoid off site dependencies. There is no gossiping going on to Google, there is no Facebook plug, no Twitter bootstrap.js fetch, no jquery download and no CPU intensive javascript mouse pointer tracking madness going on.
I only need visitor count statistics and referrer analysis, and for that it's really not necessary to use third-party spying.. or analytics. The web server already gets the required information in the process of handling the http request, so it would be overkill and totally unnecessary to let Google or some other analytics provider in through the back door to gather statistics.
The open source tool AWStats which uses web server logs was available out of the box with my web host, so that's what I'm using.
Comments
My first thought was to add a Disqus comment widget to articles, but I soon realised it might result in a ton of network requests, many of them for tracking purposes. This would totally negate my efforts described above. The worst downloads might be avoided by paying for an ad free plan at Disqus, I don't know, but I decided to look for alternatives.
After some research, I decided to go with Talkyard. They provide a hosted comment service for very reasonable pricing. There doesn't seem to be any suspicious network activity going on with the comment embed.
Declouding - thoughts
I think the use of a static website generator is in line with the idea of declouding. It allows the creator to fully own and control the content and only use the web host for the simple task of hosting plain files. This makes the web host easily replaceable, in contrast to the situation where the creator commits all the content to the database of some blogging website service.
I hope you enjoyed this content!
Comments powered by Talkyard