RSS Logo

by Mountaineerbr

#6 - More Web Master Resources


I reckon having a website for link collection is a good reason for having a website. I started a new column on the website home page, called The Fool Environmentalist which is supposed to be a list with links and very short comments about some important point. Google News always sends me good and bad articles that I should be collecting for personal reference.. Internet news are easy to read but very hard to keep track and organise them. Google does not help, sometimes even Firefox isn't!

The Guardian has been very disappointing for the last decade. Too many news about how humanity is headed to total environmental disaster.. Generically, not only The Guardian here, what they completely mess up is the sense of cause and effect. Human beings are living on Earth because the Environment has allowed all conditions for our development and survival at this precisa moment, so we are actually an effect of natural forces. To think that we are the cause and source of natural order is very naive. We may as well be, however it seems more overwhelming that what little humans can do is due to workings of a much greater order.

And here order means simply physical laws. Indeed, order can emerge from chaos and they are really not different things. We may talk more about complex systems in due time.


I did a rewrite of a shell script to deal with updating the website blog section. I found a very useful sed command which allowed me to write simple template html pages to merge with the actual post content (from Stack Overflow).

$ sed '2 r file2.txt' file1.txt

The first sed command adds text from file2.txt into file1.txt after a specific line (in that case, line #2). The following commands are more interesting as they add text from file2.txt after a matched pattern. file2.txt can be /dev/stdin! Also, if you want to modify the target file, you can use flag -i:

$ sed '/^PATTERN/ r file2.txt' file1.txt

$ sed -i '/^PATTERN/ r /dev/stdin' file1.txt

I was working on the HTML and CSS code of the website for the past days. Yesterday, I got some code snippets from W3 and made the home page responsive. It is using Flexbox, which, by the way, is not compatible with IE 10 and earlier.. I installed various web browsers available in the Arch Linux official repos:

It is a good idea to check your website with all of these browsers. I personally like to using W3M, however I am not that familiar with all shortcuts yet. On the other hang, Links and ELinks are easy to use to even have got GUI versions! They are useful for navigating to many websites, though not all.

Specially useful is the Netsurf browser, which uses a simple engine and is akin in rendering to a version of IE 10 or thereabouts..

You can test your website with a different browser (cross-browser testing) at Browserling, too.

[Valid CSS logo]

That is also good idea to check the code with an HTML validator and a CSS validator. They give you a code snippet to generate a validator logo at your website, however if you tested your HTTP version of the website, the code generated will make web browsers throw supposed errors of unsafe images when accessing it with HTTPS. In Linux, I found a very good HTML linter called tidy available in the extra repo in Arch Linux. There is a good CSS linter, albeit harder to use and requires some manual configuration is Stylelint, available in the community repo. Check the stylelint-config-standard package, too. Otherwise install via npm, check package instructions.


Lastly, you can check website performance with GTmetrix, a tip from Chris Titus Tech. It seems one of the most important optimisations you can do is improve image sizes, if your webpage is too large. My home page has approximately 200KB with images as of today..


PS: these webbrowser add-ons are almost essential webmaster tools, check Web Developer which adds a toolbar button with various web developer tools and HTTP Header Live which displays the HTTP headers.