In this post, I will discuss the humans.txt initiative and point New Leaf Journal readers to our own humans.txt file (see the official website of the humans.txt initiative). But before going into humans.txt files in depth, we must understand what humans.txt files are.

The humans.txt initiative rose in response to the ubiquitous robots.txt file that is present on most websites that have ambitions to appear in search engines. Thus, let us first examine what a robots.txt file is.

What is a robots.txt File?

Many websites – including The New Leaf Journal – have what is called a robots.txt file. As the name suggests, the robots.txt file is a text file in a website’s root that is designed to be read and interpreted by robots. By “robots” – we mean bots that visit, scan, and process a website. Major search engines with their own indexes – such as Google and Bing – use robots to scan websites and build search indexes.

The robots.txt file exists to provide rules for robots visiting a website. Moreover, it can also assist search engines in indexing a website (e.g., the robots.txt file can point robots to the website’s sitemap). Another important purpose of robots.txt is to tell robots what they are not allowed to do. For example, the file may tell certain robots (or all robots) to not crawl the site at all. More commonly, it may tell robots to crawl one part of the site but not another. Webmasters can also use robots.txt to set different rules for different robots. For example, a webmaster could set one particular set of rules for Google’s bots and a different set of rules for Bing’s bots.

With respect to the humans.txt initiative, the key point to understand about the robots.txt file is that it is written for robots. To be sure, you can often find a website’s robots.txt file by appending robots.txt to the end of the site’s base URL. But the robots.txt file is unlikely to provide useful information to ordinary visitors. Its purpose is practical in nature – to provide guidance to visiting web crawlers.

What is a humans.txt File?

The humans.txt file, like the robots.txt file, is a text file placed in a website’s root directory. If a website has a humans.txt file, it is generally available at the site’s root domain followed by humans.txt. For example, our humans.txt file is available at the following URL:

https://thenewleafjournal.com/humans.txt

Unlike the robots.txt file, the humans.txt file does not serve a practical purpose. The idea behind it is quaint and charming instead of utilitarian. Fortunately, the founders of the humans.txt initiative provided a careful point-by-point explanation of what they were trying to accomplish on their website. Let us explore.

What is the Purpose of the humans.txt File?

The humans.txt initiative describes the humans.txt movement as follows:

It’s an initiative for knowing the people behind a website. It’s a TXT file that contains information about the different people who have contributed to building the website.

Thus, the humans.txt file is written for humans. As envisioned by the founders of the humans.txt initiative, it tells human visitors about the humans who created and/or contributed to the website to which the humans.txt file is attached. The humans.txt website also ties the concept to the common robots.txt file:

We are always saying that [the internet is for humans], but the only file we generate is one full of additional information for the searchbots: robots.txt. Then why not doing one for ourselves?

However, there are only suggested formats, the exact composition of a humans.txt file is up to the imagination of each site. You can see the suggested standard in action in the humans.txt file for the humans.txt initiative website. The website for the humans.txt initiative states that “you are free to add any information you want.”

Finally, it is worth noting that the humans.txt initiative did not come up with the idea in order to replace anything else. Instead, “humans.txt is just a way to have more information about the authors of the site.”

Learning About humans.txt

I learned about humans.txt while conducting WordPress research at Perishable Press, the website of WordPress guru Jeff Starr. We use several plugins from Mr. Starr, including Blackhole for Bad Bots, which is designed to give our robots.txt file some teeth. I covered a non-WordPress project of Mr. Starr’s last year – the Wutsearch Search Engine Launchpad.

Mr. Starr described his evolving views on the humans.txt file:

Years ago, I thought the whole humans.txt thing was just silly, and even explained how to block humans.txt requests. But the concept has actually grown on me to the point where I now include a customized humans.txt file for most of my projects. It just seems like some useful information to make available for those who are looking for it.

Jeff Starr

Mr. Starr showed his own humans.txt file as an example, which you can find here. His template is far more robust than the suggested template on the website for the humans.txt project – and the format I am using for The New Leaf Journal is closer to Mr. Starr’s. He discussed designing humans.txt files:

One thing I enjoy about writing humans.txt files is the flexible structure. I mean, you can add anything you want: names, URLs, even ASCII art, if you’re bold enough.

This point also clearly distinguishes the humans.txt file from the robots.txt file. While one could well create a strange robots.txt file, the pertinent part of the file must conform to rules in order that robots can understand it. The humans.txt file is not bound by such limitations since the intended audience is human readers.

In my view, the suggested humans.txt templates on the humans.txt website are far too bare. But for some projects the humans.txt template may work. For others, something closer to Mr. Starr’s or The New Leaf Journal’s may be appropriate. The computer-art inclined can include ASCII art masterpieces in their humans.txt files.

The New Leaf Journal’s humans.txt File

I added a humans.txt file to The New Leaf Journal in February 2022, and I added a button to our site that links to the file in early March. Our humans.txt file is still a work in progress – and I expect to modify it a bit over time. But it is a permanent part of our site going forward.

My Thoughts On humans.txt

Mr. Starr suggested one practical reason to publish a humans.txt file – namely that some people will look for it. Another potential use for the humans.txt file is the fact that it requires access to a website’s file system to publish. However, on the whole, the humans.txt file is not particularly useful. A website can publish far more robust information about itself and its human staff on the site proper (I dare say my author page and feeds page are more detailed than our humans.txt file). Moreover, Mr. Starr noted in an earlier article that webmasters can redirect requests for a site’s humans.txt file to a different location.

However, separate and apart from utility, I like the concept behind the humans.txt file. One complaint about many of the websites that clutter the top spots for queries sent to Google and Bing is that the content on those sites is written for search engines – i.e., robots. There is a fine line between writing content for humans that also performs well in search engines and writing content for consumption by search engines first. It is possible to try to do well with search engines while producing humane content. It is not possible to write primarily for bots and produce humane content.

The values behind the humans.txt file conform well to those I discussed in my article about the small web and humane design. Publishing a thoughtful humans.txt file not only provides information about a site and the people behind it, but also shows a commitment – albeit in a small way – to producing content that is amenable to human readers first.

Our new humans.txt file fits in with the values of The New Leaf Journal in the same way as our new sitemap. Sitemaps list all of the content on a website. Their purpose is generally to make it easy for bots to find and crawl content on sites – and they are often referenced in robots.txt files. While our sitemap’s primary function is to make the lives of bots easier, I used a new tool to create a sitemap that is also readable by humans. You can now enjoy our sitemap just as much as GoogleBot.

While a humans.txt file is not necessary for any website, I encourage webmasters who write human-first content to consider adding a humans.txt file and making their readers aware of it much like we have now done at The New Leaf Journal. Perhaps the humans.txt file can serve as a sort of badge for humane writing websites around the web.