By default, the robots.txt file contains the following entry:
User-agent: *
# ...
Disallow: /u
This configuration blocks the image for Twitter cards. Checking the card of various forums on Twitter’s validator does not display the logo and shows the following warning:
The image URL forum.[…].org/uploads/[…].png specified by the ‘twitter:image’ metatag may be restricted by the site’s robots.txt file, which will prevent Twitter from fetching it.
The problem is that the path /uploads also starts with a u and is therefore matched by the robots.txt rule.
I know that the robots.txt file can be overwritten but I think it would be good to change the default. Unfortunately, I don’t understand the code well enough to create a pull request.
When overwriting, keep in mind that Twitter checks the robots.txt file less often than the actual html of the page, so it takes a few hours to get picked up.
Yes, at least it makes the Twitter cards work. I hope that it does not have unintended side effects.
You can open the setting “allow index in robots txt” and click “override robots.txt”. Then look up the line Disallow: /u and place a slash in the end, so it becomes Disallow: /u/. It will take Twitter a few hours to pick up the change.
Looks like this was a problem (see this thread) in the past and then got re-introduced.
In this commit, the robots rules with slash got removed because it was thought they are no longer needed. Later, in this commit they were re-introduced but without the slash in the end.