Hmm, not sure why. Here’s what our robots detection regex looks like:
def self.crawler?(user_agent) !/Googlebot|Mediapartners|AdsBot|curl|Twitterbot|facebookexternalhit|bingbot|Baiduspider|ia_archiver/.match(user_agent).nil?
According to this page, the correct Internet Archive user agent is
However when I index something like whatsmyuseragent.com with the “Save Page Now” button on Internet Archive: Wayback Machine
… I get back
User-agent: Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36 (via Wayback Save Page)
It looks like this Save Page Now button is somehow driving your personal browser to issue the save request which results in oddities.
So … uh, I guess? I’ll just add
Wayback Save Page to the user agent detector to make that a crawler, too.