Here are some useful queries I've created to help me figure out what is happening.
--Traffic Counts by UserAgent for the past 24 hours
SELECT Count(PageViewID) as PageViews, UserAgent from ac_PageViews
where ActivityDate > DateAdd(hh, -24, GetDate())
GROUP BY UserAgent
ORDER BY Count(PageViewID) DESC
--Traffic Counts by UserAgent, RemoteIP for the past 24 hours -Useful for blocking IPs from bad bots through Firewall
SELECT Count(PageViewID) as PageViews, UserAgent, RemoteIP from ac_PageViews
where ActivityDate > DateAdd(hh, -24, GetDate())
GROUP BY UserAgent, RemoteIP
ORDER BY Count(PageViewID) DESC
--Show PageViews from the Last 2 hours -- Details, Useful for understanding what URLs they are hitting
SELECT ActivityDate, UserId, UriStem, UriQuery, UserAgent, RemoteIP
FROM ac_PageViews
Where ActivityDate > DateAdd(hh, -2, GetDate())
--and UserAgent like '%Amazonbot%'
ORDER BY ActivityDate DESC
For me, the top 10 results from the first query above are.
PageViews UserAgent
67613 --Mozilla/5.0 ... (Amazonbot/0.1; +https://developer.amazon.com/support/amazonbot)
18405 --Mozilla/5.0 ...bingbot/2.0; +http://www.bing.com/bingbot.htm) Chrome/116.0.1938.76 Safari/537.36
11971 --Mozilla/5.0 ... AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Safari/537.36
11838 --Mozilla/5.0 ... AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36
11827 --Mozilla/5.0 ... AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.157 Safari/537.36
11810 --Mozilla/5.0 ... AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36
6234 --Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot;
help@moz.com)
5102 --serpstatbot/2.1 (advanced backlink tracking bot;
https://serpstatbot.com/; abuse@serpstatbot.com)
5086 --Mozilla/5.0 (compatible; GeedoBot; +http://www.geedo.com/bot.html)
3802 --Mozilla/5.0 (Linux; Android 7.0;) ... (compatible; PetalBot;+https://webmaster.petalsearch.com/site/petalbot)
I snipped the UserAgent string above to improve readability in the forums
So, 67613 hit from AmazonBot is the main issue. After 11 hours, it still hasn't picked up the changes to Robots.txt