Robotto is packed with features designed to make your everyday working life easier and more efficient, regardless of how many domains you manage.
Rest assured – Robotto will be constantly monitoring the following elements and keeping you alerted to any potential issues, as soon as they happen.
Robotto performs regular HTTP response checks throughout the day to detect changes for every domain in your account. As soon as a change is detected, you’ll receive an email alert detailing the current and previous HTTP status – enabling you to swiftly review the issue and resolve appropriately.
- Is the domain correctly serving a 200 status?
- Should it be redirecting elsewhere using a 301 or 302?
- Has there been any server downtime due to a 500 responses.
Robotto captures and records crucial information relating to the server responses including:
- HTTP response codes
- Date of the response change
- Time of the response change
- Duration between changes
You can also review the full HTTP response code/message and review a complete archive of HTTP responses recorded over time, enabling you to assess web server performance history, helping to quantify historic performance dips.
The robots.txt file is a powerful tool that helps webmasters control which sections of a website search engine crawlers are allowed to crawl, index and feature in their search results pages.
Incorrect markup or accidental changes to the live website robots.txt file can have a negative impact on search engine performance and Google warns that it can pause crawls if a robots.txt file cannot be found. Another regular issue problem identified is when the test environment robots.txt file is accidentally transferred to the live website, dictating to not allow any search engines to index the entire website.
Robotto monitors the robots.txt HTTP status of every domain in your account and will alert you immediately if the contents change.
A record of every change made to each of your domains robot.txt files over time is stored in Robotto so you can swiftly and easily review a full history of changes, including the date/time of each change and comparison view the current and previous robots.txt files.
www vs Non-www
To avoid potential duplicate content issues, your website should be setup and consistently served from only one canonical version of a domain name i.e. www.example.com or example.com
Either version is a valid format for selection, it’s simply a choice of preference. Whichever format you select as the canonical format for your domain, the alternative form should be permanently (301) redirected to the canonical version.
Testing both forms in a browser could look correct on the face of it, with one redirecting to the other, however a temporary (302) redirect could have been setup. Robotto automatically checks both versions and logs the HTTP status for your reference, allowing you to review the setup and take action if required. You will also be alerted to any future changes when they happen.
If your website has been identified as a potential malware threat by Google, the following warning message appears beneath the title of search results:
Clicking the search result listingwill subsequently display a page with another warning message rather than being taken immediately to the webpage:
“Warning – visiting this web site may harm your computer!”
These warnings can have a significant impact on your organic website traffic and should be addressed as a matter of urgency. Robotto enables you to monitor all your domains for malware alerts quickly and efficiently.
When managing a large portfolio of domain names, registrations and renewals rarely coincide to keep things simple. Some domains might have been registered for a period of years whilst others are annual renewals, keeping track can become a laborious task.
Robotto streamlines the activity for you by automatically capturing all domain renewal dates and notifying you 30 days in advance of a renewal deadline. Subsequent reminders are then sent at 15 days, 7 days and 2 days before the renewal deadline, allowing you plenty of time to renew and retain your prized domain assets.
Top Traffic Sources
The top 10 traffic sources are monitored for every domain with a Google Analytics profile authorised in Robotto. You can quickly visualise changes in your traffic sources across all your domains in a single view organised by volume.
The challenge of monitoring traffic across many domains is eliminated.
Robotto empowers you to view all of your domains organic and non-organic visits metrics plus the corresponding trends over time from within one screen.
You can quickly scan the trend graphs for your full portfolio of domains to efficiently identify where there have been negative impacts that require further inspection. Review week on week metrics including weekly traffic volumes from Google Analytics, percentage change and relative percentage changes.
Similar to the traffic metrics, review weekly bounce rate, pages per visit and time on site data, alongside week on week percentage changes and trend graphs for every domain in a single screen. A quick visual scan of the trend graphs across your full portfolio of domains enables you to efficiently identify negative impact that require your attention.
Page Load Time
Page load time is becoming increasingly important for both users and search engines alike. As broadband speeds continue to increase it seems web users’ patience decreases in-line and if your website doesn’t deliver a fast and efficient user experience, chances are that users will look elsewhere to satisfy their objectives.
Page load time is certainly a ranking factor for Google and Robotto helps you monitor load times and quickly assess performance across your full domain portfolio. Review weekly load time performance metrics, week on week percentage change data and trends across:
- Server Response Times
- Server Connection Times
- Domain Lookup Time
- Redirection Time
- Download Time
- Full Load Time
Webmaster Tools Crawl Issues
If the search engines are unable to crawl your website efficiently, your webpages will not be included in their indexes and therefore unable to deliver organic traffic.
Robotto maintains a constant check on your crawl issues data available in Webmaster Tools. The data is processed to give a more meaningful picture of issues.
The crawl issues are grouped into 3 types which allow you to identify lower volume but more important issues amongst the large volumes of noise.
Webmaster Tools Messages
The messages from Google Webmaster Tools are a mixture of critical issues and low value notifications which means some important issues can go unnoticed.
The messages for all your domains with an authenticated Webmaster Tools account are downloaded every day. They are prioritised based on their importance and then grouped. Critical issues are sent immediately, high priority issues are grouped and sent hourly and lower priority alerts are grouped into a a daily email.
Lost Entry Pages
Robotto helps you to not only identify entry pages that have been removed and might have resulted in lost traffic.
Robotto combines the 404 pages reports from Webmaster Tools is combined with entry visits data from Google Analytics to find removed pages that were generating visits.
Robotto’s ‘Removed Visits’ report details a list of all broken webpage URLs identified and the associated traffic value for the previous week (Visit data taken from Google Analytics). The report also highlights the response code returned from the problem URL, the date and time the issue was detected and where the problem URL is being linked from internally.