Free 7 days
0
1 site
Absolutely free
1 crawling
50 URLs per crawling job
3 tracked pages
7 days tracked pages changes log
3 tracked sitemaps
Tracked robots.txt
Telegram and Slack alerts
7 days response time log
Google page speed tracking
W3c validation
Sub accounts
SSL validation
Protocol tracking
Domain expire tracking
Excel changes history export
Custom request headers
Try for free

No credit card needed. No commitment.

One resource
69
3 sites
Payment Interval: month
∞ crawlings, 1 active job per site
10 000 URLs per crawling job
60 tracked pages per site
1Y history tracked pages changes log
30 sitemaps per site
Tracked robots.txt
Telegram and Slack alerts
1Y history response time log
SSL validation
Protocol tracking
Domain expire tracking
Excel changes history export
Custom request headers
Google page speed tracking
W3c validation
∞ sub accounts
Get started Provided by PayPal
Enterprise
129
6 sites
Payment Interval: month
∞ crawlings, 1 active job per site
100 000 URLs per crawling job
120 tracked pages per site
2Y history tracked pages changes log
30 sitemaps per site
Tracked robots.txt
Telegram and Slack alerts
2Y history response time log
SSL validation
Protocol tracking
Domain expire tracking
Excel changes history export
Custom request headers
Google page speed tracking
W3c validation
∞ sub accounts
Get started Provided by PayPal
All plans
Free 7 days
0
1 site
Absolutely free
1 crawling
50 URLs per crawling job
3 tracked pages
7 days tracked pages changes log
3 tracked sitemaps
Tracked robots.txt
Telegram and Slack alerts
7 days response time log
Google page speed tracking
W3c validation
Sub accounts
SSL validation
Protocol tracking
Domain expire tracking
Excel changes history export
Custom request headers
Try for free

No credit card needed. No commitment.

One resource
690
3 sites
Payment Interval: year
∞ crawlings, 1 active job per site
10 000 URLs per crawling job
60 tracked pages per site
1Y history tracked pages changes log
30 sitemaps per site
Tracked robots.txt
Telegram and Slack alerts
1Y history response time log
SSL validation
Protocol tracking
Domain expire tracking
Excel changes history export
Custom request headers
Google page speed tracking
W3c validation
∞ sub accounts
Get started Provided by PayPal
Enterprise
1290
6 sites
Payment Interval: year
∞ crawlings, 1 active job per site
100 000 URLs per crawling job
120 tracked pages per site
2Y history tracked pages changes log
30 sitemaps per site
Tracked robots.txt
Telegram and Slack alerts
2Y history response time log
SSL validation
Protocol tracking
Domain expire tracking
Excel changes history export
Custom request headers
Google page speed tracking
W3c validation
∞ sub accounts
Get started Provided by PayPal
All plans
Look at the live example right now
Site functionality Live preview without registration and email input

Who are we?

Domain

Parameter How often Customizable Turn on/off Mobile UA Desktop UA Alert in case of
Date expire
weekly - + - - expires in less than 30 days
Protocol (HTTP/1, HTTP/2, HTTP/3)
daily - + - - changed
Nameservers
weekly - + - - changed
SSL expire
daily - + - - expires in less than 30 days
DNS BL
weekly - + - - record found

Robots.txt

Parameter How often Customizable Turn on/off Mobile UA Desktop UA Alert in case of
Content
weekly - + - + changed or Not Found

Sitemap(s)

Parameter How often Customizable Turn on/off Mobile UA Desktop UA Alert in case of
Items count (links or sub sitemaps)
daily + + - + Empty content or not found

URL (page)

Parameter How often Customizable Turn on/off Mobile UA Desktop UA Alert in case of
Google "PageSpeed"
twice a day - + + + No Alerts

the results are not permanent

W3C
twice a day - + - - No Alerts
http status
twice a day + - + + http status differs from expected
redirect url

(if http status = 301, 302 or 307)

twice a day + + + + redirect url differs from expected
response time
~ every 10 minutes + + + + response time differs from expected
title
twice a day - + + + title differs from expected
title count
twice a day - + + + title count differs from 1
h1 (max 6 items on page)
twice a day - + + + changed
h1 count
twice a day - + + + changed
h2 (max 6 items on page)
twice a day - + + + changed
h2 count
twice a day - + + + changed
h3 (max 6 items on page)
twice a day - + + + changed
h3 count
twice a day - + + + changed
link canonical
twice a day - + + + changed
link canonical count
twice a day - + + + link canonical count > 1
canonical in http headers
twice a day - + + + changed
hreflang (hreflang + href)
twice a day - + + + changed
hreflang count
twice a day - + + + changed
hreflang in http headers
twice a day - + + + changed
meta robots
twice a day - + + + changed
meta robots count
twice a day - + + + changed
robots in http headers
twice a day - + + + changed
meta description
twice a day - + + + changed
meta description count
twice a day - + + + meta description count > 1
meta keywords
twice a day - + + + changed
meta keywords count
twice a day - + + + meta keywords count > 1
og title
twice a day - + + + changed
og title count
twice a day - + + + og title count > 1
og descriptions
twice a day - + + + changed
og descriptions count
twice a day - + + + og descriptions count > 1
og descriptions
twice a day - + + + changed
custom regx (max 6 items on page)

each received value cannot exceed 255 characters

twice a day + + + + changed
tag <noindex> or <!--noindex--> count
twice a day - + + + changed
data-nosnippet count
twice a day - + + + changed
metrika id
twice a day - + + + changed
google analytics id
twice a day - + + + changed
microdata

json + html

twice a day - + + + changed

You will have access to the artifacts if the requested URL returns text or text/html and the data is changed. We store request headers, response headers, response body that are not related to the time check.

You can set:
  • User agent (desktop, mobile)
  • Concurrency
  • Delay between requests
  • Ignore or not robots.txt rules
  • Accept or not nofollow links
  • Execute or not JavaScript
  • Response headers
  • Parseable mimetypes|contentTypes
  • Ignored URLs ending with
  • Ignored URLs that contain specific characters
  • Ignored paths
  • Craw by Url list
  • Craw by Sitemap
What data analysis capabilities will you have:
  • Duplicate Value Search
  • Uppercase Value Search
  • Lowercase Value Search
  • Punctuation Character Search
  • Numeric Value Search
You will be able to use the following operators:
  • !=
  • =
  • <
  • >
  • >=
  • <=
  • empty
  • not empty
  • in
  • not in
  • start with
  • not start with
  • end with
  • not end with
  • contain
  • not contain
  • str length less
  • str length greater
  • str length =

What data do we collect and analyze during crawling:

url, found on, redirect to, content-type, last modified, http version, title, date, h1 1, h1 2, h1 3, h1 4, h1 5, h1 6, h2 1, h2 2, h2 3, h2 4, h2 5, h2 6, h3 1, h3 2, h3 3, h3 4, h3 5, h3 6, meta descriptions, meta keywords, canonical, robots, http canonical, http robots, og title, og descriptions, metrika, google analytics, hreflang, http hreflang, http status, title count, h1 count, h2 count, h3 count, meta descriptions count, meta keywords count, canonical count, og title count, og descriptions count, http hreflang count, nosnippet count, noindex count, hreflang count, robots count, internal links count, internal links depth sum, depth

The service was created by SEO specialists for SEO specialists. Our mission is to create the ability to promptly track changes on the site. We are convinced that you should not learn about the changes from the Google search console.

Introducing our cutting-edge cloud-based crawler – 2-UA. Our crawler offers an extensive range of features to optimize your website's performance and ensure seamless functionality, providing functionality that rivals industry standards.

Key Advantages:

  1. Cloud-Based Technology: 2-UA operates entirely in the cloud, eliminating the need for complex installations and allowing for effortless accessibility from any device or location.
  2. Comprehensive Crawling: Our crawler thoroughly examines every aspect of your website, from page content to meta data, ensuring no stone is left unturned.
  3. Real-Time Detection: With lightning-fast detection capabilities, 2-UA promptly identifies changes on your website, enabling swift response to any updates or issues.
  4. User-Agent Flexibility: Seamlessly switch between mobile and desktop user-agents to accurately assess your website's performance across various devices and platforms.
  5. Customizable Reporting: Tailor reports to suit your specific needs, with detailed insights into crawl data, errors, and performance metrics.
  6. Continuous Monitoring: Keep a vigilant eye on your website's health with automated monitoring features, ensuring optimal performance at all times.

Experience the power and efficiency of 2-UA – the ultimate solution for comprehensive website crawling and analysis.

  • Track the total response time of site pages AVG time, Percentile {25, 50, 75, 90, 99} with mobile and desktop headers separately
  • Track the response time of pages response time within a day or AVG time, Percentile {25, 50, 75, 90, 99} when the period is more than one day with mobile and desktop headers separately
  • Viewing the history of changes in the context of data
  • Export change history data
  • Search by data
  • Viewing the history of changes in the context of the entire page

Google page speed

We are checking pages with an expected status of 200 OK

  • Performance
  • Accessibility
  • Best practices
  • SEO
  • PWA
  • LCP
  • CLS
  • FCP
  • INP
  • TTFB

We collect audit history