Overview of affiliate programs
Overview of affiliate programs" можно определить как a review and description of various affiliate programs that offer earning opportunities in internet marketing. Such overviews typically include details on the terms of participation in the program, the amount of commission payouts
Traffic arbitrage

Website testers: what to do without Ahrefs and similar tools.

Testing and auditing newly created websites is an essential part of any SEO specialist's work. Naturally, one can use popular tools like Ahrefs, SemRush, and Netpeak Spider/Checker. However, all of them are paid and may become unavailable in certain countries.
Are there any free services? Yes, of course. However, as always, no one is willing to work for free, providing maximum data collection. One can use several services to access different functions. Moreover, practically anyone can write an automated tester that will check pages for the presence of headings, image descriptions, the functionality of links, and so on.

Frog and similar ones

Let's start by looking at the services that can be used currently without the need to pay.

  • Screaming Frog SEO Spider: A popular tool among SEO specialists. It's desktop software for site crawling that displays broken pages, redirects, long and duplicate titles, errors in H1 headings, duplicate pages by the hash sum of the source code, and more. It's very convenient, but the free version is suitable only for small projects and has limited functionality.
  • BeamUsUp: A product developed by a single individual. It's a website crawler (standalone PC software). The latest update from the creator was on September 8, 2020. Despite this, the service can collect broken links, redirects, and errors on the website. However, it only parses HTML.
  • Webbee: A free website and URL scanner. It offers integration with Google Analytics and the ability to obtain statistics in infographic format. The software works on both Windows and Mac OS (version 10.7.3 and above). The blog of the website is periodically updated, but the latest version of the tool is only 3.0 (released in 2015).

  • LinkChecker: A small tool for checking broken links. It works slowly but provides accurate results. Installation and running are done through the GitHub repository.

  • IIS SEO Toolkit: Surprisingly, this is a crawler from Microsoft. It has a good range of features, and the scanning process is relatively fast. However, it does not work on Windows 10 (only 7 and 8). Apparently, Microsoft has discontinued its support.

  • SiteAnalyzer: A good website crawler with a decent set of features. It allows you to check websites for errors, analyze SEO parameters, and more. It includes features such as PageRank calculation, page indexation check, and much more. SiteAnalyzer is regularly updated, with the latest version released on March 28, 2022. Access to the tool is free (voluntary donations to the developers are possible). Note that it does not support Mac OS.

The Way of the Samurai - Python

There is another way to get a conditionally-free tester. Conditionally, because it requires investing time and effort (if you have programming skills) or money (if you hire an external developer). This way involves using Python - a multi-paradigm programming language. In simple terms, with the help of ready-made modules, you can create a tester (or even a web scraper) that can be further customized to meet your specific needs, to an unlimited extent.
What are the benefits? The advantage lies in the "one-time development investment and use it as much as you want" approach. Plus, you can always add new modules to expand its capabilities.

Let's see what is needed to develop a small SEO tester for websites (essentially creating a web scraper that gathers the necessary data and exports it to a unified document).

What is required for development? Naturally, Python itself and a code editor to simplify work, highlight syntax, etc. For example, you can use PyCharm, Spyder, Atom, and others (there are many; it's easier to explore and choose the one that suits your preferences in terms of appearance and does not require additional plugin installation).
Next, you will need modules, which are essentially ready-made libraries with a set of functions for specific tasks. You can install them using the command "pip install + module name" (e.g., the command will look like this - pip install requests). After installation, they will be ready to use.

  • Requests: This module allows you to send HTTP requests to the desired website. Using it, you can receive a response from the server (similar to what you see in the browser after clicking on a specific link). As a result, you get the full code of the required page. However, reading it can be inconvenient. To process the data further, you need to pass the obtained information to the next module, such as Beautifulsoup.

  • Beautifulsoup: This library allows you to process the code obtained using Requests and define the parameters for collecting information. With its help, you can specify data retrieval based on headers, the presence of Title, Description, alt-texts for images, outgoing links, and more. Essentially, the module uses Python's built-in html.parser. However, starting from version 4, Beautifulsoup also supports third-party parsers like lxml, html5lib, and xml.

  • Pandas: Another straightforward module that allows you to export the collected data to a .csv file. You can set variable names for column headers (make sure to enclose them in single quotes), and then export using the command df.to_csv.

The general algorithm is simple: open the code of your website's page, select the necessary data you want to gather (headers, title, etc.). Then, specify the tags and classes in the find command. Don't forget to come up with a message that the tester will write if any data is missing; this can be done using try and except statements.

After that, in the obtained code, use a loop with requests from the Requests module (it is recommended to use the time library to set a pause between requests). Next, export the data to a .csv file using Pandas and fix any errors you find.

Conclusion

Website testing is an essential task for every SEO specialist. It helps not only to identify errors on created pages but also to observe how your competitors optimize their resources. On one hand, there are many ready-made paid services that effectively address this task. On the other hand, you can always use free tools or create your own, which can be developed further and even monetized by offering your services to others.