The roadmap of CheckCheck

Approval of the first rough concept

Searching for the necessary functionality for checking sites among existing services gave mixed results. On the market there are a number of solutions for checking sites in the form of separate programs, Internet services and browser extensions. Each of of these solutions carries out its own set of checks and its additional services. However, the ideal combination of necessary checks, combined with regular automatic scans were not found. In addition, I wanted to have a comfortable modern an interface for working with errors, and the existing solutions in this part could not satisfy.

First working version

In February, the first version of the kernel was written, which was heavily changed several times in the future. It was based on own framework that can be extended with new validators of different types.

This version did not have a web interface and api, only a console. The test results could be exported as excel

tables. Technology stack: php8 + rabbitmq + postgres.

Version with web interface

The first public version of the project. Created a website with a verification form. The first version of the HTML report.

Improved reports with code snippets

After a number of technical improvements to the kernel, the reports received code snippets with error highlighting, as well as export of reports in the format PDF and Excel. The site has been updated and is open for indexing by search engines.

Adding new validators and checks.

In the previous stages, about 40 checks were implemented. Based on many years of experience in web development and modern requirements to the web, it is planned to greatly expand the list.

Regular automatic checks

One of the main features of the service should be an automatic site check and regular reports that analyze the results of checks in dynamics. At the request of the site owner, you can receive daily or weekly reports. Same it is planned to urgently notify of sudden critical problems, for example, a sharp spike in the amount of 400x or 500x errors.

Implementing paid subscriptions

To crawl sites containing more than 1000 pages, as well as to use additional functions, you must use a subscription. It is planned to leave single crawls of sites up to 1000 pages always free for everyone.

Extension for browser, telegram bot and public api

In addition to the site with a personal account, it is planned to organize access to the service using telegrams of the bot and a browser extension ... In the browser extension, you can run a check of the current site, as well as see the dynamics of the results scan the current page. Since for the implementation of these functions, you need to have an api, you can also access it provide publicly. This will increase the attractiveness of the service for developers and expand its usability.