Domain capture – how not to miss a single good domain name

Capturing valuable domains is still the basis for creating a solid SEO backend – although, of course, no one officially creates such backends. However, if you happen to stray from the path of the SEO White Knight, I have a simple way for you to check all the .pl domains expiring every day.

For those who capture domains occasionally, such a solution is rather a lot of overkill. Professional domainers, on the other hand, will find it trivial and obvious. I hope that this short text will interest someone “in the middle” 😉

List of expiring domains

Let’s start with the fact that the list of domains falling into the expiration period is nothing secret. NASK publishes it every morning at this address.

It includes all domains in the .pl extension not paid on time, including also regional and functional domains: com.pl, net.pl, katowice.pl, etc.

Save the list of domains

Of course, going through such a list manually every day would be unfeasible, so it’s worth creating a few lines of code that will do the job for us.

<?php /*
$today = date("m-d-y");
$nazwa = 'domeny'. $today .'';
file_put_contents($nazwa, fopen("https://www.dns.pl/deleted_domains.txt", 'r'));

*/
?>

We paste it into a file, which we call, for example, script.php, and hide it in some folder on our – just in case – little used / redundant server.

What does this code do? It takes a list of domains falling into the expiration period and saves them in a text file with the current date in the name.

What remains to be done is to make it called automatically every day, for which we can use, for example. Cron, which should be available on most low-cost hosting. We set it to call our file every day, for example, at noon.

One could, of course, work a bit on the code and make it, for example, add more domains in one file instead of creating new ones, but the solution in this form also works.

Filtering lists

Everyone has their own way of filtering large lists of domains, and here nothing more revealing can be written. Of course, the best, but at the same time probably the most expensive solution is the Ahrefs API and its own application, but here prices start at $500 per month.

An intermediate solution for Ahrefs account holders, is to use the API and some kind of integration with external software. Here, for the price of the standard Ahrefs plan, we have 300,000 lines of API to use.

When it comes to Ahrefs API integrations, there are plenty to choose from: Screaming Frog, Clusteric, Netpeak and others.

Depending on what information we retrieve from the API, checking one domain will take us about 4 lines from the account. Several thousand .pl domains expire every day – to check all of them, with sparing resources 300,000 lines should be enough.

The criteria by which to filter domains I leave, of course, to the consideration of each individual. I’ll just hint that sometimes it’s worth looking at:

  • The number of linking IP classes (instead of on the number of Referring Domains),
  • number of anchor types,
  • The number of links from the gov. or .edu,
  • the number of keywords, of course
  • after all – domain rating (though carefully, of course),
  • It is worth looking at the less obvious data, because as a rule, the number of links has nothing to do with their quality.

None of this, of course, guarantees or excludes a high-quality domain. The thing is to develop a way of filtering that takes into account as many aspects as possible without consuming too many queries from the API.

Capture

Keep in mind that we have just sifted through the list of domains that have just fallen into the expiration period. However, many of them will be paid for and disappear from our lists of domains for capture.

In the matter of capture itself, we are unfortunately condemned to the large entities that toil with this – of course, I am referring to Aftermarket, for example. Trying to capture anything of value on your own is a waste of time.

What will it give me?

As I stipulated at the beginning: it is not worth monitoring all domains if you capture occasionally, or only for one industry.

I capture quite a few domains and with this solution, instead of hours of wandering around in the dark, I generate a list for myself once every 2 weeks – and it takes me 20 minutes.

And while not perfect and probably in need of refinement – it shortens and increases the efficiency of operations. I wonder if you have any better patents for this?

Leave a Reply

Your email address will not be published. Required fields are marked *