How does one programmatically find newly available domains? - dns

There are online tools (i.e. http://www.swola.com/) for discovering newly available domains.
Given a corpus of domain names, it is easy to periodically check their records and raise a flag when one becomes available. However, checking periodically the records for all registered domain names in the world sounds excessive/impractical.
What is an efficient way to programmatically discover such domains? How such tools work?

I don't have enough experience with the data to determine how up to date it is (for example if someone renews a domain a week before it expires, how quickly is it removed from the list?) but I discovered that pool.com has a database of upcoming domains sorted by expiration date. It's a relatively simple matter to create a cron job that downloads this file, parses it for the specific domains you want (for example, domains that expire tomorrow, domains with only characters a-z, domains up to 9 characters, domains with KEYWORD in them, .coms, etc) and emails you a daily report.
One thing that I would like to hear about is how others are ordering these domains as soon as they become available. I know some registrars offer that service but usually at a great expense.
Here's my script, I threw it together in a half an hour but hopefully it will be useful to you.

Related

Migrate G Suite to two separate G Suites

I have one G Suite account with three domains (mainly many emails).
Now I would like to separate one domain to other fresh G Suite account. Of course with historical emails.
Anybody know how to do this? It's possible without G Suite support team?
There are different ways of doing this and yes, you can do it yourself. Please consider that it can be tricky depending on the amount of data you want to move.
The main steps are:
Create a new account with a temporary domain (the same domain can not be in two consoles at the same time)
Migrate all the contents from your current account to the new one. You have different options to do this. The cheapest one is the Data Migration Service (DMS) that will only allow you to migrate only email using the IMAP protocol (so you need to know the users' passwords). Google support for the DMS is best effort, so if you have time and budget I recommend to use a commercial tool (my tool of choice here is Cloud Migrator) that is also able to migrate calendar items and Google Drive files using the Google APIs (so it is transparent for the users).
In a cut-off date that you agree with your users you remove the domain from the original console, you add it to the new one and perform a mass rename (my tool of choice here is GAM).
There are many variables that can make the process much more complex that are difficult to describe in a single answer and this kind of activities usually require a dedicated project and a (small) team: I really suggest you to get some help.
A word of warning:
I did this on Bluehost (transferring G Suite ownership of a domain from one Bluehost account to another) through Bluehost customer support.
It took 3 separate calls to Bluehost to get everything fully moved over. Make sure that your hosting provider sends you an email confirmation that everything happened successfully, because as a reseller of a Google product, they may not have complete authority over transferring ownership of Google's product.
Best regards.
Also in reference to the domain go to Google Domain and you can see all the options you have in reference to your domain website etc. It is very self-explanatory.
wc.

Find all domains under a TLD

I'm trying to find a way to list all registered domains under a top-level domain (TLD). I.e. everything under .com, .net, etc. All the tools I find only applies to finding subdomains under a domain.
The information you seek isn't openly available. However, there are a few options you can try:
You might want to try inquiring at the respective registries directly about getting access to the Zone files. However, the process can take weeks and some registries choose not to offer access at all. For newer GTLDs you can apply at ICANN's Centralized Zone Data Service. You might need to provide a good reason to access the full lists. The Zone file can only be pulled once a day, though, so for more up to date information the only option is a paid service.
Whois API offers the entire whois database download in major GTLDs (.com, .net, .org, .us, .biz, .mobi, etc). It also provides archived historic whois database in both parsed and raw format for download as CSV files, as well as a daily download of newly registered domains.
A similar, popular question exists already but the answers and links are a bit outdated.

What's the best way to programmatically scan domains under a given TLD?

Not sure if I want to just do a kind of brute-force thing, or a dictionary attack (though this would require a Romanised dictionary of every word in the target language), but I want to scan area websites (area meaning in my country) and I don't want to just whitelist everything. Is there a better way to do this?
If you have a legitimate need for a list of all zones in a TLD, contact the registry for that TLD and ask. How willing they will be to help you varies enormously, so without knowing which TLD you're thinking of it's impossible to guess how viable this way is.
If the TLD uses DNSSEC with NSEC, you can walk the zone by following the NSEC chain. This is the best way if you can't get a file to download, and the fact that a zone is using NSEC is implicit permission to do so.
If you can't do either of the above, you're down to guessing. You're also working against the wishes of the registry, so be prepared to have your server get blocked from even talking to the TLD's name servers. There are about 6.26*10^98 possible names directly under each TLD, so you'll need to send quite a few queries.
Also note that "web servers in my country" is not a very well defined concept. Does that mean all sites with domain names in your country's ccTLD? All sites hosted on servers that are physically in your country? Sites intended for people in your country, no matter their domain name or hosting location?

Monitoring the Full Disclosure mailinglist

I develop web applications, which use a number of third party applications/code/services.
As part of the job, we regularly check with the Full Disclosure mailing list http://seclists.org/fulldisclosure/ for any of the products we use.
This is a slow process to do manually and subscribing to the list would cost even more time, as most reports do not concern us.
Since I can't be the only one trying to keep up with any possible problems in the code I use, others have surely encountered (and hopefully solved) this problem before.
What is the best way to monitor the Full Disclosure mailing list for specific products only?
Two generic ways to do the same thing... I'm not aware of any specific open solutions to do this, but it'd be rather trivial to do.
You could write a daily or weekly cron/jenkins job to scrape the previous time period's email from the archive looking for your keyworkds/combinations. Sending a batch digest with what it finds, if anything.
But personally, I'd Setup a specific email account to subscribe to the various security lists you're interested in. Add a simple automated script to parse the new emails for various keywords or combinations of keywords, when it finds a match forward that email on to you/your team. Just be sure to keep the keywords list updated with new products you're using.
You could even do this with a gmail account and custom rules, which is what I currently do, but I have setup an internal inbox in the past with a simple python script to forward emails that were of interest.

Why ww2 sub domains?

I have seen on the web some domain names having prefix of ww2 or ww3 or so (ww2.somedomain.example, ww3.yourdomain.example). And these happen mostly when traveling from a page to page. What would be the reason of having such subdomains? Is there anything special about them or are they just another sub domain? I mean, are they useful in any particular context?
People running large(-ish) sites used to do this when they needed to break up the load between more than one server. One machine would be called www then the next one would be called www2, etc.
Today, much better load balancing solutions are available that don't require you to expose your internal machine naming conventions to the browser clients.
Technically, the initials before the primary domain name (e.g. the "mail" in mail.yahoo.com) can be best though of as a machine name, identifying the web server/mail server, whatever. They can also identify a group of machines (a web farm).
So the person building up that machine can call it anything they want. The initials www are a (somewhat arbitrary) convention.
Oftentimes, ww{x} is used to indicate a particular server of a set of mirrored servers. If properly configured, I could have www.mydomain.example point to my web site on a load balancer, while I could use ww1, ww2, ww3, etc to access the site guaranteed from a specific LBed server.
I can see 3 possibilities
make the browser load resources more faster. the browser would open a fixed number of connection to same domain not to load the server
they are using more then one server so they can share the load between servers
separate some content to a separate virtual host or server. some kind of organization ...
As various answers have pointed out, modern day load-balancers can balance load without having to resort to using different sub-domains for each machine. However, there is still one benefit of dividing your site into various sub-domains: maximize browser connections.
All browsers limit the number of concurrent connections to a particular host (6 for most modern browsers). If a page contains lots of assets, page-load would be slow as the browser queue those requests because of connection limit. By loading different assets from different subdomain, you get around the connection limit, speeding up page-load.
Typically it's a partitioning strategy. When sites get sufficiently large that they can't run (or run well) on a single server you then have to look at solutions for scaling the application out horizontally (ie more servers) rather than vertically (ie bigger servers).
Some example partitioning strategies are:
Certain users always use certain servers. This can be arbitrary or based on some criteria (user type, geographic location, etc);
When a user gets a session that session is assigned to a particular server (sometimes called "sticky sessions" although this can also be used where such different machines are transparent); and
Certain activities are always on certain machines.
Another common case is organizational reasons. In an extremely large company, www might be for their main marketing website. And, ww2 might be, say, for product documentation pages.
In an ideal world, all departments would share perfectly. In practise, a big company might have their (www) marketing pages managed by an external agency. Their internal (ww2) pages done by their internal team. Often, the marketing agency just doesn't update pages quickly or refuses to run certain stacks, may be too limiting in terms of bureaucratic needs.
The marketing agency may insist on controlling the www and not sharing due to past situations where a company website went down due to internal reasons and yet the agency got blamed, or vice versa.
So, theoretically, there's no need to do this with modern load balancing and such. But, in practise, it can be a lot cheaper, straightforward and allow better business productivity.

Resources