I am trying to log DNS "leaks", in other words the DNS servers used by visitors to my web site.
How does one figure out which DNS server a web request came from to my server (i.e. the getting DNS leaks). This website dnsleaktest.com does it, it knows which DNS server I am coming from? How? It should only be able to know some stats about my browser, and maybe the HTTP referer. How does it know my DNS server?
What is being exploited, used? Or what is the traffic flow from my browser to this server, and where in that flow is dnsleaktest able to get this information?
That's not that easy.
What dnsleaks probably does - they have their own authoritative DNS server, javascript on their websites queries various randomly-generated subdomains of their domain, and on their DNS server they monitor where requests to those randomly-generated subdomains come from.
To do it, you need some domain hosted on your own DNS servers (not servers provided by your registrar or a hosting provider). You need to monitor queries to this server - can be done if you parse your DNS server logs or have your own DNS server software, or if your DNS server provides some API hooks to see the incoming requests. Then you write a script for your sites which queries various subdomains, and tells server-side script on your website which subdomain requests it should monitor. The server-side script in turn talks to the DNS server.
All the above is an unverified guess. I see no other way to do it.
Related
Halo, i’m a dev recently diving into cloudflare security layers and got few questions on a website security which is deployed to cloudflare. I’m using Pages and my domain is directly hosted by cloudflare Registrar. I’m also using the security layers provided with cloudflare infrastructure, including [ Bots, DDos, Settings, Page Shield ], which can be found in security tab of my domain in cloudflare dashboard. Below list is my questions:
security layers in use: [ Bots, DDos, Settings, Page Shield ]
I’m using firebase hosting to link my firebase functions with the domain which is hosted by cloudflare. In this case, do the above listed security layers of cloudflare automatically protect the firebase hosting resources or traffics?
I’m using cloudflare workers to manage Durable Objects. The Workers’ functions are also linked to the same root domain with different subdomain. In this case, do the above listed security layers of cloudflare automatically protect the Worker traffics?
the proxy status of firebase hosting connection is “DNS only” mode(not “Proxied” mode), since in the case of Proxied, the dns connection does not work(i didn't figure out the reason yet..). In this case, it makes me feel like the firebase hosting resources are not being protected since the orange switch in DNS dash is turned off
please consider the cloudflare plan is Pro
Thank you in advance [:
For the products you are listing, Cloudflare is implemented as a reverse proxy.
This means that from an end user perspective, when they try to connect to your services, their traffic reaches Cloudflare first (since a proxied record resolves to a Cloudflare anycast IP). Cloudflare carries out the features and security services that are configured, then forwards the HTTP requests to your origin infrastructure as specified in your Cloudflare DNS tab. This is true when the traffic is directed to proxied records.
For records in DNS-only mode, Cloudflare only performs DNS resolution (answering to the DNS query for that DNS record). Once this is done, the client will connect directly to the specified resource and the traffic will not be flowing through the Cloudflare network, meaning Cloudflare cannot provide proxy services in this scenario.
For a full explanation, I recommend the following documentation page
I'll get straight to the point.
I have bought a domain. I want to host it on my computer, maybe on a raspberry pi since only 50 persons/month will visit it.
Anywhere I've looked I see two hosting methods, using IIS/WAMP/XAMPP to only create a localhost website or adding the domain to C:\Windows\System32\drivers\etc\hosts.txt that is also... localhost.
What is the magic answer here? How can I host LIVE a website with the domain that I have bought? Am I stuck to using a hosting service? Am I missing something really important?
using IIS/WAMP/XAMPP to only create a localhost website
You need an HTTP server if you want to host a website. It needs to run on the computer you want to host the website from.
adding the domain to C:\Windows\System32\drivers\etc\hosts.txt
That's what you do as a poor man's solution instead of buying a domain name.
The Domain Name needs to be associated with a DNS server (and secondary DNS server) by your registrar.
Usually, a registrar will provide DNS hosting services as part of the deal.
The DNS server needs an A record pointing at the IP address of the computer running the web server. This IP address needs to be available to whomever is going to visit it (which almost certainly needs to be public facing) and should be static (unless you want to play games with very short TTL values and frequent reconfiguration of the DNS servers).
If you plan to host multiple different websites on the same server, you'll probably want to configure the HTTP server software to handle Virtual Name Hosting (whereby it pays attention to the Host header in the request and dynamically serves different content based on it)
I've been working in an company where there is an DNS server which could resolve our company network's address. But problems comes when I wanna use Google because it's blocked in China.
I have VPN to solve this problem but it's too slow.So I choose to use another DNS server and it works.
But if I use that DNS server,I could not visit my company's website. If I use my company's DNS server, I could not visit Google.
Is there any way that I could have my computer use company's DNS server while visiting company's website and use the other DNS server while visiting google,twitter,etc?
Note that IP addresses is always changing,so hosts doesn't work.
I would recommend dnsmasq, where you can this configuration using the parameter --server:
--server=/google.com/1.2.3.4: this will send all *.google.com queries to server 1.2.3.4
Can anybody explain why I see another web site at my http secure address. I don't have a certificate. I can manage files from folder httpsdocs but cannot access them from web. Also httpsdocs is empty. https://innovacube.com/
And base of my problem is Google indexes my https domain but I cannot denny Google Bot.
You're sharing IP addresses with another site - you'll see that both www.innovacube.com and www.cokyader.com resolve to 92.199.202.62. HTTP/1.1 allows this because you also send a host header
GET / HTTP/1.1
Host: www.innovacube.com
so the web server knows which site to serve for a given connection.
It isn't, however, possible to do this for HTTPS. The problem is that the certificate negotiation happens before the server gets to HTTP so you can't switch depending on the site that the browser really means. Therefore you can only host one HTTPS site per IP and this IP is hosting cokyader.com. (There are proposed extensions to help fix this but I've never seen any progress on this.)
If you want your own separate HTTPS site then your host will have to allocate you your own separate IP address.
Because you are using shared hosting, and it has been configured with a default SSL site that isn't yours (but presumably belongs to someone who has paid for SSL support and has their site hosted on the same server).
I am creating software that allows users to either have their own custom subdomain (e.g: theirsubdomain.mydomain.com) or point a CNAME from their own domain to my website address (e.g: theirsubdomain.theirdomain.com).
I've contacted my host about this and the first subdomain option is cool. They will set-up a wilcard subdomain script for me...
The CNAME they said I can't do automatically. I will have to manually go into my account and add the domain to point to my website address otherwise apache wont now where to look for the files.
Is this common practice or is there a way around this that is automated?
The issue is the HTTP header. When you request a Web page the browser sends a request that starts out with:
GET /mypage.html HTTP/1.1
Host: www.mysite.com
The Host item allows a single Web server to serve pages for multiple domains. By looking at the Host, the server knows that mypage.html should come from its stored files for mysite.com, and not from the files of myothersite.com which is on the same server.
I am guessing your site is on a shared Web server at your host company, and they use this functionality to differentiate between requests for your site and requests for other sites that sit on that same virtual box. Some of these virtual hosts, like HostGator, will allow you to specify other domains that should be accepted on this Host line and where the returned documents should come from. This often is a more premium service offered by companies. For example on HostGator they say "The Baby and Business hosting plans allow for unlimited domains to be hosted on just one single account", however the basic Hatchling plan does not allow this.
If you have your own rented machine, with your own installation of Apache, you can manage the processing of this HTTP header information yourself. Apache supports virtual hosts, see the following documentation: http://httpd.apache.org/docs/2.2/vhosts/
So basically, you have to have some way to tell Apache (or whatever server you are using) that the files for a particular Host value corresponds to the same files for your domain, since a single Apache server may be providing files hundreds of different domains. If you are not administering your own Apache server, to where you can set up virtual hosts as shown in the documentation, the hosting service would have to provide some custom way to get this information to Apache.