Using DNS based system to unblock a blocked URL? [closed] - dns

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Okay I understand that this might be a silly question.
I'm looking forward to unblock Youtube in my country. I'm quite sure its a simple address/url block. I currently have to use proxies which reduce the speed of the connection. I tried to use the IP of Youtube to open it up but Youtube's IP actually opens up Google.com so that it is of no use.
I was also thinking of something like creating a DNS entry on one of my sub-domains that might point to Youtube's URL in some way but that might not be possible as I don't really know how DNS systems work at all. So some guesses might help. I'm not sure of some other hidden URLs that point to Youtube or even if some exist. So they might help as well.

May be using VPN connection to some provider that does not block the traffic would help? This one for example: http://privateinternetaccess.com

Related

Is it possible to make a website which can be accessable from only some specific locations? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am planning to make a small blog website which can be accessed from our state only.Is there any way to reach this goal.Please help me out.
You cannot do that - simple as that.
You can try a few things to raise a bar, but a determined attacker will be able to overcome the restriction.
Depending on your definition of state you can try a simple firewall. It can be easy if it's a range of IP addresses. But it may be easy to overcome this as well with VPN
You can add authentication and only allow users that can pass authentication. You need to have a process to grant login details only to specific users

Is there any reason not to redirect all HTTP traffic to HTTPS? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I just wrote a rule in my .htaccess to redirect all HTTP traffic to HTTPS. It seems pretty common, but I was just wondering if that could have any negative effect. As far as I know it actually helps as far as SEO is concerned. Is there any scenario where a user wants non secure access, can't access a secured site or anything like that? Or am I missing something else besides SEO and accessibility?
There's a post about this on Server Fault. The consensus appears to be that this is a good idea.
This blog post covers some of the drawbacks. There's also this post from the Information Security Stack Exchange.
If you use AdSense, you might see a decrease in earnings due to the forced SSL compliance.
Your site may perform differently than it would using HTTP.

Modify http packets in linux gateway [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Greets,
I have a CentOS installed as a gateway, and some clients connect to internet via this gateway(NAT).
Now I want to insert some string to each webpage that clients requested. How to achieve this? netfilter, winpcap or something else ?
Any comments will be appreciated:-)
Ideally, don't. Doing this blindly will break a lot of web pages, especially ones which make heavy use of AJAX. (Because your inserted strings will end up in places where they will cause errors, like JSON responses.)
If you must, the term for what you're trying to do is "transparent proxying". Squid supports this: http://wiki.squid-cache.org/SquidFaq/InterceptionProxy
Modifying the response content requires something that knows how to parse and correctly change that content. That means you can't do it at the packet layer (layer 3, where NAT is also implemented) but you need something at the application layer (layer 7). Application level gateways are usually called proxies :-)
And since this question is actually in the wrong forum I will stop answering for now :-) Please move it to i.e. Server Fault.

How to get the domain country? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm having trouble understanding how certain websites use various domains for each website. In a nutshell how does say for instance myspace have uk.myspace.com, fr.myspace.com etc?
Do they put the main files in the above root then have individual sub domains for each country or do they have something weird going on in terms of country detection??
I cant find anything anywhere online?
thanks
There is unlikely to be a single server involved, so talking about "files above the root" is meaningless. You'll be talking about some kind of fairly advanced routing infrastructure hiding dozens of different servers across many different locations. The routing logic is the part that decides which group of machines will be responsible for handling a given request.
The forwarding part is indeed "weird country detection", in the sense that some machine is responsible for performing an IP lookup and redirecting the user to an appropriate (possibly-geographically-closer) host. This might be done for performance reasons, or it might be done for content localisation and SEO reasons (e.g., the default language).

Config IIS for serving Static files / CDN? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Is there anyway to config IIS to serve static file like CDN? Like Gzip, Cache, E-tag, Mod-date?
And how should we config to make it very robust on massive requests?
I know it's a short question but that's all I want to ask.
To the question on the IIS version, i prefer 6 and 7 IIS. You can give your answer on either 6 or 7 :)
Thanks! I hope people find this question useful!
regardless of the server (IIS, Apache, etc.):
You achieve robustness through scale. Put 1000-2000 servers behind a few hardware load-balancers (F5's). Monitor constantly to ensure they all have the same files. Secure your file system so writes are allowed by only 1-2 users. Spread them geographically across network providers, power suppliers and backhoe opportunities.
Or just outsource it to someone who's already done all of this.

Resources