Domain Network: Users receiving "Invalid URL" or "Your Connection is not Secure" - browser

I am having an issue that is affecting multiple users on my Domain.
Whenever I try to navigate to www.google.com I am unable to connect, I either get Invalid URL or Your connection is not secure.
I have tried checking my clock, it appears to be fine and is handled by the DC.
Also, I have tried clearing all my Cache on Google and I have done an:
ipconfig /release
ipconfig /flushdns
ipconfig /renew
This did not fix the issue either.
Whenever I went into my IPv4 settings and changed my DNS to googles (8.8.8.8) I was able to connect and everything worked as normal, but in doing so I lose access to my Domain resources so this is not really a fix...
This issue has been coming and going a lot over the past few weeks, but as it never seemed to affect more than one user I didn't much bother with it. However, it seems to now be affecting more than one user at a time so I need to figure out what is going on...
Also, I have tried in different browsers.
Thank you.

Related

How to bypass a 403 error after being blocked?

I am working on some php files on a shared hosting. I am constantly making changes to the files and then running them in a browser to see the changes.
After a few hours of work, I repeatedly start getting 403 errors and have to wait a few hours to "get back in".
I can open the files in a browser on my mobile phone on the same wifi connection, so it's the same IP, but I cannot open them in a different browser on my computer.
I am assuming the hosting company is blocking me because of the frequent loads. Some of the files I am working on also send out emails via PHPmailer and soon after I start getting the 403 errors, if I continue loading the pages in my mobile browser, PHPmailer stops working. Probably the hosting company blocking me again. Changing to my mobile browser doesn't help with the PHPmailer error.
What I am wondering is if there is some way to bypass the 403 error without using a different device. The PHPmailer error doesn't bother me as much, but a solution to that would also be helpful.
Additional information that may be useful... or not:
I use .htaccess to prevent all IPs except mine from accessing the files.
I tried deleting cookies and changing the DNS.
Tried ipconfig /flushdns and ipconfig/all
Have no idea if those experiments were good ideas or not though... :/

403 Forbidden - ONLY from my computer

Problem: when I visit particular url the site goes white and the title of the tab says "403 Forbidden" and nothing else happens.
I wanted to ask the forumees here about this weird problem when I, and I solely, cannot access one particular website, until I've run right into very same problem with yet another site just now.
Apparently the problem lies with my computer, and my computer only.
The site IS accessible by everyone except of me.
I cleaned the cache, removed all sh!t files, restarted the modem and the computer couple of times already, to no avail.
I can access those sites via my phone, I can access them from anywhere else but! my computer.
Has anyone stumbled upon such a problem?
There could be a variety of reasons behind this. Your browser, a browser extension, IP address shitlisted for some reason (usually spamming), some sort of virus that gets detected by remote antiviruses and block your connections, some adware or spyware that rewrites your requests...
It's also possible that if you only have problems with those two websites and especially if you did something you were not supposed to, maybe they just blocked you out.
So try using a proxy server or a VPN like strongvpn. If it works then at least you know it's got something to do with your IP address.

Access internet via Apache2 ProxyPass

Recently, I made a setup where I pointed some websites to a redirect server. The redirect server in return served the website requests using ProxyPass directive of Apache2. It worked like a charm without even a single problem for my websites.
So, based on that I have got an idea to access internet via Apache2. Please note that this is because I do not have access to fast internet and every internet provider is so lousy and lame here to provide better connection speeds even for the lot of money I pay to them.
Now, https as better speends than VPN.
So, the idea is to get rid of VPN and SSH tunnel redirects and instead, resolve every domain on my Mac to a single server IP address which should be a redirect server and which can in turn bring me back every web request made from my Mac. Possible? This will make me to always use https to my own redirect server. https has better speed than VPN for me whenever I try and when I am on VPN things are too slow for me, may be because of level of encryption. Please note that I do not want solution using PPTP, L2TP and anything else which are lighter than OpenVPN (using Pritunl).
Please let me know if anything like that is possible and if yes then how.
Even though if it does not work, my mind always gets this idea every time. I just want someone to shed light on this and shut down my idea if its the worst by far. Thanks in advance.
Also, I have also seen some proxy sites where I put any website link on their website and their website works like a browser as if I am surfing on their remote server itself. May be something like that can be useful and speedy for me. But, I do not want to use them because I do not trust those sites for security. No way.
Got a solution myself without any kind of VPN.
Actually I needed to make my DNS secure and connections to my server Apps secure. So, for that I tried DNSCrypt-Proxy and its working great and resolving my DNS queries on HTTPS (443).
And, I am using an Addon on Chrome for "Always https" connections. I am blocking every request on http for Chrome using that Addon. Perfect!!!
So, now all surfing traffic on my Mac is going on HTTPS and is perfectly safe from hackers. I do not care for any other connections made by my other Mac Apps. I just care for security of my Apps while I am surfing them OR any payments I am making for shopping.
DNSCrypt-Proxy:
Please go to https://dnscrypt.org/#dnscrypt-osx and you will find all help there to how to install and run it on your Mac.
brew install dnscrypt-proxy --with-plugins
sudo dnscrypt-proxy --ephemeral-keys --resolver-name=cisco
^ You can find the resolver name in excel sheet that comes with this package.
And, just add an entry in your Network interfaces for DNS to point to 127.0.0.1, Please note that remove all other entries.
"Always HTTPS for Chrome":
https://chrome.google.com/webstore/detail/https-everywhere/gcbommkclmclpchllfjekcdonpmejbdp?hl=en
Enjoy perfect security on your Mac, if you do not care about IP address anonymity. Always use legal stuff!!!

Azure Traffic Manager with my own SSL cert?

I've been using Azure to host my Web Apps for a while now and they've had my own wildcard cert attached to various ones with no problem. Recently, however, one of my clients has wanted a certain degree of uptime/performance (not that there have been any problems so far but they are willing to pay for it and who am I to turn down money) so I've set up mirrored sites and am using traffic manager to route between them.
It works like a charm but for one problem: I have a cname pointing a friendly url to the traffic manager address and, if I try to connect via https, it craps out and wants to use its own *.azurewebsites.com cert no matter what I try.
So my question is: am I missing something here? How to I use my own custom *.mycompany.com cert in this case?
Or, for that matter, is there a better way of doing what I'm ultimately trying to accomplish here?
Here is my set up:
Endpoint 1: MyWebApp-East (type - Azure Endpoint, ssl installed and proper host info added)
Endpoint 2: MyWebApp-West (type - Azure Endpoint, ssl installed and proper host info added)
Traffic Manager: Routing Type - Performance
UPDATE
Oddly enough, I got it to work. I must have had something wrong somewhere. I did a scorched earth approach to it by deleting EVERYTHING (sites, traffic manager, dns entries, etc) and starting over. It works perfectly now!
Posted this in the top part but so as not to leave this open, I'll repost the solution I found:
Oddly enough, I got it to work. I must have had something wrong somewhere. I did a scorched earth approach to it by deleting EVERYTHING (sites, traffic manager, dns entries, etc) and starting over. It works perfectly now.
Sometimes to go forwards, you have to destroy everything.

Why all *.dev domains target to my localhost?

When you type *.dev domains, for example juas.dev it points to localhost, someone know why ?
(My hosts are not modified, and the request dont go outside)
I have just suffered this exact issue and it was driving me crazy. I couldn't access any .dev domains on the web. Trying to load a .dev website was causing a security warning in Chrome because it was using a default self-signed certificate somewhere on my machine, resolving all .dev domains to 127.0.0.1.
If your operating system is Mac OS X, this might solve it, as it worked for me.
In Terminal, type this command:
cd /etc/resolver/ && ls
If you see an entry name dev when you hit the Enter key, then chances are this is a wildcard resolver pointing all your attempts to access .dev domains to 127.0.0.1, ie. localhost.
Simply renaming this gets rid of it. You need admin permissions so (assuming you are still in /etc/resolver/) run this command:
sudo mv dev dev.ignore
You should instantly be able to access .dev domains on the web.
If moving it doesn't work, you can try deleting it with sudo rm dev as a last resort.
Do a lookup for a TXT record at that name and things will become clearer :-)
.DEV is a recently registered top-level domain, and currently in an initial period where it's seeded with data meant to alert people using it privately that they're about to have a problem. Part of that is to return only the address 127.0.53.53, which is special enough to be obvious in log files and similar, but also in the 127.0.0.0/8 block that is defined as loopback for IPv4. Which is why you get your own machine. In a few months or so, you'll almost certainly start getting NXDOMAINs instead.

Resources