for local development I'm running a local webserver with virtual hosts to manage multiple webprojects requiring their own URL. Normally I use URLs like myproject.com.local and the real project will be located at myproject.com. Everything works fine in Safari, IE or Firefox. But Google Chrome throws a 404. As far as I know they have some kind of intelligent address bar. Is there any possibility to get it working with all domains?
Best Regards,
Bernd
I think it should be working with all domains, as long as your workstations DNS can resolve the name to an ip-address. Also, check if you have any proxy settings in Chrome, sometimes it helps to check the 'Bypass proxy for local domains'-checkbox (somewhere in the settings).
Also make sure that when you request non-standard domains or port-numbers to put http:// in front of your url.
Good luck.
Related
So, I am trying to implement a SharePoint intranet site for an organization. However, there is one application in particular that they would like a link to on the homepage. Unfortunately this application can only be used via the IE tab google chrome extension (I know, dumb) but app devs have yet to add chromium compatibility.
Any way the link looks like this:
chrome-extension:
//hehijbfgiekmjfkfjpbkbammjbdenadd/nhc.htm#url=https://website.com/sub/sub.Hub.aspx
But share point requires a https:// on the beginning of a link.
If you throw that destination into chrome directly it navigates fine, but if you add say https://google.com/ on the front or https://*/ it doesn't work.
Is there a syntax that will allow me to put https:// on the front of this without getting a 404 error?
Never mind, I ended up re-directing this through IIS internally
Recently, I made a setup where I pointed some websites to a redirect server. The redirect server in return served the website requests using ProxyPass directive of Apache2. It worked like a charm without even a single problem for my websites.
So, based on that I have got an idea to access internet via Apache2. Please note that this is because I do not have access to fast internet and every internet provider is so lousy and lame here to provide better connection speeds even for the lot of money I pay to them.
Now, https as better speends than VPN.
So, the idea is to get rid of VPN and SSH tunnel redirects and instead, resolve every domain on my Mac to a single server IP address which should be a redirect server and which can in turn bring me back every web request made from my Mac. Possible? This will make me to always use https to my own redirect server. https has better speed than VPN for me whenever I try and when I am on VPN things are too slow for me, may be because of level of encryption. Please note that I do not want solution using PPTP, L2TP and anything else which are lighter than OpenVPN (using Pritunl).
Please let me know if anything like that is possible and if yes then how.
Even though if it does not work, my mind always gets this idea every time. I just want someone to shed light on this and shut down my idea if its the worst by far. Thanks in advance.
Also, I have also seen some proxy sites where I put any website link on their website and their website works like a browser as if I am surfing on their remote server itself. May be something like that can be useful and speedy for me. But, I do not want to use them because I do not trust those sites for security. No way.
Got a solution myself without any kind of VPN.
Actually I needed to make my DNS secure and connections to my server Apps secure. So, for that I tried DNSCrypt-Proxy and its working great and resolving my DNS queries on HTTPS (443).
And, I am using an Addon on Chrome for "Always https" connections. I am blocking every request on http for Chrome using that Addon. Perfect!!!
So, now all surfing traffic on my Mac is going on HTTPS and is perfectly safe from hackers. I do not care for any other connections made by my other Mac Apps. I just care for security of my Apps while I am surfing them OR any payments I am making for shopping.
DNSCrypt-Proxy:
Please go to https://dnscrypt.org/#dnscrypt-osx and you will find all help there to how to install and run it on your Mac.
brew install dnscrypt-proxy --with-plugins
sudo dnscrypt-proxy --ephemeral-keys --resolver-name=cisco
^ You can find the resolver name in excel sheet that comes with this package.
And, just add an entry in your Network interfaces for DNS to point to 127.0.0.1, Please note that remove all other entries.
"Always HTTPS for Chrome":
https://chrome.google.com/webstore/detail/https-everywhere/gcbommkclmclpchllfjekcdonpmejbdp?hl=en
Enjoy perfect security on your Mac, if you do not care about IP address anonymity. Always use legal stuff!!!
I've got a requirement to detect if a webpage is being served on the internet or intranet, i.e. assuming a url of https://accessibleanyway.com, is the phone connected to the work wifi or to something else like their home wifi or the phone network?
What different ways are there to do this?
(1) Use WebRTC to get the local ip address. Not widely supported
(2) Try to access a local web page using jsonp/cors/iframe
The problem with 2 is that the webpage is https and the local resource is likely to be http which you can't do in IE afaik. If I make the local resource https then it's via a self cert which means installing CAs on the phones (can you buy certificates for the intranet anymore?)
Any suggestions?
The problem with (2) was that the same page was trying to use http and https, and even with an iframe you get issues.
What you could do instead is start on a http loading page, use an iframe to access a local resource which you can only access if you are on the intranet, jsonp will work fine for this. Once that's worked or failed, redirect to your start page with some token in the querystring to indicate that you are on the intranet or not
NB jumping from http to https would probably have some security issues if you are on the same website (authentication cookies being initially visible), but I would have thought it would be fine if you are going to a different one
Obviously there'll be some security needed around the token as otherwise the user could just generate their own but that's a different matter which depends on individual setups. It would obviously have to be generated by a server call, otherwise someone could just read the client code.
NB I think the IP address approach is never going to work as you have no way of knowing what a companies intranet setup looks like until you go there, so it's not a generic answer
I have a nodejs app with express as backend which is running on localhost. I have subdomains associated with it like user1.localhost. These subdomains are opening in Chrome but Firefox throws Server Not Found error.
Does Firefox needs some configuration to allow subdomains?
I think the reason is that Chrome resolves *.localhost to localhost internally and other browsers request DNS server for subdomain.localhost (which obviously fails). You can use hosts files to make it work for them.
The reason Chrome does this is security reasons, you can read more about it here.
We're looking to do some scraping on a specific URL that uses cloudflare. Has anyone experienced issues using Zombie.js/user-agents while trying to crawl cloudflare hosted sites.
Would love some help!
I am trying to interface to an API on a client's site and I am getting a 403 error indeed. The request doesn't even reach my server.
Turning security to "essentially off" did not help. The final solution was to white-list the developer machine's IP.
The error is triggered on a single URL (json serving API) with a Java client with standards compliant libraries.
Solution:
1. try to set a rule to allow direct access for that URL
2. try setting security to weaker and weaker ("essentially off")
3. if both fails: try whitelisting
4. set up an alternate non-cloudflare url (direct.domain.com)
These will of course only work if you can negotiate with the site owners.
Backup solution: use an embedded browser that you can "frame" and "remote control" or a testing framework that does the same through a plugin, and extract the content from there (if you can)
Hope this helps.
You're probably triggering one of our security features by trying to scrape a site on us. The only option, really, would be to ask the site owner to whitelist your IP(s) to override the behavior.