In trying to speed up my Node site, I've set up Amazon's CloudFront CDN to cache all static files on my server. I've configured a subdomain alias so instead of writing src="/mysite.css", I can write src="https://cache.mysite.com/mysite.css", and it will fetch the css file from the cloud, instead of my server.
The problem I'm trying to solve is that I don't want to hard code all of my static files to direct to the cloud, because it will be very hard to further develop and test my site in the future. If I make a change locally to a css file, for example, and try to test it locally, the link to the css file will be directed to the cloud, not the local copy.
So I was trying to think of a way to rewrite the urls, conditionally on the environment. Is there a way to use the ENV file to set a variable, which will then be used to insert the cache url in front of all relative urls, or otherwise leave it blank so the link remains relative?
There's process.env[documentation] to get environment variable which you can then use in your code/templates.
You can also use modules like dotenv or config
Related
Is it possible to change the url that the browser buttons in PhpStorm to go to a different url?
Currently these all take me to localhost:63342/[project name]/
I would like to direct these to a different address, [projectname].serveraddress:1111/ for example.
Currently my workaround is to use a PHP Web Page configuration but this required me to pick a browser at configuration level rather than being able to pick and choose.
I have a server config setup to automatically upload the files on change already.
Sure, just set up the Deployment entry and mark it as Default for this project.
When you are using "Open in Browser" or similar functionality, IDE looks for your default deployment entry and builds the URL based on those rules (web server URL + mappings).
If no default deployment entry is found then IDE builds URL for the built-in simple web server that uses localhost:63342/[project name]/path/file.ext URLs.
P.S. If it's a local server (files served directly from the original location) and no actual deployment is needed then use "In-place" type of config.
What is the best way to manage API URL's in an application (created with create-react-app) and ran in a Docker container?
Actually, I want to build a docker image and be able to run it on different environments (production and staging for example) without building a new one.
My current solution is to start a container with some environment variable like "docker run -e ENV=dev".
Add a logic to read env from query params. If query params are not passed use the default. That way, you can easily switch between envs on the fly. If you want to remember your users choice, then store it in the storage and you can read from storage when query param is not passed.
I don't consider myself a React dev, but I have come across and solved the same problem with Vue and docker.
After a little research specifically for React, it looks like you can use the public folder to share a mounted file/directory with the running container. Vue has a similar folder. The files in that directory are accessible from the app root URL (/some-file.blah).
Your app's directory structure might look like:
./app
./app/src
./app/src/public/
./app/src/public/config.json
./app/src/... (the rest of your app)
I assume that config.json would then be available at /config.json after a build. You could either then include that file in your HTML template's script tags or load it on demand using AJAX depending on what stage of the page lifecycle it's needed.
Having very little experience with React myself, I assume someone more familiar can provide clarification (or better edits) to help out.
We have an Website project that's hosted in Azure, and we use Web.config transforms for setting environment variables. However, our current approach for building the system for different environments is to build the project multiple times (currently this is 3), which is inefficient.
We'd like to move to using Web Deploy, as this would then set us up nicely for using Release Manager.
Our issue is around using Web Deploy parameters instead of web.config transforms; we need to substitute multiple xml elements, rather than single values.
After much research, I found these 2 articles which detail almost exactly what I'm trying to do
http://blogs.iis.net/elliotth/web-deploy-xml-file-parameterization
http://www.iis.net/learn/publish/using-web-deploy/parameterization-improvements-in-web-deploy-v3
Essentially I'm trying to replicate Scenario 5, but using a separate Set Parameter file for the value.
Unfortunately, in the examples, referencing an external xml file only works if it is on the target machine. Some testing with a colleague confirmed this; works on local machine, but not on Azure.
Is there a way I can force Web Deploy to look in a particular location for the external configuration files?
As you've already noticed, Web Deploy is only able to read replacement values on the local machine or on a UNC share. It can't read that specific file over HTTP.
If you're deploying to an Azure Web App, then one thing you could try would be to use Kudu/FTP to manually upload that file one level above your wwwroot folder. Then you could specify the file location like so:
D:\home\site\prices.xml:://book[#name='book1']/price
Of course this implies that you'd have to pre-upload this file before publishing to your site, so it's not a perfect solution, but it should work for what you're trying to accomplish.
Okay, so I've searched everywhere and while I can find plenty of stuff about moving a Drupal install out of a subdirectory I can't find anything on moving one into a subdirectory. I've recently taken over this project and it was developed without me so I've been landed in it here.
The problem is that the site was developed in the root of a dev server and I now have someone who wants it in a subdir. I've changed the base url in the htaccess and I've tried manually changing references in the CSS and DB but I can't be sure I've caught everything (modules etc).
What I want to know is, is there a way to force every link relative to the root to be relative to root/example instead. Basically everything that was once at www.example.com is now at www.example.com/subdirectory.
Thanks.
There's two pieces to this. The first you've already done: configuring htaccess to set a base url that includes the subdirectory.
Unfortunately, you may have quite a few references in the node content (especially embedded images) that will stop working.
A relatively simple solution to this would be to include a <base href="foo.com/dir" /> tag in your site theme, but this isn't a great fix in the long term.
You can try modifying your database directly, through queries such as the following (use with care, backup your database ahead of time, etc):
UPDATE field_revision_body SET body_value = REPLACE(body_value, 'devdomain.com', 'proddomain.com/subdir') (add http:// into those queries)
You may also need to update the paths in your files table to reflect the new locations on disk, especially if you're using multisite.
Alternately, have you considered using the Backup & Migrate module to move content from the dev server to a new install at the new instance?
So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.