Caddy file server with a route - caddy

I run caddy with docker. I have my website loaded to /etc/license inside the docker container.
When I serve from the root, with the following Caddyfile:
$MYDOMAIN {
root * /etc/license
file_server
}
It works as expected, my website loads when i go to $MYDOMAIN.
Now I want to put this website under the route /license so when I go to $MYDOMAIN/license I see my website. It seems that it should it be straightforward but I tried everything I could think of and I can't get it to work.
This is my latest attempt Caddyfile:
$MYDOMAIN {
handle /license {
root * /etc/license
file_server
}
# handle other routes
}
Does anybody know how to make it work the way I want and why the current setup doesn't work. Thank you

your config has a little mistake. Your folder structure need to be the same as your route. If your subroute is $MYDOMAIN/license/ and your website resides in /etc/license, you need to point your root to the directory one level higher (etc). But I would recommend to create a new directory in license with the same name and your website: /etc/license/license
you could solve it also like this:
$MYDOMAIN {
# root * /var/wwwroot
root /license/* /etc/license
file_server
#reverse_proxy /api/ localhost:5000
}

Related

Install drupal project with drupal in a subdirectory

My customer a has production environment with the following folder structure:
- www
|- maresmuseum
where I only have access to these two empty folders, being the "maresmuseum" the public_folder.
I've deployed my Drupal 9 site with composer, like the project-recommended way does, you know: placing the index.php along many other core related folders & files in "maresmuseum" folder, and /vendor, /tmp and many other private stuff in the "www" folder.
As a result, I can access my website through a url like this: https://example.com/maresmuseum (I'm quite sure my customer has a .htaccess rule somewhere to accomplish that, don't you?)
Given this production scenario, and having in mind I work with DDEV on local, I want to install this site on my local machine so I can access it with a url like this: https://example.local/maresmuseum.
Of course, all inner pages of this site must follow this url pattern, something like this:
https://example.local/maresmuseum/about-us
https://example.local/maresmuseum/contact, etc..
How should I configure DDEV to accomplish that?
Thanks in advance.
I'm sure there are many ways to do this. Here's one way to do it using nginx configuration changes.
I used https://blog.rebootr.nl/drupal-8-in-a-subdirectory-with-nginx/
ddev config --composer-root=maresmuseum --project-type=drupal9 --webserver-type=nginx-fpm --docroot=maresmuseum/web --create-docroot --web-working-dir=/var/www/html/maresmuseum (This sets up to put composer.json in maresmuseum)
Install Drupal 9: ddev composer create drupal/recommended-project --no-install
ddev composer require drush/drush
Install project, or load db, or whatever, maybe ddev exec vendor/bin/drush si -y demo_umami --account-pass=admin
Edit the .ddev/nginx_full/nginx-site.conf to remove the #ddev-generated and replace the location stanza, example in https://gist.github.com/rfay/5248e5f75bf3e27d84965bfdfc69c240#file-nginx-site-conf
Edit the maresmuseum/sites/default/settings.php to add the stanza suggested in the article to the bottom, example in https://gist.github.com/rfay/5248e5f75bf3e27d84965bfdfc69c240#file-settings-php
ddev restart && ddev launch /maresmuseum
https://<project>.ddev.site/maresmuseum will work fine (as it will without the directory).
I had some trouble with browser cache, so you'll want to pay attention to that.

How to hide nginx.conf authorization credentials?

To explain quickly, I have an nginx server running within Docker, acting as a reverse proxy for my Flask web app. A part of my configuration is using proxy_set_header Authorization to pass some credentials into the proxy_pass website.
This all works fine - but, I want to push all this stuff to GitHub, and of course don't want want my creds, encoded or not, to show up.
What is my best option here? All I can think of is having something similar to dotenv, but for nginx.conf files rather than .py files.
Does anyone have any ideas on what I could to in order to pass my creds in but without hardcoding them explicitly in the config?
You can use another configuration file create Variables with NGINX using set and add this file to gitignore.
conf.d/creds.include
set $apiuser "user";
set $apipass "pass";
http://nginx.org/en/docs/http/ngx_http_rewrite_module.html#set
app.conf
server {
include conf.d/creds.include;
...
location / {
proxy_pass ...
proxy_set_header "Authorization: $apiuser:apipass"
}
}
You should mention this in the README of your repo that anybody know how to use it.

How to host a Gatsby+Node.js project on a shared hosting?

I have a project in gatsby which uses Node.js/express for backend with MySQL.
Now, I know that all I have to do is gatsby build and that will create the static html/css/js files for me in the project/public folder and I can paste all of them in public_html folder and that will work(it is working), but Im confused about the database thing:
My issue is that in the gatsby-config.js when I change the mySql connection from localhost to the hosted db settings such as:
(The commented one is the hosted db configurations)
If I run gatsby develop while uncommenting the code. It says No such DB Error(obviously). So How can I configure the db settings here and also in the gatsby-node.js file to connect the db with the project?
I know this might sound like a dumb question but please help as I'm confused about what to do next.
Thanks.
Okay! Spent a lot of time on this. Hope it will help others.
Static Gatsby site
If you're trying to host a static gatsby site on any shared hosting. By static, I mean just plain gatsby styled pages,
You can do as the gatsby doc says:
Run :       gatsby build        or        npm run build.
According to gatsby:
Gatsby will perform an optimized production build for your site, generating static HTML and per-route JavaScript code bundles.
After this : try npm run serve.
According to gatsby :
Gatsby starts a local HTML server for testing your built site. Remember to build your site using gatsby build before using this command.
serve will test your build files(newly created files in yourprojectroot/public dir)
This will run your project(using the build files) on a test server(localhost:9000) to basically test your build files.
Test this localhost:9000, If everything is working good. You can go to your remote cPanel and paste all your build files into the public_html folder.
Head over to your domain and you're good to go.
Gatsby with MySQL and Node/express
If you are trying to host your gatsby site which works a little with node and mysql as well and you are a newbie in hosting like me, Here's what you'll want to do:
Try both the points mentioned above. (Build your static files and try serve)
Setup your db on the remote as well with the same name dbname, username and password as your local one.
Two extra things:
Now, what you are going to do is to run both the node and gatsby(webpack) servers on the same port (say 8001). So we are going to use only the node server and serve all our gatsby files(build files) as static content to node server.
In your node file, add:
app.use(express.static(path.join(__dirname, 'public')));
app.get('/*', function(req,res) {
res.sendFile(path.join(__dirname,'public/index.html'));
});
As you are going to run all your gatsby pages through index.html the last get('/*'... (above) will take care of all the pages request. Change the path public according to your remote folder structure
Add the build files along with the node(server connection) file in the public_html folder on remote.
Next add or change your .htaccess file (in the remote) to :
RewriteEngine On
RewriteRule ^$ http://127.0.0.1:8001/ [P,L]
RewriteRule ^(.*)$ http://127.0.0.1:8001/$1 [P,L]
So when you run your node file through the server's terminal, instead of yourdomainname.com:8001 the above mentioned .htaccess will redirect it to yourdomainame.com only
All done.
Your public_html now should contain the build files,a node/express conn file and .htaccess file.
Now, just go to your terminal. cd into public_html and run node yournodefilename.
You can head over to your domain now.
Note : You can use pm2 package to keep your node server always running.
Hope it helps somebody.
You should use environment variables to switch between configurations (locally and production). Environment files are files that store sensitive data such as API keys, tokens, etc, so they must be ignored and untracked to avoid pushing critical data to a public repository.
By default, Gatsby uses .env.development and env.production respectively for gasby develop and gatsby build commands, of course, you can override this behaviour but, assuming the default configuration, you should add the following snippet to your gatsby-config.js:
require("dotenv").config({
path: `.env.${process.env.NODE_ENV}`,
})
Then, you need to create a .env.development and .env.production in the root of your project with the following content:
DB_HOST:yourHost
DB_USER:yourUserName
DB_PASSWORD:yourPassword
DB_NAME:youDatabaseName
Of course, each file should have different variables if you want to switch between databases or configurations.
Add them to your gatsby-config.js:
connectionDetails:{
host: process.env.DB_HOST
user: process.env.DB_USER
password: process.env.DB_PASSWORD
database: process.env.DB_NAME
}
The final step is to add, in your host, the environment file in order to make them accessible by Gatsby. S3 by Amazon allows to configure them but I guess that it's a common configuration for the hostings.

How to serve static files with nginx after using npm run build with webpack

after generating a development build with npm run build i get the message saying
"Tip: built files are meant to be served over an HTTP server.
Opening index.html over file:// won't work."
What is the best way to do this with nginx? Currently to test it i am using an npm module called serve.
Also, if i got to my homepage at mydomain.com and search for a user, everything works like it is supposed to, redirecting me to mydomain.com/users/brad but if i then do a url search formydomain.com/users/brad i get a not found error, any help is appreciated!
In my case, when I have to serve static content with nginx, it often looks like :
location /static {
alias $myroot/staticfiles;
}
Also, if you haven't already, read the NginX guide to Serving Static Content.
If your are familiar with the Docker technology i would recommend to use a Docker Nginx Container and add your static content from the Webpack Buildflow to your container (this can be automated with a build server). Have a look at the following Docker Image from Nginx: https://hub.docker.com/_/nginx/.
Otherwise you have to install nginx on your Server where you host your Homepage. For this have a look at your Server OS and reach out to the web for a detailed nginx setup for your server. Without any configurations nginx will serve the static content on Linux-like server from /usr/share/nginx/html
If you only have FTP Access to your server you can transfer your built files via FTP/SFTP to a specific folder e.g /myHomepage and then your static content is server from yourdomain.de/myHompage.

Deploying ServiceStack App to IIS Subfolder under Root

I have a simple ServiceStack application that I was able to host as a console app and I'm now wanting to package/deploy it for IIS.
I've created an ASP.Net application project and can successfully run the service on my local machine. When I tried to deploy it to an IIS server (v7.5) in a subfolder under root, I get a 404.
Per the examples and documentation on the ServiceStack site, I set location path web.config setting like this
<location path="api">
...
</location>
I tried these paths
/api
/subfolder_name/api
but none work.
Is it not possible to have it in a subfolder and have it the path of api?
I have it in a subfolder under the root and have the path config'd like this: . Yet it doesn't work. So does that mean that all of my files have to be under the root and then I have to alter the global.asax to include my apphost init code? Seems a little messy to me.
Not sure of your exact setup but specifying the path in AppHost.Configure() may also help:
SetConfig(new EndpointHostConfig
{
ServiceStackHandlerFactoryPath = "api",
});

Resources