Do I need to use a .htaccess file with GitLab pages? - .htaccess

Do I need to setup a .htaccess file for a hand coded html website on GitLab Pages, also how do I get search engines to index the site?

As far as I know gitlab does not support customer server configuration files such as .htaccess or .conf.
You can add your website to the google search engine here: https://www.google.com/webmasters/tools/submit-url?continue=/addurl&pli=1

Related

How can I implement my custom landing page into Wordpress?

I am currenty working in a project that runs on Node.js on Docker. For that, I have built a landing page using Bootstrap. In the middle of project, our team would implement Wordpress as CMS.
Can I use that landing page inside Wordpress CMS?
Yes you can. Just create your page in the WordPress. Grab the id of the page and then use it to create your file. For instance, page-12.php and then copy this file in the root of your theme directory. You can copy your bootstrap/html code to this file. And then simply go to www.yourdomain.com/landingpage. Hope this makes sense.

Accessible directory in Prestashop?

In a Prestashop website, I would like to have a directory that I could access to directly, like this: website.com/directory
I've tried to add custom prestashop pages, but I really need the directory so that I can use my usual framework (Codeigniter) to build a custom-made blog for the website.
But How can I make that custom directory accessible with the link, website.com/directory?
Thanks in advance
You just need to physically create a folder and it will be accessible, for example prestashop-domain.com/mycustomfolder. Then upload your framework files.

How to access site name from Google search

I have already hosted my website, and I want to search it from Google.
How can I search?
Is there any need to upload my website to Google?
Yes, at least webmaster submission. Also, read more about the webmaster tool.
Add the proper search term to your searched words:
site:www.your-website.com searchword
You can use Google Webmaster tools for this purpose. Add your website there and you can get very valuable information about the state of your website on Google.
For faster inclusion of all pages of your website on Google database, you can create a sitemap.xml file (if you do not know how you can create it online here) and add it to Google webmaster tools.
You also can see on the Google page, placing the command "site:" before the url of your website in the search field, to view which pages are already indexed.

Lucene.NET and external sites

We have built a web site which employs Lucene.NET for search. We recently have integrated another web site so that form a user's perspective both website seem to be just one site! (we share the mater pages, etc.)
The problem we have is that two web sites are hosted in different locations. So when Lucene.NET crawls the first web site, it does not pick the content of the second web site. We want to extract the content from the 2nd web site and put it in the same index file that is built for the first site.
How can I get Lucene.NET to crawl an external site too?
Thanks
If you have file-system access to the 2nd system than you can just index by providing the path. If not, you will need to write a crawler, you can start with something basic using HttpWebRequest, or get fancier by using some tools that recursively crawl a site using links etc.

Restricting access to files within a folder to webservice IIS6/7

I am looking for a way to restrict direct access to a certain folder or folders on our website which is hosted in IIS7 in our second dev environment, IIS6 in our first dev environment and IIS6 on production.
Basically we should be able to link to these files from our website i.e.:
http://www.domain.com/stuff/survey.pdf
But if someone tries to link to this from a blog post, etc. it should not serve the content. Is there any way to do this in a web config or is that beyond the abilities of IIS?
What I ended up doing was writing a PHP script which served content from outside of the web root, but only if the user was logged in and had a valid site cookie.
Then I created folders to replace all the content we were currently serving (.pdf, .png, etc.) since there was not that much that we wanted secured. I name the folder the same as the original document, i.e.: /webroot/survey.pdf/ and then placed the index.php inside of the survey.pdf folder.
This worked, and now we can use the script to link to content that we want secured.

Resources