What can you do with server-side includes & htaccess - .htaccess

this issue just did not make sense to me, perhaps someone smart can help?
why is it that many hosts do not allow you to set htaccess or server-side includes?
specifically alot of the free hosting plans do not allow this, what are some situations that they are thinking of that makes them disable these features?

Income - You're not paying anything, the only money they get are from advertisements.
Resources - they take up additional resources. Offering only the essentials reduces the amount of risk and maintenance.
Agreement violations / unethical practices - It may reduce the amount of people signing up for free accounts to redirect websites, rename, hog resources, etc.
That's just what comes to mind.

I am new to this as well, but one thing you can do with SSI is to make your web page design a lot easier.
You can seperate headers, footers, and nav divisions (or any HTML elements that are on every page of your site) into their own HTML files. This means that when you add a page, or change one of those elements you only have to change them once, and then ensure they are included with an SSI include command into the correct spot on every page, nearly eliminating the chance of mistakes when putting the same element on every page.
SSI basically writes the referenced html file directly into the page wherever the include statement sits.
I know a lot less about .htaccess because I haven't played with it yet, but you can change which pages the server looks for SSI code in using different configuration commands. It also looks like you can enable SSI.
I know there is a lot more to all this than I wrote here, but hopefully this gives you an idea.
Here is a website that probably better explains this: http://www.perlscriptsjavascripts.com/tutorials/howto/ssi.html

Related

404 errors that look like strange SQL queries - how to block?

I have an E-commerce site (built on OpenCart 2.0.3.1).
Using an SEO pack plugin that keeps a list of 404 errors, so we can make redirects.
As of a couple of weeks ago, I keep seeing a LOT of 404s that don't even look like links:
999999.9 //uNiOn//aLl /**/sElEcT 0x393133353134353632312e39
999999.9 //uNiOn//aLl /**/sElEcT 0x393133353134353632312e39,0x393133353134353632322e39
999999.9 //uNiOn//aLl /**/sElEcT 0x393133353134353632312e39,0x393133353134353632322e39,0x393133353134353632332e39
...and so on, until it reaches:
999999.9" //uNiOn//aLl /**/sElEcT 0x393133353134353632312e39,0x393133353134353632322e39,0x393133353134353632332e39,0x393133353134353632342e39,0x393133353134353632352e39,0x393133353134353632362e39,0x393133353134353632372e39,0x393133353134353632382e39,0x393133353134353632392e39,0x39313335313435363231302e39,0x3931
This isn't happening once, but 30-50 times per example. Over 1600 lines of this mess in the latest 404s report.
Now, I know how to make redirects for "normal" broken links, but:
a.) I have no clue how to even format this.
b.) I'm concerned that this could be a brute-hacking attempt.
What would StackOverflow do?
TomJones999 -
As is mentioned in the comments (sort of), this is a security issue for you. The reason for so many URL requests is because it is likely a script that is rifling through many URL requests which have SQL in them and the script / hacker is attempting to either do a reconnaissance and find if your site / pages are susceptible to an SQL Injection attack, or, since they likely already know what E-Commerce Site (AND VERSION) you are using, they could be intending to exploit a known vulnerability with this SQL Injection attempt and achieve some nefarious result (DB access, Data Dump, etc).
A few things I would do:
Make sure your OpenCart is up to date and has all the latest patches applied
If it is up to date, it might be worth bringing up in the forums or to an OpenCart Moderator in case the attacker is going after a weakness he found but that OpenCart has not pushed a patch for yet.
Immediately, you can try to ban the attacker's IP address, but it is likely that they are going to use several different IP addresses and rotate through them. I might suggest looking into either ModSecurity or fail2ban ( https://www.fail2ban.org/ ). Fail2Ban can be a great add on for security in these situations because there are several ways for it to 'dynamically' thwart this attack attempt.
The excessive 404 errors in a short time span can be observed by fail2ban and fail2ban can then ban the client that is causing all of them
Also, there is a fail2ban filter for detecting attempted SQL injections and consequently banning the users. For example, I quickly searched and found this fail2ban filter with a few adjustments/improvements/fixes to the Regular Expression that detects the SQL injection.
I would not concern yourself at all with "how to format" that error log heh...
With regards to your code (or the code in OpenCart), what you want to be sure of is that all user submitted data is sanitized (such as data sent to your server as a GET parameter as in your case).
Also, if you feel uneasy about the attempted hack, it might be worth watching the feed provided on the haveibeenpwned website because data resulting from exploits targeted at databases very commonly tend to end up on sites like pastebin etc and haveibeenpwned will try to parse some of the data and identify these hacks so that you or your users can at least become aware and take appropriate measures.
Best of luck.

Htaccess and url for multilingual site [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I'm ready to use the subdirectory format for my multilingual website.
My first question is:
For SEO, have I to translate the page name in the url or it is useless?
Example:
- Same filename
site.com/fr/login
site.com/en/login
OR
- Different filename
site.com/fr/connexion
site.com/en/login
Then, when user is on site.com: Should I redirect him to site.com/en and site.com/fr depending user's IP? Or have I to set a default local, and have my url like site.com/page and site.com/fr/page
Finally, what is the best way to get the local from user's current URL?
Parsing url to get /fr or /en, or add a GET parameter in url with lang=fr (hidden with htaccess)
Thanks :)
As a precondition, I assume that you are not using frameworks / libraries. Furthermore, I never have solved similar problems only using .htaccess (as the title of your question requests) and thus don't know if it is possible to do so. Nevertheless, the following guidelines may help you.
Your first question
In general, a web page's file name and path have influence on its ranking. Furthermore, having page names and paths in native languages might help your users memorize the most important of your URLs even without bookmarking them.
Nevertheless, I never would translate the page names or directories for pages which are part of a web application (as opposed to a informational or promotional pages).
The login page you mentioned is a good example. I am nearly sure that you do not want your site to be found because of its contents on its login page. Actually, there are many websites which exclude login pages and other application pages from being indexed at all.
Instead, in SEO terms, put your effort into your promotional and informational pages. Provide valuable content, explain what is special about you or your site, and do everything you could that those pages get properly indexed. IMHO, static HTML pages are the best choice for doing so.
Furthermore, if you translate the names of pages which belong to your actual application, you will run into massive trouble. For example, after successful login, your application probably will transfer the user to his personal dashboard, which probably will be based on another HTML template / page. If you have translated that page name into different languages, then your application will have to take care to take the user to the right version. Basically, that means that you need as many versions of your application as languages you want to support. Of course, there are tricks to make life easier, but this will be a constant pain and definitely in no way worth the effort.
To summarize: Create static pages which show your USP (unique seller position) and provide valuable content to users (for example sophisticated tutorials and so on). Translate those pages, including names and paths, and SEO them in every way you could. But regarding the actual application, optimizing its pages is kind of pointless and even counterproductive.
Your second question
I would never use IP based redirecting for several reasons.
First, there are many customers in countries which are not their home country. For example, do you really want to redirect all native English speakers to your Hungarian pages because they are currently in Hungary for a business trip?
Second, more and more users today are using VPNs for different reasons, thereby often hiding the country where they currently are.
Third, which IP address belongs to which provider or country is highly volatile; you would have to constantly update your databases to keep up.
There are more reasons, but I think you already have got the idea.
Fortunately, there is a solution to your problem (but see "Final remark" below): Every browser, when fetching a page from a server, tells the server the preferred and accepted languages. For example, Apache can directly use that information in RewriteRule statements and redirect the user to the correct page.
If you can't alter your Server's configuration, then you can evaluate the respective header in your CGI program.
When doing your research, look for the Accept-Language HTTP 1.1 header. A good starting point probably is here.
Your third question
You eventually are mixing up two different things in your third question. A locale is not the same as a language. On one hand, you are asking "...to get the local from...", and on the other hand, you say "...lang=fr...", thus making the impression you want to get the language.
If you want to get the language: See my answer to your second question, or parse the language from the current path (as you already have suggested).
If you want to get the locale, things are more complicated. The only reasonable automatic method is to derive the locale from the language, but this will often fail. For example, I generally prefer the English language when doing research, but on the other hand, I am located in Germany and thus would like dates and times in the format I am used to, so deriving my desired locale from my preferred language will fail.
Unfortunately, there is no HTTP header which could tell the server which locale the user prefers. As a starting point, this article may help you.
See the final remark (next section) on how to solve this problem.
Final remark
As the article linked above already states: The only reliable way to satisfy the user is to let him choose his language, his locale and his time zone within your application. You could store the user's choices either in cookies or in your back-end database; each has its own advantages and disadvantages.
I usually use a combination of all methods (HTTP headers, cookies, database) in my projects.
Think about humans at the first. Is the URL translation important for users in France? Some people may think what it’s fine to get translated words in the URL. Users from other locales may think otherwise. Search engines take into account user behavioral factors. SEO factors will higher if you solution is more convinient for users.
It whould be nice if users get an expected language version. A site could help them if it suggests a language version by IP, HTTP headers, cookies and so on. Some people may prefer another language, some people may be on a trip. So it's still important let them to choice a language version manually.
Please read manuals and analyze competitors sites in case of doubt.
i usally show in mostly website they give url like site.com/en and site.com/fr as you mention but it upon you how you want to show website to user. i prefer make default site.com/en and give user option to select his language.
if you still confuse then refer below link it will usefull.
See Refferal Link Here
Should you translate paths?
If possible, by all means - as this will help users of that language to feel like "first class citizens". Login routes probably won't have much impact on SEO, but translating URLs on content pages may well help them to be more discoverable.
You can read Google's recommendations on multi-regional and multilingual sites, which state that it's "fine to translate words in the URL".
Should you redirect based on IP?
This can help first time users, but there are a few things to bear in mind:
How accurate will this be? E.g. if I speak English but I visit France and then view your site - I will get French. If your target market is mobile-toting globe-trotters, then it may not be the best choice. Is looking at Accept-Language, for example, any better?
Will geolocating the IP address on every request introduce any performance problems for your servers? Or make too many calls to an external geocoding service? Make sure to carry out capacity planning, and reduce calls where you already know the locale (e.g. from a cookie, or user explicitly stating their preference.
Even if you guess a preferred locale, always allow an explicit user preference to override that. There's nothing more frustrating than moving between pages, and having the site decide that it knows better what language you understand :-)
As long as you make it easy to switch between sites, you shouldn't need a specific landing page. It doesn't hurt to pop up a banner if you're unsure whether you should redirect a user (for example, amazon.com will show a banner to a UK user, giving them the option of switching sites - but not deciding for them)
How should you structure your URLs?
Having the language somewhere in the URL (either as a subdomain or a folder) is probably best for SEO. Don't forget to update your sitemap as well, to indicate to crawlers that there are alternate content pages in different languages.

How much does a single request to the server cost

I was wondering how much do you win by putting all of your css scripts and stuff that needs to be downloaded in one file?
I know that you would win a lot by using sprites, but at some point it might actually hurt to do that.
For example my website uses a lot of small icons and most of the pages has different icons after combining all those icons together i might get over 500kb in total, but if i make one sprite per page it is reduced to almost 50kb/page so that's cool.
But what about scripts js/css how much would i win by making a script for each page which has just over ~100 lines? Or maybe i wouldn't win at all?
Question, basically i want to know how much does a single request cost to download a file and is it really bad to to have many script/image files with todays modern browsers and a high speed connections.
EDIT
Thank you all for your answers, it was hard to chose just one because every answer did answer my question, I chose to reward the one that in my opinion answered my question about request cost the most directly, I will not accept any answer as correct because everyone was.
Multiple requests means more latency, so that will often make a difference. Exactly how costly that is will depend on the size of the response, the performance of the server, where in the world it's hosted, whether it's been cached, etc... To get real measurements you should experiment with your real world examples.
I often use PageSpeed, and generally follow the documented best practices: https://developers.google.com/speed/docs/insights/about.
To try answering your final question directly: additional requests will cost more. It's not necessarily "really bad" to have many files, but it's generally a good idea to combine content into a single file when you can.
Your question isn't answerable in a real generic way.
There are a few reasons to combine scripts and stylesheets.
Browsers using HTTP/1.1 will open multiple connections, typically 2-4 for every host. Because almost every site has the actual HTML file and at least one other resource like a stylesheet, script or image, these connections are created right when you load the initial URL like index.html.
TCP connections are costly. That's why browsers open directly multiple connections ahead of time.
Connections are usually limited to a small number and each connection can only transfer one file at a time.
That said, you could split your files across multiple hosts (e.g. an additional static.example.com), which increases the number of hosts / connections and can speed up the download. On the other hand, this brings additional overhead, because of more connections and additional DNS lookups.
On the other hand, there are valid reasons to leave your files split.
The most important one is HTTP/2. HTTP/2 uses only a single connection and multiplexes all file downloads over that connection. There are multiple demos online that demonstrate this, e.g. http://www.http2demo.io/
If you leave your files split, they can also be cached separately. If you have just small parts changing, the browser could just reload the changed file and all others would be answered using 304 Not Modified. You should have appropriate caching headers in place of course.
That said, if you have the resources, you could serve all your files separately using HTTP/2 for clients that support it. If you have a lot of older clients, you could fallback to combined files for them when they make requests using HTTP/1.1.
Tricky question :)
Of course, the trivial answer is that more requests takes more time, but that is not necessarily this simple.
browsers open multiple http connections to the same host, see http://sgdev-blog.blogspot.hu/2014/01/maximum-concurrent-connection-to-same.html Because that, not using parallel download but rather downloading one huge file is considered as a performance bottleneck by http://www.sitepoint.com/seven-mistakes-that-make-websites-slow/
web servers shall use gzip content-encoding whenever possible. Therefore size of the text resources such as HTML, JS, CSS are quite compressed.
most of those assets are static content, therefore a standard web server shall use etag caching on them. It means that next time the download will be like 26 bytes, since the server tells "not changed" instead of sending the 32kbyte of JavaScript over again
Because of the etag cache, the whole web site shall be cacheable (I assume you're programming a game or something like that, not some old-school J2EE servlet page).
I would suggest making 2-4 big files and download that, if you really want to go for the big files
So to put it together:
if you have only static content, then it is all the same, because etag caching will shortcut any real download from the server, server returns 304 Not modified answer
if you have some generated dynamic content (such as servlet pages), keep the JS and CSS separate as they can be etag cached separately, and only the servlet page needs to be downloaded
check that your server supports gzip content encoding for compression, this helps a lot :)
if you have multiple dynamic content (such as mutliple dynamically changing images), it makes sense to have them represented as 2-4 separate images to utilize the parallel http connections for download (although I can hardly imagine this use case in the real life)
Please, ensure that you're not serving static content dynamically. I.e. try to load the image to a web browser, open the network traffic view, reload with F5 and see that you get 304 Not modified from the server, instead of 200 OK and real traffic.
The biggest performance optimization is that you don't pull anything from the server, and it comes out of the box if used properly :)
I think #DigitalDan has the best answer.
But the question belies the real one, how do I make my page load faster? Or at least , APPEAR to load faster...
I would add something about "above the fold": basically you want to inline as much as will allow your page to render the main visible content on the first round trip, as that is what is perceived as the fastest by the user, and make sure nothing else on the page blocks that...
Archibald explains it well:
https://www.youtube.com/watch?v=EVEiIlJSx_Y
How much you win if you use any of these types might vary based on your specific needs, but I will talk about my case: in my web application we don't combine all files, instead, have 2 types of files, common files, and per page files, where we have common files that needed globally for our application, and other files that is used for its case only, and here is why.
Above is a chart request analysis for my web application, what you need to consider is this
DNS Lookup happens only once as it cached after that, however, DNS name might be cached already, then.
On each request we have:
request start + initial connection + SSL negotiation+ time to first byte + content download
The main factor here which takes majority of request time in most cases is the content download size, so if I have multiple files that all of them needed to be used in all pages, I would combine them into one file so I can save the TCP stack time, on the other hand, if I have files needed to be used in specific pages, I would make it separate so I can save the content download time in other pages.
Actually very relevant question (topic) that many web developer face.
I would also add my answer among other contributors of this question.
Introduction before going to answer
High performance web sites depending on different factors, here is some consideration:
Website size
Content type of website (primary content Text, image, video or mixture)
Traffic on your website (How many people visiting your website average)
Web-host Location vs your primary visitor location (with in your country, region and world wide), it matters a lot if you have website for Europe and your host is in US.
Web-host server (hardware) technology, I prefer SSD disks.
How web-server (software) is setup and optimized
Is it dynamic or static web site
If dynamic, how your code and database is structured and designed
By defining your need you might be able to find the proper strategy.
Regarding your question in general
What regards your website. I recommend you to look at Steve Souders 14 recommendation in his Book High Performance Web Sites.
Steve Souders 14 advice:
Make fewer HTTP requests
Use a Content Delivery Network (CDN)
Add an Expires Header
Gzip Components
Put Style-sheets at the Top
Put Scripts at the Bottom
Avoid CSS Expressions
Make JavaScript and CSS External if possible
Reduce DNS Lookups
Minify JavaScript
Avoid Redirects
Remove Duplicates Scripts
Configure ETages
Make Ajax Cacheable
Regarding your question
So if we take js/css in consideration following will help a lot:
It is better to have different codes on different files.
Example: you might have page1, page2, page3 and page4.
Page1 and page2 uses js1 and js2
Page3 uses only js3
Page4 uses all js1, js2 and js3
So it will be a good idea to have JavaScript in 3 files. You are not interested in including every thing you have that you do not use.
CSS Sprites
CSS at top and JS at the end
Minifying JavaScript
Put your JavaScript and CSS in external files
CDN, in case you use jQuery for example do not download it to your website just use the recommended CDN address.
Conclusion
I am pretty sure there is more details to write. And not all advice are necessary to implement, but it is important to be aware of those. As I mentioned before, I suggest you reading this tiny book, it gives you more details. And finally there is no perfect final solution. You need to start some where, do your best and improved it. No thing is permanent.
Good luck.
the answer to your question is it really depends.
the ultimate goal of page load optimization is to make your users feel your page load is fast.
some suggestions:
do not merge common library js css files like jquery coz they might have already cached by brower when you visited other sites so u don't even need to download them;
merge resources, but at least separate first screen required resouces and the others coz the earlier user could see some meaningful stuff, the faster they feel about your page;
if several of your pages shared some resources, separate the merged files for shared resources and page specific resources so that when you visit the second page, the shared ones might have already been cached by browser, so the page load is faster;
user might be using a phone with slow or inconsistent speed 3g/4g network, so even 50k of data or 2 more requests does make them feel different a lot;
Is really bad to have a lot of 100-lines-files and is also really bad to have just one or two big files, though for each type css/js/markup.
Desktops have mostly high speed connection, and mobile has also high latency.
Taking all the theory about this topic, i think the best approach shall be more practical, less accurate and based upon actual connection speed and device types from a statistical point of view.
For example, i think this is the best way to go today:
1) put all the stuff needed to show the first page/functionality to the user, in one file, shall be under 100KB - this is absolutely a requirement.
2) after that, split or group the files in sizes so that the latency is no longer noticeable together with the download time.
To make it simple and concrete, if we assume: time to first byte is around ~200ms, the size of each file should be between ~120KB and ~200 KB, good for the most connections of today, averaged.

Website administration - Integrated into main website or separate section?

From a usability perspective, is it better to integrate admin section on the main website or have a separate section to manage content?
Any thoughts are greatly appreciated.
EDIT: the application is a CMS for very non-techno friendly staff.
It depends on the project and part you want to administer, imho.
For example comments on newsposts should be administered in the website itself by showing a "delete" linkbutton for each comment. Otherwise the mods would have to look up the comment in the admin section => not very user friendly.
But in general I think a seperate admin section will usually be more clear to your client. You'd want them to see the site as a normal user would see it.
At the very least I would recommend moving all your administration files to a separate folder. That way if you're using a platform like .NET you can very easily control folder access though role and user-based web.config permissions.
Having your administration files all segregated allows you to do other things easily too, like delete them if you decide to move them to another server later. Also you can exclude them in your robots.txt file (although by putting it in the robots.txt file you will be telling other people this section exists, and robots don't have to mind this file).
Edit:
I feel like my answer missed the mark a little considering your question. Of course in-line editing is easier than going to a separate page from a usability perspective, but whenever I hear of mixing admin users with regular users giant alarm bells go off in my head.
I think that it depends on the function of the site and how intrusive it will be to your staff. Does it make sense for them to make changes while browsing the site and will they eventually become discouraged with your system because it forces them to inject unnecessary steps into their process? How long will the edits take? Does it make sense to show a completely different interface to administrators? I think an answer to this question requires a lot more understanding of what specific function you're trying to accomplish and will vary on a case by case basis.
I have never liked adding administration pages into the main site. It seems like it is too much of a risk from someone accidently getting access to the administration portion of the site. Even if you review the security over and over, there is always that possibility that something unexpected will happen, and it is a huge mess to clean up (what was modified, who accessed what etc. etc.). In my opinion keeping it as a seperate site is the easiest way to go.
While there is a level of convenience in being able to login to the main part of a site as a privileged user, and then clicking on parts of a page to edit/alter it, there is also a measure of security in having it in a separate area.
I have found that having a separate section of the website that is devoted specifically to administrative tasks makes the tasks easier to organize and use. Just look at Wordpress (especially the new 2.7 release), Drupal (a very popular cms), and Joomla (another very popular cms). If you would like to test these feature out to see why I think the separate section is better, you can go to www.opensourcecms.com and test out both Drupal and Joomla.
From a usability, the more integrated the better but it tends to add severely to the complexity. Its always best if you can perform the action within the context in which it occurs - lock out a bad user, trim a log thats too big etc. however since these tend to have significant side effects the security aspect trumps this a lot of the time out of fear.
I think you need to look at doing a risk assessment with regards to integrating the administration capabilities right into the application.
What would be the implication for the system if someone were able to escalate privelages and gain access to the admin functions. if every user was locked out maliciously - damage to the site, the reputation, SLA etc?
What destructive functions can an admin perform from this section ? delete lots of data? crash the app? alter costs that have material impact on users/customers?
Are the admin functions integrated in the app or isolated into specific admin functions?
Does the application have a public face or is it an intranet that is assumed secure?
I personally have never felt at ease integrating the admin section with the site out of fear that a security breach caused by my ineptness or something beyond my control like bad defaults, unpatched exploit. However, when the client is writing the cheque I tend to do what they feel is best.
I come from the school of Usability that says "minimise guess work". So, what kind of information did you get from your user group sessions?
Remember, you're not (and we're not) typical users of your system. Anything we say is going to be a guess. What's worse is that any opinion shown here is likely to be ill informed for your situation (but probably awesome at what they do). The only way you'll have an informed opinion on what to do is by going out and asking your users in intelligent ways.
Because this is a security issue some of your user mix should be people who specialise in security. Just because it's easier to use doesn't mean it's the most appropriate.

Is it possible for a 3rd party to reliably discern your CMS?

I don't know much about poking at servers, etc, but in light of the (relatively) recent Wordpress security issues, I'm wondering if it's possible to obscure which CMS you might be using to the outside world.
Obviously you can rename the default login page, error messages, the favicon (I see the joomla one everywhere) and use a non-default template, but the sorts of things I'm wondering about are watching redirects somehow and things like that. Do most CMS leave traces?
This is not to replace other forms of security, but more of a curious question.
Thanks for any insight!
Yes, many CMS leave traces like the forming of identifiers and hierarchy of elements that are a plain giveaway.
This is however not the point. What is the point, is that there are only few very popular CMS. It is not necessary to determine which one you use. It will suffice to methodically try attack techniques for the 5 to 10 biggest CMS in use on your site to get a pretty good probability of success.
In the general case, security by obscurity doesn't work. If you rely on the fact that someone doesn't know something, this means you're vulnerable to certain attacks since you blind yourself to them.
Therefore, it is dangerous to follow this path. Chose a successful CMS and then install all the available security patches right away. By using a famous CMS, you make sure that you'll get security fixes quickly. Your biggest enemy is time; attackers can find thousands of vulnerable sites with Google and attack them simultaneously using bot nets. This is a completely automated process today. Trying to hide what software you're using won't stop the bots from hacking your site since they don't check which vulnerability they might expect; they just try the top 10 of the currently most successful exploits.
[EDIT] Bot nets with 10'000 bots are not uncommon today. As long as installing security patches is so hard, people won't protect their computers and that means criminals will have lots of resources to attack. On top of that, there are sites which sell exploits as ready-to-use plugins for bots (or bots or rent whole bot nets).
So as long as the vulnerability is still there, camouflaging your site won't help.
A lot of CMS's have id, classnames and structure patterns that can identify them (Wordpress for example). URLs have specific patterns too. You just need someone experienced with the plataform or with just some browsing to identify which CMS it's using.
IMHO, you can try to change all this structure in your CMS, but if you are into all this effort, I think you should just create your own CMS.
It's more important to keep everything up to date in your plataform and follow some security measures than try to change everything that could reveal the CMS you're using.
Since this question is tagged "wordpress:" you can hide your wordpress version by putting this in your theme's functions.php file:
add_action('init', 'removeWPVersionInfo');
function removeWPVersionInfo() {
remove_action('wp_head', 'wp_generator');
}
But, you're still going to have the usual paths, i.e., wp-content/themes/ etc... and wp-content/plugins/ etc... in page source, unless you figure out a way to rewrite those with .htaccess.

Resources