Opening a website is getting delayed - web

First of all thanks to everyone who have supported me in this forum to solve critical issues and thanks in advance for this question.
I am building a website http://kidneyprostate.com where I have used wordpress too. The index file is a php file which is HTML5 and the all other files are in Wordpress.
What I have done I have kept all other files including the blog in a Subfolder named as http://kidneyprostate.com/kidney-prostate.com/ inside the root folder.
I found everything worked fine. But after few days my server got Bandwith Speed exceeded and my host company rectified it.
Now what I am facing that the site is getting delayed to open. I have checked with Pingdom Tools and I have optimized the images (those for slides), Minified the CSS and Javascript files. But still the delay problem persists.
Reasons I am guessing may be due to the same name of the sub folder or the bandwith speed have not yet been totally rectified.
Can you please help me that am I guessing correct or any other issues?
If my naming of the subfolder i.e. kidney-prostate.com is not the right way then after changing the name of the subfolder will work?
Thanks again in advance.

The "same-name" url has nothing to do.
1) Use a cache plugin (I would recommend WP Super Cache) and deactivate all the unnecessary plugins.
2) Delete all the unused themes and images.
3) Also I would consider that you deactivate some heavy bandwidth plugins (like google-analytics ones)

Related

why download apk file is buffered and gives user old version

We hold our landing page on Azure and it is for users to download an Android apk file. This landing page is a html file. Here is the markup for users to download:
download here
It all works fine until now. Users start to complain that the app they downloaded cannot work properly. But when we tested, it works fine.
Finally we find out that, although the link is
http://www.[mysite].com/android/[MyAndroidApp].apk
but sometimes when user click it, it goes to
http://101.44.1.131/cloud/223.210.55.28/files/9216...636//www.[mysite].com/android/[MyAndroidApp].apk
This is a buffer and holds an old version of our app!
Can anyone tell me why this happen and how can I prevent it buffer our old version?
How often do you update this apk file?
May be a caching issue, but not sure exactly.
Have you tried using Azure storage? Upload the file on there, and then link directly to it.
Should cost you less in the long run and not cause any buffering/cache issues
I would suggest you try to put version numbers after your filename. This is also a good practice for .js files. A problem is very often that it's cached and the cache not updated correctly. It's a general problem in the web.
So. Try to put version numbers after the file name, and let us know if this works.
Thank you all for your suggestions.
We have found the reason. Looking at the redirect url, it is actually some ISPs cached our apk files. They are doing this so that they can save themselves money and bandwidth. This is a common practice in some countries and is well documented.
How evil it is.
Our solution is thus change the file name very time we deploy a new version.

mod_pagespeed not loads the static files to the cache folder

Hi I have started exploring PageSpeed module in Apache Httpd. I have used ModPagespeedLoadFromFile for doing hotfixes. Is it correct? Or do we have any other option the hotfixes of static files using PageSpeed.
The problem is while using ModPagespeedLoadFromFile, only the files under the ModPagespeedLoadFromFile are cached in ModPagespeedFileCachePath. The rest of the static files fetched from the server are not cached at ModPagespeedFileCachePath.
Kindly anyone correct me what I am doing wrong. Thanks in advance
I'm not sure what you mean by "hotfixes"?
But assuming that you mean that rewritten resources are immediately updated when you make updates to the original resource? That is a feature of LoadFromFile, you can also use cache flushing for that.
I don't understand the FileCachePath half of your question at all. What is the problem that you observe? Files not being rewritten? Broken files?

Drupal does not detect an installed module

I have a number of sites running off a single Drupal core installation.
This includes a number of 'standard' modules such as Views and CCK in the /sites/all/modules directory.
This works fine apart from one issue.
One of the sites refuses to accept that the Google Analytics module is installed. It can see all the other modules in the directory, and all the other sites see and use the Analytics module without any issue.
I've tried clearing the cache and checked the permissions but the fact the module works for other sites, and the problem site can see the other modules has got me stumped.
Any ideas?
Edit : Ok, case closed. It was me being a muppet. I forgot the first rule, which is check all your assumptions. In this case I assumed I was looking at the right site. Wrong. For reasons best kept to myself, I have 2 instances of this site hosted, one of which the domain name resolves to, and one which it doesn't. I was looking at the 'orphan' site's drupal installation, not the correct installation which works perfectly.... now I've actually installed the module.
Sorry to have wasted your time, but rest assured, I wasted far more or my own time, and hopefully this question will serve as a reminder to others to check their assumptions too :-)
I'm not sure if you mean that this module is not showing up in the modules list, or that it is not showing any data from this particular instance after it is enabled.
If this is a case where the module is installed but not working properly you should make sure that you have not removed the
<?php print $closure ?>
tag from the end of the theme for that particular instance of drupal. If it is removed then the GA javascript code will not be added to the page.
The only thing I can think of is that there are some missing files; in particular, if the .info file is missing, the module is not listed in the modules page. The same is true if the .module file is missing.
It's not a problem with the permission, as users who can access the page listing all the modules will see all the modules (with the exception of the modules with missing files).

Linking to files in Expression Engine templates (path name problems)

Hey, new to expression engine - loving it, but having difficulties with linking to external files. I know this differs from server to server, but at present I see no end to these woes and would appreciate a nudge in the right direction. Server issues of any shade tend to give me panic attacks. They are my kryptonite.
1) Expression Engine 2.1 is installed in a subfolder of our site (www.website.com/client)
2) I have made a template group and set it up to save as files, and uploaded a logo within a subfolder (system/expressionengine/templates/default_site/site.group/images/logo.jpg)
Linking to either (www.website.com/client/images/logo.jpg) links to naught. Neither do longer variations like (www.website.com/client/system/expressionengine/templates/default_site/site.group/images/logo.jpg)
Halp.
Template groups are not meant to hold that kind of static content, /templates is not public (at our install it is even outisde of www).
You need to change your upload folder to /client/themes/uploads or something similar, then you can access your uploads as {path=themes/uploads}/image_name.jpg
try using the path: /images/logo.jpg or /client/images/logo.jpg
Also, direct those questions to http://expressionengine.com/forums/viewforum/113/
you'll receive help from a larger pool of knowledgeable users.

How to prevent robots.txt passing from staging env to production?

I had happen in the past that one of our IT Specialist will move the robots.txt from staging from production accidentally. Blocking google and others from indexing our customers' site in production. Is there a good way of managing this situation?
Thanks in advance.
Ask your IT guys to change the file permissions on robots.txt to "read-only" for all users, so that it takes the extra steps of:
becoming Administrator/root
changing the permissions to allow writes
overwriting robots.txt with the new file
As an SEO, I feel your pain.
Forgive me if I'm wrong, but I'm assuming that the problem is caused because there is a robots.txt on your staging server because you need to block the whole staging environment from the search engines finding and crawling it.
If this is the case, I would suggest your staging environment be placed internally where this isn't an issue. (Intranet-type or network configuration for staging). This can save a lot of search engine issues with that content getting crawled say, for instance, they deleted that robots.txt file from your Staging by accident and get a duplicate site crawled and indexed.
If that isn't an option, recommend staging to be placed in a folder on the server like domain.com/staging/ and use just one robots.txt file in the root folder to block out that /staging/ folder entirely. This way, you don't need to be using two files and you can sleep at night knowing another robots.txt won't be replacing yours.
If THAT isn't an option, maybe ask them to add it to their checklist to NOT move that file? You will just have to check this - A little less sleep, but a little more precaution.
Create a deployment script to move the various artifacts (web pages, images, supporting files, etc) and have the IT guy do the move by running your script. Be sure not to include robots.txt in that script.
I'd set up code on the production server which held the production robots.txt in another location and have it monitor the one that's in use.
If they're different, then I'd immediately overwrite the in-use one with the production version. Then it wouldn't matter if it gets overwritten since the bad version won't exists for long. In a UNIX environment, I'd do this periodically with cron.
Why is your staging environment not behind a firewall and not publicly exposed?
The problem is not Robots.txt...The problem is your network infrastructure.

Resources