Azure Websites Orchard Memory Consumption - azure

I have had my blog running on Orchard in Azure for, I dunno, a few months probably. All has been well. I have about 10 content items. It is a small site. I'm running it in Shared mode. Browsing the site is fine, it is fast and all is great. But today I have been trying to edit some posts and add some new ones. My per minute CPU usage is just going crazy and keeps crashing the site and hence I cant save anything. Pressing publish just destroys the site.
I'm not upgrading to reserved mode like it keeps recommending for a fucking little blog with about 3 viewers.
Any ideas why the cpu usage could be going so crazy?
Error logs are pretty much empty, occasional error from disqus but that is only when I am loading blog posts...
UPDATE 1:
Removed disqus just in case. Still failing miserably with massive loads to publish content items.
UPDATE 2: Kinda strange... error logs say A tenant could not be started: Default. Sequence contains more than one element. I think it is talking about routes.
at Orchard.Mvc.Routes.StandardExtensionRouteProvider.d__a.MoveNext() in c:\Users\sebros\My Projects\Orchard\src\Orchard\Mvc\Routes\StandardExtensionRouteProvider.cs:line 24

You should check that extensions monitoring is disabled. It creates lots of FileSystemWatcher instances in order to get dynamic compilation responsive to live files modifications but it's unnecessary in production environments.
Look at "Disabling the Dynamic Module Loader" section on this page: http://docs.orchardproject.net/Documentation/Orchard-module-loader-and-dynamic-compilation

Related

Typo3 CMS keeps kicking me out

We have moved our website which uses typo3 from on-prem to Azure cloud. We setup a Front door with firewall protection which is different from the previous setup.
Since day one when I log in I can do some stuff for a short while (like 4-5 minutes) and then it kicks me out to login screen.
Another example is when I'm logged in, I open a new tab and check some other sites then go back to typo3, again I'm logged out. Need to log in again.
I lost some of my posts while adding some additional info from other websites.
Any ideas?
I had a similar issue. I resolved it with replacing lockIP in the install tool from 4 to 0.
Note, this is a temp solution so you can keep working, but you really need to find out why this is happening.
Best regards

how to solve website speed test delay

So I have this wordpress blog set up on a VPS with litespeed and cloudflare. The website loads some banners from a revive insallation on the same VPS server, only that domain doesn't have cloudflare installed.
Although the page speed and wslow scores are good, I still get a 3 to 5 secs page load. You can see the results here:
https://gtmetrix.com/reports/www.survivalsullivan.com/WIZjVt68
Although individual resources seem to load fast (including the revive banners), there seem to be inexplicable "delays" in the waterfall... I'm no wiz in website optimization but do have some experience.
Am I missing something? I couldn't find a decent resource on how to read the waterfall, although I figured out most of it. Thanks!
Overall you got pretty good results!
First of all deal with all those images gtmetrix displays: optimize them using photoshop, jpeg mini or sprites.
If you haven't already, install bj lazy load plugin and above the fold.
Install and configure W3C cache which will fix the YSLOW settings that still not green in gtmetrix.
I assume you use some kind of theme / page builder? see if you can reduce the number of dom elements in page. Use DOM Monster! to see how nested is your page.
For example if need to display an image dont nest it in div inside column inside row inside container div.
If your website is gonna be used by users in multiple countries I would suggest paying for MAXCDN. It also integrated into W3C cache plugin.
If you use google fonts try adding them locally to style instead of GETing them.

Google Chrome could not load the webpage because myPortalapps-12812b1f934c6c.myPortal.apps.com took too long to respond

Trying hello world hosted app but getting this error on deployment,
Google Chrome could not load the webpage because
myPortalapps-12812b1f934c6c.myPortal.apps.com took too long to respond
I can ping myPortalapps.myPortal.apps.com but not myPortalapps-12812b1f934c6c.myPortal.apps.com
I also had some similar problems facing sharepoint web app's this forum post helped me out alot:
When troubleshooting performance issues where more than one
person/computer is impacted, the first place I like to start is by
running a sniffer like Fiddler:
http://www.fiddler2.com/fiddler2/version.asp
Fiddler will let you now exactly how long it takes to load the page,
and break down each and every resource that is also loaded in order to
render the page. This is also a great tool for determining what is
and what is not being cached.
I take the output of this and see if there is anything being loaded
that I'm not expecting. Every once in awhile I'll see where a user
might reference an image housed on an external site or server. This
can have serious consequences to load times.
I also look at the actual SharePoint page to see if there are any
hidden web parts loading list data. Most users accidentaly click
"Close" and not "Delete" so those web parts or list views are still
there. In some cases there could be significant data being loaded and
just not displayed.
Likewise I'll also take a look to see if any audiences are being used
since Audiences can be used to show/hide content.

Full ajax/pjax site

I´m planning to make a full dynamic site using pjax, with static menu (only the content will be updated with pjax). How bad is this?
The site that i have planned to implat this on have a lots of data on it, most images.
I have tested my solution on my local machine and it seems to work but in production it will probably be slow or what do you guys think? Are this bad practise?
Now on pjax start i slide out my container to the left, and slide in the new content from the right. I have noticed a small performance lost when i do this in Safari and FireFox. Should i skip my solution and just do regular updates of the page? I want to do something like Twitters iPhone app, but on the web.
The reason i want to do this are that i have a full size google maps with a lot of pins that take some time to load.
I have found Tubrolinks (http://www.github.com/rails/turbolinks) that would be included in Rails 4.0, its great adn i think a good answer to my question.

Best way to control access to individual nodes

am I just stupid or does Drupal have a big flaw? (probablt the former of the two..)
I have built a site with some public content and some private content. The problem is that even though menus can be hidden from public, unauthorized users, there is no stopping a visitor from just typing in node/5 (if node/5 were one of the private, hidden pages).
And I am baffled by how troublesome this is to fix. there is no basic functionality to fix this, and having tried two modules simple_access and access_control none of them work! Currently trying to fix a drupal 6 site. Any suggestions on modules that might fix this VERY BASIC functionality? Is Drupal not meant to handle corporate pages where you have external pages and internal sensitive content?
By the way, Drupal 7 is in the .9 stage, there are still VERY limited module availability, mostly everything is in an alpha stage and has been like forever, is there no development being done for D7?
The module that'll fix the problem for you is Nodeaccess; this is the opening text from the module page:
Nodeaccess is a Drupal access control module which provides view, edit and delete access to nodes. Users with the 'grant node permissions' permission will have a grant tab on node pages which allows them to grant access to that node by user or role.
So that will do exactly what you want. Also the way Drupal's access system works means that any menu link that points to a node to which the user does not have access, will not be shown for that user. So you won't even have to hide your menu items any more, Drupal will do it for you :)
Regarding Drupal 7 contributed modules, the 'major' modules (Views, CTools, Devel, etc.) are all coming along nicely and are stable, in RC or at least beta. Because Drupal is open source the sole maintainers of smaller modules may not have the time to devote to bringing the Drupal 7 version alongside maintaining the v6 module (a lot of people still use D6 and there are still issues to attend to there).
Personally I've developed quite a number of D7 sites now and have found the contributed modules to be available and of a good quality (for the most part). I guess it just depends what specific functionality you need at the end of the day.
I think there's just a gap between your expectation and how Drupal actually works.
Drupal doesn't limit access to content based on whether or not that content is in the menu. On a site with thousands of nodes it would be overwhelming to have a menu of thousands of items.
Drupal has a rich node access system and there are dozens of modules which can help solve this problem. See the list of content access control modules for ideas on which you might use.
When I run into specific problems with modules I tend to follow a few steps:
re-read the README.txt file and INSTALL.txt file
re-read the project page to see if it links to any further documentation
read the issues for the project to see if any of them have similar descriptions of problems (click on the number links in the right sidebar of the project page)
create a new test site where the only thing I install is the module in question and then walk through the steps I think I should, documenting them in a new issue in the project issue queue as a "support request", and then ending the post with "expected results" and "actual results" - maintainers will usually get back in a few days time
nodeaccess module (http://drupal.org/project/nodeaccess) should work perfectly for you.

Resources