Windows Azure (dotnetnuke module installation error) - azure

I'm new with windows Azure and i've just signup for a 3months free trial, i've installed dotnetnuke 7.0.1 the problem that i have is that everytime when i try to install a module on to my dotnetnuke website i get the sql error message, please help as i don't know what is the problem.
my windows Azure is disabled because i've created more than 1 databases but now i've deleted all the other databases now i'm left with one, how do i reactivate my trial so i can complete my tests.

The problem with the modules that you are trying to install is probably that are not SQL Azure compatible. Ensure that those modules are compatible with SQL Azure asking to the module developer/vendor. If the problem is within the open source/non-core modules, some time ago I modified all of them to be SQL Azure compatible (check this link: http://dotnetnuke6.intelequia.com/Module-Test). Before install any of them, be sure that there is no new version at CodePlex with the SQL Azure compatibility fixed.
I think the problem is that the SQL Azure "billing" counter is calculated "per day". So you should wait at least one day before creating a new database or just disable the trial limits by converting the subscription to a paid subscription

---I WORK FOR POWERDNN---
Hi Anonymous,
While Azure does have some advantages, when it comes to running an app like DotNetNuke on Azure, it is really not a good business or technology decision (at least today). Right now Azure does not have parity with standard SQL Server technologies which is what DotNetNuke has been coded against for the past ten years.
I've already talked to more than a couple associates who have tried to run their DNN website on Azure and it has caused serious problems for them, Usually what happens is a SQL script won't completely run and will leave their database in an indeterminate state. The problems usually aren't apparent to them until a few weeks after trying out azure and then they have to decide to either roll back (and loose weeks of data) or spend hours trying to figure out what script didn't fully run and trying to piece things back together in an azure-compatible way.
If you have never had to rewrite a vendor's SQL scripts, I'd highly encourage the experience. It is a lot of "fun" :-)
Always glad to help,
Tony V.

Related

Is it possible to deploy ASP.NET Core website to Azure without taking it offline?

When we try to deploy ASP.NET Core website to Azure we are getting this error:
Error Code: ERROR_INSUFFICIENT_ACCESS_TO_SITE_FOLDER
More Information: Unable to perform the operation ("Delete File") for the specified directory ("D:\home\site\wwwroot\TestAspNetCore.exe"). This can occur if the server administrator has not authorized this operation for the user credentials you are using. Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_INSUFFICIENT_ACCESS_TO_SITE_FOLDER.
The problem is IIS locks the .exe file. We can take the website offline but with continuous delivery it would be nice to have no downtime.
Note that ASP.NET 4.5 does not have this problem.
See also https://github.com/aspnet/IISIntegration/issues/226 and https://github.com/aspnet/Hosting/issues/141
I've had a similar headache and it seems not to be possible, the most reliable solution I have come up with is to have 1 build for a private local version that can be taken offline then restarted right after deployment. I then have a second build that takes the private version into production every night in the early hours.
This way I can make updates regularly through the day and ensure my site is only offline for no more than 20s during the early hours when it is least likely to be used.

MachineKey Azure SDK 1.5/1.6

I am using a custom Api Token implementation using WCF Web API on Azure. This uses FormsAuthentication.Decrypt in order to obtain a FormsAuthenticationTicket. To make sure that the decrpyt process works across multiple instances, I have provided a MachineKey in my web.config.
However, I've noticed that the MachineKey doesn't seem to be working on Azure because it looks like Azure is using a random machinekey and overwriting the one I specificed in the web.config I'm using the latest Azure SDK 1.5 (or 1.6?)
I am well aware of this issue with Azure SDK 1.3 and I believe this was rectified in 1.4. Is there a chance that this issue has since re-appeared on Azure SDK1.5/1.6?
I was having the same problem where my FormsAuthentication tickets were not validating across sub domains after the recent Microsoft .Net 4.0 Security upgrade KB2656351.
My FormsAuth tickets are generated from my dedicated servers and read on sub domains on Windows Azure.
In order to get all sub domains to decrypt the tickets I made sure all my dedicated servers were patched with the latest .Net updates via Windows Update. Then I upgraded my Azure project to version 1.6 and selected the latest Azure OS after deploying. This seemed to do the trick.
Here are some articles about the issue:
http://weblogs.asp.net/scottgu/archive/2011/12/28/asp-net-security-update-shipping-thursday-dec-29th.aspx
http://technet.microsoft.com/en-us/security/bulletin/ms11-100.mspx
cheers
Francesco
Windows Azure already synchronizes machine keys across the same role in a deployment. As such, you should be fine to completely ignore the MachineKey setting in web.config and just let Windows Azure handle it for you (the web farm scenario is well supported). Your scenario is supported on Windows Azure out of box with no modifications (just call Decrypt).
The issue that you might be talking about was a 1.3 issue where the web.config files were being modified directly to sync the machine keys. This failed when the file was read-only (i.e. TFS source control) and caused deployment failures. That was fixed some time ago.
I think I finally found the solution. This had nothing to do with Azure or MachineKeys but had more to do with the way the app was being tested. The encrypted key that was stored on my Phone App was encrypted on a different web server (however, the machine key used was the same). I just un-installed and re-installed my app thereby forcing the server to generate a new key.
It seems that decrypting this key on a different server was causing problems. I'm a little worried if this will cause problems in the future. Shouldn't using the same Machine Keys ensure that encrypt/decrypt works across boxes?
Anyways, I apologize for the inconvenience caused.
We seem to have the same problem as well. We set machinekey set in the web.config file. Things were fine until a couple of days ago when Decrypt started returning null. The decryptionkey and validationkey are identical on all machines. Not sure what the problem is.
EDIT - Azure v1.6 does seem to respect the machinekey we set in the config file. We figured out how to solve our problem - Maybe this would help you - we were seeing that decrypt on the cookie does not work on our Windows 7 64 bit dev machines. Then we checked pending updates and there were a couple of .NET updates related to security. We ran the updates and voila things started to work again.
OK so I had the problem as described above in a 3-server NLB group.
It looks like the Windows Automatic Updates had installed KB2656352, KB2656358 and KB2657424 on two of the three servers.
I'd put money on the fact that it's because some of the servers are running with the patch and some aren't. I guess machines that have been patched don't like decoding things encoded by a non-patched machine (and/or vice-versa).
Anyway, I've installed all three patches on the remaining machine and put it back into the NLB group. It seems to all work fine.

Windows azure deployment

I just built a simple hello world windows azure service containing just one web role, I used visual studio 2008 and Windows azure tools for VS 1.2 I am pretty new to this and I have been trying to deploy an application all afternoon now. I'm in australia and deploying in the region Asia anywhere.
I have pretty much followed the info provided on MSDN and it says uploaded 95% then after about ten minutes the deployment disappears. I have tried using the old windows azure developer portal and 30minutes later I can not access the service and it's status is either busy or stopped.
I have the introductory offer for an extra small compute instance on the subscription I am deploying to. Can anyone with experience with windows azure elaborate on the subject of deploying apps and the status on my application, I am very keen to get into the platform and this issue has just about spoiled my weekend.
Most likely it is related to the UseDevelopmentStorage=true for a connection string. I have accidentally done this a couple of times myself and things just magically don't work and there is no explanation. Missing DLL's are usually a little harder to track down as the application may or may not start depending on where the failure happens. Trace logging and/or infrastructure logging is the best way to find out if the DLL is missing if you can get your application to run that far.
As pointed out already, the best place to start is making the simplest "Hello World!" you possibly can and start extending from there. Yes it will take you a while to make progress but the experiences you gain from this will be invaluable moving forward.
Two things to check before deployment
1. Change Roles' Connection Strings to point to Azure Storage instead of UseDevelopmentStorage
2. All References not belong to asp.net framework should be set to "Copy Local=True"
I would guess that the deployment is going successfully but that the role instances are not able to start. The most common causes of this are eithe referrences to development storage while deployed (UseDevelopmentStorage=true) or a referrence to an assembly with copylocal!=true.

Best practices for applying changes to a SharePoint application

I feel like I need a better defined framework for updating my SharePoint (MOSS 2007) application with custom code changes. I am creating wsp solution files with features and new types and such, but once those get tested and deployed, I feel like it's a bit of a leap of faith, and that makes me nervous and occasionally reluctant to deploy changes. After deployment, it's difficult to correlate the current state of the SharePoint application with the specific code that is deployed on that SharePoint server. What features are actually installed and on which sites? Which features are activated or deactivated? Which version of this custom field or content type is really there? Things like this. If an error crops up, I have to rely on my assumptions about what code is there and actually running, or I have to spend time digging through deployed assemblies and the 12 hive -- not impossible, but pretty unpleasant.
What steps should I take to improve my ability to unambiguously determine the state of the application and find the code that truly represents that state? Are there third-party tools that can help with this?
I feel your pain... Application Developyment Lifecycle with SharePoint 2007 leaves me with a bitter taste in my mouth.
To answer your question. We built our own deployment utility that does a few things for us.
Checks state of key Timer Jobs (too many times we would do a deployment to find one WFE that did not get deployment)
Checks state of key Services on all our web front ends (again we want to know health of farm before we start kicking off timer jobs).
Shows file version and date of selected assemblies from GAC (does this across all Web Front Ends). We have seen problems before where assemblies did not get installed correctly across the farms.
Updates web.config settings based on an custom XML scheme we provide. We ran into some problems with web.config updates so we have thought about creating a utility to validate the web.config (specifically make sure there are no duplicate entries for specific keys).
Push content type updates (first time content types are deployed via feature it works great, but as soon as you need to update that content type it gets tough).
Checks status of WSP package after deployment or upgrade.
This utility uses the SharePoint API to do most of this work. Some of it is done by checking WMI Events.
Unfortunately the SharePoint development experience is lacking in this regard. As long as you are "namespacing" all features deployed using solution packages, you can use solution management from central admin to keep track of versions, and what gets deployed to which site collection.
Features are scoped from all levels from the farm to an individual web; so maintenence from that level is a little tough. I just try to organize all deployed code from the (top down) solution level.
It gets even more complicated when deploying custom timer jobs, event handlers, etc; I really hope that version next will address a lot of these common developer concerns.
Isn't the only way that you have a planned/controlled deployment process and a version management system like TFS
In the current project I am involved in we have:
Continuous builds
Daily Builds on a development server
When we release something to test we merge the code to the Main bransch in the version management system (TFS)
When tested and ready for production then we merge the main bransch to the release bransch
Using this structured way we always knows what is deployed in what environment and can also track all changes based on environment or changes in requirements(are also tracked in TFS)

Intranet Uptime Monitoring Component

I have a MOSS 2007 test site, its not public facing, instead its on our intranet, I am looking for an uptime monitoring component thats free and easy to install, any suggestions?
Update: I don't need graphs or anything fancy, I just need to make sure that I get a notification via email if the site goes down.
Nagios might be overkill, but it is not too hard to put in place ...
I also came across OpManager - http://manageengine.adventnet.com/products/opmanager/
There is a free version which allows monitoring of 10 services.
I tried to get it to install on my SQL 2008 Server, but ended up just using the product default mySQL installation.
I also decided to uninstall it after 10 minutes worth of use, it seems like a great tool for a fully qualified network administrator, but it adds too much bloat to have it installed on my development server.

Resources