Microsoft CRM 2011 On Demand Development and Test Environments - dynamics-crm-2011

Does anyone have recommendations on the best way to set up development and testing environments for Microsoft CRM 2011 On Demand?
The recommendations I have seen so far include:
Paying for another account with only one user
Creating a VM
Going with a partner hosted environment

You will need to be a little more specific. What's wrong with the 3 you have listed? Is it cost, it is the time to configure?
That being said, what I do is sign up for the free 30 day trial.
First, sign up for a new Windows Live account.
Second, click here to sign up for the 30 day account.
Third, I always write down the login & url because I always forget them.
I'll have anywhere from 1 - 5 of these running at once.
The main benefit is the control this gives me. Since you can't access the SQL server directly with On Demand, it forces you to make your configurations & customizations the correct way.
Your other option is to setup a VM environment and create a new instance every time you need a clean setup. This is not my preferred option since you need good hardware to run the environment (otherwise the performance penalty is huge)

Related

Can Loadrunner with Amazon Load Generators test a site that is not publicly accessible?

I'm a web developer and completely new to Loadrunner suite.
Our client has already provided us with some Loadrunner actions, that I need to run them to test a site that is hosted on the company's intranet that I'm currently working.
The computer I'm using can not handle more than 7 vusers, therefore I was requested to use Amazon EC2 for load generators.
Before I request my company to be charged with Amazon services I need to know, would I be able to test our internal page from my computer exactly as I do with the load generator on my localhost, or the page that will be tested needs to be publicly accessible from the internet?
Any feedback will be appreciated. Thanks.
Please read carefully what James wrote. You said you are a web developer so the task that was given to you is roughly equivalent to "write a new DB access layer".
You didn't mention which protocol you are using but I will assume TruClient (based on the 7 vUsers per machine). I will also assume you are using the latest version of LoadRunner or at least something from the 12.6X family.
1) You already have a solution for AWS out of the box in the form of StormRunner (https://www.microfocus.com/en-us/products/stormrunner-load-agile-cloud-testing/overview). If you want to test if the solution works for you please request a couple of execution hours from the sales team and try it. If your company has a valid license for LoadRunner I don't think this will be an issue.
2) You have a simple integration into the controller application for EC2 and alike. In the controller go to Tools->Manage cloud accounts. If you run a small test the cost should not be too great I assume.
3) If you are a developer, we have a new offering called TruWeb which is a transport level protocol which should be more developer friendly. It will be able to run much more users per machine so you will be able to use it to test on EC2 micro machine (free tier). The caveat is that you will have to write some JavaScript code and not be able to reuse the actions given to you. You can download TruWeb from here - https://marketplace.microfocus.com/appdelivery/content/truweb and it comes with the LoadRunner installation out of the box since 12.58. If you need further assistance with TruWeb feel free to email us to - truweb_forum#microfocus.com
I hope this will give you some directions.
a) You need training. This is not a discipline that someone is socially promoted to and finds success
b) Expect that it will take at least six months to begin delivering value in this field, longer if you are not working with a mentor
c) This is a question of application communication architecture. Architecture is one of the foundation skills for a performance tester/engineer/architect.
d) It is not recommended that you use the controller as a load generator. It is not recommended that you use just one load generator. Both of which will cause your test to fail an audit from a more mature testing firm. Go with a minimum of three, two for primary load, one for a control set of a single virtual user of each type. Design your tests to allow for the examination of Control timing records compared to the global set to understand if you have an application issue or a load generator issue as part of your test design.
e) You will need to coordinate with your network team for two reasons. One, you may need to open outbound ports (covered in documentation) to allow your controller to communicate with your load generators. Two, you absolutely will have to coordinate a tunnel from the outside internet to your internal applications under test. Expect that security will be paramount only our requests and no other requests through the tunnel. There are many mechanisms to address this, from a custom HTTP header to certificates. Speak with your network security professionals for the setup and configuration which you will be able to implement.
The self paced training for loadrunner is available for download. It takes about three days to go through. This is the absolute minimum before you pick up this tool in anger. Ideally, you would go through training with a certified instructor and be paired with a mentor for a period. The length of time for the mentor is directly related to the number of foundation skills which you bring to the table.

How to offer a C++ Windows software as a service

We (ISV) are currently planning to offer our software on a rental/subscription basis as a service.
It's a native Windows (C++ / .NET) B2B application.
Our software needs access to the file system (drives) on the customers computer and it also needs access to the network (e.g. be able to find other computers in the network).
We want to offer our customers a service where they do not have to bother themselves with setup/updates and always work with the newest version of our software. So we need a single point of maintenance.
In the first phase we do not expect a lot of our customers (let's say 20) to change to this model, so it would not be a problem to have to set them up and manage them manually, but in the long run a solution that allows an automated set/sign up process would be required.
What I found most promising was Citrix XenDesktop/XenApp with VM hosted Apps and personal vDisks, but it seems that the Citrix solution is not able to get access to the network on the client PC (I tried it with the trial in the Azure Marketplace). Also it seems to be high priced.
What would be other possible ways to meet these requirements?
Unless you can make some significant architectural changes to eliminate the need to access the local filesystem and and eliminate the need to do local network browsing, I would recommend focusing on optimizing your local installation and update process. And skip the virtualization/service idea "for now".
You can still go to subscription model with a locally installed application. Just require your application to "phone home" to check its licensing/subscription status on startup.

AggCat Account Data - Time required to refresh

I am using the Intuit AggCat Beta in a development environment and noticed that when I discover new accounts, they often don't include important information like balances, current interest rates, and other key pieces of data. This is true even after calling functions like getAccountTransactions or getCustomerAccounts (not simply discoverAndAddNewAccounts).
However, after a few hours, I refresh the accounts and this information shows up. It's very important that new accounts include this data during the discovery process, and I wanted to check if this is an issue of using the development environment (e.g. something that will go away in production?) or if other users are having this issue
This is how it works in both dev and prod environment. During the account discovering, it may not be able to get all information for some accounts for some financial institutions because of their website layout/data limitation.
The refresh call will get these information. So the best practice is to refresh the account before you try to get these information.

Azure releasing complications

We are considering to build a webapplication and rely on Azure. The main idea behind this application is that users are able to work together on specific tasks in the cloud. I'd love to go for the concept of instant releasing where users are not bothered with downtime but I have no idea how I can achieve this (if possible at all). Lets say 10.000 users are currently working on this webapplication, and I release software with database updates.
What happens when I publish a new release of my software into Azure?
What will happen to the brilliant work in progress of my poor users?
Should I bring the site down first before I publish a new release?
Can I "just release" and let users enjoy the "new" world as soon as they request a new page?
I am surprised that I can't find any information about releasing strategies in Azure, am I looking in the wrong places?
Windows Azure is a great platform with many different features which can simplify lots of software management tasks. However, bear in mind that no matter how great platform you use, your application depends on proper system architecture and code quality - well written application will work perfectly fine; poorly written application will fail. So do not expect that Azure will solve all your issues (but it may help with many).
What happens when I publish a new release of my software into Azure?
Windows Azure Cloud Services has a concept of Production and Staging deployments. New code deployment goes to staging first. Then you can do a quick QA over there (sometimes "warm up" the application to make sure it has all caches populated - but that depends on application design) and perform "Swap" - your staging deployment becomes production and production deployment becomes staging. That gives you ability to perform "rollback" in case of any issues with the new code. Swap operation is relatively fast as it is mostly internal DNS switch.
What will happen to the brilliant work in progress of my poor users?
It is always good idea to perform code deployments during the lowest site load (night time). Sometimes it is not possible e.g. if your application is used by global organization. Then you should use "the lowest" activity time.
In order to protect users you could implement solutions such as "automatic draft save" which happens every X minutes. But if your application is designed to work with cloud systems, users should not see any functionality failure during new code release.
Should I bring the site down first before I publish a new release?
That depends on architecture of your application. If the application is very well designed then you should not need to do that. Windows Azure application I work with has new code release once a month and we never had to bring the site down since the beginning (for last two years).
I hope that will give you better understanding of Azure Cloud Services.
Yes you can.
I suggest you create one of the visual stdio template applications and take a look at the "staging" and "production" environments located directly when you click your azure site in portal manager.
Say for example the users work on the "production" environment which is connected to Sqlserver1. You publish your new release to "staging" which is also connected to Sqlserver1. Then you just switch the two using the swap and staging becomes the "production" environment.
I'm not sure what happens to their work if they have something stored in sessions or server caches. Guess they will be lost. But client side stuff will work seamlessly.
"Should I bring the site down first before I publish a new release?"
I would bring up a warning (if the users work conissts of session stuff and so forth) saying brief downtime in 5 minutes and then after the swith telling everyone it is over.

Confused about Azure hosting and billing?

I've developed a simple system using ASP.NET MVC and WCF for customers to register software and get a license key. I was thinking about using Windows Azure instead of a traditional web hosting because it seems easy to use. I'd only need one SQL database and one small VM, but I'm confused about the billing.
Does the billing only charge as people actually use it, or would I pay the fee for each CPU every hour of everyday for the whole month because that was what was available to users? So for one single cpu VM at $0.12 an hour in a 30 day month I'd pay $86.4? Or would I pay less if no one used it? Then another $9.99 for an up to 1GB database, so for my needs I'd basically pay $96.39 a month?
That seems expensive for basic web hosting, but if it's easier for someone with little hosting experience to set up and maintain as well as making it easy to expand if I suddenly got a lot of traffic then it would certainly be worth it to me.
EDIT: I think I found the answer here: Getting started with Windows Azure
You're correct regarding the $0.12 / hour: you're billed based on resources consumed (meaning virtual machine instances), whether you're running at 0% cpu or 100% cpu.
While it might seem expensive compared to your average shared-hosting provider, consider that you're getting health monitoring, failover, SLA (if you have 2 or more instances) upgrade domains, etc.
I have two blog posts that go deeper into Compute Instance billing that you might find beneficial:
Part 1: The True Cost of Web and Worker Roles
Part 2: Staging and Compute-Hour Metering
I hope this helps...
The rule for billing is quite simple: if you look at the protal, there are the "gray" or "blue" boxes showing for a deployment.
If the box is gray, you are OK. If the box is blue, a bill is due.
This means that charges for every hour will be made whenever the box is blue, that is: once a deployment has been done, whether it's stopped or running.
Now you have a new feature in windows azure called WebSites. Deploying a website which have only a small amount of visits. It is simply 'free'. This is light weight website running in a shared environment.
http://www.windowsazure.com/en-us/pricing/calculator/ -> Check for websites.

Resources