I have an almost static site that i was happy hosting on Storage blob. However, i need to have php script run to support email communication through the contact html form.
So i decided to buy the smallest VM which is B1Is which has 1 CPU and 0.5 GB of memory. I RDP to the server and to my astonishment I cannot even open one file or folder or Task Manager without waiting endlessly before the "Out of memory ...please try to close programs or restart all"!
The Azure team should not sell such a VM if it will be nonfunctional from the get go. Note that i installed ZERO programs on it.
All i want is php and setup the site on IIS. And add a certificate license to it. NO Database or any other programs will run.
What should i do?
Apparently it is because "1 B1ls is supported only on Linux" based on the notes on their page.
https://learn.microsoft.com/en-us/azure/virtual-machines/windows/sizes-general
Related
I am having architecture where my BOT server is on Cloud VM. To access that VM, I do RDP. I scheduled all the BP jobs on that BOT on VM. Now when I am connected to VM using RDP and keep the Blue Prism window in foreground, my BP jobs are running fine, means in Attended mode, they are fine. But if I minimize that BP window or if I dont do RDP to VM machine, I am getting error 'failed to navigate'. That means in unattended mode on my Cloud based VM BOT server, jobs are not running.
NOTE: My BP VM machine is always up and running also its not getting locked also, as I disabled windows screen lock (alt+ctrl+del).
In this scenario, will logon agent help or any other suggestions?
Utilizing an environment relying on RDP is not supported or recommended by Blue Prism, as it causes issues with automated processes (as you describe). Please refer to page 4 of the Blue Prism Data Sheet - Remote Access Tools (available in the Documents tab of the Blue Prism client portal):
The following tools have been deemed to be specifically unsuitable for providing remote access to Blue Prism environments:
Remote Desktop Connection (RDP)
The way that this Windows tool (and other tools that use the RDP protocol) handle session management is not compatible with Blue Prism:
The underlying operating system is aware as a connection is
established which can, subject to the automation techniques being
applied, result in the executing automation being interupted.
It requires the remote access credentials to be aligned with the
credentials used to authenticate the target system against the
network which presents a potential security risk.
As a user authenticates any previously connected users are locked out.
Each connection creates a separate desktop session.
The connection is not maintained throughout a system reboot.
It does not matter whether the VM is in cloud or within your own infrastructure, they both have same issue. Blueprism needs "screen" to be able to interact with applications. VM of course does not have a physical screen, but there is still a virtual one (I don't mean the RDP one by this) as the virtualization layer provides virtual GPU and monitor.
Imagine a non-virtual pc left unlocked. This is the same. Even if you don't see it (you have to look for "console", some clouds provide access to it), it exists.
There are more possibilities how to solve it, two of them are:
1) use Blueprism Login Agent
This will unlock the physical/virtual screen of the machine with given AD/Windows credentials, like a human user would before he starts working with the pc.
Please search internet for more infor about it or look up videos on youtube, like this one: https://www.youtube.com/watch?v=Eeeeu_iHjzk&list=PL4SEtvjUqihFh-iFvb_s0VAhPCX1tzg2A&index=43
(I am not the author of this video nor affiliated with the author)
2) modify Windows registry setting to log in automatically
More info: https://support.microsoft.com/en-us/help/324737/how-to-turn-on-automatic-logon-in-windows
I've encountered this problem before.
Try using the BluePrism's "Login Agent"'s "Login" process with the BOT's credentials.
If you continue to get this error, try using a "Dynamic" Spy mode for a particular attribute.
Good luck.
I use MS azure virtual machine. IIS running, hosting different web sites. What happens ,when I update vm from extra small to small ? Should I take a backup or nothing is required?There are some IIS bindings, should I move those settings ?
I think it is similar to gave more memory in vmware machine . But since it is vital for us, I am asking this question .
The VM will be gracefully shutdown and brought back up (most likely on a completely different VM host), so you will see some downtime as that transition occurs. The actual data for your machine is stored in BLOB storage, not on the VM itself, so you don't specifically need to worry about a back up because you are making this change. That said, if this is a production machine you need to be thinking about backups anyway.
I've got 6 web sites, 2 databases and 1 cloud environment setup on my account
I used the cloud to run some tasks via Windows Task Manager, everything was installed on my D drive but between last week and today the 8 of March my folder containing the "exe" to run as been removed.
Also I've installed SVN tortoise to get the files deployed and it not installed anymore
I wonder if somebody has a clue about my problem
Best Regards
Franck merlin
If you're using Cloud Services (web/worker roles), these are stateless virtual machines. That is: Windows Azure provides the operating system, then brings your deployment package into the environment after bootup. Every single virtual machine instance booted this way starts from a clean OS image, along with the exact same set of code bits from you.
Should you RDP into the box and manually install anything, anything you install is going to be temporary at best. Your stuff will likely survive reboots. However, if the OS needs updating (especially the underlying host OS), your changes will be lost as a fresh OS is brought up.
This is why, with Cloud Services, all customizations should be done via startup tasks or the OnStart() event. You should never manually install anything via RDP since:
Your changes will be temporary
Your changes won't propagate to additional instances; you'll be required to RDP into every single box to perform the same changes.
You may want to download the Azure Training Kit and look through some of the Cloud Service labs to get a better feel for startup tasks.
In addition to what David said, check out http://blogs.msdn.com/b/kwill/archive/2012/10/05/windows-azure-disk-partition-preservation.aspx for the scenarios where the different drives will be destroyed.
Also take a look at http://blogs.msdn.com/b/kwill/archive/2012/09/19/role-instance-restarts-due-to-os-upgrades.aspx which points you to the RSS feed and MSDN article where you can see that a new OS is currently being deployed.
i have an existing program that i would like to upload to the cloud without rewriting it and i'm wondering if that is possible.
For exemple can i upload and run a photoshop instance in the cloud and use it?
Of course not the GUI but photoshop has a communication sdk so web program should be able to control it!
As far as i can see, Worker roles looks good but they have to be written in a specific way and i can't rewrite photoshop !
Thanks for your attention!
As long as your existing program is 64bit compatible and it has installer that supports unattended/silent install; or your programm is xcopy deployable, you can use it in Azure.
For the programm that requires installation and supports unattended/silent install you can use StartUp Task.
For the program that is just xcopy deployable, just put it in a folder of your worker role, and make sure the "Copy to Output" attribute of all required files are set to "Copy always". Then you can use it.
However the bigger question is, what are you going to do with that "existing programm" in Azure, if you do not have API-s to work with.
Here's the thing, the Worker role should be what you need - it's essentially a virtual machine running a slightly different version of Windows, that you can RDP to, and use it normally. You can safely run more or less anything up there, but you need to automate the deployment (e.g. using startup tasks). As this can prove a bit problematic, Microsoft has created a Virtual machine Role. You create your own deployment and that's what gets raised when you instantiate the machine.
However! This machine is stateless, meaning that files it creates aren't saved if it gets restarted. So you need to ensure the files are saved somewhere else, e.g. in blob storage (intended for just such a purpose).
What I would do in your case, is create a virtual machine role, with Photoshop installed, and a custom piece of software next to it, accepting requests via Azure Queues, that does the processing, and saves the file to blob storage, then sends the file onwards to whoever requested
I've inherited a website from an external company (who have gone bust) and I need to get it deployed to our servers (its 3 web sites running together)
However, in testing, although the app runs correctly, performance is poor and I am pretty certain this is because the app writes files to the local disk. We only currently have a single disk in the server but as its virtual we can increase this to two fairly quickly.
Server is Windows 2008 running IIS7 and has two processors already.
Some of the files are 100mb+, but there are also lots of small writes and log file writes as well.
My question is where to put which parts of the application?
Is it best to have the OS on one disk, Web sites and files/log on another?
or sites and OS on one and files on another?
Is there a "standard" point to start from?
If anyone could reply with something like this, but with an explanation so I understand WHY!!
e.g.
C: OS
C: WebSites
D: Files
D: Logs
Your background sounds like it's from Linux, because some people configure new servers taking the items you listed into account. We have a handful of IIS sites; we mostly run Apache and on Linux, so I'm taking a stab at this.
Where we have IIS, we also tend to have MS SQL Server. I would keep the Windows OS on a nice large partition, and put everything IIS, including the root directory on a second drive. IIS installs defaulted to C:\, but I believe you can move it to another drive. The names of the utilities and how to do this, are best left to those who do this regularly.
In other words, I'd make a gross disk split OS/IIS, and then tune from there. However, make sure you have lots of disk space and can defragment.