How to Run Blue Prism Scheduled Jobs on Cloud VM based BOT server - blueprism

I am having architecture where my BOT server is on Cloud VM. To access that VM, I do RDP. I scheduled all the BP jobs on that BOT on VM. Now when I am connected to VM using RDP and keep the Blue Prism window in foreground, my BP jobs are running fine, means in Attended mode, they are fine. But if I minimize that BP window or if I dont do RDP to VM machine, I am getting error 'failed to navigate'. That means in unattended mode on my Cloud based VM BOT server, jobs are not running.
NOTE: My BP VM machine is always up and running also its not getting locked also, as I disabled windows screen lock (alt+ctrl+del).
In this scenario, will logon agent help or any other suggestions?

Utilizing an environment relying on RDP is not supported or recommended by Blue Prism, as it causes issues with automated processes (as you describe). Please refer to page 4 of the Blue Prism Data Sheet - Remote Access Tools (available in the Documents tab of the Blue Prism client portal):
The following tools have been deemed to be specifically unsuitable for providing remote access to Blue Prism environments:
Remote Desktop Connection (RDP)
The way that this Windows tool (and other tools that use the RDP protocol) handle session management is not compatible with Blue Prism:
The underlying operating system is aware as a connection is
established which can, subject to the automation techniques being
applied, result in the executing automation being interupted.
It requires the remote access credentials to be aligned with the
credentials used to authenticate the target system against the
network which presents a potential security risk.
As a user authenticates any previously connected users are locked out.
Each connection creates a separate desktop session.
The connection is not maintained throughout a system reboot.

It does not matter whether the VM is in cloud or within your own infrastructure, they both have same issue. Blueprism needs "screen" to be able to interact with applications. VM of course does not have a physical screen, but there is still a virtual one (I don't mean the RDP one by this) as the virtualization layer provides virtual GPU and monitor.
Imagine a non-virtual pc left unlocked. This is the same. Even if you don't see it (you have to look for "console", some clouds provide access to it), it exists.
There are more possibilities how to solve it, two of them are:
1) use Blueprism Login Agent
This will unlock the physical/virtual screen of the machine with given AD/Windows credentials, like a human user would before he starts working with the pc.
Please search internet for more infor about it or look up videos on youtube, like this one: https://www.youtube.com/watch?v=Eeeeu_iHjzk&list=PL4SEtvjUqihFh-iFvb_s0VAhPCX1tzg2A&index=43
(I am not the author of this video nor affiliated with the author)
2) modify Windows registry setting to log in automatically
More info: https://support.microsoft.com/en-us/help/324737/how-to-turn-on-automatic-logon-in-windows

I've encountered this problem before.
Try using the BluePrism's "Login Agent"'s "Login" process with the BOT's credentials.
If you continue to get this error, try using a "Dynamic" Spy mode for a particular attribute.
Good luck.

Related

Azure B1Is nonfunctional upon creation

I have an almost static site that i was happy hosting on Storage blob. However, i need to have php script run to support email communication through the contact html form.
So i decided to buy the smallest VM which is B1Is which has 1 CPU and 0.5 GB of memory. I RDP to the server and to my astonishment I cannot even open one file or folder or Task Manager without waiting endlessly before the "Out of memory ...please try to close programs or restart all"!
The Azure team should not sell such a VM if it will be nonfunctional from the get go. Note that i installed ZERO programs on it.
All i want is php and setup the site on IIS. And add a certificate license to it. NO Database or any other programs will run.
What should i do?
Apparently it is because "1 B1ls is supported only on Linux" based on the notes on their page.
https://learn.microsoft.com/en-us/azure/virtual-machines/windows/sizes-general

Is it possible to Capture running machine on Azure or it is recommended to shutdown before?

I have an Ubuntu 14 VM on Azure to host my developed web sites. (I do not think the OS matters in the point of view the question, but never know)
I've discovered the relatively new Capture button, so for the storage price of a disk size I regularly save a "snapshot" via the Capture function (I am not preparing the image for provisioning, I mean not checking the "I have run 'waagent -deprovision' on the virtual machine" checkbox). Be aware quickly becomes pretty addictive.
The result is an image what I can use when creating new machines, its there in My Images in the wizard. This can function as a backup/rollback worflow. If I delete the VM and create a new from one of resulting image of the previously captured "snapshots". (again, no provisioning will take place)
It is possible to initiate the Capture operation on a running VM. It is not clear for me, if the result will be an image what is a template for a new VM, and that VM will start up and boot, in what state the filesystem etc will be?
Is not it a similar state than sudden power lost? If yes, then it is strongly recommended to always shutdown the VM before capturing, however this such a pain and productivity killer, so no one (including me) wants to do unless it is mandatory.
Accidentally I've switched to the new Azure portal and there the Capture UI says:
Capturing a virtual machine while it's running isn't recommended.

Windows Azure VM RDP issue

I followed this
http://blogs.technet.com/b/keithmayer/archive/2013/04/17/step-by-step-build-a-free-sharepoint-2013-lab-in-the-cloud-with-windows-azure-31-days-of-servers-in-the-cloud-part-7-of-31.aspx#.UX_iF7XvvQI
I created a VM using the datacentre Image it created successfully and the status shows Its running. I am trying to RDP It says
Remote Desktop cant connect to the remote computer for one of these reasons:
1) Remote access to the server is not enabled
2) The remote computer is turned off
3) The remote computer is not available on the network
make sure the remote computer is turned on and conencted to the network and that remote access is enabled.
I did check the endpoints the public port is open and also 3389 private port is open too. I did try with different release one with latest patch and the other with the second latest OS patch but I am still not able to RDP.
Thanks
Yeah I already figured out firewall in my organization is blocking it. I did update the answer but it did not show up I am trying again :)
Make sure your VM has reached the "Running" status. If it's still in one of its pre-running statuses (such as Provisioning), you won't be able to RDP.
Also: Be sure you don't try logging in with 'Administrator' (the default in the rdp login box). Choose localhost\yourusername.
I had a similar problem the other day. It was solved by going to the Azure Portal, selecting the VM Dashboard, then clicking "Connect" in the grey toolbar at the bottom. This will download an RDP file that contains the correct connection settings. You can then send that rdp file to others who you would like to give access to.
I just opened one of the files used to connect, and it looks like the only real difference is the port used.
full address:s:[vm name].cloudapp.net:62808
username:s:Administrator
prompt for credentials:i:1
I am not sure if all Azure VM's use 62808, but the default RDP port is 3389 so just copying the DNS from the Dashboard into the RDP address will NOT work without adding the correct port.
One more thing folks should check when having trouble connecting is password length.
I thought I would be all secure by using a guid for a password. RDP worked fine from home (on older XP RDP client), but not from office. At first I thought it was a firewall issue. After verifying with the IT guys that I had full outbound access, I looked a little closer at the RDP error message.
It was saying my credentials were rejected. Finally, I created a second account on the VM and gave it RDP access. I was able to log in fine. The only difference between the two users was this time I didn't bother with a long password.
So I shortened the password on my main account and got in with no problem. I'm not sure what the limit is, but it seems to be less than 32.

Cloud environment on Windows Azure platform

I've got 6 web sites, 2 databases and 1 cloud environment setup on my account
I used the cloud to run some tasks via Windows Task Manager, everything was installed on my D drive but between last week and today the 8 of March my folder containing the "exe" to run as been removed.
Also I've installed SVN tortoise to get the files deployed and it not installed anymore
I wonder if somebody has a clue about my problem
Best Regards
Franck merlin
If you're using Cloud Services (web/worker roles), these are stateless virtual machines. That is: Windows Azure provides the operating system, then brings your deployment package into the environment after bootup. Every single virtual machine instance booted this way starts from a clean OS image, along with the exact same set of code bits from you.
Should you RDP into the box and manually install anything, anything you install is going to be temporary at best. Your stuff will likely survive reboots. However, if the OS needs updating (especially the underlying host OS), your changes will be lost as a fresh OS is brought up.
This is why, with Cloud Services, all customizations should be done via startup tasks or the OnStart() event. You should never manually install anything via RDP since:
Your changes will be temporary
Your changes won't propagate to additional instances; you'll be required to RDP into every single box to perform the same changes.
You may want to download the Azure Training Kit and look through some of the Cloud Service labs to get a better feel for startup tasks.
In addition to what David said, check out http://blogs.msdn.com/b/kwill/archive/2012/10/05/windows-azure-disk-partition-preservation.aspx for the scenarios where the different drives will be destroyed.
Also take a look at http://blogs.msdn.com/b/kwill/archive/2012/09/19/role-instance-restarts-due-to-os-upgrades.aspx which points you to the RSS feed and MSDN article where you can see that a new OS is currently being deployed.

RPC command to initiate a software install

I was recently working with a product from Symantech called Norton EndPoint protection. It consists of a server console application and a deployment application and I would like to incorporate their deployment method into a future version of one of my products.
The deployment application allows you to select computer workstations running Win2K, WinXP, or Win7. The selection of workstations is provided from either AD (Active Directory) or NT Domain (WINs/DNS NetBIOS lookup). From the list, one can click and choose which workstations to deploy the end point software which is Symantech's virus & spyware protection suite.
Then, after selecting which workstations should receive the package, the software copies the setup.exe program to each workstation (presumable over the administrative share \pcname\c$) and then commands the workstation to execute setup.exe resulting in the workstation installing the software.
I really like how their product works but not sure what they are doing to accomplish all the steps. I've not done any deep investigations into this such as sniffing the network, etc... and wanted to check here to see if anyone is familiar with what I'm talking about and if you know how it's accomplished or have ideas how it could be accomplished.
My thinking is that they are using the admin share to copy the software to the selected workstations and then issuing an RPC call to command the workstation to do the install.
What's interesting is that the workstations do this without any of the logged in users knowing what's going on until the very end where a reboot is necessary. At which point, the user gets a pop-up asking to reboot now or later, etc... My hunch is that the setup.exe program is popping this message.
To the point: I'm looking to find out the mechanism by which one Windows based machine can tell another to do some action or run some program.
My programming language is C/C++
Any thoughts/suggestions appreciated.
I was also looking into this, since I too want to remote deploy software. I chose to packet sniff pstools since it has proven itself quite reliable in such remote admin tasks.
I must admit I was definitely over-thinking this challenge. You have probably done your packet sniff by now and discovered the same things I have. I hope by leaving this post behind we can assist other developers.
This is how pstools accomplishes execution of arbitrary code:
It copies a system service executable to \\server\admin$ (you either have to already have local admin on the remote machine, or supply credentials). Once the file is copied, it uses the Service Control Manager API to make the copied file a system service and start it.
Obviously, this system service can now do whatever it wants, including binding to an RPC named pipe. In our case, the system service would install an msi. To get confirmation of successful installation you could either remote poll a registry key, or an rpc function. Either way, you should remove the system service when you are done and delete the file (psexec does not do this, I guess they don't want it to be used surreptitiously, and in that case leaving the service behind would at least give an admin a fighting chance of realizing someone had compromised their box.) This method does not require any preconfiguration of the remote machine, simply that you have admin creds and that file sharing and rpc are open in the firewall.
I've seen demos in C# using WMI, but I don't like those solutions. File sharing and RPC are most likely to be open in firewalls. If they aren't, file sharing and remote MMC management of the remote server wouldn't work. WMI can be blocked and still leave these functional.
I've worked with a lot of software that does remote installations, and a lot of them are not as reliable as pstools. My guess is that this is because those developers are using other methods that are not as likely to be open at the firewall level.
The simple solution is often the most elusive. As always, my hat is off to the SysInternals folks. They are true hackers in the positive, old school meaning of the word!
This sort of functionality is also available with products LANDesk and Altiris. You need a daemonized listener on the client side that will listen for instructions/connections from the server. Once a connection is made any number of things can happen: you can transfer files, kick on installation scripts, etc. usually transparently to any users on that box.
I've used the Twisted Framework (http://twistedmatrix.com) to do this with a small handful of Linux machines. It's Python and Linux, not Windows, but the premise is the same: a listening client accepts instructions from a server and executes them. Very simple.
This functionality can also be accomplished with VB/Powershell scripts in a Windows-based domain.

Resources