I have a standard azure webrole that requires some 3rd party software to be installed on the cloud service. The webrole itself is very basic. The 3rd party software has a few prerequisites, so everything is being included in the role's content (a total of 5 MSI files) and is installed via a an elevated startup task.
The software is installing successfully and everything's working, but including these MSIs as content for the webrole results in an 80MB *.cspkg file.
Excluding the installers yields a package size of 10MB. The total size of all the 5 MSI files is 20.5MB. I don't understand why including 20MB of installers results in an 80MB cloud service package.
I think the packaging process might be attempting to compress the files (unintentionally increasing the payload). Is what I'm seeing normal? Or is there any way of reducing the resulting package when installers are included?
The package for WebRoles is usually bigger then one for worker roles because your role content is included twice! Once is approot folder and once in sites\0\ folder.
As a second thought of yours - compressing. This is true. Azure package is just a regular zip file. You can safely rename yourpackage.cspkg to yourpackage.zip and browse the content.
However your strategy for keeping installation files in package is not effective in its roots. What I will suggest is that you put your 3rd party installer in a blob storage. Then first download it and then install it. For downloading the content you can use various techniques. From PowerShell to Azure Bootstrapper and probably many more.
Having your (all and any of them) 3rd party installers in a Blob storage is a recommended best practice for designing applications for Windows Azure.
As to the question why your Web Application is copied twice in the Azure Deployment package - it has its roots in the times where Windows Azure Web Role was using IIS Hostable Web Core and not full IIS. And is probably still there for backward compatibility.
I found if you add the msi to the \bin folder of the Azure project it doesn't copy multiple times, in this style:
https://learn.microsoft.com/en-us/azure/cloud-services/cloud-services-dotnet-install-dotnet
Using Visual Studio 2017, Azure SDK 2.9, Build Engine version 14.0.27530.0
Related
Is there a way to copy only modified files to Service Fabric.
I have a Service Fabric application containing an ASP. Net 5 application as service. Whenever am doing a change to a JavaScript file inside my ASP. Net 5 service, every time I need to copy the entire service fabric application package. Is there a command which allows to copy only the modified file?
The best way to accomplish this is to use diff packaging and app upgrade. See this link for more info: https://azure.microsoft.com/en-us/documentation/articles/service-fabric-application-upgrade-advanced/. Diff packaging allows you to define an application package that only contains the package parts that you wish to upgrade. However, it only applies to a component of an application package, such as a Service or Code package for example. You can't create a diff package at the file level. So if you've only changed a single file in your code package, you must include that file along with every other file that belongs to the code package. You can't just include the single file that changed. But the benefit of diff packaging is that you'd only need to include that single code package. You wouldn't need to provide other Service's code packages, for example, assuming they haven't been changed.
Service Fabric SDK 2.5 brings in a preview feature called "Refresh Application".
Using this feature you can get quicker feedback of your code changes.
To enable that, set the following from project properties
Application Debug mode = Refresh Application.
More details and limitations can be found here:
https://sharepointforum.org/threads/speed-up-service-fabric-development-with-the-new-refresh-application-debug-mode.111162/
In Fabric Explorer you need to find the node where you Web Application is running. I my case that is _Node_0
By SF SDK design, local SF published file is under C:\SfDevCluster\Data_App\ . In my environment, the website file path is C:\SfDevCluster\Data_App_Node_0\Application1Type_App1\Web1Pkg.Code.1.0.0\wwwroot\
So you can also find your HTML, CSS, JS and other static resources under below path: C:\SfDevCluster\Data_App[node_id][application_type_and_instance_name][service_type_and_version]\
You can just modify the files in this folder, then the change will immediately apply to your local test web browser. Please notice if your service is hosted by micro-service running in several nodes, you may need to modify all nodes files because load balancer may access any folder files randomly.
Here is my situation, I have a web app that contains:
An .exe (which is a .net project along with assembly files and so on)
ZIPped xml files
Folders containing js&css files
Now when executing the .exe it parses the xml inside the ZIPs to create html files( the end result is a complete html that imports some of the js libraries and css files).
Considering that I have basic experience in MS Azure, I am looking for a way to have my application run on azure? My guess is that the ZIPped xmls could be stored most probably using blob storage along with the js and css files. What I am not sure of is how to get the executable running there(Possibly deploying the .exe with its corresponding resources,assemblies,dlls etc...) and have it execute from there.
If you really want to use a home grown build process (your exe) then you need to use cloud services (your own VM) where you can run this and expose your website over whatever ports you want. However it sounds like you are new to .Net, I'd suggest reading up on ASP.Net MVC Web Projects. That way you can leverage Visual Studio for building the website and deploy to a Azure Website, which is designed to host websites.
For a project I am currently working on, I need to create a setup application for an existing desktop application. The setup application will be downloaded from a website, and will download required files to the correct locations. When the application is started, it will look for newer versions of these files, download them if any exist, then start the application.
I am using Visual Studio Online with TFVC, linked to Azure. I have a test application set up so that when I trigger a build, Release Management finds the build directory, and moves the files to Azure Blob Storage, but prepends a GUID to the file names being transferred. So what I have in my storage container is:
{Some GUID}/2390/Test.exe
{Some GUID}/2389/Test.exe
{Some GUID}/2387/Test.exe
...
What I want in my container is the latest version of Test.exe, so I can connect to the container, and determine whether I want to download or not.
I have put together a NullSoft installer that checks a website, and downloads files. I have also written a NullSoft "launcher" that will compare local file versions with versions on the website (using a version xml file on the website), and download if newer, then launch the application. What I need to figure out is how to get the newer files to the website after a build, with automation being one of the goals.
I am an intern, and new to deployment in general, and I don't even know if I'm going about this the right way.
Questions:
Does what I am doing make sense for what I am trying to accomplish?
We are trying to emulate ClickOnce functionality, but can't use ClickOnce due to the fact that the application dynamically loads a number of DLLs. Is there a way to configure ClickOnce to include non-referenced DLLs?
Is there a best practice for doing what I'm describing?
I appreciate any advice, links to references, or real-world examples.
You are mentioning ClickOnce, which you investigated but can't use. Have you already tried an alternative: Squirrel? With Squirrel you can specify which files should be part of the installation, allowing you to explicitly specify which files to include even if you load them dynamically.
Link: https://github.com/Squirrel/Squirrel.Windows
Squirrel is a full framework for creating an auto-update application and can work with Azure Blob Storage hosting (and also CDN if you need to scale up)
In my code (which has worker role) I need to specify a path to a directory (third party library requires it). Locally I've included folder into project and just give full path to it. However after deployment of course I need a new path. How do I confirm that whole folder has been deployed and how do I determine a new path to it?
Edit:
I added folder to the role node in visual studio and accessed it like this: Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), "my_folder");
Will this directory be used for reading and writing? If yes, you should use a LocalStorage resource. https://azure.microsoft.com/en-us/documentation/articles/cloud-services-configure-local-storage-resources/ shows how to use this.
If the directory is only for reading (ie. you have binaries or config files there), then you can use the %RoleRoot% environment variable to identify the path where your package was deployed to, then just append whatever folder you refernced in your project (ie. %RoleRoot%\Myfiles).
I'd take a slightly different approach. Place the 3rd party package into Windows Azure blob storage, then during role startup, you can download/extract it and place the files into the available Local storage (giving it whatever permissions the app needs). Then leverage that location from your application via the same local storage configuration entry.
This should help you reduce the size of your deployment package as well as give you the ability to update the 3rd party components without completely redeploying your solution. And by leveraging it on startup, you can guarantee that the files will be there in case the role instance gets torn down and rebuilt.
I have a Ruby on Rails application that needs to find a home in an Azure Worker Role.
I currently automate the deployment of the application with a batch file - a file that takes the apache and ruby installers, runs them, and then drops the RoR app in the appropriate directory. After the batch script finishes, Apache is serving to and from the application via port 80.
I'm new to Azure and trying to figure out how to do this.
From my understanding, I have two options here: OnStart with the installation files in Blob Storage, or a startup script. I'm not sure how to do the latter, but I have located the onStart method within the WorkerRole.vb file in the new Azure project I just created.
My question: Is it recommended to use OnStart to deploy the application (using the batch script)? If so, how would I go about integrating the script into the project? And - how do I get started with storing and referencing the files in blob storage?
I know these are super high-level questions. Any input or suggested reading would be super helpful. I have tried to google / search for relevant resources but haven't been able to find much. Thank you for your time!
When you are inside OnStart() function it is better to do role configuration things i.e. IP binding, etc however if you would want to install runtime, download application zip, configured role specific setting, it is best to use Startup task. Please visit my blog Windows Azure: Startup task or OnStart(), which to choose? to learn more about it.
Now in your case it is best to use Startup task. What you can do it as below:
Create your ROR package a zip and place it at Windows Azure Blob Storage
Create a Cmmmand batch file which will do:
2.1 Download the ZIP
2.2 Unzip to Zip content to a specific location
2.3 Update the status back to AZure Blob Storage (Optional)
In your OnStart() function you just need to configure the ROR
The code will look as below if you have TCP Endpoint name "RORWeb80" set to use port 80:
TcpListener RoRPortListener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["RORWeb80"].IPEndpoint);
RoRPortListener.Start();
I have written a sample app for Tomcat/Java based worker role which does exactly the same. So what you can do it just replace the Tomcat ZIP file with ROR ZIP and reuse the code exactly.
As long as you don't need admin-level access (e.g. modifying registry, installing msi's, etc.) you can do your setup from OnStart(), including launching your script. Just include the startup script with your project (don't forget to set Copy Local to true).
Same goes with startup script: you call your cmd file, which then executes the sequence for you. And if you give it elevated permissions, you can run installers, modify registry settings, install custom perf counters, whatever.
In either case: you can keep your apache zip, ruby installers, etc. in blob storage and, at startup, download them to local storage. This saves you from bundling everything within the deployment, which gives you a few advantages (being able to update ruby / apache without redeploy, reduced package size, etc.).
There's a sample app on codeplex that demonstrates the basics of setting up Tomcat via startup script. For one more example, you can look at the scripts installed via Eclipse Windows Azure plugin for Java. These scripts are quite similar. The key is to have some way of downloading files from blob storage and then unzipping them. the codeplex project I referred to points to a sample app that does simple blob downloading. The Eclipse packaging provides similar functionality in a .vbs app. Here's a snippet of one of my scripts from an Eclipse-based project:
SET SERVER_DIR_NAME=apache-tomcat-7.0.25
SET WAR_NAME=myapp.war
rd "\%ROLENAME%"
mklink /D "\%ROLENAME%" "%ROLEROOT%\approot"
cd /d "\%ROLENAME%"
cscript /NoLogo util\unzip.vbs jre7.zip "%CD%"
cscript /NoLogo util\unzip.vbs tomcat7.zip "%CD%"
copy %WAR_NAME% "%SERVER_DIR_NAME%\webapps\%WAR_NAME%"
cd "%SERVER_DIR_NAME%\bin"
set JAVA_HOME=\%ROLENAME%\jre7
set PATH=%PATH%;%JAVA_HOME%\bin
cmd /c startup.bat
The codeplex project has a similar-looking script.
Don't forget: you'll need to set up an Input Endpoint for your role (part of the role properties).
To get blobs into blob storage, there are both free tools (like Clumsy Leaf CloudXplorer and paid tools (such as Cerebrata's Cloud Storage Studio).
To download blobs to local storage, you can either write a few lines of .net code (from OnStart) or just use the utility pointed to in the codeplex project.