How to establish a continuous deployment of non-.NET project/solution to Azure? - azure

I have connected Visual Studio Online to my Azure website. This is not a .NET ASP.NET MVC project, just several static HTML files.
Now I want to get my files uploaded to Azure and available 'online' after my commits/pushes to the TFS.
When a build definition (based on GitContinuousDeploymentTemplate.12.xaml) is executed it fails with an obvious message:
Exception Message: The process parameter ProjectsToBuild is required but no value was set.
My question: how do I setup a build definition so that it automatically copies my static files to Azure on commits?
Or do I need to use a different tooling for this task (like WebMatrix).
update
I ended up with creating an empty website and deploying it manually from Visual Studio using webdeploy. Other possible options to consider to create local Git at Azure.

Alright, let me try to give you an answer:
I was having quite a similar issue. I had a static HTML, JS and CSS site which I needed to have in TFS due to the project and wanted to make my life easier using the continuous deployment. So what I did was following:
When you have a Git in TFS, you get an URL for the repository - something like:
https://yoursite.visualstudio.com/COLLECTION/PROJECT/_git/REPOSITORY
, however in order to access the repository itself, you need to authenticate, which is not currently possible, if you try to put the URL with authentication into Azure:
https://username:password#TFS_URL
It will not accept it. So what you do, in order to bind the deployment is that you just put the URL for repository there (the deployment will fail, however it will prepare the environment for us to proceed).
However, when you link it there, you can get DEPLOYMENT TRIGGER URL on the Configure tab of the Website. What it is for is that when you push a change to your repository (say to GitHub) what happens is that GitHub makes a HTTP POST request to that link and it tells Azure to deploy new code onto the site.
Now I went to Kudu which is the underlaying system of Azure Websites which handles the deployments. I figured that if you send correct contents in the HTTP POST (JSON format) to the DEPLOYMENT TRIGGER URL, you can have it deploy code from any repository and it even authenticates!
So the thing left to do is to generate the alternative authentication credentials on the TFS site and put the whole request together. I wrapped this entire process into the following PowerShell script:
# Windows Azure Website Configuration
#
# WAWS_username: The user account which has access to the website, can be obtained from https://manage.windowsazure.com portal on the Configure tab under DEPLOYMENT TRIGGER URL
# WAWS_password: The password for the account specified above
# WAWS: The Azure site name
$WAWS_username = ''
$WAWS_password = ''
$WAWS = ''
# Visual Studio Online Repository Configuration
#
# VSO_username: The user account used for basic authentication in VSO (has to be manually enabled)
# VSO_password: The password for the account specified above
# VSO_URL: The URL to the Git repository (branch is specified on the https://manage.windowsazure.com Configuration tab BRANCH TO DEPLOY
$VSO_username = ''
$VSO_password = ''
$VSO_URL = ''
# DO NOT EDIT ANY OF THE CODE BELOW
$WAWS_URL = 'https://' + $WAWS + '.scm.azurewebsites.net/deploy'
$BODY = '
{
"format": "basic",
"url": "https://' + $VSO_username + ':' + $VSO_password + '#' + $VSO_URL + '"
}'
$authorization = "Basic "+[System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($WAWS_username+":"+$WAWS_password ))
$bytes = [System.Text.Encoding]::ASCII.GetBytes($BODY)
$webRequest = [System.Net.WebRequest]::Create($WAWS_URL)
$webRequest.Method = "POST"
$webRequest.Headers.Add("Authorization", $authorization)
$webRequest.ContentLength = $bytes.Length
$webRequestStream = $webRequest.GetRequestStream();
$webRequestStream.Write($bytes, 0, $bytes.Length);
$webRequest.GetResponse()
I hope that what I wrote here makes sense. The last thing you would need is to bind this script to a hook in Git, so when you perform a push the script gets automatically triggered after it and the site is deployed. I haven't figured this piece yet tho.
This should also work to deploy a PHP/Node.js and similar code.

The easiest way would be to add them to an empty ASP .NET project, set them to be copied to the output folder, and then "build" the project.
Failing that, you could modify the build process template, but that's a "last resort" option.

Related

Azure Functions NodeJs: Remove Http Response Header

I have an HTTP triggered, NodeJs Azure Function, and I'm looking to remove the "X-Powered-By" header from my response, but have found no way to do so.
I've tried adding both this and this azure site extensions, but neither has worked for me,
Setting the response header manually, i.e. res.headers = { ['x-powered-by']: null } is ineffective.
Based on the comments made on this github issue: https://github.com/Azure/Azure-Functions/issues/290 it would seem that using either extension should have removed the headers you wanted.
Modifying the response headers will likely won't work as they are probably added further down the pipeline by the function host and not overridable, see:
Access Azure Function runtime settings
Azure functions recently removed the x-aspnet-version header, further removal of other headers is tracked as part of the azure-webjobs-script-sdk here
You should leave a comment on the github issue and you can further discuss with the team working on this.
There is an extension called Remove Custom Headers that works for Web Apps but not for functions that have their own resource group. So, what you can do is:
1. Create a regular Web App
2. Create a function and make sure you use the same Hosting Plan as the Web App (do not use Consumption).
3. Once the function is created, install the extension named: "Remove Custom Headers"
4. Restart the function and the headers (Server and X-Powered-By) should disappear.

Errors deploying Node.js app

So I am new to IBM Bluemix and all of their products and I am trying to do this project http://www.ibm.com/developerworks/library/ba-muse-toycar-app/index.html . I have done all of the modifying of the car and everything I am just having issues with the codes.
I have a few specific questions on part 2 step 2.b when you are entering in the information for the Cloudant database what information do I put in for the cradle connection and how do I acquire that information.
Second when I go to deploy the app Part 2 Step 2.4 how do I navigate to the application directory? I have looked at the help and googled to no avail. So if we fix these things I am hoping that I will be able to deploy the application. However currently when I go to deploy it I get this error.
cf push braincar
Updating app braincar in org ccornwe1#students.kennesaw.edu / space dev as myemailaddress#gmail.com...
OK
Uploading braincar...
FAILED
Error uploading application.
open /Users/codycornwell/.rnd: permission denied
>>
I am green to all this so any help and explanation to understand it is greatly appreciated! Thanks!
In the tutorial's part 2, step 2.b, you need to specify your Cloudant credentials. There are several ways to get Cloudant credentials, but I'll focus on doing it within the context of Bluemix and the cf command line tool.
You will first need to create a Cloudant service instance, then create a set of service keys (credentials) and then view them.
Create a Cloudant service instance named myCloudantSvc using the Shared plan:
$> cf create-service cloudantNoSQLDB Shared myCloudantSvc
Create a set of service keys (credentials) named cred1:
$> cf create-service-key myCloudantSvc cred1
View the credentials for the service key you just created
$> cf service-key myCloudantSvc creed
With the last step above, you should see output which provides you with the username, password and host values that you'll need to place into your app.js code. It should look something like the following:
{
"host": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx-bluemix.cloudant.com",
"password": "longSecretPassword",
"port": 443,
"url": "https://xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx-bluemix:longSecretPassword#xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx-bluemix.cloudant.com",
"username": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx-bluemix"
}
For your second question, it looks like you're performing the cf push from your $HOME directory (as mentioned in the comment by #vmovva). By default, the cf push command will send all files in the current directory to Bluemix/CloudFoundry.
Try running the command from the directory where your source code is located to reduce the files pushed to Bluemix. If your source code is intermingled in your $HOME directory, move your source into a different directory and then push from that directory.

Add GitLab Web hook for all projects in group

I would like all my projects in a GitLab group to have shared configuration for a webhook:
<MY_JENKINS_INSTANCE>/git/notifyCommit?url=$CHANGED_REPOSITORY
GitLab webhook documentation suggests it should be possible:
If you have a big set of projects in the one group then it will be convenient for you to configure web hooks globally for the whole group. You can add the group level web hooks on the group settings page.
That sound exactly like what I am after though I see no such thing on group settings page in my gitlab 7.0.0. I was not able to find out if this feature is not newer than that in the changelog.
Does the feature exist? How do I use it?
That's possible in the enterprise version only:
In GitLab Enterprise Edition you can configure web hooks globally for the whole group. You can add the group level web hooks on the group settings page Settings > Web Hooks.
Following up on #VertigoRay's comments, here's a procedure to do it using GitLab CE API:
Have, or create an user in GitLab and a personal access token with api scope:
User (top right avatar) > Settings (menu) > Access tokens (sidebar)
Check api scope (checkbox)
Click on create personal access token (button)
<my_personal_token> is the value in Your New Personal Access Token (text field)
Perform an HTTP request to get all projects:
GET https://gitlab.example.com/api/v4/projects
Private-Token: <my_personal_token>
Accept: application/json
For each project in the response:
id which is the <project_ID> to be used in the next request URL
Convert the value of ssh_url_to_repo so that it becomes URL encoded <encoded_ssh_url>
Example: ssh://git#example.com:1234/group/alpha.git becomes ssh%3A%2F%2Fgit%40example.com%3A1234%2Fgroup%2Falpha.git
For each project, perform an HTTP request to create a hook:
POST https://gitlab.example.com/api/v4/projects/<project_ID>/hooks
Private-Token: <my_personal_token>
Content-Type: application/json
{
"url": "https://jenkins.example.com/git/notifyCommit?url=<encoded_ssh_url>",
"enable_ssl_verification": true
}
This should be scripted in the langage of your choice.
Not suitable as a persistent solution, but this might be useful for someone looking for a one-time change (from the raketasks documentation):
Add a webhook for projects in a given NAMESPACE
# omnibus-gitlab
sudo gitlab-rake gitlab:web_hook:add URL="http://example.com/hook" NAMESPACE=acme
# source installations
bundle exec rake gitlab:web_hook:add URL="http://example.com/hook" NAMESPACE=acme RAILS_ENV=production

How to Upload images from local folder to Sitecore

`webClient.UploadFile("http://www.myurl.com/~/media/DCF92BB74CDA4D558EEF2D3C30216E30.ashx", #"E:\filesImage\Item.png");
I'm trying to upload images to sitecore using webclient.uploadfile() method by sending my sitecore address and the path of my local images.But I'm not able to upload it.I have to do this without any API's and Sitecore Instances.
The upload process would be the same as with any ASP.net application. However, once the file has been uploaded you need to create a media item programtically. You can do this from an actual file in the file system, or from a memory stream.
The process involves using a MediaCreator object and using its CreateFromFile method.
This blog post outlines the whole process:
Adding a file to the Sitecore Media Library programatically
If you're thinking simply about optimizing your developer workflow you could use the Sitecore PowerShell Extensions using the Remoting API as described in this this blog post
If you want to use web service way than you can use number of ways which are as follows:
a) Sitecore Rocks WebService (If you are allowed to install that or it is already available).
b) Sitecore Razl Service(It is third party which need license).
c) Sitecore Powershell Remoting (This needs Sitecore PowerShell extensions to be installed on Sitecore Server).
d) You can also use Sitecore Service which you can find under sitecore\shell\WebService\Service.asmx (But this is legacy of new SitecoreItemWebAPI)
e) Last is my enhanced SitecoreItemWebAPI (This also need SitecoreItemWebApi 1.2 as a pre-requisite).
But in end except option d you need to install some or other thing in order to upload the image using HTTP, you should also know the valid credentials to use any of above stated methods.
If your customers upload the image on the website, you need to create the item in your master database. (needs access and write right on the master database) depend on your security you might consider not build it with custom code.
But using the Sitecore webforms for marketers module With out of the box file upload. Create a form with upload field and using the WFFM webservices.
If you dont want to use Sitecore API, then you can do the following:
Write a code that uploads images into this folder : [root]/upload/
You might need to create folder structure that represent how the images are stored in Sitecore, eg: your images uploaded into [root]/upload/Import/ will be stored in /sitecore/media library/Import
Sitecore will automatically upload these images into Media library
Hope this helps
Option: You can use Item Web API for it. No reference to any Sitecore dll is needed. You will only need access to the host and be able to enable the Item Web API.
References:
Upload the files using it: http://www.sitecoreinsight.com/how-create-media-items-using-sitecore-item-web-api/
Enable Item Web Api: http://sdn.sitecore.net/upload/sdn5/modules/sitecore%20item%20web%20api/sitecore_item_web_api_developer_guide_sc66-71-a4.pdf#search=%22item%22
I guess that is pretty much what you need, but as Jay S mentioned, if you put more details on your question helps on finding the best option to your particular case.
private void CreateImageIteminSitecore()
{
filePath = #"C:\Sitecore\Website\ImageTemp\Pic.jpg;
using (new SecurityDisabler())
{
Database masterDb = Sitecore.Configuration.Factory.GetDatabase("master");
Sitecore.Resources.Media.MediaCreatorOptions options = new Sitecore.Resources.Media.MediaCreatorOptions();
options.FileBased = true;
options.AlternateText = Path.GetFileNameWithoutExtension(filePath);
options.Destination = "/sitecore/media library/Downloads/";
options.Database = masterDb;
options.Versioned = false; // Do not make a versioned template
options.KeepExisting = false;
Sitecore.Data.Items.MediaItem mediaitemImage = new Sitecore.Resources.Media.MediaCreator().CreateFromFile(filePath, options);
Item ImageItem = masterDb.GetItem(mediaitemImage.ID.ToString());
ImageItem.Editing.BeginEdit();
ImageItem.Name = Path.GetFileNameWithoutExtension(filePath);
ImageItem.Editing.EndEdit();
}
}

How do I modify the Site Collection in SharePoint 2013?

When I try to open a form published from InfoPath I now get this error:
"The following location is not accessible, because it is in a different site collection:
https//portal/sites/forms/Daily%20Activity/Forms/template.xsn?SaveLocation=https//portal.alamedacountyfire.org/sites/forms/Daily%20Activity/&Source=https//portal.alamedacountyfire.org/sites/forms/Daily%2520Activity/Forms/AllItems.aspx&ClientInstalled=false&OpenIn=Browser&NoRedirect=true&XsnLocation=https//PORTAL/sites/forms/Daily%20Activity/Forms/template.xsn."
Correlation ID:12c0ab9c-caff-80a8-f1b4-64d81dcfa6ea
Following are some options that you can try:
1) Save the form template (.xsn) as the source files in the publish options. Look at the manifest file in notepad and see if you can find a reference to the incorrect location. If so, correct it and Republish the form.
2) Clear the InfoPath cache on that machine. Start->Run "infopath /cache clearall"
3) See if the site collection has a managed path, if so, give the proper url while publishing. The XSN might be getting deployed on the root site and throws error since the intended list does'nt exist.
I found this worked for me. Got the answer from another post.
"I had a similar problem and found it was due to the request management service routing from my web application host header to the server name.
There was a routing rule in my request management settings. I just disabled routing and the problem went away. I used the following powershell to disable it. "
$w = Get-SPWebApplication "http://webapphostname"
$r = $w | Get-SPRequestManagementSettings
$r.RoutingEnabled = $false
$r.Update()
You may want to configure it rather than disable it. Here’s a good resource to get you started:
http://www.harbar.net/articles/sp2013rm1.aspx

Resources