Programmatically export Azure SQL as DACPAC to Blob Storage - azure

I would like to perform a scheduled task of exporting an Azure SQL database as DACPAC to the Blob Storage. I would like to know can I do this. Web Job? Powershell script?

We also can do this with WebJob. I create a demo with Microsoft.Azure.Management.Sql -Pre .Net SDK,and it works successfully for me.
More information about how to deploy webjob and create scheduled job please refer to the following documents.
creating-and-deploying-microsoft-azure-webjobs
create-a-scheduled-webjob-using-a-cron-expression
The following is my detail steps and sample code:
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps about how to registry app and get access token please refer to document.
Steps:
1.Create a C# console Application
2.Get accessToken by using registry App in Azure AD
public static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var clientCredential = new ClientCredential(clientId, secretKey);
var context = new AuthenticationContext("https://login.windows.net/" + tenantId);
var accessToken = context.AcquireTokenAsync("https://management.azure.com/", clientCredential).Result;
return accessToken.AccessToken;
}
3.Create Azure sqlManagementClient object
SqlManagementClient sqlManagementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId,clientId, secretKey)));
4.Use sqlManagementClient.ImportExport.Export to export .dacpac file to azure storage
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters)
5. Go the the Bin/Debug path of the Application and Add all the contents in a .zip file.
Add the webjob from the Azure portal
Check the webjob log from the kudu tool
Check the backup file from the azure storage.
SDK info please refer to the Package.config file.
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Hyak.Common" version="1.0.2" targetFramework="net452" />
<package id="Microsoft.Azure.Common" version="2.1.0" targetFramework="net452" />
<package id="Microsoft.Azure.Common.Dependencies" version="1.0.0" targetFramework="net452" />
<package id="Microsoft.Azure.Management.Sql" version="0.51.0-prerelease" targetFramework="net452" />
<package id="Microsoft.Bcl" version="1.1.9" targetFramework="net452" />
<package id="Microsoft.Bcl.Async" version="1.0.168" targetFramework="net452" />
<package id="Microsoft.Bcl.Build" version="1.0.14" targetFramework="net452" />
<package id="Microsoft.IdentityModel.Clients.ActiveDirectory" version="2.28.3" targetFramework="net452" />
<package id="Microsoft.Net.Http" version="2.2.22" targetFramework="net452" />
<package id="Microsoft.Web.WebJobs.Publish" version="1.0.12" targetFramework="net452" />
<package id="Newtonsoft.Json" version="6.0.4" targetFramework="net452" />
</packages>
Demo code:
static void Main(string[] args)
{
var subscriptionId = "Your Subscription Id";
var clientId = "Your Application Id";
var tenantId = "tenant Id";
var secretKey = "secretkey";
var azureSqlDatabase = "Azure SQL Database Name";
var resourceGroup = "Resource Group of Azure Sql ";
var azureSqlServer = "Azure Sql Server";
var adminLogin = "Azure SQL admin login";
var adminPassword = "Azure SQL admin password";
var storageKey = "Azure storage Account Key";
var baseStorageUri = "Azure storage URi";//with container name endwith "/"
var backName = azureSqlDatabase + "-" + $"{DateTime.UtcNow:yyyyMMddHHmm}" + ".bacpac"; //back up sql file name
var backupUrl = baseStorageUri + backName;
ImportExportOperationStatusResponse exportStatus = new ImportExportOperationStatusResponse();
try
{
ExportRequestParameters exportRequestParameters = new ExportRequestParameters
{
AdministratorLogin = adminLogin,
AdministratorLoginPassword = adminPassword,
StorageKey = storageKey,
StorageKeyType = "StorageAccessKey",
StorageUri = new Uri(backupUrl)
};
SqlManagementClient sqlManagementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId,clientId, secretKey)));
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters); //do export operation
while (exportStatus.Status != OperationStatus.Succeeded) // until operation successed
{
Thread.Sleep(1000 * 60);
exportStatus = sqlManagementClient.ImportExport.GetImportExportOperationStatus(export.OperationStatusLink);
}
Console.WriteLine($"Export DataBase {azureSqlDatabase} to Storage wxtom2 Succesfully");
}
catch (Exception)
{
//todo
}
}

Hi have you had a look at the following documentation which includes a PowerShell script and an Azure automation reference with sample script.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export-powershell

Related

Azure SDK - Add/Remove Virtual Network from App Service

We have a need to Add or Remove Web Apps from Virtual Neworks.
We have the virtual network already setup, so we created an app to create subnets and then allocate the Web App to the VN using that subnet.
I have the following SDK's consumed in our C# project:
<PackageReference Include="Microsoft.Azure.Management.AppService.Fluent" Version="1.38.0" />
<PackageReference Include="Microsoft.Azure.Management.Fluent" Version="1.38.0" />
<PackageReference Include="Microsoft.Azure.Management.Network.Fluent" Version="1.38.0" />
<PackageReference Include="Microsoft.Azure.Services.AppAuthentication" Version="1.6.2" />
I have tried to find a way to update the network settings for my Web App, but cannot seem to find an Update that works with it?
In the code below I have added a comment on the bit I don't know how to do.
var networks = azure.Networks.List().ToList();
var mainVN = networks.Find(n => n.Key == "MainVn");
webApp = azure.WebApps.Find(w => w.Name == "PrimaryApp")
var cidr = $"10.0.1.0/24";
string subNetName = webApp.Name + "-subnet";
await mainVN.Update().WithSubnet(subNetName, cidr).ApplyAsync();
webApp.Update() // <----- how do I do this bit? To add the subnet/VN to the app service?
// DELETE
await azure.WebApps.Manager.Inner.WebApps.DeleteSwiftVirtualNetworkAsync("rg", "webapp");
// ADD
await azure.WebApps.Manager.Inner.WebApps.CreateOrUpdateSwiftVirtualNetworkConnectionAsync("rg", "webapp",
new Microsoft.Azure.Management.AppService.Fluent.Models.SwiftVirtualNetworkInner {
});

Is there an equivalent of Set-AzureRmKeyVaultAccessPolicy in REST API or .Net SDK?

I need to programmatically permission Azure Key Vault and the closest I got to it is Set-AzureRmKeyVaultAccessPolicy PowerShell command.
Is there an equivalent in the .NET SDK for that or perhaps in the REST API?
here you go, you could probably find something similar for the .NET SDK.
Also, if you do Set-AzureRmKeyVaultAccessPolicy -debug you would find the information needed:
DEBUG: ============================ HTTP REQUEST ============================
HTTP Method:
PUT
Absolute Uri:
https://management.azure.com/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.KeyVault/vaults/xxx?api-version=2015-06-01
Body {Omitted}
edit: For future reference, PowerShell uses the REST APIs. If there is a PS command for it, there is definitely a REST endpoint. By Junnas
We can use Microsoft Azure Key Vault Management to do that.It is a preview version. We can create or update Key Vault using keyVaultManagementClient.Vaults.CreateOrUpdateAsync() function.
I did a demo for it. My detail steps are as following:
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps please refer to document.
Steps:
1.Create a C# console application
2.Add the demo code in the project
using System;
using System.Collections.Generic;
using Microsoft.Azure.Management.KeyVault;
using Microsoft.Azure.Management.KeyVault.Models;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Rest;
var subscriptionId = "Your Subscription Id";
var clientId = "Your Registry Application Id";
var tenantId = "Your tenant Id";
var secretKey = "Application secret Key";
var objectId = "Registry Application object Id"
var clientCredential = new ClientCredential(clientId, secretKey);
var context = new AuthenticationContext("https://login.windows.net/" + tenantId);
const string resourceGroupName = "tom";
// The name of the vault to create.
const string vaultName = "TomNewKeyVaultForTest";
var accessPolicy = new AccessPolicyEntry
{
ApplicationId = Guid.Parse(clientId),
TenantId = Guid.Parse(tenantId),
Permissions = new Permissions
{
Keys = new List<string> { "List","Get" },
Secrets = new List<string> { "All" }
},
ObjectId = Guid.Parse(objectId)
};
VaultProperties vaultProps = new VaultProperties
{
EnabledForTemplateDeployment = true,
TenantId = Guid.Parse(tenantId),
AccessPolicies = new List<AccessPolicyEntry>
{
accessPolicy
}
};
Microsoft.Rest.ServiceClientCredentials credentials = new TokenCredentials(token);
VaultCreateOrUpdateParameters vaultParams = new VaultCreateOrUpdateParameters("eastasia", vaultProps);
KeyVaultManagementClient keyVaultManagementClient= new KeyVaultManagementClient(credentials)
{
SubscriptionId = subscriptionId
};
var result = keyVaultManagementClient.Vaults.CreateOrUpdateAsync(resourceGroupName, vaultName, vaultParams).Result;
3.Debug the demo
4.Check the created or updated KeyVault in the azure portal
More SDK information please refer to the package.config file:
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Hyak.Common" version="1.0.2" targetFramework="net452" />
<package id="Microsoft.Azure.Common" version="2.1.0" targetFramework="net452" />
<package id="Microsoft.Azure.Common.Dependencies" version="1.0.0" targetFramework="net452" />
<package id="Microsoft.Azure.Management.KeyVault" version="2.0.0-preview" targetFramework="net452" />
<package id="Microsoft.Bcl" version="1.1.9" targetFramework="net452" />
<package id="Microsoft.Bcl.Async" version="1.0.168" targetFramework="net452" />
<package id="Microsoft.Bcl.Build" version="1.0.14" targetFramework="net452" />
<package id="Microsoft.IdentityModel.Clients.ActiveDirectory" version="2.28.3" targetFramework="net452" />
<package id="Microsoft.Net.Http" version="2.2.22" targetFramework="net452" />
<package id="Microsoft.Rest.ClientRuntime" version="2.3.1" targetFramework="net452" />
<package id="Microsoft.Rest.ClientRuntime.Azure" version="3.3.1" targetFramework="net452" />
<package id="Newtonsoft.Json" version="6.0.8" targetFramework="net452" />
</packages>

How can I get a certificate by name given in ServiceDefinition on Azure

<Certificates>
<Certificate name="MyRandomName" storeLocation="LocalMachine" storeName="My" />
</Certificates>
When I have above in my ServiceDefinition.csdef. Is this the name 'MyRandomName' that the certificate gets on the server?
How do it get a X509Certificate2 instance of it in the OnStart call ? Is it needed that I have a setting also telling the thumbprint to look it up by?
I found another solution to my problem:
I can decrypt a setting like this:
var encryptedBytes = Convert.FromBase64String(setting);
var envelope = new EnvelopedCms();
envelope.Decode(encryptedBytes);
var store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.ReadOnly);
envelope.Decrypt(store.Certificates);
string passwordChars = Encoding.UTF8.GetString(envelope.ContentInfo.Content);
when it has been encryptet like this:
X509Certificate2 cert = LoadCertificate(
System.Security.Cryptography.X509Certificates.StoreName.My,
System.Security.Cryptography.X509Certificates.StoreLocation.CurrentUser, args[0]);
byte[] encoded = System.Text.UTF8Encoding.UTF8.GetBytes(args[1]);
var content = new ContentInfo(encoded);
var env = new EnvelopedCms(content);
env.Encrypt(new CmsRecipient(cert));
string encrypted64 = Convert.ToBase64String(env.Encode());
This means the user do not have to add any thumbprints other than the
<Certificates>
<Certificate name="Composite.WindowsAzure.Management" thumbprint="3D3275357F9DADDDF31F7597656B42137BBBCD56" thumbprintAlgorithm="sha1" />
</Certificates>
in cscfg for their Cloud Service, and also upload it on the portal ofcause.
The args[0] and args[1] are just a thumbprint of the certificate to use and settingvalue to encrypt.

how to get azure webrole to log all 404s

I have been trying to get logging working with azure for my MVC project but so far haven't had much success.
I have a Diagnostics connection string in my ServiceConfiguration.Cloud.cscfg file which points to my blob storage:
...
<Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.DiagnosticsConnectionString" value="**ConectionString**" />
</ConfigurationSettings>
My web.config has tracing set up
...
<tracing>
<traceFailedRequests>
<remove path="*"/>
<add path="*">
<traceAreas>
<add provider="ASP" verbosity="Verbose" />
<add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
<add provider="ISAPI Extension" verbosity="Verbose" />
<add provider="WWW Server" areas="Authentication,Security,Filter,StaticFile,CGI,Compression,Cache,RequestNotifications,Module" verbosity="Verbose" />
</traceAreas>
<failureDefinitions timeTaken="00:00:15" statusCodes="400-599" />
</add>
</traceFailedRequests>
</tracing>
</system.webServer>
My WebRole.cs has the following in
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.Diagnostics;
using Microsoft.WindowsAzure.ServiceRuntime;
namespace MvcWebRole1
{
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
// Get the factory configuration so that it can be edited
DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
// Set scheduled transfer interval for infrastructure logs to 1 minute
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1);
// Specify a logging level to filter records to transfer
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
// Set scheduled transfer interval for user's Windows Azure Logs to 1 minute
config.Logs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1);
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.DiagnosticsConnectionString", config);
//RoleEnvironment.Changing += this.RoleEnvironmentChanging;
return base.OnStart();
}
}
}
But the I am not seeing any diagnostics logs
The mam folder just contains an MACommanda.xml and a MASecret, vsdeploy folder is empty and the wad-control-container has a file for each deployment.
Am I missing something / doing something wrong?
I have been trying to follow the guides from http://msdn.microsoft.com/en-us/library/windowsazure/gg433048.aspx in particular http://channel9.msdn.com/learn/courses/Azure/Deployment/DeployingApplicationsinWindowsAzure/Exercise-3-Monitoring-Applications-in-Windows-Azure
Update:
I found the following which could be part of the problem
IIS7 Logs Are Not Collected Properly -
http://msdn.microsoft.com/en-us/library/hh134842
although that should only account for the 404s not working, with a failure definition of 15 seconds my 17 second sleep in my controller action should have still been logged
Here is how I was finally able to get all the logging working on the Azure web role:
In the WebRole.cs include the following:
// Get the default initial configuration for DiagnosticMonitor.
var config = DiagnosticMonitor.GetDefaultInitialConfiguration();
// Filter the logs so that only error-level logs are transferred to persistent storage.
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = config.Logs.ScheduledTransferLogLevelFilter =
config.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
// Schedule a transfer period of 30 minutes.
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = config.Logs.ScheduledTransferPeriod = config.WindowsEventLog.ScheduledTransferPeriod =
config.Directories.ScheduledTransferPeriod = config.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
// Specify a buffer quota.
config.DiagnosticInfrastructureLogs.BufferQuotaInMB = config.Logs.BufferQuotaInMB = config.WindowsEventLog.BufferQuotaInMB =
config.Directories.BufferQuotaInMB = config.PerformanceCounters.BufferQuotaInMB = 512;
// Set an overall quota of 8GB maximum size.
config.OverallQuotaInMB = 8192;
// WindowsEventLog data buffer being added to the configuration, which is defined to collect event data from the System and Application channel
config.WindowsEventLog.DataSources.Add("System!*");
config.WindowsEventLog.DataSources.Add("Application!*");
// Use 30 seconds for the perf counter sample rate.
TimeSpan perfSampleRate = TimeSpan.FromSeconds(30D);
config.PerformanceCounters.DataSources.Add(new PerformanceCounterConfiguration()
{
CounterSpecifier = #"\Memory\Available Bytes",
SampleRate = perfSampleRate
});
config.PerformanceCounters.DataSources.Add(new PerformanceCounterConfiguration()
{
CounterSpecifier = #"\Processor(_Total)\% Processor Time",
SampleRate = perfSampleRate
});
config.PerformanceCounters.DataSources.Add(new PerformanceCounterConfiguration()
{
CounterSpecifier = #"\ASP.NET\Applications Running",
SampleRate = perfSampleRate
});
// Start the DiagnosticMonitor using the diagnosticConfig and our connection string.
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString",config);
return base.OnStart();
In the web.config under system.WebServer add the following:
<tracing>
<traceFailedRequests>
<remove path="*"/>
<add path="*">
<traceAreas>
<add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
<add provider="WWW Server" areas="Authentication,Security,Filter,StaticFile,CGI,Compression,Cache,RequestNotifications,Module" verbosity="Verbose" />
</traceAreas>
<failureDefinitions statusCodes="400-599" />
</add>
</traceFailedRequests>
</tracing>
In the service definition file add the following under the web role:
<LocalResources>
<LocalStorage name="DiagnosticStore" sizeInMB="8192" cleanOnRoleRecycle="false"/>
</LocalResources>
That should enable all your logging in the MVC application.
Can you try by removing the "timeTaken" attribute from "failureDefinitions" node? Ref: http://msdn.microsoft.com/en-us/library/aa965046(v=VS.90).aspx.
Check the deployment's diagnostics settings by viewing the appropriate file in wad-control-container.
I note that you are not setting what in my experience is all the required values for DiagnosticInfrastructureLogs or for Logs, including bufferQuotaInMB and scheduledTransferLogLevelFilter.
Try this:
// Get the factory configuration so that it can be edited
DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
config.DiagnosticInfrastructureLogs.bufferQuotaInMB = 512;
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1D);
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
config.Logs.bufferQuotaInMB = 512;
config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
config.Logs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1D);
Try that to start. Ensure as well that you have added trace listeners.

Connecting to Azure storage account thru proxy server

My 'LocalClient' app is in a corporate LAN behind an HTTP proxy server (ISA). The first Azure API call i make - CloudQueue.CreateIfNotExist() - causes an exception: (407) Proxy Authentication Required. I tried following things:
Added the <System.Net> defaultProxy element to app.config, but it doesn't seem to be working (Reference: http://geekswithblogs.net/mnf/archive/2006/03/08/71663.aspx).
I configured 'Microsoft Firewall Client for ISA Server', but that did not help either.
Used a custom proxy handler as suggested here: http://dunnry.com/blog/2010/01/25/SupportingBasicAuthProxies.aspx. I am not able to get this working - getting a Configuration initialization exception.
As per MSDN, an HTTP proxy server can be specified in the connection string only in case of Development Storage (see http://msdn.microsoft.com/en-us/library/ee758697.aspx):
UseDevelopmentStorage=true;DevelopmentStorageProxyUri=http://myProxyUri
Is there any way to connect to the Azure Storage thru a proxy server?
I actually found that the custom proxy solution was not required.
Adding the following to app.config (just before the </configuration>) did the trick for me:
<system.net>
<defaultProxy enabled="true" useDefaultCredentials="true">
<proxy usesystemdefault="true" />
</defaultProxy>
</system.net>
The custom proxy solution (the third thing i tried as mentioned in my original question) worked perfectly. The mistake i was doing earlier was not putting the <configSections> element at the beginning of <configuration> in app.config as required. On doing that, the custom proxy solution given here solved my problem.
To by pass the proxy then please use like below, it works as expected and same has been tested.
public class AzureUpload {
// Define the connection-string with your values
/*public static final String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=your_storage_account;" +
"AccountKey=your_storage_account_key";*/
public static final String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=test2rdrhgf62;" +
"AccountKey=1gy3lpE7Du1j5ljKiupjhgjghjcbfgTGhbntjnRfr9Yi6GUQqVMQqGxd7/YThisv/OVVLfIOv9kQ==";
// Define the path to a local file.
static final String filePath = "D:\\Project\\Supporting Files\\Jar's\\Azure\\azure-storage-1.2.0.jar";
static final String file_Path = "D:\\Project\\Healthcare\\Azcopy_To_Azure\\data";
public static void main(String[] args) {
try
{
// Retrieve storage account from connection-string.
//String storageConnectionString = RoleEnvironment.getConfigurationSettings().get("StorageConnectionString");
//Proxy httpProxy = new Proxy(Proxy.Type.HTTP,new InetSocketAddress("132.186.192.234",8080));
System.setProperty("http.proxyHost", "102.122.15.234");
System.setProperty("http.proxyPort", "80");
System.setProperty("https.proxyUser", "ad001\\empid001");
System.setProperty("https.proxyPassword", "pass!1");
// Retrieve storage account from connection-string.
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
// Get a reference to a container.
// The container name must be lower case
CloudBlobContainer container = blobClient.getContainerReference("rpmsdatafromhospital");
// Create the container if it does not exist.
container.createIfNotExists();
// Create a permissions object.
BlobContainerPermissions containerPermissions = new BlobContainerPermissions();
// Include public access in the permissions object.
containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
// Set the permissions on the container.
container.uploadPermissions(containerPermissions);
// Create or overwrite the new file to blob with contents from a local file.
/*CloudBlockBlob blob = container.getBlockBlobReference("azure-storage-1.2.0.jar");
File source = new File(filePath);
blob.upload(new FileInputStream(source), source.length());*/
String envFilePath = System.getenv("AZURE_FILE_PATH");
//upload list of files/directory to blob storage
File folder = new File(envFilePath);
File[] listOfFiles = folder.listFiles();
for (int i = 0; i < listOfFiles.length; i++) {
if (listOfFiles[i].isFile()) {
System.out.println("File " + listOfFiles[i].getName());
CloudBlockBlob blob = container.getBlockBlobReference(listOfFiles[i].getName());
File source = new File(envFilePath+"\\"+listOfFiles[i].getName());
blob.upload(new FileInputStream(source), source.length());
System.out.println("File " + listOfFiles[i].getName()+ " upload successful");
}
//directory upload
/*else if (listOfFiles[i].isDirectory()) {
System.out.println("Directory " + listOfFiles[i].getName());
CloudBlockBlob blob = container.getBlockBlobReference(listOfFiles[i].getName());
File source = new File(file_Path+"\\"+listOfFiles[i].getName());
blob.upload(new FileInputStream(source), source.length());
}*/
}
}catch (Exception e)
{
// Output the stack trace.
e.printStackTrace();
}
}
}
.Net or C# then please add below code to "App.config"
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5.2" />
</startup>
<system.net>
<defaultProxy enabled="true" useDefaultCredentials="true">
<proxy usesystemdefault="true" />
</defaultProxy>
</system.net>
</configuration>

Resources