I have implemented REST API calls using a standalone c# console application. The API returns JSON which i'm deserializing and then storing it in the database.
Now i want to implement the entire logic in Azure platform so that it can invoked by passing start date and an end date and store location (it should run for three location) Below is the code:
static void Main()
{
MakeInventoryRequest();
}
static async void MakeInventoryRequest()
{
using (var client = new HttpClient())
{
var queryString = HttpUtility.ParseQueryString(string.Empty);
// Request headers
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "5051fx6yyy124hhfyuscf34f57ce9");
// Request parameters
queryString["query.locationNumbers"] = "4638";
queryString["availableFromDate"] = "2019-01-01";
queryString["availableToDate"] = "2019-03-07";
var uri = "https://api-test.location.cloud/api/v1/inventory?" + queryString;
using (var request = new HttpRequestMessage(HttpMethod.Get, uri))
using (var response = await client.SendAsync(request))
{
var stream = await response.Content.ReadAsStreamAsync();
if (response.IsSuccessStatusCode == true)
{
List<Inventory> l1 = DeserializeJsonFromStream<List<Inventory>>(stream);
InsertInventoryRecords(l1);
}
if (response.IsSuccessStatusCode == false)
{
throw new Exception("Error Response Code: " + response.StatusCode.ToString() + "Content is: " + response.Content.ReadAsStringAsync().Result.ToString());
}
}
}
}
Please suggest the best possible design using Azure components
With the information in hand I think you have multiple options , you need to find out which works for you the best . You can use Cloud service to host the console app ( you will have to change it to worker role , Visual studio will help you to convert that ) . I am not sure about the load which you are expecting but you can always increase and decrease the instance and these can be deployed to different geographies .
I see that you are persisting the data , if you want to do that you can use many of the SQL offerings . For invoking the REST API you can also azure functions and ADF.
Please feel free to comment if you want any more details on the same.
Related
I want to schedule splunk report to an azure web-hook and persist it into Cosmos DB.(after from processing ) This tutorial gave me some insight on how to process and persist data into cosmos db via the azure functions ( in java ). To solve the next part of the puzzle I"m reaching out for some advise on how to go about:
How to setup and host a webhook on Azure ?
Should I set a HttpTrigger , inside the EventHubOutput function and deploy it into the function app.? Or should I use the Webhook from Azure Event Grid ?(not clear on how to do this ). I'm NOT looking to stream any heavy volumes of data and want to keep the consumption cost low. So , which route should I take here?. Any pointers to tutorials will be of help here.
How do I handle a webhook data processing on #EventHubOutput ( referring the java example in the tutorial) ?. What is the setup and configuration I need to do here ? Any working examples will be of help .
I ended up using just the #HttpTrigger and binding the output using #CosmosDBOutput to persist the data. Something like this , would like to know if there are any better approaches.
public class Function {
#FunctionName("PostData")
public HttpResponseMessage run(
#HttpTrigger(
name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
#CosmosDBOutput( name = "databaseOutput", databaseName = "SplunkDataSource",
collectionName = "loginData",
connectionStringSetting = "CosmosDBConnectionString")
OutputBinding<String> document,
final ExecutionContext context) {
context.getLogger().info("Java HTTP trigger processed a request.");
// Parse the payload
String data = request.getBody().get();
if (data == null) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body(
"Please pass a name on the query string or in the request body").build();
} else {
// Write the data to the Cosmos document.
document.setValue(data);
context.getLogger().info("Persisting payload to db :" + data);
return request.createResponseBuilder(HttpStatus.OK).body(data).build();
}
}
Info :
I have below 2 method which is part of Web API (not core API) and it is deployed in Azure
Method 1 :
public async Task<bool> ProcessEmployee(list<employee> EmployeeList)
var tasks = new List<Task<EmployeeResponseModel>>();
HttpClient localHttpClient = new HttpClient();
localHttpClient.Timeout = TimeSpan.FromSeconds(100);
foreach (var employee in EmployeeList) // **having 1000 calls**
{
tasks.Add(GetAddressResponse(employee.URL,localHttpClient));
}
var responses = await Task.WhenAll(tasks);
}
Method 2 :
private async Task<EmployeeResponseModel> GetAddressResponse(url, HttpClient client)
{
var response = new EmployeeResponseModel();
try
{
using (HttpResponseMessage apiResponse = await client.GetAsync(**url**))
{
if (apiResponse.IsSuccessStatusCode)
{
var res= await apiResponse.Content.ReadAsStringAsync();
response = JsonConvert.DeserializeObject<EmployeeResponseModel>(res);
}
}
return response;
}
catch (Exception ex)
{
}
return response;
}
If i monitor from Azure -> Diagnose and Solve Problem -> Web App Slow all external API calls is showing latency issue
But if i am calling same external API from Postman is is quite fast and having less latency
method 1 and method 2 is part of one web api and it is deployed on Azure AppService.
getAddress is external API which is been deployed in other environment and don't have much information
if we are calling external API i.e 'getAddress' from 1) we are facing high latency more than 5 sec.
if we are calling external API i.e 'getAddress' from Postman we receive response in 303 ms.
I guess it results from the location of the service plan.
If the location of the service plan is far away from you position, it may cause the latency. But it can't rule out other possibilities, so my suggestion is debug in localhost first to rule out the possibility of the code.
I am looking for SharePoint Hosted App Solution which will provision Branding files (JS/CSS/Images) into SharePoint Online/Office 365 environment.
I got a very good article to achive this and tried to implement the same as shown in below link: http://www.sharepointnutsandbolts.com/2013/05/sp2013-host-web-apps-provisioning-files.html
This solution is not working for me and while execution of app, I am getting below error:
Failed to provision file into host web. Error: Unexpected response data from server. Here is the code which is giving me error:
// utility method for uploading files to host web..
uploadFileToHostWebViaCSOM = function (serverRelativeUrl, filename, contents) {
var createInfo = new SP.FileCreationInformation();
createInfo.set_content(new SP.Base64EncodedByteArray());
for (var i = 0; i < contents.length; i++) {
createInfo.get_content().append(contents.charCodeAt(i));
}
createInfo.set_overwrite(true);
createInfo.set_url(filename);
var files = hostWebContext.get_web().getFolderByServerRelativeUrl(serverRelativeUrl).get_files();
hostWebContext.load(files);
files.add(createInfo);
hostWebContext.executeQueryAsync(onProvisionFileSuccess, onProvisionFileFail);
}
Please suggest me, what can be the issue in this code? Or else suggest me another way/reference in which I can Create a SharePoint-Hosted App to provision Branding Files.
Thanks in Advance!
I would use a different method to access host web context as follows:
//first get app context, you will need it.
var currentcontext = new SP.ClientContext.get_current();
//then get host web context
var hostUrl = decodeURIComponent(getQueryStringParameter("SPHostUrl"));
var hostcontext = new SP.AppContextSite(currentcontext, hostUrl);
function getQueryStringParameter(param) {
var params = document.URL.split("?")[1].split("&");
var strParams = "";
for (var i = 0; i < params.length; i = i + 1) {
var singleParam = params[i].split("=");
if (singleParam[0] == param) {
return singleParam[1];
}
}
}
Here are some references:
https://sharepoint.stackexchange.com/questions/122083/sharepoint-2013-app-create-list-in-host-web
https://blog.appliedis.com/2012/12/19/sharepoint-2013-apps-accessing-data-in-the-host-web-in-a-sharepoint-hosted-app/
http://www.mavention.com/blog/sharePoint-app-reading-data-from-host-web
http://www.sharepointnadeem.com/2013/12/sharepoint-2013-apps-access-data-in.html
Additionally, here is an example of how to deploy a master page, however as you might notice during your testing the method used to get host web context is not working as displayed in the video and you should use the one I described before.
https://www.youtube.com/watch?v=wtQKjsjs55I
Finally, here is a an example of how to deploy branding files through a Console Application using CSOM, if you are smart enough you will be able to convert this into JSOM.
https://channel9.msdn.com/Blogs/Office-365-Dev/Applying-Branding-to-SharePoint-Sites-with-an-App-for-SharePoint-Office-365-Developer-Patterns-and-P
We decided to implement a search functionality in our API which is developed in ServiceStack, we decided to use Lucene.Net since we heard it was a great indexer to make searches.
We created a worker role whose job is to create the indexes in a Azure Storage folder, we guided ourselves using Leon Cullen's tutorial. We use the AzureDirectory library specified in that post, so we could use the latest Azure SDK.
Then in our API project we added the references for Lucene.Net and AzureDirectory too, our endpoint ended up looking like this:
public object Post(SearchIndex request)
{
List<Product> products = new List<Product>();
var pageSize = -1;
var totalpages = -1;
int.TryParse(ConfigurationManager.AppSettings["PageSize"], out pageSize);
if (request.Page.Equals(0))
{
request.Page = 1;
}
// Get Azure settings
AzureDirectory azureDirectory ;
try
{
// This is the line where we get the Access denied exception thrown at us
azureDirectory = new AzureDirectory(Microsoft.WindowsAzure.Storage.CloudStorageAccount.Parse(ConfigurationManager.AppSettings["ConnectionStringAzureSearch"]), "indexsearch");
IndexSearcher searcher;
using (new AutoStopWatch("Creating searcher"))
{
searcher = new IndexSearcher(azureDirectory);
}
using (new AutoStopWatch(string.Format("Search for {0}", request.SearchString)))
{
string[] searchfields = new string[] { "Id", "Name", "Description" };
var hits = searcher.Search(QueryMaker(request.SearchString, searchfields), request.Page * pageSize);
int count = hits.ScoreDocs.Count();
float temp_totalpages = 0;
temp_totalpages = (float)hits.ScoreDocs.Count() / (float)pageSize;
if (temp_totalpages > (int)temp_totalpages)
{
totalpages = (int)temp_totalpages + 1;
}
else
{
totalpages = (int)temp_totalpages;
}
foreach (ScoreDoc match in hits.ScoreDocs)
{
Document doc = searcher.Doc(match.Doc);
int producId = int.Parse(doc.Get("Id"));
Product product = Db.Select<Product>("Id={0}", producId).FirstOrDefault();
products.Add(product);
}
}
return new SearchIndexResult { result = products.Skip((int)((request.Page - 1) * 10)).Take(pageSize).ToList(), PageSize = pageSize, TotalPages = totalpages };
}
catch (Exception e)
{
return new HttpResult(HttpStatusCode.NoContent, "azureDirectory. Parameter: " + request.SearchString + ". e: " + e.Message);
}
}
If we run this locally it works as expected, returning us the results we were expecting. But when we published our API to Azure and tried to access to the search endpoint we received an 403 error message with the message 'Access to the path "D:/AzureDirectory" is denied".
We're confused as to why is trying to access to such folder at all, the name of the folder is wrong and I think it's trying to access a local route, we really don't know why does it work fine locally but once it's deployed to Azure it stops working.
The worker role runs without a problems, but it's the API side that cannot access to the folder in Azure Storage. Are we missing some important step in the configuration? The tutorial we followed wasn't very clear for beginners using Lucene.Net or Azure Storage so we fear we might have missed an important step. We've checked our connection strings and everything seems ok though.
As for reference:
https://github.com/azure-contrib/AzureDirectory/blob/master/AzureDirectory/AzureDirectory.cs
when you do this
azureDirectory = new AzureDirectory(Microsoft.WindowsAzure.Storage.CloudStorageAccount.Parse(ConfigurationManager.AppSettings["ConnectionStringAzureSearch"]), "indexsearch");
This executes
var cachePath = Path.Combine(Path.GetPathRoot(Environment.SystemDirectory), "AzureDirectory");
var azureDir = new DirectoryInfo(cachePath);
if (!azureDir.Exists)
azureDir.Create();
var catalogPath = Path.Combine(cachePath, _containerName);
var catalogDir = new DirectoryInfo(catalogPath);
if (!catalogDir.Exists)
catalogDir.Create();
_cacheDirectory = FSDirectory.Open(catalogPath);
So simple solution for you might be to have that directory on site root
DirectoryInfo info = new DirectoryInfo(HostingEnvironment.MapPath("~/"));
azureDirectory = new AzureDirectory(storageAccount, containerName, new SimpleFSDirectory(info), true);
I got it to work.
I just got the latest version of AzureDirectory from GitHub.
Got the latest nuGet packages for Azure Storage etc.
Recreated the index.
In addition to #brykneval answer, I tried his solution but last parameter bool compressBlob = false which he set to true made my local debug fail with 404 exception from AzureDirectory library and when I published to Azure web app, it had exception with message: System.IO.InvalidDataException: Block length does not match with its complement.
I removed last parameter from constructor and everything works like a charm. Hope this helps anyone.
I'm building a Request/Acknowledge/Poll style REST service with NServiceBus underneath to manage queue processing. I want to give the client a URI to poll for updates.
Therefore I want to return a location header element in my web service as part of the acknowledgement. I can see that it is possible to do this:
return new HttpResult(response, HttpStatusCode.Accepted)
{
Location = base.Request.AbsoluteUri.CombineWith(response.Reference)
}
But for a Url such as: http://localhost:54567/approvals/?message=test, which creates a new message (I know I should probably just use a POST), the location will be returned as: http://localhost:54567/approvals/?message=test/8f0ab1c1a2ca46f8a98b75330fd3ac5c.
The ServiceStack request doesn't expose the Uri fragments, only the AbsouteUri. This means that I need to access the original request. I want this to work regardless of whether this is running in IIS or in a self hosted process. The closest I can come up with is the following, but it seems very clunky:
var reference = Guid.NewGuid().ToString("N");
var response = new ApprovalResponse { Reference = reference };
var httpRequest = ((System.Web.HttpRequest)base.Request.OriginalRequest).Url;
var baseUri = new Uri(String.Concat(httpRequest.Scheme, Uri.SchemeDelimiter, httpRequest.Host, ":", httpRequest.Port));
var uri = new Uri(baseUri, string.Format("/approvals/{0}", reference));
return new HttpResult(response, HttpStatusCode.Accepted)
{
Location = uri.ToString()
};
This now returns: http://localhost:55847/approvals/8f0ab1c1a2ca46f8a98b75330fd3ac5c
Any suggestions? Does this work regardless of how ServiceStack is hosted? I'm a little scared of the System.Web.HttpRequest casting in a self hosted process. Is this code safe?
Reverse Routing
If you're trying to build urls for ServiceStack services you can use the RequestDto.ToGetUrl() and RequestDto.ToAbsoluteUri() to build relative and absolute urls as seen in this earlier question on Reverse Routing. e.g:
[Route("/reqstars/search", "GET")]
[Route("/reqstars/aged/{Age}")]
public class SearchReqstars : IReturn<ReqstarsResponse>
{
public int? Age { get; set; }
}
var relativeUrl = new SearchReqstars { Age = 20 }.ToUrl("GET");
var absoluteUrl = HostContext.Config.WebHostUrl.CombineWith(relativeUrl);
relativeUrl.Print(); //= /reqstars/aged/20
absoluteUrl.Print(); //= http://www.myhost.com/reqstars/aged/20
For creating Urls for other 3rd Party APIs look at the Http Utils wiki for example extension methods that can help, e.g:
var url ="http://api.twitter.com/user_timeline.json?screen_name={0}".Fmt(name);
if (sinceId != null)
url = url.AddQueryParam("since_id", sinceId);
if (maxId != null)
url = url.AddQueryParam("max_id", maxId);
var tweets = url.GetJsonFromUrl()
.FromJson<List<Tweet>>();
You can also use the QueryStringSerializer to serialize a number of different collection types, e.g:
//Typed POCO
var url = "http://example.org/login?" + QueryStringSerializer.SerializeToString(
new Login { Username="mythz", Password="password" });
//Anonymous type
var url = "http://example.org/login?" + QueryStringSerializer.SerializeToString(
new { Username="mythz", Password="password" });
//string Dictionary
var url = "http://example.org/login?" + QueryStringSerializer.SerializeToString(
new Dictionary<string,string> {{"Username","mythz"}, {"Password","password"}});
You can also serialize the built-in NameValueCollection.ToFormUrlEncoded() extension, e.g:
var url = "http://example.org/login?" + new NameValueCollection {
{"Username","mythz"}, {"Password","password"} }.ToFormUrlEncoded();