ComputerVisionClient or Xamarin Essentials Error - Invalid URI: The format of the URI could not be determined when calling method ReadInStreamAsync - azure

So I am capturing a photo and opening a stream using Xamarin.Essentials 1.7 MediaPicker built into Essentials.
When I call the ReadInStreamAsync(stream) method in Computer Vision Client, I get an error and my Xamarin.Forms app breaks inside the method: 'Invalid URI: The format of the URI could not be determined.'
This is the stream.Name value - '/data/user/0/com.companyname.xamphotoappdemo2/cache/2203693cc04e0be7f4f024d5f9499e13/198fd32db9cc4be38a493325974fa138/d964251252fc4963aca94339d73a8007.jpg'
This is my code:
var file = await MediaPicker.CapturePhotoAsync(new MediaPickerOptions
{ Title = "Please take a photo" });
if(file != null)
{
var stream = await file.OpenReadAsync();
chosenImage.Source = ImageSource.FromStream(() => stream);
// 2. Add OCR logic.
var client = Authenticate(ApiSettings.subscriptionKey, ApiSettings.endpoint);
var text = await client.ReadInStreamAsync(stream);
//after the request, get the operation location
string operationLocation = text.OperationLocation;
//we only need the operation ID, not the whole URL
const int numberOfCharsInOperationId = 36;
string operationId = operationLocation.Substring(operationLocation.Length - numberOfCharsInOperationId);
//Get the ocr read results
ReadOperationResult results;
do
{
results = await client.GetReadResultAsync(Guid.Parse(operationId));
}
while ((results.Status == OperationStatusCodes.Running || results.Status == OperationStatusCodes.NotStarted));
var readResults = results.AnalyzeResult.ReadResults;
var expirationDates = from page in readResults
from line in page.Lines
where line.Text.Contains("EXPIRES") && line.Words.Count == 4
select line.Words[3].Text;
expirationDate.Text = expirationDates.ToString();
photoPath.Text = file.FullPath;
The image is displaying as expected in the XAML Image control and that is reading the image source from the stream, so is this a bug in the ReadInStreamAsync method?

Related

How to configure the user_token of Damn Vulnerable Web Application within CSRF field while Script based authentication using ZAP?

I had been following the documentation of Script Based Authentication for Damn Vulnerable Web Application using ZAP. I have navigated to http://localhost/dvwa/login.php through Manual Explore which opens up the DVWA application on my localhost as follows:
and adds the URL to the Default Context.
I've also created the dvwa script with the following configuration:
and modified the dvwa script:
Now when I try Configure Context Authentication, dvwa script does gets loaded but the CSRF field doesn't shows up.
Additionally, POST Data doesn't even shows up but Extra POST Data is shown.
Am I missing something in the steps? Can someone help me out?
The modified script within the documentation of Script Based Authentication section for Damn Vulnerable Web Application using ZAP
seems incomplete.
The complete script is available at Setting up ZAP to Test Damn Vulnerable Web App (DVWA) which is as follows:
function authenticate(helper, paramsValues, credentials) {
var loginUrl = paramsValues.get("Login URL");
var csrfTokenName = paramsValues.get("CSRF Field");
var csrfTokenValue = extractInputFieldValue(getPageContent(helper, loginUrl), csrfTokenName);
var postData = paramsValues.get("POST Data");
postData = postData.replace('{%username%}', encodeURIComponent(credentials.getParam("Username")));
postData = postData.replace('{%password%}', encodeURIComponent(credentials.getParam("Password")));
postData = postData.replace('{%' + csrfTokenName + '%}', encodeURIComponent(csrfTokenValue));
var msg = sendAndReceive(helper, loginUrl, postData);
return msg;
}
function getRequiredParamsNames() {
return [ "Login URL", "CSRF Field", "POST Data" ];
}
function getOptionalParamsNames() {
return [];
}
function getCredentialsParamsNames() {
return [ "Username", "Password" ];
}
function getPageContent(helper, url) {
var msg = sendAndReceive(helper, url);
return msg.getResponseBody().toString();
}
function sendAndReceive(helper, url, postData) {
var msg = helper.prepareMessage();
var method = "GET";
if (postData) {
method = "POST";
msg.setRequestBody(postData);
}
var requestUri = new org.apache.commons.httpclient.URI(url, true);
var requestHeader = new org.parosproxy.paros.network.HttpRequestHeader(method, requestUri, "HTTP/1.0");
msg.setRequestHeader(requestHeader);
helper.sendAndReceive(msg);
return msg;
}
function extractInputFieldValue(page, fieldName) {
// Rhino:
var src = new net.htmlparser.jericho.Source(page);
// Nashorn:
// var Source = Java.type("net.htmlparser.jericho.Source");
// var src = new Source(page);
var it = src.getAllElements('input').iterator();
while (it.hasNext()) {
var element = it.next();
if (element.getAttributeValue('name') == fieldName) {
return element.getAttributeValue('value');
}
}
return '';
}
Using this script, CSRF Field and POST Data field shows up just perfect.

export to excel from a byte [] in angular

I have the response json from the backend (spring boot):
private byte[] archivoExcel;
private String extension;
private String mime;
What I need in angular is to get this byte [] and export it to excel, for this my answer json in angular is:
export class RespuestaExportar {
archivoExcel: ArrayBuffer;
extension: string;
mime: string;
}
and in my component.ts file I have:
this.reversionesService.getReversionesExportarSeguimiento(this.solicitud).subscribe(res => { this.respuestaExportar=res; let file = new Blob([this.respuestaExportar.archivoExcel], { type: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"});
var fileURL = URL.createObjectURL(file); window.open(fileURL); }
When I run the 'export' button, it consumes correctly, and it downloads the excel, but this file is damaged. Do I need one more step to solve it? or there is another alternative, since I need to get this byte [] from the backEnd.
I think you can guide yourself from this service that I perform for my backend (SpringBoot) I send a responseObject as json
ObjectResponse response = new ObjectResponse();
ByteArrayInputStream in = getExcel();
byte[] array = new byte[in.available()];
in.read(array);
httpStatus = HttpStatus.OK;
response.setResultado(array);
return new ResponseEntity<>(response, httpStatus);
and I get a Json response with the structure that sent the data is that when serializing it, the response.getResult arrives in base64, that is, already transformed to the client that receives it.
So in my frontend (Angular) I proceed to transform it into a Blob and be able to work the file.
this.subcriber = this.reporteService.generarExcel (). subscribe ((b64Data: any) => {
const byteCharacters = atob (b64Data.result);
const byteNumbers = new Array (byteCharacters.length);
for (let i = 0; i <byteCharacters.length; i ++) {
byteNumbers [i] = byteCharacters.charCodeAt (i);
}
const byteArray = new Uint8Array (byteNumbers);
let blob = new Blob ([byteArray], {type: 'application / vnd.openxmlformats-officedocument.spreadsheetml.sheet'});
const url = window.URL.createObjectURL (blob);
const anchor = document.createElement ('a');
anchor.download = `file.xlsx`;
anchor.href = url;
anchor.click ();
messageService = {
state: false,
messages: null
};
}, (err) => {
this.subcriber.unsubscribe ();
})
I hope it helps you

Getting CustomerVisionErrorException: Operation returned an invalid status code "BadRequest"

Im trying to use Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction feature by submitting an image by using pickphoto cross media and geting a result of a predicition. I have tried to pass the image as url or a stream and keep getting a badrequest. I know that I have the correct prediction key and endpoint because i works for training telling me is the way I pass the image into the method. What is the correct way to transform the image from Cross Media pick photo package into the
private async void UplodatePictureButton_Clicked(object sender, EventArgs e)
{
await CrossMedia.Current.Initialize();
MediaFile file;
if (!CrossMedia.Current.IsPickPhotoSupported)
{
await DisplayAlert("No upload", "Picking a photo is not supported", "OK");
return;
}
file = await CrossMedia.Current.PickPhotoAsync();
if (file == null)
{
return;
}
MainImage.Source = ImageSource.FromStream(() =>
{
var stream = file.GetStream();
return stream;
});
// Create the Api, passing in the training key
CustomVisionTrainingClient trainingApi = new CustomVisionTrainingClient()
{
ApiKey = trainingKey,
Endpoint = SouthCentralUsEndpointTraining
};
var projects = trainingApi.GetProjects();
var project = projects.FirstOrDefault(p => p.Name == "Car");
CustomVisionPredictionClient endpoint = new CustomVisionPredictionClient()
{
ApiKey = predictionKey,
Endpoint = SouthCentralUsEndpointPrediction
};
var result = endpoint.ClassifyImageUrl(project.Id, project.Name, new Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction.Models.ImageUrl(file.Path));
foreach (var c in result.Predictions)
{
Console.WriteLine($"\t{c.TagName}: {c.Probability:P1}");
}
}
Unhandled Exception:
Microsoft.Azure.CognitiveService.Vision.CustomerVision.Prediction.Models.CustomVisionErrorException: Operation returned an invalid status code "BadRequest"
Expect a prediction.
Here is the picture of the code:
code
Here is the picture of the problem:
problem
I got the same "Bad Request Message" while trying to do a endpoint.DetectImage(projectId, iteractionName, stream). The thing is that last week it was working Perfect. I have noticed that it only happens with large images arround 2 mb

oAuth2 web request works in browser but not in app

I have the following code sample with which I'm trying to authenticate an Azure active directory user within a Xamarin forms app
The URL (I've removed the actual client ID) works fine in a browser but fails when trying to send the http request
The error message says 'the response type must include client_id'
string URL = "https://login.microsoftonline.com/common/oauth2/v2.0/authorize?"
+ "client_id=xxxx-xxxxx-xxxxx-xxxxx-xxx"
+ "&response_type=code"
+ "&redirect_uri=https://login.microsoftonline.com/common/oauth2/nativeclient"
+ "&response_mode=query"
+ "&scope=openid%20offline_access%20https%3A%2F%2Fgraph.microsoft.com%2Fmail.read"
+ "&state=12345";
var webRequest = System.Net.WebRequest.Create(URL) as HttpWebRequest;
System.Console.WriteLine(URL);
if (webRequest != null)
{
webRequest.Method = "POST";
webRequest.ServicePoint.Expect100Continue = false;
webRequest.Timeout = 20000;
webRequest.ContentType = "text/html";
//POST the data.
using (requestWriter = new StreamWriter(webRequest.GetRequestStream()))
{
requestWriter.Write(postData);
}
}
HttpWebResponse resp = (HttpWebResponse)webRequest.GetResponse();
Stream resStream = resp.GetResponseStream();
StreamReader reader = new StreamReader(resStream);
ret = reader.ReadToEnd();
You put parameters in the URL, so you need to use GET method, instead of POST (like your browser does when you paste the URL in its address bar).
So, replace:
webRequest.Method = "POST";
by:
webRequest.Method = "GET";
and remove:
//POST the data.
using (requestWriter = new StreamWriter(webRequest.GetRequestStream()))
{
requestWriter.Write(postData);
}

Values seems changing straight after Initial Save in Dynamics CRM Opportunity

I'm doing a Service Call to retrieve the Account details (currency, discount associated with the account) on the Selection of the Account lookup on the Opportunity form (form type == 1 // Create) using Web API in CRM 2016 On-Premise. Everything is working fine but when the opportunity is saved initially it's straight away coming up with unsaved changes next to the Save button after the initial save which is forcing me to do another save(abnormal behaviour).I'm not so sure what value is changing straightaway after initial save.
The Service Call is Synchronous and is being triggered on the change of the Account Lookup, well before the initial save. Any Help Appreciated!.
function SetOpportunityCurrencyAndDiscount(){
var accountId = (GetValue("vm_accountid"))[0].id;
var result = RetrieveRecord("account", null, accountId.slice(1,-1));
var accountDiscount = result["vm_accountdiscount"];
var transactionCurrencyId = result["_transactioncurrencyid_value"];
var currencyName = result["_transactioncurrencyid_value#OData.Community.Display.V1.FormattedValue"];
SetValue("vm_discount", accountDiscount);
Xrm.Page.getAttribute("transactioncurrencyid").setValue([{ id: transactionCurrencyId, name: currencyName, entityType: "transactioncurrency"}]); }
function RetrieveRecord(recordType, alternateKey, accountId){
var result = null;
var entityType = recordType;
var query = null;
if(alternateKey != null && agencyId == null)
query = "/api/data/v8.0/accounts(emailaddress1='"+alternateKey+"')?$select=name,accountid,_transactioncurrencyid_value,vm_agencydiscount";
else
query = "/api/data/v8.0/accounts("+agencyId+")?$select=name,accountid,_transactioncurrencyid_value,vm_agencydiscount";
var req = new XMLHttpRequest();
req.open("GET", Xrm.Page.context.getClientUrl() + query, false);
req.setRequestHeader("OData-MaxVersion", "4.0");
req.setRequestHeader("OData-Version", "4.0");
req.setRequestHeader("Accept", "application/json");
req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
req.setRequestHeader("Prefer", "odata.include-annotations=\"OData.Community.Display.V1.FormattedValue\"");
req.onreadystatechange = function () {
if (this.readyState === 4) {
req.onreadystatechange = null;
if (this.status === 200) {
result = JSON.parse(this.response);
}
else {
alert(this.statusText);
}
}
};
req.send();
return result;
}
After you save your record and the form is dirty again, open dev tools and paste this into the console. It will show you which fields are dirty.
function showDirtyFields() {
var Xrm = Array.prototype.slice.call(document.querySelectorAll('iframe')).filter(function(d) {
return d.style.visibility !== 'hidden';
})[0].contentWindow.Xrm;
var message='The following fields are dirty: \n';
Xrm.Page.data.entity.attributes.forEach(function(attribute,index){
if(attribute.getIsDirty()==true){message+="\u2219 "+attribute.getName()+"\n";}
});
Xrm.Utility.alertDialog(message);
}
showDirtyFields();
Another way of accomplishing the same thing is to turn on auditing for the entity. The audit log will show you which fields were submitted.

Resources