The object is write protected or the server cannot read it - string

I have this var called turnosFinal2
{"title":"Mariel Guerrieri","start":"2020-03-18T09:30:00","end":"2020-03-18T10:30:00"}
,{"title":"Agustin Guerra","start":"2020-03-17T12:30:00","end":"2020-03-17T13:30:00"}
,{"title":"Maria Marco","start":"2020-03-21T13:00:00","end":"2020-03-21T14:00:00"}
And a fullCallendar with the events propierty:
events: [{
title: 'Agustin Guerra',
start: '2020-03-12T10:30:00',
end: '2020-03-12T11:30:00'
}]
In this case, the event is static and it successfully renderized when i load the webpage.. I wanna replace this static event for dinamic data
Then, i transform the var turnosFinal2 on this way:
var turnosFinal = "[" + (turnosFinal2.replace(/"title"/g, "title").replace(/"start"/g, "start").replace(/"end"/g, "end").replace(/"/g, "'")) + "]";
Getting now:
[{title:'Mariel Guerrieri',start:'2020-03-18T09:30:00',end:'2020-03-18T10:30:00'}
,{title:'Agustin Guerra',start:'2020-03-17T12:30:00',end:'2020-03-17T13:30:00'}
,{title:'Maria Marco',start:'2020-03-21T13:00:00',end:'2020-03-21T14:00:00'}
]
from my point of view this variable should now be able to go in the value of events, so i put it
events: turnoFinal,
But i get this console error:
main.js:4329 GET http://localhost/pos/[%7Btitle:'Mariel%20Guerrieri',start:'2020-03-18T09:30:00',end:'2020-03-18T10:30:00'%7D,%7Btitle:'Agustin%20Guerra',start:'2020-03-17T12:30:00',end:'2020-03-17T13:30:00'%7D,%7Btitle:'Maria%20Marco',start:'2020-03-21T13:00:00',end:'2020-03-21T14:00:00'%7D]?start=2020-03-16T00%3A00%3A00-03%3A00&end=2020-03-23T00%3A00%3A00-03%3A00 403 (Forbidden)
main.js:5273 Request failed {message: "Request failed", xhr: XMLHttpRequest}
If i click on first line (localhost/pos/[%7...) get this:
¡Forbidden Access!
You do not have permission to access the requested object.
           The object is read protected or
           the server cannot read it.

Related

Getting message in ibmmq Node.js

I'm using ibmmq module https://github.com/ibm-messaging/mq-mqi-nodejs
I need to get an xml message from a queue and than make an xsl-transformation.
I put messages to the queue with JMeter and if I browse messages in rfhutil I can see them as is on the Data tab.
But when I get it in the code
function getCB(err, hObj, gmo,md,buf, hConn ) {
// If there is an error, prepare to exit by setting the ok flag to false.
if (err) {...
} else {
if (md.Format=="MQSTR") {
console.log("message <%s>", decoder.write(buf));
} else {
console.log("binary message: " + buf);
}
}
I get my message with some service information:
buf=RFH �"�MQSTR � <mcd><Msd>jms_text</Msd></mcd> X<jms><Dst>queue://MY_QM/MY_QUEUE</Dst><Tms>1657791724648</Tms><Dlv>2</Dlv></jms> ...My_message...
How can I get only My message like I do in rfhutil?
I can get it with string methods, but it looks like crutches.
That message has the headers created by a JMS application. There are various ways of dealing with it. You can
Have the sending app disable sending that structure (setting the targClient property)
Use GMO options to ignore the properties (MQGMO_NO_PROPERTIES)
Have your application deal with the RFH2 stucture. See for example the amqsget.js sample in the Node.js repo which includes this fragment:
switch (format) {
case MQC.MQFMT_RF_HEADER_2:
hdr = mq.MQRFH2.getHeader(buf);

The specified Checkout Session could not be found

Goal: Create a successful (test) Checkout Session using Stripe's API for checkout.
[the link for their tutorial on Checkout here: https://github.com/stripe-samples/checkout-one-time-payments]
I'm creating a checkout session using my UI & building the checkout session with the data supplied to the backend web service using the following code:
var options = new Stripe.Checkout.SessionCreateOptions
{
PaymentMethodTypes = new List<string>
{
"card",
},
LineItems = stripeCartLineItems,
Mode = "payment",
SuccessUrl = "https://" + HostName + "/Stripe/OrderPlaced",
CancelUrl = "https://example.com/cancel",
};
var requestOptions = new RequestOptions
{
StripeAccount = stripeConnectedAccountId,
ApiKey = StripeConfiguration.ApiKey
};
var service = new Stripe.Checkout.SessionService();
Stripe.Checkout.Session session = service.Create(options, requestOptions);
return Json(new { sessionId = session.Id });
As you can see, I receive acknowledgment back from Stripe's API with a valid checkout session id:
Logs on Stripe's Dashboard confirm a successful checkout session:
However, I keep getting this error message:
The API keys have already been refreshed and placed appropriately. That's not the issue... Loading up the test Checkout page is failing. My logs in Stripe's dashboard say this:
The Javascript call which initiates the redirect to Stripe's checkout experience is copied straight from their tutorial (pasted above). That code looks like this:
checkoutButton.addEventListener('click', function () {
$.ajax({
url: "/Stripe/CreateCheckoutSession",
method: "POST",
data: { stripeConnectedAccountId: stripeConnectedAccountId, cartLineItems: scope.cartLineItems },
}).done(function (resp) {
stripe.redirectToCheckout({
sessionId: resp.sessionId
}).then(function (result) {
// If `redirectToCheckout` fails due to a browser or network
// error, display the localized error message to your customer
// using `result.error.message`.
alert(result.error.message);
});
})
After going to: https://stripe.com/docs/error-codes/resource-missing. The docs says this for that specific error code: "The ID provided is not valid. Either the resource does not exist, or an ID for a different resource has been provided."
Ok Stripe. Sure sure. You made this API - I'll listen. However, according to your docs, Intellisense, & your sample code... my code is correct and I used the session.Id provided by the response object YOU sent me after initiating a Checkout Session:
I have no clue how to proceed.
Any ideas are appreciated.
If you have already verified the session and keys from server and stripe,
Please check the stripe key used in your client side. The public key used to initialise the stripe in both client & server should be same.
Check the logs in client side to make sure that the key is same.

Cloudrail - Twitter API - GetIdentifier fails

I'm using cloudrail-node-js v.2.17.3.
I want to simply get an unique Identifier for Twitter Authentication.
However, this always returns an error :
{ [Error: The URL the RedirectReceiver returns does not contain all necessary keys in the query, it's missing at least oauth_token,oauth_verifier]
message: 'The URL the RedirectReceiver returns does not contain all necessary keys in the query, it\'s missing at least oauth_token,oauth_verifier' }
Here is my code :
const cloudrail = require('cloudrail-si');
cloudrail.Settings.setKey(cloudrailKey);
let Service = new cloudrail.services.Twitter(cloudrail.RedirectReceivers.getLocalAuthenticator(3000), twitterKey, twitterSecret, `http://localhost:3000`, "state");
Service.getIdentifier(function (err, id) {
console.log(err, id);
});
Thanks in advance.

Asana addProject trouble in node

I'm trying to create a task and then set a project on the task using nodejs and the asana thin wrapper available from npm.
var asana = require('asana');
var newTask = { name: "Your Mission", notes: "Stuff" };
var project = [{ id:321, name: "Missions Impossible"}];
var client = asana.Client.basicAuth('APIKEY');
client.tasks.createInWorkspace(123, newTask).then(function(task) {
client.tasks.addProject(task.id, project).then(function(o) {
// Check for empty object returned (sign of success)
if (Object.keys(o).length === 0)
console.log('yay!');
else
console.log('booo');
}
The task is created, but I get an error in the addProject method - "Possibly unhandled error. Invalid Request".
I've tried different variations on the project object, but I'm out of ideas.
Is the project wrongly formed? Something else?
You are right, your project is malformed. The data passed to the addProject method should be a dictionary with the member:
project: 321
or
project: { id: 321 }
See the documentation for the endpoint being called and the data that it gets passed.

Using JavaScript to put file into blob storage results in 403 (Server failed to authenticate request)

I followed this great article by Gaurav Mantri in order to upload files using HTML5/Javascript directly into blob storage.
http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/
However I am finding that during the upload portion this portion of his code fails with the 403 error.
And the funny thing is, this happens randomly. Sometimes the upload actually works and everything completes successfully however majority of the time it is failing with the 403 error.
One thing to note: I am hoping that CORS support will be added soon to Azure however for time being I am using Chrome (with the chrome.exe --disable-web-security option) to get around the issue.
PUT
https://mystorage.blob.core.windows.net/asset-38569007-3316-4350…Giv17ye4bocVWDbA/EQ+riNiG3wEGrFucbd1BKI9E=&comp=block&blockid=YmxvY2stMA==
403 (Server failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.)
$.ajax({
url: uri,
type: "PUT",
data: requestData,
processData: false,
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');
xhr.setRequestHeader('Content-Length', requestData.length);
},
success: function (data, status) {
console.log(data);
console.log(status);
bytesUploaded += requestData.length;
var percentComplete = ((parseFloat(bytesUploaded) / parseFloat(selectedFile.size)) * 100).toFixed(2);
$("#fileUploadProgress").text(percentComplete + " %");
uploadFileInBlocks();
},
error: function(xhr, desc, err) {
console.log(desc);
console.log(err);
}
});
I have put a 30-sec delay after creating the asset/locator/file in Azure before actually starting the upload portion in order to give time for the Locator to be propagated.
Any suggestion to what I could be missing?
Many thanks to Gaurav for pointing me in the direction of the issue.
It turns out that I was making JSON calls to the server which would create the assets/locators/policies and then return the upload uri back.
However my upload uri was of type Uri and when JSON serialized it, it didn't properly encode it.
After changing my uri object (on the server) to a string (and calling uploaduri = (new UriBuilder(theuri)).ToString(); ) the uri returned back to the web client was properly encoded and I no longer got the 403 errors.
So as a heads up to others, if you get this same issue, you may want to look at the encoding of your upload uri.
Gaurav here's the code I use to create the empty asset (w/ locator and file):
/// <summary>
/// Creates an empty asset on Azure and prepares it to upload
/// </summary>
public FileModel Create(FileModel file)
{
// Update the file model with file and asset id
file.FileId = Guid.NewGuid().ToString();
// Create the new asset
var createdAsset = this.Context.Assets.Create(file.AssetName.ToString(), AssetCreationOptions.None);
// Create the file inside the asset and set its size
var createdFile = createdAsset.AssetFiles.Create(file.Filename);
createdFile.ContentFileSize = file.Size;
// Create a policy to allow uploading to this asset
var writePolicy = this.Context.AccessPolicies.Create("Policy For Copying", TimeSpan.FromDays(365 * 10), AccessPermissions.Read | AccessPermissions.Write | AccessPermissions.List);
// Get the upload locator
var destinationLocator = this.Context.Locators.CreateSasLocator(createdAsset, writePolicy);
// Get the SAS Uri and save it to file
var uri = new UriBuilder(new Uri(destinationLocator.Path));
uri.Path += "/" + file.Filename;
file.UploadUri = uri.Uri;
// Return the updated file
return file;
}

Resources