Authenticating to Dynamics NAV for OData - node.js

I'm trying to write a node.js script that uses a Dynamics NAV Odata feed.
I have both a UserAccount/PW and a Web Services Access Key from my Dynamics NAV setup.
I can't for the life of my find out how to properly authenticate, either by adding something in a header or by adding something in the URL query. I've tried using the 'username:password#server' format. I've tried encoding that as base64 and adding that in the Header for the 'Authentication' value.
The documentation itself is incredibly non-specific. I know how to generate the key, but I don't know how to properly send that key to NAV to authenticate.
I'm using the 'request-promise' npm package, which takes an 'options' argument that I can add arbitrary header key:value pairs into. Please someone give me some direction about how to authenticate to NAV. I've been on this for hours.

I found a satisfactory answer.
Using node-libcurl I was able to cURL to a url using the format
http://username:password#<server>/ODATA_table
specifically my cURL module looks like this:
var Curl = require('node-libcurl').Curl;
var curl = new Curl(),
close = curl.close.bind(curl);
function getOData(url) {
return new Promise((resolve, reject) => {
curl.setOpt(Curl.option.URL, url);
curl.setOpt(Curl.option.HTTPAUTH, Curl.auth.NTLM);
curl.setOpt(Curl.option.SSL_VERIFYPEER, false);
curl.setOpt(Curl.option.SSL_VERIFYHOST, false);
curl.setOpt(Curl.option.POST, 0);
curl.on('end', function (statusCode, body, headers) {
var retObj = JSON.parse(body);
resolve(retObj);
close();
});
curl.on( 'error', function(e){
reject(e);
close();
});
curl.perform();
})
}
module.exports = {getOData: getOData};
But I have to explicitly ask for json in the url, like ?format=json.

Tkol, you're right,also you can use guzzle, it's very simple, that's a sample function that query customers table :
public function ReadCustomer($identifier=0)
{
try {
$client = new GuzzleHttpClient();
$apiRequest = $client->request('GET', 'http://server:port/ServiceName/WS/CompanyName/Page/Customer?$filter=No eq \''.$identifier.'\'',[
'auth' =>'username','password', 'NTLM' ], //NTLM authentication required
'debug' => true //If needed to debug
]);
$content = json_decode($apiRequest->getBody()->getContents());
return $content;
} catch (RequestException $re) {
//For handling exception
}
}
you can check my sample:
update/delete/get from Dynamics NAV OData webservice

Related

Node.JS PowerBI App Owns Data for Customers w/ Service Principal (set "config.json" from a table in my database)

I'm attempting to refactor the "Node.JS PowerBI App Owns Data for Customers w/ Service Principal" code example (found HERE).
My objective is to import the data for the "config.json" from a table in my database and insert the "workspaceId" and "reportId" values from my database into the "getEmbedInfo()" function (inside the "embedConfigServices.js" file). Reason being, I want to use different configurations based on user attributes. I am using Auth0 to login users on the frontend, and I am sending the user metadata to the backend so that I can filter the database query by the user's company name.
I am able to console.log the config data, but I am having difficulty figuring out how to insert those results into the "getEmbedInfo()" function.
It feels like I'm making a simple syntax error somewhere, but I am stuck. Here's a sample of my code:
//----Code Snippet from "embedConfigServices.js" file ----//
async function getEmbedInfo() {
try {
const url = ;
const set_config = async function () {
let response = await axios.get(url);
const config = response.data;
console.log(config);
};
set_config();
const embedParams = await getEmbedParamsForSingleReport(
config.workspaceId,
config.reportId
);
return {
accessToken: embedParams.embedToken.token,
embedUrl: embedParams.reportsDetail,
expiry: embedParams.embedToken.expiration,
status: 200,
};
} catch (err) {
return {
status: err.status,
error: err.statusText,
}
};
}
}
This is the error I am receiving on the frontend:
"Cannot read property 'get' of undefined"
Any help would be much appreciated. Thanks in advance.
Carlos
The error is because of fetching wrong URL. The problem is with the config for the Service Principal. We will need to provide reportId, workspaceId for the SPA and also make sure you added the service principal to workspace and followed all the steps from the below documentation for the service principal authentication.
References:
https://learn.microsoft.com/power-bi/developer/embedded/embed-service-principal

Create custom extension through Graph API with Client Credentials auth

I have a .NET Web API that I am using to do some interaction with Microsoft Graph and Azure AD. However, when I attempt to create an extension on the user, it comes back with Access Denied.
I know it is possible from the documentation here however, it doesnt seem to work for me.
For the API, I am using client credentials. So my web app authenticates to the API using user credentials, and then from the API to the graph it uses the client.
My app on Azure AD has the Application Permission Read and Write Directory Data set to true as it states it needs to be in the documentation for a user extension.
I know my token is valid as I can retrieve data with it.
Here is my code for retrieving it:
private const string _createApprovalUrl = "https://graph.microsoft.com/beta/users/{0}/extensions";
public static async Task<bool> CreateApprovalSystemSchema(string userId)
{
using(var client = new HttpClient())
{
using(var req = new HttpRequestMessage(HttpMethod.Post, _createApprovalUrl))
{
var token = await GetToken();
req.Headers.Add("Authorization", string.Format("Bearer {0}", token));
req.Headers.TryAddWithoutValidation("Content-Type", "application/json");
var requestContent = JsonConvert.SerializeObject(new { extensionName = "<name>", id = "<id>", approvalLimit = "0" });
req.Content = new StringContent(requestContent, Encoding.UTF8, "application/json");
using(var response = await client.SendAsync(req))
{
var content = await response.Content.ReadAsStringAsync();
ApprovalSystemSchema schema = JsonConvert.DeserializeObject<ApprovalSystemSchema>(content);
if(schema.Id == null)
{
return false;
}
return true;
}
}
}
}
Is there anyone who may have a workaround on this, or information as to when this will be doable?
Thanks,
We took a look and it looks like you have a bug/line of code missing. You appear to be making this exact request:
POST https://graph.microsoft.com/beta/users/{0}/extensions
Looks like you are missing the code to replace the {0} with an actual user id. Please make the fix and let us know if you are now able to create an extension on the user.

Uploading a file from Autodesk A360 to bucket in NodeJS

I am using the Forge data management API to access my A360 files and aim to translate them into the SVF format so that I can view them in my viewer. So far I have been able to reach the desired item using the ForgeDataManagement.ItemsApi, but I don't know what to do with the item to upload it to the bucket in my application.
From the documentation it seems like uploadObject is the way to go (https://github.com/Autodesk-Forge/forge.oss-js/blob/master/docs/ObjectsApi.md#uploadObject), but I don't know exactly how to make this function work.
var dmClient = ForgeDataManagement.ApiClient.instance;
var dmOAuth = dmClient.authentications ['oauth2_access_code'];
dmOAuth.accessToken = tokenSession.getTokenInternal();
var itemsApi = new ForgeDataManagement.ItemsApi();
fileLocation = decodeURIComponent(fileLocation);
var params = fileLocation.split('/');
var projectId = params[params.length - 3];
var resourceId = params[params.length - 1];
itemsApi.getItemVersions(projectId, resourceId)
.then (function(itemVersions) {
if (itemVersions == null || itemVersions.data.length == 0) return;
// Use the latest version of the item (file).
var item = itemVersions.data[0];
var contentLength = item.attributes.storageSize;
var body = new ForgeOSS.InputStream();
// var body = item; // Using the item directly does not seem to work.
// var stream = fs.createReadStream(...) // Should I create a stream object lik suggested in the documention?
objectsAPI.uploadObject(ossBucketKey, ossObjectName, contentLength, body, {}, function(err, data, response) {
if (err) {
console.error(err);
} else {
console.log('API called successfully. Returned data: ' + data);
//To be continued...
}
I hope someone can help me out!
My current data:
ossObjectName = "https://developer.api.autodesk.com/data/v1/projects/"myProject"/items/urn:"myFile".dwfx";
ossBucketKey = "some random string based on my username and id";
Regards,
torjuss
When using the DataManagement API, you can either work with
2 legged oAuth (client_credentials) and access OSS' buckets and objects,
or 3 legged (authorization_code) and access a user' Hubs, Projects, Folders, Items, and Revisions
When using 3 legged, you do access someones content on A360, or BIM360 and these files are automatically translated by the system, so you do not need to translate them again, not to transfer them on a 2 legged application bucket. The only thing you need to do it is get the manifest of the Item or its revision and use the URN to see it in the viewer.
Checkout an example here: https://developer.autodesk.com/en/docs/data/v2/reference/http/projects-project_id-versions-version_id-GET/
you'll see something like
Examples: Successful Retrieval of a Specific Version (200)
curl -X GET -H "Authorization: Bearer kEnG562yz5bhE9igXf2YTcZ2bu0z" "https://developer.api.autodesk.com/data/v1/projects/a.45637/items/urn%3Aadsk.wipprod%3Adm.lineage%3AhC6k4hndRWaeIVhIjvHu8w"
{
"data": {
"relationships": {
"derivatives": {
"meta": {
"link": {
"href": "/modelderivative/v2/designdata/dXJuOmFkc2sud2lwcWE6ZnMuZmlsZTp2Zi50X3hodWwwYVFkbWhhN2FBaVBuXzlnP3ZlcnNpb249MQ/manifest"
}
},
Now, to answer teh other question about upload, I got an example available here:
https://github.com/Autodesk-Forge/forge.commandline-nodejs/blob/master/forge-cb.js#L295. I copied the relevant code here for everyone to see how to use it:
fs.stat (file, function (err, stats) {
var size =stats.size ;
var readStream =fs.createReadStream (file) ;
ossObjects.uploadObject (bucketKey, fileKey, size, readStream, {}, function (error, data, response) {
...
}) ;
}) ;
Just remember that ossObjects is for 2 legged, where as Items, Versions are 3 legged.
We figured out how to get things working after some support from Adam Nagy. To put it simple we had to do everything by use of 3 legged OAuth since all operations involves a document from an A360 account. This includes accessing and showing the file structure, translating a document to SVF, starting the viewer and loading the document into the viewer.
Also, we were targeting the wrong id when trying to translate the document. This post shows how easily it can be done now, thanks to Adam for the info!

Windows Azure node.js Push notification for Windows store 8.1 - How to use 'createRawTemplateRegistration' template?

Please explain with one example as I am getting Error: 400 - The specified resource description is invalid.
Basically, I want to update badge value. But there is no template for badge registration in WnsService API document (http://azure.github.io/azure-sdk-for-node/azure-sb/latest/WnsService.html). So, I am trying with "createRawTemplateRegistration" template to update the badge value.
Please help me on this.
You can directly use the function sendBadge() to push badge value to client devices.
Please try the following code:
var azure = require('azure');
var notificationHubService = azure.createNotificationHubService('<hubname>', '<connectionstring>');
notificationHubService.wns.sendBadge(null,99,function(error,response){
if(error) console.log(error);
console.log(response);
})
Any further concern, please feel free to let me know.
update
Do you mean that you want only one template and to handle all the types of notifications including Raw, Toast, Badge? If so, I think the answer is negative. According the description http://azure.github.io/azure-sdk-for-node/azure-sb/latest/WnsService.html#createRawTemplateRegistration:
Remember that you have to specify the X-WNS-Type header
So the header option is required. And according the REST API which is invoked via this api in nodejs is Create Registration, and we can find the description:
The BodyTemplate element is mandatory, as is the X-WNS-Type header.
So we should specify the notification type for the template.
update1
This code sample works fine on my side:
var channel = '<devicetoken>';
var templateMessage = { text1: '$(message)' };
notificationHubService.wns.createRawTemplateRegistration(channel,'tag',JSON.stringify(templateMessage), {headers: { 'X-WNS-Type': 'wns/raw' }},
function (e, r) {
if (e) {
console.log(e);
} else {
console.log({
id: r.RegistrationId,
deviceToken: r.DeviceToken,
expires: r.ExpirationTime
});
}
}
)

Overriding delete operation on Azure Mobile Services table

I'd like to override delete operation on my Azure Mobile Services table to make it more like update then real delete. I have additional column named IsDeleted and I'd like to set it's value to true when delete operation is executed.
I figured out, that what I need is:
fire my own 'update' inside del function,
delete current request.execute()
prepare and sent response by myself
That meens my del function should look like that:
function del(id, user, request) {
// execute update query to set 'isDeleted' - true
// return standard response
request.respond();
}
As you can see I'm missing the first part of the function - the update one. Could you help me writing it? I read Mobile Services server script reference but there is no info about making additional queries inside a server script function.
There are basically two ways to do that - using the tables object, and using the mssql object. The links point to the appropriate reference.
Using mssql (I didn't try it, you may need to update your SQL statement):
function del(id, user, request) {
var sql = 'UPDATE <yourTableName> SET isDeleted = true WHERE id = ?';
mssql.query(sql, [id], {
success: function() {
request.respond(statusCodes.OK);
}
});
}
Using tables (again, only tested in notepad):
function del(id, user, request) {
var table = tables.getTable('YourTableName');
table.where({ id: id }).read({
success: function(items) {
if (items.length === 0) {
request.respond(statusCodes.NOT_FOUND);
} else {
var item = items[0];
item.isDeleted = true;
table.update(item, {
success: function() {
request.respond(statusCodes.OK, item);
}
});
}
}
});
}
There is a Node.js driver for SQL Server that you might want to check out.
The script component of Mobile Services uses node.js. You might want to check out the session from AzureConf called Javascript, meet cloud

Resources