Skobbler Map : FreeDrive function integration in other app - skobbler-maps

In skobbler map, Which frameworks or files should be imported in our app in order to use 'freedrive' function of NavigationUI? thanks

Related

Source of the JSONArrayFor Liquid method when using Logic App

I've built my own Liquid template verification application that will take a liquid template + a sample file and then output the result. The app is using the same version of Dotliquid that Logic Apps does but there seems to be a custom filter JSONArrayFor that I'm not able to replicate.
Running the app with the filter throws an error "DotLiquid.Exceptions.SyntaxException: Unknown tag 'JSONArrayFor'" so it has to be a custom filter created by the Azure team.
Does anyone have any clue where I can find the source, the logic or anything in regard what it does and how the filter works?

How to expose a module from an Electron app to an external module

I am creating an Electron app and am using electron-builder to package and build the app. Users are able to make plugins for the app (plugins can be node modules with their own dependencies and such). Everything is working fine except the part of the app for exposing the app's API to the plugin.
I created a module in the app for handling the plugins "Plugin-handler" that imports the plugin and also exposes the API to the plugin (The app's API is just a set of functions and it is bundled with the app).
The dilemma is that the user should be able to place a plugin anywhere on their machine and the app does not know the path before the build. Consequently, I excluded the "plugin-handler" module in the Electron-builder config so it does not bundle with the Webpack. Now I need to find the right way to expose the API to the plugin.
Here is how I am doing it now, to load the plugins and passing the API:
// In the Plugin-handler module
const API = require('api')
const plugin = require('path-to-plugin')( API )
path-to-plugin is added by the user in the app when they import their plugin.
As seen above, currently I pass the API to the plugin as an argument, which is not ideal, instead, I need a way to exposing the API module (or any other module that is bundled in the APP) to the plugin so users can access it in their plugin like below:
// In the plugin
const { arg1, arg2,... } = require('api')
I've seen apps doing this, and allowing users to access their API in their plugins, but since I am new to all this, I may be doing things wrong, so please do be kind, and thank you for the help!
I drew a simple chart for better portraying the question:

Firebase dynamic links builder API for NodeJS

I am trying to use firebase cloud functions to build my dynamic link rather than use an android client API
#VisibleForTesting
static Uri buildDeepLink(#NonNull final Uri deepLink, int minVersion) {
String uriPrefix = "https://url.page.link";
DynamicLink.Builder builder = FirebaseDynamicLinks.getInstance()
.createDynamicLink()
.setDomainUriPrefix(uriPrefix)
.setAndroidParameters(new DynamicLink.AndroidParameters.Builder()
.setMinimumVersion(minVersion)
.build())
.setLink(deepLink);
final DynamicLink link = builder.buildDynamicLink();
// Return the dynamic link as a URI
return link.getUri();
}
The above code is for android client, is there a similar code for cloud functions environment?
Firebase Dynamic Links currently has SDKs for creating links for iOS, Android, Unity, and C++. It also has a REST API for other platforms. Since you're not using Android or iOS, you'll have to use the REST API.
If you're feeling lazy, you can also use this small package instead of the REST API:
https://www.npmjs.com/package/firebase-dynamic-links

Assign Application Insights cloud_RoleName to Windows Service running w/ OWIN

I have an application built from a series of web servers and microservices, perhaps 12 in all. I would like to monitor and, importantly, map this suite of services in Applications Insights. Some of the services are built with Dot Net framework 4.6 and deployed as Windows services using OWIN to receive and respond to requests.
In order to get the instrumentation working with OWIN I'm using the ApplicationInsights.OwinExtensions package. I'm using a single instrumentation key across all my services.
When I look at my Applications Insights Application Map, it appears that all the services that I've instrumented are grouped into a single "application", with a few "links" to outside dependencies. I do not seem to be able to produce the "Composite Application Map" the existence of which is suggested here: https://learn.microsoft.com/en-us/azure/application-insights/app-insights-app-map.
I'm assuming that this is because I have not set a different "RoleName" for each of my services. Unfortunately, I cannot find any documentation that describes how to do so. My map looks as follow, but the big circle in the middle is actually several different microservices:
I do see that the OwinExtensions package offers the ability to customize some aspects of the telemetry reported but, without a deep knowledge of the internal structure of App Insights telemetry, I can't figure out whether it allows the RoleName to be set and, if so, how to accomplish this. Here's what I've tried so far:
appBuilder.UseApplicationInsights(
new RequestTrackingConfiguration
{
GetAdditionalContextProperties =
ctx =>
Task.FromResult(
new [] { new KeyValuePair<string, string>("cloud_RoleName", ServiceConfiguration.SERVICE_NAME) }.AsEnumerable()
)
}
);
Can anyone tell me how, in this context, I can instruct App Insights to collect telemetry which will cause a Composite Application Map to be built?
The following is the overall doc about TelemetryInitializer which is exactly what you want to set additional properties to the collected telemetry - in this case set Cloud Rolename to enable application map.
https://learn.microsoft.com/en-us/azure/application-insights/app-insights-api-filtering-sampling#add-properties-itelemetryinitializer
Your telemetry initializer code would be something along the following lines...
public void Initialize(ITelemetry telemetry)
{
if (string.IsNullOrEmpty(telemetry.Context.Cloud.RoleName))
{
// set role name correctly here.
telemetry.Context.Cloud.RoleName = "RoleName";
}
}
Please try this and see if this helps.

Delete folder in Google Cloud Storage using nodejs gcloud api

I am using gcloud nodejs api to access Google Cloud Storage. I can save/delete/exists files individually, but I didn't find a way to delete a folder or even to list the files in a folder using gcloud nodejs api.
I have seen people say that the folder hierachy in GCS is not a real tree structure, but just names. So I tried to use wildcard to match the file name string, which did not succeed.
I wonder if there is any way to do it. If not, what tool should I use?
The code to list files in a directory should look something like this:
bucket.getFiles({ prefix: 'directoryName/' }, function(err, files) {})
And to delete:
bucket.deleteFiles({ prefix: 'directoryName/' }, function(err) {})
getFiles API documentation
deleteFiles API documentation
Instead of using gcloud nodejs api, there are two other ways to do this.
Use the googleapis package to access the standard JSON API and XML API of gcs. googleapis is a lower level API tool which includes interacting with google cloud services. We can create/list/delete files on gcs. Documentation and examples:
https://cloud.google.com/storage/docs/json_api/v1/objects/delete
https://cloud.google.com/storage/docs/json_api/v1/objects/list
Use childe_process to execute the gsutil commmand line tool. This is not a standard way of programatically accessing the google api, but still a viable solution.Wildcard is allowed when issuing the command. Note that is may not work on the google app engine. Here is an example.
Nodejs
var exec = require('child_process').exec;
exec("gsutil rm gs://[bucketname]/[directory ]/*" , function(error,stdout,stderr){});
As Stephen suggested, using standard gcloud method bucket.getFiles and bucket.deleteFiles is the most desirable approach. Since gcs don't have the concept of directories, the manipulation of multiple files obviously should be considered as a bucket level operation.

Resources