Azure Maps API - Search for location based on ID - azure

I am working with the Azure Maps API, which returns information for a "fuzzy" match in this format:
{
type: 'Geography',
id: 'US/GEO/p0/52503',
score: 2.7559998035,
entityType: 'PostalCodeArea',
address: {
municipalitySubdivision: 'Brentwood',
municipality: 'Washington',
countrySecondarySubdivision: 'District of Columbia',
countrySubdivision: 'DC',
countrySubdivisionName: 'District of Columbia',
postalCode: '56967',
postalName: 'Parcel Return Service',
countryCode: 'US',
country: 'United States',
countryCodeISO3: 'USA',
freeformAddress: 'Parcel Return Service, DC 56967'
},
position: { lat: 38.91752, lon: -76.99356 },
viewport: { topLeftPoint: [Object], btmRightPoint: [Object] }
}
I want to be able to store the id property in a database (US/GEO/p0/52503 in this case) and then be able to get all the information from Azure without needing to basically duplicate the Azure data. I couldn't find any information in the Azure Maps API documentation about how to do a lookup for a location based on this ID - is this possible?

The is no way to retrieve results by ID in Azure Maps. Additionally, there is no guarantee that the ID will not change in the future (the V2 geocoding service doesn't have the ID). I believe it is mainly there for debugging purposes.
Storing the Address and optionally the lat/lon values in your database is likely the best option. Note that as long as you have an Azure subscription and your database is in Azure, you can store the lat/lon result data in your database (even if you aren't using Azure Maps anymore).

Related

Iteration within Azure AD B2C custom policy

Is there a way within a B2C custom policy to perform an iteration on some Json returned from a REST call.
Example:
If the REST service returned:
[
{
name: "item1",
value: "value1"
},
{
name: "item2",
value: "value2"
},
{
name: "item3",
value: "value3"
}
]
I want to be able to iterate through this array, and perform a calculation, and set a claim boolean if a record is found.
calculation(name, value) == calculation(another_input_claim, value)
I cannot use another REST service, as the specifics of the calculation need to be securely performed inside the B2C policy.
I can change the Json format of the input array if this helps?
Do you have any suggestions on implementing this within a B2C policy?
One of the options available here in B2C which I feel can be used is "GetClaimsFromJsonArray" This does help in extracting the values from the JSON Array.

mongoose update millions of records while extracting information

We have a production database with over 5 million customer customer records, each customer document has an embedded array of licenses they have applied for. And example customer document is as follows:
{
_id: ObjectId('...'),
phoneNumber: 'xxxx',
// Other customer fields
licenses: [
{
_id: ObjectId('...'),
state: 'PENDING',
expired: false,
createdAt: ISODate(''),
// Other license fields
},
// More Licenses for this customer
]
}
I have been tasked with changing the state of every PENDING license applied for during the month of September to REJECTED and sending an SMS to every customer whose pending permit just got rejected.
Using the model.where(condition).countDocuments() I have found that there is over 3 million customers (not licenses) matching the aforementioned criteria. Each customer has an average of 9 licenses.
I need assistance coming up with a strategy that won't slow down the system when performing this action. Furthermore, this is around 17GB of data.
Sending SMS is fine, I can queue details for SMS service. My challenge is processing the licenses while extracting relevant information for SMS.
First of all you have to create an index on the collection:
db.collection.createIndex( { "licenses.state": 1 } )
Then you shoud do something like that:
model.updateMany({}, {
'$set': {
'licenses.$[elem].state': 'REJECTED'
}
}, { arrayFilters: [{
'elem.createdAt': { $gte: ISODate(....) }
}],
multi: true
} ).then(function (doc)){}
If you have a replica set and your updates are on the primary instance you should not affect the secondary instances when reading on those once.
If you want to split the update on many batches you can use the _id (already indexed). Of course it depends on your _id format.

i am trying to create order but no such sku on stripe api

i am trying to create order but no such sku on stripe api. is it possible to create order on stripe without creating a product? i just want to store the order on stripe.
const orderRes = await stripe.orders.create({
currency: 'usd',
email: 'iamaemail#gmail.com',
items: [
{type: 'sku', parent: 'sku_7hAchfCjchvSHL'},
],
shipping: {
name: 'Jenny Rosen',
address: {
line1: '1234 Main Street',
city: 'San Francisco',
state: 'CA',
country: 'US',
postal_code: '94111',
},
},
}
As you can see in the Stripe docs, the Orders API has been deprecated and is not SCA compliant (which affects if charges made via this API will succeed for European merchants or end-users).
If you look at the Stripe API reference for creating an Order object, you can see that only the currency parameter is required. If you are going to specify a value for the parent parameter for an element in the items hash, you must ensure that you are using a SKU object that was created in Test Mode or Live Mode (whichever matches the environment in which you're trying to create the Order object), and that the SKU is correct. This SKU id appears to not match any SKU id in your Stripe account (which is why you're getting the error that you mention).

Rest Api for search by products, brands and more using single search input

I'm new to Node and Mongodb.
I want to implement Search Rest api, with single param passing to api resulting to search in mongo collections checking the category, subCategory values and returning the related keyword matching object. just like flipkart search bar, with suggestion keywords, what should i follow to achieve this. i'm just having knowledge with basic CRUD operations that's all. Any suggestions or ref practices are helpful to me. Thank you
You can follow two approaches for the above implementation.
1) Basic approach. We can create a search collection which would have the following field like
Search
_id, name, description, type (brand, products, etc), type_id (brand_id, product_id), logo (It can be a brand logo, or product logo and etc).
On every product, brand, etc add we would create an entry in the search table.
Similarly, on deletion, we would remove that product or brand from the search table
We would have an end called http:///search/:string
Which would in response give result as
{
data: [
{
_id: 507f1f77bcf86cd799439011,
name: 'Clothing',
description: "Sample description about clothing",
type: 'brand',
type_id: 675f1f77bcf86cd799439124, // brand id reference,
logo: "http://<domain_name>/logo/675f1f77bcf86cd799439124"
},
{
_id: 5d3f1f77bcf86cd799439234,
name: 'Monitor',
description: "Sample description about Monitor",
type: 'product',
type_id: 5j5f1f77bcf86cd799439987, // product id reference
logo: "http://<domain_name>/logo/5j5f1f77bcf86cd799439987"
}, {
_id: 507f1f77bcf86cd799439333,
name: "Mobile",
description: "Sample description about Mobile",
type: 'brand',
type_id: 876f1f77bcf86cd799439444, // brand id reference
logo: "http://<domain_name>/logo/876f1f77bcf86cd799439444"
}
]
}
2) Sophisticated approach: Instead of using a search table you can go with the elastic search for a faster and robust approach
you can use mongodb text search feature
or you can go with elasatic search as per your requirement.

How to update an index with new variables in Elasticsearch?

I have an index 'user' which has a mapping with field of "first", "last", and "email". The fields with names get indexed at one point, and then the field with the email gets indexed at a separate point. I want these indices to have the same id though, corresponding to one user_id parameter. So something like this:
function indexName(client, id, name) {
return client.update({
index: 'user',
type: 'text',
id: id,
body: {
first: name.first
last: name.last
}
})
}
function indexEmail(client, id, email) {
return client.update({
index: 'user',
type: 'text',
id: id,
body: {
email: email
}
})
}
When running:
indexName(client, "Jon", "Snow").then(indexEmail(client, "jonsnow#gmail.com"))
I get an error message saying that the document has not been created yet. How do I account for a document with a variable number of fields? And how do I create the index if it has not been created and then subsequently update it as I go?
The function you are using, client.update, updates part of a document. What you actually needs is to first create the document using the client.create function.
To create and index, you need the indices.create function.
About the variable number of fields in a document type, it is not a problem because Elastic Search support dynamic mapping. However, it would be advisable to provide a mapping when creating the index, and try to stick to it. Elastic Search default mapping can create you problems later on, e.g. analyzing uuids or email addresses which then become difficult (or impossible) to search and match.

Resources