Sequence Number for Id - node.js

I want to configure my Mongo DB to create sequence number for an Id column. Ex. It has to start from 1001 and increase by 1 automatically when I insert next row. I have my schema definitions as part of Node.JS how to add this configuration in Node schema?

MongoDB doesn't support this out of the box. The way I've implemented this (albeit in C#) is to create a "Sequence" collection with a key and a next number. You can atomically increment and return the next number then use this as the id in your collection.
This is a C# function, using the findandmodify mongodb command to fetch and update a sequence number for a given "key".
public long GetNextSequenceNumber(string name, string key)
{
var update = new BsonDocument(new BsonElement("$inc", new BsonDocument(new BsonElement("SequenceNumber", 1))));
var query = new BsonDocument("_id", key);
var command = new CommandDocument {
{ "findandmodify" , name },
{ "query", query},
{ "update" , update},
{ "new" , true},
};
var res = Db.RunCommand(command);
if (res.Response["value"] != BsonNull.Value)
{
var o = BsonSerializer.Deserialize<Sequence>(res.Response["value"].ToBsonDocument());
return o.SequenceNumber;
}
else
{
var o = new Sequence() { Id = key, SequenceNumber = 0 };
Db.GetCollection(name).Insert<Sequence>(o);
return o.SequenceNumber;
}
}
and the Sequence model:
public class Sequence
{
public string Id { get; set; }
public long SequenceNumber { get; set; }
}
The sequence documents look like:
{
_id : 'mykey',
SequenceNumber : NumberLong(1234)
}
If you need converting it to javascript please ask.
Hope that helps.

Related

Get polygon from Azure Maps Search

I'm trying to use Azure.Maps.Search to give me a polygon for a result. For example, if I search for "Birmingham" I would like a result for that municipality with a collection of geopoints defining the boundary.
Is this possible?
var credential = new AzureKeyCredential("............");
var client = new MapsSearchClient(credential);
Response<SearchAddressResult> searchResult = await client.SearchAddressAsync(
query: "Birmingham",
options: new SearchAddressOptions
{
ExtendedPostalCodesFor=new SearchIndex[] { SearchIndex.PointAddresses },
CountryFilter = new string[] { "GB" },
Top = 1,
EntityType = GeographicEntity.Municipality
});
Yes, this is possible. The search address API will contain a DataSources.Geometries.ID value in the results that is the ID of the unique boundary for that result. You can take this ID and pass it into the GetPolygonsAsync API in the Azure.Maps.Search Nuget package.
using Azure;
using Azure.Maps.Search;
using Azure.Maps.Search.Models;
namespace AzureMapsTest
{
internal class Program
{
private const string MapsApiKey = "...........";
static async Task Main(string[] args)
{
var credential = new AzureKeyCredential(MapsApiKey);
var client = new MapsSearchClient(credential);
SearchAddressOptions singleSearchResultOptions = new SearchAddressOptions { Top = 1 };
Response<SearchAddressResult> searchResult =
await client.SearchAddressAsync(
"Ealing, London, England",
singleSearchResultOptions);
Response<PolygonResult> polygons =
await client.GetPolygonsAsync(new string[] { searchResult.Value.Results[0].DataSources.Geometry.Id });
Console.WriteLine(System.Text.Json.JsonSerializer.Serialize(polygons.Value.Polygons));
}
}
}

Extend Umbraco back office search to search custom proerties not just title

What I would like to do is be able to type an custom property within the back office search. e.g. put the ISBN into the search field and have the results shown currently it always returns "no items found" as the search will only show results for the title node.
How do I enable the content search as seen in the image to search the data in the custom fields?
The data is in the internal index, I have checked the index is working and can see the result with "Examine Management" if I search via the custom data.
The solution is what I used to extend the search
https://dev.to/skttl/how-to-customize-searching-in-umbraco-list-views-1knk
Add a new file in the App_Code (SearchExtender)
using System.Linq;
using Examine;
using Umbraco.Core;
using Umbraco.Core.Cache;
using Umbraco.Core.Configuration;
using Umbraco.Core.Logging;
using Umbraco.Core.Models;
using Umbraco.Core.Persistence;
using Umbraco.Core.Persistence.DatabaseModelDefinitions;
using Umbraco.Core.PropertyEditors;
using Umbraco.Core.Services;
using Umbraco.Web;
using Umbraco.Web.Editors;
using Umbraco.Web.Models.ContentEditing;
namespace SearchExtender
{
public class CustomListViewSearchController : ContentController
{
public CustomListViewSearchController(PropertyEditorCollection propertyEditors, IGlobalSettings globalSettings, IUmbracoContextAccessor umbracoContextAccessor, ISqlContext sqlContext, ServiceContext services, AppCaches appCaches, IProfilingLogger logger, IRuntimeState runtimeState, UmbracoHelper umbracoHelper)
: base(propertyEditors, globalSettings, umbracoContextAccessor, sqlContext, services, appCaches, logger, runtimeState, umbracoHelper)
{
}
public PagedResult<ContentItemBasic<ContentPropertyBasic>> GetChildrenCustom(int id, string includeProperties, int pageNumber = 0, int pageSize = 0, string orderBy = "SortOrder", Direction orderDirection = Direction.Ascending, bool orderBySystemField = true, string filter = "", string cultureName = "")
{
// get the parent node, and its doctype alias from the content cache
var parentNode = Services.ContentService.GetById(id);
var parentNodeDocTypeAlias = parentNode != null ? parentNode.ContentType.Alias : null;
// if the parent node is not "books", redirect to the core GetChildren() method
if (parentNode?.ContentType.Alias != "books")
{
return GetChildren(id, includeProperties, pageNumber, pageSize, orderBy, orderDirection, orderBySystemField, filter);
}
// if we can't get the InternalIndex, redirect to the core GetChildren() method, but log an error
if (!ExamineManager.Instance.TryGetIndex("InternalIndex", out IIndex index))
{
Logger.Error<CustomListViewSearchController>("Couldn't get InternalIndex for searching products in list view");
return GetChildren(id, includeProperties, pageNumber, pageSize, orderBy, orderDirection, orderBySystemField, filter);
}
// find children using Examine
// create search criteria
var searcher = index.GetSearcher();
var searchCriteria = searcher.CreateQuery();
var searchQuery = searchCriteria.Field("parentID", id);
if (!filter.IsNullOrWhiteSpace())
{
searchQuery = searchQuery.And().GroupedOr(new [] { "nodeName", "isbn" }, filter);
}
// do the search, but limit the results to the current page 👉 https://shazwazza.com/post/paging-with-examine/
// pageNumber is not zero indexed in this, so just multiply pageSize by pageNumber
var searchResults = searchQuery.Execute(pageSize * pageNumber);
// get the results on the current page
// pageNumber is not zero indexed in this, so subtract 1 from the pageNumber
var totalChildren = searchResults.TotalItemCount;
var pagedResultIds = searchResults.Skip((pageNumber > 0 ? pageNumber - 1 : 0) * pageSize).Select(x => x.Id).Select(x => int.Parse(x)).ToList();
var children = Services.ContentService.GetByIds(pagedResultIds).ToList();
if (totalChildren == 0)
{
return new PagedResult<ContentItemBasic<ContentPropertyBasic>>(0, 0, 0);
}
var pagedResult = new PagedResult<ContentItemBasic<ContentPropertyBasic>>(totalChildren, pageNumber, pageSize);
pagedResult.Items = children.Select(content =>
Mapper.Map<IContent, ContentItemBasic<ContentPropertyBasic>>(content))
.ToList(); // evaluate now
return pagedResult;
}
}
}
change requests for /umbraco/backoffice/UmbracoApi/Content/GetChildren (the default endpoint for child nodes), and change it to my newly created one, which is located at /umbraco/backoffice/api/CustomListViewSearch/GetChildrenCustom.
This is done easily by adding a js file containing an interceptor like this.
Add file to /App_Plugins/CustomListViewSearch/CustomListViewSearch.js
angular.module('umbraco.services').config([
'$httpProvider',
function ($httpProvider) {
$httpProvider.interceptors.push(function ($q) {
return {
'request': function (request) {
// Redirect any requests for the listview to our custom list view UI
if (request.url.indexOf("backoffice/UmbracoApi/Content/GetChildren?id=") > -1)
request.url = request.url.replace("backoffice/UmbracoApi/Content/GetChildren", "backoffice/api/CustomListViewSearch/GetChildrenCustom");
return request || $q.when(request);
}
};
});
}]);
a package.manifest file in my App_Plugins folder.
{
"javascript": [
"/App_Plugins/CustomListViewSearch/CustomListViewSearch.js"
]
}
If the node Alais isnot working make sure its set in the documnt type (far right on document type name)

Bug in OrmLite - updating record with Primary Key = 0

Given a simple poco
public class Model
{
[PrimaryKey]
public int ID { get; set; }
public string Description { get; set; }
}
this works fine ...
var connectionString = #"Data Source=WIN8PC\SQLEXPRESS;Initial Catalog=Test;Integrated Security=True;";
connectionFactory = new OrmLiteConnectionFactory(connectionString, SqlServerDialect.Provider);
using (var db = connectionFactory.OpenDbConnection())
{ db.DropAndCreateTable<Model>(); }
var model0 = new Model { ID = 0, Description = "Item Zero" };
var model1 = new Model { ID = 1, Description = "Item One" };
using (var db = connectionFactory.OpenDbConnection())
{ db.Save(model0, model1); }
as does this ...
model0.Description += " updated";
model1.Description += " updated";
using (var db = connectionFactory.OpenDbConnection())
{
db.Save(model0);
db.Save(model1);
}
however, this crashes with a primary key violation exception ...
model0.Description += " updated again";
model1.Description += " updated again";
using (var db = connectionFactory.OpenDbConnection())
{ db.Save(model0, model1); }
The record with ID zero is required, as this is a lookup table to replace an existing C# enum type. This is a local copy of distributed data (that I don't control), so there's no reason to have an auto-increment key.
The issue appears to be in OrmLiteWriteCommandExtensions.SaveAll() - any row with id == defaultValue is assumed to be a new item, rather than an update of an existing record. The same issue occurs in the parallel async methods too.
Is there any other way to get around this issue, other than by saving each record individually (inside a transaction). It would be preferable to save all updated records for a table in one command.
Save is a high-level API that will INSERT or UPDATE based on whether or not the Primary Key has a value. If you want to insert a default Primary Key value you can use Insert instead as seen in this Live Example on Gistlyn:
public class Model
{
[PrimaryKey]
public int ID { get; set; }
public string Description { get; set; }
}
db.DropAndCreateTable<Model>();
var model0 = new Model { ID = 0, Description = "Item Zero" };
var model1 = new Model { ID = 1, Description = "Item One" };
db.Insert(model0, model1);
var rows = db.Select<Model>();
"Inserted Rows: {0}".Print(rows.Dump());
Which outputs:
Inserted Rows: [
{
ID: 0,
Description: Item Zero
},
{
ID: 1,
Description: Item One
}
]

How to batch get items using servicestack.aws PocoDynamo?

With Amazon native .net lib, batchget is like this
var batch = context.CreateBatch<MyClass>();
batch.AddKey("hashkey1");
batch.AddKey("hashkey2");
batch.AddKey("hashkey3");
batch.Execute();
var result = batch.results;
Now I'm testing to use servicestack.aws, however I couldn't find how to do it. I've tried the following, both failed.
//1st try
var q1 = db.FromQueryIndex<MyClass>(x => x.room_id == "hashkey1" || x.room_id == "hashkey2"||x.room_id == "hashkey3");
var result = db.Query(q1);
//2nd try
var result = db.GetItems<MyClass>(new string[]{"hashkey1","hashkey2","hashkey3"});
In both cases, it threw an exception that says
Additional information: Invalid operator used in KeyConditionExpression: OR
Please help me. Thanks!
Using GetItems should work as seen with this Live Example on Gistlyn:
public class MyClass
{
public string Id { get; set; }
public string Content { get; set; }
}
db.RegisterTable<MyClass>();
db.DeleteTable<MyClass>(); // Delete existing MyClass Table (if any)
db.InitSchema(); // Creates MyClass DynamoDB Table
var items = 5.Times(i => new MyClass { Id = $"hashkey{i}", Content = $"Content {i}" });
db.PutItems(items);
var dbItems = db.GetItems<MyClass>(new[]{ "hashkey1","hashkey2","hashkey3" });
"Saved Items: {0}".Print(dbItems.Dump());
If your Item has both a Hash and a Range Key you'll need to use the GetItems<T>(IEnumerable<DynamoId> ids) API, e.g:
var dbItems = db.GetItems<MyClass>(new[]{
new DynamoId("hashkey1","rangekey1"),
new DynamoId("hashkey2","rangekey3"),
new DynamoId("hashkey3","rangekey4"),
});
Query all Items with same HashKey
If you want to fetch all items with the same HashKey you need to create a DynamoDB Query as seen with this Live Gistlyn Example:
var items = 5.Times(i => new MyClass {
Id = $"hashkey{i%2}", RangeKey = $"rangekey{i}", Content = $"Content {i}" });
db.PutItems(items);
var rows = db.FromQuery<MyClass>(x => x.Id == "hashkey1").Exec().ToArray();
rows.PrintDump();

converting strings within Entity Framework query

The following query works perfectly fine and populates its dropdown list. The data in the data base is stored in all uppercase, ie PALM BEACH. I want to convert it to Proper case, which obviously i can do after the fact by iterating through the returned list and reformatting BUT I should be able to do it with in the query itself. The following query works fine.
Dim citylist As List(Of String) = (From c In ctx.ziptaxes
Where c.StateID = ddlStates.SelectedIndex
Order By c.City Ascending
Select c.City).ToList()
But if i try to convert it to some thing like this, it fails
Dim citylist As List(Of String) = (From c In ctx.ziptaxes
Where c.StateID = ddlStates.SelectedIndex
Let cityname = StrConv(c.City, VbStrConv.ProperCase)
Order By cityname Ascending
Select cityname).ToList()
I've tried using culture info and String.Format(c.City, vbProperCase) too and nothing other than the original query works. Any help appreciated.
ADDENDUM:
Well some further research is telling me that .Net objects like string conversion and cultureinfo cannot be used prior to the query being run. If that's the case it explains why it isn't working. The following solves my problem BUT I would still like to know if there is way to do it within the LINQ to EF.
Dim citylist As List(Of String) = (From c In ctx.ziptaxes
Where c.StateID = ddlStates.SelectedIndex
Order By c.City Ascending
Select c.City).ToList()
If citylist.Count > 0 Then
For i As Integer = 0 To citylist.Count - 1
citylist(i) = StrConv(citylist(i).ToLower(), vbProperCase)
Next
With ddlCity
.Items.Clear()
.DataSource = citylist.Distinct()
.DataBind()
.Items.Insert(0, "Select a city")
.SelectedIndex = 0
End With
End If
You can do the conversion in your SELECT. Here's an example (with an over-simplified City name converter):
using System.Collections.Generic;
using System.Linq;
using NUnit.Framework;
namespace LinqQuestion
{
[TestFixture]
public class StackOverflowTests
{
private IEnumerable<City> _cities;
[TestFixtureSetUp]
public void Arrange()
{
_cities = new List<City>
{
new City { Id = 1, Name = "FLINT", StateId = 1 },
new City { Id = 2, Name = "SAGINAW", StateId = 1 },
new City { Id = 3, Name = "DETROIT", StateId = 1 },
new City { Id = 4, Name = "FLint", StateId = 1 }
};
}
[Test]
public void TestCountryQuery()
{
var data = _cities
.Where(c => c.StateId == 1)
.OrderBy(c => c.Name)
.Select(c => StrConv(c.Name))
.Distinct().ToList();
Assert.That(data.Count == 3);
}
private static string StrConv(string original)
{
var firstLetter = original.Substring(0, 1).ToUpper();
var theRest = original.Substring(1, original.Length - 1).ToLower();
return firstLetter + theRest;
}
}
public class City
{
public int Id { get; set; }
public int StateId { get; set; }
public string Name { get; set; }
}
}

Resources