ServiceStack OrmLite Is it a error to use UserAuthRepository.CreateUserAuth inside a transaction - servicestack

I have a complex workflow where I want to create rows in several tables in one transaction.
One of the operations is to create a new UserAuth (from ServiceStack Authentication feature).
I assume that all the database operations in a transaction should operate on the same connection, and if that is true, then I think it may be a problem to call UserAuthRepository.CreateUserAuth inside a transaction because it looks as if it uses its own connection.
So my question is whether if the creation of a UserAuth will be part of the transaction or not when I have code like shown below. And if not, then how to go about creating new users as part of an transaction?
using (var db = Db.OpenDbConnection()) {
using (var trans = db.OpenTransaction()) {
... do some databae operations via. db ...
var userAuth = UserAuthRepository.CreateUserAuth(
new UserAuth{UserName = "blabla"},
"password"
);
... do some more databae operations via. db ...
trans.Commit();
}
}

Internally whenever ServiceStack requires accessing a database, e.g in the OrmLiteUserAuthRepository.CreateUserAuth it asks for and uses a new connection and immediately disposes of it once it's done.
There is currently no way to make it apart of a custom transaction.

Related

How to save MS bot state data and conversations ?

I am using node js and Microsoft Bot builder sdk to write a BOT. Currently, I am saving everything in data bags (conversationData, userData etc) and using CosmosDb to store the state. I followed this article to configure CosmosDB & made changes as per nodejs.
https://azure.microsoft.com/en-in/blog/bot-conversation-history-with-azure-cosmos-db/
There are few concerns with this approach,
conversationData bag is cleared when we call endConversation() in dialog. Thats expected by sdk design but we would like to persist this data for multiple conversation flows with same user (same conversation id.) Now, json cosmosDb in db gets replaced with new keys on conversationData when user start new intent.
Ex: schedule a meeting with {name} for {day} at {place}.
we save conversationData.name , conversationData.day , and conversationData. place.
same user starts over schedule a meeting with {name2} for {day2} at {place2}.
documentDb entry gets replaced with conversationData.name1 , conversationData.day2 , and conversationData. place2
Ideally, we would like to keep everything.
Is there a better way to save chat history & conversationData, userData
databags in MS BOT ?.
All storage implementations just write getData and saveData, they internally have a key:value store where key is typically userId + conversationId, but you can make it whatever you want as long as you can reliably derive it from the arguments passed to getData and setData.
Looking at a sample in redis https://github.com/suttna/botbuilder-redis-storage - https://github.com/suttna/botbuilder-redis-storage/blob/master/src/storage.ts for an example storage implementation that's pretty easy to follow.
You would use a custom implementation like this
// Create new storage with redis client
var storage = new YourStorage()
// this is just here for the sake of initializing a `bot`
var connector = new builder.ChatConnector()
var bot = new builder.UniversalBot(connector)
// Configure bot to use YourStorage
bot.set('storage', storage)
bot.set('persistConversationData', true);
storage is just an object that implements
public saveData(context: IBotStorageContext, data: IBotStorageData, callback?: (err: Error) => void)
public getData(context: IBotStorageContext, callback: (err: Error, data: IBotStorageData) => void)
I totally just copied those signatures from the linked redis module, but they are the same in the BotBuilder source for the default storage - https://github.com/Microsoft/BotBuilder/blob/5cf71c742f27d89e9dbc4076f850122bd6edac11/Node/calling/src/storage/BotStorage.ts
The samples are in typescript. If you are unfamiliar, ignore the bit right after : which indicates the type of a thing.

User specific database in MongoDB

I am currently working on an inventory management software in Node js and MongoDB. I am pretty new to MongoDB, having worked in Oracle and MySQL for most of my projects.
Is it possible to create a separate database schema for every client who uses my software, with each client having access only to his copy of the database schema and collections?
The equivalent of selecting data in Oracle database would be
Select * from User1.table,
Select * from User2.table etc
Also, if it were possible, how would it be implemented using a node js mongo db client like mongoose?
I looked at MongoDB documentation, but it talks mainly about adding users to a database for authorization.
I apologize if it seems like a silly question, but id appreciate it if someone could point me in the right direction for this.
Before starting to invest a lot of time in the development of your project, check out other possible approaches to the scenario that you are trying to build.
I did a quick search on SO and found some additional threads with similar scenarios:
MongoDB Database vs. Collection
MongoDB Web App - Database per User
Additional info about mongoose database creation
Whenever you call the connect method on the mongoose object, you are either connecting to an existing database or you are creating it in case it doesn't already exist.
You could have a function that allows you to pass in a name argument with the name and create databases programmatically:
function createDatabase(name) {
var conn_string = 'mongodb://localhost/';
if (typeof name == 'string') {
conn_string += name;
}else{
return false;
}
mongoose.connect(conn_string);
}
Also, be aware that a database will be created when you first insert a record in a collection of that particular database.
It is not sufficient to only connect to the database, you also have to insert a record.
As per my previous example, you could also pass a schema parameter to the function, tailored to each user's profile and fire an insert statement after you connect to that database.

Architecting CouchDB with nodejs

Does anyone have any experience with CouchDB where a real DAL was utilized? CouchDB is not like any other datastore out there, esp. due to its notion of views which add an interesting dynamic to data - business logic separation... not to mention revision controlling the application source code.
Side Note: Libraries like Nano are not a DAL. They are akin to a database driver. Using Nano directly from business logic would tie the application to CouchDB. Not what I want. Instead my custom made DAL uses Nano as a driver, but separates the business logic from Nano completely.
Question: any best practices or documents I should read? Any existing DALs that can switch between MongoDB & CouchDB for common things (to act as a starting point for what I am trying to do)?
You may want to check out resourceful https://github.com/flatiron/resourceful it has support for several data adapters, including mongodb and couchdb
Here is a simple use case:
var resourceful = require('resourceful');
var Creature = resourceful.define('creature', function () {
//
// Specify a storage engine
//
this.use('couchdb');
//
// Specify some properties with validation
//
this.string('diet');
this.bool('vertebrate');
this.array('belly');
//
// Specify timestamp properties
//
this.timestamps();
});
//
// Now that the `Creature` prototype is defined
// we can add custom logic to be available on all instances
//
Creature.prototype.feed = function (food) {
this.belly.push(food);
};

Plugin bypassed when Entity queried from console application

My plugin encrypts/decrypts a field. Works on the field within a CRM form.
From my console application, a retrieve bypasses my plugin, e.g., it retrieves the encrypted value directly from the database without running the plugin. When debugging, breakpoints in the plugin are hit when the field is accessed from a form , but they are not hit when accessed from my console program.
I'm surprised that my plugin isn't invoked from a program. It bypasses my business rules.
Here is how I'm accessing the entity and the field from a program:
private static OrganizationServiceProxy service = null;
private static OrganizationServiceContext orgSvcContext = null;
public static void RetrieveSSNs()
{
var query = orgSvcContext.CreateQuery("bpa_consumer");
foreach (Entity consumer in query)
{
if (consumer.Attributes.Contains("bpa_ssn"))
{
string ssn = consumer["bpa_ssn"].ToString();
Console.WriteLine(string.Format("Consumer \"{0}\" has SSN {1}", consumer.Attributes["bpa_name"], ssn));
}
else
{
Console.WriteLine(string.Format("Consumer \"{0}\" doesn't have a SSN", consumer.Attributes["bpa_name"]));
}
}
}
I'm guessing you have the plugin registered on the Retrieve method? If so, add another identical registration on the RetrieveMultiple. This should get your plugin to execute on your foreach. I should warn you that this is an extremely dangerous thing to do from a performance standpoint though...
If you are concerned about performance my recommendation is to put the encrypted data into a separate entity with a lookup back. Using this method CRM only has to execute the Retrieve/RetrieveMultiple plug-in when a user needs to access the encrypted data, not every time a user accesses the primary record. This will also make it easier to secure the encrypted data.
Turns out the you must register your plugin for the event RetrieveMultiple when you query for a collection of Entities.

Connections with many databases

We have a webapp where each client has their own db (approx. 700 at the moment).
In SubSonic 2, you had to wrap each call with the SharedDBConnectionScope passing in the right connection string to use, otherwise you ran the risk of one thread or client getting data from another thread or client.
In SubSonic3 is this still needed? Do I need to wrap the calls like I did in 2.x?
There are easy ways of switching the database now, but do I still have thread issues or can I do away with the call to SharedDBConnectionScope?
SubSonic 3 greatly improved the way to create a provider from scratch or just passing a name and a connectionsctring:
Some Examples:
// Linq Templates:
var db = new YourDB("connectionstring goes here", "System.Data.SqlClient");
// SimpleRepository without app.config
IDataProvider provider = SubSonic.DataProviders.ProviderFactory.GetProvider(
connectionString: "Server=localhost;Database=clientdb;Uid=root;",
providerName: "MySql.Data.MySqlClient"
);
IRepository repository = new SimpleRepository(provider,
SimpleRepositoryOptions.RunMigrations);
So basically you can create a provider or repository each time a client connects and use this in your class.

Resources