I have an Azure Table Storage with the following Entity:
SampleEntity : TableEntity
{
public int EmployeeId{get; set;}
}
And I have inserted 100 records to the table.
Now I have a change in requirement that the EmployeeID should be string. Also I should not delete the existing 100 records that were inserted.
Hence I changed the existing SampleEntity as follows:
SampleEntity : TableEntity
{
public string EmployeeId{get; set;}
}
And I have inserted 50 Rows into the table with EmployeeId as string.
Now when I do a GetOperation on the table with new SampleEntity(with string EmployeeID), I am getting 150 rows, but the values of EmployeeID for the first 100 rows inserted using the Old SampleEntity was
0.
On the other hand if I switch to old SampleEntity and do a GetOperaiton, I get null values for EmployeeID for the 50 rows inserted using new SampleEntity.
How can I use the new Sample Entity toget all 150 rows with values for EmployeeId in Strings?
What you could possibly do is change the data type of Integer type EmployeeId to String. For this what you would need to do is fetch all entities as DynamicTableEntity and check the property type of EmployeeId property. If the type is Int32, you would create a new entity with a String type EmployeeId property and set its value to old entity's EmployeeId value and then update the existing entity (you will keep the same PartitionKey and RowKey).
See the sample code below for example:
//Get all entities and make sure we get them as dynamic table entity.
var query = new TableQuery();
var allEntities = table.ExecuteQuery(query);
foreach (var entity in allEntities)
{
var propertyType = entity.Properties["EmployeeId"].PropertyType;
if (propertyType == EdmType.Int32 && entity.Properties["EmployeeId"].Int32Value.HasValue)
{
//This is an entity with Integer type EmployeeId. What we need to do is update this entity with a new entity where data type of EmployeeId is String.
var employeeId = entity.Properties["EmployeeId"].Int32Value.Value;
var newEntityWithStringType = new DynamicTableEntity()
{
PartitionKey = entity.PartitionKey,
RowKey = entity.RowKey,
ETag = "*"
};
newEntityWithStringType.Properties.Add("EmployeeId", new EntityProperty(employeeId.ToString()));
TableOperation updateOperation = TableOperation.Replace(newEntityWithStringType);
table.Execute(updateOperation);
}
}
The code above assumes that you only have one property EmployeeId. If there are more properties, please make sure to include them in newEntityWithStringType Properties.
Related
How to update JSON(B) array value by index? And also to retrieve the index of each value in JSONB array?
There is ServiceStack ORMLite model:
public class Page
{
[AutoIncrement]
public long Id { get; set; }
[PgSqlJsonB]
public List<Widget> Widgets { get; set; }
}
For example, how to update the second item in Widgets list?
Here is an example of how to do select array indexes and update array value by index in raw Postgres SQL:
How to update objects inside JSONB
The idea is to select array indexes with AutoQuery and update particular JSONB array value knowing it's index in database array.
In OrmLite complex Types like List<Widget> are blobbed, so if you change the value in C# and save it, it will serialize the entire Widgets property to JSON and update the entire field.
If you want to use PostgreSQL native functions to manipulate the column contents in a server side query you'd need to use the Custom SQL APIs, e.g:
db.Execute("UPDATE page SET widgets = jsonb_set(widgets, ...) WHERE id = #id",
new { id });
I have a collection where I am storing the timestamp and its latest location with the following class:
public class TrackingInfo
{
[JsonProperty("id")]
public string Id { get; set; }
[JsonProperty("_partition_key")]
public string _PartitionKey { get; set; }
[JsonProperty("asset_id")]
public string AssetId { get; set; }
[JsonProperty("unix_timestamp")]
public double UnixTimestamp { get; set; }
[JsonProperty("timestamp")]
public string Timestamp { get; set; }
[JsonProperty("location")]
public Point Location { get; set; }
}
which is partitioned by _PartitionKey which contains a construct like this:
tracking._PartitionKey = $"tracking_{tracking.AssetId.ToLower()}_{DateTime.Today.ToString("D")}";
Looks like there is no way to do a Group by on the collection.
Can someone please help me create a SQL document query to find the latest entry for each AssetId and its Location and Timnestamp when the data was recorded.
Update 1:
what if I change the _PartitionKey to represent per day something like below:
tracking._PartitionKey = $"tracking_{DateTime.Today.ToString("D")}";
would it make it easier to get all assets and its latest tracking record?
As per my comment, my suggestion would be to solve your problem differently.
Assumption: You have a large number of assetIds and don't know the values beforehand:
Have one document that represents the latest state of your asset
Have another document that represents the location events of your asset
Update the first document whenever there is a new location event
You can put both types of documents in the same collection or separate them - both approaches have benefits. I would probably separate them.
Then do a query "what assets are within 1km of xxx" (Querying spatial types)
Sidenote: It might be a good idea to use the assetId as partitionKey instead of your combined key. Using such a key is very bad for queries
If you only have very few assetIds, you can use those to only find the latest updates by using and ordering by the timestamp field. This will only return the last item
Cosmos DB doesn't support group by feature,you could vote up this.
Provide a third-party package [documentdb-lumenize for your reference which supports group by feature,it has .net example:
string configString = #"{
cubeConfig: {
groupBy: 'state',
field: 'points',
f: 'sum'
},
filterQuery: 'SELECT * FROM c'
}";
Object config = JsonConvert.DeserializeObject<Object>(configString);
dynamic result = await client.ExecuteStoredProcedureAsync<dynamic>("dbs/db1/colls/coll1/sprocs/cube", config);
Console.WriteLine(result.Response);
You could group by assetId column and get the max timestamp.
A POCO update in OrmLite executes SQL like this example:
(#P1 varchar(1043),#P2 varchar(6))
UPDATE table
SET FILEDATA=#P1
WHERE FILEID=#P2
But it leads to multiple query plans based on different #P1 and #P2 values with varying parameter lengths.
So, what's the best way(s) to specify data types/lengths for parameterized queries in Ormlite, so that query plans are cached properly, and avoids multiple query plans due to variable parameter lengths?
Here's a similar situation with having variable length strings: https://dba.stackexchange.com/questions/216330/parameterized-query-creating-many-plans
Update
Here's an example:
Database Table
dbo.Users
Id (PK, int, not null)
Email (nvarchar(150), not null)
POCO
[Alias("Users")]
public class User
{
[PrimaryKey]
[AutoIncrement]
public int Id { get; set; }
public string Email { get; set; }
}
Code
int userId = 1;
User user;
// get User
using (var db = DbConn.OpenDbConnection())
{
user = db.SingleById<User>(userId);
}
// print User email (hi#example.com)
Console.WriteLine(user.Email);
// update User email
using (var db = DbConn.OpenDbConnection())
{
user.Email = "tester#example.org";
db.Update(User);
}
The update operation will result in an SQL query similar to the one I've posted at the top, with variable length of parameters. It causes multiple query plans to be created by SQL Server due to variable length of parameters. Ideally, the query should have fixed length of parameters, so that a query plan can be created, cached and reused for the same operations (e.g. User update) with varying parameter values (i.e. different email).
The Size of string parameters are now being specified from this commit where it takes the default string size of the configured StringConverter. This change is available from v5.5.1 that's now available on MyGet.
If needed its behavior can be overridden by replacing the String Converter and overriding InitDbParam().
How can i set a auto increment column seed from 1 to 100??
in sql server can do this use
ADD Id INT NOT NULL IDENTITY(1000,1)
but in ormlite autoincrement attribute seems like always start with 1.
tried also
db.AlterTable<MSchool>("command") // DROP ID AND ADD AUTO INCREMENT COLUMN
it works, if Id doesn't related to any table.
can i set a column autoincrement with default seed and increment?
[AutoIncrement(1000,1)]
public int Id {get;set;}
UPDATE
Resolved, but not good
public class School
{
[AutoIncrement]
public int Id {get;set;}
}
//then create table
db.CreateTable<School>();
//then update seed
db.ExecuteSql("DBCC CHECKIDENT ('School',reseed,1000)");
OR
[PostCreateTable("DBCC CHECKIDENT ('School',reseed,1000)")]
public class School : BaseModel
{
[AutoIncrement]
public int Id {get;set;}
}
Is there no easier way to do this??
Personally I don't believe this behavior belongs inside source code and would just modify the database out-of-band.
But if I were to add it in source code I'd do something like:
if (db.CreateTableIfNotExists<School>())
{
db.ExecuteSql("DBCC CHECKIDENT ('School',reseed,1000)");
}
So it only resets the seed if the table doesn't exist.
Another option is attach it to the model at runtime so it's decoupled from the class definition with:
typeof(School)
.AddAttributes(new PostCreateTableAttribute(
"DBCC CHECKIDENT ('School',reseed,1000)"));
I'm trying to persist an object into a MongoDB, using the following bit of code:
public class myClass
{
public string Heading { get; set; }
public string Body { get; set; }
}
static void Main(string[] args)
{
var mongo = MongoServer.Create();
var db = mongo.GetDatabase("myDb");
var col = db.GetCollection<BsonDocument>("myCollection");
var myinstance = new myClass();
col.Insert(myinstance);
var query = Query.And(Query.EQ("_id", new ObjectId("4df06c23f0e7e51f087611f7)));
var res = col.Find(query);
foreach (var doc in res)
{
var obj = BsonSerializer.Deserialize<myClass>(doc);
}
}
However I get the following exception 'Unexpected element: _id' when trying to Deserialize the document.
So do I need to Deserialize in another way?? What is the preferred way of doing this?
TIA
Søren
You are searching for a given document using an ObjectId but when you save an instance of MyClass you aren't providing an Id property so the driver will create one for you (you can make any property the id by adding the [BsonId] attribute to it), when you retrieve that document you don't have an Id so you get the deserialization error.
You can add the BsonIgnorExtraElements attribute to the class as Chris said, but you should really add an Id property of type ObjectId to your class, you obviously need the Id (as you are using it in your query). As the _id property is reserved for the primary key, you are only ever going to retrieve a single document so you would be better off writing your query like this:
col.FindOneById(new ObjectId("4df06c23f0e7e51f087611f7"));
The fact that you are deserializing to an instance of MyClass once you retrieve the document lends itself to strongly typing the collection, so where you create an instance of the collection you can do this
var col = db.GetCollection<MyClass>("myCollection");
so that when you retrieve the document using the FindOneById method the driver will take care of the deserialization for you putting it all together (provided you add the Id property to the class) you could write
var col = db.GetCollection<MyClass>("myCollection");
MyClass myClass = col.FindOneById(new ObjectId("4df06c23f0e7e51f087611f7"));
One final thing to note, as the _id property is created for you on save by the driver, if you were to leave it off your MyClass instance, every time you saved that document you would get a new Id and hence a new document, so if you saved it n times you would have n documents, which probably isn't what you want.
A slight variation of Projapati's answer. First Mongo will deserialize the id value happily to a property named Id which is more chsarp-ish. But you don't necessarily need to do this if you are just retrieving data.
You can add [BsonIgnoreExtraElements] to your class and it should work. This will allow you to return a subset of the data, great for queries and view-models.
Try adding _id to your class.
This usually happens when your class doesn't have members for all fields in your document.
public class myClass
{
public ObjectId _id { get; set; }
public string Heading { get; set; }
public string Body { get; set; }
}