Neo4jClient: decimal point dropped - decimal

I'm having trouble using properties of type decimal in my program, which uses Neo4jClient to interact with my Neo4J server.
using System;
using Neo4jClient;
namespace TestProject
{
class Program
{
static void Main()
{
var graphClient = new GraphClient(new Uri("http://localhost:7474/db/data"));
graphClient.Connect();
const decimal socialFactor = 0.5m;
var userIn = new User {Name = "John Doe", SocialFactor = socialFactor};
var nodeRef = graphClient.Create(userIn);
var userOut = graphClient.Get(nodeRef).Data;
Console.WriteLine(userOut.Name + ", " + userOut.SocialFactor);
Console.WriteLine(userOut.SocialFactor == socialFactor ? "win" : "fail!");
}
}
class User
{
public string Name { get; set; }
public decimal SocialFactor { get; set; }
}
}
What the program does is insert a user node with a social factor of 0.5 into Neo4j (userIn), and immediately retrieves the user node again (userOut). The problem is that userOut.SocialFactor is 5 instead of 0.5!
Strangely enough, when I change the type to double, there is no problem, but that's obviously something I don't want to do.
Is this a bug? Is there some kind of workaround?
Thanks,
Jan

Turns out that yes, it was a bug but it was recently fixed in version 1.0.0.595 of Neo4jClient.

Related

Importing Thousands of Records Into Acumatica via SOAP Contract-Based API

I’m using contract-based SOAP APIs to try to import about 25,000 Journal Entry lines from a banking system into a single Acumatica GL batch.
If I try to add all the records at once to the same GL batch, my
request times out after a few hours. Since it uses the same GL
batch, this solution does not leverage multi-threading.
I've also tried adding the 25000 lines one line at a time to a
single GL batch and the requests does not time out, but
performance-speed starts decreasing significantly after
approximately 3000 records or so are added to the GL batch. This
process takes several hours to run and since it uses the same GL
batch, this solution does not leverage multi-threading.
I looked into multi-threading as well to import the data into
several smaller GL-batches of 5000 lines each and that works without
any timeout issues. but it still takes about an hour and a half to
run. Also, the customer does not accept this multi-batch approach;
they want all their daily data in a single GL batch.
25,000 records does not seem like a lot to me, so I wonder if Acumatica’s APIs were not built for this volume of lines in a single transaction. All I’m doing in my code is building the entity info by reading a text file and then calling the put method to create the GL batch using that entity with 25,000 line records.
I've read a couple of articles about optimizing the APIs, but they primarily deal with different instances of an entity, as in several different GL batches or several different Stock Items for example. In those cases, multi-threading is a great asset because you can have multiple threads creating multiple "different" GL batches, but multi-threading is not helpful when updating the same GL batch.
Here's what I've read so far:
https://asiablog.acumatica.com/2016/12/optimizing-large-import.html
https://adn.acumatica.com/blog/contractapioptimization/
I'm at a loss here, so any pointers would be greatly appreciated.
I look forward to your response.
Here's my code:
public static void CreateMultipleLinesPerJournalEntryBatchContractTEST(MyStoreContract.DefaultSoapClient soapClient, List<JournalEntry> journalEntries)
{
string myModuleForBatchLookup = "GL";
//list holding the values of all the records belonging to the batch in process
List<JournalEntry> allBatchItems = journalEntries;
//List used to store objects in format required by Acumatica
List<MyStoreContract.JournalTransactionDetail> myJournalTransactionsFormatted = new List<MyStoreContract.JournalTransactionDetail>();
try
{
//Creating a header and returning a batch value to be used for all line iterations.
JournalEntry myHeaderJournalEntryContract = allBatchItems.First();
string myBatchNumberToProcess = AddGLBatchHeaderContractTEST(soapClient, myHeaderJournalEntryContract);
// Do something with then n number of items defined in processing subBatch size or remaining items if smaller
foreach (JournalEntry je in allBatchItems)
{
//Moving the items in each batch from the original unformatted list to the formatted list one at a time
myJournalTransactionsFormatted.Add(new MyStoreContract.JournalTransactionDetail
{
BranchID = new MyStoreContract.StringValue { Value = je.Branch },
Account = new MyStoreContract.StringValue { Value = je.Account },
Subaccount = new MyStoreContract.StringValue { Value = je.Subaccount },
ReferenceNbr = new MyStoreContract.StringValue { Value = je.RefNumber },
DebitAmount = new MyStoreContract.DecimalValue { Value = je.DebitAmount },
CreditAmount = new MyStoreContract.DecimalValue { Value = je.CreditAmount },
TransactionDescription = new MyStoreContract.StringValue { Value = je.TransactionDescription },
UsrTransactionTime = new MyStoreContract.StringValue { Value = je.UsrTransactionTime },
UsrTransactionType = new MyStoreContract.StringValue { Value = je.UsrTransactionType },
UsrTranSequence = new MyStoreContract.StringValue { Value = je.UsrTranSequence },
UsrTellerID = new MyStoreContract.StringValue { Value = je.UsrTellerID }
});
}
//Specify the values of a new Jornal Entry using all the collected elements from the batch(list) created
MyStoreContract.JournalTransaction journalToBeCreated = new MyStoreContract.JournalTransaction
{
//Header data and details added by list generated by loop
BatchNbr = new MyStoreContract.StringSearch { Value = myBatchNumberToProcess }, //This is one of two lines used to lookup/search the batch needing to be updated
Module = new MyStoreContract.StringSearch { Value = myModuleForBatchLookup }, //This is one of two lines used to lookup/search the batch needing to be updated
Details = myJournalTransactionsFormatted.ToArray() // this is the line adding the array containing all the line details
};
soapClient.Put(journalToBeCreated);
Console.WriteLine("Added " + allBatchItems.Count.ToString() + " line transactions");
Console.WriteLine();
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine("The following error was encountered and all entries for this batch need to be logged in error table");
Console.WriteLine();
Console.WriteLine(e.Message);
Console.WriteLine();
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
}
public static string AddGLBatchHeaderContractTEST(MyStoreContract.DefaultSoapClient soapClient, JournalEntry je)
{
try
{
//Specify the values of a new Jornal Entry Batch header
MyStoreContract.JournalTransaction journalToBeCreated = new MyStoreContract.JournalTransaction
{
//Header data
BranchID = new MyStoreContract.StringValue { Value = "PRODWHOLE" }, //This is the default branch
TransactionDate = new MyStoreContract.DateTimeValue { Value = je.TransactionDate.AddDays(-1) }, //Reduced 1 day from the batch
CurrencyID = new MyStoreContract.StringValue { Value = je.CurrencyCode }, //Currency to be used for the batch
Description = new MyStoreContract.StringValue { Value = je.TransactionDescription },
Hold = new MyStoreContract.BooleanValue { Value = true }
};
//Create a Journal Entry with the specified values
MyStoreContract.JournalTransaction newJournalTransaction = (MyStoreContract.JournalTransaction)soapClient.Put(journalToBeCreated);
string myBatchToProcess = newJournalTransaction.BatchNbr.Value;
return myBatchToProcess;
}
catch (Exception e)
{
Console.WriteLine("Error was caught while trying to create the header for the batch...");
Console.WriteLine();
Console.WriteLine(e);
Console.WriteLine();
return null;
}
}
My custom class for legacy system line items which I then need format into Acumatica's format:
class JournalEntry
{
public DateTime TransactionDate { get; set; }
public string CurrencyCode { get; set; }
public string Description { get; set; }
public string Branch { get; set; }
public string Account { get; set; }
public string Subaccount { get; set; }
public string RefNumber { get; set; }
public decimal DebitAmount { get; set; }
public decimal CreditAmount { get; set; }
public string TransactionDescription { get; set; }
//Added custom fields for customer
public string UsrTellerID { get; set; }
public string UsrTransactionType { get; set; }
public string UsrTransactionTime { get; set; }
public string UsrTranSequence { get; set; }
//Adding original file data for the line
public string FileLineData { get; set; }
}
I tried Yuriy's approach described below, but my custom fields are not updating. Only standard fields are being updated. Which command should I use to update the extension (custom) fields. See code below:
//Here I create instance of GLTran
GLTran row = graph.GLTranModuleBatNbr.Cache.CreateInstance() as GLTran;
//here I get a handle to graph extension GLTranExt to be able to use the added fields.
var rowExt = row.GetExtension<GLTranExt>();
row = graph.GLTranModuleBatNbr.Insert(row);
graph.GLTranModuleBatNbr.Cache.SetValueExt(row, "AccountID", JE.Account);
graph.GLTranModuleBatNbr.Cache.SetValueExt(row, "SubID", JE.Subaccount);
row.TranDesc = "my line description";
row.Qty = 1.0m;
row.CuryDebitAmt = (JE.DebitAmount);
row.CuryCreditAmt = (JE.CreditAmount);
rowExt.UsrTellerID = "Test teller";
rowExt.UsrTransactionTime = "Test Transaction Time";
rowExt.UsrTransactionType = "Test Transaction Type";
rowExt.UsrTranSequence = "Test Transaction Sequence";
row = graph.GLTranModuleBatNbr.Update(row);
graph.Actions.PressSave();
In multi threaded import of Sales Orders I've got 18000 lines per hour ( 4 cores, 32Gb RAM ). So your 25000 is very similar to what I've get ( one Sales order had 1 - 6 lines ). For second link that you provided, what were parameters of your API call, what was number of your Acumatica instances ( CPU, RAM, parameters of SQL Server )?
I propose you to consider scaling Acumatica horizontally and also scale your database via SQL sharding.
Edit
In case if you need to have one GL Batch with 25000 lines on it, then I propose you following workaround:
Create one more Acumatica page that has text box and button Import.
In code of button Import button
2.1 read text box information as xml ( or JSON )
2.2 Create instance of GL Graph
2.3 Insert via Graph needed amount ( in your case 25000 ) lines
2.4 Call to graph.PressSave()
Send Web API your request not to GL Batch but to created by you page.
I know this is an old question, but I am answering here for the benefit of anyone who stumbles across this page. There's a performance issue in the Journal Transaction screen where the time to create a transaction increases non-linearly with the number of rows to insert.
A customization-based workaround was provided to us by Acumatica support which significantly improved performance. I don't have the exact version where this fix was included, but builds newer than today (September 2021) should include this already.
Customization fix:
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using PX.Api;
using PX.Data;
using PX.Common;
using PX.Objects.Common;
using PX.Objects.Common.Extensions;
using PX.Objects.CS;
using PX.Objects.CM;
using PX.Objects.CA;
using PX.Objects.Common.Bql;
using PX.Objects.Common.GraphExtensions.Abstract;
using PX.Objects.Common.GraphExtensions.Abstract.DAC;
using PX.Objects.Common.GraphExtensions.Abstract.Mapping;
using PX.Objects.GL.DAC;
using PX.Objects.GL.FinPeriods;
using PX.Objects.GL.JournalEntryState;
using PX.Objects.GL.JournalEntryState.PartiallyEditable;
using PX.Objects.GL.Overrides.PostGraph;
using PX.Objects.GL.Reclassification.UI;
using PX.Objects.PM;
using PX.Objects.TX;
using PX.Objects.Common.Tools;
using PX.Objects.GL.DAC.Abstract;
using PX.Objects.Common.EntityInUse;
using PX.Objects.GL.FinPeriods.TableDefinition;
using PX.Data.SQLTree;
using PX.Objects.CR;
using PX.Data.BQL.Fluent;
using PX.Data.BQL;
using PX.Objects;
using PX.Objects.GL;
namespace PX.Objects.GL
{
public class JournalEntry_Extension : PXGraphExtension<JournalEntry>
{
public delegate void PopulateSubDescrDelegate(PXCache sender, GLTran Row, Boolean ExternalCall);
[PXOverride]
public void PopulateSubDescr(PXCache sender, GLTran Row, Boolean ExternalCall, PopulateSubDescrDelegate baseMethod)
{
if (Base.IsImport || Base.IsExport || Base.IsContractBasedAPI)
{
return;
}
baseMethod(sender,Row,ExternalCall);
}
}
}

ArangoDb .Net driver performance

I am trying on the new ArangoDb-Net reimplement driver https://github.com/yojimbo87/ArangoDB-NET/tree/reimplement. Today is the first time I tried on the performance. When I used the araongosh to perform the insert. It can insert about 5000 records per second. However, when I used the .Net driver to perform the same update. I took about 2 minutes to perform the same insertion. May I know what I have done wrong? Thanks.
[EDIT] completing the question with the github discussion
I have tested the code below with my arangosh
count=1;
startTime=+new Date();
console.log(startTime);
while(count <= 10000)
{
db.someCollection.save({"Id":"1234567890123456789012345678901234",
"Key":1234567,
"Revision":1234567,
"Name":"Mohamad Abu Bakar",
"IC Number":"1234567-12-3444",
"Department":"IT Department",
"Height":1234,
"DateOfBirth":"2015-01-27 03:33",
"Salary":3333});
count++;
}
endTime=+new Date();
console.log(endTime);
console.log("Total time taken:" + (endTime - startTime)/1000);
It took 3.375 seconds to complete the operation.
I do the similar thing with the .Net driver and it took almost 9.5797819. Almost triple of the arangosh. Here's the code in .Net:
public static void TestArangoDb()
{
//ASettings.AddConnection("_system", "127.0.0.1", 8529, false, "_system");
//var db = new ADatabase("_system");
//db.Create("trial_sample");
ASettings.AddConnection("trial_sample",
"127.0.0.1", 8529, false, "trial_sample");
var db2 = new ADatabase("trial_sample");
db2.Collection.Create("someCollection");
DateTime startTime = DateTime.Now;
Console.WriteLine("Start Time: " + startTime.ToLongTimeString());
for(int count=1; count <= 10000; count++)
{
var employee = new Employee();
employee.Id = "1234567890123456789012345678901234";
employee.Key = "1234567";
employee.Revision = "1234567";
employee.Name = "Mohamad Abu Bakar";
employee.IcNumber = "1234567-12-3444";
employee.Department = "IT Department";
employee.Height = 1234;
employee.DateOfBirth = new DateTime(2015, 1, 27, 3, 33, 3);
employee.Salary = 3333;
var result = db2.Document.Create<Employee>("someCollection", employee);
//var updateDocument = new Dictionary<string, object>()
// .String("DocumentId", "SomeId");
//db2.Document.Update(result.Value.String("_id"), updateDocument);
}
DateTime endTime = DateTime.Now;
TimeSpan duration = endTime - startTime;
Console.WriteLine("End Time: " + endTime.ToLongTimeString());
Console.WriteLine("Total time taken: " + duration.TotalSeconds);
}
public class Employee
{
public string Id { get; set; }
public string Key { get; set; }
public string Revision { get; set; }
public string Name { get; set; }
public string IcNumber { get; set; }
public string Email { get; set; }
public string Department { get; set; }
public double Height { get; set; }
public DateTime DateOfBirth { get; set; }
public decimal Salary { get; set; }
}
If I remove the comment for:
var updateDocument = new Dictionary<string, object>()
.String("DocumentId", "SomeId");
db2.Document.Update(result.Value.String("_id"), updateDocument);
The performance is almost 30 times. It took 99.8789133 seconds to complete. In fact, I just perform additional update to add additional column.
Could you suggest on the problem on the code above? Thanks.
yojimbo87 researched the issue deeper. Testing The different layers uncovered the problem.
Merge request #32 improves performance when creating, updating and replacing documents/edges from generic objects by ~57%.
On local machine single document creation with given Employee example object now takes on average ~0.4ms in 10k iteration loop using the driver. Raw .NET HTTP request (without any ArangoDB driver abstraction) takes ~0.35ms in 10k loop. Difference is made by conversion from generic object into Dictionary which needs to be done because of attributes processing (such as IgnoreField, IgnoreNullValue and AliasField).
The NuGet package was updated to reflect this improvement.

ServiceStack ORMLite

We are migrating our SProc based solution over to ORMLite, and so far has been pretty painless. Today I wrote the following method:
public AppUser GetAppUserByUserID(int app_user_id)
{
var dbFactory = new OrmLiteConnectionFactory(this.ConnectionString, SqlServerOrmLiteDialectProvider.Instance);
AppUser item = null;
var rh = new RedisHelper();
var id= CacheIDHelper.GetAppUserID( app_user_id );
item = rh.Get<AppUser>(id);
if (item == null)
{
try
{
using (var db = dbFactory.OpenDbConnection())
{
item = db.Single<AppUser>("application_user_id={0}", app_user_id);
rh.Set<AppUser>(item, id);
}
}
catch (Exception ex)
{
APLog.error(ex, "Error retrieving user!");
}
}
return item;
}
I have remove some of the extraneous fields, but they are basically:
[Alias("application_user")]
public class AppUser : APBaseObject
{
[Alias("application_user_id")]
[AutoIncrement]
public int? UserID
{
get;
set;
}
[Alias("application_user_guid")]
public string UserGUID
{
get;
set;
}
//MORE FIELDS here.
}
The challenge is that they only field that is populate is the ID field, AND I already know that ID because I am passing it into the method.
I did get the last SQL called and ran that against the DB directly and all of the fields were being referenced correctly.
I stepped through the code in the debugger, and everything came back correctly, except that the only field returned was the ID.
Thoughts?
I had a similar issue which was caused by my class methods not mapping to the db properly. My exact issue was caused by a nullable int field in the db and the class method was defined as an 'int' instead of 'int?'.
Perhaps you have a similar issue?

Unable to Serialize into XML

I am trying to Serialize the content of some text into an XML file (performed when a user saves their selections), and then will later deserialize it (when the user chooses to display their saved selection).
I have been following the following tutorial on serialization.
I have also tried to do this via LINQ to XML but was either getting namespace errors, or the tool returned no errors, but did not work (with the same problem as described below).
The problem I am having is that my code is not returning any errors, but the function is not working (I have a label control that allows me to see that the 'catch' is being returned). I am building the tool in Expression Blend, using C#.
Here is my SaveSelection.cs Class
using System;
using System.Collections.Generic;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Shapes;
using System.Xml.Serialization;
using System.Xml;
namespace DYH
{
public class SaveSelections
{
[XmlAttribute("Title")]
public string Title
{ get; set; }
[XmlElement("Roof")]
public string RoofSelection
{ get; set; }
[XmlElement("Cladding")]
public string CladdingSelection
{ get; set; }
[XmlElement("MixedCladding")]
public string MixedCladdingSelection
{ get; set; }
[XmlElement("FAJ")]
public string FAJSelection
{ get; set; }
[XmlElement("GarageDoor")]
public string GarageDoorSelection
{ get; set; }
[XmlElement("FrontDoor")]
public string FrontDoorSelection
{ get; set; }
}
}
Here is my C# code
// Save Selection Button
private void Button_SaveSelection_MouseLeftButtonDown(object sender, System.Windows.Input.MouseButtonEventArgs e)
{
try
{
// Save selections into the SavedSelections.xml doc
SaveSelections userselection = new SaveSelections();
userselection.Title = TextBox_SaveSelection.Text;
userselection.RoofSelection = Button_Roof_Select.Text;
userselection.CladdingSelection = Button_Cladding_Select.Text;
userselection.MixedCladdingSelection = Button_MixedCladding_Select.Text;
userselection.FAJSelection = Button_FAJ_Select.Text;
userselection.GarageDoorSelection = Button_GarageDoor_Select.Text;
userselection.FrontDoorSelection = Button_FrontDoor_Select.Text;
SerializeToXML(userselection);
// XDocument xmlSaveSelections = XDocument.Load("../SavedSelections.xml");
//
// XElement newSelection = new XElement("Selection", //xmlSaveSelections.Element("Selections").Add(
// //new XElement("Selection",
// new XElement("Title", TextBox_SaveSelection.Text),
// new XElement("RoofSelection", Button_Roof_Select.Text),
// new XElement("CladdingSelection", Button_Cladding_Select.Text),
// new XElement("MixedCladdingSelection", Button_MixedCladding_Select.Text),
// new XElement("FAJSelection", Button_FAJ_Select.Text),
// new XElement("GarageDoorSelection", Button_GarageDoor_Select.Text),
// new XElement("FrontDoorSelection", Button_FrontDoor_Select.Text));
//
//// xmlSaveSelections.Add(newSelection);
//// xmlSaveSelections.Save("../SavedSelections.xml");
SelectionLabel.Text = "Your selection has been saved as " + "'" + TextBox_SaveSelection.Text + "'. We suggest you write down the name of your selection.";
}
catch(Exception ex)
{
throw ex;
SelectionLabel.Text = "There was a problem saving your selection. Please try again shortly.";
}
}
// Saves SaveSelection.cs to XML file SavedSelections.xml
static public void SerializeToXML(SaveSelections selection)
{
XmlSerializer serializer = new XmlSerializer(typeof(SaveSelections));
TextWriter textWriter = new StreamWriter(#"/SavedSelections.xml");
serializer.Serialize(textWriter, selection);
textWriter.Close();
}
I have left evidence of one of my previous attempts commented out so you can see a previous format I tried.
My issue is that when I try to use the tool, the SelectionLabel.Text returns "There was a problem saving your selection. Please try again shortly." so I know that the code is returning the catch and not executing the 'try'.
Any help??
Edit 18/6/2012: The below code was the code that worked as per correct answer to question.
public void Button_SaveSelection_MouseLeftButtonDown(object sender, System.Windows.Input.MouseButtonEventArgs e)
{
string roofSelection = TextBox_SaveSelection.Text + "_RoofSelection";
string claddingSelection = TextBox_SaveSelection.Text + "_CladdingSelection";
string mixedCladdingSelection = TextBox_SaveSelection.Text + "_MixedCladdingSelection";
string fajSelection = TextBox_SaveSelection.Text + "_FAJSelection";
string garageDoorSelection = TextBox_SaveSelection.Text + "_GarageDoorSelection";
string frontDoorSelection = TextBox_SaveSelection.Text + "_FrontDoorSelection";
try
{
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
// Gives us 6Mb of storage space in IsoStore
Int64 isoSpaceNeeded = 1048576 * 6;
Int64 currentIsoSpace = store.AvailableFreeSpace;
// If space needed is greater than (>) space available, increase space
if (isoSpaceNeeded > currentIsoSpace)
{
// If user accepts space increase
if (store.IncreaseQuotaTo(currentIsoSpace + isoSpaceNeeded))
{
IsolatedStorageFileStream file = store.CreateFile("SavedSelections.txt");
file.Close();
// Stream writer to populate information in
using (StreamWriter sw = new StreamWriter(store.OpenFile("SavedSelections.txt", FileMode.Open, FileAccess.Write)))
{
appSettings.Add(roofSelection, Button_Roof_Select.Text);
sw.WriteLine(roofSelection);
appSettings.Add(claddingSelection, Button_Cladding_Select.Text);
sw.WriteLine(claddingSelection);
appSettings.Add(mixedCladdingSelection, Button_MixedCladding_Select.Text);
sw.WriteLine(mixedCladdingSelection);
appSettings.Add(fajSelection, Button_FAJ_Select.Text);
sw.WriteLine(fajSelection);
appSettings.Add(garageDoorSelection, Button_GarageDoor_Select.Text);
sw.WriteLine(garageDoorSelection);
appSettings.Add(frontDoorSelection, Button_FrontDoor_Select.Text);
sw.WriteLine(frontDoorSelection);
}
SelectionLabel.Text = "Your selection has been saved as " + "'" + TextBox_SaveSelection.Text + "'. We suggest you write down the name of your selection.";
}
}
}
SelectionLabel.Text = "Your selection has been saved as " + "'" + TextBox_SaveSelection.Text + "'. We suggest you write down the name of your selection.";
}
catch //(Exception ex)
{
//throw ex;
SelectionLabel.Text = "There was a problem saving your selection. Please try again shortly.";
}
}
It looks like your issue is because you're trying to a file, but that file did not come from a FileSaveDialog initiated by a user action. You're running into a security feature of Silverlight where you're not allowed access to the local file system. Instead, try writing to IsolatedStorage. However, be aware that end users can completely (as well as selectively) disable application storage so you'll need to handle those exceptions as well.
Here's a quick article on how to use IsolatedStorage.

Loading an object from a db4o database

I am developing an e-commerce website that utilises db4o as the backend. All was well until last week when I came across a problem that I have been unable to solve. The code below is quite straight forward. I open a database file, save an object and then try to retrieve it. However I get nothing back. The "users" variable has a count of zero.
public class Program
{
private static string _connectionString = string.Format(#"c:\aaarrrr.db4o");
static void Main(string[] args)
{
TestUser container = new TestUser() { id = 1, Name = "Mohammad", Surname = "Rafiq" };
Db4oFactory.Configure().Diagnostic().AddListener(new DiagnosticToConsole());
using (var dbc = Db4oFactory.OpenFile(_connectionString))
{
dbc.Store(container);
}
IList<TestUser> users = null;
using (var dbc = Db4oFactory.OpenFile(_connectionString))
{
users = dbc.Query<TestUser>(x => x.id == 1).ToList();
}
if (users.Count > 0)
{
Console.WriteLine("{0} {1} with id of {2}", users.First().Name, users.First().Surname, users.First().id);
}
else
{
Console.WriteLine("\nNo data returned.");
}
Console.ReadLine();
}
}
public class TestUser
{
[Indexed]
private int _id = 0;
private string _name = string.Empty;
private string _surname = string.Empty;
public int id { get { return _id; } set { _id = value; } }
public string Name { get { return _name; } set { _name = value; } }
public string Surname { get { return _surname; } set { _surname = value; } }
}
I have attached db4o diagnostic listener and I see nothing in the console output. Everything seems fine. I know I am writing to the file because I can see the file size increase and the timestamp is also updated. I have checked all the project settings and they are all set to default. I am using .net 4, visual studio 2010 beta and windows 7. I have done some reading regarding reflection permission but I cant see how this applies here. Any help or ideas would be knidly appreciated.
After calling store(), you need to commit() before leaving the using{} statement. You closed your database before committing your changes.

Resources