Can we use direct SQL Connection in graph for bulk update - acumatica

We have a requirement to insert data from view to a custom table, right now we are doing this using graph objects and which is working fine, but the issue is, for 1K records also it is taking huge amount of time, so as an alternate we wrote bulk update. We wrote view because we have some complex logic in it.
we are in SaaS environment now. The question is, can we use this kind of code, because i know SaaS doesn't support stored procedures.
please suggest. Please have a look at below code.
public static void BulkInsertSalesData()
{
string DBConnectionString = System.Web.Configuration.WebConfigurationManager.ConnectionStrings["ProjectX"].ToString();
using (SqlConnection con = new SqlConnection(DBConnectionString))
{
if (con.State == ConnectionState.Closed)
con.Open();
SqlCommand cmd = new SqlCommand("SELECT field1,field2,field3 FROM viewname where companyID = 2", con);
cmd.CommandTimeout = 0;
using (SqlDataReader rdr = cmd.ExecuteReader())
{
using (SqlConnection destinationCon = new SqlConnection(DBConnectionString))
{
using (SqlBulkCopy bc = new SqlBulkCopy(destinationCon))
{
bc.DestinationTableName = "CustomTable";
bc.ColumnMappings.Add("field1", "field1");
bc.ColumnMappings.Add("field2", "field2");
bc.ColumnMappings.Add("field3", "field3");
destinationCon.Open();
bc.WriteToServer(rdr);
}
}
}
if (con.State == ConnectionState.Open)
con.Close();
}
}

Have you tried PXDatabase class?
Bypassing the data access layer in Acumatica would not pass certification. There are security concerns related to direct database access. PXDatabase class allows a more direct access at the expense of skipping some validations.
Example:
protected virtual void UpdateKey(APVendorRefNbr vrn)
{
PXDatabase.Insert<APVendorRefNbr>(
new PXDataFieldAssign<APVendorRefNbr.masterID>(vrn.MasterID),
new PXDataFieldAssign<APVendorRefNbr.detailID>(vrn.DetailID),
new PXDataFieldAssign<APVendorRefNbr.vendorID>(vrn.VendorID),
new PXDataFieldAssign<APVendorRefNbr.vendorDocumentID>(vrn.VendorDocumentID),
new PXDataFieldAssign<APVendorRefNbr.siblingID>(vrn.SiblingID),
new PXDataFieldAssign<APVendorRefNbr.createdByID>(vrn.CreatedByID),
new PXDataFieldAssign<APVendorRefNbr.createdByScreenID>(vrn.CreatedByScreenID));
}

Related

Adding restrictions to ACLs yields empty results for queries in Jackrabbit Oak

Using Jackrabbit Oak, I've been attempting to configure security through SecurityProvider and SecurityConfigurations. In particular, I've been using the restrictions which generally works as expected. However, when dealing with JCR-SQL2 queries, more gets filtered out than expected.
Details
It can be reproduced with the repository below.
/
node [nt:unstructured]
subnode [nt:unstructured]
On node, I add an access control entry with privilege JCR_ALL for user together with a restriction for rep:glob -> "", such that user do not have access to any children of node.
It works as expected when using session.getNode:
session.getNode("/node") returns the node
session.getNode("/node/subnode") throws PathNotFoundException as expected due to the restriction.
However, when I execute the following JCR-SQL2 query:
SELECT * FROM [nt:unstructured]
I get no results back. Here I would have expected to get /node, as it is otherwise available when using session.getNode.
Code
public static void main(String[] args) throws Exception {
Repository repository = new Jcr().with(new MySecurityProvider()).createRepository();
Session session = repository.login(new UserIdCredentials("")); // principal is "SystemPrincipal.INSTANCE"
// Create nodes
Node node = session.getRootNode().addNode("node", "nt:unstructured");
node.addNode("subnode", "nt:unstructured");
// Add access control entry + restriction
AccessControlManager acm = session.getAccessControlManager();
JackrabbitAccessControlList acl = (JackrabbitAccessControlList) acm
.getApplicablePolicies("/node").nextAccessControlPolicy();
Privilege[] privileges = new Privilege[]{acm.privilegeFromName(Privilege.JCR_ALL)};
Map<String, Value> restrictions = new HashMap<String, Value>() {{put("rep:glob", new StringValue(""));}};
acl.addEntry(new PrincipalImpl("user"), privileges, true, restrictions);
acm.setPolicy("/node", acl);
session.save();
// executes query
RowIterator rows = repository.login(new UserIdCredentials("user")).getWorkspace().getQueryManager()
.createQuery("SELECT * FROM [nt:unstructured]", Query.JCR_SQL2).execute().getRows();
System.out.println("Number of rows: " + rows.getSize()); //Prints 0
}
If one were to remove restrictions from the code above, both node and subnode appears in the query results as expected.
MySecurityProvider uses ConfigurationParameters.EMPTY and the default implementations of all SecurityConfigurations, except for AuthenticationConfiguration which I've implemented myself:
class MyAuthenticationConfiguration extends AuthenticationConfigurationImpl {
public MyAuthenticationConfiguration(SecurityProvider securityProvider) {
super(securityProvider);
}
#NotNull
#Override
public LoginContextProvider getLoginContextProvider(ContentRepository contentRepository) {
return new LoginContextProvider() {
#NotNull
public LoginContext getLoginContext(Credentials credentials, String workspaceName) {
String userId = ((UserIdCredentials) credentials).getUserId();
Set<Principal> principalSets = new HashSet<>();
if (userId.isEmpty()) {
principalSets.add(SystemPrincipal.INSTANCE);
} else {
principalSets.add(new PrincipalImpl(userId));
}
Map<String, ? extends Principal> publicPrivileges = new HashMap<>();
AuthInfoImpl authInfoImpl = new AuthInfoImpl(userId, publicPrivileges, principalSets);
Subject subject = new Subject(true, principalSets, Collections.singleton(authInfoImpl), new HashSet<Principal>());
return new PreAuthContext(subject);
}
};
}
}
I am using Jackrabbit Oak version 1.10.0
This turned out be a bug in Jackrabbit Oak - Link to issue.
This has been resolved as of version 1.12.0

Revit API: Material Asset Parameters get and set

I am trying to access via RevitAPI the data that is contained for particular asset. For instance I want to manipulate the Identity Data and get and eventually set some data for Manufacturer, Model, Cost and URL.
How can I achieve the same for the other Assets?
I am reading the Materials:
public IEnumerable<Material> GetMaterials(Document doc)
{
collector = new FilteredElementCollector(doc);
return collector.OfClass(typeof(Material)).OfType<Material>();
}
And then the Parameters:
public IEnumerable<Parameter> GetMaterialParameters(Material material)
{
List<Parameter> parameters = new List<Parameter>();
var localParameters = material.ParametersMap;
foreach (Parameter localParameter in localParameters)
{
parameters.Add(localParameter);
}
return parameters;
}
but still can't find where those properties are exposed.
What you really need is the Visual Materials API that was introduced in Revit 2018.1, the newest update:
Revit 2018.1 and the Visual Materials API
It is much harder and maybe impossible to achieve what you want in earlier versions.
Here are pointers to some more or less futile attempts:
Material Assets and FBX
Read Material Asset Parameter
Rendering Assets
Material Asset Textures
Finally this is how I managed to edit the parameters.
private void AssignProductData_OnClick(object sender, RoutedEventArgs e)
{
var material = (MaterialItem)MaterialsCombo.SelectedItem;
using (var transaction = new Transaction(doc))
{
transaction.Start("ChangeName");
var parameterManufacturer = material.Material.get_Parameter(BuiltInParameter.ALL_MODEL_MANUFACTURER);
parameterManufacturer.Set("Brand New Product");
var parameterCost = material.Material.get_Parameter(BuiltInParameter.ALL_MODEL_COST);
parameterCost.Set(1099.99);
var parameterModel = material.Material.get_Parameter(BuiltInParameter.ALL_MODEL_MODEL);
parameterModel.Set("R1223123KJNSDAS9089");
var parameterUrl = material.Material.get_Parameter(BuiltInParameter.ALL_MODEL_URL);
parameterUrl.Set("http://www.site.no/products/R1223123KJNSDAS9089");
transaction.Commit();
}
}

Connecting to CRM Online through CRM 365 Plugin

I need to connect and retrieve records in CRM Online through CRM 365 plugin. I have tried simplified connection using xrm.tooling.dll but unfortunately it says Could not load file or assembly 'microsoft.xrm.tooling.connectorand when i used ClientCredential the error says Metadata contain refereces that cannot be resolved.
Strangely, i tried both method with console applcation and it's work prefectly. Just wanna knows what i miss in this case ? Do i need special requirement when i want to connect to CRM through plugin ? Please anybody share your knowledge.
EDIT
This just a sample code to get account name from CRM Online and display it using InvalidPluginExecutionException:
IOrganizationService _service;
public void Execute(IServiceProvider serviceprovider)
{
IPluginExecutionContext context = (IPluginExecutionContext)serviceprovider.GetService(typeof(IPluginExecutionContext));
IOrganizationServiceFactory servicefactory = (IOrganizationServiceFactory)serviceprovider.GetService(typeof(IOrganizationServiceFactory));
IOrganizationService service = servicefactory.CreateOrganizationService(context.UserId);
if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)
{
Entity ent = (Entity)context.InputParameters["Target"];
if (ent.LogicalName != "opportunity")
return;
string connstring = #"Url=https://office.crm5.dynamics.com; Username=username#office.onmicrosoft.com; Password=crmoffice; authtype=Office365";
CrmServiceClient conn = new Microsoft.Xrm.Tooling.Connector.CrmServiceClient(connstring);
service = (IOrganizationService)conn.OrganizationWebProxyClient != null ? (IOrganizationService)conn.OrganizationWebProxyClient :
(IOrganizationService)conn.OrganizationServiceProxy;
try
{
Guid fabercastel = new Guid("efd566dc-10ff-e511-80df-c4346bdcddc1");
Entity _account = new Entity("account");
_account = service.Retrieve(_account.LogicalName, fabercastel, new ColumnSet("name"));
string x = _account["name"].ToString();
throw new InvalidPluginExecutionException("Result of Query : " + x);
}
catch (Exception ex)
{
throw new InvalidPluginExecutionException(ex.Message);
}
You should be able to connect to another CRM instance without using any assemblies that are outside Online Sandbox (so other than Microsoft.Xrm.Sdk and related). Simply use the sample from SDK from "SDK\SampleCode\CS\GeneralProgramming\Authentication\AuthenticateWithNoHelp\AuthenticateWithNoHelp.cs". Simplified version for connecting to Office365 looks like that:
class AuthenticateWithNoHelp
{
private String _discoveryServiceAddress = "https://disco.crm.dynamics.com/XRMServices/2011/Discovery.svc";
private String _organizationUniqueName = "orgname";
private String _userName = "admin#orgname.onmicrosoft.com";
private String _password = "password";
private String _domain = "domain";
public void Run()
{
IServiceManagement<IDiscoveryService> serviceManagement =
ServiceConfigurationFactory.CreateManagement<IDiscoveryService>(
new Uri(_discoveryServiceAddress));
AuthenticationProviderType endpointType = serviceManagement.AuthenticationType;
AuthenticationCredentials authCredentials = GetCredentials(serviceManagement, endpointType);
String organizationUri = String.Empty;
using (DiscoveryServiceProxy discoveryProxy =
GetProxy<IDiscoveryService, DiscoveryServiceProxy>(serviceManagement, authCredentials))
{
if (discoveryProxy != null)
{
OrganizationDetailCollection orgs = DiscoverOrganizations(discoveryProxy);
organizationUri = FindOrganization(_organizationUniqueName,
orgs.ToArray()).Endpoints[EndpointType.OrganizationService];
}
}
if (!String.IsNullOrWhiteSpace(organizationUri))
{
IServiceManagement<IOrganizationService> orgServiceManagement =
ServiceConfigurationFactory.CreateManagement<IOrganizationService>(
new Uri(organizationUri));
AuthenticationCredentials credentials = GetCredentials(orgServiceManagement, endpointType);
using (OrganizationServiceProxy organizationProxy =
GetProxy<IOrganizationService, OrganizationServiceProxy>(orgServiceManagement, credentials))
{
organizationProxy.EnableProxyTypes();
Guid userid = ((WhoAmIResponse)organizationProxy.Execute(
new WhoAmIRequest())).UserId;
}
}
}
private AuthenticationCredentials GetCredentials<TService>(IServiceManagement<TService> service, AuthenticationProviderType endpointType)
{
AuthenticationCredentials authCredentials = new AuthenticationCredentials();
authCredentials.ClientCredentials.UserName.UserName = _userName;
authCredentials.ClientCredentials.UserName.Password = _password;
return authCredentials;
}
public OrganizationDetailCollection DiscoverOrganizations(
IDiscoveryService service)
{
if (service == null) throw new ArgumentNullException("service");
RetrieveOrganizationsRequest orgRequest = new RetrieveOrganizationsRequest();
RetrieveOrganizationsResponse orgResponse =
(RetrieveOrganizationsResponse)service.Execute(orgRequest);
return orgResponse.Details;
}
public OrganizationDetail FindOrganization(string orgUniqueName,
OrganizationDetail[] orgDetails)
{
if (String.IsNullOrWhiteSpace(orgUniqueName))
throw new ArgumentNullException("orgUniqueName");
if (orgDetails == null)
throw new ArgumentNullException("orgDetails");
OrganizationDetail orgDetail = null;
foreach (OrganizationDetail detail in orgDetails)
{
if (String.Compare(detail.UrlName, orgUniqueName,
StringComparison.InvariantCultureIgnoreCase) == 0)
{
orgDetail = detail;
break;
}
}
return orgDetail;
}
private TProxy GetProxy<TService, TProxy>(
IServiceManagement<TService> serviceManagement,
AuthenticationCredentials authCredentials)
where TService : class
where TProxy : ServiceProxy<TService>
{
Type classType = typeof(TProxy);
if (serviceManagement.AuthenticationType !=
AuthenticationProviderType.ActiveDirectory)
{
AuthenticationCredentials tokenCredentials =
serviceManagement.Authenticate(authCredentials);
return (TProxy)classType
.GetConstructor(new Type[] { typeof(IServiceManagement<TService>), typeof(SecurityTokenResponse) })
.Invoke(new object[] { serviceManagement, tokenCredentials.SecurityTokenResponse });
}
return (TProxy)classType
.GetConstructor(new Type[] { typeof(IServiceManagement<TService>), typeof(ClientCredentials) })
.Invoke(new object[] { serviceManagement, authCredentials.ClientCredentials });
}
static public void Main(string[] args)
{
AuthenticateWithNoHelp app = new AuthenticateWithNoHelp();
app.Run();
}
}
You can simplify it further by removing part with DiscoveryService and directly calling:
https://orgname.api.crm.dynamics.com/XRMServices/2011/Organization.svc
This should work on Sandboxed plugins as it uses only Sdk assemblies.
You already have your connection to CRM using the IOrganizationService that you've defined on the third line of your plugin. Unless you need to connect to another CRM instance in a different org, there is no login needed or required.
Basically just delete the 4 lines above your try, and you should be good.
Edit:
public void Execute(IServiceProvider serviceprovider)
{
IPluginExecutionContext context = (IPluginExecutionContext)serviceprovider.GetService(typeof(IPluginExecutionContext));
IOrganizationServiceFactory servicefactory = (IOrganizationServiceFactory)serviceprovider.GetService(typeof(IOrganizationServiceFactory));
IOrganizationService service = servicefactory.CreateOrganizationService(context.UserId);
if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)
{
Entity ent = (Entity)context.InputParameters["Target"];
if (ent.LogicalName != "opportunity")
return;
Guid fabercastel = new Guid("efd566dc-10ff-e511-80df-c4346bdcddc1");
Entity _account = new Entity("account");
_account = service.Retrieve(_account.LogicalName, fabercastel, new ColumnSet("name"));
string x = _account["name"].ToString();
throw new InvalidPluginExecutionException("Result of Query : " + x);
}
}
You do not need any additional libraries like Microsoft.Xrm.Tooling.Connector or others from SDK to consume CRM web services. Standard .NET mechanism related to SOAP / REST protocols will be enough (but of course this method may be little more difficult).
EDIT: I've made some additional investigation and it occurs that configuring auto-generated OrganizationServiceClient for Office365 authentication without using SDK libraries may be real pain in the ass. I'm not telling it is not possible however it is not documented by Microsoft. To add more details OAuth authentication is not supported by Visual Studio generated proxy classes.
Because of that - my second recommendation is to use facade web service communicating with CRM OnLine. You may host this web service on Windows Azure or any other cloud/hosting place in the internet. From your CRM 365 Plugin you may consume your custom web service methods and communicate with your CRM Online instance using this service. I suppose it will be much better approach that trying to run undocumented methods of connecting to CRM Online.**
You should be able to connect to another CRM instance without using any assemblies that are outside Online Sandbox (so other than Microsoft.Xrm.Sdk and related).
For example simply use the sample from SDK from SDK\SampleCode\CS\GeneralProgramming\Authentication\AuthenticateWithNoHelp\AuthenticateWithNoHelp.cs.

How to write JavaRDD to marklogic database

I am evaluating spark with marklogic database. I have read a csv file, now i have a JavaRDD object which i have to dump into marklogic database.
SparkConf conf = new SparkConf().setAppName("org.sparkexample.Dataload").setMaster("local");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> data = sc.textFile("/root/ml/workArea/data.csv");
SQLContext sqlContext = new SQLContext(sc);
JavaRDD<Record> rdd_records = data.map(
new Function<String, Record>() {
public Record call(String line) throws Exception {
String[] fields = line.split(",");
Record sd = new Record(fields[0], fields[1], fields[2], fields[3],fields[4]);
return sd;
}
});
This JavaRDD object i want to write to marklogic database.
Is there any spark api available for faster writing to the marklogic database ?
Lets say, If we could not write JavaRDD directly to marklogic then what is the currect approach to achieve this ?
Here is the code which i am using to write the JavaRDD data to marklogic database, let me know if it is wrong way to do that.
final DatabaseClient client = DatabaseClientFactory.newClient("localhost",8070, "MLTest");
final XMLDocumentManager docMgr = client.newXMLDocumentManager();
rdd_records.foreachPartition(new VoidFunction<Iterator<Record>>() {
public void call(Iterator<Record> partitionOfRecords) {
while (partitionOfRecords.hasNext()) {
Record record = partitionOfRecords.next();
System.out.println("partitionOfRecords - "+record.toString());
String docId = "/example/"+record.getID()+".xml";
JAXBContext context = JAXBContext.newInstance(Record.class);
JAXBHandle<Record> handle = new JAXBHandle<Record>(context);
handle.set(record);
docMgr.writeAs(docId, handle);
}
}
});
client.release();
I have used java client api to write the data, but i am getting below exception even though POJO class Record is implementing Serializable interface. Please let me know what could be the reason & how to solve that.
org.apache.spark.sparkexception task not Serializable .
The easiest way to get data into MarkLogic is via HTTP and the client REST API - specifically the /v1/documents endpoints - http://docs.marklogic.com/REST/client/management .
There are a variety of ways to optimize this, such as via a write set, but based on your question, I think the first thing to decide is - what kind of document do you want to write for each Record? Your example shows 5 columns in the CSV - typically, you'll write either a JSON or XML document with 5 fields/elements, each named based on the column index. So you'd need to write a little code to generate that JSON/XML, and then use whatever HTTP client you prefer (and one option is the MarkLogic Java Client API) to write that document to MarkLogic.
That addresses your question of how to write a JavaRDD to MarkLogic - but if your goal is to get data from a CSV into MarkLogic as fast as possible, then skip Spark and use mlcp - https://docs.marklogic.com/guide/mlcp/import#id_70366 - which involves zero coding.
Modified example from spark streaming guide, Here you will have to implement connection and writing logic specific to database.
public void send(JavaRDD<String> rdd) {
rdd.foreachPartition(new VoidFunction<Iterator<String>>() {
#Override
public void call(Iterator<String> partitionOfRecords) {
// ConnectionPool is a static, lazily initialized pool of
Connection connection = ConnectionPool.getConnection();
while (partitionOfRecords.hasNext()) {
connection.send(partitionOfRecords.next());
}
ConnectionPool.returnConnection(connection); // return to the pool
// for future reuse
}
});
}
I'm wondering if you just need to make sure everything you access inside your VoidFunction that was instantiated outside it is serializable (see this page). DatabaseClient and XMLDocumentManager are of course not serializable, as they're connected resources. You're right, however, to not instantiate DatabaseClient inside your VoidFunction as that would be less efficient (though it would work). I don't know if the following idea would work with spark. But I'm guessing you could create a class that keeps hold of a singleton DatabaseClient instance:
public static class MLClient {
private static DatabaseClient singleton;
private MLClient() {}
public static DatabaseClient get(DatabaseClientFactory.Bean connectionInfo) {
if ( connectionInfo == null ) {
throw new IllegalArgumentException("connectionInfo cannot be null");
}
if ( singleton == null ) {
singleton = connectionInfo.newClient();
}
return singleton;
}
}
then you just create a serializable DatabaseClientFactory.Bean outside your VoidFunction so your auth info is still centralized
DatabaseClientFactory.Bean connectionInfo =
new DatabaseClientFactory.Bean();
connectionInfo.setHost("localhost");
connectionInfo.setPort(8000);
connectionInfo.setUser("admin");
connectionInfo.setPassword("admin");
connectionInfo.setAuthenticationValue("digest");
Then inside your VoidFunction you could get that singleton DatabaseClient and new XMLDocumentManager like so:
DatabaseClient client = MLClient.get(connectionInfo);
XMLDocumentManager docMgr = client.newXMLDocumentManager();

Reusable generic LightSwitch screen with WCF RIA Services

I'm new to WCF RIA Services, and have been working with LightSwitch for 4 or so months now.
I created a generic screen to be used for editing lookup tables all over my LightSwitch application, mostly to learn how to create a generic screen that can be used with different entity sets on a dynamic basis.
The screen is pretty simple:
Opened with arguments similar to this:
Application.ShowLookupTypesList("StatusTypes", "StatusTypeId"); which correspond to the entity set for the lookup table in the database.
Here's my WCF RIA service code:
using System.Data.Objects.DataClasses;
using System.Diagnostics;
using System.Reflection;
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Data;
using System.Linq;
using System.ServiceModel.DomainServices.EntityFramework;
using System.ServiceModel.DomainServices.Server;
namespace WCF_RIA_Project
{
public class LookupType
{
[Key]
public int TypeId { get; set; }
public string Name { get; set; }
}
public static class EntityInfo
{
public static Type Type;
public static PropertyInfo Key;
public static PropertyInfo Set;
}
public class WCF_RIA_Service : LinqToEntitiesDomainService<WCSEntities>
{
public IQueryable<LookupType> GetLookupTypesByEntitySet(string EntitySetName, string KeyName)
{
EntityInfo.Set = ObjectContext.GetType().GetProperty(EntitySetName);
EntityInfo.Type = EntityInfo.Set.PropertyType.GetGenericArguments().First();
EntityInfo.Key = EntityInfo.Type.GetProperty(KeyName);
return GetTypes();
}
[Query(IsDefault = true)]
public IQueryable<LookupType> GetTypes()
{
var set = (IEnumerable<EntityObject>)EntityInfo.Set.GetValue(ObjectContext, null);
var types = from e in set
select new LookupType
{
TypeId = (int)EntityInfo.Key.GetValue(e, null),
Name = (string)EntityInfo.Type.GetProperty("Name").GetValue(e, null)
};
return types.AsQueryable();
}
public void InsertLookupType(LookupType lookupType)
{
dynamic e = Activator.CreateInstance(EntityInfo.Type);
EntityInfo.Key.SetValue(e, lookupType.TypeId, null);
e.Name = lookupType.Name;
dynamic set = EntityInfo.Set.GetValue(ObjectContext, null);
set.AddObject(e);
}
public void UpdateLookupType(LookupType currentLookupType)
{
var set = (IEnumerable<EntityObject>)EntityInfo.Set.GetValue(ObjectContext, null);
dynamic modified = set.FirstOrDefault(t => (int)EntityInfo.Key.GetValue(t, null) == currentLookupType.TypeId);
modified.Name = currentLookupType.Name;
}
public void DeleteLookupType(LookupType lookupType)
{
var set = (IEnumerable<EntityObject>)EntityInfo.Set.GetValue(ObjectContext, null);
var e = set.FirstOrDefault(t => (int)EntityInfo.Key.GetValue(t, null) == lookupType.TypeId);
Debug.Assert(e.EntityState != EntityState.Detached, "Entity was in a detached state.");
ObjectContext.ObjectStateManager.ChangeObjectState(e, EntityState.Deleted);
}
}
}
When I add an item to the list from the running screen, save it, then edit it and resave, I receive data conflict "Another user has deleted this record."
I can workaround this by reloading the query after save, but it's awkward.
If I remove, save, then readd and save an item with the same name I get unable to save data, "The context is already tracking a different entity with the same resource Uri."
Both of these problems only affect my generic screen using WCF RIA Services. When I build a ListDetail screen for a specific database entity there are no problems. It seems I'm missing some logic, any ideas?
I've learned that this the wrong approach to using LightSwitch.
There are several behind-the-scenes things this generic screen won't fully emulate and may not be do-able without quite a bit of work. The errors I've received are just one example. LightSwitch's built-in conflict resolution will also fail.
LS's RAD design means just creating a bunch of similar screens is the way to go, with some shared methods. If the actual layout needs changed across many identical screens at once, you can always find & replace the .lsml files if you're careful and make backups first. Note that modifying these files directly isn't supported.
I got that error recently. In my case I create a unique ID in my WCF RIA service, but in my screen behind code I must explicitly set a unique ID when I create the object that will later be passed to the WCF RIA Service insert method (this value will then be overwritten with the unique counter ID in the table of the underlying database).
See the sample code for this project:
http://lightswitchhelpwebsite.com/Blog/tabid/61/EntryId/157/A-Visual-Studio-LightSwitch-Picture-File-Manager.aspx

Resources