Please can someone give me a small sample of how to use the Storage class in LWUIT? I have tried implementing by emulating the system used in the Recipe Hands-on-Lab, but my application does not need to have multiple objects, as it is within the sample.
Recipe sample allows user to add more and more samples, but all I want to do is add ONE entry of information.
Also how do I retrieve the info stored?
com.sun.lwuit.io.Storage.init("MobileApplication1");
if(com.sun.lwuit.io.Storage.isInitialized()) {
com.sun.lwuit.io.Storage.getInstance().writeObject("MobileApplication1","My first string");
String myStr = (String)com.sun.lwuit.io.Storage.getInstance().readObject("MobileApplication1");
System.out.println(myStr);
} else {
System.out.println("Storage not initialized");
}
The above code will create a storage of name 'MobileApplication1', add an object 'My first string' and reads the string.
You can use J2ME record store as well and your record store will always have only 1 record.
Related
I have an index using the Azure Cognitive Search service. I'm writing a program to automate the upload of new data to this index. I don't want to delete and re-create the index from scratch unnecessarily each time. Is there a way of comparing what is currently in the index with the data that I am about to upload, without having to download that data from there first and manually compare it? I have been looking at the MS documentation and other articles but cannot see a way to do this comparison?
you can use MergeOrUpload operation, so if it's not there it will insert, otherwise update.
Please make sure the IDs are the same otherwise you'll endup always adding new items.
IndexAction.MergeOrUpload(
new Customer()
{
Id = "....",
UpdatedBy = new
{
Id = "..."
}
}
)
https://learn.microsoft.com/en-us/dotnet/api/microsoft.azure.search.models.indexactiontype?view=azure-dotnet
I'm using Azure App Insight as a logging tool and store log data by the following code:
private void SendTrace(LoggingEvent loggingEvent)
{
loggingEvent.GetProperties();
string message = "TestMessage";
var trace = new TraceTelemetry(message)
{
SeverityLevel = SeverityLevel.Information
};
trace.Properties.Add("TetstKey", "TestValue");
var telemetryClient = new TelemetryClient();
telemetryClient.Context.InstrumentationKey = this.InstrumentationKey;
telemetryClient.Track(trace);
}
everything works well. I see logged record in App insight as well as in App insight analytics (in trace table). My custom attributes are written in special app insight row section - customDimensions. For example, the above code will add new attribute with "TestKey" key and "TestValue" value into customDimensions section.
But when I try to write some big text (for example JSON document with more then 15k letters) I still can do it without any exceptions, but the writable text will be cut off after some document length. As the result, the custom attribute value in customDimensions section will be cropped too and will have only first part of document.
As I understand there is the restriction for max text length which is allowed to be written in app insight custom attribute.
Could someone know how can I get around with this?
The message has the highest allowed limit of 32768. For items in the property collection, value has max limit of 8192.
So you can try one of the following options:
Use message field to the fullest by putting the big text there.
Split the data into multiple, and add to properties collection separately.
eg:
trace.Properties.Add("key_part1", "Bigtext1_upto8192");
trace.Properties.Add("key_part2", "Bigtext2_upto8192");
Reference: https://github.com/MicrosoftDocs/azure-docs/blob/master/includes/application-insights-limits.md
I have a continuous Azure WebJob that is running off of a QueueInput, generating a report, and outputting a file to a BlobOutput. This job will run for differing sets of data, each requiring a unique output file. (The number of inputs is guaranteed to scale significantly over time, so I cannot write a single job per input.) I would like to be able to run this off of a QueueInput, but I cannot find a way to set the output based on the QueueInput value, or any value except for a blob input name.
As an example, this is basically what I want to do, though it is invalid code and will fail.
public static void Job([QueueInput("inputqueue")] InputItem input, [BlobOutput("fileoutput/{input.Name}")] Stream output)
{
//job work here
}
I know I could do something similar if I used BlobInput instead of QueueInput, but I would prefer to use a queue for this job. Am I missing something or is generating a unique output from a QueueInput just not possible?
There are two alternatives:
Use IBInder to generate the blob name. Like shown in these samples
Have an autogenerated in the queue message object and bind the blob name to that property. See here (the BlobNameFromQueueMessage method) how to bind a queue message property to a blob name
Found the solution at Advanced bindings with the Windows Azure Web Jobs SDK via Curah's Complete List of Web Jobs Tutorials and Videos.
Quote for posterity:
One approach is to use the IBinder interface to bind the output blob and specify the name that equals the order id. The better and simpler approach (SimpleBatch) is to bind the blob name placeholder to the queue message properties:
public static void ProcessOrder(
[QueueInput("orders")] Order newOrder,
[BlobOutput("invoices/{OrderId}")] TextWriter invoice)
{
// Code that creates the invoice
}
The {OrderId} placeholder from the blob name gets its value from the OrderId property of the newOrder object. For example, newOrder is (JSON): {"CustomerName":"Victor","OrderId":"abc42"} then the output blob name is “invoices/abc42″. The placeholder is case-sensitive.
So, you can reference individual properties from the QueueInput object in the BlobOutput string and they will be populated correctly.
As the question suggests, I need to find out how to access entity data that has been passed into a JavaScript function via a lookup.
JavaScript Code Follows:
// function to generate the correct Weighting Value when these parameters change
function TypeAffectedOrRegionAffected_OnChanged(ExecutionContext, Type, Region, Weighting, Potential) {
var type = Xrm.Page.data.entity.attributes.get(Type).getValue();
var region = Xrm.Page.data.entity.attributes.get(Region).getValue();
// if we have values for both fields
if (type != null && region != null) {
// create the weighting variable
var weighting = type[0].name.substring(4) + "-" + region;
// recreate the Weighting Value
Xrm.Page.data.entity.attributes.get(Weighting).setValue(weighting);
}
}
As you can see with the following line using the name operator I can access my Type entity's Type field.
// create the weighting variable
var weighting = type[0].name.substring(4) + "-" + region;
I am looking for a way now to access the values stored inside my type object. It has the following fields new_type, new_description, new_value and new_kind.
I guess I'm looking for something like this:
// use value of entity to assign to our form field
Xrm.Page.data.entity.attributes.get(Potential).setValue(type[0].getAttribute("new_value"));
Thanks in advance for any help.
Regards,
Comic
REST OData calls are definitely the way to go in this case. You already have the id, and you just need to retrieve some additional values. Here is a sample to get you started. The hardest part with working with Odata IMHO is creating the Request Url's. There are a couple tools, that you can find on codeplex, but my favorite, is actually to use LinqPad. Just connect to your Org Odata URL, and it'll retrieve all of your entities and allow you to write a LINQ statement that will generate the URL for you, which you can test right in the browser.
For your instance, it'll look something like this (it is case sensitive, so double check that if it doesn't work):
"OdataRestURL/TypeSet(guid'" + type[0].Id.replace(/{/gi, "").replace(/}/gi, "") + "'select=new_type,new_description,new_value,new_kind"
Replace OdataRestURL with whatever your odata rest endpoint is, and you should be all set.
Yes Guido Preite is right. You need to retrieve the entity by the id which come form the lookup by Rest Sync or Async. And then get Json object. However for make the object light which is returned, you can mention which fields to be backed as part of the Json. Now you can access those fields which you want.
I am trying to write a plugin that will trigger when an account is created. If there is a originating lead I want to fetch the company name in the lead and put it in the account name field. What I'm not sure how to do is to obtain the information out of the lead entity.
I have the following code (I'll keep updating this)...
Entity member = service.Retrieve("lead",
((EntityReference)account["originatingleadid"]).Id, new ColumnSet(true));
if (member.Attributes.Contains("companyname"))
{
companyName = member.Attributes["companyname"].ToString();
}
if (context.PostEntityImages.Contains("AccountPostImage") &&
context.PostEntityImages["AccountPostImage"] is Entity)
{
accountPostImage = (Entity)context.PostEntityImages["AccountPostImage"];
companyName = "This is a test";
if (companyName != String.Empty)
{
accountPostImage.Attributes["name"] = companyName;
service.Update(account);
}
}
I'm not going to spoil the fun for you just yet but the general idea is to:
Catch the message of Create.
Extract the guid from your Entity (that's your created account).
Obtain the guid from its EntityReference (that's your lead).
Read the appropriate field from it.
Update the name field in your account.
Store the information.
Which of the steps is giving you issues? :)
As always, I recommend using query expressions before fetchXML. YMMV
Is lead connected to the account? Just use the IOrganizationService.Retrieve Method
To retrieve the correct lead (assuming you have the lead id from the account entity)..
Create the organizationService in the execute method of your plugin.
http://msdn.microsoft.com/en-us/library/gg334504.aspx
Also here is a nice example to write the plugin:
http://mscrmkb.blogspot.co.il/2010/11/develop-your-first-plugin-in-crm-2011.html?m=1