How to call Page.ClientScript.RegisterStartupScript From foreach for each iteration - c#-4.0

How to call Page.ClientScript.RegisterStartupScript from foreach for each iteration. It looks like it call all iterations once at end
int saveImageCount=0;
foreach (DataRow Stdrow in key.ColumnValues)
{
saveImageCount++
Page.ClientScript.RegisterStartupScript(GetType(), "SaveImage" + saveImageCount, "javascript:SaveImage();", true);
}
Unable to call javascript:SaveImage for each iteration

They will all have the same type, so they'll override each other until the last one, and the last one will "win".
Instead build up a string and then, outside of the foreach, register the string.

#Jonathan,I think you are a little bit mislead,its Type and key that makes unique combination not only type.
So #brijesh is 95% right but he has to use Register.ClientScriptBlock as it will be registered based on event not on page load.
int saveImageCount = 0;
foreach (DataRow Stdrow in key.ColumnValues)
{
saveImageCount++;
ClientScript.RegisterClientScriptBlock(this.GetType(), "script"+saveImageCount, "alert('data has been added successfully');", true);
}
GetType and "script"+saveImageCount will make a unique key combination and hence your script will be called without hesitation.

Related

nodejs loop in asynch

Im sending user information to user when he signs in.
var userstuff00 = findSub(user['id']);
userstuff00.then(function(sub){
for(var i in sub){
var userstuff01 = findSub(sub[i.toString()]['id']);
userstuff01.then(function(sub2){
for(var i2 in sub2){
//here i is the last object in sub but i2 is for the first i
console.log(sub[i.toString()]);
console.log(sub2[i2.toString()]);
}
});
}
});
and this is findSub function where usercollection is mongodb table:
function findSub(stuffCode){
var tempMembers = usercollection.find({'stuff ': stuffCode});
return tempMembers;
}
this is user object :
user{
id,
name,
stuff,
subMember[] //list of users where their stuff equals to this users id
}
I want to add every sub2 in that users subMember but i's id is not equal to i2's stuff( I cant add i2 in another i's submember).
how can get submembers for the first i and then find submembers for the second i?
my goal is a list of users that has users as submembers and these users(submembers level 1) have submembers(submembers level 2) and so on (up to level 10) :
Family Tree
This is a typical issue when working with asynchronous code: the synchronous code will finish before any of the asynchronous code. So this loop:
for(var i in sub)
... will finish before any of the then callbacks inside it will be executed. So by the time one of those gets executed, the value of i is already the length of sub.
To be able to use the value of the i at the time you called then (not its callback), there are different solutions. One of them is to use let i instead of var i as it will define a different variable on each iteration of the for loop. Or you can bind the value of i as argument to the then callback:
userstuff01.then(function(i, sub2){
// ^^^ added parameter
for(var i2 in sub2){
//here i is the last object in sub but i2 is for the first i
console.log(sub[i.toString()]);
console.log(sub2[i2.toString()]);
}
}.bind(null, i));
// ^^^^^^^^^^^ bind the parameter value to the current value of i.

C# error Collection was modified; enumeration operation might not execute

Below is my code which am calling and am getting below exception.
Collection was modified; enumeration operation might not execute.
In this code am checking if dataset contains tb_error table name and then checking the row count.
If rowcount> 1 , insert into db.
after that i want to clear that table and after that i need to clear other view also.
Please help me where to modify my code.
if (MainClass.OutputDataset.Tables.Contains(tb_error.TableName))
{
foreach (DataRow drErr in MainClass.OutputDataset.Tables[tb_error.TableName].Rows)
{
//insert into DB
}
}
if (MainClass.OutputDataset != null && MainClass.OutputDataset.Tables["tb_error"].Rows.Count > 0)
{
MainClass.OutputDataset.Tables["tb_error"].Clear();
}
MainClass.dsinput.Tables.Remove("BSData_VW");
}
This happens because the underlying collection has since had items added or removed, which invalidates the loop.
You can get around this by taking a snapshot, e.g.:
foreach (DataRow drErr in MainClass.OutputDataset.Tables[tb_error.TableName].Rows.ToList())
{
//insert into DB
}
The key is the .ToList() call at the end, which means the foreach loop only operates on Rows as it is at the point-in-time.
When you get an error like that, you pretty much have to abandon using foreach and come up with some other looping mechanism. You can try using a for statement or rolling your own with a variable and a while statement.

remove previously added and() clause from Where com.datastax.driver.core.querybuilder.Select.Where

for (int i=0; i<mycolumns.length; i++)
{
where.and(QueryBuilder.eq(COLNAME, mycolumns[i]));
//how to remove the above and() call
}
In every iteration of the loop, I want to execute the query and then substitute the value in next loop iteration.
I'm not completely clear on what you are trying to accomplish. I am guessing that you are trying to update multiple rows sharing a primary key, updating 1 row at a time?
Unfortunately this isn't possible since when you call where.and you are adding data to the Where object and it is returning you a reference to the same Where object.
In short, Where is not immutable and neither is the Statement it belongs to, so you won't get a new copy every time you call it, rather you get an updated version of the Where object.
What you could do is generate your Statement again (whether it be QueryBuilder.update,delete, or insert) in the loop like:
for (int i=0; i<mycolumns.length; i++) {
Statement stmt = QueryBuilder.update("tableName").where(eq("key", 1)).and(QueryBuilder.eq(COLNAME, mycolumns[i]));
session.execute(stmt);
}

CRM PlugIn Pass Variable Flag to New Execution Pipeline

I have records that have an index attribute to maintain their position in relation to each other.
I have a plugin that performs a renumbering operation on these records when the index is changed or new one created. There are specific rules that apply to items that are at the first and last position in the list.
If a new (or existing changed) item is inserted into the middle (not technically the middle...just somewhere between start and end) of the list a renumbering kicks off to make room for the record.
This renumbering process fires in a new execution pipeline...We are updating record D. When I tell record E to change (to make room for D) that of course fires the plugin on update message.
This renumbering is fine until we reach the end of the list where the plugin then gets into a loop with the first business rule that maintains the first and last record differently.
So I am trying to think of ways to pass a flag to the execution context spawned by the renumbering process so the recursion skips the boundary edge business rules if IsRenumbering == true.
My thoughts / ideas:
I have thought of using the Depth check > 1 but that isn't a reliable value as I can't explicitly turn it on or off....it may happen to work but that is not engineering a solid solution that is hoping nothing goes bump. Further a colleague far more knowledgeable than I said that when a workflow calls a plugin the depth value is off and can't be trusted.
All my variables are scoped at the execute level so as to avoid variable pollution at the class level....However if I had a dictionary object, tuple, something at the class level and one value would be the thread id and the other the flag value then perhaps my subsequent execution context could check if the same owning thread id had any values entered.
Any thoughts or other ideas on how to pass context information to a new pipeline would be greatly appreciated.
Per Nicknow sugestion I tried sharedvariables but they seem to be going out of scope...:
First time firing post op:
if (base.Stage == EXrmPluginStepStage.PostOperation)
{
...snip...
foreach (var item in RenumberSet)
{
Context.ParentContext.SharedVariables[recordrenumbering] = "googly";
Entity renumrec = new Entity("abcd") { Id = item.Id };
#region We either add or subtract indexes based upon sortdir
...snip...
renumrec["abc_indexfield"] = TmpIdx + 1;
break;
.....snip.....
#endregion
OrganizationService.Update(renumrec);
}
}
Now we come into Pre-Op of the recursion process kicked off by the above post-op OrganizationService.Update(renumrec); and it seems based upon this check the sharedvariable didn't carry over...???
if (!Context.SharedVariables.Contains(recordrenumbering))
{
//Trace.Trace("Null Set");
//Context.SharedVariables[recordrenumbering] = IsRenumbering;
Context.SharedVariables[recordrenumbering] = "Null Set";
}
throw invalidpluginexception reveals:
Sanity Checks:
Depth : 2
Entity: ...
Message: Update
Stage: PreOperation [20]
User: 065507fe-86df-e311-95fe-00155d050605
Initiating User: 065507fe-86df-e311-95fe-00155d050605
ContextEntityName: ....
ContextParentEntityName: ....
....
IsRenumbering: Null Set
What are you looking for is IExecutionContext.SharedVariables. Whatever you add here is available throughout the entire transaction. Since you'll have child pipelines you'll want to look at the ParentContext for the value. This can all get a little tricky, so be sure to do a lot of testing - I've run into many issues with SharedVariables and looping operations in Dynamics CRM.
Here is some sample (very untested) code to get you started.
public static bool GetIsRenumbering(IPluginExecutionContext pluginContext)
{
var keyName = "IsRenumbering";
var ctx = pluginContext;
while (ctx != null)
{
if (ctx.SharedVariables.Contains(keyName))
{
return (bool)ctx.SharedVariables[keyName];
}
else ctx = ctx.ParentContext;
}
return false;
}
public static void SetIsRenumbering(IPluginExecutionContext pluginContext)
{
var keyName = "IsRenumbering";
var ctx = pluginContext;
ctx.SharedVariables.Add(keyName, true);
}
A very simple solution: add a bit field to the entity called "DisableIndexRecalculation." When your first plugin runs, make sure to set that field to true for all of your updates. In the same plugin, check to see if "DisableIndexRecalculation" is set to true: if so, set it to null (by removing it from the TargetEntity entirely) and stop executing the plugin. If it is null, do your index recalculation.
Because you are immediately removing the field from the TargetEntity if it is true the value will never be persisted to the database so there will be no performance penalty.

How to maintain counters with LinqToObjects?

I have the following c# code:
private XElement BuildXmlBlob(string id, Part part, out int counter)
{
// return some unique xml particular to the parameters passed
// remember to increment the counter also before returning.
}
Which is called by:
var counter = 0;
result.AddRange(from rec in listOfRecordings
from par in rec.Parts
let id = GetId("mods", rec.CKey + par.UniqueId)
select BuildXmlBlob(id, par, counter));
Above code samples are symbolic of what I am trying to achieve.
According to the Eric Lippert, the out keyword and linq does not mix. OK fair enough but can someone help me refactor the above so it does work? A colleague at work mentioned accumulator and aggregate functions but I am novice to Linq and my google searches were bearing any real fruit so I thought I would ask here :).
To Clarify:
I am counting the number of parts I might have which could be any number of them each time the code is called. So every time the BuildXmlBlob() method is called, the resulting xml produced will have a unique element in there denoting the 'partNumber'.
So if the counter is currently on 7, that means we are processing 7th part so far!! That means XML returned from BuildXmlBlob() will have the counter value embedded in there somewhere. That's why I need it somehow to be passed and incremented every time the BuildXmlBlob() is called per run through.
If you want to keep this purely in LINQ and you need to maintain a running count for use within your queries, the cleanest way to do so would be to make use of the Select() overloads that includes the index in the query to get the current index.
In this case, it would be cleaner to do a query which collects the inputs first, then use the overload to do the projection.
var inputs =
from recording in listOfRecordings
from part in recording.Parts
select new
{
Id = GetId("mods", recording.CKey + part.UniqueId),
Part = part,
};
result.AddRange(inputs.Select((x, i) => BuildXmlBlob(x.Id, x.Part, i)));
Then you wouldn't need to use the out/ref parameter.
XElement BuildXmlBlob(string id, Part part, int counter)
{
// implementation
}
Below is what I managed to figure out on my own:.
result.AddRange(listOfRecordings.SelectMany(rec => rec.Parts, (rec, par) => new {rec, par})
.Select(#t => new
{
#t,
Id = GetStructMapItemId("mods", #t.rec.CKey + #t.par.UniqueId)
})
.Select((#t, i) => BuildPartsDmdSec(#t.Id, #t.#t.par, i)));
I used resharper to convert it into a method chain which constructed the basics for what I needed and then i simply tacked on the select statement right at the end.

Resources