I am getting the following error while copying more than 17000 documents to a folder:
Exception occurred calling method NotesDocumentCollection.putAllInFolder(string)
This is my code:
docColl = database.search(formula);
getComponent("TempName").setValue(docColl.getCount());
docColl.putAllInFolder("f_Statistics");
If I move less than 17000 documents, it works. There is nothing to with no of documents in the view.
How can I solve this problem?
Perhaps you can use a loop and a try... catch to handle the error. I'm not sure about the exact syntax you would need for xpages, but could be something like this:
docColl = database.search(formula);
exceptionCaught = true; // little white lie
while(exceptionCaught = true);
{
getComponent("TempName").setValue(docColl.getCount());
exceptionCaught = false;
try
{
docColl.putAllInFolder("f_Statistics");
}
catch (Exception e)
{ // It blew up; assume that this means there were too many docs
View folder = db.getView("f_Statistics");
docColl.Subtract(folder.getAllEntries();
exceptionCaught = true;
}
}
Yes, it's a hack.
And no... the above is not tested, or even syntax checked. I'm just throwing out the idea.
If you try this, I strongly recommend that you do some additional checking to make sure that the cause of the exception really is the number of docs, because if any other exception occurs, the above code will most likely be an infinite loop!
How can I solve this problem?
Copy fewer documents? Whty not split it up into multiple moves if you are having problems when the number of documents exceede a certain number?
Related
#FetchRequest(
entity: Client.entity(),
sortDescriptors: [])
private var clients: FetchedResults<Client>
(...)
var searchResults: FetchedResults<Client> {
if searchText.isEmpty {
return clients
} else {
return clients.filter({$0.name!.contains(searchText)}) // Error here!
}
}
(...)
ForEach(searchResults, id: \.self) { client in
(...)
Error
Cannot convert return expression of type '[FetchedResults<Client>.Element]' (aka 'Array<Client>') to return type 'FetchedResults<Client>'
Hi,
I'm not sure how my logic is wrong. Could someone please tell me how to fix searchResults?
Also, is this the more efficient way to filter results or should I filter in the ForEach()? It seems pretty slow when I put the filter in ForEach()
While I know you have a code fix, and it is the correct one, I wanted to answer the question for posterity as I have run into this myself. The logic is wrong simply because your searchResults var is of type FetchedResults<Client>, but a filter returns an Array. So, the error message is telling you exactly the problem. You are trying to return an [Client] as a FetchedResults<Client> so you have a type mismatch.
You have two solutions:
You can filter in the fetch request, which is how you solved it per #Larme suggestion. This is especially helpful if you only need the filtered results in the UI and/or you have a lot of results.
You can filter when you use your fetched results. This is useful when you want your user to be able to determine what is filtered out by their own selections, but you don't know what filtering they will want ahead of time or that regardless of the filtering you are doing, you may need the whole FetchRequest later.
I have HTML Table which i am using to populate the data from the table, the database table contains more 50000 rows and it throws an error "Out of Memory exception error". If i have used a where clause and select few rows and it works which is not how it should work. Please Help. Thanks!
Here is my code.
protected void PopulatePage()
{
try
{
RepeaterBankStatement.DataSource = _dc.SelectBankStatement().ToList();
RepeaterBankStatement.DataBind(); // Out of Memory exception
}
catch(Exception EX)
{
}
}
This code works fine if i have used where clause but i don't want to use where clause. i want it to populate all the records.
protected void PopulatePage()
{
try
{
RepeaterBankStatement.DataSource = _dc.SelectBankStatement().Where(m => m.Description3 == "12345678524").ToList().Take(50000);
RepeaterBankStatement.DataBind();
}
catch(Exception EX)
{
}
}
You can try to minimize memory consumption to do a select new {} with only the data you actually need. It will also eliminate the change tracking overhead. This might reduce it just enough.
From your question it is not really clear if the repeater really shows everything at once, otherwise you can implement paging
In my current error handling, I specify the Form, function and error message in a messagebox similar to the following.
try
{
//Some code here
}
catch(Exception ex)
{
MessageBox.Show("Form Title : " + this.Title + "\nFunction : CurrentFunction \nError : " + ex.Message);
return;
}
This works for me, but was curious if I can make the process even more simple, and generate the function name, instead of typing it out every time I want to display the error message.
Additionally:
I know you can include the stacktrace and view the top few lines, but I was curious if there was a cleaner way to show the function.
Yes, if you just need the current function (not the calling function), you can use MethodBase.GetCurrentMethod:
string currentMethod = System.Reflection.MethodBase.GetCurrentMethod().Name;
I've implemented IAlertUpdateHandler interface in a class and used it for handling creation and updating of alerts. The code is fired but it goes into endless loop by calling itself again and again.
Actually I want to suppress email notification so I'm calling a.Update(false); but this again calls PreUpdate or PostUpdate method and there is
StackOverFlowException :(
I've tried returning true/false from both the methods but nothing is helping.
According to the documentation, you're supposed to just return true or false. Try commenting out the Update call and just return either true or false. If you call Update, you'll just go into the recursive loop and your return statement will never get evaluated.
1
I know that it might be a bit late for this but I thought put it anyway to help the next one.
I found a solution/hack for this problem.
I beleive that when the alert is created from the UI the system fires an SPAlert.update(), so what I came up with is to do similar but ignore the update called from the UI to do that I added a custom property to the SPAlert property bag.
public bool PreUpdate(SPAlert a, SPWeb web, bool newAlert, string properties)
{
if (CHECK_IF_SUPPRESSING_EMAIL && !a.Properties.ContainsKey("CustomUpdate"))
{
//add a property to identify this update
a.Properties.Add("CustomUpdate", ""); //can be called anything :)
a.Update(false);
//return false to ignore the update sent by the UI
return false;
}
else
{
//no changes here proceed with custom behaviour
return true;
}
}
I have tested and it seems to do the trick hope this helps someone :)
I've created a snippet that pulls data from a databse table and displays it in tabular format. The snippet takes an id as parameter, and this is added to the sql query.
My problem is that if I've got more than 1 snippet call (sometimes need the tabular data for different id's displayed on a page) on the same page, all table data is the same as the last database call that's been made by the last snippet.
What do I need to do to kinda not cache the snippet database calls and have them all display their own content?
I've tried setting the page to no cache-able. Also used the [! !] brackets for the snippet calls, and even used the function_exists() method, but none of them helped.
Please can someone help me?
thanks
Try this at the end of the snippet:
mysql_connect('host', 'user', 'pass');
mysql_select_db('db_name');
You need to specify the connection parameters ofcourse.
It would help to answer if you can post your snippet. I do this with multiple calls on the page without issue, so there is either something wrong inside the snippet, or you need to output to unique placeholder names.
You have encountered a glitch of ModX, and it took me a long time to solve. ModX does a lot of caching by using hashing and apparently, when multiple connections are made from within one page divided over multiple snippets, this erratic behaviour can be seen. This is most likely very unwanted behaviour, it can be solved easily but gives you terrible headache otherways.
One sympton is that $modx->getObject($classname, $id)returns null (often).
The solution is very simple:
either use a static class with a single db instance, or
use $modx->setPlaceholder($instance, $tag);, or a combination.
My solution has been:
class dt__xpdo {
private function __construct() {}
public function __destruct() {
$this->close();
}
static public function db($modx = null) {
if ($modx->getPlaceholder('dt_xpdo') == '') {
$dt_user = 'xxxxxxxxx';
$dt_pw = 'xxxxxxxxx';
$dt_host = 'localhost';
$dt_dbname = 'xxxxxxxxx';
$dt_port = '3306';
$dt_dsn = "mysql:host=$dt_host;dbname=$dt_dbname;port=$dt_port;charset=utf8";
$dt_xpdo = new xPDO($dt_dsn, $dt_user, $dt_pw);
$dt_xpdo->setPackage('mymodel', MODX_CORE_PATH.'components/mymodel/'.'model/', '');
//$modx->log(modX::LOG_LEVEL_DEBUG, 'mymodel.config.php');
//$modx->log(modX::LOG_LEVEL_DEBUG, 'Could not addPackage for mymodel!');
$modx->setPlaceholder('dt_xpdo', $dt_xpdo);
}
return $modx->getPlaceholder('dt_xpdo');
}
}
Now you can use in your code:
require_once 'above.php';
and use something like
$xpdo = dt__xpdo::db($modx);
and continue flawlessly!