I have an object in a controller's method:
$post = ORM::factory('post', array('slug' => $slug);
that is sent to the view:
$this->template->content = View::factory('view')->bind('post', $post);
I've created 1-n relation between post and comments. So good so far.
The main issue is: how should I pass post comments to the view? Currently I'm getting them in the view ($post->comments->find_all()) but I don't feel it's the best method (and not matching MVC standards in my humble opinion). I was also thinking about assigning them to the property in the controller ($post->comments) however I get an error about undefined property (which makes sense to me).
How would you recommend to solve that issue?
Why not grab the comments in the controller and pass them to the view for presentation? At least that's how I'd go about this.
$post = ORM::factory('post', array('slug' => $slug));
$comments = $post->comments->find_all();
$this->template->content = View::factory('view')->bind('comments', $comments);
In regards to your multiple pages comment, which I assume you mean posts...
This is what I would usually do.
$posts = ORM::factory('post', array('slug' => $slug))->find_all();
$view = new View('view');
foreach ($posts as $post) {
$view->comments = $post->comments;
$this->template->content .= $view->render();
}
Though this may not be the most resource friendly way to accomplish this, especially if you're working with many posts. So passing the posts to the view and then doing the foreach loop inside the view may be a better way to go about this.
$posts = ORM::factory('post', array('slug' => $slug))->find_all();
$this->template->content = View::factory('view')->bind('posts', $posts);
Though I also don't necessarily think running a select query from the view is the worst thing in the world. Though I'm no expert... ;)
I had asked this question a while ago in regards to CodeIgniter... passing an Array to the view and looping through it seemed to be the favored response...
Using CodeIgniter is it bad practice to load a view in a loop
Related
I'm running into some n+1 performance issues when iterating over a collection of ContentItems of a custom Type that I created solely through migrations.
ContentDefinitionManager.AlterPartDefinition("MyType", part => part
.WithField("MyField", field => field
...
)
);
ContentDefinitionManager.AlterTypeDefinition("MyType", type => type
.WithPart("MyType")
);
Every time I access a field of this part a new query is performed. I can use QueryHints to avoid this for the predefined parts
var myItems = _orchardServices.ContentManager.Query().ForType("MyType")
.WithQueryHints(new QueryHints().ExpandParts<LocalizationPart()
...
);
but can I do this for the ContentPart of my custom type too? This does not seem to work:
var myItems = _orchardServices.ContentManager.Query().ForType("MyType")
.WithQueryHints(new QueryHints().ExpandParts<ContentPart>()
...
);
How can I tell Orchard to just get everything in one go? I'd prefer to be able to do this without writing my own HQL or directly addressing the repositories.
Example:
var myItems = _orchardServices.ContentManager.Query().ForType("MyType");
#foreach(var item in myItems.Take(100)) {
foreach(var term in item.Content.MyItem.MyTaxonomyField.Terms) {
// Executes 100 queries
<div>#term.Name</div>
}
}
TaxonomyField doesn't store ids and using the TaxonomyService inside of the loop wouldn't improve performance. Right now, to work around this, I fetch all TermContentItems.Where(x => myItems.Select(i => i.Id).Contains(TermPartRecord.Id)) from the repository outside of the loop as well as a list of all the terms of the Taxonomy that the field is using. Then inside the loop:
var allTermsInThisField = termContentItems.Where(tci => tci.TermsPartRecord.Id == c.Id)
.Select(tci => terms.Where(t => t.Id == tci.TermRecord.Id).Single()).ToList()
I'm not a very experienced programmer but this was the only way I could see how to do this without digging into HQL and it seems overly complicated for my purposes. Can Orchard do this in less steps?
I have been searching for hours, but I cannot find anything about this.
Situation:
Backend, existing of NodeJS + Express + Mongoose (+ MongoDB ofcourse).
Frontend retrieves object from the Backend.
Frontend makes some changes (adds/updates/removes some attributes).
Now I use mongoose: PersonModel.findByIdAndUpdate(id, updatedPersonObject);
Result: added properties are added. Updated properties are updated. Removed properties... are still there!
Now I've been searching for an elegant way to solve this, but the best I could come up with is something like:
var properties = Object.keys(PersonModel.schema.paths);
for (var i = 0, len = properties.length; i < len; i++) {
// explicitly remove values that are not in the update
var property = properties[i];
if (typeof(updatedPersonObject[property]) === 'undefined') {
// Mongoose does not like it if I remove the _id property
if (property !== '_id') {
oldPersonDocument[property] = undefined;
}
}
}
oldPersonDocument.save(function() {
PersonModel.findByIdAndUpdate(id, updatedPersonObject);
});
(I did not even include trivial code to fetch the old document).
I have to write this for every Object I want to update. I find it hard to believe that this is the best way to handle this. Any suggestions anyone?
Edit:
Another workaround I found: to unset a value in MongoDB you have to set it to undefined.
If I set this value in the frontend, it is lost in the REST-call. So I set it to null in the frontend, and then in the backend I convert all null-values to undefined.
Still ugly though. There must be a better way.
You could use replaceOne() if you want to know how many documents matched your filter condition and how many were changed (I believe it only changes one document, so this may not be useful to know). Docs: https://mongoosejs.com/docs/api/model.html#model_Model.replaceOne
Or you could use findOneAndReplace if you want to see the document. I don't know if it is the old doc or the new doc that is passed to the callback; the docs say Finds a matching document, replaces it with the provided doc, and passes the returned doc to the callback., but you could test that on your own. Docs: https://mongoosejs.com/docs/api.html#model_Model.findOneAndReplace
So, instead of:
PersonModel.findByIdAndUpdate(id, updatedPersonObject);, you could do:
PersonModel.replaceOne({ _id: id }, updatedPersonObject);
As long as you have all the properties you want on the object you will use to replace the old doc, you should be good to go.
Also really struggling with this but I don't think your solution is too bad. Our setup is frontend -> update function backend -> sanitize users input -> save in db. For the sanitization part, we use a helper function where we integrate your approach.
private static patchModel(dbDocToUpdate: IModel, dataFromUser: Record<string, any>): IModel {
const sanitized = {};
const properties = Object.keys(PersonModel.schema.paths);
for (const key of properties) {
if (key in dbDocToUpdate) {
sanitized[key] = data[key];
}
}
Object.assign(dbDocToUpdate, sanitized);
return dbDocToUpdate;
}
That works smoothly and sets the values to undefined. Hence, they get removed from the document in the db.
The only problem that remains for us is that we wanted to allow partial updates. With that solution that's not possible and you always have to send everything to the backend.
EDIT
Another workaround we found is setting the property to an empty string in the frontend. Mongo then also removes the property in the database
I am building a CMS that supports a website which also contains an SMF forum (2.0.11). One of the modules in the CMS involves a "report" that tracks attendance. That data is queried to tables outside of smf, but in the same database. In addition to what it does now, I would also like for a post to be made in a specific board on the SMF forum containing the formatted content. As all of the posts are contained by the database, surely this is possible, but it seems there is more to it than a single row in a table.
To put it in the simplest code possible, below is what I want to happen when I click Submit on my page.
$title = "2015-03-04 - Attendance";
$author = "KGrimes";
$body = "Attendance was good.";
$SQL = "INSERT INTO smf_some_table (title, author, body) VALUES ($title, $author, $body)";
$result = mysqli_query($db_handle, $SQL);
Having dug through the smf DB tables and the post() and post2() functions, it seems there is more than one table involved when a post is made. Has anyone outlined this before?
I've looked into solutions such as the Custom Form Mod, but these forms and templates are not what I am looking for. I already have the data inputted and POST'ed to variables, I just need the right table(s) to INSERT it into so that it appears on the forum.
Thank you in advance for any help!
Source: http://www.simplemachines.org/community/index.php?topic=542521.0
Code where you want to create post:
//Define variables
//msgOptions
$smf_subject = "Test Title";
//Handle & escape
$smf_subject = htmlspecialchars($smf_subject);
$smf_subject = quote_smart($smf_subject, $db_handle);
$smf_body = "This is a test.";
//Handle & escape
$smf_body = htmlspecialchars($smf_body);
$smf_body = quote_smart($smf_body, $db_handle);
//topicOptions
$smf_board = 54; //Any board id, found in URL
//posterOptions
$smf_id = 6; //any id, this is found as ID in memberlist
//SMF Post function
require_once('../../forums/SSI.php');
require_once('../../forums/Sources/Subs-Post.php');
//createPost($msgOptions, $topicOptions, $posterOptions);
// Create a post, either as new topic (id_topic = 0) or in an existing one.
// The input parameters of this function assume:
// - Strings have been escaped.
// - Integers have been cast to integer.
// - Mandatory parameters are set.
// Collect all parameters for the creation or modification of a post.
$msgOptions = array(
'subject' => $smf_subject,
'body' => $smf_body,
//'smileys_enabled' => !isset($_POST['ns']),
);
$topicOptions = array(
//'id' => empty($topic) ? 0 : $topic,
'board' => $smf_board,
'mark_as_read' => true,
);
$posterOptions = array(
'id' => $smf_id,
);
//Execute SMF post
createPost($msgOptions, $topicOptions, $posterOptions);
This will create the simplest of posts with title and body defined by you, along with location and who the author is. More parameters can be found in the SMF functions database for createPost. The SSI and Subs-Post.php includes must be the original directly, copying them over doesn't do the trick.
I have been working with soft delete and now i want to load the navigation properties of my entity that are not "deleted". I have found a way, my problem this way is not to clear for me, there is another way to do this.
Context.CreateSet().Include("Salary").Select(u => new {User= u, Salary = u.Salarys.Where(s => !s.Deleted)}).AsQueryable().Select(a => a.User).AsQueryable();
Eager loading doesn't support filtering. Your code can be simplified to:
var users = Context.CreateSet<User>()
.Select(u => new {
User = u,
Salary = u.Salaries.Where(s => !s.Deleted)
})
.AsEnumerable()
.Select(a => a.User);
Include is not needed because you are replacing it with your own query and AsQueryable is not needed because the query is IQueryable all the time till called AsEnumerable which will sqitch to Linq-to-Objects when selecting users and selected salaries. EF will take care of correctly fixing navigation properties for you.
I've created a snippet that pulls data from a databse table and displays it in tabular format. The snippet takes an id as parameter, and this is added to the sql query.
My problem is that if I've got more than 1 snippet call (sometimes need the tabular data for different id's displayed on a page) on the same page, all table data is the same as the last database call that's been made by the last snippet.
What do I need to do to kinda not cache the snippet database calls and have them all display their own content?
I've tried setting the page to no cache-able. Also used the [! !] brackets for the snippet calls, and even used the function_exists() method, but none of them helped.
Please can someone help me?
thanks
Try this at the end of the snippet:
mysql_connect('host', 'user', 'pass');
mysql_select_db('db_name');
You need to specify the connection parameters ofcourse.
It would help to answer if you can post your snippet. I do this with multiple calls on the page without issue, so there is either something wrong inside the snippet, or you need to output to unique placeholder names.
You have encountered a glitch of ModX, and it took me a long time to solve. ModX does a lot of caching by using hashing and apparently, when multiple connections are made from within one page divided over multiple snippets, this erratic behaviour can be seen. This is most likely very unwanted behaviour, it can be solved easily but gives you terrible headache otherways.
One sympton is that $modx->getObject($classname, $id)returns null (often).
The solution is very simple:
either use a static class with a single db instance, or
use $modx->setPlaceholder($instance, $tag);, or a combination.
My solution has been:
class dt__xpdo {
private function __construct() {}
public function __destruct() {
$this->close();
}
static public function db($modx = null) {
if ($modx->getPlaceholder('dt_xpdo') == '') {
$dt_user = 'xxxxxxxxx';
$dt_pw = 'xxxxxxxxx';
$dt_host = 'localhost';
$dt_dbname = 'xxxxxxxxx';
$dt_port = '3306';
$dt_dsn = "mysql:host=$dt_host;dbname=$dt_dbname;port=$dt_port;charset=utf8";
$dt_xpdo = new xPDO($dt_dsn, $dt_user, $dt_pw);
$dt_xpdo->setPackage('mymodel', MODX_CORE_PATH.'components/mymodel/'.'model/', '');
//$modx->log(modX::LOG_LEVEL_DEBUG, 'mymodel.config.php');
//$modx->log(modX::LOG_LEVEL_DEBUG, 'Could not addPackage for mymodel!');
$modx->setPlaceholder('dt_xpdo', $dt_xpdo);
}
return $modx->getPlaceholder('dt_xpdo');
}
}
Now you can use in your code:
require_once 'above.php';
and use something like
$xpdo = dt__xpdo::db($modx);
and continue flawlessly!