multi insert in kohana orm3 - kohana

In my application i have a loop that executes about 1000 times, inside it i'm creating object and saving it. This is the part of application where i populate my database with data. In common this looks like this:
foreach(...){
...
try{
$object = new Model_Whatever;
$object->whatever=$whatever;
$object->save();}
catch(Exception $e){
...}
}
}
This produces 1000 of INSERT queries. Is it possible to, in some way, made kohana produce multi inserts. Split this into 10 inserts with 100 data sets in each. Is it possible and if yes that what is the way doing so?

Whilst the Kohana ORM doesn't support multi inserts, you can still use the query builder as follows:
$query = DB::insert('tablename', array('column1', 'column2','column3'));
foreach ($data as $d) {
$query->values($d);
}
try {
$result = $query->execute();
} catch ( Database_Exception $e ) {
echo $e->getMessage();
}
you'll still need to split the data up so the above doesn't try to execute a query with 1000 inserts.
$data assumes an array of arrays with the values corresponding to the order of the columns
thanks Isaiah in #kohana

php work very slow when insert multi array very big (so that method ::values have array_merge) so more fast:
class Database_Query_Builder_Bath_Insert
extends Database_Query_Builder_Insert{
public static function doExecute($table, $data) {
$insertQuery = DB::insert($table, array_keys(current($data)));
$insertQuery->_values = $data;
$insertQuery->execute();
}
}

call_user_func_array([$query, 'values'], $data);

Related

Laravel Excel Import Write into Batches of Excel file

I have a large number of records in an excel sheet which is taking too much time while importing to the 3 tables in database. To overcome this issue, I am trying to import it through batch by creating small excel files with less data so that I run queue jobs of laravel on it. I am trying with the below code but it doesn't work, it throws an error of Array to String conversion and also sub excel files are not creating. I am using excel Maatweb, but I am handling it through controller.
Can someone lead me please.
function importBatchFiles(Request $request)
{
$this->validate($request, [
'file' => 'required|mimes:xls,xlsx,csv'
]);
$file = $request->file('file');
//$fileName='orders_'.$request->get('company_id').'_'.date('Y-m-d').uniqid().'.xlsx';
if ($file->isValid()) {
//$file->move('order_list', $fileName);
$data = Excel::toArray(new OrdersImport, request()->file('file'));
foreach($data as $key => $value)
{
foreach($value as $row)
{
$inputs[] = $row;
}
}
$data1 = array_slice($inputs, 1);
$parts = (array_chunk($data1, 500));
foreach($parts as $index => $part){
$filename = resource_path('pending-files/'.date('Y-m-d').$index.'.'.$request->file->extension());
file_put_contents($filename, $part);
}
return Response::json(['success' => 'Orders Queued for importing.']);
}else{
return Response::json(['error' => 'Some errror']);
}
}
When using the WithChunkReading concern, you can execute each chunk into a queue job. You can do so by simply adding the ShouldQueue contract.
For more details please refer to this link: https://docs.laravel-excel.com/3.1/imports/queued.html

Fatest string comparison using eloquent

I'm importing data using Maatwebsite from an excel file and before creating a new row in my model I check if the register already exists to avoid duplicates. But this takes too long.
In my ProductImport.php:
public function model(array $row)
{
$exists = Product::
where('product_description', $row["product_description"])
->where('product_code', $row["product_code"])
->first();
if($exists ){
return null;
}
++$this->rows;
// Autoincrement id
return new Product([
"product_description" => $row["art_descripcion"],
"product_code" => $row["cui"],
"id_user" => $this->id_user,
...
]);
}
public function chunkSize(): int
{
return 1000;
}
As you see, I'm also using chunkSize, because there are 5000 rows per excel.
The problem:
The size of the product_description varies between 800 to 900 characters (varchar[1000]) and it makes the query (where()) very slow per iteration within the foreach.
Is there a better way to handle this? Maybe using updateOrCreate instead of searching first and then creating? Because I think it will be the same approach.
So the main problem is how do I compare those 800 - 900 size string quicker? Because this search is taking a lot of time to execute:
$exists = Product::
where('product_description', $row["product_description"])
->where('product_code', $row["product_code"])
->first();

Laravel 4 Export Excel Multidimensional Query Result

I'm trying to get PHPExcel (rather, the Laravel wrapper for it: https://github.com/Maatwebsite/Laravel-Excel ) to export my view. The query I'm running returns multiple rows as a multidimensional array (I'm using DB::select and binding that way because the query is a bit complex for Fluent)
The results look like this:
array(3) {
[0]=> object(stdClass)#224 (3) {
["name"]=> string(13) "Administrator"
["TotalRequest"]=> string(6) "100.00"
["TotalGiven"]=> string(6) "150.00" }
[1]=> object(stdClass)#226 (3) {
["name"]=> string(14) "Beta Alpha Psi"
["TotalRequest"]=> string(6) "363.00"
["TotalGiven"]=> string(6) "200.00" }
[2]=> object(stdClass)#227 (3) {
["name"]=> string(30) "Student Government Association"
["TotalRequest"]=> string(7) "1225.00"
["TotalGiven"]=> string(6) "620.00" }
}
The Laravel-Excel package only takes in a $data array(), so I'm confused how to convert my multi-dimensional array into my view. I can get it to work if I use the alternative
View::make(xxxx)->with('example', $example)
Am I overlooking how to pass $data as an array when I have objects involved in this?
It's not a multi-dimensional array but an array of objects. If you pass the array to your view using something like this:
View::make('xxxx')->with('example', $example)
Then in your view you may loop it using #foreach, like this:
#foreach($example as $item)
{{ $item->name }}
{{ $item->TotalRequest }}
{{ $item->TotalGiven }}
#endforeach
Because, the array contains multiple stdClass objects and the first object is (0):
{
["name"]=> string(13) "Administrator"
["TotalRequest"]=> string(6) "100.00"
["TotalGiven"]=> string(6) "150.00"
}
So, you may also retrieve the first object from the $example array using something like $example[0] and to retrieve the second object you may use $example[1] and so on.
Well I Would suggest you simply change the Fetch Mode to return an array directly using a method like the one below
public function getQueryResult($returnQueryAs = null)'
{
$tablename = 'YourAwesomeTable';
if($returnQueryAs == 'array')
{
DB::connection()->setFetchMode(PDO::FETCH_ASSOC);
}
return DB::table($tablename)->get();
}
then do something like this
public function exportAsExcel()
{
$filename = 'nameoffile';
$data = $this->getQueryResult($as = 'array');
Excel::create($filename, function($excel) use ($filename, $data) {
// Set the title
$excel->setTitle($filename);
// Chain the setters
$excel->setCreator('Damilola Ogunmoye');
$excel->sheet('SHEETNAME', function($sheet) use ($data) {
$sheet->fromArray($data);
});
})->download('xls');
}
This Would get an associative array as opposed to an stdClass object which you can pass directly.

Kohana 3 ORM find_all() returns all rows regardless of where clause

I have one simple users table, and I want to find all users where email_notifications = 1.
Logic dictates that the following should work:
class Controller_Test extends Controller {
public function action_index()
{
$user = ORM::factory('user');
$user = $user->where('email_notifications', '=', 1);
$total = $user->count_all();
$users = $user->find_all();
echo $total." records found.<br/>";
foreach ($users as $v)
{
echo $v->id;
echo $v->first_name;
echo $v->last_name;
echo $v->email;
}
}
}
However, what's happening is that I am getting ALL of my users back from the DB, not just the ones with email_notifications turned on. The funny thing is, the $total value returned is the accurate number result of this query.
I am so stumped, I have no idea what the problem is here. If anyone could shed some light, I'd really appreciate it.
Thanks,
Brian
Calling count_all() will reset your model conditions. Try to use reset(FALSE) to avoid this:
$user = ORM::factory('user');
$user = $user->where('email_notifications', '=', 1);
$user->reset(FALSE);
$total = $user->count_all();
$users = $user->find_all();

Select top/latest 10 in couchdb?

How would I execute a query equivalent to "select top 10" in couch db?
For example I have a "schema" like so:
title body modified
and I want to select the last 10 modified documents.
As an added bonus if anyone can come up with a way to do the same only per category. So for:
title category body modified
return a list of latest 10 documents in each category.
I am just wondering if such a query is possible in couchdb.
To get the first 10 documents from your db you can use the limit query option.
E.g. calling
http://localhost:5984/yourdb/_design/design_doc/_view/view_name?limit=10
You get the first 10 documents.
View rows are sorted by the key; adding descending=true in the querystring will reverse their order. You can also emit only the documents you are interested using again the querystring to select the keys you are interested.
So in your view you write your map function like:
function(doc) {
emit([doc.category, doc.modified], doc);
}
And you query it like this:
http://localhost:5984/yourdb/_design/design_doc/_view/view_name?startkey=["youcategory"]&endkey=["youcategory", date_in_the_future]&limit=10&descending=true
here is what you need to do.
Map function
function(doc)
{
if (doc.category)
{
emit(['category', doc.category], doc.modified);
}
}
then you need a list function that groups them, you might be temped to abuse a reduce and do this, but it will probably throw errors because of not reducing fast enough with large sets of data.
function(head, req)
{
% this sort function assumes that modifed is a number
% and it sorts in descending order
function sortCategory(a,b) { b.value - a.value; }
var categories = {};
var category;
var id;
var row;
while (row = getRow())
{
if (!categories[row.key[0]])
{
categories[row.key[0]] = [];
}
categories[row.key[0]].push(row);
}
for (var cat in categories)
{
categories[cat].sort(sortCategory);
categories[cat] = categories[cat].slice(0,10);
}
send(toJSON(categories));
}
you can get all categories top 10 now with
http://localhost:5984/database/_design/doc/_list/top_ten/by_categories
and get the docs with
http://localhost:5984/database/_design/doc/_list/top_ten/by_categories?include_docs=true
now you can query this with a multiple range POST and limit which categories
curl -X POST http://localhost:5984/database/_design/doc/_list/top_ten/by_categories -d '{"keys":[["category1"],["category2",["category3"]]}'
you could also not hard code the 10 and pass the number in through the req variable.
Here is some more View/List trickery.
slight correction. it was not sorting untill I added the "return" keyword in your sortCategory function. It should be like this:
function sortCategory(a,b) { return b.value - a.value; }

Resources