Import large excel file using laravel excel and laravel queue - excel

laravel excel version 3.1
laravel version 5.6
i have over 100000 rows of data in excel file. I would like to import this data into my database.
In My controller.php
if(request()->file('fertilizer_import')) {
$import = new FertilizerImport();
$file = request()->file('fertilizer_import');
dispatch(new FertilizerImportJob($import, $file));
}
In my FertilizerImportJob.php
public function __construct($import, $file)
{
$this->import = $import;
$this->file = $file;
}
public function handle()
{
Excel::import($this->import, $this->file);
}
And then, I uploaded my excel file. It is enter one row in jobs table. I run php artisan make:queue but data is not enter my fertilizer table.
How can i do that? Please advice me.

You should Try to use chunk and queue in laravel/Maatwebsite
In controller file
public function importExcel()
{
$path = resource_path() . "/Houten.xlsx";
\Excel::import(new TestImport, $path);
return redirect('/')->with('success', 'All good!');
return back();
}
And in Your Import File app/Imports/testImports.php
use Maatwebsite\Excel\Concerns\ToCollection;
use Maatwebsite\Excel\Concerns\WithStartRow;
use Illuminate\Contracts\Queue\ShouldQueue;
use Maatwebsite\Excel\Concerns\WithChunkReading;
class TestImport implements ToCollection, WithChunkReading, ShouldQueue,WithStartRow
{
public function collection(Collection $rows)
{
//do your insertion here. //no seprate job required
}
public function startRow(): int
{
return 1;
}
public function batchSize(): int
{
return 500;
}
public function chunkSize(): int
{
return 500;
}
}
Then set QUEUE_DRIVER=database and run the code. You will see one entry in jobs table. then you need to run this job. there are multiple ways to run any job. below is one
In your terminal
php artisan queue:restart
php artisan queue:listen
this will execute the code in collection function in chunks and will insert your data.

Related

AS3 Calling method in another swf file using workers threads

I currently have a worker thread loading a external swf file successfully. But how do I call a function in the loaded swf file ?
This is the class located in the external swf file.
package
{
import flash.display.Bitmap;
import flash.display.Sprite;
import flash.utils.ByteArray;
public class BackgroundProcesses extends Sprite {
public function BackgroundProcesses()
{
super();
}
public function EncryptTheImage(_imageToEncrypte:Bitmap):ByteArray
{
// Encrypted the image here
var _imageInEncryptedBytes:ByteArray = new ByteArray();
return _imageInEncryptedBytes;
}
}
}
Here is my Flex Mobile mxml file:
import mx.events.FlexEvent;
protected function LoadWorkerSwfFile(event:FlexEvent):void
{
var workerLoader:URLLoader = new URLLoader();
workerLoader.dataFormat = URLLoaderDataFormat.BINARY;
workerLoader.addEventListener(Event.COMPLETE, loadComplete);
workerLoader.load(new URLRequest("BackgroundProcesses.swf"));
}
private function loadComplete(event:Event):void
{
var workerBytes:ByteArray = event.target.data as ByteArray;
var bgWorker:Worker = WorkerDomain.current.createWorker(workerBytes);
bgWorker.addEventListener(Event.WORKER_STATE, WorkerIsRunning);
bgWorker.start();
}
private function WorkerIsRunning(event:flash.events.Event):void
{
}
Any help is appreciated!
Someone else can correct me, but I don't think you can access another worker's properties and methods directly, nor do I think you are suppose to be able to by design. Instead, you're supposed to pass messages and data back and forth between them to communicate and send/get work results.
This may help: Communicating between workers

How to implement distributed task executed on GG Node against segment of the IMDB located on it?

I have partitioned (IMDB) I would like to start a compute task on each node which do some calculation on each node IMDB against ALL records on THE node it was executed. Thus each task do a part of the job.
it seems that that colocation is not quite possible since I can not restrict access to the data on the node.
Please confirm or suggest a solution.
Sounds like you are asking how to collocate computations with nodes on where the data is cached. You can take a look at CacheAffinityExample shipped with GridGain. Specifically, the following code snippet:
for (int i = 0; i < KEY_CNT; i++) {
final int key = i;
// This callable will execute on the remote node where
// data with the given key is located.
grid.compute().affinityCall(CACHE_NAME, key, new GridCallable() {
#Override public void call() throws Exception {
String val = cache.get(key);
// Work on cached value.
...
return val;
}
}).get();
}
This code will send a closure to every node and do calculation against all data on the node:
grid.forCache("mycache").compute().broadcast(new GridRunnable() {
#Override public void run() {
for (GridCacheEntry<K, V> e : cache.entrySet()) {
// Do something
...
}
}
}).get();

ZipStream and Kohana 3.3

I'm trying to implement on the fly creating and streaming zip files in Kohana 3.3 using ZipStream (https://github.com/Grandt/PHPZip). I assumed that zip file would be streamed as soon as first image is added to zip, but it happens that download is stalled up until whole zip file is created and sent to user.
<?php defined('SYSPATH') or die('No direct script access.');
class Controller_Download extends Controller {
public function action_images()
{
require Kohana::find_file('vendor', 'ZipStream');
$zip = new ZipStream("images.zip");
foreach($images as $image)
{
$zip->addLargeFile($image);
}
$zip->finalize();
exit;
}
}
Apparently Kohana buffers output and that can be negated by adding this to download action.
while (ob_get_level() > 0) {
ob_end_clean();
}
Whole controller
<?php defined('SYSPATH') or die('No direct script access.');
class Controller_Download extends Controller {
public function action_images()
{
while (ob_get_level() > 0) {
ob_end_clean();
}
require Kohana::find_file('vendor', 'ZipStream');
$zip = new ZipStream("images.zip");
foreach($images as $image)
{
$zip->addLargeFile($image);
}
$zip->finalize();
exit;
}
}

Rescanning the configuration file in groovy

I need to implement a configuration file, which should be rescanned periodically or after an edit, what should I do?
I tried
config = new ConfigSlurper().parse(Config);
its not working when Config.groovy changes dynamically.
Example (from comment below)
MyConfig.groovy
class MyConfig {
public static ConfigObject config
public static void run() {
config = new ConfigSlurper().parse(Config)
}
public static void printconfig() {
println config.options.video.enable
}
}
MyConfig.run()
for( int i = 0 ; i < 10 ; i++ ) {
Thread.sleep(3000)
MyConfig.printconfig()
}
Config.groovy
options { video { enable = false } }
You seem to parse the config file once, then never re-parse it...
What you could do is store the last modified date of the file, and call run() again from printConfig if it detects the file has been modified...
Also, I assume you have a copy/paste error... Shouldn't:
config = new ConfigSlurper().parse(Config)
be:
config = new ConfigSlurper().parse( MyConfig.class.getResource( 'Config.groovy' ) )
Or something?

Playframework Excel file generation

I've installed excel module in order to generate reports from datas recorded by my application into database.
It works fine : i can create report simply by clicking on a link into my main page and render into excel template.
But i'd rather generate excel file periodically (using a job) and save it into a shared folder, and that without any human action (so not by clicking on a link).
It's like I want to trigger the associated controller to render into my template automatically.
Does anyone got any tips on it for me?
So the problem is you can't pass some parameters into the job, or...?
Using something like this just doesn't work?
#On("0 45 4-23 ? * MON-FRI")
public class ExcelJob extends Job {
public void doJob() {
// generate excel
}
}
I wrote my own Excel generator using JExcel, and I use it for scheduled generation without a problem. It also doesn't require a template, because the report structure is derived from annotations. This is roughly 20 lines of code - you may want to try it for yourself.
This is really rough and lacks good user feedback, but gives you the idea...
Excel generator - not Play-specific in any way
public class ExcelGenerator
{
public void generateReport(Function successCallback,
Function failureCallback)
{
try
{
byte[] report = // generate your report somehow
successCallback.execute(report);
}
catch (Exception e)
{
failureCallback.execute(e.getMessage());
}
}
}
A function interface for callbacks (very basic)
public interface Function
{
public void execute(Object... args);
}
Your Play controller
public class MyController extends Controller
{
public static void index()
{
render();
}
public static void createReport()
{
Function failureCallback = new Function()
{
public void execute(Object... args)
{
flash.error(args[0]);
indxe();
}
};
Function successCallback = new Function()
{
public void execute(Object... args)
{
renderBinary((byte[])args[0]);
}
};
ExcelGenerator excelGenerator = new ExcelGenerator();
excelGenerator.generateReport(successCallback,
failureCallback);
}
}
Finally, re-use the ExcelGenerator from your job
public class MyJob extends Job
{
public void doJob()
{
Function failureCallback = new Function()
{
public void execute(Object... args)
{
Logger.error(args[0]);
}
}
Function successCallback = new Function()
{
public void execute(Object... args)
{
byte[] report = (byte[])args[0];
// write report to disk
}
}
ExcelGenerator excelGenerator = new ExcelGenerator();
excelGenerator.generateReport(successCallback,
failureCallback);
}
}
You'll still need to write your own report generator, or refactor the existing excel module to provide what you need.
So if you want to run and manage several jobs you can do something like this
for (int i = 0; i < 10; i++) {
SendingMessageJob sendingMessageJob = new SendingMessageJob();
promises.add(sendingMessageJob.now());
}
boolean allDone = false;
while (!allDone) {
allDone = true;
for (F.Promise promise : promises) {
if (!promise.isDone()) {
allDone = false;
break;
}
}
}
// when arrive here all jobs have finished their process
You can check the Play documentation, specifically the section on jobs, where you'll see examples on how to create automatically triggered methods. This should solve your issue.
EDIT (update on comment):
You can manually trigger a job, do this:
new MyExcelGeneratorJob().doJob();
Thing is, Play is stateless, so the job should use data from the database. Instead of trying to pass parameters from your request into the Job (won't work) try to store that data in a staging area in the database that the job loads and processes to generate the excel.

Resources