netsuite suitescript 2.0 export(csv) - netsuite

Is there a way to export search results using suitescript 2.0 in the same way when exporting from the Search page using Export(CSV). Netsuite Answers says that this can be done by building a CSV file, I would like to know if I can run the Export(CSV) as is. I need to do this because I have many searches that I need to run weekly which have to be downloaded to Excel and I would like to have a script do this instead of manually selecting each one.

Use the N/task.SearchTask API.

The inbuilt solution provided by Netsuite is to schedule the saved search to send email with the saved search results to be send as attachment in CSV format .
Alternatively, you can also find third party library to convert JSON to CSV and convert the saved search result to JSON format you want to be in CSV

Really quick code of a scheduled script that puts the results of a saved search into an exiting file.
Ref: SuiteScript 2.0 API page 792
/**
*#NApiVersion 2.x
*#NScriptType ScheduledScript
*/
define(['N/task','N/log'],
function(task)
{
function execute(context)
{
//create search task
var myTask = task.create({
taskType: task.TaskType.SEARCH
});
myTask.savedSearchId = 4222;
myTask.fileId = 14581313;
var myTaskId = myTask.submit();
log.audit({title:"Task submitted.",
details:"Put results of savedSearchId:4222 in csv file InternalID:14581313"});
}
return {execute: execute
}
});
I then check if the file is new enough (the script didn't fail) and download and process it.

Related

What is the use of the 'Parameters' tab in the script record?

I have a scheduled script (running from a Suitelet script) that is getting parameters from a Suitelet. What do the parameters in the Parameter tab do?
my suitelet code chunk:
var params = [];
params['custscript_emp_accrual'] = empreq_id;
params['custscript_emp_months'] = rowCount;
nlapiLogExecution('debug', 'empreq_id:rowCount', empreq_id + ':' + rowCount);
nlapiScheduleScript('customscript_emp_accrual_sched', 'customdeploy_emp_accrual_sched', params);
my scheduled code chunk:
var empreq_Id = nlapiGetContext().getSetting('SCRIPT', 'custscript_emp_accrual');
var month = nlapiGetContext().getSetting('SCRIPT', 'custscript_emp_months');
var dateNow = nlapiLookupField('customrecord_payroll_period', month, 'custrecord_payperiod_enddate');
They're ways to setup script deployments. Check out the help doc "Creating Script Parameters Overview" for more information.
The Parameters allow you to pass data from the Suitelet to the Scheduled Script. If you don't set up the script parameters then the data won't be passed to the Scheduled Script.
You might want to focus your learning on SuiteScript 2.0, this is most widely used in industry and has more support online.
Script parameters are very useful for variables that could either change over time or be different between script deployments.
This is just a simple example of how you could use parameters:
You could have a script that could be deployed to more than one record type and on each record type you certain code to be executed for a different user.
Now you can add a script parameter called USER, which is a select/multi-select field. This means that on each script deployment, you can specify a different user for which your code should run.
Now you could have the following code to read your parameter and restrict editing:
var allowedUser= runtime.getCurrentScript().getParameter('custscript_user');
var currentUser= runtime.getCurrentUser();
if(currentUser == allowedUser){
//do something
}
You can now select a different user on each of your script deployments.
Hope this helps!
For more info, have a look at page 110 of this document from Oracle: SuiteScript Developer Guide

Why does exporting from Excel to Csv cause Laravel Seeder to fail silently when using Flynsarmy csv seeder package?

I'm using this package: https://github.com/Flynsarmy/laravel-csv-seeder
Flynsarmy Csv seeder works fine if I manually create a file and save it as a csv, then seed my database. Artisan responds with
"Seeding: ProductsTableSeeder
Database seeding completed successfully."
However, if I save an Excel file as csv, I get the response,
"Seeding: ProductsTableSeeder"
It does not fail. No exceptions are thrown. But there is also no success message.
When I check the DB, the tables are empty.
Here's my migration 'up' method
{
Schema::create('products', function (Blueprint $table) {
$table->increments('id');
$table->integer('sku')->nullable();
$table->string('name');
$table->string('species')->nullable();
$table->string('fabric')->nullable();
$table->string('panel')->nullable();
$table->string('gauge_gasket')->nullable();
$table->string('type')->nullable();
$table->string('brand')->nullable();
$table->string('brand_label')->nullable();
$table->boolean('is_cjb');
$table->boolean('is_osp');
$table->boolean('is_starmark');
$table->boolean('is_vision');
$table->boolean('gasketed');;
$table->string('color');
$table->float('inside_width')->nullable();
$table->float('inside_length')->nullable();
$table->float('outside_width')->nullable();
$table->float('outside_length')->nullable();
$table->timestamps();
});
Here's my implementation of Flynsarmy Csv Seeder:
use Flynsarmy\CsvSeeder\CsvSeeder;
class ProductsTableSeeder extends CsvSeeder
{
/**
* Run the database seeds.
*
* #return void
*/
public function __construct()
{
$this->table = 'products';
$this->filename = base_path().'/database/seeds/csvs/product_info.csv';
}
public function run()
{
// Recommended when importing larger CSVs
DB::disableQueryLog();
// Uncomment the below to wipe the table clean before populating
DB::table($this->table)->truncate();
parent::run();
}
}
Now, I suspected my issue is the csv file. I used csvlint.io and it says
*
Structural problem: Assumed header As there is no machine readable way
to tell if your CSV has a header row, we have assumed that your CSV
has one.
*
However, it gives this same message for the csv I manually created and that seems to work just fine.
I thought it might be a permission issue and so I did an ls -lr on my /database/seeds/csvs directory and I get
-rw-rw-r-- 1 secretname secretname 18315 Dec 19 14:07 product_info.csv
I even did what friends don't let friends do, (chmod 777), but that fixed nothing, so it isn't a permissions issue.
I've been repeatedly doing:
php artisan cache:clear
composer dumpautoload
php artisan migrate:fresh --seed
All I want to do is export from Excel to Csv and seed the database. Why is this such a problem?
Here's a sample of my csv file. I have tried with both an id column and without (since laravel adds an id and autoincrements)
sku,name,species,fabric,panel,finish,color,type,inside_width,inside_length,outside_width,outside_length,
800124,air tray casket,,,,white,,,30.5,,,,
555555,alex,metal,blue crepe,,paint,light blue,metal caskets,24.5,76.75,28.5,83,
578645,alex,metal,rosetan crepe,,paint,copper,metal caskets,24.5,76.75,28.5,83,
524785,alex,metal,oyster crepe,,paint,silver,metal caskets,24.5,76.75,28.5,83,
483959,alex,metal,oyster crepe,,paint,white,metal caskets,24.5,76.75,28.5,83,
The flynsarmy csv they give as a sample was the one that worked successfully, It's simply
id,name
1,Foo
2,Bar
I feel so dumb. Or smart, depending on how I look at it. I was searching for complex answers, looking over a gigantic csv. So I paired it down to a few items at a time.It's the coma's. Excel placed comas where they did not need to be (at the end of each line). Removing the coma from the end of each line solves the problem. Interestingly, I did not get a "database seeded successfully" message. It only said "Seeding: ProductsTableSeeder", but it did populate the tables this time.
Well, Hello World! How's that for a first post? ;)
But you know what? That linter validated my csv and said all was well. Well, it was wrong!
I ran into the same problem.
Excel only place the extra comma if there is formatting like table border on the column that has no data.
Open the csv with excel and save it again should remove the extra commas

Converting a date to a string - SuiteScript 2.0

Goal: Convert JS Date Object to a String representation in the format of "11/2/2017" in a NetSuite SuiteScript 2.0 scheduled script.
I have a date object that I need to use for 2 purposes. In one, I am going to use it for comparisons (so I want the actual date object). The other is I want it to be the name of a Custom Record, ie a string value.
I am doing this in NetSuite SuiteScript 2.0 (Javascript) in a Scheduled Script. The toString() of the date right now is: "2017-11-02T07:00:00.000Z". The format I want to end up with for the name is 11/2/2017.
When I test toLocaleDateString() in a browser test app, I get 11/2/2017 - the exact format I want. However, when I sue this same thing in SuiteScript 2.0 I get "November 2, 2017". I know there is a difference between client/server but this was frustrating.
I tried the format.parse() function as NetSuite's documentation claims that this is the equivalent to the 1.0 nlapiDateToString() function. This did not work.
Besides writing my own function (which I am tempted to do), does anyone know how to accomplish this goal?
To switch over to that format you would not use format.parse, you would use format.format. Here is a simple example of converting a date object to that string format.
require(['N/format'],function(format){
function formatDate(testDate){
log.debug('testDate: '+testDate);
var responseDate=format.format({value:testDate,type:format.Type.DATE});
log.debug('responseDate: '+responseDate);
}
var testDate=new Date();
formatDate(testDate);
});
I'm going to suggest using the momentJS library for all of your SuiteScript date manipulation needs. It works well as a SuiteScript 2.0 module and you can format dates easily:
var now = new Date();
var formattedDate = moment(now).format('M/D/YYYY');
Use format.parse module of suitescript 2.0
var myDateString= "04/26/2020";
var parseDate = format.parse({
value: myDateString,
type: format.Type.DATE
});

SuiteFlow||SuiteScript: Send email based on file size

We have a Suiteflow that sends an email as an attachment. However, the email doesn't send if the attachment is over 5 MB in size. I want to add a condition to the action that says when document size is < 5 MB. I planned on then adding a separate action to send the email without the attachment if the file size is >= 5 MB. Is this possible and if not, what work around is there?
SuiteScript (Javascript) is certainly an option but I would prefer just modifying the existing SuiteFlow
::In Response to below comments:
The email attachment is added to a Document field on the Transaction, not in the File subtab. I cannot find how to get at it's properties (like size) therefore in a search. (idea #2 below).
Also, the code sample (idea #1) does not work because nlapiLoadFile will not load a file > 5 mb, meaning I can't do a test etc. I am trying to avoid writing the whole thing as a script.
So far the only solution (and I don't feel it is a good one) is to take the sending of the email, make it into a script, and do a try catch on it. Any other ideas??
Assuming that the file that you are sending as an attachment is a file that already exists in Netsuite file cabinet you can add a script to validate the size of the attachment:
var load = nlapiLoadFile('100');//where 100 is the internal id of the file
var filesize = load.getSize(); //Returns the size of the file in bytes
if(filesize > .....) //
For reference of using this Suitescript API:
Helpguide > SuiteCloud (Customization, Scripting, and Web Services) : SuiteScript : SuiteScript API : SuiteScript Objects : nlobjFile
I konw its not efficient but how about doing a search to get the size of the file. Something like this
Filters:
Internal Id = internal id of the attachment
Results:
Size
The Size column would return the file size in KB.

File upload with metadata using SharePoint Web Services

I trying to upload a file with metadata using SharePoint Web Services. The first approach I took is to use the WebRequest/WebResponse objects and then update the metadata using the Lists.asmx - UpdateListItems method. This works just fine but it creates two versions of the file. The second approach I took was to use the Copy.asmx web service and use the CopyIntoItems method which copies the file data along with the metadata. This works fine and creates v 1.0 but when I try to upload the same file with some changes in the metadata (using the Copy.asmx) it does not do update anything. Does anybody came across the same issue or has some other ideas to implement the required functionality.
Thanks,
Kiran
This might be a bit of topic (sorry) but I'd like to advice you to a real timesaving shortcut when working with SharePoint remotely, http://www.bendsoft.com/net-sharepoint-connector/
It enables you to work with SharePoint lists and document libraries with SQL and stored procedures.
Uploading a file as a byte array
...
string sql = "CALL UPLOAD('Shared Documents', 'Images/Logos/mylogo.png', #doc)";
byte[] data = System.IO.File.ReadAllBytes("C:\\mylogo.png");
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", data);
cmd.ExecuteNonQuery();
...
Upload stream input
using (fs == System.IO.File.OpenRead("c:\\150Mb.bin")) {
string sql = "CALL UPLOAD('Shared Documents', '150Mb.bin', #doc)";
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", fs);
cmd.ExecuteNonQuery();
}
There are quite a few methods to simplify remote document management
UPLOAD(lisname, filename, data)
DOWNLOAD(listname, filename)
MOVE(listname1, filename1, listname2, filename2)
COPY(listname1, filename1, listname2, filename2)
RENAME(listname, filename1, filename2)
DELETE(listname, filename)
CREATEFOLDER(listname, foldername)
CHECKOUT(list, file, offline, lastmodified)
CHECKIN(list, file, comment, type)
UNDOCHECKOUT(list, file)
Cheers

Resources