Data migration from NetSuite to another storage - netsuite

So we need to migrate our old receipts from NetSuite filecabinet to another storage ( say AWS). How to do this without using any products like Celigo's SuiteStorage ?

Why not use an off-the-shelf product? It will get you results much quicker than trying to homebrew something, and you'll be able to move on to more important tasks, which should easily recoup any financial cost for a COTS product.
If your receipts are all underneath a single folder in the File Cabinet, you can just manually download the whole folder as a zip, then manually upload it wherever you want. Could take a while depending on how large the archive is, but it's a pretty simple process.
If you're hell-bent on building something yourself, there's no SuiteScript API for creating ZIP files, but you can SFTP individual files using SuiteScript 2.0 and the N/sftp module.

Related

Sharepoint Online: Delete version history in bulk?

I'm administering our organization Sharepoint Online right now and the storage is running low. I noticed that there are files that are taking up a lot of space because of version history (Example is a powerpoint slide with videos in it) It takes up to over a gb of file sometimes. I manually deleted version history of some files and it freed up almost 50gb worth of storage
Is there a built in way to do this in bulk? Or is there a built-in tool in sharepoint (something like the 'Storage Metrics') that traverses all files and shows the size but also shows the size with the version history size
Per my knowledge, there is no built-in tool to show the size of version histories. And there is no built-in way to delete version histories in bulk.
As a workaround, you could delete version histories in bulk by using PowerShell.
References:
https://www.sharepointdiary.com/2016/02/sharepoint-online-delete-version-history-using-powershell.html
https://social.msdn.microsoft.com/Forums/en-US/870e2f03-abf3-44b8-a2b6-71cb2aade2ef/powershell-script-to-delete-older-versions-of-documents-in-a-sharepoint-online-library?forum=sharepointdevelopment
As Emily mentioned, there is no native function for bulk delete history in SharePoint. SharePoint allows you to delete versions only for a single selected document.
There are generally two approaches to solve this issue: PowerShell Script and a 3rd party tool. Scripts from "sharepointdiary" look good and can be helpful.
There is a tool DMS Shuttle for SharePoint. It provides an UI and can delete versions in bulk for a particular document, for all documents in a Library or sub-folder or even for the whole site.
The tool allows you to specify the number of latest versions to keep.
It is commercial, but there is a trial version and for students it is free. Disclaimer: I work for the vendor.
Take a look on this article: https://dms-shuttle.com/documentation/delete-version-history-in-sharepoint-online-office-365/

Real-time dashboard with Excel

I'm working on building a dashboard for some of the higher-ups in my organization. The dashboard uses the information stored in Excel files made by several separate departments. The goal of the project is to automatically sync all of the separate Excel files and put them together to provide context. Then, take the information from the files to create a real-time performance dashboard.
What would be the best approach to do this? Is there a way to do this within Excel?

Best design approach for storing documents

We have a sharepoint website and as part of functional process across the website where there are lot of documents been uploaded. Currently they are been stored into database which results in very bulky table in terms of size. My initial approach was to utilize sharepoint to store the documents into file library. Does anybody think database is the wiser options and why or any other approach which is performant and better to store confidential files?
Using a database for storing documents is not a recommended approach, not only it will have large size but will be hard when it comes to maintenance and performance.
If you have a SharePoint server, why not go with a library or multiple libraries to store documents. You will get the below advantages when using SharePoint.
1.Permission management : you can set up access to documents and choose who access what.
2.Search : if there is a search service running you can search through your libraries.
3.OWA : office web apps can be used to open documents on the browser.
4.Audits : You can enable audit logs to see who does what.
Remember, SharePoint is a CMS and there are other options like MMS etc, but it stores the documents in a database too, its designed well so you dont have to worry much about it. If you go with your custom solution you will have to do a lot of custom development and testing.
I never recommend saving files in the database. The easiest approach is to store them on the server in a directory and only save the file names in the database. This makes it easy to show them via a URL in a browser as well. Create a table with a column for the OriginalFileName and one for the ActualFileName. When i save a file to the server after its uploaded i usually change the name so you never have complications with duplicate file names. I use a GUID as the actual file name when its saved and save the original file name in the database along with the actual so you can get both back.

Can you use a Sharepoint Trigger to export an uploaded Excel File to a Network Drive as CSV?

I have a business requirement to process files uploaded by regional businesses for import to another system. It's envisaged that users will use SharePoint 2007 (soon to be SharePoint 2013), the event will trigger an export to CSV and the process will then run against those files.
Is this possible in either SharePoint versions?
Would that be an app, or standalone service I would want to create and schedule?
Does anyone have a more elegant solution? Essentially the CSV export is feeding in to a program that allows a user to visually validate and press a button to push to the other system after tweaking.
With custom code, you could create an event receiver on the list where CSV file lives that will run some code whenever the CSV file is updated. Here's a starter:
http://elczara.wordpress.com/2011/02/16/sharepoint-2010-event-receiver/
Make it a farm solution (sandbox solutions can't write to the filesystem directly), and you'll probably want to look up RunWithElevatedPrivileges, since the user doing the uploading may not have permission to write to the file system.
Steve's suggestion of rethinking the end-to-end solution is a good one, although I'm not sure how you can trigger the other system to "do its business".
Yes, it is possible, both with the 2007 version and the 2013.
Depending on your deployment scenario, you can:
create a custom timer job that will execute your job.
Create a custom site Workflow, with a loop and a delay, that will do the job.
The 1st is easier to build and maintain, but has less perspectives if you need to apply custom process.
But if you can control the application that consumes the feed, why don't you consume SharePoint directly? From the 2010 version, you can very easily get data using the listdata.svc web service. With older versions, you can still get data using a simple web service.

Ecommerce Datafeed

I'm trying to build a e-commerce site for a client and it isn't quite as straightforward as I had hoped. I am using Magento Community and my client has a wholesaler who provides a data feed (over 5000 products). In order to make this data feed compliant with Magento (and its attributes), I have edited some column headings in Excel and successfully uploaded as a CSV file.
My issue is that the wholesaler regularly renews the data feed automatically. When this happens, I am assuming my tweaking of the spreadsheet will be overruled, making my now Magento compatible CSV file useless again.
My question then is how can I make the wholesaler data feed compliant with my revised version so I don't have to continually rename elements? Is this possible?
I apologise if this sounds very stupid but I am more seasoned to static website builds.
Thank you in advance.
Does your webserver have access to the newly produced updated automatic data feed ?
IF so, just fetch the feed, modifty it to be complaint with the magento data feed then find the magento 'processfeed' or whatever function, and run it against the file ?
If your webserver cannot automatically fetch the file, can the pc making the feed automatically post the data to the webserver ?
IF neither are possible so far, then there will always be a manual person who has to 'reupload the feed'.. in which case, just make a simple page for them, 'Upload feed', or modify the magento page, and let them upload their standard feed, reformat it, then process the reformatted feed.
You would need to write a script to transform wholesaler data feed into magento format so you do not need to rename manually every time.

Resources