loading AND saving to txt/csv file? - tabulator

I am trying to set up tabulator with all it's data validation goodness and simple to use UI in order to help a colleague with CRUD operations on a .txt file he has to do on a daily basis.
I found that tabulator can load data using AJAX but my question is, is it possible to load the data from a .csv/.txt file and then save to the same file?
I know you can export to .csv but without overwriting the loaded data, next time all his work would be lost.

If you are referring to a file on a users local computer then im afraid there is no import from file functionality built in to tabulator, but there is nothing to stop you implementing that bit your self.
The link below is a link to an article that explains how to load a CSV file from an input element in JavaScript,. In the example it loads it into an HTML table, but you could easily alter that to dump it into an array of objects to pass into Tabulators setData function
http://codeanalyze.com/Articles/Details/20174/Read-CSV-file-at-client-side-and-display-on-html-table-using-jquery-and-html5
In terms of saving the data back to the users computer, you would need to use the built in download function, there is no way to save it back to the users computer without the file popup due to browser safety constraints.
But i will add that the above approach is a bit unorthodox. The usual way to handle data persistence would be to save the data back to your server into a database, and then load it back to the client with an ajax request, giving the user the option to download the data when they want the final copy

Related

DataTables export to excel

I'm currently working on a project (developed using laravel, php, javascript, jquery, html), containing a large amount of data, so I'm using DataTables to display them with serverside set to true. What I'm trying to figure out is how to export the complete table to an excel file - right now it only saves one page (the one being shown at that very moment ) - using the Button extention.
I've been reading about it for a while now but still can't figure out a way to do this. I understand that, since serverside is set to true,
the only rows that exist on the client-side are those shown in the table at any one time.
But how can I get the complete table? Any help would be appreciated!
UPDATE:
So I create the excel file with the data I want in the backend, using PHPExcel, but now the problem is that it is saved server-side, while I wanna make it downloadable (client-side). From what I've been reading, I must add the appropriate headers to do so, but nothing I've tried works. Using
this, I managed to output the data of the excel in the screen, but it just shows gibberish... I should probably also mention that I'm new at this!

live search on webpage without refreshing it

I know you can achieve it with php, mysql and ajax, but it reads the data from a database file, right ?
I'm thinking about how I should make small page with that. Should i place all my text, titles in the database or is there a way to do that inside of html structure ?
Don't put all the data inside the document, that would eventually make the website slow.
Store the data in a database and download it when typing in the search field. You can either send a request every n seconds using AJAX, or download and store all the data in f.e. localStorage and call that using AJAX.

Excel 2007 Refresh Imported CSV File From Web

Log data from a test is uploaded to a web service, and the processed CSV is downloaded back into Excel for viewing in charts. At the moment, this is done via copy and paste for short CSV files and the Data > From Text feature for larger CSV files. Unfortunately, this takes a bunch of time for every test, and I need to make the process very simple for someone else to update the Excel spreadsheet.
The Excel spreadsheet contains 5 raw-data pages which are used to store the CSV from the server. I have no issues selecting Data > From Text, entering the website URL, and completing the format to import. This process can be repeated (same as the Copy and Paste) for all 5 pages to import the data.
This process only allows me to put in one filename, so I am using the same URL for the data, and having PHP return the CSV of the latest (or a specifically configured) test whenever the website is accessed. I've verified that this process is working correctly.
Unfortunately, when I do 'Refresh All', it prompts for a filename unless I go to Data > Connections > Properties, and uncheck 'Prompt for file name on refresh'.
However, even when I do that, I'm getting mixed results. Sometimes only one of the pages will update. (Seems to be the last one I set up.) Sometimes none of them do. I need a solution which updates all 5 pages based on the current CSV from the server without having to set up the connections again every time. Ideally I'd like to just hide these raw data sheets so we can have an Excel file that's just the final charts.
Surely this is a common function and I am doing something wrong, yet all the guides I try on the Internet don't seem to work. For example, this one:
http://www.kimgentes.com/worshiptech-web-tools-page/2010/8/18/web-connecting-csv-files-as-external-data-to-excel-spreadshe.html [URL is corrected]
Seems like they only set up one connection. I can get one working to refresh, but not more than one.
I have seen this happen and finally figured it out. There are actually 3 things that can happen to give this result, and a separate solution for each:
First, Excel software uses the IE 11 web object to when it does web
retrieval of data. This means it will be "sticky" to sessions using
IE11 to access the data. Most websites these days are run by cloud
servers, which generate sessions on the server with the most load.
This normally has no impact on users on web browsers since they
login and can visually enter their credentials etc. But when a
program accesses a website and must use a specific web browser, it
must use the properties of that browser and how it works. I ran into
this a lot when I would generate and be able to download my CSV
files on the website in Chrome, then try to use Excel to import the
same files wouldn't work (it would say they weren't there). The
solution to this, at least for now, is to use IE 11, login to the
website, generate the CSV files and test that they can be
downloaded. Then use Excel to run the web import and it should pick
up the same sticky session to get the CSV files.
Second, password entry is a different thing, but also has to do with the stickiness
of the data. For some reason Excel will not cache your credential
responses for logging into a website without you entering them 3
times. This experience may change for you, but I found that I must
enter a new credential set (for a new web import of a CSV) 3 times
before it becomes permanently cached by Excel. After this, I don't
have the problem.
Third, most complex Excel programs that require
web import may also require that you either import local data you
downloaded from a website, import data from a website into a sheet
or run more complex objects like Macros. All of these need proper
permissions. You may need to set your Trust Center settings to allow
you to use your Excel program on your computer in this way. That is
part of MS office. You can set add and update those as per MS info
here:
https://support.microsoft.com/en-us/office/add-remove-or-change-a-trusted-location-7ee1cdc2-483e-4cbb-bcb3-4e7c67147fb4

Creating tables within ckeditor from uploaded files

Would it be possible to create and populate tables within ckeditor from uploaded files?
A user would choose a file from their machine in any format such as word, excel etc and it could display a formatted table in the editor....
It is pretty clear that one can create a table in CKEditor. Also, it is possible to add a table automatically. Now, if you want to automatically add a table based upon the content of an uploaded file, then:
create a form where the file can be uploaded
implement the feature with which one can upload the file
make sure you know where your files are, either using a database or a deterministic algorithm
implement a server-side functionality which prepares the content (table) data
use the prepared data as input at client-side

How do I load all the data from a webpage with Qliksense?

So what I want do do is, get the all the data from this webpage http://abreview.ru/stat/aeb/ on QlikSense Desktop.
My Attempt
I tried to do this in data load editor through web file as a connection, but it only loaded a part of the data (part that can be seen without chaining the data filters when the webpage is first loaded)
So, how do I can the full amount of data be loaded, what are some good ways of doing this.
I could copy the data into excel file and then load it through excel but it is a lot of algorithmic work and I want to find an efficient solution, that is if there is one.
Great question, I've done this as a step by step process as I've never done this before either.
To connect to the website, you need to open the data load editor.
On the right hand side you need to "create new connection"
You then enter the URL and name it
Once this is created, the connection will appear on the right, click the select data icon.
You then need to pick the table you want to load, I'm assuming you want the fifth one.
Then click insert script and load data.
You can then get on with creating your apps.

Resources