I'm using Laravel 8 and I have uploaded Excel file into the server.
The name of the file is "data.xlsx" and it's stored in /public/uploads.
Inside the data.xlsx there are three columns with data.
In my database I have already created a table called "promos". Inside that table there are three columns: id, name, code.
How can I store the first and the second column data from the data.xlsx file into the "promos" table, specifically name and code column?
is it possible to achieve this without using additional libraries on Laravel?
If you dont want to use library for read excel data and save in database , then you can write it in normal php system with **IOFactory::createReader('Xlsx');
The easy solution is you can use https://github.com/Maatwebsite/ in laravel and their functionality is easy to understand and you can also customize storing the first and the second column data from the data.xlsx file
You can use https://laravel-excel.com/ for easy upload excel data to DB
Related
I need to copy file names of excel files that are in my Azure Storage as blobs and then put these names in the SQL Server table using ADF. It can be a file path as a name of a file but the hardest thing is that in the dataset which takes all the files from one specific folder I have to select a sheet name and these sheet names are different for each file, therefore it returns an error. Is there a way to create a collective dataset without indicating the sheet name?
So, if I understand your question correctly you are looking for a way to write all Excel filenames to a SQL Database using ADF.
You can use the generic Get Metadata activity and use a binary dataset as source. Select Child items as an field to retrieve. This will retrieve all files in the folder. Then add a filter to only select the Excel file types.
Hope that this gets you on the right track.
On one hand there is an excel file with a table. In the table there are village names and a bunch of attributes in the columns.
On the other hand there is one docx file for each village. Within this file there are tables that need to be dynamically updated based on what is written in the xlsx file. If I was within excel I could use simple INDEX MATCH formulas that use name of village and name of paragraph to retrieve the right information from the xlsx file. But I am in a docx file...
To go through naming each cell in xlsx and linking it would be too tedious (there are hundreds of fields). Is there any way I could escape VBA? Thank you for any ideas (including VBA if really necessary).
You could use a DATABASE field in the Word document to retrieve the data for each village from the Excel file - one DATABASE field per village. See:
https://support.microsoft.com/en-us/office/field-codes-database-field-04398159-a2c9-463f-bb59-558a87badcbc
You will, of course, need to create the necessary query statement.
An example of such a DATABASE field's usage can be found at: https://www.msofficeforums.com/mail-merge/21847-mail-merge-into-different-coloumns.html#post67097
I built a data entry UserForm to populate a worksheet that will serve as the raw database. The raw data requires further manipulation and analysis in order to be reported, so I set up a database connection using Get External Data>From Microsoft Query>Excel Files, pointed it to the file I was already working in, selected the fields I wanted and performed basic functions on those I wanted aggregated. This creates an Excel table where I then use formulas that to complete the analysis. It works great for me; I can add entries to the database, Refresh the summary table, the new entries are added and the formulas populate automatically.
The problem is that no one else can refresh the table because it's looking locally for the file. The connection string is:
DSN=Excel Files;DBQ=C:\Users\MyName\Desktop\Folder 1\Results.xlsm;DefaultDir=C:\Users\MyName\Desktop\Folder 1;DriverId=1046;MaxBufferSize=2048;PageTimeout=5;
I have a very basic understanding of the database connections, but I need this file to be as automated as possible by request of my colleague. Can I fix the connection string so that the file is "flexible" and can be refreshed on any computer? Is this the best solution? If not, what else can I do that does not involve downloading additional plugins or 3rd party add-ins?
If what you need is a file containing the raw data (a Database) AND one or more excel files connected to it that pick up the data from the database and work with this data, you need to split the two things. You can do the database with an access file located on a shared directory with an appropriate table and you can reproduce the user form in this file so the insertion of the data will be made in this file. Then you connect one or more excel files (using connection Mode = Share Deny None, so you can update the data and at the same time work with them from the excel files), the data will be imported in the files in tables and here you do all the proessing you need.
If one file is enough for you (you don't need to have a database with the row data separated and you don't need to use the file from different location simultaneously) and all the problem is that if the file is opened from a different location from the one specifyed in the connection string it does not work...well in this case (that seems the case) i don't know why to use a connection to the same file.
If what you need is a table for work with, just create it selecting the range with the data you already have inserted (Create a table - quick start guide) and then when you add data through the form instead of adding them in a "normal" row, add them to a new row of the table with something like WorkSheets("name").ListObjects("table_name").ListRows.Add and add the data in the new table row.
I have created types using Oracle objects and created a table
CREATE OR REPLACE TYPE OttawaAddress_Ty AS OBJECT
(StrtNum NUMBER(9),
Street VARCHAR2(20),
City VARCHAR2(15),
Province CHAR(2),
PostalCode CHAR(7));
/
CREATE OR REPLACE TYPE OttawaOfficesInfo_Ty AS OBJECT
(Name VARCHAR(35),
OfficeID VARCHAR2(2),
Phone VARCHAR2(15),
Fax CHAR(15),
Email CHAR(30));
/
CREATE TABLE OttawaOffices
(OfficeAddress OttawaAddress_Ty,
OfficeInfo OttawaOfficesInfo_Ty,
Longitude_DMS NUMBER (10,7),
Latitude_DMS NUMBER (10,7),
SDO_GEOMETRY MDSYS.SDO_GEOMETRY);
I have an Excel file which holds the data and I need to import to this Oracle table using INSERT INTO SQL statements. How can I do this? As you can notice, I have a column called SDO_GEOMETRY which will hold the Decimal Degrees of the records. These decimal degrees are saved in two separate columns in my Excel file.
I am not sure if I can problematically insert the values from Excel or whether I need to go through every record and create
INSERT INTO ... VALUES.... And if so, how to add values when I have created types?
Oracle has a really neat feature called External Tables. These look like regular tables from inside the database, so we can execute SELECT statements against them. The trick is that the table's data domes from OS files (hence "external"). We just define the table to say the same structure as the spreadsheet's columns. It doesn't work with Excel binary format but it does work for CSV files (so Save as ...).
The advantage of external tables is that manipulating data is easy in SQL - it's what it just best - and we don't need to load anything into a staging table. Something like
insert into OttawaOffices
select ottawaaddress_ty(ext.strtnum,ext.street,ext.city,ext.province,ext.postalcode)
, ottawaofficesinfo_ty (ext.name,ext.officeid,ext.phone,ext.fax,ext.email)
, ext.longitude
, ext.latitude
, SDO_GEOMETRY MDSYS.SDO_GEOMETRY(ext.col1, ext.col2)
from your_external_table ext
/
The limitation of external tables is the need to get the source file onto the database server and create a database directory object. Some places are funny about this.
Anyway, find out more.
I'm not going to pass judgement on the declaring table columns as user-defined types. It's usually considered bad practice but maybe it works in your use case.
EDIT: I think this question belongs over at superuser not here at Stackexchange.
What I would like to do is have a single excel file that calls up data from every excel file in a given directory. Specifically if I have a time sheet excel file from multiple people working multiple different job numbers I would like to have that data populated in a single file for everyones times. The directory where the files are stored would be updated weekly so I would want the "master" excel file to reflect the weekly changes automatically...hopefully. Is there an easy way to do this that I would be able to teach someone else?
Import every file to a database table using stored procedure and export one excel file. You can schedule this as a job. Use OPENROWSET and xp_cmdshell. What technology are you using?