I used MS SQL Server 2008 R2 (MS SQL) where I could right click the query result, copy/paste it with headers to Excel for easy exploration. Now with PG Admin (PostgreSQL) I have to do export (File > Export > CSV) then bunch of Excel steps (Text To Columns).
Is there an easy way to copy/paste the query result with headers into Excel?
For pgAdmin 4, there is an option to "Copy with headers". It is a drop-down beside the copy button in the Query Tool menu:
PgAdmin seems to make semi colon the default field separator. Excel seems to like tabs by default.
You could try and change excel or each time just do the "text to columns" feature.
I personally would go to Preferences->Query tool->Results grid and change the following
Result copy quote character: "
Result copy field separator: Tab
Copy column names: True
This will make it more behave more like sql management studio.
There's a lot of different ways to accomplish what you want here. The question is a bit confusing because you are talking about Excel, but then you table about '/var/lib/postgres/myfile1.csv', which makes me think you are now using some favor of Linux.
I'm using Ubuntu 12.04 with pgAdminIII 1.16.0. And I have Open Office installed with LibreOffice 3.5.4.2 as the Excel replacement.
I'm not sure why you want to take the information out of the grid in pgAdminIII, but assuming just wanting to take the data and move it over to a spreadsheet to play it for some reason, then about the easiest way to do it is run your query and click the upper left corner of the results (which just like a spreadsheet selects everything) and copy. Then, you should be able to open LibreOffice and paste in the information. It will bring up the same dialog as you would see when importing a CSV file.
Also, you should be able to start psql and then do a "COPY" command. If you get a permissions error, then try the suggested "\COPY" instead. Please see the PostgreSQL docs. Here is a link to a wiki page here.
If I'm missing what you are trying to do, please ask questions in the comments section, and I'll try to improve my answer accordingly.
You have to set your query tool output to text not the grid data. That way the Column names and the query results are all in the same cut past text file. When you do this you are no longer doing CSV. The whole results and field names comes over as a text file in the cut and paste process.
Answering to quite an old post:
The answer by #Phillip Fleischer seems to be the best way, at least in pgAdmin III. But for pgAdmin III version 1.22.2 (the one I am using), instead of Preferences..., the settings mentioned were seen under File > Options > Query tool > Results grid.
Related
I'm trying to create excel file formatted like this:
I tried to use https://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:769425837805, but it is good for putting ONLY simple table in report - I need data like Info1/2/3 etc.
I tried csv, sperated by ';', but that makes excel unformatted, and it is unreadable.
Any ideas?
PS If there is a way to paint cell - great, but if not - it isn't necessary.
PS2 2nd thought - column number in bottom part of raport varies between diffrent input data. So that makes tool from link useless
You might want to check out the following tzwo options:
Free as part the Alexandria tools: https://github.com/mortenbra/alexandria-plsql-utils/blob/master/ora/xlsx_builder_pkg.pkb
Commercial option: https://www.oraexcel.com
I have a stored procedure that returns formatted, delimited text using dbms_output.put_line statements. Currently, we run the script in Toad and manually paste the output into Excel, but I was hoping I could cut out a step and get the output directly into Excel. I created a connection and set the properties to run the SP: that works fine (more or less -- the next step would have been to figure out how to supply a parameter). However, since no query is being returned, Excel doesn't recognize that there's anything to be done. Is there any way to do this automagically? Thanks.
ETA: I was just trying to figure out if I could build a cursor by inserting the GET_LINE output into it and return that, but that didn't look like it was going to work out.
If you are using Toad most recent versions (10+) allow you to save your output as an excel file. Earlier versions also allow this but with different commands.
In the output section at the bottom right click on any part of the results:
select "Export Dataset".
select your choice of export file (Excel file)
select a file path and file name
choose whatever options such as saving the sql on a separate worksheet you need
press the button in the lower right corner
Even if the output is comma delimited csv you can then have excel convert it into a real xls or xlsx format.
Our database needs to be filled with the zip code for every state in our country, we are provided with a catalog of zip codes in a xls file, we have to import this file to a table in a database hosted in Windows Azure.
I don't know if Stack Overflow allows me to post a link to our xls, but I'll describe the structure of the file:
Every sheet holds the zip code information for a whole state, inside every sheet we have fifteen columns with information such as zip code, type of terrain, type of area, locality, state, city, etc. Every sheet has the same columns and the information inside the cells may contain special characters (i.e. á, é, ó, ú, etc.) normal to Spanish language and this special characters need to be preserved. Also some cell may be empty or not and blank spaces are likely to appear in the contents of the cells (i.e. Villa de Montenegro).
We are looking for a way to import every sheet into our table without losing special characters or skipping empty cells. We have no prior experience doing this kind of task and wanted to know what is the best way to import it.
We tried a suggestion of importing the xls to CSV files and then importing those CSV to our database, but we tried some of the variations of the macro recommended here but the CSV are generated with many errors (Macros aren't our forte).
In short, what is the best way to import our xls to an Azure database table without losing empty cells, special characters nor failing when blank spaces are inside a cell?
I recently had to migrate some data in a similar way. I used the SQL Server 2014 Import and Export Data Wizard. I initially tried with a .csv, but it was finicky about quoted commas and such. When I saved it as a .xlsx file, I was able to upload it without a problem. It's pretty straight forward to use, just select your xls file as the source, configure the connection to your Azure database, next-next-next, and hopefully you get the happy path. I wrote about it on my blog, step by step with screenshots.
We found an easy, although slow, way to copy the contents from an xls using Visual Studio, the version we used was 2012 but it works with 2008 and 2013 too.
Open the Server Explorer.
Add a new connection, the url for the database is required, the credentials are the same as the ones you use to access the database on Azure. Test the connection if you like, if the credentials are correct then you're good to go.
After the connection has been made, expand the Tables section and select the table you wish to dump your data.
Right click and select view table data.
If the table is empty or it has already some data, the workflow is the same. The last record will be empty, select it.
Go to your xls file, for this to work, the number and order of the columns must be the same as the table you will be dumping the data. Select the rows desired, copy them.
Return to Visual Studio, while the last empty row is selected paste the data. The data will start to copy directly into your Azure database.
Depending on your internet connection and the amount of data you're coping, this might take a long time.
This is an easy solution, although not optimal. This works if you don't own SQL Server with all of its tools. Still gotta check if this works on the express edition, will update when I test.
Good day !
I need to do the export versions of the log data item in the excel.
This solution I unfortunately cannot use because I'm from Russia, and the solution only supports the Latin alphabet.
so I have the most to learn to extract data from the version history for a single item list.
please help. how is this done?
While I haven't found a clean way of doing this, I have a partial workaround.
Use the Export to Excel option in the List section of the ribbon (make sure the list view you export includes a modified column - thanks T6J2E5).
Save the owssvr.iqy file and open with notepad
Copy just the URL from the file and paste it back into your browser, adding "&IncludeVersions=TRUE"
Save the XML file and open in Excel (or your favorite XML viewer), selecting the "As an XML table" open option.
You'll have to delete the first few columns and rows as they contain the schema data but other than that you should have all the version history (I suggest you add the Version column to the view). You can sort by the Modified column to get a chronological change log of the entire list.
FYI "IncludeVersions=TRUE" should be before the List ID for anyone else that needs this.
spurl/_vti_bin/owssvr.dll?XMLDATA=1&IncludeVersions=TRUE&List={ListID}&View={VIEWID}&RowLimit=0&RootFolder=name
I am facing Error after doing the same that semicolon is missing. how to resolve it.
I'm looking to pull into the XML feed from Feedburner's API. This is just a matter of writing the URL and using the "From Web" data connection in Excel.
https://feedburner.google.com/api/awareness/1.0/GetItemData?uri=RSSFEEDNAME&dates=2011-08-01,2011-08-05
This works fine (and is pretty fast).
Now, I'd like to be able to update two cells in the "dates" sheet to have it pull that range of data. This is done using parameters in the URL:
https://feedburner.google.com/api/awareness/1.0/GetItemData?uri=RSSFEEDNAME[]
Using the Excel UI, I can then assign the [] to any cell. However, no matter what I try, this doesn't work. I initially thought there might be some issue with the date format so I've worked myself to the point where I am entering into the cell, the exact copy (&dates=2011-08-01,2011-08-05) as text.
Each time, the feed pulls up with just the current days data (which is the default behavior when no dates are specified). It isn't giving an error (which it will do for relatively small infractions, like not having two-digit months) which makes me think it just simply isn't replacing the [] with the specified text. I'm also using this same method for a WebTrends Web Service query and gettign similarly frustrating results. I've read every how-to on web queries, and I'm following them exactly.
I can't find any place to see what the final URL Excel is requesting, so it's a bit of a shot in the dark. Any thoughts on next steps would be greatly appreciated!
Best,
Nathan
The answer was no to use the Web Query "wizard" and just do it by hand.
Open Notepad (or some text editor)
In the editor type the following four lines:
WEB
1
http://example.com/index.html?something=[]&somethingelse=[]
[BLANK]
Save it as anything with an .iqy extension.
Open Excel, go to the Data ribbon, and click "Existing Connections"
Click "Browse for More..."
Find the IQY file you made and click "Open"
Excel will then ask you where you want to put the resulting data, followed by prompts for each placeholder you entered in the URL. Those prompts let you either type in a value, or select a cell to act as the data.
I would have thought that dates should have been a named parameter and that you should link that to whichever cell has the date value(s).
The cell should just have 2011-08-01,2011-08-05 as its value as long as you create the named parameter dates and link it to that cell