sharepoint list version history export to excel - sharepoint

Good day !
I need to do the export versions of the log data item in the excel.
This solution I unfortunately cannot use because I'm from Russia, and the solution only supports the Latin alphabet.
so I have the most to learn to extract data from the version history for a single item list.
please help. how is this done?

While I haven't found a clean way of doing this, I have a partial workaround.
Use the Export to Excel option in the List section of the ribbon (make sure the list view you export includes a modified column - thanks T6J2E5).
Save the owssvr.iqy file and open with notepad
Copy just the URL from the file and paste it back into your browser, adding "&IncludeVersions=TRUE"
Save the XML file and open in Excel (or your favorite XML viewer), selecting the "As an XML table" open option.
You'll have to delete the first few columns and rows as they contain the schema data but other than that you should have all the version history (I suggest you add the Version column to the view). You can sort by the Modified column to get a chronological change log of the entire list.

FYI "IncludeVersions=TRUE" should be before the List ID for anyone else that needs this.
spurl/_vti_bin/owssvr.dll?XMLDATA=1&IncludeVersions=TRUE&List={ListID}&View={VIEWID}&RowLimit=0&RootFolder=name
I am facing Error after doing the same that semicolon is missing. how to resolve it.

Related

VBA: Mail Merge - Field not found dialogue box

I've been writing a tool to extract data out of an SQL database via a set of ODBC tables inside Excel VBA and insert the information into some pre-formatted reports using a mail merge and a Word Object. Some of the reports have some optional fields that may not always contain data. In these occasions those optional fields will be absent in the database entirely.
My code has been designed to be dynamic and produce the merge information (by way of a CSV file merged using VBA) using the question details in the database as the field headers. My problem stems from when the option questions are absent from the output merge file and Word prompts the user to "Remove Field". I've struggled to find a programmatic way to essentially answer this dialogue box on behalf of the user (which will always be to remove the field), or have word just know what to do and not require to ask.
I've had no luck finding a module, command or function that will either tell the word object to clean this up automatically, or any handler to answer the question programmatically. Any help would be appreciated.
Just incase someone else falls into a similar problem, here is how i solved this issue:
Instead of opening up a datasource, open up a headersource instead. In my case the headsource and data source are the same file.
Compare the merge fields (< word doc object >.MailMerge.Fields) to the fields inside your file (< word doc object >.MailMerge.DataSource.FieldNames). Delete each merge field when it is not found (< MailMergeField >.delete).
Open the data source as normal and set the first record to 2 if its the same file as your headerfile.

Specify Excel 'Display Format' while importing data

I'm currently using an excel document as a template for generating a report. This is done by first specifying an 'Xml Map' in Excel and then importing data against it. The report generation works fine.
The problem is that I want the display format on the cell to be 'General' and not 'Text' after the import. I came across this link (yes, Excel 2007)
http://office.microsoft.com/en-gb/excel-help/xml-schema-definition-xsd-data-type-support-HP010206414.aspx#BMxsdexport
The link specifies that Excel will set string data from the xml import to display as 'Text' by default. I need this to be displayed as 'General' instead. Is there a way to do this?
The only solution I've come up with so far is to use a macro to change the display format after opening the document but if I can do it using only Excel settings it would be better.
Try to use the text import feature: http://office.microsoft.com/en-us/excel-help/text-import-wizard-HP010102244.aspx
NOTE: the important step that should address your need is the "Column data format" section, which often gets overlooked as it is the last step of the import. I hope that helps.
The mapping cannot be changed.
http://social.technet.microsoft.com/Forums/en-US/fdf99171-0a53-4716-9e72-25afc36ddf90/specify-excel-display-format-while-importing-data

PostgreSQL: Copy/paste resulting with headers into Excel without code

I used MS SQL Server 2008 R2 (MS SQL) where I could right click the query result, copy/paste it with headers to Excel for easy exploration. Now with PG Admin (PostgreSQL) I have to do export (File > Export > CSV) then bunch of Excel steps (Text To Columns).
Is there an easy way to copy/paste the query result with headers into Excel?
For pgAdmin 4, there is an option to "Copy with headers". It is a drop-down beside the copy button in the Query Tool menu:
PgAdmin seems to make semi colon the default field separator. Excel seems to like tabs by default.
You could try and change excel or each time just do the "text to columns" feature.
I personally would go to Preferences->Query tool->Results grid and change the following
Result copy quote character: "
Result copy field separator: Tab
Copy column names: True
This will make it more behave more like sql management studio.
There's a lot of different ways to accomplish what you want here. The question is a bit confusing because you are talking about Excel, but then you table about '/var/lib/postgres/myfile1.csv', which makes me think you are now using some favor of Linux.
I'm using Ubuntu 12.04 with pgAdminIII 1.16.0. And I have Open Office installed with LibreOffice 3.5.4.2 as the Excel replacement.
I'm not sure why you want to take the information out of the grid in pgAdminIII, but assuming just wanting to take the data and move it over to a spreadsheet to play it for some reason, then about the easiest way to do it is run your query and click the upper left corner of the results (which just like a spreadsheet selects everything) and copy. Then, you should be able to open LibreOffice and paste in the information. It will bring up the same dialog as you would see when importing a CSV file.
Also, you should be able to start psql and then do a "COPY" command. If you get a permissions error, then try the suggested "\COPY" instead. Please see the PostgreSQL docs. Here is a link to a wiki page here.
If I'm missing what you are trying to do, please ask questions in the comments section, and I'll try to improve my answer accordingly.
You have to set your query tool output to text not the grid data. That way the Column names and the query results are all in the same cut past text file. When you do this you are no longer doing CSV. The whole results and field names comes over as a text file in the cut and paste process.
Answering to quite an old post:
The answer by #Phillip Fleischer seems to be the best way, at least in pgAdmin III. But for pgAdmin III version 1.22.2 (the one I am using), instead of Preferences..., the settings mentioned were seen under File > Options > Query tool > Results grid.

CSV Exporting: Preserving leading zeros

I'm working on a .NET application which exports CSV files to open in Excel and I'm having a problem with preserving leading zeros when the file is opened in Excel. I've used the method mentioned at http://creativyst.com/Doc/Articles/CSV/CSV01.htm#CSVAndExcel
This works great until the user decides to save the CSV file within Excel. If the file is opened again in Excel then the leading zeros are lost.
Is there anything I can do when generating the CSV file to prevent this from happening.
This is not a CSV issue.
This is Excel loving to play with CSV files.
Change the extension to something else.
As #GSerg mentions, this is not a CSV issue.
If your users must edit/save in Excel they need to select the entire worksheet, right-click and choose "Format Cells" and from the Category list select "Text" after opening the csv file. This will preserve the leading zeros since the numbers will be treated as simple text.
Alternatively, you could use Open XML SDK 2.0, or some other Excel library, to create an xlsx file from your csv data and programmaticaly set the Cell type to Text in order to take the end users out of the equation...
I found a nice way around this, if you add a space anywhere along the phone number, the cell is then not treated as number and is treated as a text cell in both Excel and Apple's iWork Numbers.
It's the only solution I've found so far that plays nice with Numbers.
Yes I realise the number then has a space, but this is easy to process out of large chunks of data, you just have to select a column and remove all spaces.
Also, if this is web related, most web type things are ok with users entering a space in the number field. E.g you can tap-to-call on mobiles.
The challenge is to get the space in there in the first place.
In use:
01202123456 = 1202123456
but
01202 123456 = 01202 123456
Ok, new discovery.
Using Quick Preview on Mac to view a CSV file the telephone column will display perfectly, but opening the file fully with Numbers or Excel will ruin that column.
On some level Mac OS X is capable of handling that column correctly with no user meddling.
I am now working on the best/easiest way to make a website output a universally accepted CSV with telephone numbers preserved.
But maybe with that info someone else has an idea on how to make Numbers handle the file in the same way that Quick Preview does?

How do I save an Excel 2007 file in "OOXML" (xml text) so that I can modify it in code?

I made an Excel file with data on tab 2, and a chart on tab 1. This is for a web-portal where investors can download the excel document with ubber graphics and the like, but with their data.
So, the 'simple' fix in my mind is to save the Excel document as "OOXML" and just replace the data items. However, it seems that the document is encrypted (at least... not readable in notepad).
How do I get to where I need to go here?
Thanks,
Found my solution... using the Office Open XML SDK and googling / playing with it for a while.

Resources