A couple of reports on our server have had failed subscriptions recently; the queries run fine, render in the browser fine and some combinations of parameters that filter the data down still deliver fine.
A subscription has failed to send and after looking into it it's actually the export to excel (other export formats work, including csv) that fails:
Reporting Services Error For more information about this error
navigate to the report server on the local server machine, or enable
remote errors
I've made a copy of the report and kept filtering it down until I've isolated the specific record and field that are causing the issue, it appears to be an incorrectly entered date:
01/10/0022
Obviously it should be 2022 and it would be better if the data was correct from the start but we don't have control over that. Taking the column that date is in out of the report and then trying to export, it works, adding it back in and it fails again. I know we've had examples like the above before in the data as some other reports were written using DATETIME instead of DATETIME2 fields causing the report generation itself to fail, fixing those reports and running their respective subscriptions again worked so I'm assuming something must have changed on our report server instance that's now causing the export to fail here.
Has anyone come across this before and know what would be causing the export to fail just based on this date value?
Related
I'm experiencing a weird problem when trying to import XML data to Excel 2016.
I have a Webservice created in C# which conforms to XML standards.
In Excel 2016 I use the "Data"->"From internet"->"Pasted URL to the Webservice".
When I finish the Excel import dialog I get a nice table representation of my data from the Webservice.
But when the output changes in the Webservice and I hit the "Update" option when right-clicking the datatable - the data doesn't update in the datatable!
If I repeat and click update again (exactly the same operation as before) - the data updates in the datatable...
Anyone knows how to overcome this strange behavior?
For reference I also tried hitting the update and update all in the ribbon under data and tried any possible combinations under the connection properties - to no avail.
And I also tried to reload the XML in the browser - here the data is updated instantly - so It doesn't seem to be related to caching. I can see Excel writes connecting shortly on the first run - but no update occurs.
I have some queries in my Access database which pull data from Excel files that appear in the database as linked tables. They have worked just fine, until suddenly and inexplicably I was getting the error "External table is not in the expected format," when trying to access them.
I thought it might have to do with the fact that I was using a macro-enabled workbook, but it was fine before. I do have a mail merge set up in Word which is linked to the database, and using one of the aforementioned queries.
It turns out that the issue was due to the mail merge document. Once I saved and closed the mail merge file in Word and tried accessing the queries and tables again in Access, the error was no longer appearing.
It seems that if a Word mail merge is connected to the database, this error may appear. I am not sure as to why a more appropriate error is not appearing; after testing it seems to happen regardless of whether the linked file is a macro-enabled workbook or not.
In short, as Olivier put it, the file was locked by Word. A simple issue, but not exactly clear given the error message (unless you follow Andre's logic that the expected format is a non-locked file, hahah).
I hope this helps someone else!
My form is hooked to a query. The "not in expected format" message happens when the query is set up as a "snapshot". When I changed to "dynaset", the form started to work as expected.
Log data from a test is uploaded to a web service, and the processed CSV is downloaded back into Excel for viewing in charts. At the moment, this is done via copy and paste for short CSV files and the Data > From Text feature for larger CSV files. Unfortunately, this takes a bunch of time for every test, and I need to make the process very simple for someone else to update the Excel spreadsheet.
The Excel spreadsheet contains 5 raw-data pages which are used to store the CSV from the server. I have no issues selecting Data > From Text, entering the website URL, and completing the format to import. This process can be repeated (same as the Copy and Paste) for all 5 pages to import the data.
This process only allows me to put in one filename, so I am using the same URL for the data, and having PHP return the CSV of the latest (or a specifically configured) test whenever the website is accessed. I've verified that this process is working correctly.
Unfortunately, when I do 'Refresh All', it prompts for a filename unless I go to Data > Connections > Properties, and uncheck 'Prompt for file name on refresh'.
However, even when I do that, I'm getting mixed results. Sometimes only one of the pages will update. (Seems to be the last one I set up.) Sometimes none of them do. I need a solution which updates all 5 pages based on the current CSV from the server without having to set up the connections again every time. Ideally I'd like to just hide these raw data sheets so we can have an Excel file that's just the final charts.
Surely this is a common function and I am doing something wrong, yet all the guides I try on the Internet don't seem to work. For example, this one:
http://www.kimgentes.com/worshiptech-web-tools-page/2010/8/18/web-connecting-csv-files-as-external-data-to-excel-spreadshe.html [URL is corrected]
Seems like they only set up one connection. I can get one working to refresh, but not more than one.
I have seen this happen and finally figured it out. There are actually 3 things that can happen to give this result, and a separate solution for each:
First, Excel software uses the IE 11 web object to when it does web
retrieval of data. This means it will be "sticky" to sessions using
IE11 to access the data. Most websites these days are run by cloud
servers, which generate sessions on the server with the most load.
This normally has no impact on users on web browsers since they
login and can visually enter their credentials etc. But when a
program accesses a website and must use a specific web browser, it
must use the properties of that browser and how it works. I ran into
this a lot when I would generate and be able to download my CSV
files on the website in Chrome, then try to use Excel to import the
same files wouldn't work (it would say they weren't there). The
solution to this, at least for now, is to use IE 11, login to the
website, generate the CSV files and test that they can be
downloaded. Then use Excel to run the web import and it should pick
up the same sticky session to get the CSV files.
Second, password entry is a different thing, but also has to do with the stickiness
of the data. For some reason Excel will not cache your credential
responses for logging into a website without you entering them 3
times. This experience may change for you, but I found that I must
enter a new credential set (for a new web import of a CSV) 3 times
before it becomes permanently cached by Excel. After this, I don't
have the problem.
Third, most complex Excel programs that require
web import may also require that you either import local data you
downloaded from a website, import data from a website into a sheet
or run more complex objects like Macros. All of these need proper
permissions. You may need to set your Trust Center settings to allow
you to use your Excel program on your computer in this way. That is
part of MS office. You can set add and update those as per MS info
here:
https://support.microsoft.com/en-us/office/add-remove-or-change-a-trusted-location-7ee1cdc2-483e-4cbb-bcb3-4e7c67147fb4
I have a Cognos Express version 9.5 report with Drill-Through defined on a list column that open a second report passing it the coresponding data item value as parameter.
The second report open correctly showing data filtered as it should using the parameter received during drill-Through.
The problem is that the second report take forever to open and freeze my browser during various seconds... This is strange because this problem happen only when opening second report with drill-through and not when opening it directly filling in the optional parameter with a prompt...
Using browser debugger (f12) I noticed that http://my_server/p2pd/servlet/dispatch is called hundred of times before opening the second report and this is what is causing the browser to freeze...
Any idea what is happening?
Try using the 'Drill Through Assistant' to help with debugging your Drill-Through definition.
Here is a link to IBM site describing how to use the Drill through assistant.
Migrating my report to COGNOS 10.1.1 solved the problem...
I recently (today, actually) got some new permissions to some SAP tables, but I'm getting permissions issues with importing them. Here's how my process looks right now:
I have an Access db that links to SAP tables via an ODBC connection.
In that same Access db, a set of spaghetti-like queries pulls & refines a modest data set (a dozen columns, few hundred rows, nothing special). I can run these queries without a problem.
An Excel file imports that data using Data->Import External Data->Import Data. I do this all the time. Except this time, I'm getting the dialog pictured below. Clicking OK doesn't seem to do anything. Clicking Cancel produces an "ODBC Connection Failure" message (or something like that).
Again, these queries can be run from Access just fine. But when I import those query results into Excel, I get this problem. I can get around it with a make-table query, but since someone else is maintaining the Access db, I'd rather not make any changes to it.
I was going thru my old unanswered questions and found this one. I think I solved this by using a staging table in Access to store the data and then later importing into Excel from that staging table.