Hi i wonder if anyone can help me, I am absolutely stuck on this one. I will start from the beginning so you understand what I am trying to accomplish.
I have a web page which displays total figures in a grid, below the grid I have a Report Viewer which displays these figures as a line graph via an SSRS report on the server. Recent requirements would like the grid to be filterable, so if they want to see totals for a specific customer or product. While I have this working I would also like the SSRS report within the report viewer to also so the same information.
Now I have this semi working, I have made the necessary changes to the report on the server and this is working correctly, next was to hook that all up with the Report Viewer. I also have this working to a degree. Basically what is happening is if the Async is flase then the report does not refresh even ater telling it to. If I turn Async on it works as expected on my computer with IIS7 however when I upload this to our server with IIS6 after the initial load, when a postback happens or when I try to filter the grid I just get a blank screen. Its like the report is not being displayed.
Any help would be appreciated as I have previously used SSRS with report viewer to get a different report based on option buttons which works, but this does not seem to on the server. Locally it is fine but not where it needs to be.
Give it a try . Set enable 32 bit to true
Below is how you do it from IIS7 manager:
1 Right click on the Application Pool and select “Advanced Settings…” or select the same from the Actions pane after selecting the Application pool.
2 Change the “Enable 32-bit Applications” to True (if you want the application pool to spawn in a 32-bit mode)
3 Click OK.
Ok so I figured it out and while it has taken me a while to get the time to post it on here, I thought I should, just in-case someone else comes across this issue.
So Async ="true" the report is displayed in the report viewer for the first time after any postback occurs I am left with a blank screen. If I turn Async = "false" then I get the report the first time but the report never refreshes. After fiddling for a while it got me thinking if it works the first time then that is all I need and just pass different parameters to the report. So I set Async = "false" and had a look as to how to completely reset the report viewer control. Turns out all I needed was ReportViwerControl.Reset(). This reset the control to its default like almost like we are using it for the first time. Then I apply the report details and parameters and hey presto the report works as expected and the report changes every time a filter is set.
I uploaded this to the server and it is working as expected. Quite why it wouldn't before I am unsure however, i do have to stress that the server is old and has old technology, the reason for IIS6. While our test server and my local copy have the latest technology.
Thank you all very much for your help.
Related
Is there a way to set a max run time or a timeout factor in Power Query so that after a specified period the query would terminate itself regardless of whether it was successfully executed or not?
This is important for me because I have built various queries at my workplace that usually run fine (as in I have been running it daily without issues for months and years) but occasionally, they would hang (likely because of clashing with another process on the server at that moment, when the server was under heavy load) and keep making read requests on the server indefinitely - One time IT have told me they logged more than 7 million reads from my machine into the database within a few hours. In cases, these have caused the servers to crash which leads to extensive downtime.
So I would like to know if there is any setting, or any thing that I can build into the query itself to ensure it terminates after a certain period of time.
I'm proficient with the M Power Query langue.
Thanks.
[Update 1]
Thanks Alexis for the suggestion below regarding setting a CommandTimeout value when coding the connection. It's a great pointer.
I'll try it out, do some tests and report back if it conclusively fixed the issue or not.
But in the meantime, I've done a bit of digging into Microsoft's Power Query documentation and found that the CommandTimeout argument itself already has a default value of 10 minutes built into it, so theoretically even if we didn't specify that argument, the query should have terminated itself. But that wasn't the case. It ran for hours.
I wonder if it is a bug with Excel version of Power Query? Because I do use Power Query within Power BI as well, and over there I haven't seen it crashing and hanging yet (admittedly I've been using the Excel version more frequently than I did Power BI).
However, if anyone has any other suggestions on potential fixes for this problem, that would be much appreciated. Thanks.
I found a pretty decent answer to this here. Here are the steps posted on that forum:
Have the Power BI Desktop file open and in Report View
Click on the arrow for "Edit Queries" (in the External Data section)
A dropdown will appear - then click "Data Source Settings"
Data source settings pop-up window will appear
Click on "Change Source..."
Another pop-up will appear
Click on Advanced Options (a drop-down of sorts will appear within the pop-up)
"Command timeout in minutes (optional)" will be the first option
Enter a value - I chose to enter 60 minutes but feel free to enter any value
Then apply the query changes and wait till the refresh is complete
This was written for Power BI, but it works in the Excel query editor as well. In summary,
Open Query Editor
Choose File > Options and settings > Data source settings
Select your source and click on Change Source...
Expand Advanced options and enter a Command timeout value
When I tried this with a connection to a SQL server it added a CommandTimeout argument to my Source step. You can just use this code instead and skip all the clicking:
= Sql.Database("server", "DB", [CommandTimeout=#duration(0, 0, 15, 0)])
Doing it via the Data source settings may be preferable if you aren't connecting to a SQL server as the parameters might be different. E.g.
= Web.Page(Web.Contents("URL", [Timeout=#duration(0,0,15,0)]))
or
= OData.Feed("http://some.url/service.svc/", null, [Timeout=#duration(0, 0, 15, 0)])
I am using the RandomProducts web part, and for some reason product options have no effect on the price even though I have added the price adjustments for each variant. Is this normal? I assumed this would work out of the box since its a default option but maybe I have to write my own code... just seems odd. Does something need to be enabled for product options to update the price of a product?
Are you sure you are not having an JS issues on your page? When using the Product Options, you define the option and any adjustment to the price. You then choose the products to apply the option to. When a user selects a specific option, there is a postback that occurs that updates the displayed price. If using the CartItemSelector web part, it will contain all of the postback JS needed to update the price within itself.
This works on a basic demo site so I suspect there is a JS issue that is causing the postback to not complete properly.
Another possible cause could be latency when posting the value back. If it is the first time you are running the site, it's possible the postback is taking longer than expected (due to recompilation) and it is appearing as if it's failing.
Other things to check:
On the Product Options General Tab
- Is "Enabled" checked?
- Is "Display price adjustments" checked?
I'd recommend to check the following things:
JavaScript errors via the developer tools of your browser
Caching settings/expiration for the web part or page
Enable SQL debug (Settings > System > Debug > Display on live site) and check which queries are executed to see where the price comes from
hi I'm having troubles with the SharePoint list, I've got the list connected to a Visio, and when I'm on-premise, it update fast, I mean I change the list, click refresh in the Visio app and it's updated, but on the Visio web access from SharePoint, It takes too long, I can be pressing refresh, but its like it has a timer, every 2 or 3 minutes it updates, every change I do so far. The problem is that I need to be instant, sometimes its is instant because I luckily change the list when its about to refresh.
Is there any configuration in the server to me updates that fast?
English is not my native language, so I'm sorry
If you want your changes to be shown instantly, you can set Minimum Cache Age to zero in SharePoint:
http://technet.microsoft.com/en-us/library/ee524061.aspx
Note that default value (a few minutes) provides a better rendering performance in multiuser scenario.
Also note that this seetting does not seem to be available in SharePoint Online / Office 365 (please correct me if I am wrong).
Xpage (listPostits.xsp) has a "View" container control, where one of the column is set "show values in this column as links".
Now, here comes "Strange behaviour".
When i work with this application on my own (developer) PC (Win XP, Chrome or IE), the Domino generate the link, which can't be really processed:
/servername/db/postit/postit.nsf/listPostits.xsp/onePostit.xsp?documentId=many_numbers&action=editDocument
Namely, the Bold-marked portion shouldn't be there ! This portion is the name of the XPage, where the View control is in.
When i work with the application from other PC (Mac, Firefox) then i get the correct link (the same as above but without the XPage name inbetween):
/servername/db/postit/postit.nsf/onePostit.xsp?documentId=many_numbers&action=editDocument
update: let us leave for the moment the differencies in generated links between two machines. The first question is - why the extra portion is inserted into automatically generated link?
After playing around i think i might have found the reason for this strange behaviour. Namely, the "Substitution" Rules on the server side. One of them is to substitute "*/postit/all" with "/db/postit/postit.nsf/listPostits.xsp"
If i switch it off, then the Links are generated properly. Still, it's pretty strange to me that these settings influence the way Domino generates the links. I thought it works on the fly with them and those settings have nothing to do with the way how Links are generated inside the application.
So, the help now is needed regarding Web Site Rule Topic, but for that, i guess, i have to create another topic. But in case somebody has some good Info on this, please share it with me. I'm a bit confused at the moment :)
Final Update: Spent some more hours of testing and the results confirmed the initial idea.
If i open the page with the standart URL, i.e.
http://servername/db/postit/postit.nsf/listPostits.xsp then everything is fine, links are generated properly. When i however open the same page with short URL http://servername/postit/all , then server adds the substitute URL (db/postit/postit.nsf/listPostits.xsp) to every single link he generates automatically to be used as the link to open/edit the underlying document.
Is it bug or feature ? Don't know.
As a workaround (because i want to keep simple URL's for the application) i have to manually generate links.
Upgrading from 8.2 to 8.3 and testing out the new No Data Content functionality. Report looks in order if results are returned. The No Data message does not appear. However if we test the report (pass in parameters expecting no results), we are returned a blank page (pdf, html, excel output). Not even the header or footer appear on the page. And the No Data Content message does not appear as well.
We have very complex reports using Oracle SQL and in most cases the Header content is linked to a SQL statement to render output from the database as well as list the parameters passed in. The issue seems to be related to embedded data objects, i.e. we have a list object embedded within a table object. I've tried stripping out the extra layers with no success thus far.
In 8.2 we used style variables, i.e. RowNumber()=0 or RowNumber() is null to conditionally hide data objects in the body of the report. We've never used any conditions to hide or display the header or the footer and in 8.3 now this seems to be an issue.
This seemed like such a useful enhancement in 8.3 but we haven't gotten it working yet. Any thoughts or suggestions to try?
Thanks for reading this. I appreciate any advice.
Joe
We ran into this same problem when upgrading reports from 8.2 => 8.4. We reported it to Cognos as a bug -- Not sure if they've assigned a bug tracker id to it, but we got the impression it wasn't going to be fixed soon. (Obviously, if it exists in 8.3 and it has been carried forward to the next version, it's not a high priority.)
I'm sorry I don't have an answer at the moment on how to fix it, but I was planning to look into work arounds next week. I'll edit this post with any ideas I come up with.
UPDATE:
Not sure if this is an available feature of 8.3, but in 8.4 there is a new "No Data Contents" property for data containers (lists, blocks, etc.). Setting this value to yes creates two tabs at the top of the page, one for a page to be displayed if data is returned, and another for instances when no records are found. You can customize a message to be displayed using that second page. Pretty cool, actually, but buried in the documentation.
Hope that helps. If you still have problems, check out the Index topic "no data > specify what appears for a data container."
yea it appears that a blank pdf is returned... but in fact the cognos viewer bugs out at the second prompt page if there is no data. Headers and footers and items in which didnt need data to render ... as not showing up as well.
This existed in 8.2 and we were always able to do some sort of work around to get it to atleast show. Seems much more prevalent in 8.3 now.
Id like a solution on this as well! halp! >_<
Edit: seems a slight work around is to create a new report in 8.3 and copy each component starting with queries... then variables.. then objects on the page.. followed by page sets and master detail relationships. in that order for simplicity. Essentially recreating the report from scratch in 8.3 seems to fix the problem.
This works for about 90% of our reports.