We have successfully implemented birt viewer 3.7 in our spring based web application. We are running through an issue where our DB query is fetching a large dataset (around 8000) rows with images (blob data) which is taking around 12 minutes to appear on UI. Can somebody help us implementing custom pagination and export in birt viewer so viewer can retrieve around 50 rows at a given point of time. However if user wants to export data in pdf or excel all of the data should get exported.
Thanks
Anant
Add an Integer paramter (required, with default value 50) to your report.
append "limit ?" to your sql query.
Bind the new parameter to your sql query, so the ? gets replaced by the value of the new parameter.
you can set the paramter in the url like this: http://localhost:8080/birtviewer/frameset?__report=reportname.rptdesign¶metername=50
or simply set it in the displayed Parameters in the web viewer.
You can do it by putting this into the web.xml file:
<!-- Preview report rows limit. An empty value means no limit. -->
<context-param>
<param-name>BIRT_VIEWER_MAX_ROWS</param-name>
<param-value>50</param-value>
</context-param>
OR add:
__maxrows=50
to the URL, but it is probably more manageable to just edit the web.xml file.
Related
I want to show yesterday page view statistics statistic in kentico10
I have used page view web part, but it doesn't include yesterday field as shown below.
But it doesn't include yesterday field!
One approach would be to clone the biult in webpart (Page Views) which creates a copy of the webpart in the database and copies the code files to the filesystem. Then you would edit the StatisticsType property, add in a new value for yesterday
Yesterday would be added in a value into the datasource:
Then in the C# code behind of the .cs class you would edit Reload data to handle the yesterday query the way the other queries are handled.
You would simply set the correct fromDate, startDate, and Interval type to represent "yesterday".
I don't recommend editing the built in webpart because that is not a 100% upgrade safe change.
I have a document library in SharePoint online. I keep on dumping the records into it. As SharePoint have a 5000 record view limitation the moment it reaches that limit, still I will be able to upload documents but it doesn't show up any where.
Eventually I end up creating a new view and apply a filter and then the document starts showing up under the new view.
My question here is: Is there a way to automatically create a view when it reaches the 5000 limitation and put the newly uploaded documents to the new view.
Yes, you can do this via MS Flow/Workflow & server side apps/scripts of course but it's not a good approach to the issue IMO.
Have you indexed the columns? I just tested this now on a document library with 20k documents and I'm able to filter. There are limitations which you should look into (complex filtering), that's where compound indexes come in.
If you still have issues then I recommend you give the highlighted content web-part a try. You can create custom search queries & it looks similar to a document library if u set the settings correctly. The only meh thing about this approach is there is a delay for search to update, from 15 mins to 6 hours depending on how much data you have
I want to be able to get the data and sort it in-memory. I am able to get JSON data using dojox.data.JsonRestStore. Now my question is how do I store it in memory and do in-memry sorting when I click on Dojo Datagrid headers. From what I have searched so far, it is not possible to sort the datagrid in-memory/client-side as it will request the sorted data from my Rest Service. As it is custom rest service, I am not able to sort the data on server side (or is it possible?).
Thanks in advance.
Arun
Outside of XPages, you should be able to set an attribute of the grid to do this -- clientSort: true
However, this doesn't seem to take effect within XPages. I tried the following, with no success:
Adding a clientSort attribute with a value of true to the grid control (via the Dojo tab). The attribute showed up in the right place in the page source, but had no effect. (Programmatically checking the property returned a value of undefined.
Setting [grid].clientSort = true on the onClientLoad event of the page. When checked programmatically, the property would show that it is now set to true, but it had no effect.
I even tried adding it to a grid created programmatically (without the Dojo Data Grid control) and it had no effect.
It appears that either XPages is wiping out the attribute or that it just doesn't work within XPages with a remote data source. (My first two attempts used a REST service. My third attempt used a remote XML data source.)
I still think it's worth attempting to see whether it works with a local data source (like a read-write item store), but I have not had a chance to try that yet.
I receive this error quite often whenever I try to modify the Standard filter for my SQL based report. The reason for this error is that table alias in joins is set to a some non unique value on modifying the filter.
I spent a lot of time to find any solution for it but failed. Then found a workaround for it my self.
Open data base OrganizationName_MSCRM and execute following query
select DefaultFilter from dbo.ReportBase where Name = 'My SQL based report name'
Copy the value of the default filter and open in an XML file. You will find alias="a_3513cef8db754312b0db555339f05c9a" in the XML. Change the GUID in the alias with some other GUID and update back the ReportBase table DefaultFilter value.
Run the report and it will work.
If you want to avoid running updates directly against your CRM database (which, as #Alex said, is unsupported), you may be able to modify the default filter in the RDL itself.
First, you have to download the RDL and open it as an XML or text file. You can do this in any text editor (and in VS.NET if the RDL isn't in an BIDS Report Server Project).
Near the bottom, you will find a section similar to the following:
<CustomProperties><CustomProperty><Name>Custom</Name><Value><MSCRM
xmlns="mscrm"><ReportFilter><ReportEntity
paramname="P1"><fetch version="1.0" output-format="xml-platform"
mapping="logical" distinct="false">
The Value is the default filter for the RDL. The Value is XML encoded twice, but if you run it through a decoder, like the one at http://coderstoolbox.net/string/#!encoding=xml, you'll get something more readable:
1st Decode
<MSCRM xmlns="mscrm"><ReportFilter><ReportEntity
paramname="P1"><fetch version="1.0" output-format="xml-platform"
mapping="logical" distinct="false">
2nd Decode
<ReportFilter><ReportEntity
paramname="P1"><fetch version="1.0" output-format="xml-platform"
mapping="logical" distinct="false">
Include everything through the closing </Value>, and you'll get the FetchXML that defines the default filter.
Change the guid, or remove the <LinkEntity> section that includes the alias. As long as you have valid FetchXML, you should be able to upload the file as a new report.
Re-Encode (x2) the XML and put it back in the RDL. First encode the fetch XML, then wrap it in the <MSCRM xmlns="mscrm"></MSCRM> element, and encode that string. Be sure that your encoder does not replace the " with ". The one at coderstoolbox.net will, but CRM doesn't do this when encoding the XML.
You must upload your RDL as a new report. In my testing, CRM will not update the defaultfilter column in the database with the filter in the RDL when updating an existing report.
I know this is a lot of hoop-jumping, and I can't say whether this type of customization is officially supported. Personally, I feel safer uploading an RDL through the CRM web UI than I do with an open SSMS terminal running update statements against the database.
I followed a similar solution as Scott Stone - but only exported the report, searched for the 'offending' join, modified the GUID and, as Scott also mentioned, deleted and then re-uploaded the report. (as opposed to simply re-uploading the modified RDL into the existing report.)
I have a content query web part showing 3 events for todays date on our Intranet homepage. However, when a user deletes an event from a recurring series, SP changes the title of that event to "Deleted: [original title]". My content query web part still shows these and I don't see a way to filter them out.
For my list views I get around it with a field and a formula in that field of the following.
=IF(ISNUMBER(FIND("Deleted:",Title)),"Yes","No").
However, that field isn't available to me within the CQWP filter drop-down.
how does one prevent these deleted occurrences from showing in a CQWP?
UPDATE: This is a publishing page BTW.
While you can perform the filter via XSLT, it may be preferable to do so in the CAML query itself. This article from Microsoft, How to: Customize the Content Query Web Part by using Custom Properties, shows how the CAML Where clause is constructed (and may be modified). Good luck!
Have you tried the DataFormWebPart (DFWP) before? It is basically the most basic of databound webparts, but strangely the most customizable as well.
To try it out, open SharePoint designer and create a dummy aspx page to work on / play around with. Then open the datasources pane and find the list you want to use. then click on that list and select "view data". Drag the appropriate fields onto the aspx page. You now have a (very) basic grid displaying your data.
The cool thing is though, all layout is xsl driven, so completely customizable. You can even make the xsl reusable by switching to source view and cut and paste the xs,l into a separate xsl file. To then use this file instead of the inline xsl, after moving the inline xsl, change the DFWP property xsl (that contained the inline xsl) from
<xsl><!-- .... this is where the moved inline xsl used to be... --></xsl>
to
<xsllink>/server relative url to xsl file</xsllink>
The only remaining problem is that the DFWP's datasource is now bound a single list. To do roll-ups cross site collection (if you have say multiple event lists you want to include), you need to change the DFWP's SPDataSource's select query and datasourcemode. How to do that is explained here.
Now to address your actual question: (after my blatant plug of the underappreciated DFWP :-D) The DFWP's SPDataSource basically uses either an SPQuery or an SPSiteDataQuery (depending on the datasourcemode) to retrieve the data, and you can do way more filtering wise. Either in the CAML query itself or by just filter out the unneeded rows using something like the following:
<xsl:if test="not(contains(#Title, 'Deleted:'))">
<xsl:apply-templates />
</xsl:if>