there is a error in my IIS7 on one website so i activate the Failed Request Tracing log and get some XML files, how to analyze this XML stuff? Any tools outside?
I found FRT Logs Article but is there any comfortable LogSuite available to analyze all FRT Logs in a directory?
I know this is old but in case anyone comes across this like I did, IE loads it up with all the pretty formatting, tabs, and collapsible sections as shown in https://www.pluralsight.com/blog/it-ops/iis-7-troubleshooting
Which doesn't exactly answer, as it I wouldn't call it a tool but it appears to be the intended way to view it, while not readily apparent [at least in my opinion given that many of the logs output by Microsoft nowadays have viewers to parse the data]
Related
I need a better Live Log Viewer that supports NLog, Log4net and Enterprise Library. The Viewer must constantly be running in live mode for our operation guys. So far the ones that we've tried always run out of memory and we always need to restart them. I need a Viewer that can either remove unwanted stale messages or roll them over to a log file automatically. I know this is a tall order. So far no luck. We will even pay for it.
I use the ReflectInsight Viewer. http://www.reflectsoftware.com/
We use Enterprise Library and Log4net logging at work which produce text log files. Previously I would use a number of tail logging programs to show me new messages that came into the files, but didn't provide much filtering, searching capabilities.
By adding a reference to the ReflectInsight Logging Extensions https://insightextensions.codeplex.com and updating my existing logging configuration, I was able to send my logged messages to the ReflectInsight viewer and view them in real-time. I was then able to search, filter, bookmark, and view my messages.
I could then save the results to another file...containing only what I needed and filtering out the noise from other applications.
I hope this helps you as it did me.
Have you tried LogGrok for viewing NLOG logs?
It supports filter the log by any field for MSI, VB and NLOG, configurable highlighteres(some are already pre-configured), multiple search results (supporting Regex).
Utility supports large log files and has customizable docking windows UI.
LogGrok is opensource, you can add features you need by yourself(or ask project team to add it for you).
Project documentation can be found in this repository
I use LogExpert http://logexpert.codeplex.com/ it is free, and it has some very powerful features. I have only ran into one or two minor bugs. Features include, Filtering, Search, Color coded phrases, regular expressions, filter to tab (creating new log files from distilled results).
All in all really well done, and the developer needs help to keep it going!
I am writing a Java web app which I would like to allow users to execute basic PDF reports. Normally I would use Jasper Reports for this. However this time I would like the users to be able to edit there own reports in iReport and upload them. Which should be straight forward enough.
That got me thinking, Jasper lets you effectively write code in the reports which gets executed when the report is generated. Is it possible to write a report which has full access to the Java API and therefore my web app. I don't want users being able to kill tomcat or worse still use the DAO api I have built to read other users data.
Does any one know if this is actually possible and if so can you sandbox it somehow. May be I could filter the reports XML before its complied some how?
Also does anyone know if the same applies to other open source reporting tools such as BIRT?
I'm releasing a web service [1] to allow developers to generate PDF using templates drawn on i-report,
So I had to solve the same problem, my first try was to use the Java Security Engine API, but it was too complex with a lot of required permissions.
So while I was searching how Heroku isolates each web app, I discovered the Linux Containers (LXC) [2], so i decided to isolate each "developer sandbox" in a lxc container.
It don't prevent users to shutdown the "sandboxed server", but if they do, they will power off only its own sandbox, other users sandbox won't be affected.
[1] http://reports.simpleservic.es/landing
[2] http://en.wikipedia.org/wiki/LXC
Have a look at the java-sandbox [1] which we use in our BI solution Reportserver [2]. I am currently preparing a blog-post which will explain how to run jasperreports in a sandboxed environment.
As for Birt, the very same applies there too. Here user's cannot directly write java code, but they can use Rhino which in the end has the same effect.
[1] http://blog.datenwerke.net/p/the-java-sandbox.html
[2] http://reportserver.datenwerke.net
I've got all the hCard data in my site, but it's not able to parse it... formatting is right, an exact copy/paste from the example given, but it's not reading...
http://southwestrestaurants.com/restaurant/flying-star-cafe-3/
Any Ideas?
Your hCard is correct and the JS is, in fact, parsing it. The foursquare debug tool doesn't show hCard data because your server is returning mobile optimized version of the page, which doesn't contain the same microformat mark-up that your desktop version returns.
As far as we know, the foursquare debug tool is making a "naked" request (no special headers / user-agent), do you know why your server would be returning the mobile version of the site in this case?
I am little bit confused on whether to cache the xml response returned from a search engine.Initially I thought of caching the response as I have to use the same response in several web parts and apply different XSLT to that response in different web parts.
But I was stuck up with a question that if I only add few web parts to a page , and if I dont have the web part with caching logic added, it might cause serious problems.
Will it be a good idea to submit the query from every web part independent of other web parts and just cache the XSLT file?
Could anyone suggest me a good option to overcome this.
If it is the exact same XML response that is used by all of the webparts, then I think it would make sense to cache the data. You could create a class that is shared by all of your webparts that handles the caching (and getting the XML from the search engine if it does not exist in the cache).
I'm going to need to push and pull files from a SharePoint site that is not hosted by my company (it is external). I'm only going to get a few days (if that) to get this working so I don't have much time to experiment.
To add to my requirements/headaches, I'm going to have to implement this with VBScript. .Net would be preferred for me but for reasons beyond my control I have to use VBScript. I don't have direct access to my VBScript web server, so I won't be able to implement this in .NET and use that object from VBScript.
I'm looking for anything that would help me accomplish this goal quickly and effectively. I found this post and am wondering if the PUT/GET method used here would work for me?
http://weblogs.asp.net/bsimser/archive/2004/06/06/149673.aspx (I got this link from: Sharepoint API - How to Upload files to Sharepoint Doc Library from ASP.NET Web Application)
To top all of this off, I've never done any programming or administration of a SharePoint site. My knowledge of SharePoint is that of a user. I'm aware that there is an API from the few Google searches I did. However, my readings make me believe that my code would need to run on or in proximity to the SharePoint server. I don't believe I have the proximity I need to use the API.
Sincere thank yous!
Regards,
Frank
Progress Update: I'm still researching this. Tom pointed out that the example I had posted is probably from an old SharePoint version. His recommendation to use .Net to develop a prototype on Web Services is good but I'm hoping for more detailed answers.
I'm now wondering if I can accomplish what I need to accomplish using HTTP PUT and GETs. At my company, for a specific project we do use HTTP PUT and GETs to do something like this. We have files that are stored on an HTTP server and this is how we post and retrieve them.
Would this work over SharePoint or would SharePoint require special handling? Basically, do I have to use Web Services?
Progress Update 2: This link is helpful... Upload a file to SharePoint through the built-in web services
But I am still looking for more information on this topic... Thanks all...
You'll need to use the sharepoint lists web service for metadata and get/put for uploads. That link looks to be for SharePoint 2001, so hopefully you can use the newer/simpler version.
I recommend building something in .net first to get the web service calls worked out - some of the parameters can be quite tricky to debug, and I wouldn't want to be doing that on a remote vbscript page.
Assuming there is no metadata required and the SharePoint library is being used like a file server you can do most of what you want with PUT/GET, but you will probably need a call to GetListItems to find the urls to download.
There's an example on my blog of a lower level call to that web service - it's javascript, but probably close enough.
http://tqcblog.com/2007/09/24/sharepoint-blog-content-rating-with-javascript-and-web-services
What setting up the .net version gets you is very quick set up of a connection to the server (just add a web service reference in visual studio) so you can get the query and queryoptions strings working to retrieve the items you want. Once that works you just have to put it all together as a string including the soap stuff for use without all the nice tools.
I'm a little unclear on the context of the implementation and the prerequisite of having to use VBScript. Are the files being moved from one server to another server or from a user's desktop to this SP server? or are they being accessed via software like Excel?
The first thing that sprang to my mind (this may sound crazy) was using the Office application to make the connection. Your script would call up Excel (just as an example) and pass it the vba needed to initiate the Open File, and then provide the full path to the file that needs to be retrieved. Then have it do a Save As to the location that needs the file. Do the same thing but in reverse for putting files on the SharePoint server.
The tricky part, obviously, is getting the script to interface with the Office app. I know this can be done with the Windows version of PHP, but I don't want to get into anything specific without knowing your situation.
I seriously wonder if you are going to be able to use VBScript to call the SharePoint web services. I haven't looked at the SharePoint web services for a while so I don't remember exactly how they are defined. I thought the web services were SOAP calls though which makes it trickier than
I'm not sure I tried to use Excel to call some web services with the MSSOAP.SoapClient and it seemed this component was unable to handle any WSDL types beyond the very simple strings. Anything with nested data would not work. Instead, you would need to create a COM object to process the conversion which is a major hassle. If you are able to use XMLHTTP component then it might be possible with VBScript, but I'm not sure if it will work with SharePoint web services.
I'm not sure what you mean, "I don't have direct access to my VBScript web server." Is your web server in VBScript (ASP)? Or did you mean SharePoint server?
You might consider C# Script (cs-script) as a scripted solution that uses .NET. I have had good success with it, although it does need to be installed on the computer that runs the script.
I'm integrating between two companies. According to this book, we should use AD FS to accomplish what I'm looking for.
I still don't actually have this working though so if someone has more information I will change the answer to this question.
http://books.google.com/books?id=-6Dw74If4N0C&pg=PA27&lpg=PA27&dq=sharing+sharepoint+sites+external+adfs&source=bl&ots=ojOlMP13tE&sig=FjsMmOHymCOMGo7il7vjWF_lagQ&hl=en&ei=ytqfStClO5mMtgejsfH0Dw&sa=X&oi=book_result&ct=result&resnum=5#v=onepage&q=&f=false
I never really received a answer to this that worked out but this is no longer an issue for me.
What we ended up doing is scraping the html. In effect, we put together our own ad-hoc web service processor where instead of SOAP, html is used to communicate. Then we execute GETs, POSTs, and etc to work with the web service.
We had done something similar in VBScript in for WebDAV -- we had a class and created a new one to work with SharePoint.