I am trying to do a Performace test on Excel based application with loadrunner. Started of running the protocol advisor. which is trowing error.
My main target is to record the excel based application. For simulation created a database and calling database from excel.
Any sugestions what protocol to use. Or any other tools for conducting performance test on excel based application.
Here is where the foundation classes of knowledge on development and architecture come into play for a performance test professional.
Tell us about the next upstream component from excel? Is it connecting directly to the database or is it going through a web services or other application server layer? Do you have SQL Queries you are trying to reproduce or some other mechanism so as accessing your source?
What have you tried? (Other than protocol confuser?)
Related
Can any help me in fetching data from power BI endpoint without the need of using Power Shell, as want to know a way of directly fetching in Linux only?
I know a power shell can be installed in Linux , but is there any way I can skip and directly fetch the data?
reference - https://learn.microsoft.com/en-us/power-bi/admin/service-premium-connect-tools
Your Power BI XMLA endpoint is accessible through your Azure Analysis Services (AAS) instance tied to the given datasource/workspace, which means that you should be able to connect to that AAS instance and work with the data there via the web. I am not aware of any currently available Linux compatible tools that allow this. I did a bit of research and was surprised to find that there was not a VS Code extension that allowed this (might have to get to work on that ;)).
That being said, Microsoft has several different client libraries (for both AMO and ADOMD.NET) built within their .NET Core framework that would theoretically be able to used by a client application that could be built for supported Linux OS (Microsoft doc here). In other words, (again, theoretically) it should be relatively painless to build a simple tool for a supported Linux OS that takes in XMLA commands and executes them on a provided connection.
EDIT: Another good option to consider might be Microsoft's Power BI REST API (documentation here). If the functionality you are looking for is available within their REST API, you should be able to write a client tool (using one of many different options, but .NET Core could still be the in there) targeting Linux that makes use of the API for your Power BI instance in place of directly using the XMLA endpoint. I would consider this the better alternative. This is going is a less 'Microsoft-y' way of doing this, and is going to be much easier to maintain and develop over time. I would start by confirming that the functionality you want is not available in this API first.
EDIT: After reading further in above linked document regarding AMO and ADOMD.NET client libraries:
TCP based connectivity is supported for Windows computers only.
Interactive login with Azure Active Directory is supported for Windows computers only. The .NET Core Desktop runtime is required.
So it looks like there are currently some limitations to these libraries regarding a Linux runtime. I am not positive that you could use something other than TCP based connectivity to accomplish this, but if I find a way (or someone is able to suggest something), then I will update.
I need to find out the performance of my application.This application works as follows-
It's a WPF windows application, which requires some data to be filled by user
On clicking Submit button, it calls WCF web services
These services save these values in DB
Which tool would be the best for this scenario?
The best approach would be having several thousands of geo-distributed real users having different Internet connection speed and using your WPF application normally. If you are not able to arrange this - I would suggest mimicking these several thousands users on protocol level. You mentioned WCF web services - go and find a web service load testing tool. For example good ones are:
SoapUI - designed for web services testing. Has some load testing capabilities. See Creating and Running LoadTests to get started.
Apache JMeter - multiprotocol load testing tool, supports web services as well. See Testing SOAP/REST Web Services Using JMeter guide.
In fact any tool which is capable of sending a HTTP request will fit.
You can use some of usual load testing tools? for example Apache Jmeter (free) or HP LoadRunner (community license for 50 free users).
Just record communication between your WPF application and WCF ( I belive it communicate via HTTP) and add needed parameters.
I am currently working on a mobile concept.
We are running a Sharepoint 2010 Intranet solution, which is ONLY accessible within the company.
We want to make a mobile solution (for people outside), with data from the Sharepoint server.
I would like to have the data moved i.e. every 10-15 minutes through a cron job, and then move the data to an external database, which the mobile solution can access.
What is the easiest way to move the data? Using the web services, or are there any other ways to do this?
Thank you on beforehand,
Jens
A possible solution is to code a timer job, which is a cron job scheduled by sharepoint that you can set to run every night with some Sharepoint Object model code that extracts all the data and sends it to the other server, you can do this using ado.net or any equivalent technology like ORMS etc, so this method pushes the data to the server.
If you have limitations on the connectivity like firewalls that only allow http traffic then definitely you will need to use either web services or the client object model, this method pulls the data from the server.
Client Object Model is preferred over web services as it among other features it batches the requests to make it more efficient, the api is better to manipulate data, etc.
Another option is to use SSIS to do the job as described in this artice:
http://msdn.microsoft.com/en-us/library/hh368261.aspx
I am working on inventory application (C# .net 4.0) that will simultaneously inventory dozens of workstations and write the results to a central database. To save me having to write a DAL I am thinking of using Fluent NHibernate which I have never used before.
It is safe and good practice to allow the inventory application which runs as a standalone application to talk directly to the database using Nhibernate? Or should I be using a client server model where all access to the database is via a server which then reads/writes to database. In other words if 50 workstations when currently being inventoried there would be 50 active DB sessions. I am thinking of using GUID-Comb for the PK ID's.
Depending on the environment in which your application will be deployed, you should also consider that direct database connections to a central server might not always be allowed for security reasons.
Creating a simple REST Service with WCF (using WebServiceHost) and simply POST'ing or PUT'ing your inventory data (using HttpClient) might provide a good alternative.
As a result, clients can get very simple and can be written for other systems easily (linux? android?) and the server has full control over how and where data is stored.
it depends ;)
NHibernate has optimistic concurrency control ootb which is good enough for many situations. So if you just create data on 50 different stations there should be no problem. If creating data on one station depends on data from all stations it gets tricky and a central server would help.
I have installed Cognos 10 Framework Manager in my client computer and when I tried creating a new project, this pop up message showed up.
I heard that you need to have a webservice in this context. The question is> How should I solve this problem_
Framework Manager should typically be configured to connect to a running Cognos BI instance. If you don't have one of those, you will get errors trying to perform a variety of activities within an FM project (such as applying row-level security). Even getting started will be difficult, since Framework Manager will reach out to the BI Server to retrieve things like data source information.
If you have a running BI Server somewhere, you need to configure the FM instance (Cognos Configuration on your client system) to point at the Gateway and Dispatcher URLs for the Cognos BI Server. Here's the relevant docs on IBM's site:
http://publib.boulder.ibm.com/infocenter/cbi/v10r1m1/topic/com.ibm.swg.ba.cognos.inst_cr_winux.10.1.1.doc/t_steps_configure_fm_environment_properties.html#steps_configure_FM_environment_properties