PowerPivot - How to grab data from Yahoo webservices - excel

I would like to import weather forecasts in PowerPivot for a specified country by using Yahoo API.
So first of all I got the data stream URL including all requiered parameters to get forecasts (it works in my browser).
https://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20weather.forecast%20where%20woeid%3D612977&format=json&diagnostics=true&callback=
Then I tried to use it as "Other data Stream" in PowerPivot to grab the content.
However I got the following strange error about DTD :
Cannot connect to the specified feed. Verify the connection and try
again. Reason: For security reasons DTD is prohibited in this XML
document. To enable DTD processing set the DtdProcessing property on
XmlReaderSettings to Parse and pass the settings into XmlReader.Create
method
Any idea to solve that ?

The easiest was to install PowerQuery for Microsoft Excel then creating a new request to Yahoo Web Services.
After parsing data in PowerQuery (split columns, etc) , formatted data is available in a PowerPivot tab and you can use it as well as direct imported data in PowerPivot.

Related

Import data from Web Page Power Query

I wanted to get table from Webpage "https://www.nseindia.com/companies-listing/corporate-filings-actions" using Power Query. So i have used this link as API "https://www.nseindia.com/api/corporates-corporateActions?index=equities".
But When I Tried using Data Tab > Get Data > From Web. I have tried using both of above link. But unable to fetch data.Pop Up Box hangs excel
am I trying wrong way?
Can you suggest me a correct way?
Not able to do the process

Does Azure Data Factory support CDATA XML format?

After some trial and error, and a support call to Microsoft's Data Factory team of engineers, supposedly the product does not support CDATA values returned in XML.
I am using a web sink using XML format to send SOAP headers, and return XML data that contains CDATA formatting. Everything comes back properly in my pipeline, except the XML values containing any sort of CDATA formatting.
Is there a known workaround for this within Azure or Data Factory itself?
Thank you

Export Azure Application Insights with REST API for search term

I have a requirement where I need to search a particular term in App insights and then export this 'report' and send it in the email. Consider this as "report a bug" use case.
So if I search key "xxxxxx123", it should retrieve all matching traces/logs and then export it to either excel or CSV.
So my question is, is it possible with the NuGet package or even with REST API?
I tried looking at this, but couldn't find it helpful,
dev.applicationinsights.io/apiexplorer
There is no nuget package. You can do it by using the Get query rest api.
Write any query you need in the Query textbox, then in the right side, you can see the generated request. Then you can use c# or other programming language to query the results, as per this article.(Note: remember use your real Application ID and API Key).
Add a screenshot for your reference:
After fetch the data, you should write your own logic to export it to csv or excel.

Consume external api into kentico

What is the best way to consume an external api's data?
Do I need to create a new web api project and set up routing?
In the past I use a web service data source and attached a repeater. This won't work because I have an API instead of a web service.
Thanks much
you can try this, this is how i've converted my JSON / XML apis (or anything really) into a Transformable object, just clone this tool and adjust to your needs
https://devnet.kentico.com/marketplace/utilities/universal-api-viewer-(with-hierarchy-support)
A custom Data Source is what you would still want to do, as all a data source does really is return a Data Table, my tool there takes it another step by assigning it hierarchy structure and psuedo page types so the Repeater can treat them like items on the content tree.
After reading you can now connect externally do the database, you can use Kentico's ConnnectionHelper to connect to the external database via the Connection String, then query it.
If you have access to the external database, then you can use Kentico's ConnectionHelper class to pass in the external database connection string and run queries against it if you wish.
GeneralConnection ConnectionObj = ConnectionHelper.GetConnection("GetConnectionStringFromWeb.ConfigHere");
ConnectionObj.Open();
DataSet Results = ConnectionObj.ExecuteQuery(new QueryParameters("select * from SomeTable", null, QueryTypeEnum.SQLQuery));

Spotfire: Some part of the data table "sp_XXXX" could not be loaded. A data source may be missing or has been changed

I have created a spotfire visualization and saved it . However when it is tried accessing through web link I am getting below error
Some part of the data table "XXXXxxxxx" could not be loaded. A data source may be missing or has been changed.
I have done below settings :
1) The data is loaded as "Linked to source".
2) Data connections Properties -->Data connections Settings -->data Source--> --> Credentials--> credentials is given (Profile credentials has been shared)
3) I have used an Stored procedure and it is been created under the database which has spotfire access (including schema).
Please help me to solve the issue.
You mentioned that you accessed the DXP via a web link, which suggests to me that you are using WebPlayer. WebPlayer will not have access to files that are stored locally. WebPlayer will load data remotely using database connections and Information Links, however.
I assume that the data load properly when you load the analysis using the desktop client?
A workaround would be to save a copy of the file for access in WebPlayer and select "Stored Data" (i.e. embedded) in the "Change all applicable data sources to:" dropdown.
This issue has been resolved . This is due to the temp tables used in the query, it acts weirdly when the temp table used in the stored procedure (as per my observation)

Resources