Cognos two ways of importing tables:What is the difference? - cognos

What is the difference between the
'Run MetaData Wizard' --> Select Data Source
and
'Create Query Subject' --> 'Data Source' and then picking the datasource from the wizard
Are they the same thing?

They are identical if you pick the options as suggested above.
In the Run Metadata Wizard you have options to import metadata from other Cognos tools or from 3rd party applications.
In the create query subject, you can create Data Source queries, which done directly against the data source.
Model query, which done against other query subjects in the model
and stored procedure, which maps to DB stored procedure that return result set.

Using the Metadata Wizard you can create multiple query sources (e.g. import multiple tables) from a data source in a single operation.

Related

Azure Data Factory Flatten Multi-Array JSON- Issue previewing data in dataflow source

I haven't work much with ADF but I am trying to connect to a REST API and write the data to an Azure SQL DB. I have already created a pipe that copies the JSON retrieved from the Rest API to and writes to Blob storage.
When I create a dataflow and use the Blob as the source, I get a nested table in the data preview tab. Allow schema drift is selected and the JSON settings is set to document of arrays.
All the data is in subarrays under the tickets array. Is there a way to select only the tickets array? If this is possible then I should be able to easily flatten the rest.
Top Level JSON
Sub-Array
Data Preview
You can use the Flatten transformation to unroll the tickets array. It is currently showing as drifted in your data preview, so you'll want to first make it part of your metadata. You can do that either through Import Projection on the source projection tab, or use the "Map Drifted" button on your data preview panel.

Are there any methods to modifying properties of all tables at a time for google docs?

I have written a article in my google docs.
I have included small tables, big tables and huge tables in different places in the files.
Now I need to modify some properties of all tables at a time.
But that seems not possible?
Are there any methods to modifying properties of all tables at a time for google docs?
PS. more details to illustrate my issue:
1. Here is a doc file with one table.
2. Right click on the table and choose Table properties
3. Now here comes more tables in a doc file
How can I deal with all the tables together? (All modifications are the same)
Method 1
When creating the tables, you can simply set all the properties on the first one and then for the next ones you can copy and paste the first one since the format will be kept.
Method 2
If you want to modify more tables at the same time, you can make use of Apps Script.
Apps Script is a powerful development platform which can be used to build web apps and automate tasks. What makes it special is the fact that it is easy to use and to create applications that integrate with G Suite.
Therefore, your task can be achieved by using this script.
Snippet
function setTableProperties() {
var doc = DocumentApp.openById("DOCUMENT_ID");
var tables = doc.getBody().getTables();
tables.forEach((table) => {
//Any instruction run with the variable table will be executed for all tables.
});
}
Explanation
The above script gathers all the tables from the wanted document and then using a for loop accesses each table from the document.
In order to set the properties of the tables as wanted, you just have to use the appropriate method/s.
The getAttributes method can be used as well in order to see exactly which properties does a table posses.
Reference
Apps Script Document Service;
Apps Script Enum Attribute;
Apps Script Table Class;
Apps Script DocumentApp Class.

Stream Analytics Query (Select * into output)(Exclude specific columns)

I have a query like;
SELECT
*
INTO [documentdb]
FROM
[iothub]
TIMESTAMP BY eventenqueuedutctime
I need to use * because data is dynamic and dont have specific schema. Problem is Iothub system information data is written to documentdb in this query. Is there any way to exclude Iothub system information data?
Thanks.
This is not possible currently but this will be possible in Job Compatibility Level 1.2 in near future. For now, one workaround is that you could create a post create trigger in Cosmos DB to remove this property from the document.
To answer your question, Azure stream analytics service doesn't have an in-built support for excluding columns from dynamic data (iothub information). But, we can achieve this by using UDF. Here is more info on UDF.
UDF can help us in deleting the column from input data and returning us the updated json.
There are two steps basically to achieve this:
Create a JavaScript UDF.
Go to functions from left hand side navigation (below inputs).
Click on Add --> JavaScript UDF.
Give a function alias = removeiothubinfo
keep output type - any.
copy paste following code into function definition.
function main(input) {
delete input['IoTHub'];
return input;
}
Click on Save
Update query
Go to query mode and copy paste the following query :
WITH NewInput AS
(
SELECT
udf.removeiothubinfo(iothub) AS UpdatedJson
FROM
[iothub]
)
SELECT
UpdatedJson.*
INTO
[documentdb]
FROM
NewInput
Click on Save
I suggest you to test your query before running the job by uploading a sample file containing similar structure for json.
Edited
Also, even in job compatibility level 1.2 there has been no additional functionality to achieve this. Check this out for more info.
As #chetangm said in his answer, no such filtering mechanism is supported in ASA so far. Yes, you could use create trigger in Cosmos db, however it need to be triggered in sdk code or REST API. It won't be triggered automatically.
I provide you with another workaround that using Azure Function Cosmos DB Triggered. It could be executed when data is added to or changed in Azure Cosmos DB. You just need to remove the fields you don't want in the function code.

Read a table from Kentico database which was not declared as 'custom table'

My question is pretty simple. I am working on Kentico 9 with its SQL Server database which contains several tables which had been added directly from the SQL Management Studio by an external contractor. The fact is that those tables are being used to store custom content which will be displayed for a site, but, in the code they don't have the code for making queries. I mean, they don't have Info and Provider classes.
https://docs.kentico.com/display/K82/Retrieving+database+data+using+ObjectQuery+API
According with this, all tables into the Kentico database can be accessed by invoking methods on these classes, but I don't have it this time.
Something like this, it will not work if I use my table name:
var user = UserInfoProvider.GetUserInfo("administrator");
var items = CustomTableItemProvider.GetItems("MyTable")
.TopN(10)
.WhereEquals("ItemCreatedBy", user.UserID)
.OrderBy("ItemCreatedWhen");
My question is:
can I query any table by its name?
One last thing:
I cannot declared those table as "custom table" because it seems to be a bug in the CMS.
Or you can pull data using your own SQL query:
var ds = ConnectionHelper.ExecuteQuery("select ....", null, QueryTypeEnum.SQLQuery);
Nevertheless I would recommend to create a custom class inside a custom module (much more robust than custom tables) instead and use the generated Info and InfoProvider classes to get and manipulate data.
I think an object has to be registered within the system (created through Kentico UI or API) in order to be pulled from DB with object query.
So I'd choose one of the following options:
Use Entity Framework or something similar to work with that data
Create appropriate custom tables or even custom module and push data there. Not sure why you can't create a custom table... What is an error you're getting?
If you need to present data on the UI only (without processing on the back end) - use just custom queries
Hope this helps.
If you are accessing in code then you could do it the good old fashioned way. If you want to pull data from the database to display on the website you could also do so by creating a custom query and using a transformation to display the fields, then use a repeater on the page to display the transformed data. Alternatively you can use a SQL datasource with a basic repeater, but you still have to create a transformation to display the data. Both methods allow you to access the data in the tables from within the CMS UI, no need to touch any code behind.
If your objective is to read data from these database tables to transform on webpage e.g. using CMS Repeater webpart, you can simply create custom query(s) in Kentico itself and load data using it. You can find the detail here on how to create custom custom queries and load data using it.
On the other hand you can also write your custom classes and define the custom methods where you can pull data using your own SQL query like this:
var ds = ConnectionHelper.ExecuteQuery("select ....", null, QueryTypeEnum.SQLQuery);
Lastly I don't think there should be any issue to create custom table instead of those direct DB tables, only thing we have to ensure code name of custom table should be unique means don't try to use exact same name because it'll cause exception due to same table name already exist in DB. You can please share exception you getting while creating custom table so that I can help you out further.

Azure Search, Is there a way to add Query when importing from SQL

When Importing data to an Index in Azure Search, from SQL (progrematically not through the interface), Is there a way to add Query to filter the data come from the SQL table ?
Looking at the REST API documentation for Create Data Source, as of today it is not possible to define a query to filter the data that populates an index.
However I read somewhere that you can create a View and use that as the data source for populating the index. However when using a view, you will not be able to use SQL Integrated change tracking for change / deletion detection. However, you will still be able to use High Water Mark change detection and Soft Delete Column deletion detection.
Also, please vote for this UserVoice suggestion to request adding support for query parameter.

Resources