I'm trying to document some legacy Excel documents for source control ahead of their decommissioning. I need to extract all SQL queries contained within them.
I've been searching for various ways of doing this for a while in either VBA or PowerShell (other options welcomed) and when I thought I had this in place, I realised I'd just returned the source query i.e.
Dim mt AS ModelTable
Dim wb AS Workbook
Set wb = Workbooks.Open(myFilePath)
'Get Model Tables and create a file per SQL query
For Each mt In wb.Model.ModelTables
sql = mt.SourceWorkbookConnection.OLEDBConnection.CommandText
Debug.Print sql
Next
When I run the above, I have anywhere between 2 and 10 model tables per Excel file but am only able to return the SQL from the source query that created the connection.
I've tried several variations on the Model object but either get blank, a reference to model, or the same result i.e. just a single query.
For context, the 'EC' connection is our SQL database which is feeding the model tables.
In the UI I see the below:
UI Queries and Connections
EC Details
Most articles, or posts on here refer to adding or updating the queries, whereas I just need to extract them!
Not sure if I'm wording the question appropriately, or what I need to achieve isn't possible.
Any help massively appreciated as constantly going in circles here!
UPDATE
I found a few guides online to do this slightly differently, and to query the model within the workbook via MDX; I've since found that the compatibility of the model is set to 1103 so am unable to use the TMSCHEMA_* DMVs.
Having spent several hours searching yesterday on ways to get the source query/partition query/table query I repeatedly hit dead-ends, am I trying to do the impossible here?
Related
In our project we created several useful queries on log analytics that we deploy as a "savedSearch" (Microsoft.OperationalInsights/workspaces/savedSearches#2020-08-01).
Now when we load the query in the editor we can export it to excel, which can be nicely refreshed to view current data.
However this link is created to the query that is in the editor and not the stored/deployed query. The alternative is to export to Power Bi (M query) which generates a script that you can then use in excel.
In both cases the query itself seems to be in the connection, so it does not get updated when we deploy a new version. Does anyone know of a way to make this connection to a stored/deployed query?
I feel like this should be as straightforward as a connection to resource so that not only the data, but also the query itself gets updated.... I must be missing something
One way I can think of is to leverage Functions in log queries.
You can first save your query as a function, then export it to excel that would create a connection but execute the function, instead of the raw query.
You can tweak your query later if needed and save/overwrite to the same function, and the refresh should still be able to pull in the latest results since the changes are now neatly abstracted away via the function. :)
I have written a article in my google docs.
I have included small tables, big tables and huge tables in different places in the files.
Now I need to modify some properties of all tables at a time.
But that seems not possible?
Are there any methods to modifying properties of all tables at a time for google docs?
PS. more details to illustrate my issue:
1. Here is a doc file with one table.
2. Right click on the table and choose Table properties
3. Now here comes more tables in a doc file
How can I deal with all the tables together? (All modifications are the same)
Method 1
When creating the tables, you can simply set all the properties on the first one and then for the next ones you can copy and paste the first one since the format will be kept.
Method 2
If you want to modify more tables at the same time, you can make use of Apps Script.
Apps Script is a powerful development platform which can be used to build web apps and automate tasks. What makes it special is the fact that it is easy to use and to create applications that integrate with G Suite.
Therefore, your task can be achieved by using this script.
Snippet
function setTableProperties() {
var doc = DocumentApp.openById("DOCUMENT_ID");
var tables = doc.getBody().getTables();
tables.forEach((table) => {
//Any instruction run with the variable table will be executed for all tables.
});
}
Explanation
The above script gathers all the tables from the wanted document and then using a for loop accesses each table from the document.
In order to set the properties of the tables as wanted, you just have to use the appropriate method/s.
The getAttributes method can be used as well in order to see exactly which properties does a table posses.
Reference
Apps Script Document Service;
Apps Script Enum Attribute;
Apps Script Table Class;
Apps Script DocumentApp Class.
I am learning pl/sql. I want to ask a question for importing excel files.
I create a table after that import data from excel nearly 100 rows.
I wonder how can i see this query basic like;
insert into table_name (column1,colum2,...,columnn )
values (value1, value2, ... , value n); and other 100 rows..
Sincerely
I'm not sure whether there is a feature within Oracle engine itself, but I can think of two ways to get those queries:
1. Use Oracle SQL Developer (Or another GUI with the same features) :
Oracle SQL Developer (Download link here) is a free tool developed by Oracle to interact with the database. Add the connection for your database and connect to it, then follow these guidelines carefully to generate your insert script.
2. Use v$sql (Experimental) :
Right now I have no access to an Oracle database to check this, but theoretically, if the database is a development/training one, there should not be a lot of activities and queries inside, so you can query the v$sql table to find the last 100 (or whatsoever) queries:
SELECT SQL_FULLTEXT FROM V$SQL WHERE ROWNUM < 1000 ORDER BY FIRST_LOAD_TIME desc;
Check for the ones starting with INSERT INTO {THE_TABLE_WHICH_HAS_IMPORTED_DATA} to find your insert lines.
As I mentioned, this method is quite experimental and might confuse you, so I strongly suggest using Oracle SQL Developer.
TLDR; How do you add a full text index using Entity framework 5 coded migrations
I'm having issues adding a full text index to a database using Entity framework migrations. It needs to be there from the start so I'm attempting modifying the InitialCreate migration that was automatically generated to add it.
As there isn't a way to do it via the DbMigrations API I've resorted to running inline sql at the end of the 'Up' code.
Sql("create fulltext catalog AppNameCatalog;");
Sql("create fulltext index on Document (Data type column Extension) key index [PK_dbo.Document] on AppNameCatalog;");
When this runs everything gets created fine until it reaches this sql, then it throws the the sql error 'CREATE FULLTEXT CATALOG statement cannot be used inside a user transaction.'. Which is expected and working as designed.
Thankfully Sql() has an overload that allows you to run the sql outside the migration transaction. Awesome! I thought.
Sql("create fulltext catalog AppNameCatalog;", true);
Sql("create fulltext index on Document (Data type column Extension) key index [PK_dbo.Document] on AppNameCatalog;", true);
But low and behold modifying the code to do this (see above) results in a new timeout error 'Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.'
I've tried spitting out the sql and running it manually and it works fine. I've also diff'd the generated sql with and without running it outside a transaction and they are identical so it must be something in the way sql is executed.
Thanks in advance for any help!
I had a similar problem. My InitialCreate migration was creating a table and then attempting to add a full text index to that table, using the overloaded Sql() to indicate that it needs to execute outside the transaction. I was also getting a timeout error and I suspect it's due to a thread deadlock.
I could get it to work in some scenarios by using Sql() calls instead of CreateTable() and by merging the CREATE FULL TEXT CATALOG and CREATE FULL TEXT INDEX statements into a single Sql() call. However, this wasn't very reliable. Sometimes it would work and sometimes it would fail with the same timeout error.
The only reliable solution I found was to move the creation of the catalog and full text index into a separate migration.
HI,
I have a SP which returning more than 100 fields with 1000+ row. I need to to save all in temp table and and rum my customize query to get the appropriate data.
I did many search but i am unable to find the right solutions for my project. I will appreciate if anyone can share his idea.
create table #SP_Result
(i need to create field dynamically according to the SP return result )
exec Ministry..civil_record
"2010-08-07","Autogen",20,NULL,NULL,NULL,NULL,NULL,NULL,NULL
I need dump the result from SP to #SP_Result.
Why don't you run the query itself, in your "customised query", rather than try to capture the result set of the stored proc ? That's the normal method.
All those Nulls look like a bastard of a de-normalised "table", where many rows will not apply to the task. It is much, much faster to deal with the database in a normalised, set-oriented manner.