How to get Excel to reliably execute sp_executesql from a query table on a worksheet? - excel

In MS Excel, if you create a QueryTable with Microsoft Query, and your SQL query cannot be visually presented by Microsoft Query, then you are not allowed to provide parameters for that query. Which is a shame, so there is this awesome technique that allows parameters anyway:
{CALL sp_executesql (N'select top (#a) * from mytable', N'#a int', ?)}
You provide the query in the ODBC CALL form and it works with parameters.
Unless it does not.
While on some computers it works flawlessly, on other computers Excel throws an error when trying to refresh the query table:
For SQL Native Client 10: Invalid parameter number
For SQL Native Client 11: Procedure or function sp_executesql has too many arguments specified.
With a profiler I can see Excel (actually, the native client when poked by Excel) is doing this before actually executing sp_executesql:
exec sp_describe_undeclared_parameters N' EXEC sp_executesql N''<actual query>;'',N''<declared parameters>'',#P1 '
Here #p1 is the parameter placeholder that is supposed to go to sp_executesql later, and that is where sp_describe_undeclared_parameters fails. It does not expect any custom parameters for sp_executesql -- only the two intrinsic ones, #stmt and #params. If I manually remove the ,#p1 bit from the query, it executes fine in all cases.
So that is the problem: On some computers the above auto-generated sp_describe_undeclared_parameters works with the unnecessary/wrong ,#P1 bit, on some it fails.
We need to make it work on all computers.
Weird things to consider:
I fail to see anything common in computers that don't have the problem. Bitness or the Windows version do not seem to matter.
I fail to manually execute the said query with the ,#P1 bit attached - whatever tool I use, I get the "too many arguments" error, and yet, Excel is able to execute it no problem when it feels like. I can see with the profiler that is the exact query that hits the server. Maybe it has something to do with a very peculiar combination of connection settings, but they appear to be same on all computers (the data source is an ODBC system data source using SQL Server Native Client 11, and all parameters are same on all tabs across the computers).

Related

SSIS - Power Query Source: setting connection at runtime

I'm trying to use the Power Query source component in a generic way from SSIS (VS2019).
The idea would be to use a for each loop to load and transform Excel files. At run time, I need to set the connection manager properties for each file as well as the PQY script to be executed on the file.
What I did so far is trying to create a JSON connection string inside a script component and assign the connection string to the connection manager. It keeps on saying that the file requires credentials.
Would someone already experienced that kind of dev? All the files do have the same structure so far, do meta-data need to be refreshed too?
[Edit]
1. In the control flow, I'm retrieving the PQY script I want to apply from a DB.
Before transormations, script starts like this:
let Source = Excel.Workbook(File.Contents("path_to_a_file.xlsx"),null,true),RawData_Sheet = Source{[Item="Table1",Kind="Table"]}[Data]..."
In the C# script task, I'm replacing the path to excel file by the current file variable. M Script is stored in a variable used in the PQY component.
C# Script is then updating the PQY connection manager to target the appropriate file:
ConnectionManager _conn = Dts.Connections["Power Query Connection Manager"];
String _ConnectString = "[{kind:File,path:path_to_a_file.xlss,AuthenticationKind:Windows,Username:myusername,Password:mypassword}]";
_conn.ConnectionString = _ConnectString;
The PQY component is left has it is, connected to ["Power Query Connection Manager"] and getting its script from the variable I set.
PQY configuration screen
Thanks for any tip on this,
Olivier
I can't address the specifics of the PQ but generic anything in a Data Flow will not work.
The Data Flow task works because it makes a strict contract between the source(s) and the destination(s). These columns with these data types will be in play during the run. It's a design-time contract because that allows the run-time engine to allocate resources based on how many buffers of data the system can support. Each row is X bytes, we have Y bytes of memory available, so Z buffers worth of data plus parallelism stuff.
Wish I had a better story to tell you.

How to execute multiples procedures from nodejs and node-firebird?

I'm using Firebird 2.5 and node-firebird 0.8.6. I have to run SQL files with multiple stored procedures but I always got errors like the ones below
Error: Dynamic SQL Error, SQL error code = -104, Token unknown - line 1, column 5, term
at doCallback (/home/somasys/Downloads/testefb/node_modules/node-firebird/lib/index.js:1234:18)
at /home/somasys/Downloads/testefb/node_modules/node-firebird/lib/index.js:2929:21
at /home/somasys/Downloads/testefb/node_modules/node-firebird/lib/messages.js:151:25
at search (/home/somasys/Downloads/testefb/node_modules/node-firebird/lib/messages.js:117:13)
at /home/somasys/Downloads/testefb/node_modules/node-firebird/lib/messages.js:54:21
at FSReqCallback.wrapper [as oncomplete] (fs.js:477:5)
Here it's some parts of my SQL file:
set term ^;
CREATE OR ALTER PROCEDURE PRC_CALCULATRIBUTA()
BEGIN
...
END^
set term ;^
commit work;
set term ^;
CREATE OR ALTER PROCEDURE PRC_CORRIGEENCERR()
BEGIN
...
END^
set term ;^
commit work;
I've already tried to remove these set term and commit work and ran it (SQL script) inside a
EXECUTE BLOCK AS
BEGIN
...
END
but even so I got the same errors like the one described above. Is there any instruction or statement to put inside my SQL script?
Firebird's statement API can only execute individual statements. In addition, the SET TERM statement is not part of the Firebird SQL syntax. It is only a client-side feature in ISQL and other Firebird tools to determine when a statement is done. See also firebird procedural query throwing "token unknown" error at "SET TERM #;".
You will need to:
split up your SQL script in individual statements,
remove the SET TERM statements
remove any statement terminators outside the procedure bodies, and
execute the statements individually.
I would also suggest to not execute commit work, but instead use the transaction control options from node-firebird. I'm not sure if executing commit work will work in node-firebird, but some drivers will break because you just closed a transaction on them without using their transaction API.
In other words, you will need to execute:
CREATE OR ALTER PROCEDURE PRC_CALCULATRIBUTA()
BEGIN
...
END
optionally execute the commit, or commit explicitly using the node-firebird API, and then
CREATE OR ALTER PROCEDURE PRC_CORRIGEENCERR()
BEGIN
...
END
etc.
You cannot use execute block for this, because execute block doesn't support execution of DDL. There are workarounds to that limitation (using execute statement), but it is generally not a good use of execute block.
As an aside, committing between creating stored procedures is unnecessary.

SqlState 24000, Invalid cursor state

I am trying to help one of our developers resolve an Azure SQL DB error. He has attempted to run a script, connecting using sqlcmd and (I presume) ODBC. It seems no matter what he does he receives the error message "SqlState 24000, Invalid cursor state".
His script consists of roughly 80 "insert into table where not exists sub-select" statements. Some of the sub-selects return zero records.
I read this post which is admittedly almost a year old now. The short version seems to be "this is a known Azure SQL DB bug".
sqlcmd on Azure SQL Data Warehouse - SqlState 24000, Invalid cursor state after INSERT statement
I know for certain my developer has been able to run these statements previously. Is that just the nature of a bug - sometimes it occurs and sometimes it doesn't? Does he need to use a different ODBC driver? Any other suggestions?
Please make sure you are using ODBC driver 13.1 or later. You can download it from here.

pyodbc fetchall() returns no results when a column returned by the query contains too much data

Setup:I am using Python 3.3 on a Windows 2012 client.
I have a select query running using pyodbc which is not returning any results via fetchall(). I know the query works fine because i can take it out and run it from Microsoft SQL Management Studio without any issues.
I can also remove one column from the select list and the query will return results. For the database row in question, this column contains a large amount of XML data (> 10,000 characters), so it seems as though there is some buffer overflow issue going on causing fetchall() to fail, though it doesn't throw any exceptions. I have tried googling around and i have seen rumors of a config option to raise the buffer size, but i haven't been able to nail down exactly how to do it, or what a workaround would be.
Is there a configuration option that I can use, or any alternative to pyodbc.
Disclaimer: I have only been using python for about 2 weeks now so i
am still quite the noob, though i have made every attempt to research
my problems thoroughly this one has proven to be elusive:
On a side note, i tried using odbc instead of pyodbc but the same query throws this oddball error which google isn't helping me solve either
[ERROR] An exception while executing the Select query: [][Negative size passed to PyBytes_FromStringAndSize]
It seems this issue was resolved by changing my SQL connection string
FROM:
DRIVER={SQL Server Native Client 11.0}
TO:
DRIVER={SQL Server}

Odd Oracle + .net behaviour when comparing types

My workplace has a .net application supplied to us by a postal service, it connects to an oracle database running on the same machine and is responsible for registering, storing and printing shipping labels.
Seeing as the database host etc. is configurable we asked the company if the application could be used over the network (simply copying it over to another machine resulted in "literal does not match format string" errors), all we were told is "it isn't possible". Not wanting to take no for an answer I poked around the exe with reflector.
Together with Oracle's v$sqlarea view I pinpointed the errors to a few date comparison functions, but I have no idea why the application was working in the first place on the original machine.
The original application uses queries similar to
SELECT * FROM shipping WHERE date = '2011/03/28' --error
easily fixed with something like
SELECT * FROM shipping WHERE to_char(date, 'yyyy/mm/dd') = '2011/03/28'
Why does the original application work without throwing any errors? The incorrect query pops up in the v$sqlarea view when the application is used on the original host, if I copy the query and run it manually using anything else it throws the error, if I run the application on any other machine it throws the error too, is there some setting in Oracle that is modifying queries on the fly, but only for queries originating from the local machine, while storing the original query in v$sqlarea?
This sounds like a regional settings difference between the two client machines, since formatting of dates will be dependent on the culture used to convert the date to a string in .NET, and unless the application specifies a culture, it will use the settings of the current logged on user running the application. This is obviously a problem if the database engine is expecting them in a certain format. This problem is less likely to arise with parametrized queries, where the date parameters are passed separate from the query and as a date datatype instead of a string.
If you work with dates, you must avoid String.Format based query generation. Use parametrized selects and parameters to set those values.
OracleCommand cmd = new OracleCommand("SELECT * FROM shipping WHERE date = :dataParam", connection);
var param = cmd.Parameters.Add("date", OracleDbType.Date);
param.Value = DateTime.Now;
It worked, because the format was matching the datetime settings on the developer machine and on the target database.
In other words: the issue is connected to an incorrect date time format you are trying to provide.
This could be because of regional settings on the server. Please check that the new server is configured for the same Locale (EN-GB, EN-US, or whatever the original server is configured to use).

Resources