How to execute multiples procedures from nodejs and node-firebird? - node.js

I'm using Firebird 2.5 and node-firebird 0.8.6. I have to run SQL files with multiple stored procedures but I always got errors like the ones below
Error: Dynamic SQL Error, SQL error code = -104, Token unknown - line 1, column 5, term
at doCallback (/home/somasys/Downloads/testefb/node_modules/node-firebird/lib/index.js:1234:18)
at /home/somasys/Downloads/testefb/node_modules/node-firebird/lib/index.js:2929:21
at /home/somasys/Downloads/testefb/node_modules/node-firebird/lib/messages.js:151:25
at search (/home/somasys/Downloads/testefb/node_modules/node-firebird/lib/messages.js:117:13)
at /home/somasys/Downloads/testefb/node_modules/node-firebird/lib/messages.js:54:21
at FSReqCallback.wrapper [as oncomplete] (fs.js:477:5)
Here it's some parts of my SQL file:
set term ^;
CREATE OR ALTER PROCEDURE PRC_CALCULATRIBUTA()
BEGIN
...
END^
set term ;^
commit work;
set term ^;
CREATE OR ALTER PROCEDURE PRC_CORRIGEENCERR()
BEGIN
...
END^
set term ;^
commit work;
I've already tried to remove these set term and commit work and ran it (SQL script) inside a
EXECUTE BLOCK AS
BEGIN
...
END
but even so I got the same errors like the one described above. Is there any instruction or statement to put inside my SQL script?

Firebird's statement API can only execute individual statements. In addition, the SET TERM statement is not part of the Firebird SQL syntax. It is only a client-side feature in ISQL and other Firebird tools to determine when a statement is done. See also firebird procedural query throwing "token unknown" error at "SET TERM #;".
You will need to:
split up your SQL script in individual statements,
remove the SET TERM statements
remove any statement terminators outside the procedure bodies, and
execute the statements individually.
I would also suggest to not execute commit work, but instead use the transaction control options from node-firebird. I'm not sure if executing commit work will work in node-firebird, but some drivers will break because you just closed a transaction on them without using their transaction API.
In other words, you will need to execute:
CREATE OR ALTER PROCEDURE PRC_CALCULATRIBUTA()
BEGIN
...
END
optionally execute the commit, or commit explicitly using the node-firebird API, and then
CREATE OR ALTER PROCEDURE PRC_CORRIGEENCERR()
BEGIN
...
END
etc.
You cannot use execute block for this, because execute block doesn't support execution of DDL. There are workarounds to that limitation (using execute statement), but it is generally not a good use of execute block.
As an aside, committing between creating stored procedures is unnecessary.

Related

SSIS - Power Query Source: setting connection at runtime

I'm trying to use the Power Query source component in a generic way from SSIS (VS2019).
The idea would be to use a for each loop to load and transform Excel files. At run time, I need to set the connection manager properties for each file as well as the PQY script to be executed on the file.
What I did so far is trying to create a JSON connection string inside a script component and assign the connection string to the connection manager. It keeps on saying that the file requires credentials.
Would someone already experienced that kind of dev? All the files do have the same structure so far, do meta-data need to be refreshed too?
[Edit]
1. In the control flow, I'm retrieving the PQY script I want to apply from a DB.
Before transormations, script starts like this:
let Source = Excel.Workbook(File.Contents("path_to_a_file.xlsx"),null,true),RawData_Sheet = Source{[Item="Table1",Kind="Table"]}[Data]..."
In the C# script task, I'm replacing the path to excel file by the current file variable. M Script is stored in a variable used in the PQY component.
C# Script is then updating the PQY connection manager to target the appropriate file:
ConnectionManager _conn = Dts.Connections["Power Query Connection Manager"];
String _ConnectString = "[{kind:File,path:path_to_a_file.xlss,AuthenticationKind:Windows,Username:myusername,Password:mypassword}]";
_conn.ConnectionString = _ConnectString;
The PQY component is left has it is, connected to ["Power Query Connection Manager"] and getting its script from the variable I set.
PQY configuration screen
Thanks for any tip on this,
Olivier
I can't address the specifics of the PQ but generic anything in a Data Flow will not work.
The Data Flow task works because it makes a strict contract between the source(s) and the destination(s). These columns with these data types will be in play during the run. It's a design-time contract because that allows the run-time engine to allocate resources based on how many buffers of data the system can support. Each row is X bytes, we have Y bytes of memory available, so Z buffers worth of data plus parallelism stuff.
Wish I had a better story to tell you.

Disable wrapping migration in a transaction with Node db-migrate

I need to use db-migrate to add an index to a Postgres database with CREATE INDEX CONCURRENTLY. However, db-migrate wraps all migrations in a transaction by default, and trying to create a concurrent index inside a transaction results in this error code:
CREATE INDEX CONCURRENTLY cannot run inside a transaction block
I can't find any way to disable transactions as part of the db-migrate options, either CLI options or (preferably) as a configuration directive on the migration itself. Any idea if this can be accomplished?
It turns out that this can be solved on the command line by using --non-transactional. Reading the source, I can see that this sets an internal flag called notransactions, but it's not clear to me whether this can be set as part of the migration configuration or must be passed on the command line.
I kept getting the errors even when running with --non-transactional flag.
The solution for me was to run with --non-transactional AND have each CREATE INDEX CONCURRENTLY statement in its own separate migration file. It turns out you can't have more than one of it in the same file (same transaction block).

SqlState 24000, Invalid cursor state

I am trying to help one of our developers resolve an Azure SQL DB error. He has attempted to run a script, connecting using sqlcmd and (I presume) ODBC. It seems no matter what he does he receives the error message "SqlState 24000, Invalid cursor state".
His script consists of roughly 80 "insert into table where not exists sub-select" statements. Some of the sub-selects return zero records.
I read this post which is admittedly almost a year old now. The short version seems to be "this is a known Azure SQL DB bug".
sqlcmd on Azure SQL Data Warehouse - SqlState 24000, Invalid cursor state after INSERT statement
I know for certain my developer has been able to run these statements previously. Is that just the nature of a bug - sometimes it occurs and sometimes it doesn't? Does he need to use a different ODBC driver? Any other suggestions?
Please make sure you are using ODBC driver 13.1 or later. You can download it from here.

SSIS package works from SSMS but not from agent job

I've an SSIS package to load excel file from network drive. It's designed to load content and then move the file to archived folder.
Everything works good when the following SQL statement runs in SSMS window.
However when it's copied to SQL agent job and executes from there, the file is neither loaded nor moved. But it shows "successful" from the agent log.
The same thing also happened to "SSIS job" instead of T-SQL job, even with proxy of windows account.(same account as ssms login)
Declare #execution_id bigint
EXEC [SSISDB].[catalog].[create_execution] #package_name=N'SG_Excel.dtsx', #execution_id=#execution_id OUTPUT, #folder_name=N'ETL', #project_name=N'Report', #use32bitruntime=True, #reference_id=Null
Select #execution_id
DECLARE #var0 smallint = 1
EXEC [SSISDB].[catalog].[set_execution_parameter_value] #execution_id, #object_type=50, #parameter_name=N'LOGGING_LEVEL', #parameter_value=#var0
EXEC [SSISDB].[catalog].[start_execution] #execution_id
GO
P.S. At first relative path of network drive is applied, then switched to absolute path(\\server\folder). It's not solving the issue.
SSIS Package Jobs run under the context of the SQL Server Agent. What Account is setup to run the SQL Server Agent on the SQL Server? It may need to be run as a Domain account that has access to the network share.
Or you can copy the Excel file to local folder on the SQL Server, so the Package can access the file there.
Personally I avoid the File System Task - I have found it unreliable. I would replace that with a Script Task, and use .NET methods from the System.IO namespace e.g. File.Move. These are way more reliable and have mature error handling.
Here's a starting point for the System.IO namespace:
https://msdn.microsoft.com/en-us/library/ms404278.aspx
Be sure to select the relevant .NET version using the Other Versions link.
When I have seen things like this in the past it's been that my package isn't accessing the path I thought it was at run time, its looking somewhere else, finding an empty folder & exiting with success.
SSIS can have a nasty habit of going back to variable defaults . It may be looking at a different path you used in dev? Maybe hard code all path values as a test? or put in break points & double check the run time values of all variables & parameters.
Other long shots may be:
Name resolution, are you sure the network name is resolving correctly at runtime?
32/64 bit issues. Dev tends to run 32 bit, live may be 64 bit. May interfere with file paths? Maybe force to 32 bit at run time?
There is issue with sql statement not having statement terminator (;) that is causing issue.
Declare #execution_id bigint ;
EXEC [SSISDB].[catalog].[create_execution] #package_name=N'SG_Excel.dtsx', #execution_id=#execution_id OUTPUT, #folder_name=N'ETL', #project_name=N'Report', #use32bitruntime=True, #reference_id=Null ;
Select #execution_id ;
DECLARE #var0 smallint = 1 ;
EXEC [SSISDB].[catalog].[set_execution_parameter_value] #execution_id, #object_type=50, #parameter_name=N'LOGGING_LEVEL', #parameter_value=#var0 ;
EXEC [SSISDB].[catalog].[start_execution] #execution_id ;
GO
I have faced similar issue in service broker ..

How to get Excel to reliably execute sp_executesql from a query table on a worksheet?

In MS Excel, if you create a QueryTable with Microsoft Query, and your SQL query cannot be visually presented by Microsoft Query, then you are not allowed to provide parameters for that query. Which is a shame, so there is this awesome technique that allows parameters anyway:
{CALL sp_executesql (N'select top (#a) * from mytable', N'#a int', ?)}
You provide the query in the ODBC CALL form and it works with parameters.
Unless it does not.
While on some computers it works flawlessly, on other computers Excel throws an error when trying to refresh the query table:
For SQL Native Client 10: Invalid parameter number
For SQL Native Client 11: Procedure or function sp_executesql has too many arguments specified.
With a profiler I can see Excel (actually, the native client when poked by Excel) is doing this before actually executing sp_executesql:
exec sp_describe_undeclared_parameters N' EXEC sp_executesql N''<actual query>;'',N''<declared parameters>'',#P1 '
Here #p1 is the parameter placeholder that is supposed to go to sp_executesql later, and that is where sp_describe_undeclared_parameters fails. It does not expect any custom parameters for sp_executesql -- only the two intrinsic ones, #stmt and #params. If I manually remove the ,#p1 bit from the query, it executes fine in all cases.
So that is the problem: On some computers the above auto-generated sp_describe_undeclared_parameters works with the unnecessary/wrong ,#P1 bit, on some it fails.
We need to make it work on all computers.
Weird things to consider:
I fail to see anything common in computers that don't have the problem. Bitness or the Windows version do not seem to matter.
I fail to manually execute the said query with the ,#P1 bit attached - whatever tool I use, I get the "too many arguments" error, and yet, Excel is able to execute it no problem when it feels like. I can see with the profiler that is the exact query that hits the server. Maybe it has something to do with a very peculiar combination of connection settings, but they appear to be same on all computers (the data source is an ODBC system data source using SQL Server Native Client 11, and all parameters are same on all tabs across the computers).

Resources