How to parse embed sql in ANTLR4 - antlr4

Consider following example input:
namespace Client
{
sql FindByQuery
{
SELECT * FROM client_Profile
WHERE 1 = 1
#ifNotEmpty(City, sql{ AND `City` = #City })
#ifNotEmpty(Zipcode, sql{ AND `Zipcode` = #Zipcode })
#ifNotEqual(State, "NY", sql{ AND `State` = #State })
#ifEqual(IsReactive, 1 , sql{ AND `WONum` LIKE CONCAT('B%', #WONum) })
#include(CommonOrderBy)
}
}
There is two separate language in code:
Any raw text, include whitespace, only if not starts with '#' and ending with '}' (The SQL statement)
My control blocks(namespace/sql) and macros('#....').
How to parse the inner SQL statement as raw text and handle the macros starts with '#' correctly?
Or is there any example?

ANTLR4 lexer mode can do that, it means create a separate lexer g4 file, define two mode: DEFAULT mode and the SQL mode.
The "sql FindByQuery" statement push to SQL mode, and the "}" pop back to DEFAULT mode.
In these two separate ANTLR4 modes, lexer can be totally different.

Related

Are there line continuation character for string in Terraform

I am writing some long SQL queries in a Terraform file, the query would be like:
"CREATE TABLE example_${var.name} (id string, name string..........................................)"
To make the query readable, I hope the query would be the format like the following, and cross multiple lines
CREATE TABLE example_$(var.name) (
id string,
name string,
................................
)
Is there a line continuation character for a long single line string to be written as multiple lines. Just like we could use backslash \ in Python for long string?
I have tried use heredoc, but it does not work when running the query. Thanks
It sounds like your goal is to have a long SQL query defined in Terraform, but across multiple lines so you don't need to horizontal scroll to infinity and beyond.
In my team we use heredoc to achieve this although you said it's not possible in your case.
Another idea my team use when heredoc isn't possible is to join an array of strings.
E.g.
locals {
sql = join(",", [
"id string",
"name string",
"address string",
"renter string",
"profession string"
])
}
Results in
> local.sql
id string,name string,address string,renter string,profession string
I hope I've understood your question correctly but if not please let me know.
PS: There's an open issue for multiline strings in Terraform
To make a multi-line strings in Terraform using the heredoc string syntax.
locals {
sql = <<EOT
CREATE TABLE example_$(var.name) (
id string,
name string,
................................
)
EOT
}

Handling special characters in ARRAY_CONTAINS search in cosmos sql query

I have db structure like this :
"selfId": "cd29433e-f36b-1410-851b-009d805073d7",
"selfName" : "A",
"bookIds": [
"2f51bfd4",
"2f3a1010",
"090436c0",
"1078c3b2",
"b63b06e0"
]
I am working in C# and get bookId as a string.
I am writing query as :
string SQLquery = string.Format("select c.selfName from c where ARRAY_CONTAINS(c.bookIds,\""+bookId+"\"");
But When bookId contains special character, then query is giving error. For example if bookId = "AK"s" book" (please note id itself contains ") , then executing sqlQuery is giving error.
When a text contains quotes, slashes, special characters, you can use Escape Sequence
The quoted property operator [""] can also be used to access properties. for example, SELECT food.id and SELECT food["id"] are equal. This syntax can be used to escape a property with spaces, special characters, or a name that is the same as a SQL keyword or reserved term.
Example:
SELECT food["id"]
FROM food
WHERE food["foodGroup"] = "Snacks" and food["id"] = "19015"
Reference :- https://learn.microsoft.com/en-us/azure/cosmos-db/sql/sql-query-constants#bk_arguments

Azure Stream Analytics Job - Transformation Query - correct formatting in ARM template

When editing a Stream Analytics transformation query in the Portal, you can format it for readability across multiple lines...e.g.
SELECT
INTO [Output1]
FROM [Input1]
PARTITION BY PartitionId
WHERE etc etc etc
When putting this into an ARM template for CI/CD, this is entered as one massive long string and would end up displaying in the portal as...
SELECT * INTO [Output1] FROM [Input1] PARTITION BY PartitionId WHERE etc etc etc to infinity....
The official documentation is pretty useless and doesn't give any clues for the query part of the template, just that it is a "string"...
https://learn.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/2016-03-01/streamingjobs/transformations
There is a Microsoft sample template that is the only example I could find with a transform query specified...
https://github.com/Azure/azure-quickstart-templates/blob/master/101-streamanalytics-create/azuredeploy.json
...and it looks like it is trying to do spacing...
"query": "SELECT\r\n *\r\nINTO\r\n [YourOutputAlias]\r\nFROM\r\n [YourInputAlias]"
...but failing badly - see screenshot
Has anyone managed to do this?
Also does anyone know why you can see the transformation query in the Azure Resource Explorer (https://resources.azure.com/)? Or that it cannot be exported from the portal with the rest of the Stream Job? (done at Resource Group level)
Thanks in advance
I know it is a full year later and perhaps you've figured this out already, however, this is what I did:
In my Parameters file, I used an array of strings, for example:
"StreamAnalyticsJobQueryMultiline": {
"value": [
"WITH allData AS ( ",
" SELECT ",
" *, ",
" GetMetadataPropertyValue([%%INPUTSTREAMNAME%%], '[User].[EventNamespace]') AS EventNamespace ",
" FROM [%%INPUTSTREAMNAME%%] Partition By PartitionId ",
"SELECT ",
" *, ",
" 'EventHubWriterv1' AS EventType ",
"INTO ",
" [%%OUTPUTSTREAMNAME%%] ",
"FROM ",
" allData Partition By PartitionId "
]
When the array is concatenated, and output as a string, it produces something like this, where each item in that array is still enclosed by the quotation marks and the entire thing is contained within square braces (see: https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions-string#concat)
["Foo","Bar","Blah"]
So additional transformation in order to turn it into something readable in the Stream Analytics output is required.
Also note here the %%INPUTSTREAMNAME%% and %%OUTPUTSTREAMNAME%%, as both my input and output streams are also parameters, and using the typical inline [parameter('ParamName')] did not work nicely with the rest of the transformation needed.
In my Template file, I take the StreamAnalyticsJobQueryMultiline parameter and use the variables field to do this transformation:
"QueryStringRaw": "[string(concat(parameters('StreamAnalyticsJobQueryMultiline')))]",
// Update the end-of-lines by removing the doublequote and comma, and replacing it with a newline character
"QueryStringIter1": "[replace(variables('QueryStringRaw'), '\",', '\n')]",
// Update the beginning-of-lines by removing the doublequote
"QueryStringIter2": "[replace(variables('QueryStringIter1'), '\"', '')]",
// Update the InputStreamName and OutputStreamName values
"QueryStringIter3": "[replace(variables('QueryStringIter2'), '%%INPUTSTREAMNAME%%', parameters('InputStreamName'))]",
"QueryStringIter4": "[replace(variables('QueryStringIter3'), '%%OUTPUTSTREAMNAME%%', parameters('OutputStreamName'))]",
// Produce the final output for the query string by trimming the leading and trailing square brackets
"QueryStringFinal": "[substring(variables('QueryStringIter4'), 1, sub(length(variables('QueryStringIter4')), 2))]"
Then I reference that in the transformation portion of the Microsoft.StreamAnalytics/streamingjobs properties:
"transformation": {
"name": "Transformation",
"properties": {
"streamingUnits": "[parameters('StreamAnalyticsStreamingUnitsScale')]",
"query": "[variables('QueryStringFinal')]"
}
}

DocumentDB Stored Procedure Lumenize

I'm using the aggregate stored procedure lumenize https://github.com/lmaccherone/documentdb-lumenize with the .net client and I'm in trouble in the filterquery content.
How simply pass alphanumeric value to the filterquery query ?
string configString = #"{
cubeConfig: {
groupBy: 'Modele',
field: 'Distance',
f: 'sum'
},
filterQuery: 'SELECT * FROM Modele WHERE ModeleGUID = ''0b93def1-ccd7-fc35-0475-b47c89137c3f'' '}";
Each test gives me a parse error in the filterquery :(
Error: One or more errors occurred.
Message: After parsing a value an unexpected character was encountered:
'. Path 'filterQuery', line 7, position 63.
End of demo, press any key to exit.
Thanks
Just to properly close this out: The issue is related to multiple single-quotes in the filter string. As long as they're escaped properly (e.g. \'), things should work as expected.

Couch DB escape characters

I have a list function for CouchDB, but I can't get it into the DB because I'm constantly getting syntax erros. Here is the function:
function(head,req) {
var headers;
start({'headers':{'Content-Type' : 'text/csv; charset=utf-8; header=present'}});
while(r = getRow()) {
if(!headers) {
headers = Object.keys(r.value);
send('"' + headers.join('","') + '"\n');
}
headers.forEach(function(v,i) {
send(String(r.value[v]).replace(/\"/g,'""').replace(/^|$/g,'"'));
(i + 1 < headers.length) ? send(',') : send('\n');
});
}
}
Can anyone show me an example of this function formatted that can be inserted into CouchDB?
List functions are stored in design documents. Design documents are JSON documents, so you need to make sure they conform to the JSON standard. Since List functions are string values in JSON, you need to make sure that you:
Escape any double quotes in the function, so " becomes \". Where possible, you should use single quotes instead of double quotes.
Make sure you replace any line breaks with \n or just remove line breaks as Javascript ignores them anyway.
Escape any backslashes, so \ becomes \\.

Resources