In Sqlite we have table1 with column column1
there are 4 rows with following values for column1
(p1,p10,p11,p20)
DROP TABLE IF EXISTS table1;
CREATE TABLE table1(column1 NVARCHAR);
INSERT INTO table1 (column1) values ('p1'),('p10'),('p11'),('p20');
Select instr(',p112,p108,p124,p204,p11,p1124,p1,p10,p20,',','+column1+',') from table1;
We have to get the position of each value of column1 in the given string:
,p112,p108,p124,p204,p11,p1124,p1,p10,p20,
the query
Select instr(',p112,p108,p124,p204,p11,p1124,p1,p10,p20,',column1) from table1;
returns values
(2,7,2,17)
which is not what we want
the query
Select instr(',p112,p108,p124,p204,p11,p1124,p1,p10,p20,',','+column1+',') from table1;
returns 9 for all rows -
it turned out that it is the position of first "0" symbol ???
Howe we can get the exact positions of column1 in the given string in sqlite ??
In SQLite the concatenation operator is || and not + (like SQL Server), so do this:
Select instr(',p112,p108,p124,p204,p11,p1124,p1,p10,p20,',',' || column1 || ',') from table1;
What you did with your code was number addition which resulted to 0, because none of the string operands could succesfully be converted to number,
so instr() was searching for '0' and found it always at position 9 of the string:',p112,p108,p124,p204,p11,p1124,p1,p10,p20,'.
Related
I am writing an SQL query that joins two tables. The problem that I am facing is that the column on which I am joining is blank (""," ") on one table and null on the other.
Table A
id
col
1
2
3
SG
Table B
id
col
a
null
b
null
c
SG
source_alleg = spark.sql("""
SELECT A.*,B.COL as COLB FROM TABLEA A LEFT JOIN TABLEB B
ON A.COL = B.COL
""")
For my use case blank values and null are same. I want to do something like Trim(a.col) which will convert blank values to null and hence find all the matches in the join.
Output:
id
col
colb
1
either null or blank
either null or blank
2
either null or blank
either null or blank
3
SG
SG
In sql the NULL are ignored during a join unless you use a outer join or full join
more information : https://www.geeksforgeeks.org/difference-between-left-right-and-full-outer-join/
if you want to convert the nulls to a string you can just use an if
select
if(isnull(trim(col1)),"yourstring", col1),
if(isnull(trim(col2)),"yourstring", col2)
from T;
Other similar questions deal these two problems separately, but I want to merge them in a single statement. I'm using Python3 with psycopg2 library on a PostgreSQL database.
I have a table with 3 fields: ID, name, bool (BIT)
I want to insert a row in this table only if does not exists an other row with same 'name' and 'bool' = 0 and the total count of rows with same ID is less than a given threshold.
More specific my table should contain at most a given threshold number of rows with same ID. Those rows can have the same 'name', but only one of those rows with same ID and same 'name' can have 'bool'= 0.
I tried with this:
INSERT INTO table
SELECT 12345, abcdf , 0 FROM table
WHERE NOT EXISTS(
SELECT * FROM table
WHERE ID = 12345 AND name = abcdf AND bool = 0)
HAVING (SELECT count(*) FROM table
WHERE ID = 12345) < threshold
RETURNING ID;
but the row is inserted anyway.
Then I tried the same statement replacing 'HAVING' with 'AND', but it insert all the threshold rows together.
I have a Hive table whit data stored as ORC.
I write in some fields empty values (blank, '"") but sometimes when I run a select query on this table the empty string columns are shown as NULL in the query result.
I would like to see the empty values I entered, how is this possible?
If you want to see, empty values for NULL in hive table, then you can use NVL function, which can help you to produce default values for NULL column values.
Below is syntax,
NVL(arg1, arg2) - here argument 1 is expression or column and arg2 is default value for
NULL values.
e.g. Query - SELECT NVL(blank,'') as blank_1 AS FROM db.table;
I want to return row value from a Table where another table contain Colon separated value.
Suppose
I have a Table name "Unit Name" that Contain unit_id, unit_name
and Table 2 is User_reg where contain User_id. user Id contain colon separator value. Like as 82:81:80
How can get unit name list from unit_name Table
SELECT
*
FROM
unit_name un
WHERE (select school from user_reg where user_mode = 4) is not null
and un.unit_id in
(SELECT regexp_substr( school, '[^:]+', 1, LEVEL ) FROM USER_REG
CONNECT BY regexp_substr( school, '[^:]+', 1, LEVEL ) IS NOT NULL );
If you run the following query, you'll have a delimited string converted to rows.
select * from table(apex_string.split('82:81:80',':'))
I have a column of type map(varchar, varchar). I would like to filter on the keys of the map to get rows of the table where the map contains a given string.
How do I check whether a varchar is contained in the keys of a map type column?
Get rows with a certain string (varchar) / strings as keys
select * from planet
where map_keys(tags) = ARRAY['barrier'];
Get rows where an array column contains a particular string (varchar)
select * from planet
where contains(map_keys(tags), 'barrier');
In this case,
table name: planet
schema of column "tags": map(varchar, varchar)
String I was searching for in column tags: 'barrier'