Deleting several rows in table of DB - jooq

I am using code-generator. I have two tables. In one table I have user_id an several docs of this user_id. In second table a have docs of this user_id, but without user_id and I have to delete these docs.
Please help!

I'm assuming you have some criteria about the user whose documents you want to delete, just not the foreign key? Just use a semi-join, which you can use in DELETE statements as well.
Assuming your search criteria is something like the username:
In SQL
DELETE FROM docs
WHERE docs.user_id IN (
SELECT user_id
FROM user
WHERE username = ?
)
In Java
// Assuming the usual static import:
import static org.jooq.impl.DSL.*;
// Then write:
ctx.deleteFrom(DOCS)
.where(DOCS.USER_ID.in(
select(USER.USER_ID)
.from(USER)
.where(USER.USERNAME.eq(username))
))
.execute();

Related

Script returning limited amount of records as compare to Query

I tried to convert an SQL query into Gosu Script ( Guidewire). My script is working only for limited number of records
This is the SQL query
select PolicyNumber,* from pc_policyperiod
where ID in ( Select ownerID from pc_PRActiveWorkflow
where ForeignEntityID in (Select id from pc_workflow where State=3))
This is my script
var workFlowIDQuery = Query.make(Workflow).compare(Workflow#State,Relop.Equals,WorkflowState.TC_COMPLETED).select({QuerySelectColumns.path(Paths.make(entity.Workflow#ID))}).transformQueryRow(\row ->row.getColumn(0)).toTypedArray()
var prActiveWorkFlowQuery = Query.make(PRActiveWorkflow).compareIn(PRActiveWorkflow#ForeignEntity, workFlowIDQuery).select({QuerySelectColumns.path(Paths.make(entity.PRActiveWorkflow#Owner))}).transformQueryRow(\row -> row.getColumn(0)).toTypedArray()
var periodQuery = Query.make(PolicyPeriod).compareIn(PolicyPeriod#ID,prActiveWorkFlowQuery).select()
for(period in periodQuery){
print(period.policynmber)
}
Can anyone find a cause; why the script results in limited records or suggest improvements?
I would suggest you to write a single Gosu Query to select policyPeriod and join 3 entities with a foreign key to other entity.
I am note sure if the PolicyPeriod ID is same as the PRActiveWorkflow ID. Can you elaborate the relation between PolicyPeriod and PRActiveWorkflow entity ?

Pass column name as argument - Postgres and Node JS

I have a query (Update statement) wrapped in a function and will need to perform the same statement on multiple columns during the course of my script
async function update_percentage_value(value, id){
(async () => {
const client = await pool.connect();
try {
const res = await client.query('UPDATE fixtures SET column_1_percentage = ($1) WHERE id = ($2) RETURNING *', [value, id]);
} finally {
client.release();
}
})().catch(e => console.log(e.stack))
}
I then call this function
update_percentage_value(50, 2);
I have many columns to update at various points of my script, each one needs to be done at the time. I would like to be able to just call the one function, passing the column name, value and id.
My table looks like below
CREATE TABLE fixtures (
ID SERIAL PRIMARY KEY,
home_team VARCHAR,
away_team VARCHAR,
column_1_percentage INTEGER,
column_2_percentage INTEGER,
column_3_percentage INTEGER,
column_4_percentage INTEGER
);
Is it at all possible to do this?
I'm going to post the solution that was advised by Sehrope Sarkuni via the node-postgres GitHub repo. This helped me a lot and works for what I require:
No column names are identifiers and they can't be specified as parameters. They have to be included in the text of the SQL command.
It is possible but you have to build the SQL text with the column names. If you're going to dynamically build SQL you should make sure to escape the components using something like pg-format or use an ORM that handles this type of thing.
So something like:
const format = require('pg-format');
async function updateFixtures(id, column, value) {
const sql = format('UPDATE fixtures SET %I = $1 WHERE id = $2', column);
await pool.query(sql, [value, id]);
}
Also if you're doing multiple updates to the same row back-to-back then you're likely better off with a single UPDATE statement that modifies all the columns rather than separate statements as they'd be both slower and generate more WAL on the server.
To get the column names of the table, you can query the information_schema.columns table which stores the details of column structure of your table, this would help you in framing a dynamic query for updating a specific column based on a specific result.
You can get the column names of the table with the help of following query:
select column_name from information_schema.columns where table_name='fixtures' and table_schema='public';
The above query would give you the list of columns in the table.
Now to update each one for a specific purpose, You can store the result set of column name to a variable and pass that variable to the function to perform the required action.

not in query and select one field from second collection

My requirement is to count all the data whose particular id is not in reference collection. The equivalent SQL query would go as below:
select count(*) from tbl1 where tbl.arr.id not in (select id from tbl2)
I've tried as below, but got stuck up on fetching single field i.e. id from 2nd query.
db.coll1.find(
{$not:
{"arr.id":
{$in:
{db.coll2.find()}//how would I fetch a single column from
//2nd coll2
}
}
}
).count()
Also, Please note that arr.id is an ObjectId stored in collection coll1 and same will go with collection coll2. Should special care be taken while fetching the id like say ObjectId(id)?
Update - I am using mongo db version 3.0.9
I had to use $nin to check for not in condition and get the array in a different format as the version of mongodb was 3.0.9. Below is how I did it.
db.coll1.find({"arr.id":{$nin:[db.coll2.find({},["id"])]}}).count()
For mongodb v>=3.2 it would be as below
db.coll1.find({"arr.id":{$nin:[db.coll2.find({},"id")]}}).count()

SELECT with multiple values in DocumentDB

I have an Employees collection and I want to retrieve full documents of 10 employees whose ID's I'd like to send to my SQL SELECT. How do I do that?
To further clarify, I have 10 EmployeeId's and I want pull these employees' information from my Employees collection. I'd appreciate your help with this.
Update:
As of 5/6/2015, DocumentDB supports the IN keyword; which supports up to 100 parameters.
Example:
SELECT *
FROM Employees
WHERE Employees.id IN (
"01236", "01237", "01263", "06152", "21224",
"21225", "21226", "21227", "21505", "22903",
"14003", "14004", "14005", "14006", "14007"
)
Original Answer:
Adding on to Ryan's answer... Here's an example:
Create the following UDF:
var containsUdf = {
id: "contains",
body: function(arr, obj) {
if (arr.indexOf(obj) > -1) {
return true;
}
return false;
}
};
Use your contains UDF is a SQL query:
SELECT * FROM Employees e WHERE contains(["1","2","3","4","5"], e.id)
For documentation on creating UDFs, check out the DocumentDB SQL reference
You can also vote for implementing the "IN" keyword for "WHERE" clauses at the DocumentDB Feedback Forums.
You could also achieve this by using OR support. Below is a sample –
SELECT *
FROM Employees e
WHERE e.EmployeeId = 1 OR e.EmployeeId = 2 OR e.EmployeeId = 3
If you need more number of ORs than what DocumentDB caps, you would have to break up your queries into multiple smaller queries by employeeId values. You can also issue the queries in parallel from the client and gather all the results
The best way to do this, today would be to create a Contains() UDF that took in the array of ids to search on and use that in the WHERE clause.
Does
Select * from Employees where EmployeeId in (1,3,5,6,...)
Not work ?
thanks to ryancrawcour we know it doesn't
Another method is to use the ARRAY_CONTAINS method in the SQL API.
Here is the sample code :
SELECT *
FROM Employees
WHERE ARRAY_CONTAINS(["01236", "01237", "01263", "06152", "21224"],Employees.id).
I ran both queries ( using the IN method ) with a sample set of datasets, both are consuming the same amount of RUs.

Subsonic 3 Simple Query inner join sql syntax

I want to perform a simple join on two tables (BusinessUnit and UserBusinessUnit), so I can get a list of all BusinessUnits allocated to a given user.
The first attempt works, but there's no override of Select which allows me to restrict the columns returned (I get all columns from both tables):
var db = new KensDB();
SqlQuery query = db.Select
.From<BusinessUnit>()
.InnerJoin<UserBusinessUnit>( BusinessUnitTable.IdColumn, UserBusinessUnitTable.BusinessUnitIdColumn )
.Where( BusinessUnitTable.RecordStatusColumn ).IsEqualTo( 1 )
.And( UserBusinessUnitTable.UserIdColumn ).IsEqualTo( userId );
The second attept allows the column name restriction, but the generated sql contains pluralised table names (?)
SqlQuery query = new Select( new string[] { BusinessUnitTable.IdColumn, BusinessUnitTable.NameColumn } )
.From<BusinessUnit>()
.InnerJoin<UserBusinessUnit>( BusinessUnitTable.IdColumn, UserBusinessUnitTable.BusinessUnitIdColumn )
.Where( BusinessUnitTable.RecordStatusColumn ).IsEqualTo( 1 )
.And( UserBusinessUnitTable.UserIdColumn ).IsEqualTo( userId );
Produces...
SELECT [BusinessUnits].[Id], [BusinessUnits].[Name]
FROM [BusinessUnits]
INNER JOIN [UserBusinessUnits]
ON [BusinessUnits].[Id] = [UserBusinessUnits].[BusinessUnitId]
WHERE [BusinessUnits].[RecordStatus] = #0
AND [UserBusinessUnits].[UserId] = #1
So, two questions:
- How do I restrict the columns returned in method 1?
- Why does method 2 pluralise the column names in the generated SQL (and can I get round this?)
I'm using 3.0.0.3...
So far my experience with 3.0.0.3 suggests that this is not possible yet with the query tool, although it is with version 2.
I think the preferred method (so far) with version 3 is to use a linq query with something like:
var busUnits = from b in BusinessUnit.All()
join u in UserBusinessUnit.All() on b.Id equals u.BusinessUnitId
select b;
I ran into the pluralized table names myself, but it was because I'd only re-run one template after making schema changes.
Once I re-ran all the templates, the plural table names went away.
Try re-running all 4 templates and see if that solves it for you.

Resources