Nestjs & TypeOrm: No results from Query Builder using getOne() / getMany() - node.js

I don't get this. I have a service that injects entity repositories and has dedicated methods to do some business logic and functions.
Beside that I expose a method that just returns QueryBuilder - to avoid injecting repositories all over the place - for a few occasions when other service needs just a quick query:
type EntityFields = keyof MyEntity;
entityQueryBuilder(alias?: string, id?: number, ...select: EntityFields[]) {
const q = this.entityRepository.createQueryBuilder(alias);
if (id) {
q.where({id});
}
if (select) {
q.select(select);
}
return q;
}
Now when I am trying to use this and call:
const r = await service.entityQueryBuilder('a', 1, 'settings').getOne();
the result is always empty although in the log the generated SQL is correct.
However when I do:
const r = await service.entityQueryBuilder('a', 1, 'settings').execute();
I get (almost) what I need. I get array instead of an entity object directly but the data are there.
I am unhappy though as I need to map the result to the object I wanted, which is something that getOne() should do on my behalf. getMany() does not return results either.
What did I do wrong?
Edit:
FWIW here is the final solution I came up with based on the hint in accepted reply:
entityQueryBuilder(id?: number, ...select: EntityFields[]) {
const q = this.entityRepository.createQueryBuilder('alias');
if (id) {
q.where({id});
}
if (select) {
q.select(select.map(f => `alias.${f}`));
}
return q;
}
Admittedly it has hardcoded alias but that I can live with and is OK for my purpose.
Hope this helps someone in the future.

It happens because you put no really proper select. In your case, you need a.settings instead of settings:
const r = await service.entityQueryBuilder('a', 1, 'a.settings').getOne(); // it should works

Related

Redux unexpected behaviour | create empty object if not found

I am debugging an app, there is an existing redux reducer which sets some data of store object. Now when i dispatch action for this reducer before the relevant object is initialised it still works and create an empty object. This works on our deployment server and do crash on my local machine with correct error that "map is undefined on null". Why is it creating an empty object and not crashing on deployment server and if it is creating an object why is it not assigning the data we pass to it. My reducer is
case ACTIONS.SET_LOCAL_WEIGHTS: {
const { weight } = action;
const drafts = fromJS(state.getIn(['draftData', 'rows']));
const setWeight = drafts.map((row: any) => {
row.data.weight = weight[row.id].weight;
return row;
});
return state
.setIn(['draftData', 'rows'], setWeight)
.setIn(['draftData', 'total'], setWeight.length);
}
It creates: draftData: {} when rows and total is also provided. I have tried it on node 15 and 12 for checking any anomaly on map function.
I get error Cannot read property 'map' of undefined on your code if the initial state doesn't have a property state.draftData.rows. I don't see anywhere where you would be creating an empty object.
The immutable.js fromJS method will create a List if called with an array from state.draftData.rows. But if it is called with undefined then it returns undefined instead of a collection with a .map() method.
I also don't think that you need to be calling fromJS if the rows object is never converted toJS, but it might depend on your initial state.
This code should work. It uses the existing List from state if it exists, or creates an empty List otherwise.
const drafts = state.getIn(["draftData", "rows"]) ?? fromJS([]);
The assignment in row.data.weight = weight[row.id].weight seems like a mutation of state.
I tried to rewrite this, but it seems strange to me that your code doesn't do anything with the weights in the payload unless their index/key matches one that's already in the state.
import { fromJS, List, Map } from "immutable";
interface Row {
data: {
weight: number;
};
id: number;
}
const reducer = (state = Map(), action) => {
switch (action.type) {
case ACTIONS.SET_LOCAL_WEIGHTS: {
const { weight } = action;
const drafts: List<Row> =
state.getIn(["draftData", "rows"]) ?? fromJS([]);
const setWeight = drafts.reduce(
(next, row, index) =>
next.setIn([index, "data", "weight"], weight[row.id]?.weight),
drafts
);
return state
.setIn(["draftData", "rows"], setWeight)
.setIn(["draftData", "total"], setWeight.size);
}
default:
return state;
}
};

How do I filter keys from JSON in Node.js?

I'm trying to select certain keys from an JSON array, and filter the rest.
var json = JSON.stringify(body);
which is:
{
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
}
Want I want:
{
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
}
}
I've checked out How to filter JSON data in node.js?, but I'm looking to do this without any packages.
Now you can use Object.fromEntries like so:
Object.fromEntries(Object.entries(raw).filter(([key]) => wantedKeys.includes(key)))
You need to filter your obj before passing it to json stringify:
const rawJson = {
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
};
// This array will serve as a whitelist to select keys you want to keep in rawJson
const filterArray = [
"FirstName",
"typeform_form_submits",
];
// this function filters source keys (one level deep) according to whitelist
function filterObj(source, whiteList) {
const res = {};
// iterate over each keys of source
Object.keys(source).forEach((key) => {
// if whiteList contains the current key, add this key to res
if (whiteList.indexOf(key) !== -1) {
res[key] = source[key];
}
});
return res;
}
// outputs the desired result
console.log(JSON.stringify(filterObj(rawJson, filterArray)));
var raw = {
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
}
var wantedKeys =["FirstName","typeform_form_submits" ]
var opObj = {}
Object.keys(raw).forEach( key => {
if(wantedKeys.includes(key)){
opObj[key] = raw[key]
}
})
console.log(JSON.stringify(opObj))
I know this question was asked aways back, but I wanted to just toss out there, since nobody else did:
If you're bound and determined to do this with stringify, one of its less-well-known capabilities involves replacer, it's second parameter. For example:
// Creating a demo data set
let dataToReduce = {a:1, b:2, c:3, d:4, e:5};
console.log('Demo data:', dataToReduce);
// Providing an array to reduce the results down to only those specified.
let reducedData = JSON.stringify(dataToReduce, ['a','c','e']);
console.log('Using [reducer] as an array of IDs:', reducedData);
// Running a function against the key/value pairs to reduce the results down to those desired.
let processedData = JSON.stringify(dataToReduce, (key, value) => (value%2 === 0) ? undefined: value);
console.log('Using [reducer] as an operation on the values:', processedData);
// And, of course, restoring them back to their original object format:
console.log('Restoration of the results:', '\nreducedData:', JSON.parse(reducedData), '\nprocessedData:', JSON.parse(processedData));
In the above code snippet, the key value pairs are filtered using stringify exclusively:
In the first case, by providing an array of strings, representing the keys you wish to preserve (as you were requesting)
In the second, by running a function against the values, and dynamically determining those to keep (which you didn't request, but is part of the same property, and may help someone else)
In the third, their respective conversions back to JSON (using .parse()).
Now, I want to stress that I'm not advocating this as the appropriate method to reduce an object (though it will make a clean SHALLOW copy of said object, and is actually surprisingly performant), if only from an obscurity/readability standpoint, but it IS a totally-effective (and mainstream; that is: it's built into the language, not a hack) option/tool to add to the arsenal.

Passing path parameters in axios

I am using Axios with NodeJs and trying to pass path parameters in axios.get() method. For example, if URL is url = '/fetch/{date}', I want to replace {date} with the actual date while calling axios.get(url).
I went through the source code on Github and StackOverflow, but couldn't find any method.
Is it possible to keep URLs with parameters as a placeholder and replace them while actually calling the get method of Axios?
Axios doesn't have this feature and it looks like the team don't want to add it.
With credit to previous responders for inspiration, to me this seems like the solution closest to what you (and me) are looking for:
1 - Where you want to store all your URLs and their parameters, define them as functions which use a template string to return the composed URL:
export var fetchDateUrl = (date) => `/fetch/${date}`;
If you need any type-specific formatting of the value being concatenated into the URL, this function is a good place to do it.
2 - Where you want to make the request, call the function with the correct parameters:
import { fetchDateUrl } from 'my-urls';
axios.get(fetchDateUrl(someDateVariable))...;
Another variation, if you really like the idea of naming the parameters at the call site, you can define the URL function to destructure an object like this:
var fetchDateUrl = ({date}) => `/fetch/${date}`;
which you'd then use like this:
axios.get(fetchDateUrl({date: someDateVariable}));
Use template strings
url = `/fetch/${date}`
Or just tag it on
url = '/fetch/'+ date
I think using axios interceptors is better to do this :
//create your instance
const instanceAxios = axios.create({
baseUrl: 'http://localhost:3001'
]);
instanceAxios.interceptors.request.use(config => {
if (!config.url) {
return config;
}
const currentUrl = new URL(config.url, config.baseURL);
// parse pathName to implement variables
Object.entries(config.urlParams || {}).forEach(([
k,
v,
]) => {
currentUrl.pathname = currentUrl.pathname.replace(`:${k}`, encodeURIComponent(v));
});
const authPart = currentUrl.username && currentUrl.password ? `${currentUrl.username}:${currentUrl.password}` : '';
return {
...config,
baseURL: `${currentUrl.protocol}//${authPart}${currentUrl.host}`,
url: currentUrl.pathname,
};
});
// use like :
instanceAxios.get('/issues/:uuid', {
urlParams : {
uuid: '123456789'
}
})
For typescript users, you will need to add this, in one of your .d.ts
declare module 'axios' {
interface AxiosRequestConfig {
urlParams?: Record<string, string>;
}
}
( this is a POC, not really tested, doesn't hesitate if you see something wrong )
You can use template strings ie:
let sellerId = 317737
function getSellerAnalyticsTotals() {
return axios.get(`http://localhost:8000/api/v1/seller/${sellerId}/analytics`);
}
Given some API /fetch/${date} you likely want to wrap your axios call in a function.
const fetchData = (date) => axios.get(`/fetch/${date}`);
fetchData(dateObject.toFormat('yyyy-mm-dd'))
.then(result => { ... });
This requires the calling code to format date correctly however. You can avoid this by using a DateTime library that handles date string parsing and do the format enforcement in the function.
const fetchData = (date) => axios.get(`/fetch/${date.toFormat('yyyy-mm-dd')}`);
fetchData(dateObject)
.then(result => { ... });
you can do like this:
getProduct = (id) => axios.get(`product/${id}`);
I always do it like this:
const res = await axios.get('https://localhost:3000/get', { params: { myParam: 123 } });
I find this to be much clearer than template strings.
More explanation here

Running node sqlite3 code synchronously

I'm having trouble adjusting to the async-first nature of node / js / typescript. This intent of this little function should be pretty clear: it takes a database and returns an array of courses that are listed in that database.
The problem is that the return statement gets run before any of the database operations have run, and I get an empty list. When I set a breakpoint inside the database each loop, I can see that the rows are being found and that courses are being put into ret one by one, but these courses never become visible in the scope where courseList() was called.
const courseList = (database: sqlite3.Database): Course[] => {
let ret = new Array<Course>();
database.serialize();
database.each("select ID, Title from Course", (err: Error, row: Object) => {
ret.push(new Course(
row.ID,
row.Title
))
})
return ret;
}
Suggestions?
The calling code just wants to print information about courses. For example:
let courses = courseList(db);
console.log(courses.length); // logs 0, even though the db contains courses
database.each takes a complete callback. Use that to resume e.g.
const courseList = (database: sqlite3.Database, complete): Course[] => {
let ret = new Array<Course>();
database.serialize();
database.each("select ID, Title from Course", (err: Error, row: Object) => {
ret.push(new Course(
row.ID,
row.Title
))
}, complete);
return ret;
}
let courses = courseList(db, () => {
console.log(courses.length);
});
More
There are better ways to write this. Use promises https://basarat.gitbooks.io/typescript/content/docs/promise.html
The documentation is horrible : https://github.com/mapbox/node-sqlite3/wiki I would be tempted to look elsewhere (TS First) for a database solution. Its not worth the pain for me personally. YMMV.

Knex Query build - build chain dynamically

I've traded node-DBI for knex because it has more function that I require.
So far I'd make the same choice again but only one thing is holding me back: writing abstract methods that take a options variable where params like where, innerjoin and such are contained in.
Using node-dbi I could easily forge a string using these variables but I can't seem to create the knex chain dymanicly because after using a switch, you'd get knex.method is not a function.
Any idea how to resolve this?
I'm looking for something as in
`getData(table,options){
var knex=knex
if(options.select)
/** append the select data using knex.select()
if(options.where)
/** append the where data using knex.where(data)*/
if(options.innerJoin)
/** append innerjoin data*/
}`
This way I can avoid having to write alot of DB functions and let my Business Logical Layers handel the requests
/*This function serves as the core of our DB layer
This will generate a SQL query and execute it whilest returning the response prematurely
#param obj:{Object} this is the options object that contain all of the query options
#return Promise{Object}: returns a promise that will be reject or resolved based on the outcome of the query
The reasoning behind this kind of logic is that we want to abstract our layer as much as possible, if evne the slightest
sytnax change occurs in the near future, we can easily update all our code by updating this one
We are using knex as a query builder and are thus relying on Knex to communicate with our DB*/
/*Can also be used to build custom query functions from a data.service. This way our database service will remain
unpolluted from many different functions and logic will be contained in a BLL*/
/* All available options
var options = {
table:'table',
where:{operand:'=',value:'value',valueToEqual:'val2'},
andWhere:[{operand:'=',value:'value',valueToEqual:'val2'}],
orWhere:[{operand:'=',value:'value',valueToEqual:'val2'}],
select:{value:['*']},
insert:{data:{}},
innerJoin:[{table:'tableName',value:'value',valueToEqual:'val2'}],
update:{data:{}}
}*/
/*Test object*/
/*var testobj = {
table:'advantage',
where:{operand:'>',value:'id',valueToEqual:'3'},
select:{value:['*']},
innerJoin:{table:'User_Advantage',value:'User_Advantage.Advantageid',valueToEqual:'id'}
}
var testobj = {
table:'advantage',
where:{operand:'>',value:'id',valueToEqual:'3'},
select:{value:['*']},
innerJoin:{table:'User_Advantage',value:'User_Advantage.Advantageid',valueToEqual:'id'}
}
queryBuilder(testobj)*/
function queryBuilder(options){
var promise = new Promise(function (resolve, reject) {
var query;
for (var prop in options) {
/*logger.info(prop)*/
if (options.hasOwnProperty(prop)) {
switch (prop) {
case 'table':
query = knex(options[prop]);
break;
case 'where':
query[prop](options[prop].value, options[prop].operand, options[prop].valueToEqual);
break;
/*andWhere and orWhere share the same syntax*/
case 'andWhere':
case 'orWhere':
for(let i=0, len=options[prop].length;i<len;i++){
query[prop](options[prop][i].value, options[prop][i].operand, options[prop][i].valueToEqual);
}
break;
case 'select':
query[prop](options[prop].value);
break;
/*Same syntax for update and insert -- switch fallthrough*/
case 'insert':
case 'update':
query[prop](options[prop].data);
break;
case 'innerJoin':
for(let i=0, len=options[prop].length;i<len;i++){
query[prop](options[prop][i].table, options[prop][i].value, options[prop][i].valueToEqual);
}
break;
}
}
}
return query
.then(function (res) {
return resolve(res);
}, function (error) {
logger.error(error)
return reject(error);
})
return reject('Options wrongly formatted');
});
return promise
}
Thanks to Molda I was able to produce the code above. This one takes a Object called options in as a parameter and will build the knex chain based on this value. See the comments for the syntax of Object
Not every knex query option has been included but this will serve as a good base for anyone trying to achieve a similar effect.
Some examples to use this:
/*Will return all values from a certain table
#param: table{String}: string of the table to query
#param: select{Array[String]}: Array of strings of columns to be select -- defaults to ['*'] */
function getAll(table,select) {
/*Select * from table as default*/
var selectVal=select||['*']
var options={
table:table,
select:{value:selectVal}
}
return queryBuilder(options)
}
or a more specific use case:
function getUserAdvantages(userid){
var options = {
table:'advantage',
innerJoin:[{table:TABLE,value:'advantage.id',valueToEqual:'user_advantage.Advantageid'}],
where:{operand:'=',value:'user_advantage.Userid',valueToEqual:userid}
}
return sqlService.queryBuilder(options)
}
Note: the sqlService is a module of node that I export containing the queryBUilder method.
Edit: I wanted to add that the only roadblock I had was using the .from / .insert from Knex. I no longer use these methods as they resulted in errors when using them. Ive used knex(table) as commented.

Resources