Getting error while creating index for elastic search - node.js

I am trying to create an elastic search index if it doesn't exist using OnModuleInit (NestJs). It works fine with my local machine but when I am trying to deploy to AWS ES Cluster, this part is giving such an error.
ElasticsearchModule.registerAsync({
useFactory: async () => ({
node: process.env.ES_ENDPOINT,
maxRetries: 3,
requestTimeout: 60000,
pingTimeout: 60000,
}),
})
export class ClientElasticSearchModule implements OnModuleInit {
constructor(private clientSearchService: ClientElasticSearchService) {}
async onModuleInit() {
await this.clientSearchService.createIndex();
}
}
It is where I am creating an index:
await this.elasticsearchService.indices.create({
index: clientsIndex,
body: {
settings: {
analysis: {...},
},
mappings: {
properties: {
name: {
type: 'text',
fields: {
complete: {
type: 'text',
analyzer: 'autocomplete_analyzer',
search_analyzer: 'autocomplete_search_analyzer',
},
},
},
},
},
},
});
Error message:
ResponseError: Response Error
at onBody (/code/node_modules/#elastic/elasticsearch/lib/Transport.js:337:23)
at IncomingMessage.onEnd (/code/node_modules/#elastic/elasticsearch/lib/Transport.js:264:11)
at IncomingMessage.emit (node:events:402:35)
at IncomingMessage.emit (node:domain:475:12)
at endReadableNT (node:internal/streams/readable:1343:12)
at processTicksAndRejections (node:internal/process/task_queues:83:21) {
meta: {
body: '',
statusCode: 403,
headers: {
date: 'Fri, 14 Jan 2022 16:18:46 GMT',
'content-type': 'application/json',
'content-length': '73',
connection: 'keep-alive',
'x-amzn-requestid': '8b3cded2-f210-4c79-ab48-ff517725a1e2',
'access-control-allow-origin': '*'
},
meta: {
context: null,
request: [Object],
name: 'elasticsearch-js',
connection: [Object],
attempts: 0,
aborted: false
}
}
}

This looks like your AWS Opensearch (same tech under the hood of Elasticsearch) instance is returning a 403. This might mean you need to set up IAM Roles and the correct access for your instance. Please See the AWS Docs

Related

Google picker on frontend (user login in reactJS) and google drive download on backend (node.js), API v3

I am building a file uploader which provides the user an option to upload files from his google drive. Google picker is set up and working on the frontend (reactJS) and I have the fileID and OAuth token given by Google Picker. I send these to the backend (node.js) and have the Google Drive API over there. I followed the documentation https://developers.google.com/drive/api/v3/manage-downloads and put the oauth token in the auth param in drive.files.get, now I got the following error
Error GaxiosError: [object Object]
at Gaxios._request (/home/.../node_modules/gaxios/src/gaxios.ts:112:15)
at processTicksAndRejections (internal/process/task_queues.js:97:5) {
response: {
config: {
url: 'https://www.googleapis.com/drive/v3/files/1GqxjzMjrDdJquOPeMrFGIMngE20vTrjU?alt=media&key=ya29.6q33c6T418VuSILwq...cLKnBMKEG4vhui8K',
method: 'GET',
responseType: 'stream',
userAgentDirectives: [Array],
paramsSerializer: [Function],
headers: [Object],
params: [Object],
validateStatus: [Function],
retry: true,
retryConfig: [Object]
},
data: PassThrough {
_readableState: [ReadableState],
readable: true,
_events: [Object: null prototype],
_eventsCount: 2,
_maxListeners: undefined,
_writableState: [WritableState],
writable: false,
allowHalfOpen: true,
_transformState: [Object],
[Symbol(kCapture)]: false
},
headers: {
'alt-svc': 'h3-29=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"',
'cache-control': 'private, max-age=0',
connection: 'close',
'content-length': '176',
'content-type': 'application/json; charset=UTF-8',
date: 'Wed, 16 Dec 2020 11:42:15 GMT',
expires: 'Wed, 16 Dec 2020 11:42:15 GMT',
server: 'UploadServer',
vary: 'Origin, X-Origin',
'x-guploader-uploadid': 'ABg5-Uw8z7O1Hpe1od4_dQF9So652TfYS0Mc1vpIu3t4DDXPzB7YvNwQAeHKCvoNBF-7m_pW9e8EHPOgrEHS84HWR7M'
},
status: 400,
statusText: 'Bad Request',
request: {
responseURL: 'https://www.googleapis.com/drive/v3/files/1GqxjzMjrDdJquOPeMrFGIMngE20vTrjU?alt=media&key=ya29.6q33c6T418VuSILwq...cLKnBMKEG4vhui8K'
}
},
config: {
url: 'https://www.googleapis.com/drive/v3/files/1GqxjzMjrDdJquOPeMrFGIMngE20vTrjU?alt=media&key=ya29.6q33c6T418VuSILwq...cLKnBMKEG4vhui8K',
method: 'GET',
responseType: 'stream',
userAgentDirectives: [ [Object] ],
paramsSerializer: [Function],
headers: {
'x-goog-api-client': 'gdcl/4.4.3 gl-node/12.19.0 auth/6.1.3',
'Accept-Encoding': 'gzip',
'User-Agent': 'google-api-nodejs-client/4.4.3 (gzip)'
},
params: {
alt: 'media',
key: 'ya29.6q33c6T418VuSILwq...cLKnBMKEG4vhui8K'
},
validateStatus: [Function],
retry: true,
retryConfig: {
currentRetryAttempt: 0,
retry: 3,
httpMethodsToRetry: [Array],
noResponseRetries: 2,
statusCodesToRetry: [Array]
}
},
code: '400'
}
Here I observed that, there's a "key" parameter but from what I understand there should be an Authorization token. I tried using the file url from POSTMAN using a GET call with Authorization Bearer but it responded with a sign in page.
Questions:
Since I provided a token, shouldn't it download the file directly?
Why isn't the file downloading on the backend?
Note: I did follow the google drive api documentation on node.js but even that prompts user sign in which shouldn't be the case.
Edit: Added the client code as well
Client side (reactjs)
<GooglePicker
clientId={"7...b.apps.googleusercontent.com"}
developerKey={"AIz...PY"}
scope={["https://www.googleapis.com/auth/drive.readonly"]}
onChange={handleFileChange}
onAuthenticate={(token) => setDownloadToken(token)}
onAuthFailed={(data) => console.log("on auth failed:", data)}
multiselect={false}
navHidden={false}
authImmediate={false}
query={"a query string like .txt or fileName"}
viewId={"PDFS"}
>
<Button variant="contained" size="small">
G-Drive
</Button>
</GooglePicker>{" "}
Node.js Code
export const googleDriveUpload = async (_, {fileID, authToken}) => {
console.log(fileID, authToken);
const drive = google.drive({version: 'v3', auth:authToken});
var dest = fs.createWriteStream('/tmp/resume.pdf');
drive.files.get({fileId: fileID, alt: 'media'},
{responseType: 'stream'}, (err, res ) => {
if (err) {
console.log("Error " ,err);
} else {
res.data.pipe(file);
console.log("Download complete");
}
});
When you define const drive = google.drive({version: 'v3', auth:XXX}), you need to assign to auth the response of the function authorize(), as shown in the quickstart for Drive API in node.js
Please follow the complete quickstart to obtain a valid authenticated client
If creating authenticating with an oAuth2 client is not what you want, there are also other options to create valid crendetials, see google-api-nodejs-client library

Sending proactive messages to MS teams in a channel

We are trying to use proactive messages to send notification in a teams channel whenever some activity happens in the backend system. I am using the below code:
await msadapter.continueConversation(convRef, async turnContext => {
await turnContext.sendActivity('proactive hello');
});
in the above code convRef is a reference to conversationReference that is stored into our database. It is obtained when the first time channel is created using a listener on
this.onConversationUpdate(async (context, next: () => Promise<void>) => {
}
The above technique works just fine when i try it in a group chat. But i get authorization error in case of channel as below
[onTurnError] { Error: Authorization has been denied for this request.
at new RestError (/workspace/node_modules/#azure/ms-rest-js/dist/msRest.node.js:1397:28)
at /workspace/node_modules/#azure/ms-rest-js/dist/msRest.node.js:1849:37
at process._tickCallback (internal/process/next_tick.js:68:7)
code: undefined,
statusCode: 401,
request:
WebResource {
streamResponseBody: false,
url:
'https://smba.trafficmanager.net/amer/v3/conversations/19%3A82de560bf7c843b7a6e050fed39db2c5%40thread.tacv2/activities/f%3A48b6e038-a38d-94e3-7907-dcf88ad4484d',
method: 'POST',
headers: HttpHeaders { _headersMap: [Object] },
body:
'{"type":"message","serviceUrl":"https://smba.trafficmanager.net/amer/","channelId":"msteams","from":{"id":"28:a835cf1d-83a8-4ae9-845a-23a68a1df442","name":"FlashCX.ai"},"conversation":{"isGroup":true,"conversationType":"channel","id":"19:82de560bf7c843b7a6e050fed39db2c5#thread.tacv2","tenantId":"04e930f3-0866-4a6d-b07c-a4737e8f9865"},"recipient":{"id":"29:1a-Xb7uPrMwC2XqjMEHCC7ytV2xb2VUCqTA-n_s-k5ZyMCTKIL-ku2XkgbE167D_5ZbmVaqQxJGIQ13vypSqu-A","aadObjectId":"718ab805-860c-43ec-8d4e-4af0c543df75"},"text":"proactive hello hi","inputHint":"acceptingInput","replyToId":"f:48b6e038-a38d-94e3-7907-dcf88ad4484d"}',
query: undefined,
formData: undefined,
withCredentials: false,
abortSignal: undefined,
timeout: 0,
onUploadProgress: undefined,
onDownloadProgress: undefined,
operationSpec:
{ httpMethod: 'POST',
path: 'v3/conversations/{conversationId}/activities/{activityId}',
urlParameters: [Array],
requestBody: [Object],
responses: [Object],
serializer: [Serializer] } },
response:
{ body:
'{"message":"Authorization has been denied for this request."}',
headers: HttpHeaders { _headersMap: [Object] },
status: 401 },
body:
{ message: 'Authorization has been denied for this request.' } }
Doc reference: https://learn.microsoft.com/en-us/azure/bot-service/bot-builder-howto-proactive-message?view=azure-bot-service-4.0&tabs=javascript#send-proactive-message

Error when saving to Storage at Google Cloud Function: { Error: Could not refresh access token: Unsuccessful response status code...}

I've written my code some month ago, and now without changing something there is an error message and I don't know why.
My Code in a Google Cloud Function loads a picture from an AWS S3 bucket to google cloud and saves it in a bucket, which was actually working some month ago!
My code is running node.js 8:
const tmpdir = os.tmpdir();
const filePath = path.join(tmpdir, fileName);
console.log(filePath);
console.log(data);
await writeFile(filePath, data.Body);
const bucketName = "test";
console.log('test1');
let storage;
if(!storage){
storage = new Storage();
}
await storage.bucket(bucketName).upload(filePath, {
gzip: false,
metadata: {
cacheControl: "public, max-age=31536000"
}
});
console.log('test2');
console.log(`${filePath} uploaded to ${bucketName}.`);
"test1" is printed in the logs, but "test2" not.
The error I get:
function-138m28l2agh39{ Error: Could not refresh access token: Unsuccessful response status code. Request failed with status code 500
at Gaxios._request (/srv/node_modules/gaxios/build/src/gaxios.js:85:23)
at <anonymous>
at process._tickDomainCallback (internal/process/next_tick.js:229:7)
response:
{ config:
{ url: 'http://169.254.169.254/computeMetadata/v1/instance/service-accounts/default/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fiam%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control',
headers: [Object],
retryConfig: [Object],
params: [Object],
responseType: 'text',
timeout: 0,
paramsSerializer: [Function: paramsSerializer],
validateStatus: [Function: validateStatus],
method: 'GET' },
data: 'Could not fetch URI /computeMetadata/v1/instance/service-accounts/default/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fiam%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control\n',
headers:
{ connection: 'close',
'content-length': '260',
'content-type': 'text/plain; charset=utf-8',
date: 'Thu, 16 Jul 2020 14:25:37 GMT',
'x-content-type-options': 'nosniff' },
status: 500,
statusText: 'Internal Server Error',
request:
{ responseURL: 'http://169.254.169.254/computeMetadata/v1/instance/service-accounts/default/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fiam%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control' } },
config:
{ url: 'http://169.254.169.254/computeMetadata/v1/instance/service-accounts/default/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fiam%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control',
headers: { 'Metadata-Flavor': 'Google' },
retryConfig:
{ noResponseRetries: 3,
currentRetryAttempt: 3,
retry: 3,
httpMethodsToRetry: [Array],
statusCodesToRetry: [Array] },
params:
{ scopes: 'https://www.googleapis.com/auth/iam,https://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/devstorage.full_control' },
responseType: 'text',
timeout: 0,
paramsSerializer: [Function: paramsSerializer],
validateStatus: [Function: validateStatus],
method: 'GET' },
code: '500' }
Did somebody face a similar issue or can somebody help? :)

putMapping illegal_argument_exception without message

I try to call putMapping from elasticsearch javascript client, but always got an error with status code 400, illegal_argument_exception.
When I call getMapping for "documents" index I got:
{
"documents":{
"mappings": {
"properties":{
"category":{"type":"text","fields":{"keyword":
{"type":"keyword","ignore_above":256}}},
"createdAt":{"type":"text","fields":{"keyword":
{"type":"keyword","ignore_above":256}}},
"id":{"type":"text","fields":{"keyword":
{"type":"keyword","ignore_above":256}}},
"info":{"type":"text","fields":{"keyword":
{"type":"keyword","ignore_above":256}}},
"text":{"type":"text","fields":{"keyword":
{"type":"keyword","ignore_above":256}}},
"title":{"type":"text","fields":{"keyword":
{"type":"keyword","ignore_above":256}}},
"visibility":{"type":"boolean"}
}
}
}
}
I tried to call putMapping with same properties and add a new parameter 'index': 'not_analyzed' to category field:
esClient.indices.putMapping({
index: 'documents',
type: 'document',
body: {
document: {
properties: {
title: { type: 'text' },
info: { type: 'text' },
text: { type: 'text' },
category: { type: 'text', index:
'not_analyzed'
},
visibility: { type: 'boolean' },
createdAt: { type: 'text' },
},
},
},
}, (err, resp) => {
if (err) {
console.error(err);
}
else {
console.log('Successfully Created Index', resp);
}
});
I got this error:
ResponseError: illegal_argument_exception
at IncomingMessage.<anonymous> (/Users/user/reference_sys_cfu-back/node_modules/#elastic/elasticsearch/lib/Transport.js:287:25)
at IncomingMessage.emit (events.js:208:15)
at IncomingMessage.EventEmitter.emit (domain.js:476:20)
at endReadableNT (_stream_readable.js:1168:12)
at processTicksAndRejections (internal/process/task_queues.js:77:11) {
name: 'ResponseError',
meta: {
body: { error: [Object], status: 400 },
statusCode: 400,
headers: {
'content-type': 'application/json; charset=UTF-8',
'content-length': '345'
},
warnings: null,
meta: {
context: null,
request: [Object],
name: 'elasticsearch-js',
connection: [Object],
attempts: 0,
aborted: false
}
}
}
index: 'not_analyzed' - I do this for use "term" method for exact finding strings, but it (index: 'not_analyzed') deprecated in 2.x version of elasticsearch. Now I just use field.keyword: 'query string', to finding exact strings.

Hyperledger Composer with Custom IBM Cloudant Wallet Fails - Entity too Large

I am creating a nodejs server where I am adding participants and creating the cards on the cloud wallet. From what I learned from playing with the composer sdk, I ended up creating a function that takes the participant details and then issues the identity and then creates a card on the cloudant. This method was working fine until today.
Suddenly I am getting 413 - Entity too large when I am trying to import the card.
Any suggestions?
const BusinessNetworkConnection = require("composer-client")
.BusinessNetworkConnection;
const NetworkCardStoreManager = require("composer-common")
.NetworkCardStoreManager;
const IdCard = require("composer-common").IdCard;
const connectionProfile = JSON.parse(
JSON.stringify(require("./connection-profile.json"))
);
const walletType = {
type: "#ampretia/composer-wallet-cloudant",
options: {
database: "composer-wallets",
apikey: "",
host: "",
iam_apikey_description: "",
iam_apikey_name: "",
iam_serviceid_crn: "",
password: "",
port: ,
url: "",
username: ""
}
};
const AdminConnection = require("composer-admin").AdminConnection;
const cardStore = NetworkCardStoreManager.getCardStore(walletType);
const bnConnection = new BusinessNetworkConnection({ cardStore });
const adminConnection = new AdminConnection({ cardStore });
const addParticipantAndIssueIdentity = userDetails => {
return new Promise(async (resolve, reject) => {
try {
await adminConnection.connect("admin#test.com");
const definition = await bnConnection.connect(
"admin#test.com"
);
const participantRegistry = await bnConnection.getParticipantRegistry(
`org.test.bna.${userDetails.role}`
);
const factory = definition.getFactory();
let participant = factory.newResource(
"org.test.bna",
`${userDetails.role}`,
`${userDetails.uid}`
);
participant.email = userDetails.email;
participant.firstName = userDetails.firstName;
participant.lastName = userDetails.lastName;
await participantRegistry.add(participant);
const returnedCard = await bnConnection.issueIdentity(
`org.test.bna.${userDetails.role}#${userDetails.uid}`,
`${userDetails.uid}`
);
const metadata = {
userName: returnedCard.userID,
version: 1,
enrollmentSecret: returnedCard.userSecret,
businessNetwork: "test"
};
const idCard = new IdCard(metadata, connectionProfile);
console.log("Importing Card 1");
await adminConnection.importCard(
`${userDetails.uid}#test`,
idCard
); <----- This call is failing
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
console.log("Request Identity");
const result = await adminConnection.requestIdentity(
`${userDetails.uid}#test`,
returnedCard.userID,
returnedCard.enrollmentSecret
);
idCard.setCredentials({
...result,
privateKey: result.key
});
console.log("Importing Card 2");
await adminConnection.importCard(
`${userDetails.uid}#test`,
idCard
);
console.log("Exporting Card");
await adminConnection.exportCard(
`${userDetails.uid}#test`
);
resolve(true);
} catch (error) {
console.log(error);
reject(error);
}
});
};
UPDATE:
I have added the error log below:
{ Error: Failed to save card: testCard#test
at card.toArchive.then.catch (/node_modules/composer-common/lib/cardstore/walletbackedcardstore.js:100:31)
at <anonymous>
at process._tickCallback (internal/process/next_tick.js:188:7)
cause:
{ Error: 413 Request Entity Too Large
at Object.clientCallback (/node_modules/#cloudant/cloudant/lib/client.js:213:20)
at Request._callback (/node_modules/#cloudant/cloudant/lib/clientutils.js:154:11)
at Request.self.callback (/node_modules/request/request.js:185:22)
at emitTwo (events.js:126:13)
at Request.emit (events.js:214:7)
at Request.self._source.emit (/node_modules/#cloudant/cloudant/lib/eventrelay.js:78:21)
at Request.<anonymous> (/node_modules/request/request.js:1161:10)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
at Request.self._source.emit (/node_modules/#cloudant/cloudant/lib/eventrelay.js:78:21)
_response:
IncomingMessage {
_readableState: [Object],
readable: false,
domain: null,
_events: [Object],
_eventsCount: 4,
_maxListeners: undefined,
socket: [Object],
connection: [Object],
httpVersionMajor: 1,
httpVersionMinor: 1,
httpVersion: '1.1',
complete: true,
headers: [Object],
rawHeaders: [Array],
trailers: {},
rawTrailers: [],
upgrade: false,
url: '',
method: null,
statusCode: 413,
statusMessage: 'Request Entity Too Large',
client: [Object],
_consuming: true,
_dumped: false,
req: [Object],
request: [Object],
toJSON: [Function: responseToJSON],
caseless: [Object],
read: [Function],
body: '{"error":"document_too_large","reason":"cards"}\n' },
_data: { error: 'document_too_large', reason: 'cards', statusCode: 413 } } }
{ Error: Failed to save card: pleasework52#airspace-blockchain-company
at card.toArchive.then.catch (/node_modules/composer-common/lib/cardstore/walletbackedcardstore.js:100:31)
at <anonymous>
at process._tickCallback (internal/process/next_tick.js:188:7)
cause:
{ Error: 413 Request Entity Too Large
at Object.clientCallback (/node_modules/#cloudant/cloudant/lib/client.js:213:20)
at Request._callback (/node_modules/#cloudant/cloudant/lib/clientutils.js:154:11)
at Request.self.callback (/node_modules/request/request.js:185:22)
at emitTwo (events.js:126:13)
at Request.emit (events.js:214:7)
at Request.self._source.emit (/node_modules/#cloudant/cloudant/lib/eventrelay.js:78:21)
at Request.<anonymous> (/node_modules/request/request.js:1161:10)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
at Request.self._source.emit (/node_modules/#cloudant/cloudant/lib/eventrelay.js:78:21)
_response:
IncomingMessage {
_readableState: [Object],
readable: false,
domain: null,
_events: [Object],
_eventsCount: 4,
_maxListeners: undefined,
socket: [Object],
connection: [Object],
httpVersionMajor: 1,
httpVersionMinor: 1,
httpVersion: '1.1',
complete: true,
headers: [Object],
rawHeaders: [Array],
trailers: {},
rawTrailers: [],
upgrade: false,
url: '',
method: null,
statusCode: 413,
statusMessage: 'Request Entity Too Large',
client: [Object],
_consuming: true,
_dumped: false,
req: [Object],
request: [Object],
toJSON: [Function: responseToJSON],
caseless: [Object],
read: [Function],
body: '{"error":"document_too_large","reason":"cards"}\n' },
_data: { error: 'document_too_large', reason: 'cards', statusCode: 413 } } }
All flavours of Cloudant on IBM Cloud imposes a 1 meg max document size. There is no way around this, and applies to both free and paid versions. See https://console.bluemix.net/docs/services/Cloudant/api/document.html#documents
An efficient data architecture for Cloudant would typically use documents of a couple of kilobytes.

Resources