Nodejs app using Mailjet throwing a confusing error - node.js

I'm building an app using Mailjet, and using their connection example.
app.get('/send',function(req,res){
...
var request = mailjet
.post("send")
.request({
<request stuff, email details>
});
request
.on('success', function (response, body) {
<handle response>
})
.on('error', function (err, response) {
<handle error>
});
Getting this error:
Unhandled rejection Error: Unsuccessful
at /home/ubuntu/workspace/node_modules/node-mailjet/mailjet-client.js:203:23
When I go to the Mailjet client and ask it to log the error, it tells me:
{ [Error: Unauthorized]
original: null,
...
Anyone have an idea of where I should start troubleshooting?
Update: saw this in the error output:
header:
{ server: 'nginx',
date: 'Thu, 02 Mar 2017 14:04:11 GMT',
'content-type': 'text/html',
'content-length': '20',
connection: 'close',
'www-authenticate': 'Basic realm="Provide an apiKey and secretKey"',
vary: 'Accept-Encoding',
'content-encoding': 'gzip' },
So it's not eating my API key and secret. Can anyone tell me how to set those as environmental variables in Cloud9?

You can set environment variables in ~/.profile. Files outside of the workspace directory /home/ubuntu/workspace aren't accessible for read-only users so people won't be able to see them.
In the terminal, you can do for example:
$> echo "export MAILJET_PUBLIC=foo" >> ~/.profile
$> echo "export MAILJET_SECRET=bar" >> ~/.profile
Then, you'll be able to access those variables in Node when using the connect method:
const mailjet = require ('node-mailjet')
.connect(process.env.MAILJET_PUBLIC, process.env.MAILJET_SECRET)
The runners (from the "run" button) and the terminal will evaluate ~/.profile and make the environment variable available to your app.

Related

Influxdb2 Python API: Path not found

I have a working InfluxDb2 server and, on a Raspberry Pi, the Python client library.
I've generated the the tokens in the server UI and copied an all-areas one into the Python. The test bucket is set up in the UI too. In the Python program I have this:
bucket = "test"
org = "test-org"
#
token = "blabla=="
# Store the URL of your InfluxDB instance
url="http://10.0.1.1:8086/api/v2"
client = influxdb_client.InfluxDBClient(
url=url,
token=token,
org=org
)
Followed later by:
p = influxdb_client.Point("my_measurement").tag("location", "Prague").field("temperature", 25.3)
write_api = client.write_api(write_options=SYNCHRONOUS)
write_api.write(bucket='test', org='test-org', record=p)
I've overcome the not-authorized but now, whatever I do, I end up with this:
influxdb_client.rest.ApiException: (404)
Reason: Not Found
HTTP response headers: HTTPHeaderDict({'Content-Type': 'application/json; charset=utf-8', 'X-Influxdb-Build': 'OSS', 'X-Influxdb-Version': 'v2.2.0', 'X-Platform-Error-Code': 'not found', 'Date': 'Tue, 26 Apr 2022 14:35:50 GMT', 'Content-Length': '54'})
HTTP response body: {
"code": "not found",
"message": "path not found"
}
I've also gone back to Curl which gives me not authorized problem with the same parameters. Any help appreciated, beginning to regret trying to upgrade now.
You don't need the /api/v2 in your url parameter, just url="http://10.0.1.1:8086"
See https://github.com/influxdata/influxdb-client-python#getting-started

Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG (Airflow version 2.0.2 MWAA)

I try to use SparkKubernetesOperator to run spark job into Kubernetes with the same DAG and yaml files as the following question:
Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG
But airflow shows the following error:
HTTP response headers: HTTPHeaderDict({'Audit-Id': 'e2e1833d-a1a6-40d4-9d05-104a32897deb', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'Date': 'Fri, 10 Sep 2021 08:38:33 GMT', 'Content-Length': '462'})
HTTP response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"the object provided is unrecognized (must be of type SparkApplication): couldn't get version/kind; json parse error: json: cannot unmarshal string into Go value of type struct { APIVersion string \"json:\\\"apiVersion,omitempty\\\"\"; Kind string \"json:\\\"kind,omitempty\\\"\" } (222f7573722f6c6f63616c2f616972666c6f772f646167732f636f6e6669 ...)","reason":"BadRequest","code":400}
Any suggestion to resolve that problem???
think u had the same problem like me
SparkKubernetesOperator(
task_id='spark_pi_submit',
namespace="default",
application_file=open("/opt/airflow/dags/repo/script/spark-test.yaml").read(), #officially know bug
kubernetes_conn_id="kubeConnTest", #ns default in airflow connection UI
do_xcom_push=True,
dag=dag
)
I wrapped it like this.
and it works like charm
https://github.com/apache/airflow/issues/17371

How to run REST API to build trigger in google cloud build

I have written a python script in my local machine and trying to run it and getting below error:
Error
{'error': {'code': 400,
'details': [{'#type': 'type.googleapis.com/google.rpc.Help',
'links': [{'description': 'Google developer console '
'API key',
'url': 'https://console.developers.google.com/project/[project_id]/apiui/credential'}]}],
'message': 'The API Key and the authentication credential are from '
'different projects.',
'status': 'INVALID_ARGUMENT'}}
python script to Build trigger
bashCommand = "gcloud auth print-access-token"
process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE)
output, error = process.communicate()
if error:
print(error)
headers = {
'Authorization' : 'Bearer '+str(output)[2:-3],
'Accept' : 'application/json',
'Content-Type' : 'application/json'
}
cloudbuild = {"build":
{"source":
{"repoSource":
{"projectId":"[PROJECT_ID]",
"repoName":"[repoName]",
"branchName":".*"
}
}
},
"description":"API TRigger for all branch",
"name":"[TRIGGER NAME]"
}
data = json.dumps(cloudbuild)
response = requests.post('https://cloudbuild.googleapis.com/v1/projects/[PROJECT_ID]/triggers?key=[API KEY]', headers=headers, data=data)
results_output = response.json()
pprint(results_output)
I also set the project in my local machine
gcloud config set project [project-name]
please give some solution for this.
Thanks in advance.
I removed API Key from request
Access-token is enough to run the above python script

How to fix: "EPROTO" Error after upgrading Node's version

The following code works on Node's v10.15.3 version:
const { post } = require('request');
post({
url: 'https://cidadao.sinesp.gov.br/sinesp-cidadao/mobile/consultar-placa/v4',
body: '<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\n<v:Envelope xmlns:v=\"http://schemas.xmlsoap.org/soap/envelope/\">\n <v:Header>\n <b>LGE Nexus 5</b>\n <c>ANDROID</c>\n <d>v4</d>\n <e>4.3.2</e>\n <f>98.193.54.223</f>\n <g>514650d8dba4784ed08b5a029583576361a50bc5</g>\n <h>-3272.3179572637086</h>\n <i>940.839492700698</i>\n <j/>\n <k/>\n <l>2019-05-24 10:24:35</l>\n <m>8797e74f0d6eb7b1ff3dc114d4aa12d3</m>\n </v:Header>\n <v:Body xmlns:n0=\"http://soap.ws.placa.service.sinesp.serpro.gov.br/\">\n <n0:getStatus>\n <a>LSU3J43</a>\n </n0:getStatus>\n </v:Body>\n</v:Envelope>',
headers: {
'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
'User-Agent': 'SinespCidadao / 3.0.2.1 CFNetwork / 758.2.8 Darwin / 15.0.0',
Host: 'cidadao.sinesp.gov.br'
},
}, (err, httpResponse, body) => {
if (err) return console.error(err);
console.log(JSON.stringify(httpResponse));
});
But after upgrading to v12.2.0 or above I got the following error:
Error: write EPROTO 17432:error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol:c:\ws\deps\openssl\openssl\ssl\statem\statem_lib.c:1922:
at WriteWrap.onWriteComplete [as oncomplete] (internal/stream_base_commons.js:83:16) {
errno: 'EPROTO',
code: 'EPROTO',
syscall: 'write'
}
How can I fix it?
As the same code works on Node.js v10.15.3, but not work on v12.2.0, and the error message indicates "unsupported protocol", the most possible root cause of this issue is: the minimal supported TLS version in Node.js 10 is TLSv1.0, but since v11.4.0, it is raised to TLSv1.2 (tls.DEFAULT_MIN_VERSION). I suspect the certificate of cidadao.sinesp.gov.br is signed with TLSv1.0, which works on Node.js v10.15.3, but not on v12.2.0.
To make Node.js accept TLSv1.0, you can lauch Node.js process with --tls-min-v1.0 option.
I made an experiment and it works well:
BTW, the certificate of cidadao.sinesp.gov.br is invalid now. It has been expired since May 2018 -- as the OP mentioned, the request should be sent from Brazil (or through proxy node in Brazil).

request not work, Error: Invalid protocol: 127.0.0.1:?

I am new to node.js, I use request send the post request.but I got a error!
request({
method: 'POST',
url: config.api + '/index',
body: {
name: "name"
},
json: true
})
throw er; // Unhandled 'error' event
^
Error: Invalid protocol: 127.0.0.1:
I write this: It work fine, you can modify it like this.
request({
method: 'POST',
url: 'http://127.0.0.1:3000' + '/index',
body: {
name: "name"
},
json: true
})
Your code is incorrect: follow the instructions on the NPM module page.
If you're using a PHP Development Server please refer to this thread for the solution.
Stack Overflow
I met a similar issue on Win10 after a system update. It was caused by the system proxy settings.
http_proxy=127.0.0.1:8888
https_proxy=127.0.0.1:8888
Change the above environment settings to
http_proxy=http://127.0.0.1:8888
https_proxy=http://127.0.0.1:8888
done the job for me.
Btw, if you use git-bash, you can also check the git config.
$git config --list
...
http.sslverify=false
http.proxy=http://127.0.0.1:8888
https.proxy=http://127.0.0.1:8888
...

Resources