Is there a redis package for "Azure cache for Redis" in dart 2.9.2 (for flutter app) which takes hostname, port and its key in the connection string? - azure

I want to connect my Azure Cache for Redis in a flutter app. Currently I've tried two packages in dart for redis which are redis 1.3.0 and dartis 0.5.0.
Example:
import 'package:redis/redis.dart';
...
RedisConnection conn = new RedisConnection();
conn
.connect('localhost', 6379)
.then((Command command) {
print("yo2");
command.send_object(["SET", "key1", "value1"]).then((var response) {
print(response);
});
});
Instead of "localhost" I put "SampleName.redis.cache.windows.net". This is the error I get:
E/flutter ( 4861): [ERROR:flutter/shell/common/shell.cc(209)] Dart Error: Unhandled exception:
E/flutter ( 4861): RedisError(NOAUTH Authentication required.)
old package This is the package starred on Redis Website. But it's incompatible on versions >2.

Okay so I found the solution. Use your key in the following way:
For redis 1.3.0:
...
RedisConnection conn = new RedisConnection();
conn
.connect('SampleName.redis.cache.windows.net', 6379)
.then((Command command) {
print("yo2");
command.send_object([
"AUTH",
"<YourKey>"
]).then((var response) {
print(response);
});
command.send_object(["SET", "key1", "value1"]);
});
...
And for dartis 0.5.0:
...
final client = await redis.Client.connect(
'redis://SampleName.redis.cache.windows.net:6379');
// Runs some commands.
final commands = client.asCommands<String, String>();
await commands.auth("<YourKey>");
// SET key value
await commands.set('yo', 'yovalue');
...

Related

Django Channels Consumer Not Connecting to websocket

i created a websocket with javascripts on the client side...then followed up with setting up my project to handle webocket connections as follows(following the official django-channels documentation). But each time i refresh the page and watch the websocket from the browser console...its fails. i inserted an print statement in the init of the consumer class and it was printed(each time a page containing a websocket was visited or refreshed)..which mean that the routing is working fine...but for some reasons the consumer is not connecting/accepting the connection as expected to the websocket. and again there is no log in the development server as to any websocket connection process.Please can anyone help and suggest a fix.
python-version - 3.9.9,
django-version - 3.2.9,
channels-version - 3.0.4
my setting.py file(relevant lines)
INSTALLED_APPS = [
...
'channels',
]
ASGI_APPLICATION = 'ma_mall.asgi.application'
CHANNEL_LAYERS = {
"default": {
'BACKEND': "channels_redis.core.RedisChannelLayer",
"CONFIG": {
'hosts': [('127.0.0.1', 6379)],
}
}
}
my asgi.py file
application = ProtocolTypeRouter({
"http": get_asgi_application(),
"websocket": URLRouter(
routing.websocket_urlpatterns
)
})
my routing.py file
websocket_urlpatterns = [
path("ws/notifications/", consumers.NotificationConsumer.as_asgi()),
]
my consumer file
class NotificationConsumer(AsyncJsonWebsocketConsumer):
groups = ['general_group']
async def connect(self):
await self.accept()
await self.channel_layer.group_add('notification_group', self.channel_name)
await self.channel_layer.group_send('notification_group',
{
'type': 'tester.message',
'tester': 'tester'
}
)
async def disconnect(self, close_code):
await self.channel_layer.group_discard('notification_group', self.channel_name)
the javascript for the websocket on the client-side
i created as websocket on the client side using javascript as follows
const notification_websocket = new WebSocket(
'ws://' +
window.location.host +
'/ws/notifications/'
);
notification_websocket.onmessage = function (e) {
let data = JSON.parse(e.data);
console.log("Just received this from the back end.. 0", data);
}
notification_websocket.onclose = function (e) {
console.log("websocket closed");
}
i realized that i had used an Older Version of Redis (to go around using redis on Windows) and the issue happen to be coming from the redis part of the configuration. Once i switched from using Redis to using a Memcache
CHANNEL_LAYERS = {
"default": {
"BACKEND": "channels.layers.InMemoryChannelLayer"
}
}
the websocket-consumer connection connected and was kept open as expected... i hope to find a solution to using Redis newer version on Windows for development

How to connect to Google Cloud SQL (PostgreSQL) from Cloud Functions?

I feel like I've tried everything. I have a cloud function that I am trying to connect to Cloud SQL (PostgreSQL engine). Before I do so, I pull connection string info from Secrets Manager, set that up in a credentials object, and call a pg (package) pool to run a database query.
Below is my code:
Credentials:
import { Pool } from 'pg';
const credentials: sqlCredentials = {
"host":"127.0.0.1",
"database":"myFirstDatabase",
"port":"5432",
"user":"postgres",
"password":"postgres1!"
}
const pool: Pool = new Pool(credentials);
await pool.query(`select CURRENT_DATE;`).catch(error => console.error(`error in pool.query: ${error}`));
Upon running the cloud function with this code, I get the following error:
error in pool.query: Error: connect ECONNREFUSED 127.0.0.1:5432
I have attempted to update the host to the private IP of the Cloud SQL instance, and also update the host to the Cloud SQL instance name on this environment, but that is to no avail. Any other ideas?
Through much tribulation, I figured out the answer. Given that there is NO documentation on how to solve this, I'm going to put the answer here in hopes that I can come back here in 2025 and see that it has helped hundreds. In fact, I'm setting a reminder in my phone right now to check this URL on November 24, 2025.
Solution: The host must be set as:
/cloudsql/<googleProjectName(notId)>:<region>:<sql instanceName>
Ending code:
import { Pool } from 'pg';
const credentials: sqlCredentials = {
"host":"/cloudsql/my-first-project-191923:us-east1:my-first-cloudsql-inst",
"database":"myFirstDatabase",
"port":"5432",
"user":"postgres",
"password":"postgres1!"
}
const pool: Pool = new Pool(credentials);
await pool.query(`select CURRENT_DATE;`).catch(error => console.error(`error in pool.query: ${error}`));

UnhandledPromiseRejectionWarning: TypeError: Channel credentials must be a ChannelCredentials object in GCP Batch publishing

I am trying to do batch publishing of messages using node module #google-cloud/pubsub. My batch publishing code looks like below.
const { PubSub } = require("#google-cloud/pubsub");
const grpc = require("grpc");
const createPublishEventsInBatch = (fastify, topic) => {
const pubSub = new PubSub({ grpc });
const batchPublisher = pubSub.topic(topic, {
batching: {
maxMessages: 100,
maxMilliseconds: 1000
}
});
return (logTrace, data, eventInfo, version) => {
const { entityType, eventType } = eventInfo;
fastify.log.debug({
logTrace,
eventType: eventType,
data,
message: `Publishing batch events for ${entityType}`
});
const event = createEvent(data, entityType, eventType, logTrace, version);
batchPublisher.publish(Buffer.from(JSON.stringify(event)));
fastify.log.debug({
traceHeaders: logTrace,
tenant: data.tenant,
message: "Event publish completed",
data
});
};
};
Pubsub and gRPC version as follows.
"#google-cloud/pubsub": "^2.18.1",
"grpc": "^1.24.11"
When I am publishing the message with above code I am getting the following error.
(node:6) UnhandledPromiseRejectionWarning: TypeError: Channel credentials must be a ChannelCredentials object
at new ChannelImplementation (/app/node_modules/#grpc/grpc-js/build/src/channel.js:75:19)
at new Client (/app/node_modules/#grpc/grpc-js/build/src/client.js:61:36)
at new ServiceClientImpl (/app/node_modules/#grpc/grpc-js/build/src/make-client.js:58:5)
at GrpcClient.createStub (/app/node_modules/google-gax/build/src/grpc.js:334:22)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
I am seeing this issue only in my production environment and in staging and all my lower environment, this is working fine. Can somebody please guide me to fix this issue.
Not in regards to the exception, but I wanted to mention that you'd generally want to do this once and then cache:
const pubSub = new PubSub({ grpc });
const batchPublisher = pubSub.topic(topic, {
This lets you avoid a lot of init overhead, possibly some memory leaks (from proto parsing), and lets you keep a single publishing queue (and batching) for all requests.

Write An azure function with Javascript that is connecting OracleDB and running on Docker

I have an Azure Function that is written with Visual Studio Code and it is a nodejs application with javascript codes.
Also, the application connects to Oracle DB to run an Oracle Script.
Also, the application runs on a Docker image.
I added the npm packages for oracle connection;
npm i express
npm i oracledb
Below my some key points of code;
Dockerfile
FROM mcr.microsoft.com/azure-functions/node:3.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
COPY . /home/site/wwwroot
RUN cd /home/site/wwwroot && \
npm install
index.js
module.exports = async function (context, req) {
let responseMessage = "";
let connection;
try {
const oracledb = require('oracledb');
connection = await oracledb.getConnection({
user: "xx",
password: "xx",
connectString: req.body
});
let query = 'select * from xx where rownum=1';
result = await connection.execute(query);
responseMessage = result;
} catch (err) {
responseMessage = err.message;
} finally {
if (connection) {
try {
// Always close connections
await connection.close();
} catch (err) {
responseMessage = err.message;
}
}
}
context.res = {
body: responseMessage
};
}
Here is my folder structure of project;
CASE1: When I run the project with "func start" the application is working properly and gets the result.
CASE2: When I run it on my local docker with below steps it returns an error form HTTP response.
Run "docker build ."
docker run -d -p 99:80 myimage
It is listed on "docker ps" list.
I call the endpoint "http://localhost:99/api/HttpExample" I get an error.
DPI-1047: Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory". See https://oracle.github.io/node-oracledb/INSTALL.html for help
Node-oracledb installation instructions: https://oracle.github.io/node-oracledb/INSTALL.html
You must have 64-bit Oracle client libraries in LD_LIBRARY_PATH, or configured with ldconfig.
If you do not have Oracle Database on this computer, then install the Instant Client Basic or Basic Light package from
http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html
I search some documentations but I can't find the solution for especially Azure Function Project. Because my dockerfile have to be created based on "FROM mcr.microsoft.com/azure-functions/node:3.0".
What should be the Dockerfile for this project?

Dart await keyword

I want to try the rpc package of Dart using the io sample (https://github.com/dart-lang/rpc)
I'm on the 64 bits version of Dart editor with the 1.9.1 sdk (cannot update more thans stable version)
This is my pubspec.yaml :
name: myDartRestServer
version: 0.0.1
description: A minimal command-line application.
#author: <your name> <email#example.com>
#homepage: https://www.example.com
environment:
sdk: '1.9.1'
dependencies:
rpc: any
dev_dependencies:
unittest: any
But when I try to copy the sample for launch my server :
final ApiServer _apiServer = new ApiServer('/api/', prettyPrint: true);
void start() {
_apiServer.addApi(new Synchro());
HttpServer server = await HttpServer.bind(InternetAddress.ANY_IP_V4, 9090);
server.listen(_apiServer.httpRequestHandler);
_apiServer.enableDiscoveryApi("http://localhost:9090");
}
My EDI(SDK) don't know the await keyword. (I import the dart:io dart:async and rpc package in my library file)
I've missed something? Thanks by advance for your responses. Have a nice day.
You need to mark the function as async in order to be able to use await
void start() async {
....
try at DartPad
Without async await is a valid identifier.

Resources