Kafkajs - get statistics (lag) - node.js

In our nest.js application we use kafkajs client for kafka.
We need to get chance monitor statistic.
One of metrics is lag.
Trying to figure out if kafkajs provides any and nothing interesting. (The most interesting thing in payload are: timestamp, offset, batchContext.firstOffset, batchContext.firstTimestamp, batchContext.maxTimestamp)
Questions
Is there any ideas how to log lag value and other statistic provided by kafkajs?
Should I think about implementing my own statistic monitor to collect required information in node application which uses kafka.js client?
New Details 1
Following documentation I can get batch.highWatermark, where
batch.highWatermark is the last committed offset within the topic partition. It can be useful for calculating lag.
Trying
await consumer.run({
eachBatchAutoResolve: true,
eachBatch: async (data) => {
console.log('Received data.batch.messages: ', data.batch.messages)
console.log('Received data.batch.highWatermark: ', data.batch.highWatermark)
},
})
I can get information like a next one:
Received data.batch.messages: [
{
magicByte: 2,
attributes: 0,
timestamp: '1628877419958',
offset: '144',
key: null,
value: <Buffer 68 65 6c 6c 6f 21>,
headers: {},
isControlRecord: false,
batchContext: {
firstOffset: '144',
firstTimestamp: '1628877419958',
partitionLeaderEpoch: 0,
inTransaction: false,
isControlBatch: false,
lastOffsetDelta: 2,
producerId: '-1',
producerEpoch: 0,
firstSequence: 0,
maxTimestamp: '1628877419958',
timestampType: 0,
magicByte: 2
}
},
{
magicByte: 2,
attributes: 0,
timestamp: '1628877419958',
offset: '145',
key: null,
value: <Buffer 6f 74 68 65 72 20 6d 65 73 73 61 67 65>,
headers: {},
isControlRecord: false,
batchContext: {
firstOffset: '144',
firstTimestamp: '1628877419958',
partitionLeaderEpoch: 0,
inTransaction: false,
isControlBatch: false,
lastOffsetDelta: 2,
producerId: '-1',
producerEpoch: 0,
firstSequence: 0,
maxTimestamp: '1628877419958',
timestampType: 0,
magicByte: 2
}
},
{
magicByte: 2,
attributes: 0,
timestamp: '1628877419958',
offset: '146',
key: null,
value: <Buffer 6d 6f 72 65 20 6d 65 73 73 61 67 65 73>,
headers: {},
isControlRecord: false,
batchContext: {
firstOffset: '144',
firstTimestamp: '1628877419958',
partitionLeaderEpoch: 0,
inTransaction: false,
isControlBatch: false,
lastOffsetDelta: 2,
producerId: '-1',
producerEpoch: 0,
firstSequence: 0,
maxTimestamp: '1628877419958',
timestampType: 0,
magicByte: 2
}
}
]
Received data.batch.highWatermark: 147
Is any ideas how to use batch.highWatermark in tag calculation then?

Looks like the only way to get offset lag metric is by using instrumentation events:
consumer.on(consumer.events.END_BATCH_PROCESS, (payload) =>
console.log(payload.offsetLagLow),
);
offsetLagLow measures the offset delta between first message in the batch and the last offset in the partition (highWatermark). You can also use offsetLag but it is based on the last offset of the batch.
As #Sergii mentioned there are some props available directly when you are using eachBatch (here are all available methods on the batch prop). But you won't get that props if you are using eachMessage. So instrumentation events are the most universal approach.

Related

Unable to get screenshot of window through winapi

Using the Node.js ffi-napi package, I'm attempting to get a buffer of bitmap data from a screenshot of a given window or desktop if no window handle is supplied. To that end, I'm trying to port the c++ example from the microsoft documentation and making the api calls through ffi-napi.
Even though all the api calls come back without errors, I only end up with a buffer filled with 0s. I've traced it down to a few potential pieces that might be causing it but I don't know which piece is incorrect and how to fix it.
Even though my call to BitBlt is returning true, the pixel values I get from calling GetPixel on my source window dc is returning correct values but when I call GetPixel on my memory dc, I get 0s.
My call to GetObjectA seems to be populating the BITMAP struct I created, (my structs are just extensions of buffers), except the last 8 bytes which is supposed to hold the pointer to the bitmap data is all 0s.
My call to GetDIBits is returning 1080 which is the correct number of rows it should have read from the bitmap data but the buffer I get back is all 0s.
My guess is that BitBlt didn't actually copy the desktop dc to the memory dc even though it's returning true and I don't know why that is. I've also tried running the calculator app and passing in the dc to the screenshot function as well and even though it returns the appropriate items, the buffer I get back is still all 0s.
Any help would be greatly appreciated!
This is my node.js code
function screenshot(hWnd = null) {
let hdcWindow = null;
let hdcMemDC = null;
let hbmScreen = null;
let hDIB = null;
try {
if (!hWnd) hWnd = user32.GetDesktopWindow();
console.log('hWnd', hWnd);
// Retrieve the handle to a display device context for the client area of the window.
hdcWindow = user32.GetDC(hWnd);
console.log('hdcWindow', hdcWindow);
const rcClient = new win32_structs.RECT();
user32.GetClientRect(hWnd, rcClient);
console.log('rcClient', rcClient);
// Create a compatible DC and bitmap
hdcMemDC = gdi32.CreateCompatibleDC(hdcWindow);
console.log('hdcMemDC', hdcMemDC);
hbmScreen = gdi32.CreateCompatibleBitmap(hdcWindow, rcClient.right - rcClient.left, rcClient.bottom - rcClient.top);
console.log('hbmScreen', hbmScreen);
const hPrevDC = gdi32.SelectObject(hdcMemDC, hbmScreen);
console.log('hPrevDC', hPrevDC);
// Bit block transfer into our compatible memory DC.
const bitBltRes = gdi32.BitBlt(hdcMemDC, 0, 0, rcClient.right - rcClient.left, rcClient.bottom - rcClient.top, hdcWindow, 0, 0, apiConstants.SRCCOPY);
const pixelWnd = gdi32.GetPixel(hdcWindow, 0, 0);
const pixelMem = gdi32.GetPixel(hdcMemDC, 0, 0);
console.log('pixelWnd', pixelWnd);
console.log('pixelMem', pixelMem);
console.log('bitBltRes', bitBltRes);
// Get the BITMAP from the HBITMAP
const bmpScreen = new win32_structs.BITMAP();
const getObjectRes = gdi32.GetObjectA(hbmScreen, bmpScreen.length, bmpScreen);
console.log('getObjectRes', getObjectRes);
console.log('bmpScreen.length', bmpScreen.length);
console.log('bmpScreen', bmpScreen);
const bi = new win32_structs.BITMAPINFOHEADER();
bi.biSize = bi.length;
bi.biWidth = bmpScreen.bmWidth;
bi.biHeight = bmpScreen.bmHeight;
bi.biPlanes = 1;
bi.biBitCount = 32;
bi.biCompression = apiConstants.BI_RGB;
bi.biSizeImage = 0;
bi.biXPelsPerMeter = 0;
bi.biYPelsPerMeter = 0;
bi.biClrUsed = 0;
bi.biClrImportant = 0;
console.log('bi', bi);
const dwBmpSize = ((bmpScreen.bmWidth * bi.biBitCount + 31) / 32) * 4 * bmpScreen.bmHeight;
console.log('dwBmpSize', dwBmpSize);
// Starting with 32-bit Windows, GlobalAlloc and LocalAlloc are implemented as wrapper functions that
// call HeapAlloc using a handle to the process's default heap. Therefore, GlobalAlloc and LocalAlloc
// have greater overhead than HeapAlloc.
// hDIB = kernel32.GlobalAlloc(apiConstants.GHND, dwBmpSize);
// const lpBitmap = kernel32.GlobalLock(hDIB);
const lpBitmap = new Buffer.alloc(dwBmpSize);
// Gets the "bits" from the bitmap and copies them into buffer lpbitmap
const getDIBitsRes = gdi32.GetDIBits(hdcWindow, hbmScreen, 0, bmpScreen.bmHeight, lpBitmap, bi, apiConstants.DIB_RGB_COLORS);
console.log('getDIBitsRes', getDIBitsRes);
console.log('lpBitmap', lpBitmap);
for (const c of lpBitmap) {
if (c > 0) {
console.log(c);
break;
}
}
// clean up
if (hDIB != null) {
kernel32.GlobalUnlock(hDIB);
kernel32.GlobalFree(hDIB);
}
if (hbmScreen != null) gdi32.DeleteObject(hbmScreen);
if (hdcMemDC != null) gdi32.DeleteObject(hdcMemDC);
if (hdcWindow != null) user32.ReleaseDC(hWnd, hdcWindow);
return lpBitmap;
} catch (err) {
// clean up memory on errors
if (hDIB != null) {
kernel32.GlobalUnlock(hDIB);
kernel32.GlobalFree(hDIB);
}
if (hbmScreen != null) gdi32.DeleteObject(hbmScreen);
if (hdcMemDC != null) gdi32.DeleteObject(hdcMemDC);
if (hdcWindow != null) user32.ReleaseDC(hWnd, hdcWindow);
throw err;
}
}
Here's my console log:
hWnd 65552
hdcWindow 83954845
rcClient <Buffer#0x000001BE188E8670 00 00 00 00 00 00 00 00 80 07 00 00 38 04 00 00, _structProps: { left: { offset: 0, dataType: 'long' }, top: { offset: 4, dataType: 'long' }, right: { offset: 8, dataType: 'long'
}, bottom: { offset: 12, dataType: 'long' } }, left: 0, top: 0, right: 1920, bottom: 1080>
hdcMemDC 1375804638
hbmScreen 990189696
hPrevDC 8716303
pixelWnd { r: 231, g: 234, b: 237 }
pixelMem { r: 0, g: 0, b: 0 }
bitBltRes true
getObjectRes 32
bmpScreen.length 32
bmpScreen <Buffer#0x000001BE188C5CC0 00 00 00 00 80 07 00 00 38 04 00 00 00 1e 00 00 01 00 20 00 00 00 00 00 00 00 00 00 00 00 00 00,
_structProps: { bmType: { offset: 0, dataType: 'long' }, bmWidth: { offset: 4, dataType: 'long' }, bmHeight: { offset: 8, dataType: 'long' }, bmWidthBytes: { offset: 12, dataType: 'long' }, bmPlanes: { offset: 16, dataType: 'uint' }, bmBitsPixel: { offset: 18, dataType: 'uint' }, pointerPadding: { offset: 20, dataType: 'long' }, bmBits: { offset: 24, dataType: 'ulonglong' } },
bmType: 0, bmWidth: 1920, bmHeight: 1080, bmWidthBytes: 7680, bmPlanes: 1, bmBitsPixel: 32, pointerPadding: 0, bmBits: 0n>
bi <Buffer#0x000001BE188C5540 28 00 00 00 80 07 00 00 38 04 00 00 01 00 20 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00,
_structProps: { biSize: { offset: 0, dataType: 'ulong' }, biWidth: { offset: 4, dataType: 'long' }, biHeight: { offset: 8, dataType: 'long' }, biPlanes: { offset: 12, dataType: 'uint' }, biBitCount: { offset: 14, dataType: 'uint' }, biCompression: { offset: 16, dataType: 'ulong' }, biSizeImage: { offset: 20, dataType: 'ulong' }, biXPelsPerMeter: { offset: 24, dataType: 'long' }, biYPelsPerMeter: { offset: 28, dataType: 'long' }, biClrUsed: { offset: 32, dataType: 'ulong' }, biClrImportant: { offset: 36, dataType: 'ulong' } },
biSize: 40, biWidth: 1920, biHeight: 1080, biPlanes: 1, biBitCount: 32, biCompression: 0, biSizeImage: 0,
biXPelsPerMeter: 0, biYPelsPerMeter: 0, biClrUsed: 0, biClrImportant: 0>
dwBmpSize 8298585
getDIBitsRes 1080
lpBitmap <Buffer#0x000001BE18AED040 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ... 8298535 more bytes>
The OP mentioned they were able to get their code working in the end, with a simple fix to one of the constants. However, that doesn't help other readers so much, because they don't have the full code for it; so I thought I would try to reproduce the working code that they had. (thanks to #vincitego for the starting point!)
After a whole lot of trial and error, I was able to get my reproduction working, and have published the solution as a component in my Windows FFI library here: https://github.com/Venryx/windows-ffi
Usage:
import {VRect, CaptureScreenshot, GetForegroundWindowHandle} from "windows-ffi";
// First capture a screenshot of a section of the screen.
const screenshot = CaptureScreenshot({
windowHandle: GetForegroundWindowHandle(), // comment to screenshot all windows
rectToCapture: new VRect(0, 0, 800, 600),
});
// The image-data is now stored in the `screenshot.buffer` Buffer object.
// Access it directly (and cheaply) using the helper functions on `screenshot`.
for (let x = 0; x < 800; x++) {
console.log(`Pixel color at [${x}, 0] is:`, screenshot.GetPixel(x, 0).ToHex_RGB());
}
You can use the library as packaged, or, if you prefer, just reference its source-code and extract the parts you need. The main module for screenshot-capturing can be seen here.

Error in Watson Classifier API reference nodejs example: source.on is not a function

I'm trying to use Watson Classifier from node. I've started by implementing the example in the API reference, found at https://www.ibm.com/watson/developercloud/natural-language-classifier/api/v1/node.html?node#create-classifier
My code (sensitive information replaced with stars):
58 create: function(args, cb) {
59 var params = {
60 metadata: {
61 language: 'en',
62 name: '*********************'
63 },
64 training_data: fs.createReadStream(config.data.prepared.training)
65 };
66
67 params.training_data.on("readable", function () {
68 nlc.createClassifier(params, function(err, response) {
69 if (err)
70 return cb(err);
71 console.log(JSON.stringify(response, null, 2));
72 cb();
73 });
74 });
75 },
The file I am trying to make a stream from exists. The stream works (I've managed to read from it on "readable"). I've placed the on("readable") part because it made sense for me to do all of this once the stream becomes available, and also because I wanted to be able to check that I can read from it. It does not change the outcome, however.
nlc is the natural_langauge_classifier instance.
I'm getting this:
octav#****************:~/watsonnlu$ node nlc.js create
/home/octav/watsonnlu/node_modules/delayed-stream/lib/delayed_stream.js:33
source.on('error', function() {});
^
TypeError: source.on is not a function
at Function.DelayedStream.create (/home/octav/watsonnlu/node_modules/delayed-stream/lib/delayed_stream.js:33:10)
at FormData.CombinedStream.append (/home/octav/watsonnlu/node_modules/combined-stream/lib/combined_stream.js:44:37)
at FormData.append (/home/octav/watsonnlu/node_modules/form-data/lib/form_data.js:74:3)
at appendFormValue (/home/octav/watsonnlu/node_modules/request/request.js:321:21)
at Request.init (/home/octav/watsonnlu/node_modules/request/request.js:334:11)
at new Request (/home/octav/watsonnlu/node_modules/request/request.js:128:8)
at request (/home/octav/watsonnlu/node_modules/request/index.js:53:10)
at Object.createRequest (/home/octav/watsonnlu/node_modules/watson-developer-cloud/lib/requestwrapper.js:208:12)
at NaturalLanguageClassifierV1.createClassifier (/home/octav/watsonnlu/node_modules/watson-developer-cloud/natural-language-classifier/v1-generated.js:143:33)
at ReadStream.<anonymous> (/home/octav/watsonnlu/nlc.js:68:8)
I tried debugging it myself for a while, but I'm not sure what this source is actually supposed to be. It's just an object composed of the metadata I put in and the "emit" function if I print it before the offending line in delayed-stream.js.
{ language: 'en',
name: '*******************',
emit: [Function] }
This is my package.json file:
1 {
2 "name": "watsonnlu",
3 "version": "0.0.1",
4 "dependencies": {
5 "csv-parse": "2.0.0",
6 "watson-developer-cloud": "3.2.1"
7 }
8 }
Any ideas how to make the example work?
Cheers!
Octav
I got the answer in the meantime thanks to the good people at IBM. It seems you have to send the metadata as a stringified JSON:
59 var params = {
60 metadata: JSON.stringify({
61 language: 'en',
62 name: '*********************'
63 }),
64 training_data: fs.createReadStream(config.data.prepared.training)
65 };

Showing image from MongoDB that is stored as Buffer

So I am storing an image like this:
router.post('/', upload.single('pic'), (req, res) => {
var newImg = fs.readFileSync(req.file.path);
var encImg = newImg.toString('base64');
var s = new Buffer(encImg, 'base64');
var newCar = {
picture: s,
contentType: req.file.mimetype,
link: req.body.link
}
})
});
Now the data looks like this:
{
_id: 5a502869eb1eb10cc4449335,
picture: Binary { _bsontype: 'Binary',
sub_type: 0,
position: 1230326,
buffer: <Buffer 89 50 4e 47 0d 0a 1a 0a 00 00 00 0d 49
48 44 52 00 00 05 00 00 00 03 1e 08 06 00 ... >
},
contentType: 'image/png',
link: 'fds',
__v: 0
}
I want to show this picture on frontend, like this:
<img src="data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAAAUA
AAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO
9TXL0Y4OHwAAAABJRU5ErkJggg==" alt="Red dot" />
In my case, this code will be:
<img src="data:<%= c.contentType %>;base64, <%= c.picture %>" />
And all I am getting is some weird symbols:
I think I tried almost everything, and still can't figure out what is this. Even when I convert that Buffer toString('ascii'), I am still getting some symbols (boxes) that can't be recognized.
What am I supposed to do?
P.S. Also, is this a good way to store images? (less than 16MB), I think I noticed it's kinda slow, cuz of those long strings converting and reading file, compared to case where I just store the image as file?
HTML
<img [src]="'data:image/jpg;base64,'+Logo.data" height="50" width="60" alt="Red dot" />
Data from database:
"Logo" : {
"data" : BinData(0,"/9j/4AAQSkZJRgABAQEAYABgAAD/"),
"name" : "dp.jpg",
"encoding" : "7bit",
"mimetype" : "image/jpeg",
"truncated" : false,
"size" : 895082
}
Hope it help's

Get username from keycloak session in NodeJS

Is there something similar to:
request.getUserPrincipal().getName() // Java
In Node to get username when we are using connect-keycloak with express middle-ware?
I also came along with this issue.
I did dive into the middleware code and tried to find something similar. It turns out that the request object is modified and appended by kauth.grant.
console.log('req.kauth.grant') prints out:
{
access_token: {
token: 'kasdgfksj333',
clientId: 'mobile',
header: {
alg: 'RS256'
},
content: {
jti: '33389eb6-3611-4de2-b913-add9283c3de0',
exp: 1464883174,
nbf: 0,
iat: 1464882874,
iss: 'http://docker:9090/auth/realms/test',
aud: 'test-client',
sub: '333604a0-b527-4afb-a04e-5e4ebf06ce9c',
typ: 'Bearer',
azp: 'test-client',
session_state: '1cd35952-8e42-44f1-ad15-aaf9964bfefa',
client_session: '943f1213-f556-4021-bbc6-2355146ab955',
'allowed-origins': [],
resource_access: [Object],
name: 'Test User',
preferred_username: 'user',
given_name: 'Test',
family_name: 'User',
email: 'foo#bar.com'
},
signature: < Buffer 45 1 b 3 d d7 4 f f9 d1 63 44 ad a9 ca b8 c4 67 88 ba e9 5 d 64 8 d a0 a9 75 a1 79 cf 18 52 d5 f7 f0 08 71 1 d 79 bd 59 e9 5 a f8 25 72 dd e5 06 71 4 f b7 f1 47... > ,
signed: 'eyJhbGcfOiJSUzf1NiJ9.eyJqdGkiOsJmYmY4OWViwi0zNjExLTrkZTItYjkxMy1hZGQ5MjgzYzNkZTAiLCJleHAiOjE0NjQ4ODMxNzQsIm5iZiI6MCwiaWF0IjoxNDY0ODgyODc0LCJpc3MiOiJodHRwOi8vZG9ja2VyaG9zdDo5MDgwL2F1dGgvcmVhbG1zL3JoY2FycyIsImF1ZCI6InJoY2Fycy12ZWhpY2xlLW93bmVyLWlvcyIsInN1YiI6IjkxMjYwNGEwLWI1MjctNGFmYi1hMDRlLTVlNGViZjA2Y2U5YyIsInR5cCI6IkJlYXJlciIsImF6cCI6InJoY2Fycy12ZWhpY2xlLW93bmVyLWlvcyIsInNlc3Npb25fc3RhdGUiOiIxY2QzNTk1Mi04ZTQyLTQ0ZjEtYWQxNS1hYWY5OTY0YmZlZmEiLCJjbGllbnRfc2Vzc2lvbiI6Ijk0M2YxMjEzLWY1NTYtNDAyMS1iYmM2LTIzNTUxNDZhYjk1NSIsImFsbG93ZWQtb3JpZ2lucyI6W10sInJlc291cmNlX2FjY2VzcyI6eyJhY2NvdW50Ijp7InJvbGVzIjpbIm1hbmFnZS1hY2NvdW50Iiwidmlldy1wcm9maWxlIl19fSwibmFtZSI6IlRlc3QgVXNlciIsInByZWZlcnJlZF91c2VybmFtZSI6IjEyMzEyMyIsImdpdmVuX25hbWUiOiJUZXN0IiwiZmFtaWx5X25hbWUiOiJVc2VyIiwiZW1haWwiOiJmb29iYXJ1c2VyQGFyY29uc2lzLmNvbSJ9'
},
refresh_token: undefined,
id_token: undefined,
token_type: undefined,
expires_in: undefined,
__raw: '{"access_token":"eyJhbGciOiJSUzI3NiJ2.eyJqdGki4iJmYmY4OWriNi0zNjExLTRkZTItYjkxMy1hZGQ5MjgzYzNkZTAiLCJleHAiOjE0NjQ4ODMxNzQsIm5iZiI6MCwiaWF0IjoxNDY0ODgyODc0LCJpc3MiOiJodHRwOi8vZG9ja2VyaG9zdDo5MDgwL2F1dGgvcmVhbG1zL3JoY2FycyIsImF1ZCI6InJoY2Fycy12ZWhpY2xlLW93bmVyLWlvcyIsInN1YiI6IjkxMjYwNGEwLWI1MjctNGFmYi1hMDRlLTVlNGViZjA2Y2U5YyIsInR5cCI6IkJlYXJlciIsImF6cCI6InJoY2Fycy12ZWhpY2xlLW93bmVyLWlvcyIsInNlc3Npb25fc3RhdGUiOiIxY2QzNTk1Mi04ZTQyLTQ0ZjEtYWQxNS1hYWY5OTY0YmZlZmEiLCJjbGllbnRfc2Vzc2lvbiI6Ijk0M2YxMjEzLWY1NTYtNDAyMS1iYmM2LTIzNTUxNDZhYjk1NSIsImFsbG93ZWQtb3JpZ2lucyI6W10sInJlc291cmNlX2FjY2VzcyI6eyJhY2NvdW50Ijp7InJvbGVzIjpbIm1hbmFnZS1hY2NvdW50Iiwidmlldy1wcm9maWxlIl19fSwibmFtZSI6IlRlc3QgVXNlciIsInByZWZlcnJlZF91c2VybmFtZSI6IjEyMzEyMyIsImdpdmVuX25hbWUiOiJUZXN0IiwiZmFtaWx5X25hbWUiOiJVc2VyIiwiZW1haWwiOiJmb29iYXJ1c2VyQGFyY29uc2lzLmNvbSJ9.RRs910_50WNEranKuMRniLrpXWSNoKl1oXnPGFLV9_AIcR15vVnpWvglct3lBnFPt_FH6QPJTmp7i-8mRTIDoIL8jtmEtJ8VfE2ZYX5WN3RlxPFQc5kCOZUQiV55eZALOCSTpm2HIw1eLhBVs4Is8RMJoWy8xj3k4pkOqqll8NY__TJdTG7Iihj0lReblyaW34OpSxkAYoqYaayox0H_7UbnpSAIL0BqBL41lDPH4mXouUX3i0fFbLOt_MnAtPrdFYTez7OVmKhZx7gavdQEkHEGK8thgagnCrycejUqTO0YUeOsasQ2NK9KLPBIEA0eX_p2l2yDYhlJR15stQ3AHA"}',
store: [Function],
unstore: [Function]
}
For sure - this is not developer friendly but you can access the username via
req.kauth.grant.access_token.content.preferred_username. That results in user.
I will report this as an issue to the main contributer.
(Github Repo of keycloak middleware https://github.com/keycloak/keycloak-nodejs-connect)
UPDATE
The main contributers of the keycloak project just answered me. If you find any additional issues - address them here:
https://issues.jboss.org/projects/KEYCLOAK
For the node.js adapter:
https://issues.jboss.org/browse/KEYCLOAK-2833?jql=project%20%3D%20KEYCLOAK%20AND%20component%20%3D%20%22Adapter%20-%20Node.js%22
UPDATE 2: March 15 2021
Reporting issues for the keycloak middleware require a RedHat user account now. Since this thread still seems to be active and I am not into that topic any longer (so much time passed by) I can only suggest to set up an account report bugs there.
https://issues.jboss.org/projects/KEYCLOAK
Hope I could help.
Cheers
Orlando
🍻

How to replicate a curl command with the nodejs request module?

How can I replicate this curl request:
$ curl "https://s3-external-1.amazonaws.com/herokusources/..." \
-X PUT -H 'Content-Type:' --data-binary #temp/archive.tar.gz
With the node request module?
I need to do this to PUT a file up on AWS S3 and to match the signature provided by Heroku in the put_url from Heroku's sources endpoint API output.
I have tried this (where source is the Heroku sources endpoint API output):
// PUT tarball
function(source, cb){
putUrl = source.source_blob.put_url;
urlObj = url.parse(putUrl);
var options = {
headers: {},
method : 'PUT',
url : urlObj
}
fs.createReadStream('temp/archive.tar.gz')
.pipe(request(
options,
function(err, incoming, response){
if (err){
cb(err);
} else {
cb(null, source);
}
}
));
}
But I get the following SignatureDoesNotMatch error.
<?xml version="1.0"?>
<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message>
<AWSAccessKeyId>AKIAJURUZ6XB34ESX54A</AWSAccessKeyId>
<StringToSign>PUT\n\nfalse\n1424204099\n/heroku-sources-production/heroku.com/d1ed2f1f-4c81-43c8-9997-01706805fab8</StringToSign>
<SignatureProvided>DKh8Y+c7nM/6vJr2pabvis3Gtsc=</SignatureProvided>
<StringToSignBytes>50 55 54 0a 0a 66 61 6c 73 65 0a 31 34 32 34 32 30 34 30 39 39 0a 2f 68 65 72 6f 6b 75 2d 73 6f 75 72 63 65 73 2d 70 72 6f 64 75 63 74 69 6f 6e 2f 68 65 72 6f 6b 75 2e 63 6f 6d 2f 64 31 65 64 32 66 31 66 2d 34 63 38 31 2d 34 33 63 38 2d 39 39 39 37 2d 30 31 37 30 36 38 30 35 66 61 62 38</StringToSignBytes>
<RequestId>A7F1C5F7A68613A9</RequestId>
<HostId>JGW6l8G9kFNfPgSuecFb6y9mh7IgJh28c5HKJbiP6qLLwvrHmESF1H5Y1PbFPAdv</HostId>
</Error>
Here is an example of what the Heroku sources endpoint API output looks like:
{ source_blob:
{ get_url: 'https://s3-external-1.amazonaws.com/heroku-sources-production/heroku.com/2c6641c3-af40-4d44-8cdb-c44ee5f670c2?AWSAccessKeyId=AKIAJURUZ6XB34ESX54A&Signature=hYYNQ1WjwHqyyO0QMtjVXYBvsJg%3D&Expires=1424156543',
put_url: 'https://s3-external-1.amazonaws.com/heroku-sources-production/heroku.com/2c6641c3-af40-4d44-8cdb-c44ee5f670c2?AWSAccessKeyId=AKIAJURUZ6XB34ESX54A&Signature=ecj4bxLnQL%2FZr%2FSKx6URJMr6hPk%3D&Expires=1424156543'
}
}
Update
The key issue here is that the PUT request I send with the request module should be the same as the one sent with curl because I know that the curl request matches the expectations of the AWS S3 Uploading Objects Using Pre-Signed URLs API. Heroku generates the PUT url so I have no control over its creation. I do know that the curl command works as I have tested it -- which is good since it is the example provided by Heroku.
I am using curl 7.35.0 and request 2.53.0.
The Amazon API doesn't like chunked uploads. The file needs to be sent unchunked. So here is the code that works:
// PUT tarball
function(source, cb){
console.log('Uploading tarball...');
putUrl = source.source_blob.put_url;
urlObj = url.parse(putUrl);
fs.readFile(config.build.temp + 'archive.tar.gz', function(err, data){
if (err){ cb(err); }
else {
var options = {
body : data,
method : 'PUT',
url : urlObj
};
request(options, function(err, incoming, response){
if (err){ cb(err); } else { cb(null, source); }
});
}
});
},

Resources