Monitoring AWS Kinesis stream on the shell console - aws-cli

I am sending data stream to AWS kinesis service and want to check if the kinesis received the data. What I expect is real-time output on shell console like redis does:
Wonder if AWS CLI supports features like that.

Related

Transfer large files between two node js client through aws using serverless services

I want to know how to transfer large files between two node js clients through aws using serverless services without saving files in s3 bucket.
In my client side I used socket.io-stream to send files from one client to another through the node.js server.
Then how i used stream to send files to the AWS and then that stream sends back to the other client. Curently i am trying to use AWS Websocket, but how do I use stream with websocket?

Write all logs to the console or use a log library appender?

I'm running a couple of Node services on AWS across Elastic Beanstalk and Lambdas. We use the Bunyan library and produce JSON logs. We are considering moving our logging entirely to CloudWatch. I've found two ways of pushing logs to CloudWatch:
Write everything to the console using bunyan and use the built-in log streaming in both Beanstalk and Lambda to push logs to CloudWatch for me.
Use a Bunyan Stream like https://github.com/mirkokiefer/bunyan-cloudwatch and push all log events directly to CloudWatch via their APIs.
Are both valid options? Is one more preferred than the other? Any plusses and minuses that I'm missing?
I favor the first option: Write everything to the console using bunyan.
I think this separates concerns better than baking cloudstream into your app. Besides, bunyan-cloudwatch is not maintained.

From IoT Hub to SQL Database

I sent data to IoT Hub using Mosquitto:
mosquitto_pub <..> -m "mensaje" <..>
I have tested that this message arrive to IoT Hub with "device explorer twin" and also with the CLI
But after that, when I create a stream analytics jobs, the input fail
input
I have tried all kind of formart: CSV, JSON, AVRO and also all king of string format with -m "mensaje" like: {"mensaje":"adios"}
There is no way to do this from the CMD with the command mosquitto_pub?
You can not send the message in JSON/CSV/AVRO format from Mosquitto to Azure so I had to change to other Mosquitto broker and that work without problem

get amazon S3 to send http request upon file upload

I need my nodejs application to receive a http request with a file name, when a file is uploaded in my S3 bucket.
I would like some recommendations on the most simple/straight forward way to achieve this.
So far I see 3 ways to do this, but I feel Im overthinking this, and there surely exist better options:
1/ file uploaded on s3 -> S3 send a notification to SNS -> SNS sends a http request to my application
2/ file uploaded on s3 -> lambda function is triggered and sends a http request to my application
3/ make my application watch the bucket on regular basis and do something when a file is uploaded
thanks
ps. yes, Im really new to amazon services :)
SNS: Will work OK, but you'll have to manage the SNS topic subscription. You also won't have any control over the HTTP post's format.
Lambda: This is what I would go with. It gives you the most control.
How would you efficiently check for new objects exactly? This isn't a good solution.
You could also have S3 post the new object events to SQS, and configure your application to poll the SQS queue instead of listening for an HTTP request.
SNS- If you want to call multiple services on updating S3 then I would suggest SNS. You can create a topic for SNS and there can have multiple subscribers to that topic. Later if you want to add more HTTP then it would be as simple as subscribing the topic.
Lambda - If you need to sent notification to only one HTTP endpoint then I would strongly recommend this.
SQS - You don't need SQL in this scenario. SQS is mainly for decoupling components and would be the best fit for microservices but you can use with other messaging systems as well
You don't need to build something on your on to regularly monitor the bucket for changes as already services there like Lambda, SNS etc.

Get bitrate from remote video (pure nodejs)

I have videos in s3 and am curious if there is a way the bitrate can be efficiently gained using nodejs. I'm looking to make the new lambda Aws service run against newly added s3 objects to get bitrate.
Since I need to run in just node I can't use ffmpeg here
https://github.com/ListenerApproved/node-ffprobe
just did this on my node server, use probeData.format.bit_rate

Resources