I want to implement spring-integration-aws to send and receive messages with SQS. I am looking at localstack and would like to know the recommendation of the spring team.
Which tool/api should I use for local setup of spring integration flows for SQS inbound and outbound adapters?
Also, will there be examples of AWS in spring-integration-samples in future? I am looking for an example with xml config that reads the aws config from credentials and send and receive messages via outbound adapters.
Not sure what recommendation you expect from us, but I see an answer in your own question - Localstack: https://github.com/localstack/localstack.
In the project test we indeed use this tool over a docker container:
https://github.com/spring-projects/spring-integration-aws/blob/master/src/test/java/org/springframework/integration/aws/lock/DynamoDbLockRegistryTests.java#L62
We don't have such a test against SQS, but the configuration technique is similar.
I recall I heard that Testcontainers project can be used for testing AWS services locally as well: https://www.testcontainers.org/modules/localstack/
We don't have resources to write samples for this project.
Related
I have many SQS queues to which I have to attach workers to consume and process messages (using spring integration flow). My initial thought is to have a single spring boot application with 1 channel per SQS queue(spring integration aws).
This will be like 1 SQS -> 1 Channel -> 1 Flow
However, this might get into maintainability, deployment, memory issues and we won't be able to scale each worker independently. It sounds like a monolith of workers. Another option is to create a spring cloud function per worker type and deploy them as lambda.
Is there any other solution for this scenario in the spring stack (I want to use spring integration and reactor), such that we can scale every worker independently and also manage deployment of every worker independently (may sound like a microservice for each worker, but it does not have a domain, just some processing logic like validations, invoke some api, store to database)
Even if it doesn't have any domain and REST endpoints, you still can treat it as a microservice since it is activated when the message is consumed from the SQS queue. Therefore your direction with Spring Boot + Spring Integration AWS is correct. You just expose an SQS queue as a configuration property and deploy your application into a desired environment with appropriate scaling policy.
Yes, you probably still can write a Spring Cloud Function for AWS Lambda, but no one stops you to do a Spring Integration logic in the function body.
I am trying integrate spring cloud stream kinesis in my app but i cant find all configuration option in there manual. I have seen this link:
https://github.com/spring-cloud/spring-cloud-stream-binder-aws-kinesis/blob/master/spring-cloud-stream-binder-kinesis-docs/src/main/asciidoc/overview.adoc
There are few properties mentioned like:
spring.cloud.stream.instanceCount=
I would like to know how can i set some of the properties which i cant see in the documentation:
hostname
port number
access key
secret key
username
I am looking for something like:
spring.cloud.stream.binder.host=
spring.cloud.stream.binder.port=
spring.cloud.stream.binder.access_key=
There is no host or port for AWS services. You only do a connection to the AWS via an auto-configuration. The Spring Cloud Kinesis Binder is fully based on the auto-configuration provided by the Spring Cloud AWS project. So, you need to follow its documentation how to configure accessKey and secretKey: https://cloud.spring.io/spring-cloud-static/spring-cloud-aws/2.1.2.RELEASE/single/spring-cloud-aws.html#_spring_boot_auto_configuration:
cloud.aws.credentials.accessKey
cloud.aws.credentials.secretKey
You also may consider to use a cloud.aws.region.static if you don't run your application in the EC2 environment.
There is no more magic than standard AWS connection settings and auto-configuration provided by the Spring Cloud AWS.
Or you can rely on the standard AWS credentials file instead.
I am trying azure functions to connect to websocket to get the data and send to locally installed kafka in my laptop. but unable to send to kafka?.Is there any way to send data from azure functions to locally installed kafka?
There is nothing in Azure Function that will do that for you, so you need to send your messages to Kafka the same you would do from any other .NET code.
The steps:
Make sure Kafka is accessible from the internet as #Dhinesh suggested.
Pick a Kafka client library to use (the official one is here, but there are others too).
Write a simple console app which would send some sample messages to your Kafka and make sure it works.
Use exactly the same code inside your Azure Function body to send real messages to Kafka.
If you need something more elaborate than this basic guide, feel free to adjust your question.
Azure AppService supports Virtual Networks (vNets). Functions run on top of an AppService so this means that your Functions can securely access your on-prem Kafka instance over VPN. For this to work you would need to:
Create a site to site or a site to point VPN using the VNET integration -instructions here: https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-site-to-site-create
Ensure your function and app service is inside this vNET - instructions here: https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-integrate-with-vnet
I hope this helps
I am in the process of moving an application from c# to node.js. I am a node.js newbie, coming from a .net background. I am looking to incorporate domain driven design patterns into the app. development which led me to the concept of bounded contexts and micro services. I would like to use aws as my cloud provider but am having a problem in determining which tool I should use for handlling command and event processing? Azure has the service bus which seems to work pretty good for this.
Is there an equivalent to the service bus for aws or should I just look to use SQS?
There is no direct equivalent to Azure Service Bus, but it can be replaced by combining SQS and SNS. Let's take a look. Azure Service Bus consists of two parts:
Queues. In most cases, SQS (Simple Queue Service) will provide an adequate replacement, but keep in mind that Azure Service Bus queues are First In-First Out (FIFO), while SQS queues do not guarantee the order of messages.
Update 2018-01-09: SQS now allows to create FIFO queues. (see https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html)
Topics and subscriptions. This is used in PubSub (publish/subscribe) event-driven design, when you need to deliver the same message to several consumers. SQS can't do that, but SNS (Simple Notification Service) is exactly that type of service.
Update 2018-08-01: on November 28th 2017 Amazon introduced Amazon MQ, which is Apache Active MQ in Amazon cloud. Amazon MQ has both queues and topics (for publish/subscribe usage model), so it can be seen as a full-featured replacement for Azure Service Bus.
July 2019 Update: Amazon introduced EventBridge service bus service that enables event-driven architectures for custom applications, as well as integration with AWS services and other SaaS hosted on its platform, see https://aws.amazon.com/about-aws/whats-new/2019/07/introducing-amazon-eventbridge/
If SQS suits your needs then it is well integrated with the platform:
https://aws.amazon.com/sqs/
Amazon Simple Queue Service (SQS) is a fast, reliable, scalable, fully managed message queuing service. Amazon SQS makes it simple and cost-effective to decouple the components of a cloud application. You can use Amazon SQS to transmit any volume of data, without losing messages or requiring other services to be always available. Amazon SQS includes standard queues with high throughput and at-least-once processing, and FIFO queues that provide FIFO (first-in, first-out) delivery and exactly-once processing.
Also, there's Enterprise Service Bus (HVM) in the marketplace but is seems like a Windows only thing.
But you don't have to use only solutions that are directly integrated into your hosting provider's platform. You can run anything on AWS. For example you can use tools like Redis, RabbitMQ, ZeroMQ, ActiveMQ, NSQ etc.
See:
https://redis.io/
https://www.rabbitmq.com/
http://zeromq.org/
http://activemq.apache.org/
http://nsq.io/
I recently started looking at spring cloud streams. Is there a streams module that would allow us to talk directly through Amazon SNS?
There's not currently a module, but Spring Integration has an extension project that supports SNS so it shouldn't be too hard to create a custom module.
Consider contributing.
The inbound adapter is available and there is a pull request for an outbound adapter that's not merged yet.
EDIT
The oubound adapter is now on master and available in the 1.0.0.BUILD-SNAPSHOT - we will create a milestone release shortly.