Nested Azure Logic Apps - azure

Is this even possible? I have a logic application that is an unit of functionality (mixture of card connectors and API apps I created) that I would like to share among other logic apps. From what I can see, this doesn't appear to be allowed. If this is not possible, I am going to have to recreate the same cards in the logic apps I need this unit of functionality in.

You can definitely factor your Logic Apps in smaller flows that can be orchestrated and with the support of conditions and iterations you can do lots of powerful stuff.
Unfortunately today it is not as straight forward to leverage the "Action of Type Workflow" that we have in the system since the User Interface does not expose the capabilities you would want such as adding actions of type Workflow, editing the access keys, and the outputs.
I wrote a quick blog here that shows you how you can do that today using the ARMClient to edit the access keys and make it work:
http://blogs.msdn.com/b/carlosag/archive/2015/05/31/using-nested-azure-logic-apps-or-invoking-flows-from-another-logic-app.aspx

In my notes from the Logic Apps talk at Ignite I wrote that it is possible to nest logic apps. I don't have an example handy, but it might be worth skimming through the slides/video from that talk.

Related

Is there a way to use inner features of an application through Bixby?

I want Bixby to access the inner features of the application. Like to compose a message and send it in a Chat App. Is there a way to do so?
Sure, you can! As long as the application has REST (Or SOAP) endpoints that can be invoked, it can be called from Bixby.
Having said that, Bixby has many built-in features that allows developers to create rich, natural conversational experiences. As a general guide, the data intensive and complex computations parts of your capsule should be run on an external REST endpoint while the conversational experience (and the associated logic) should be driven from within Bixby. Hope this helps.

Should I be moving to a microservices based architecture?

I am working on a monolith system. All of it's code is in one repository (Web API and background workers). System is written in Nodejs and MongoDB (Mongoose) is used as a data store. My goal is to set a new path how project should evolve. At first I was wondering if I could move towards microservices based architecture.
Monolith architecture creates some problems:
If my background workers needs to scale. I have to deploy all the project to the server despite only using a small fraction of it.
All system must be redeployed when code changes. What if payment processor calls webhook while system is being redeployed?
Using microsevices advantages are quite obvious:
Smaller code base for individual microservice. Easier to reason about it.
Ability to select programming tools best for particular use case.
Easier to scale.
Looking at the current code I noticed that Mongoose ODM (Object Document Mapper) models are used across all the project to create, query and update models in database. As a principle of a good programming all such interactions with database should be abstracted. Business logic should not leak into other system layers. I could do that by introducing REPOSITORY pattern (Domain Driven Design). While code is still being shared across web api and it's background workers it is not a hard task to do.
If i decide to extract repositories into standalone microservices than all bunch of problems arise:
Some sort of query language must be introduced to accommodate complex search queries.
Interface must provide a way to iterate over search results (cursor based navigation) without returning all database documents over network.
Since project is in it's early stage and I am the only developer, going to microservices based architecture seems like an overkill. Maybe there are other approaches I should consider?
Extracting business logic and interaction with database into separate repository and sharing among services to avoid complex communication protocols between services?
Based on my experience with working in Microservices for last few years, it seems like an overkill in current scenario but pays off in long-term.
Based on the information stated above, my thoughts are:
Code Structure - Microservices Architecture (MSA) applying in above context means not separating DAO, Business Logic etc. rather is more on the designing system as per business functions. For example, if it is an eCommerce application, then you can shipping, cart, search as separate services, which can further be divided into smaller services. Read it more about domain-driven design here.
Deployment Unit - Keeping microservices apps as an independent deployment unit is a key principle. Hence, keep a vertical slice of the application and package them as Docker Image with Application Code, App Server (if any), Database and OS (Linux etc.)
Communication - With MSA, communication between services become a key and hence general practice is to remain with the message-oriented approach for communication (read about the reactive system and reactive programming for more insight).
PaaS Solution - There are multiple PaaS solutions available, which you can apply so that you don't need to worry about all the other aspects like container management, container orchestration, auto-scaling, configuration management, log management and monitoring etc. See following PaaS solutions:
https://www.nanoscale.io/ by TIBCO
https://fabric8.io/ - by RedHat
https://openshift.io - by RedHat
Cloud Vendor Platforms - AWS, Azure & Google Cloud all of them have specific support for Microservices App from the deployment perspective, which we can use as an alternative solution if you don't want to deploy PaaS solution in your organization.
Hope these pointers will have in understanding the overall landscape so that you can structure your architecture for future need.
I am working on a monolith system... My goal is to set a new path how project should evolve. At first I was wondering if I could move towards microservices based architecture.
In what ways do you need to evolve the project? Will it be mostly bugfixes, adding features, improving performance and/or scalability? Do you anticipate other developers collaborating in the future? Are you currently having maintenance issues? The answers to these questions (and many more) should be considered in guiding your choices.
You seem to be doing your homework around the pros and cons of a microservice architecture, so if you haven't asked yourself why you're even doing this in the first place, now would be good time to do so.
Maybe there are other approaches I should consider?
There's always the good old don't-break-what's-going ;)

What gives lower latency, to code business logic in T-SQL or js?

I'm about to start developing a back-end service for a mobile app using Azure Mobile Services. But I honestly can't figure out which approach is better for perfomance: to code business logic using stored procedures in T-SQL or doing it using javascript. Other than perfomance, also, which one gives more oportunnity to reuse?
JavaScript or C# would offer more opportunity for reuse, if for example you later expand your app and need to provide fuller web services than WAMS can provide. In terms of performance there's probably not enough difference to tip the scale one way or the other, since the IO is the main factor.
As a general rule, embedding business/application logic in your database is to be avoided, partly because SQL-derived languages are rarely ideal for that type of code, but more practically because it makes it much harder to support alternative databases in the future.

mvc-mini-profiler - working with a load balanced web role (azure et al)

I believe that the mvc mini profiler is a bit of a 'God-send'
I have incorporated it in a new MVC project which is targeting the Azure platform.
My question is - how to handle profiling across server (role instance) barriers?
Is this is even possible?
I don't understand why you would need to profile these apps any differently. You want to profile how your app behaves on the production server - go ahead and do it.
A single request will still be executed on a single instance, and you'll get the data from that same instance. If you want to profile services located on a different physical tier as well, that would require different approaches; involving communication through internal endpoints which I'm sure the mini profiler doesn't support out of the box. However, the modification shouldn't be that complicated.
However, would you want to profile physically separated tiers, I would go about it in a different way. Specifically, profile each tier independantly. Because that's how I would go about optimizing it. If you wrap the call to your other tier in a profiler statement, you can see where the problem lies and still be able to solve it.
By default the mvc-mini-profiler stores and delivers its results using HttpRuntime.Cache. This is going to cause some problems in a multi-instance environment.
If you are using multiple instances, then some ways you might be able to make this work are:
to change the Http Cache to an AppFabric Cache implementation (or some MemCached implementation)
to use an alternative Storage strategy for your profile results (the code includes SqlServerStorage as an example?)
Obviously, whichever strategy you choose will require more time/resources than just the single instance implementation.

Saas model data isolation

I curently have an application writen in php using the symfony framework. Rather than have seperate installs for customer on a hosted server, I would like to move to as SaaS model with one install for all customers posible running of google code or another cloud based service. I am not tied to PHP though i would like to have the benifits of a good framework.
So the chalenge: If all customers are using the same application we then have fin a way isolating each customers data. Customers do for eample have admin access and can manager their own users and privlages. At a simplistic leve you could just have a organisation identifier in each table take and add that to all database operations. However most application framewors use and ORM of some kind, and I have not been able to find one that will easly / seemlesly facinatate this at a leve the has minimum impact on the application code.
Has anyone looked at this, are there any good aproaches to this problem?
As Itay says, a multi-tenant system is a common requirement. A while back I was doing some research on this problem and came across a pretty good presentation on the different ways to handle this issue, and the pros and cons of each: http://aac2009.confreaks.com/06-feb-2009-14-30-writing-multi-tenant-applications-in-rails-guy-naor.html
This particular presentation is targeted to a Rails audience, but the principles are the same as with any language.
The approach you described is common, and PHP (One of the strengths) will allow you to comparatively easily go into the ORM code and modify it to your needs.
Second approach is to create a separate DB for each organization and a joint DB for shared resources.
A bit of a design challenge (but just a bit).
if you are really big, then you will even need to consider a separate DB server for each organization (I would say this is a serious overkill in 99.99999% of the cases).
This MSDN article gives you a very good overview of Data Architecture in Multi-tenancy: http://msdn.microsoft.com/en-us/library/aa479086.aspx

Resources