How to set Database credentials for different environment in TALEND Open Studio - data-migration

I have a requirement to set Database connections credentials for my TALEND jobs for different environment as Runtime Values. Means if i want to run my jobs in Development environment, then it should pick DB credentials from a development csv/excel/text file and if i am running it in Production then it should pick credentials from Prod text file. Someone please tell me is this is possible and if yes can someone please guide me how to do this. I read this link but in this i am not able to configure the values in a text/csv file.
http://blog.iadvise.eu/2014/05/27/use-of-contexts-within-talend/

Use context variables. This tutorial will help you set up context variable which has different values for different environment. You can have different configuration for different environment for same job.

Related

wildfly 20 doesn't log with log4j. Which factors Can be involved for that?

I have two environment development(local) and other QA environment, In the last is a cluster with two nodes.
The problem came with the deploy in QA environment I can't see the log in the server, but locally print console logging without problem.
I'm sure that module structure is the same in both environments, and my configure is in the classpath with xml file.
Which aspect can influence in this difference?
Local print logging console server and QA enviroment dont do it.
I was able to solve this, I checked the server in the development environment, it's a domain configuration and for this reason it did not show the application log in the user wizard, due to application conditions only the applications write to the .log file, at start I was not allowed to check this domain log directory, but after I check this by command line my app log file is right there
I'll hope to help someone else

How do I add environment variables to a AWS Lightsail Node.js instance?

The Problem
I'm dipping my toes into AWS by deploying a simple API built with NestJS. This will be the first app I've deployed to a cloud service. I've already cloned my repository on an AWS Lightsail Linux instance with Node.js and installed all of my dependencies. However, I'm confused on how best to provide environment variables to my app.
Obviously, I has a local .env file that I used during development with credentials for my database, port info, etc. Do I just create a new .env file on the machine running my instance through the command line? I've read that for other AWS services you can provide env variables through that service's UI, but I can't find the same thing for Lightsail.
I would greatly appreciate it if someone could give me an explanation of env variables, and how we should generally provide them to cloud services.
Thank you!
There could be a better method. But, you could add a file locally on the LightSail server itself. If you don't know how to do so through console, I'll explain below.
Create the file: touch .env
Open up the file in console: vim .env.
Now you will see the file opened up in console, with weird UI if you haven't seen it before.
Prepare to write to file: press i.
Now you should see a the ~Insert~ mode appear at the bottom.
Add the variables: copy and paste your .env from your local laptop to your lightsail console.
close out: press esc, then type in :wq, and click on enter
This should work as normal with any .env file. I'm unsure if there are security issues with this, but I don't believe so.

Ready to use Image for SharePoint 2013 environment

I have a requirement in which I am required to have a ready to use image of the SharePoint 2013 development environment with all the necessary installations like SQL server, language pack, etc and configurations already done- App configuration, service application configuration, search configuration etc, which the developers can directly use by either mounting it or by running it and right away start the development.
The environment can be considered here is single server environment with all the tiers in the same server. However any suggestions for the multi-server environment will also be helpful.
Thanks

SQL Server Agent Job Not Reading SSIS Project Parameters

I have an SQL Server Agent Job running a number of SSIS Packages via a series of steps. The packages themselves are set up to read source and destination server locations via a query to an SQL config table. However, the initial connection pointing to the config table has a hardcoded server location.
When rolling out the SSIS Packages and Agent Job from test to stage, I will need to alter the hardcoded location of the config table from the old test location, to the new stage location. I thought a way around this would be to create a Project Parameter for the location of the config table, and feed each package connection for the config table with this variable, meaning I'd only have to change this Project Param, rather than manually alter the connection in each package.
This works fine if I execute the package in visual studio, but an agent job is unable to read the project parameter, and the step fails.
Does anybody know of a way around this? I've done some reading on SSIS Environment Variables, but as far as I can see this wouldn't solve the problem. I've also been told that I can specify connection properties in the command line of the agent job itself, but I haven't been able to get this to override a package variable.
If anyone has an ideas it would be much appreciated.
In the end I added this to the command line of the job, immediately after /CHECKPOINTING OFF:
/SET "\"\Package.Variables[VARIABLENAME].Properties[Value]\"";VARIABLEVALUE

Deploying Report(.rdl) and DataSet and Data Source to SharePoint 2013 Integrated mode

I am searching for blog of MSFT article which speaks about publishing/deploying SSRS reports to DEV- QA - Acceptance - PROD SharePoint 2013 in Integrated mode.
Can someone advice if creating a feature/solution and packaging them for deployment is easy way or if there is another way to deploy this reports in the development lifecycle? with every environment(DEV, QA, UAT, PRod) the link within datasource changes which also needs to be dynamic, is it possible?
Within the project properties in SQL Server Data Tools/BIDS, you can define your various environments and enter the appropriate urls for data sources, reports, and server that environment in the deployment properties. Once these are set up, you can choose the correct configuration for the appropriate environment. Then when you click deploy, it will go to the correct location for that environment. Do not be fooled by the Configuration drop-down box on the properties page. You actually have to click the button to switch between configurations.
So change it here:
Not here:
For the data sources, make sure you are using shared data sources and that you set Overwrite data source in your project properties to false. Then update the data source to be appropriate to the environment and deploy it once in each environment. You won't have to do this in the future since it will now be out there. This means your SSRS data source must have the same name in each environment. If you change it in the future, you need to update it and redeploy in each SharePoint environment. Here's a blog post describing this approach.

Resources