How to run independent instances of gtest for two separate linux processes (one for client and one for server)? - linux

I have two separate linux processes: One for client and one for server. They share some common code and libraries but their application code is independent of each other.
Goal: Run an independent set of test cases on the client and on the server simultaneously. The desire is to trigger the server in a specific minimal way while the client related tests are in progress. (Mocking/ stubbing would be too cumbersome in this case).
Here's the excerpt from CMakefiles.txt:
target_link_libraries(Client SHARED FILES_GLOB src/client/* FILES_GLOB src/common/*
DEPENDS common_libaries)
target_link_libraries(Client PRIVATE gtest)
target_link_libraries(Server SHARED FILES_GLOB src/server/* FILES_GLOB src/common/*
DEPENDS common_libaries)
target_link_libraries(Server PRIVATE gtest)
I have a separate set of test cases for client and for server in their respective folders:
TEST_F(serverTest#1) ... TEST_F(serverTest#n)
TEST_F(clientTest#1) ... TEST_F(clientTest#n)
As a starting point (to make things simpler), InitGoogleTest() is run-time disabled on the server based on a config parameter. The code is something like this:
server/app.cpp:
if (gtestMode) { // == false
::testing::InitGoogleTest();
::testing::GTEST_FLAG(filter) = "*abcd*";
RUN_ALL_TESTS();
}
On the client side, InitGoogleTest() etc. are run-time enabled and do get invoked.
However, when I launch the two applications/ processes back to back from the command-line, the test cases defined on the server side are being invoked. Here's the sample output:
[==========] Running 122 tests from 5 test suites. *(I have 5 test suites on the server side. Not on the client. I've added one test suite on the client just now)*
[----------] Global test environment set-up.
[----------] 4 tests from TestFixture *(This is a server based test fixture name)*
[ RUN ] TestFixture.ServerTest1
Feature: link Setup
...
Questions:
How come the gtest library on the client side (linked PRIVATEly) is even able to see the test fixtures on the server side? They are in different processes.
What is the correct way to make the gtest on the client independent of the gtest on the server?
Please let me know if you need further information about this. As always, any insights you can offer would be of great value.
Regards,
Venk

Related

How to ensure mongoose.dropDatabase() can ONLY be called when connected to mongo-memory-server

We're using mongodb-memory-server for unit tests. After every unit test suite we execute:
await connection.dropDatabase();
await collection.deleteMany({});
To setup mongoose we have two different methods:
setupMongoose(); <--- Connects to our dev database in the cloud (Atlas)
setupMongooseWithMemoryServer(); <---- Connects mongoose to memory server.
We're a team of developers, and my worst fear is that someone uses "setupMongoose()" to setup unit tests by mistake some day. If that happens, dropDatabase() will be called for our "real" dev database. That would be a catastrophy.
So how can I ensure that dropDatabase() and maybe collection.deleteMany({}) can NEVER ever be called on our cloud database?
Some thoughts:
I have thought about setting up env variables and check for it before calling the dangerous methods. I've also already made a run time check:
checkForUnitTestEnv() {
if (!this.init || process.env.JEST_WORKER_ID === undefined) {
console.error('FATAL TRIED TO DROP DATABSE WITHOUT JEST!');
throw 'FATAL TRIED TO DROP DATABSE WITHOUT JEST!';
}
}
(this.init is only true if memory-server has been initialized).
But these methods are not fool proof. Errors can still happen if our developers are not careful. So I was hoping to either make it "illegal operations" with our database provider (Atlas) if possible, or check the mongoose connection uri on run time before calling the dangerous methods (but I haven't found a good way to do this yet).

[AWS][Amplify] Invoke function locally crashs with no error

I have just joined a developpment team, and the project should run in the cloud using amplify. I have a function called usershandler that i want to run locally. For that, i used :
amplify invoke function usershandler
This is the output i get :
Starting execution...
EVENT: {"httpMethod":"GET","body":"{\"name\": \"Amplify\"}","path":"/users","resource":"/{proxy+}","queryStringParameters":{}}
App started
get All VSM called
Connection to database was a success
null
Result:
{"statusCode":200,"body":"{\"success\":true,\"results\":[]}","headers":{"x-powered-by":"Express","access-control-allow-origin":"*","access-control-allow-headers":"Origin, X-Requested-With, Content-Type, Accept","content-type":"application/json; charset=utf-8","content-length":"29","etag":"W/\"1d-4wD7ChrrlHssGyekznKfKxR7ImE\"","date":"Tue, 21 Jul 2020 12:32:36 GMT","connection":"close"},"isBase64Encoded":false}
Finished execution.
EDIT : Also, when running the invoke command, amplify asks me for a src/event.json while i've seen it looking for the index.js for some ??
EDIT 2 [SOLVED] : downgrading #aws-amplify/cli to 4.14.1 seems to solve this :)
Expected behavior : The server should continue running so i can use it ..
Actual behavior : It always stops after the finished execution message.
The connection to the db works fine, the config.json contains correct values. Don't know why it is acting like this. Have anybody had the same problem?
Have a nice day.
Short answer: You are running the invoke command which is doing just what it is supposed to be doing - invoking the lambda function.
If you are looking to get a local API up, then run the following command:
sam local start-api
This will read your template and based on the endpoints you have setup, run them locally essentially mocking API Gateway locally. Read more about it in the official docs here.
Explanation:
This command comes is one of offering of AWS Serverless Application Model (AWS SAM). A tool to develop serverless application. It is essentially an abstraction of AWS Cloufdformation. Similarly Amplify is an abstraction that makes it simple to not only develop and manage the backend but also brings that power to frontend.
As both of them essentially use Cloudformation templates underneeth, you can leverage the capabilities of one tool with another.
SAM provides a robust set of tools for local development invcluding running a local lambda mocking server, in case you are not using API Gateway.
I use this combination to develop and test my frontend along with backend which is in golang, a language which is not as mature as javascript as a backend language with Amplify as of now.

Fail/terminate node.js process if Mithril Ospec tests fail

When I run unit tests via Ospec for Mithril, I can see if tests fail locally in the console.
What I'm looking for is a solution that will not allow a following Node.js build script to execute if one or more of the tests fail.
I don't want code to be pushed up to another environment/lane if the unit tests aren't passing.
I don't see how to accomplish this in the docs.
In Node, I'm running ospec && someBuildProcess.
The answer might be a Node.js thing, but I'm at a loss for what to look for now.
ospec calls process.exit(1) if any tests fail, and the command string you posted should work. I just verified it locally with the following setup:
https://gist.github.com/tivac/d90c07592e70395639c63dd5100b50a6
ospec runs, fails, and the echo command never gets called.
Can you post some more details about your setup?

Visibility of variables from different consoles in Ethereum

I'm new in Ethereum.
And I'm trying to develop contracts using Azure's cluster (I have trial account).
I connected using geth to my network from machine in Azure:
gethadmin#XXXXXX-tx0:~$ geth attach http://ether2ore.eastus.cloudapp.azure.com:8545
Then I initialize variable
>var test_var = 555
undefined
>test_var
555
It's - OK.
But when I tried to connect to same point from my laptop:
C:\Users\boris>geth attach http://ether2ore.eastus.cloudapp.azure.com:8545
I tried to check this variable:
> test_var
ReferenceError: 'test_var' is not defined
at <anonymous>:1:1
I see that it's not defined.
On both consoles I see same accounts:
From - C:\Users\boris>
eth.accounts ["0xab14c61930343149c2f54044054cd46b90c0dee6", "0x7cc276b28bfdbb57151ed3b5552aafb2f2592964", "0xc9b8b9d57219b2c0935d8c28d4d2247fe70232f3", "0x2aed463fd54aa41fed898a9629bee6f0935b74fb"]
From - gethadmin#XXXXXX-tx0
eth.accounts
["0xab14c61930343149c2f54044054cd46b90c0dee6", "0x7cc276b28bfdbb57151ed3b5552aafb2f2592964", "0xc9b8b9d57219b2c0935d8c28d4d2247fe70232f3", "0x2aed463fd54aa41fed898a9629bee6f0935b74fb"]
Command admin.peers on both consoles gives me same results
So, it's same networks.
Maybe I don't understand how it should works, but I suspect that, if I define variable in same network it must be visible from all consoles. Is not it?
Same situation with contracts. I compiled contract in first console and can operate with it, but it's not reachable from another console.
Please, can you explain me why this situation happens or give me proper links to get answer on this question.
Many thanks
The variables you are instantiating are local variables within the console.
Each time you connect, a new JavaScript Console with the web3 API exposed is created locally and it wraps the ethereum function calls so that you can use them within JS rather than having to write the raw calls.
To persist data on the network you would need to deploy a contract with storage. You could then fetch the data from the contract storage without executing a transaction using a getStorageAt or call and the signature of your public variable. However, you will need to execute a transaction calling a contract function (similar to the example in the solidity documentation above) in order to update the data stored in the contract.

How to unit test a method which connects to mongo, without actually connecting to mongo?

I'm trying to write a test to test a method that connects to mongo, but I don't actually want to have to have mongo running and actually make a connection to it to have my tests pass successfully.
Here's my current test which is successful when my mongo daemon is running.
describe('with a valid mongo string parameter', function() {
it('should return a rejected promise', function(done) {
var con = mongoFactory.getConnection('mongodb://localhost:27017');
expect(con).to.be.fulfilled;
done();
});
});
mongoFactory.getConnection code:
getConnection: function getConnection(connectionString) {
// do stuff here
// Initialize connection once
MongoClient.connect(connectionString, function(err, database) {
if (err) {
def.reject(err);
}
def.resolve(database);
});
return def.promise;
}
There are a couple of SO answers related to unit testing code that uses MongoDB as a data store:
Mocking database in node.js?
Mock/Test Mongodb Database Node.js
Embedded MongoDB when running integration tests
Similar: Unit testing classes that have online functionality
I'll make an attempt at consolidating these solutions.
Preamble
First and foremost, you should want MongoDB to be running while performing your tests. MongoDB's query language is complex, so running legitimate queries against a stable MongoDB instance is required to ensure your queries are running as planned and that your application is responding properly to the results. With this in mind, however, you should never run your tests against a production system, but instead a peripheral system to your integration environment. This can be on the same machine as your CI software, or simply relatively close to it (in terms of process, not necessarily network or geographically speaking).
This ENV could be low-footprint and completely run in memory (resource 1) (resource 2), but would not necessarily require the same performance characteristics as your production ENV. (If you want to performance test, this should be handled in a separate environment from your CI anyway.)
Setup
Install a mongod service specifically for CI. If repl sets and/or sharding are of concern (e.g. write concern, no use of $isolated, etc.), it is possible to mimic a clustered environment by running multiple mongod instances (1 config, 2x2 data for shard+repl) and a mongos instance on the same machine with either some init.d scripts/tweaks or something like docker.
Use environment-specific configurations within your application (either embedded via .json files, or in some place like /etc, /home/user/.your-app or similar). Your application can load these based on a node environment variable like NODE_ENV=int. Within these configurations your db connection strings will differ. If you're not using env-specific configs, start doing this as a means to abstract the application runtime settings (i.e. "local", "dev", "int", "pre", "prod", etc.). I can provide a sample upon request.
Include test-oriented fixtures with your application/testing suite. As mentioned in one of the linked questions, MongoDB's Node.js driver supports some helper libraries: mongodb-fixtures and node-database-cleaner. Fixtures provide a working and consistent data set for testing: think of them as a bootstrap.
Builds/Tests
Clean the associated database using something like node-database-cleaner.
Populate your fixtures into the now empty database with the help of mongodb-fixtures.
Perform your build and test.
Repeat.
On the other hand...
If you still decide that not running MongoDB is the correct approach (and you wouldn't be the only one), then abstracting your data store calls from the driver with an ORM is your best bet (for the entire application, not just testing). For example, something like model claims to be database agnostic, although I've never used it. Utilizing this approach, you would still require fixtures and env configurations, however you would not be required to install MongoDB. The caveat here is that you're at the mercy of the ORM you choose.
You could try tingodb.
TingoDB is an embedded JavaScript in-process filesystem or in-memory database upwards compatible with MongoDB at the API level.

Resources