Is it possible to run Selenium like tests in a Windows Docker Container - node.js

we have a Windows Electron application that runs e2e Tests via Spectron. The application is platform-dependent and won't run on Linux (Containers). We want to run our Spectron e2e Tests inside a preconfigured Docker container to have them isolated.
To get a grasp of it I have built a minimal nodejs application that does basically nothing and has an e2e test (jest) that opens a browser tab and checks the title, no functionality just a simple spike.
I created a Dockerfile to build a container to run the tests:
FROM mcr.microsoft.com/windows:20H2-amd64
RUN mkdir "C:/app"
WORKDIR "C:/app"
COPY app "C:/app"
RUN powershell -Command \
Set-ExecutionPolicy unrestricted;
ENV chocolateyUseWindowsCompression false
RUN powershell -Command \
iex ((new-object net.webclient).DownloadString('https://chocolatey.org/install.ps1'));
RUN choco install googlechrome -y --version=91.0.4472.101 --ignore-checksums
RUN choco install chromedriver -y --version=91.0.4472.1010 --ignore-checksums
RUN choco install nodejs-lts -y --version=14.17.1
RUN npm config set strict-ssl false
RUN npm install
ENTRYPOINT npm test
Note this is a Windows container, as our main app will also need a Windows container to run. The container builds and runs the test but crashes with the error: SessionNotCreatedError: session not created thrown by from tab crashed. On my Windows Host, the test runs fine.
Is there anything wrong with my Dockerfile or is this simply not possible in a Windows container?
I don't think it's relevant to the problem but here is also the test file that gets executed when the container does npm test:
describe('test google.com', () => {
const {
Builder,
By,
Key,
until
} = require('selenium-webdriver');
var driver;
beforeEach(() => {
driver = new Builder()
.forBrowser('chrome')
.build();
});
afterEach(() => {
driver.quit();
});
it('should open google search', async () => {
await driver.get('http://www.google.com');
driver
.getTitle()
.then(title => {
expect(title).toEqual('Google');
});
});
});

We had a similar problem, but we are using .net-core with Selenium. For some reason, installing the Chromedriver did not work inside container, so we had to do two things:
manually download the driver based on the chrome version and export the zip into the working directory. (It's been a while though, and we did not really update the image, installing via choco may be working now)
Even stranger thing is that we had to install some fonts for some reason.
Take look at my repo: https://github.com/yamac-kurtulus/Windows-Docker-Images/tree/master/DotnetCore%20Selenium%20With%20Chrome
The relevant part is after line 23 in the Dockerfile.
Note: If you are not very deep into the project, I strongly suggest you to migrate to Linux. Working with Docker on Windows is like a nightmare that you cannot wake up from.

Related

How can I execute a command inside a docker from a node app?

I have a node app running, and I need to access a command that lives in an alpine docker image.
Do I have to use exec inside of javascript?
How can I install latex on an alpine container and use it from a node app?
I pulled an alpine docker image, started it and installed latex.
Now I have a docker container running on my host. I want to access this latex compiler from inside my node app (dockerized or not) and be able to compile *.tex files into *.pdf
If I sh into the alpine image I can compile '.tex into *.pdf just fine, but how can I access this software from outside the container e.g. a node app?
If you just want to run the LaTeX engine over files that you have in your local container filesystem, you should install it directly in your image and run it as an ordinary subprocess.
For example, this Javascript code will run in any environment that has LaTeX installed locally, Docker or otherwise:
const { execFileSync } = require('node:child_process');
const { mkdtemp, open } = require('node:fs/promises');
const tmpdir = await mkdtemp('/tmp/latex-');
let input;
try {
input = await open(tmpdir + '/input.tex', 'w');
await input.write('\\begin{document}\n...\n\\end{document}\n');
} finally {
input?.close();
}
execFileSync('pdflatex', ['input'], { cwd: tmpdir, stdio: 'inherit' });
// produces tmpdir + '/input.pdf'
In a Docker context, you'd have to make sure LaTeX is installed in the same image as your Node application. You mention using an Alpine-based LaTeX setup, so you could
FROM node:lts-alpine
RUN apk add texlive-full # or maybe a smaller subset
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY ./ ./
CMD ["node", "main.js"]
You should not try to directly run commands in other Docker containers. There are several aspects of this that are tricky, including security concerns and managing the input and output files. If it's possible to directly invoke a command in a new or existing container, it's also very straightforward to use that permission to compromise the entire host.

Selenium on a Docker container

I have a simple javascript file that uses selenium to launch Chrome, open some website, and collect some data from that website. Now, I want to put that file and run it inside a Docker container.
To do this I wrote the following Dockerfile:
FROM alpine
RUN apk add --update nodejs npm chromium
COPY . /src
WORKDIR /src
RUN npm install -g chromedriver
RUN npm install
ENTRYPOINT ["node", "index.js"]
The image builds with no errors, but when I attempt to run the container from it I get:
/src/node_modules/selenium-webdriver/remote/index.js:248
reject(Error(e.message))
^
Error: Server terminated early with status 1
at /src/node_modules/selenium-webdriver/remote/index.js:248:24
at processTicksAndRejections (node:internal/process/task_queues:96:5)
There's only one dependency for my index.js file - "selenium-webdriver": "^4.1.1", and the file itself looks like this:
index.js
const {Builder, By} = require('selenium-webdriver');
(async function example() {
let driver = await new Builder().forBrowser('chrome').build();
try {
// Navigate to Url
await driver.get('https://www.example.com');
// Get all the elements available with tag 'p'
let elements = await driver.findElements(By.css('p'));
for(let e of elements) {
console.log(await e.getText());
}
}
finally {
await driver.quit();
}
})();
What am I doing wrong? How can I make the container run successfully?
I think the problem is with selenium-webdriver/chrome,
since the builder is getting stuck on mine.

MeteorUp volumes and how Meteor can access to their contents

First, thank you for reading my question. This is my first time on stackoverflow and I made a lot of research for answers that could help me.
CONTEXT
I'm developing a Meteor App that is used as a CMS, I create contents and store datas in mongoDb collections. The goal is to use these datas and a React project to build a static website, which is sent to an AWS S3 bucket for hosting purpose.
I'm using meteorUp to deploy my Meteor App (on an AWS EC2 instance) and according to MeteorUp documentation (http://meteor-up.com/docs.html#volumes), I added a docker volume in my mup.js:
module.exports = {
...
meteor: {
...
volumes: {
'/opt/front': '/front'
},
...
},
...
};
Once deployed, volume is well set in '/opt/myproject/config/start.sh':
sudo docker run \
-d \
--restart=always \
$VOLUME \
\
--expose=3000 \
\
--hostname="$HOSTNAME-$APPNAME" \
--env-file=$ENV_FILE \
\
--log-opt max-size=100m --log-opt max-file=10 \
-v /opt/front:/front \
--memory-reservation 600M \
\
--name=$APPNAME \
$IMAGE
echo "Ran abernix/meteord:node-8.4.0-base"
# When using a private docker registry, the cleanup run in
# Prepare Bundle is only done on one server, so we also
# cleanup here so the other servers don't run out of disk space
if [[ $VOLUME == "" ]]; then
# The app starts much faster when prepare bundle is enabled,
# so we do not need to wait as long
sleep 3s
else
sleep 15s
fi
On my EC2, '/opt/front' contains the React project used to generate a static website.
This folder includes a package.json file, every modules are available in the 'node_modules' directory. 'react-scripts' is one of them, and package.json contains the following script line:
"build": "react-scripts build",
React Project
React App is fed with a JSON file available in 'opt/front/src/assets/datas/publish.json'.
This JSON file can be hand-written (so the project can be developed independently) or generated by my Meteor App.
Meteor App
Client-side, on the User Interface, we have a 'Publish' button that the Administrator can click when she/he wants to generate the static website (using CMS datas) and deploy it to the S3 bucket.
It calls a Meteor method (server-side)
Its action is separated in 3 steps:
1. Collect every useful datas and save them into a Publish collection
2. JSON creation
a. Get Public collection first entry into a javascript object.
b. Write a JSON file using that object in the React Project directory ('opt/front/src/assets/datas/publish.json').
Here's the code:
import fs from 'fs';
let publishDatas = Publish.find({}, {sort : { createdAt : -1}}).fetch();
let jsonDatasString = JSON.stringify(publishDatas[0]);
fs.writeFile('/front/src/assets/datas/publish.json', jsonDatasString, 'utf8', function (err) {
if (err) {
return console.log(err);
}
});
2. Static Website build
a. Run a CD command to reach React Project's directory then run the 'build' script using this code:
process_exec_sync = function (command) {
// Load future from fibers
var Future = Npm.require("fibers/future");
// Load exec
var child = Npm.require("child_process");
// Create new future
var future = new Future();
// Run command synchronous
child.exec(command, {maxBuffer: 1024 * 10000}, function(error, stdout, stderr) {
// return an onbject to identify error and success
var result = {};
// test for error
if (error) {
result.error = error;
}
// return stdout
result.stdout = stdout;
future.return(result);
});
// wait for future
return future.wait();
}
var build = process_exec_sync('(cd front && npm run build)');
b. if 'build' is OK, then I send the 'front/build' content to my S3 bucket.
Behaviors:
On local environment (Meteor running on development mode):
FYI: React Project directory's name and location are slightly different.
Its located in my meteor project directory, so instead of 'front', it's named '.#front' because I don't want Meteor to restart every time a file is modified, added or deleted.
Everything works well, but I'm fully aware that I'm in development mode and I benefit from my local environment.
On production environment (Meteor running on production mode in a docker container):
Step 2.b : It works well, I can see the new generated file in 'opt/front/src/assets/datas/'
Step 3.a : I get the following error:
"Error running ls: Command failed: (cd /front && npm run build)
(node:39) ExperimentalWarning: The WHATWG Encoding Standard
implementation is an experimental API. It should not yet be used in
production applications.
npm ERR! code ELIFECYCLE npm ERR! errno 1 npm
ERR! front#0.1.0 build: react-scripts build npm ERR! Exit status 1
npm ERR! npm ERR! Failed at the front#0.1.0 build script. npm ERR!
This is probably not a problem with npm. There is likely additional
logging output above.
npm ERR! A complete log of this run can be found in: npm ERR!
/root/.npm/_logs/2021-09-16T13_55_24_043Z-debug.log [exec-fail]"
So here's my question:
On production mode, is it possible to use Meteor to reach another directory and run a script from a package.json?
I've been searching for an answer for months, and can't find a similar or nearby case.
Am I doing something wrong?
Am I using a wrong approach?
Am I crazy? :D
Thank you so much to have read until the end.
Thank you for your answers!
!!!!! UPDATE !!!!!
I found the solution!
In fact I had to check few things on my EC2 with ssh:
once connected, I had to go to '/opt/front/' and try to build the React-app with 'npm run build'
I had a first error because of CHMOD not set to 777 on that directory (noob!)
then, I had an error because of node-sass.
The reason is that my docker is using Node v8, and my EC2 is using Node v16.
I had to install NVM and use a Node v8, then delete my React-App node_modules (and package-lock.json) then reinstall it.
Once it was done, everything worked perfectly!
I now have a Meteor App acting as a CMS / Preview website hosted on an EC2 instance that can publish a static website on a S3 bucket.
Thank you for reading me!
!!!!! UPDATE !!!!!
I found the solution!
In fact I had to check few things on my EC2 with ssh:
once connected, I had to go to '/opt/front/' and try to build the React-app with 'npm run build'
I had a first error because of CHMOD not set to 777 on that directory (noob!)
then, I had an error because of node-sass.
The reason is that my docker is using Node v8, and my EC2 is using Node v16.
I had to install NVM and use a Node v8, then delete my React-App node_modules (and package-lock.json) then reinstall it.
Once it was done, everything worked perfectly!
I now have a Meteor App acting as a CMS / Preview website hosted on an EC2 instance that can publish a static website on a S3 bucket.
Thank you for reading me!

Use a docker sdk to send commands to the docker machine from a web app

I'm new to Docker and I have some difficulties to understand how I should use it.
For now, I'm wondering if that makes sense to attempt sending commands to a docker machine on my computer from the client side script of a javascript web app using an SDK like Dockerode.
I installed Docker CE for windows (17.06.0-ce) and Docker Toolbox, and I ran a container on the default machine using the docker terminal. Now I'm wondering if the commands I typed could be sent from a web app using NodeJS. I tried using this code:
import Docker from 'dockerode';
const docker = new Docker({host: 'myDefaultMachineHost'});
export function createLocalDb () {
docker.pull('someImageFromDockerHub', function (err, stream) {
if (err) console.log("Catch : " + err.toString());
stream.pipe(process.stdout, {end: true});
stream.on('end', function() {
//run the container
}).catch(function (err) {
console.log("Catch : " + err.toString());
});
});
}
But that doesn't work(stream.pipe throws an error). Am I misunderstanding the context in which I'm supposed to use dockerode ?
Thanks for your explanations !
In short: You need change your code to this const docker = new Docker({socketPath: '/var/run/docker.sock'}); and add docker socket inside your container.
Theory:
You have docker socket inside your local machine. You should add this socket inside your docker container. The volume is your solution.
Image for visualization this issue:
Implementation with arguments
This is simple task for Linux/Mac user. They can do
docker run -v /var/run/docker.sock:/var/run/docker.sock ...
On Windows you need run
docker run -v //var/run/docker.sock:/var/run/docker.sock ...
More details in this question.
Implementation with Dockerfile
Also, you can add to your Dockerfile VOLUME instruction.
On Linux/Mac it should be line like this:
VOLUME /var/run/docker.sock /var/run/docker.sock
I don't know who it will be on Windows, I use Mac.

Why isn't my server restarting / code updating using Docker + Nodejs?

My docker file is super simple:
FROM node:4-onbuild
RUN npm install gulp -g;
EXPOSE 8888
This image will automatically run the start script in package.json which I have set simply as gulp.
If I run gulp on my host machine, and make a change to node file, it automatically restarts server:
var gulp = require('gulp');
var nodemon = require('gulp-nodemon');
gulp.task('default', function() {
nodemon({
script: 'server.js', // starts up server on port 4000
env: { 'NODE_ENV': 'development' }
})
});
Figuring everything is okay I run this: docker run -d -p 1234:4000 -v $(pwd):/usr/src/app my-image
Going to http://192.168.99.100:1234/ shows 'Hello World!' from my server.js file. Updating the file does NOT update what I see by hitting that URL again. If I exec into the container, I see the file is updated. Since the container started node via the same gulp command, I don't understand why the node server wouldn't have restarted and shown the update.
The TL;DR of this is that you need to set nodemon to poll the filesystem for changes as follows: https://github.com/remy/nodemon#application-isnt-restarting
In some networked environments (such as a container running nodemon reading across a mounted drive), you will need to use the legacyWatch: true which enabled Chokidar's polling.
Via the CLI, use either --legacy-watch or -L
The longer version is this (with one key assumption - you're using docker on Mac or similar):
On Mac or similar, docker doesn't run natively and instead runs inside of a virtual machine (generally virtual box via docker-machine). Virtual machines generally don't propagate filesystem inotify events, which is what most watchers rely on to restart or perform an action when a file changes. As the virtual machine doesn't propagate the events from the host, Docker never receives the events. Your original docker file would probably work on a native linux machine.
There's an open issue and much more detailed discussion of this here.

Resources