Testing element for display: none - jestjs

I'm hiding an element
button {
display: none
}
and trying to test if it's hidden with Jest:
it('should not show empty button', async () => {
const { container } = renderComponent();
await waitFor(() => {
expect(container.querySelector('button')).not.toBeVisible();
});
});
and getting TypeError: expect(...).not.toBeVisible is not a function. I also tried
expect(container.querySelector('.ag-floating-filter-button-button')).toHaveStyle('display: none')
with the same error

You can use Window.getComputedStyle(element) to check its display style property.
console.log(getComputedStyle(test).display === 'none')
#test{
display: none;
}
<div id="test"></div>

Related

Stencil unable to test mouseenter\mouseleave events using Jest

I built a simple button component using Stencil and assigned 4 events (onMouseDown, onMouseUp onMouseEnter, onMouseLeave), to the button. The component looks like this:
.
.
.
#State() buttonState: string ='disabled';
.
.
.
someInternalLogic(eventName: Events) {
...//just sets a state variable of this.buttonState
}
render() {
return (
<button
onMouseDown={() => this.someInternalLogic(Events.MOUSEDOWN)}
onMouseUp={() => this.someInternalLogic(Events.MOUSEUP)}
onMouseEnter={() => this.someInternalLogic(Events.MOUSEENTER)}
onMouseLeave={() => this.someInternalLogic(Events.MOUSELEAVE)}
>
</button>
);
}
I'm new to testing in general and Jest in particular. I'm having troubles understanding how to test these events synthetically. I've come up with a workaround which works, but is obviously not the way to go.
The workaround:
it('should mouseleave', async () => {
const button = await page.root.shadowRoot.querySelector('button');
const mouseleave = new window.Event("mouseleave", {
bubbles: false,
cancelable: false
});
let mouseleaveBool = false;
button.addEventListener("mouseleave", e=>{
mouseleaveBool = true;
});
await button.dispatchEvent(mouseleave);
await page.waitForChanges();
expect(mouseleaveBool ).toBeTruthy();
});
Instead of dispatching events you can directly call event handlers on your component instance
So for this component
export class TestBtn {
onMouseLeave() {
// do something
}
render() {
return (
<Host>
<button onMouseLeave={() => this.onMouseLeave()}>Test</button>
</Host>
);
}
}
Test can look like this
describe('test-btn', () => {
it('does something on mouse leave', async () => {
// arrange
const page = await newSpecPage({
components: [TestBtn],
html: `<test-btn></test-btn>`,
});
let component = page.rootInstance as TestBtn;
// act
component.onMouseLeave();
// assert
// check if did something
});
});

mounted method giving error message - Nuxt.js, Jest

I am new to unit testing and have a question about the mounted method inside the component.
I am testing if the button text is correctly displaying depends on one of the data values, and it passes. However, I have one method in mounted() inside the component and requests API call which is called from nuxt context.
The method is failing and consoling err message from try and catch because looks like it can not find nuxt context inside the test. This is not affecting my test but I wonder if it is fine, or do I need to fix something.
This is my component.
<template>
<button>
{{variations.length > 0 ? 'Select options' : 'add to cart'}}
</button>
</template>
<script>
data() {
return {
variations: [],
}
},
mounted() {
this.getVaridations()
},
methods: {
async getVaridations() {
try {
const variations = await this.$getVatiation.get() // this part is failing and consoling err message from catch
this.variations = variations
} catch (err) {
console.log(err) // consoling as TypeError: Cannot read properties of undefined (reading 'get')
}
},
},
</script>
This is testing
describe('Product.vue', () => {
it('btn display as "Select options" when there is validation', () => {
const wrapper = mount(Product, {})
expect(wrapper.find('.product-btn').text()).toBe('Select options') // This passes
})
})
You can mock any component methods like
import { shallowMount } from "#vue/test-utils";
describe('Product.vue', () => {
it('btn display as "Select options" when there is validation', () => {
const mocks = {
$getVatiation: {
get: () => [] // returns an empty array, change to what you want to return
}
}
const wrapper = shallowMount (Product, {mocks}) // send your mocks as an argument
expect(wrapper.find('.product-btn').text()).toBe('Select options')
})
})

How to run mediapipe facemesh on a ES6 node.js environment alike react

I am trying to run this HTML example https://codepen.io/mediapipe/details/KKgVaPJ from https://google.github.io/mediapipe/solutions/face_mesh#javascript-solution-api in a create react application. I have already done:
npm install of all the facemesh mediapipe packages.
Already replaced the jsdelivr tags with node imports and I got the definitions and functions.
Replaced the video element with react-cam
I don't know how to replace this jsdelivr, maybe is affecting:
const faceMesh = new FaceMesh({
locateFile: (file) => {
return `https://cdn.jsdelivr.net/npm/#mediapipe/face_mesh/${file}`;
}
});
So the question is:
Why the facemesh is not showing? Is there any example of what I am trying to do?
This is my App.js code (sorry for the debugugging scaffolding):
import './App.css';
import React, { useState, useEffect } from "react";
import Webcam from "react-webcam";
import { Camera, CameraOptions } from '#mediapipe/camera_utils'
import {
FaceMesh,
FACEMESH_TESSELATION,
FACEMESH_RIGHT_EYE,
FACEMESH_LEFT_EYE,
FACEMESH_RIGHT_EYEBROW,
FACEMESH_LEFT_EYEBROW,
FACEMESH_FACE_OVAL,
FACEMESH_LIPS
} from '#mediapipe/face_mesh'
import { drawConnectors } from '#mediapipe/drawing_utils'
const videoConstraints = {
width: 1280,
height: 720,
facingMode: "user"
};
function App() {
const webcamRef = React.useRef(null);
const canvasReference = React.useRef(null);
const [cameraReady, setCameraReady] = useState(false);
let canvasCtx
let camera
const videoElement = document.getElementsByClassName('input_video')[0];
// const canvasElement = document.getElementsByClassName('output_canvas')[0];
const canvasElement = document.createElement('canvas');
console.log('canvasElement', canvasElement)
console.log('canvasCtx', canvasCtx)
useEffect(() => {
camera = new Camera(webcamRef.current, {
onFrame: async () => {
console.log('{send}',await faceMesh.send({ image: webcamRef.current.video }));
},
width: 1280,
height: 720
});
canvasCtx = canvasReference.current.getContext('2d');
camera.start();
console.log('canvasReference', canvasReference)
}, [cameraReady]);
function onResults(results) {
console.log('results')
canvasCtx.save();
canvasCtx.clearRect(0, 0, canvasElement.width, canvasElement.height);
canvasCtx.drawImage(
results.image, 0, 0, canvasElement.width, canvasElement.height);
if (results.multiFaceLandmarks) {
for (const landmarks of results.multiFaceLandmarks) {
drawConnectors(canvasCtx, landmarks, FACEMESH_TESSELATION, { color: '#C0C0C070', lineWidth: 1 });
drawConnectors(canvasCtx, landmarks, FACEMESH_RIGHT_EYE, { color: '#FF3030' });
drawConnectors(canvasCtx, landmarks, FACEMESH_RIGHT_EYEBROW, { color: '#FF3030' });
drawConnectors(canvasCtx, landmarks, FACEMESH_LEFT_EYE, { color: '#30FF30' });
drawConnectors(canvasCtx, landmarks, FACEMESH_LEFT_EYEBROW, { color: '#30FF30' });
drawConnectors(canvasCtx, landmarks, FACEMESH_FACE_OVAL, { color: '#E0E0E0' });
drawConnectors(canvasCtx, landmarks, FACEMESH_LIPS, { color: '#E0E0E0' });
}
}
canvasCtx.restore();
}
const faceMesh = new FaceMesh({
locateFile: (file) => {
return `https://cdn.jsdelivr.net/npm/#mediapipe/face_mesh/${file}`;
}
});
faceMesh.setOptions({
selfieMode: true,
maxNumFaces: 1,
minDetectionConfidence: 0.5,
minTrackingConfidence: 0.5
});
faceMesh.onResults(onResults);
// const camera = new Camera(webcamRef.current, {
// onFrame: async () => {
// await faceMesh.send({ image: videoElement });
// },
// width: 1280,
// height: 720
// });
// camera.start();
return (
<div className="App">
<Webcam
audio={false}
height={720}
ref={webcamRef}
screenshotFormat="image/jpeg"
width={1280}
videoConstraints={videoConstraints}
onUserMedia={() => {
console.log('webcamRef.current', webcamRef.current);
// navigator.mediaDevices
// .getUserMedia({ video: true })
// .then(stream => webcamRef.current.srcObject = stream)
// .catch(console.log);
setCameraReady(true)
}}
/>
<canvas
ref={canvasReference}
style={{
position: "absolute",
marginLeft: "auto",
marginRight: "auto",
left: 0,
right: 0,
textAlign: "center",
zindex: 9,
width: 1280,
height: 720,
}}
/>
</div >
);
}
export default App;
You don't have to replace the jsdelivr, that piece of code is fine; also I think you need to reorder your code a little bit:
You should put the faceMesh initialization inside the useEffect, with [] as parameter; therefore, the algorithm will start when the page is rendered for the first time
Also, you don't need to get videoElement and canvasElement with doc.*, because you already have some refs defined
An example of code:
useEffect(() => {
const faceMesh = new FaceDetection({
locateFile: (file) => {
return `https://cdn.jsdelivr.net/npm/#mediapipe/face_detection/${file}`;
},
});
faceMesh.setOptions({
maxNumFaces: 1,
minDetectionConfidence: 0.5,
minTrackingConfidence: 0.5,
});
faceMesh.onResults(onResults);
if (
typeof webcamRef.current !== "undefined" &&
webcamRef.current !== null
) {
camera = new Camera(webcamRef.current.video, {
onFrame: async () => {
await faceMesh.send({ image: webcamRef.current.video });
},
width: 1280,
height: 720,
});
camera.start();
}
}, []);
Finally, in the onResults callback I would suggest printing first the results, just to check if the Mediapipe implementation is working fine. And don't forget to set the canvas size before drawing something.
function onResults(results){
console.log(results)
canvasCtx = canvasReference.current.getContext('2d')
canvas.width = webcamRef.current.video.videoWidth;
canvas.height = webcamRef.current.video.videoHeight;;
...
}
Good luck! :)

Jest / Enzyme - How to test at different viewports?

I am trying to run a test on a component at a certain viewport width. I am doing the following, but this doesn't seem to change it:
test('Component should do something at a certain viewport width.', () => {
global.innerWidth = 2000;
const component = mount(<SomeComponent />);
...
});
I also found an article that explains how to do it using JSDom, but as Jest now ships with JSDom, I wondered if there was a native solution.
https://www.codementor.io/pkodmad/dom-testing-react-application-jest-k4ll4f8sd
Background Information:
jsdom does not implement window.resizeBy() or window.resizeTo()
jsdom defines the window innerWidth and innerHeight to be 1024 x 768
It is possible to simulate a window resize using jsdom by manually setting window.innerWidth and window.innerHeight and firing the resize event
Here is an example:
comp.js
import * as React from 'react';
export default class Comp extends React.Component {
constructor(...args) {
super(...args);
this.state = { width: 0, height: 0 }
}
updateDimensions = () => {
this.setState({ width: window.innerWidth, height: window.innerHeight });
}
componentDidMount() {
this.updateDimensions();
window.addEventListener("resize", this.updateDimensions);
}
componentWillUnmount() {
window.removeEventListener("resize", this.updateDimensions);
}
render() {
return <div>{this.state.width} x {this.state.height}</div>;
}
}
comp.test.js
import * as React from 'react';
import { shallow } from 'enzyme';
import Comp from './comp';
const resizeWindow = (x, y) => {
window.innerWidth = x;
window.innerHeight = y;
window.dispatchEvent(new Event('resize'));
}
describe('Comp', () => {
it('should display the window size', () => {
const component = shallow(<Comp />);
expect(component.html()).toEqual('<div>1024 x 768</div>');
resizeWindow(500, 300);
expect(component.html()).toEqual('<div>500 x 300</div>');
resizeWindow(2880, 1800);
expect(component.html()).toEqual('<div>2880 x 1800</div>');
});
});
Notes:
As of Enzyme v3 shallow calls React lifecycle methods like componentDidMount() so it can be used in place of mount
This answer borrows heavily from the information here, here, here, and #JoeTidee's own answer here.
If you're using TypeScript it will complain that window.innerWidth/innerHeight are readonly.
You can get around this with either redeclaring the property:
Object.defineProperty(window, 'innerWidth', {writable: true, configurable: true, value: 105})
or using the Object.assign method:
window = Object.assign(window, { innerWidth: 105 });
Both not extremely nice solutions, but they work.
Works for me. Code is no longer marked as uncovered.
it('resize event listener changes the state', () => {
const wrapper = shallow(<Component />);
const instance = wrapper.instance();
instance.setState({
mobileMode: true
});
global.innerWidth = 800;
window.dispatchEvent(new Event('resize'));
expect(instance.state.mobileMode).toBeFalsy();
global.innerWidth = 600;
window.dispatchEvent(new Event('resize'));
expect(instance.state.mobileMode).toBeTruthy();
});
Resize listener inside my component
...
resizeListener = () => {
if (window.innerWidth < 768) {
this.setState({
mobileMode: true
});
} else {
this.setState({
mobileMode: false
});
}
};
window.addEventListener('resize', resizeListener);
...

How can I use Esri Arcgis Map in ReactJs Project?

I'm trying to use Esri map. To include map in my project, here is what I found:
require([
"esri/map",
"esri/dijit/Search",
"esri/dijit/LocateButton",
"esri/geometry/Point",
"esri/symbols/SimpleFillSymbol",
"esri/symbols/SimpleMarkerSymbol",
"esri/symbols/SimpleLineSymbol",
But there isn't any esri folder or npm package. Therefore, I'm confused here. How esri is imported in project?
Use esri-loader to load the required esri modules. This is a component rendering basemap.
import React, { Component } from 'react';
import { loadModules } from 'esri-loader';
const options = {
url: 'https://js.arcgis.com/4.6/'
};
const styles = {
container: {
height: '100vh',
width: '100vw'
},
mapDiv: {
padding: 0,
margin: 0,
height: '100%',
width: '100%'
},
}
class BaseMap extends Component {
constructor(props) {
super(props);
this.state = {
status: 'loading'
}
}
componentDidMount() {
loadModules(['esri/Map', 'esri/views/MapView'], options)
.then(([Map, MapView]) => {
const map = new Map({ basemap: "streets" });
const view = new MapView({
container: "viewDiv",
map,
zoom: 15,
center: [78.4867, 17.3850]
});
view.then(() => {
this.setState({
map,
view,
status: 'loaded'
});
});
})
}
renderMap() {
if(this.state.status === 'loading') {
return <div>loading</div>;
}
}
render() {
return(
<div style={styles.container}>
<div id='viewDiv' style={ styles.mapDiv } >
{this.renderMap()}
</div>
</div>
)
}
}
export default BaseMap;
This renders a base map but this is not responsive. If I remove the div around the view div or if I give the height and width of the outer div (surrounding viewDiv) as relative ({ height: '100%', width: '100%'}), the map does not render. No idea why. Any suggestions to make it responsive would be appreciated.
An alternative method to the above is the one demonstrated in esri-react-router-example. That application uses a library called esri-loader to lazy load the ArcGIS API only in components/routes where it is needed. Example:
First, install the esri-loader libary:
npm install esri-loader --save
Then import the esri-loader functions in any react module:
import * as esriLoader from 'esri-loader'
Then lazy load the ArcGIS API:
componentDidMount () {
if (!esriLoader.isLoaded()) {
// lazy load the arcgis api
const options = {
// use a specific version instead of latest 4.x
url: '//js.arcgis.com/3.18compact/'
}
esriLoader.bootstrap((err) => {
if (err) {
console.error(err)
}
// now that the arcgis api has loaded, we can create the map
this._createMap()
}, options)
} else {
// arcgis api is already loaded, just create the map
this._createMap()
}
},
Then load and the ArcGIS API's (Dojo) modules that are needed to create a map:
_createMap () {
// get item id from route params or use default
const itemId = this.props.params.itemId || '8e42e164d4174da09f61fe0d3f206641'
// require the map class
esriLoader.dojoRequire(['esri/arcgis/utils'], (arcgisUtils) => {
// create a map at a DOM node in this component
arcgisUtils.createMap(itemId, this.refs.map)
.then((response) => {
// hide the loading indicator
// and show the map title
// NOTE: this will trigger a rerender
this.setState({
mapLoaded: true,
item: response.itemInfo.item
})
})
})
}
The benefit of using esri-loader over the approach shown above is that you don't have to use the Dojo loader and toolchain to load and build your entire application. You can use the React toolchain of your choice (webpack, etc).
This blog post explains how this approach works and compares it to other (similar) approaches used in applications like esri-redux.
You don't need to import esri api like you do for ReactJS. As the react file will finally compile into a js file you need to write the esri parts as it is and mix the ReactJS part for handling the dom node, which is the main purpose of ReactJS.
A sample from the links below is here
define([
'react',
'esri/toolbars/draw',
'esri/geometry/geometryEngine',
'dojo/topic',
'dojo/on',
'helpers/NumFormatter'
], function(
React,
Draw, geomEngine,
topic, on,
format
) {
var fixed = format(3);
var DrawToolWidget = React.createClass({
getInitialState: function() {
return {
startPoint: null,
btnText: 'Draw Line',
distance: 0,
x: 0,
y: 0
};
},
componentDidMount: function() {
this.draw = new Draw(this.props.map);
this.handler = this.draw.on('draw-end', this.onDrawEnd);
this.subscriber = topic.subscribe(
'map-mouse-move', this.mapCoordsUpdate
);
},
componentWillUnMount: function() {
this.handler.remove();
this.subscriber.remove();
},
onDrawEnd: function(e) {
this.draw.deactivate();
this.setState({
startPoint: null,
btnText: 'Draw Line'
});
},
mapCoordsUpdate: function(data) {
this.setState(data);
// not sure I like this conditional check
if (this.state.startPoint) {
this.updateDistance(data);
}
},
updateDistance: function(endPoint) {
var distance = geomEngine.distance(this.state.startPoint, endPoint);
this.setState({ distance: distance });
},
drawLine: function() {
this.setState({ btnText: 'Drawing...' });
this.draw.activate(Draw.POLYLINE);
on.once(this.props.map, 'click', function(e) {
this.setState({ startPoint: e.mapPoint });
// soo hacky, but Draw.LINE interaction is odd to use
on.once(this.props.map, 'click', function() {
this.onDrawEnd();
}.bind(this));
}.bind(this))
},
render: function() {
return (
<div className='well'>
<button className='btn btn-primary' onClick={this.drawLine}>
{this.state.btnText}
</button>
<hr />
<p>
<label>Distance: {fixed(this.state.distance)}</label>
</p>
</div>
);
}
});
return DrawToolWidget;
});
Below are the links where you can find information in detail.
http://odoe.net/blog/esrijs-reactjs/
https://geonet.esri.com/people/odoe/blog/2015/04/01/esrijs-with-reactjs-updated

Resources