Cypress is returning an empty array when trying to log sheetnames of an excel file - excel

I am currently trying to get the sheetnames of an excel file but Cypress is returning an empty array. Is there something I missed? I'll be using it to verify data on later steps.
I'm using Cypress 9.6.0 with Cucumber. Below are my scripts and screenshots:
index.js for task
module.exports = (on, config) => {
on('file:preprocessor', cucumber());
on('task', {
checkExcelSheetContents(args){
if (fs.existsSync(args.filePath)) {
const workbook = xlsx.readFile(args.filePath);
return xlsx.utils.sheet_to_json(workbook.SheetNames)
} else {
throw new Error ("File not found")
}
}
})
return Object.assign({}, config, {
fixturesFolder: 'cypress/fixtures',
integrationFolder: 'cypress/integration',
screenshotsFolder: 'cypress/screenshots',
videosFolder: 'cypress/videos',
supportFile: 'cypress/support/index.js'
});
}
.js file
And ('try', () => {
var excelFilePath = "../CreateAutomatedTests/cypress/downloads/courses20220714_09_51_27.xlsx"
cy.wrap(excelFilePath).as('filePath')
cy.get('#filePath').then((filePath) => {
cy.task('checkExcelSheetContents', { filePath }).then((contents) => {
cy.log(contents)
})
})
})
Please see these screenshots as well

I've always used the buffer version of xlsx.read().
From xlsx package
For Node ESM, the readFile helper is not enabled. Instead, fs.readFileSync should be used to read the file data as a Buffer for use with XLSX.read:
import { readFileSync } from "fs";
import { read } from "xlsx/xlsx.mjs";
const buf = readFileSync("test.xlsx");
/* buf is a Buffer */
const workbook = read(buf);
Your task:
on('task', {
checkExcelSheetContents(args){
if (fs.existsSync(args.filePath)) {
const buf = fs.readFileSync(file);
const workbook = xlsx.read(buf, { type: 'buffer' });
return workbook.SheetNames
} else {
throw new Error ("File not found")
}
}
})

Related

Exclude file that not fully transferred

I am writing a typescript function to display all the files in the current directory and all the subfolders. It works like a charm but wanted to exclude the file that is still under transfer(growing file). The transfer is done via a 3rd part tool, and I can't access the source to compare.
Here is the code
interface DirectoryFiles {
fileName: string;
directory: string;
}
private async ListFilesDir(
currDirecory: string,
fileArray: DirectoryFiles[] | [],
): Promise<DirectoryFiles[]> {
const allFiles = await this.getAllFilesInDir(currDirecory);
fileArray = fileArray || [];
for (const file of allFiles) {
if (fs.statSync(currDirecory + '/' + file).isDirectory()) {
fileArray = await this.ListFilesDir(
join(currDirecory, file),
fileArray,
);
} else {
fileArray.push({
directory: currDirecory,
fileName: file,
});
}
}
return fileArray;
}
async getAllFilesInDir(dir: string): Promise<string[]> {
try {
return fsPromises.readdir(dir);
} catch (error) {
throw new Error(`Error occurred while reading directory ${dir}!`);
}
}
been figuring out this for a while now. Thank you for your time.

How to write the unit test for 'fs unlink' using vitest for the follow function?

deleteDuplicatedImage.ts
import { unlink, PathLike } from "fs";
import { logger } from "../libraries";
export const deleteDuplicatedImage = (imagePath: PathLike) => {
unlink(imagePath, function (error) {
if (error) {
throw error;
}
// if no error is thrown, file has been deleted successfully
logger.info("File was deleted as it already exists in the db!");
});
};
This is the function for which I'm writing test case using vitest framework.
Though, I tried to write the test for it in the following way
deleteDuplicatedImage.spec.ts
require("dotenv").config();
import { nanoid } from "nanoid";
import { afterEach, describe, expect, it, vi } from "vitest";
import * as deleteDuplicatedImage from "../../src/lib/utilities/deleteDuplicatedImage";
const testImagePath: string = `${nanoid()}-testImagePath`;
describe("utilities -> deleteDuplicatedImage", () => {
afterEach(() => {
vi.restoreAllMocks();
});
it("it should throw an error", async () => {
const mockedDeleteDuplicatedImage = vi
.spyOn(deleteDuplicatedImage, "deleteDuplicatedImage")
.mockImplementation((_imagePath: any) => {});
deleteDuplicatedImage.deleteDuplicatedImage(testImagePath);
expect(mockedDeleteDuplicatedImage).toBeCalledWith(testImagePath);
expect(
deleteDuplicatedImage.deleteDuplicatedImage(testImagePath)
).toBeUndefined();
});
});
It is also passed but not including the coverage of the code!!
It should have 100% test coverage

copying files using `ncp` throws: no such file or directory, mkdir

I'm using ncp to copy files as following:
import ncp from "ncp";
import { promisify } from "util";
const ncpPromise = promisify(ncp);
const copyAssets = async (exportFolderName, includeSourceMaps) => {
const assets = glob.sync("**/", { cwd: distPath });
const options = { clobber: true, stopOnErr: true };
if (!includeSourceMaps) {
options.filter = (f) => {
return !f.endsWith(".map");
};
}
return Promise.all(
assets.map((asset) => {
return ncpPromise(
path.join(distPath, asset),
path.join(exportPath, exportFolderName, asset),
options
);
})
);
};
But this sometimes fails with the following error:
"ENOENT: no such file or directory, mkdir '/path/to/folder'"
How can I solve this ?
I guess you are trying to copy all files matching for the given glob, so you need to do:
const assets = glob.sync("**/*.*", { cwd: distPath }); // note the *.*
For example, your current glob in question will result into:
[
'folder1/',
'folder2/',
]
whereas the glob in this answer will result into (This is what you want):
[
'folder1/file1.txt',
'folder1/file2.txt',
'folder2/anotherfile.txt',
]
An Alternative:
Seems like ncp isn't being maintained. So, you can use fs-extra, it can copy file and directory as well:
const glob = require("glob");
const path = require("path");
const fs = require("fs-extra");
const copyAssets = async (exportFolderName, includeSourceMaps) => {
const assets = glob.sync("**/*.*", { cwd: distPath });
const options = { overwrite: true };
if (!includeSourceMaps) {
options.filter = (f) => {
return !f.endsWith(".map");
};
}
return Promise.all(
assets.map((asset) => {
return fs
.copy(
path.join(distPath, asset),
path.join(exportPath, exportFolderName, asset),
options
)
.catch((e) => console.log(e));
})
);
};
NPM qir (yes, it is published by myself) is another choice:
const qir = require('qir');
qir.asyncing.copy('/A/path/to/src', '/B/path/to/dest')
.then(() => { /* OK */ }
.catch(ex => { /* Something wrong */ }
;
Here, /A/path/to/src may be a file or a folder, and /B/path/to is not required to exist already.
There is a synchronous way:
const qir = require('qir');
qir.syncing.copy('/A/path/to/src', '/B/path/to/dest');
And, if both src and dest located in the same directory:
const qir = require('qir');
let q = new qir.AsyncDir('/current/working/dir');
q.copy('A/path/to/src', 'B/path/to/dest')
.then(() => { /* OK */ }
.catch(ex => { /* Something wrong */ }
;
It will copy /current/working/dir/A/path/to/src to /current/working/dir/B/path/to/dest.

Jest - mocking and testing pino multi streams based on log levels

I am struggling to find out correct way of mocking and using pino in a test logging service,
So here is my implementation of pino logger. This write to different file streams based on log levels.
getChildLoggerService(fileNameString): pino.Logger {
const streams: Streams = [
{ level: 'fatal', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-fatal.log'))},
{ level: 'error', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-error.log'))},
{ level: 'debug', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-debug.log'))},
{ level: 'info', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-info.log'))},
];
return pino({useLevelLabels: true,
base: {
hostName: os.hostname(),
platform: os.platform(),
processId: process.pid,
timestamp: this.appUtilService.getCurrentLocaleTimeZone(),
// tslint:disable-next-line: object-literal-sort-keys
fileName: this.appUtilService.getFileName(fileNameString),
} ,
level: this.appUtilService.getLogLevel(),
messageKey: LOGGER_MSG_KEY,
prettyPrint: this.appUtilService.checkForDevEnv(process.env.NODE_ENV),
timestamp: () => {
return this.appUtilService.getCurrentLocaleTimeZone()
},
}, multistream(streams)).child({
connectorReqId: (process.env.REQ_APP_NAME === null ? 'local': process.env.REQ_APP_NAME)
+uuid.v4().toString()
});
}
The most important part I wanted to test is the multistreams where I need to write to different log files based on the log levels and so far I couldn't figure out a way to do that
import pino, { DestinationStream } from 'pino';
const sinon = require('sinon');
import pinoms from 'pino-multi-stream';
const fs = require('fs');
const path = require('path');
const stream = require('stream');
const { PassThrough } = require('stream');
class EchoStream extends stream.Writable {
_write(chunk, enc, next) {
console.log('ssdsdsd',chunk.toString());
next();
}
}
import * as _ from 'lodash';
import { Writable } from 'stream';
import { mocked } from 'ts-jest/utils';
import { LogServiceInstance } from './log.service';
// jest.mock('pino', () => jest.fn().mockImplementation(() => { ====> Tried this inline mock, doesnt work
// return {
// child: jest.fn().mockReturnValue(jest.requireActual('pino').Logger)
// }
// }));
// jest.mock('pino', () => {
// return jest.fn().mockImplementation(() => {
// return {
// child: jest.fn().mockReturnValue(jest.requireActual('pino').Logger),
// stream: jest.fn().mockImplementation(() => {
// return [
// {
// level: 'info',
// stream: fs.createWriteStream(
// path.resolve(process.cwd(), '/test/database-connector-logs/info.log')
// ),
// },
// {
// level: 'warn',
// stream: fs.createWriteStream(
// path.resolve(process.cwd(), '/test/database-connector-logs/warn.log')
// ),
// },
// ];
// }),
// };
// });
// });
describe('Test suite for Log service', () => {
//const mockedPino = mocked(pino, true);
test('Test case for getLoggerInstance', () => {
const mockedPinoMsStream = [
const mockedPinoStream = (pino.prototype.stream = jest.fn(() => mockedPinoMsStream));
console.dir(pino);
const prop = Reflect.ownKeys(pino).find((s) => {
return s === 'symbols';
});
// Tried this but it did not work as the actual files are written with the values
pino[prop]['streamSym'] = jest.fn().mockImplementation(() => {
return fs.createWriteStream(path.resolve(process.cwd(), './test/database-connector-logs/info.log'))
});
console.dir(pino);
const log = LogServiceInstance.getChildLoggerService(__filename);
console.dir(Object.getPrototypeOf(log));
log.info('test logging');
expect(2).toEqual(2);
});
Could someone let me know where the mocking is wrong and how to mock it properly
UPDATE:
I came to understand that mocking pino-multi-stream might do the trick, so tried it this way. This was added at the very top and rest of all mockings are all removed (even inside the test suite as well)
const mockedPinoMultiStream = {
stream: jest.fn().mockImplementation(() => {
return {write: jest.fn().mockReturnValue(new PassThrough())}
})
}
jest.mock('pino-multi-stream', () => {
return {
multistream: jest.fn().mockReturnValue(mockedPinoMultiStream)
}
});
wanted to mock to test if based on the level, respective named files are being used, but this also results in exception
TypeError: stream.write is not a function
at Pino.write (/XXX/node_modules/pino/lib/proto.js:161:15)
at Pino.LOG (/XXXX/node_modules/pino/lib/tools.js:39:26)
LATEST UPDATE:
So I resolved the exception by modifying the way pino multistream is mocked
const { PassThrough } = require('stream');
...
...
const mockedPinoMultiStream = {
write: jest.fn().mockImplementation((data) => {
return new Passthrough();
})
};
Now there is no more exception and write(method) is properly mocked when I print "pino". BUt I do not understand how to test the different files based on different log levels. Could someone let me know, how that is to be done.?
Note: I tried setting a return value of fs.createWriteStream instead of a Passthrough but that didnt work
Atlast, I found the answer to making use of pino streams based on different log levels.
I went ahead and created a test directory to house the test log files. In reality, we do not want pino to be adulterating the actual log files. So I decided to mock the pino streams during the start of the jest test. This file gets executed first before any test suite is triggered. So I modified the jest configuration in package.json like
"setupFiles": [
"<rootDir>/jest-setup/stream.logger.js"
],
in the stream.logger.js file, I added
const pinoms = require('pino-multi-stream');
const fs = require('fs');
const path = require('path');
const stream = require('stream');
const Writable = require('stream').Writable;
const { PassThrough } = require('stream');
const pino = require('pino');
class MyWritable extends Writable {
constructor(options) {
super(options);
}
_write(chunk, encoding, callback) {
const writeStream =fs.createWriteStream(path.resolve(process.cwd(), './test/logs/info.log'));
writeStream.write(chunk,'utf-8');
writeStream.emit('close');
writeStream.end();
}
}
const mockedPinoMultiStream = {
write: jest.fn().mockImplementation((data) => {
const writeStream = new MyWritable();
return writeStream._write(data);
})
};
jest.mock('pino-multi-stream', () => {
return {
multistream: jest.fn().mockReturnValue(mockedPinoMultiStream)
}
});
Now I went ahead and created the test file - log.service.spec.ts
import * as pino from 'pino';
const sinon = require('sinon');
import pinoms from 'pino-multi-stream';
const fs = require('fs');
const path = require('path');
const stream = require('stream');
import * as _ from 'lodash';
import { Writable } from 'stream';
import { mocked } from 'ts-jest/utils';
import { LogServiceInstance } from './log.service';
describe('Test suite for Log service', () => {
//const mockedPino = mocked(pino, true);
afterEach(() => {
// delete the contents of the log files after each test suite
fs.truncate((path.resolve(process.cwd(), './test/logs/info.log')), 0, () => {
console.dir('Info log file deleted');
});
fs.truncate((path.resolve(process.cwd(), './test/logs/warn.log')), 0, () => {
console.dir('Warn log file deleted');
});
fs.truncate((path.resolve(process.cwd(), './test/logs/debug.log')), 0, () => {
console.dir('Debug log file deleted');
});
});
test('Test case fir getLoggerInstance', () => {
const pinoLoggerInstance = LogServiceInstance.getChildLoggerService(__filename);
pinoLoggerInstance.info('test logging');
_.map(Object.getOwnPropertySymbols(pinoLoggerInstance), (mapItems:any) => {
if(mapItems.toString().includes('Symbol')) {
if(mapItems.toString().includes('pino.level')) {
expect(pinoLoggerInstance[mapItems]).toEqual(20);
}
}
if(mapItems.toString().includes('pino.chindings')) {
const childInstance = pinoLoggerInstance[mapItems].toString().substr(1);
const jsonString = '{'+ childInstance+ '}';
const expectedObj = Object.create(JSON.parse(jsonString));
expect(expectedObj.fileName).toEqual('log.service.spec');
expect(expectedObj.appName).toEqual('AppJestTesting');
expect(expectedObj.connectorReqId).toEqual(expect.objectContaining(new String('AppJestTesting')));
}
});
// make sure the info.log file is written in this case
const infoBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/info.log')).read(1024);
expect(infoBuffRead).toBeDefined();
// now write a warn log
pinoLoggerInstance.warn('test warning log');
const warnBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/warn.log')).read(1024);
expect(warnBuffRead).toBeDefined();
// now write a debug log
pinoLoggerInstance.debug('test warning log');
const debugBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/warn.log')).read(1024);
expect(debugBuffRead).toBeDefined();
});
});
I also made sure that the test log files do not get overwhelmed with data over time , by deleting their contents after each execution
Hope this helps people trying to test pino multi stream

Cache-busting page-data.json files in Gatsby

I have a gatsby generated website on which I have replaced the contents of the homepage.
Unfortunately the previous version was serving up /page-data/index/page-data.json with the incorrect cache-control headers, resulting in /page-data/index/page-data.json being cached on client browsers (and stale data being shown unless force-refreshed). I have also discovered that page-data.json files are not hashed (see https://github.com/gatsbyjs/gatsby/issues/15080).
I've updated the cache-control headers so that versions from now on will not be cached but this does not help with clients that have the cached version now.
What can I do to force clients to request the latest version of this file?
I got there in the end... This is in my gatsby-node.js
const hash = md5(`${new Date().getTime()}`)
const addPageDataVersion = async file => {
const stats = await util.promisify(fs.stat)(file)
if (stats.isFile()) {
console.log(`Adding version to page-data.json in ${file}..`)
let content = await util.promisify(fs.readFile)(file, 'utf8')
const result = content.replace(
/page-data.json(\?v=[a-f0-9]{32})?/g,
`page-data.json?v=${hash}`
)
await util.promisify(fs.writeFile)(file, result, 'utf8')
}
}
exports.onPostBootstrap = async () => {
const loader = path.join(__dirname, 'node_modules/gatsby/cache-dir/loader.js')
await addPageDataVersion(loader)
}
exports.onPostBuild = async () => {
const publicPath = path.join(__dirname, 'public')
const htmlAndJSFiles = glob.sync(`${publicPath}/**/*.{html,js}`)
for (let file of htmlAndJSFiles) {
await addPageDataVersion(file)
}
}
Check out this tutorial, this is the solution I've been using.
https://examsworld.co.in/programming/javascript/how-to-cache-bust-a-react-app/
It's basically a wrapper component that checks to see if the browser's cached version matches the build's version number in package.json. If it doesn't, it clears the cache and reloads the page.
This is how I'm using it.
gatsby-browser.js
export const wrapRootElement = ({ element }) => (
<CacheBuster>
{({ loading, isLatestVersion, refreshCacheAndReload }) => {
if (loading) return null
if (!loading && !isLatestVersion) {
// You can decide how and when you want to force reload
refreshCacheAndReload()
}
return <AppProvider>{element}</AppProvider>
}}
</CacheBuster>
)
CacheBuster.js
import React from 'react'
import packageJson from '../../package.json'
global.appVersion = packageJson.version
// version from response - first param, local version second param
const semverGreaterThan = (versionA, versionB) => {
const versionsA = versionA.split(/\./g)
const versionsB = versionB.split(/\./g)
while (versionsA.length || versionsB.length) {
const a = Number(versionsA.shift())
const b = Number(versionsB.shift())
// eslint-disable-next-line no-continue
if (a === b) continue
// eslint-disable-next-line no-restricted-globals
return a > b || isNaN(b)
}
return false
}
class CacheBuster extends React.Component {
constructor(props) {
super(props)
this.state = {
loading: true,
isLatestVersion: false,
refreshCacheAndReload: () => {
console.info('Clearing cache and hard reloading...')
if (caches) {
// Service worker cache should be cleared with caches.delete()
caches.keys().then(function(names) {
for (const name of names) caches.delete(name)
})
}
// delete browser cache and hard reload
window.location.reload(true)
},
}
}
componentDidMount() {
fetch('/meta.json')
.then(response => response.json())
.then(meta => {
const latestVersion = meta.version
const currentVersion = global.appVersion
const shouldForceRefresh = semverGreaterThan(
latestVersion,
currentVersion
)
if (shouldForceRefresh) {
console.info(
`We have a new version - ${latestVersion}. Should force refresh`
)
this.setState({ loading: false, isLatestVersion: false })
} else {
console.info(
`You already have the latest version - ${latestVersion}. No cache refresh needed.`
)
this.setState({ loading: false, isLatestVersion: true })
}
})
}
render() {
const { loading, isLatestVersion, refreshCacheAndReload } = this.state
const { children } = this.props
return children({ loading, isLatestVersion, refreshCacheAndReload })
}
}
export default CacheBuster
generate-build-version.js
const fs = require('fs')
const packageJson = require('./package.json')
const appVersion = packageJson.version
const jsonData = {
version: appVersion,
}
const jsonContent = JSON.stringify(jsonData)
fs.writeFile('./static/meta.json', jsonContent, 'utf8', function(err) {
if (err) {
console.log('An error occured while writing JSON Object to meta.json')
return console.log(err)
}
console.log('meta.json file has been saved with latest version number')
})
and in your package.json add these scripts
"generate-build-version": "node generate-build-version",
"prebuild": "npm run generate-build-version"
Outside of going to each client browser individually and clearing their cache there isn't any other means of invalidating all of your client's caches. If your webpage is behind a CDN you can control, you may be able to force invalidation at the CDN-level so new clients will always be routed to the up to date webpage even if the CDN had a pre-existing, outdated copy cached.

Resources