I have an X value representing how much I open my mouth in spark AR. How can I show that X value into a 2d text?
I expect the text to show the Number value while running the effect.
You should use this script:
const Scene = require('Scene');
// Use export keyword to make a symbol available in scripting debug console
export const Diagnostics = require('Diagnostics');
// To use variables and functions across files, use export/import keyword
// export const animationDuration = 10;
// Use import keyword to import a symbol from another file
// import { animationDuration } from './script.js'
// To access scene objects
// const directionalLight = Scene.root.find('directionalLight0');
// To access class properties
// const directionalLightIntensity = directionalLight.intensity;
// To log messages to the console
Diagnostics.log('Console message logged from the script.');
const Patches = require('Patches');
const numberFormat = '{0}';
const number = Patches.getScalarValue('number');
Patches.setStringValue('value', number.format(numberFormat));
enter image description here
Related
I have a query with the syntax in the require statement. Please refere the sample code below.
const nodemailer = require("nodemailer");
const {google} =require('googleapis');
const {OAuth2}=google.auth;
Some times , I see sample codes which use
const {<variable>} = require('moduleName')
Other times, I see like below
const <variable> = require('moduleName')
What is the difference between them?
Thanks in Advance.
Grateful to the Developers Community.
So, you use { } in this context when you want to do object destructuring to get a property from the exported object and create a module-level variable with that same name.
This:
const { google } = require('googleapis');
is a shortcut for this:
const __g = require('googleapis');
const google = __g.google;
So, within this context, you use the { google } only when you want the .google property from the imported module.
If you want the entire module handle such as this:
const nodemailer = require("nodemailer");
then, you don't use the { }. The only way to know which one you want for any given module is to consult the documentation for the module, the code for the module or examples of how to use the module. It depends entirely upon what the module exports and whether you want the top level export object or you want a property of that object.
It's important to realize that the { } used with require() is not special syntax associated with require(). This is normal object destructuring assignment, the same as if you did this:
// define some object
const x = { greeting: "hello" };
// use object destructuring assignment to create a new variable
// that contains the property of an existing object
const { greeting } = x;
console.log(greeting); // "hello
When you import the function with {}, it means you just import one function that available in the package. Maybe you have've seen:
const {googleApi, googleAir, googleWater} = require("googleapis")
But, when you not using {}, it means you import the whole package, just write:
const google = require("googleapis")
So, let say when you need googleApi in your code. You can call it:
google.googleApi
I have a test like this, but i can not get the 'sharedMap' in 'sharedSeq1' value, i don't know how to get the 'remoteFluidObjectHandle' value.
import {MockContainerRuntimeFactory, MockFluidDataStoreRuntime, MockStorage} from "#fluidframework/test-runtime-utils";
import {SharedObjectSequence, SharedObjectSequenceFactory} from "#fluidframework/sequence";
import * as mocks from "#fluidframework/test-runtime-utils";
import {SharedMap} from "#fluidframework/map";
import {IFluidHandle} from "#fluidframework/core-interfaces";
const mockRuntime: mocks.MockFluidDataStoreRuntime = new mocks.MockFluidDataStoreRuntime();
describe('ShredObjectSequence', function () {
it('should get synchronization data from another shared object', async function () {
const dataStoreRuntime1 = new MockFluidDataStoreRuntime();
const sharedSeq1: SharedObjectSequence<IFluidHandle<SharedMap>> = new SharedObjectSequence(mockRuntime, 'shareObjectSeq1', SharedObjectSequenceFactory.Attributes,)
const containerRuntimeFactory = new MockContainerRuntimeFactory();
dataStoreRuntime1.local = false;
const containerRuntime1 = containerRuntimeFactory.createContainerRuntime(
dataStoreRuntime1,
);
const services1 = {
deltaConnection: containerRuntime1.createDeltaConnection(),
objectStorage: new MockStorage(),
};
sharedSeq1.initializeLocal();
sharedSeq1.connect(services1);
const dataStoreRuntime2 = new MockFluidDataStoreRuntime();
const containerRuntime2 = containerRuntimeFactory.createContainerRuntime(
dataStoreRuntime2,
);
const services2 = {
deltaConnection: containerRuntime2.createDeltaConnection(),
objectStorage: new MockStorage(),
};
const sharedSeq2: SharedObjectSequence<IFluidHandle<SharedMap>> = new SharedObjectSequence(mockRuntime, 'shareObjectSeq2', SharedObjectSequenceFactory.Attributes,)
sharedSeq2.initializeLocal();
sharedSeq2.connect(services2);
// insert a node into sharedSeq2, it will sync to sharedSeq1
sharedSeq2.insert(0, [<IFluidHandle<SharedMap>>new SharedMap('sharedMapId', mockRuntime, SharedMap.getFactory().attributes).handle])
containerRuntimeFactory.processAllMessages();
// next case is passed, it show we got the sharedSeq2 changed
expect(sharedSeq1.getLength()).toBe(1)
const remoteFluidObjectHandle = await sharedSeq1.getRange(0, 1)[0];
// at here, i get error: Cannot read property 'mimeType' of null, it cause by remoteFluidObjectHandle.ts:51:30
const sharedMap = await remoteFluidObjectHandle.get()
expect(sharedMap).not.toBeUndefined()
});
});
run this test will get 'Cannot read property 'mimeType' of null' error, it caused by 'remoteFluidObjectHandle.ts:51:30'
The fluid mocks have very limited and specific behaviors, it looks like you are hitting the limits of them. You'll have better luck with an end-to-end test, see packages\test\end-to-end-tests. These use the same in-memory server as our as the playground on fluidframework dot com. The in-memory server uses the same code as tinylicious, our single process server and routerlicious, our docker based reference implementation.
I have a problem with TypeScript import/export mechanics in WebStorm.
Lets say, I have a following file, named apiUtils.ts:
export function getUsername(req) {
return req.headers.username;
}
export default module.exports;
And I have another file, that is referencing apiUtils.ts:
import apiUtils from "../bin/apiUtils";
const req = '';
const username = apiUtils.getUsername(req);
getUsername method is not identified on the fly and I cannot navigate to it with Ctrl + click. It is marked as unused, and I can even write the following and compile the code successfully without receiving an error:
import apiUtils from "../bin/apiUtils";
const req = '';
const username = apiUtils.getUsername(req, thereIsNoSecontArgInThisMehod);
On the other hand, if I import the method as follows:
import { getUsername } from "../bin/apiUtils";
const req = '';
const username = getUsername(req);
And everything will work as expected: I get navigation, input validation and etc.
I prefer to use import apiUtils from "../bin/apiUtils"; instead of importing each method separately, since it creates a chaos of methods, that is unclear where they belong to, if they are local or not.
Is there a way to fix the import, and get it to understand what method is referenced? Since for the runtime both ways are working the same.
This happens because of export default module.exports construction - none of the IDEs I'm aware of can handle them.
If you like to import the entire module, use import * as <name> construction (https://www.typescriptlang.org/docs/handbook/modules.html#import-the-entire-module-into-a-single-variable-and-use-it-to-access-the-module-exports):
export function getUsername(req) {
return req.headers.username;
}
import * as apiUtils from "../bin/apiUtils";
const req = '';
const username = apiUtils.getUsername(req);
I am struggling to get const properties define outside of module.exports from the object where I am using that module. here is simple example:
ServiceX.js
const _ = require('lodash');
module.exports = {
testFirstName: function () {
console.log('Nodics');
}
}
ServiceY.js
const utils = require('utils');
module.exports = {
testLastName: function () {
console.log('framework');
}
}
now if I import both file via require and merging via _.merge(). Output file contain both of the methods, but it doesn't contain any of the const variable define outside exports.
let combined = _.merge(require('ServiceX'), require('ServiceY'));
writing this combined to the third file some this MyService
even if I print this combined object via console.log(combined), I get only both functions, not const properties.
Use case I have:
I have n-number of files in different location, I need to read files, merge all and create a new file with merged content.
Please help me,
Thanks
A problem here is that a service currently looks like:
const _ = require('lodash');
module.exports = {
testFirstName: function () {
console.log('Nodics');
}
}
And the reason for that is that if you read all files and concatenate the result then module.exports will be overwritten by the last read file.
But if it instead looked like:
const _ = require('lodash');
module.exports.testFirstname = function () {
console.log('Nodics');
}
You could pull it off by doing fs.readFileSync() on all your services and concatenate the result.
You could also do it as a build command, rather than in code:
cat service*.js > combined.js
You will most likely run into other problems, since if one service uses const _ = require('lodash') and another is doing the same you will try to redefine a const variable.
To get around this you could move have to move your requires into another scope, so they don't conflict at file level scope.
Is there a way to extract text from PDFs in nodejs without any OS dependencies (like pdf2text, or xpdf on windows)? I wasn't able to find any 'native' pdf packages in nodejs. They always are a wrapper/util on top of an existing OS command.
Thanks
Have you checked PDF2Json? It is built on top of PDF.js. Though it is not providing the text output as a single line but I believe you may just reconstruct the final text based on the generated Json output:
'Texts': an array of text blocks with position, actual text and styling informations:
'x' and 'y': relative coordinates for positioning
'clr': a color index in color dictionary, same 'clr' field as in 'Fill' object. If a color can be found in color dictionary, 'oc' field will be added to the field as 'original color" value.
'A': text alignment, including:
left
center
right
'R': an array of text run, each text run object has two main fields:
'T': actual text
'S': style index from style dictionary. More info about 'Style Dictionary' can be found at 'Dictionary Reference' section
After some work, I finally got a reliable function for reading text from PDF using https://github.com/mozilla/pdfjs-dist
To get this to work, first npm install on the command line:
npm i pdfjs-dist
Then create a file with this code (I named the file "pdfExport.js" in this example):
const pdfjsLib = require("pdfjs-dist");
async function GetTextFromPDF(path) {
let doc = await pdfjsLib.getDocument(path).promise;
let page1 = await doc.getPage(1);
let content = await page1.getTextContent();
let strings = content.items.map(function(item) {
return item.str;
});
return strings;
}
module.exports = { GetTextFromPDF }
Then it can simply be used in any other js file you have like so:
const pdfExport = require('./pdfExport');
pdfExport.GetTextFromPDF('./sample.pdf').then(data => console.log(data));
Thought I'd chime in here for anyone who came across this question in the future.
I had this problem and spent hours over literally all the PDF libraries on NPM. My requirements were that I needed to run it on AWS Lambda so could not depend on OS dependencies.
The code below is adapted from another stackoverflow answer (which I cannot currently find). The only difference being that we import the ES5 version which works with Node >= 12. If you just import pdfjs-dist there will be an error of "Readable Stream is not defined". Hope it helps!
import * as pdfjslib from 'pdfjs-dist/es5/build/pdf.js';
export default class Pdf {
public static async getPageText(pdf: any, pageNo: number) {
const page = await pdf.getPage(pageNo);
const tokenizedText = await page.getTextContent();
const pageText = tokenizedText.items.map((token: any) => token.str).join('');
return pageText;
}
public static async getPDFText(source: any): Promise<string> {
const pdf = await pdfjslib.getDocument(source).promise;
const maxPages = pdf.numPages;
const pageTextPromises = [];
for (let pageNo = 1; pageNo <= maxPages; pageNo += 1) {
pageTextPromises.push(Pdf.getPageText(pdf, pageNo));
}
const pageTexts = await Promise.all(pageTextPromises);
return pageTexts.join(' ');
}
}
Usage
const fileBuffer = fs.readFile('sample.pdf');
const pdfText = await Pdf.getPDFText(fileBuffer);
This solution worked for me using node 14.20.1 using "pdf-parse": "^1.1.1"
You can install it with:
yarn add pdf-parse
This is the main function which converts the PDF file to text.
const path = require('path');
const fs = require('fs');
const pdf = require('pdf-parse');
const assert = require('assert');
const extractText = async (pathStr) => {
assert (fs.existsSync(pathStr), `Path does not exist ${pathStr}`)
const pdfFile = path.resolve(pathStr)
const dataBuffer = fs.readFileSync(pdfFile);
const data = await pdf(dataBuffer)
return data.text
}
module.exports = {
extractText
}
Then you can use the function like this:
const { extractText } = require('../api/lighthouse/lib/pdfExtraction')
extractText('./data/CoreDeveloper-v5.1.4.pdf').then(t => console.log(t))
Instead of using the proposed PDF2Json you can also use PDF.js directly (https://github.com/mozilla/pdfjs-dist). This has the advantage that you are not depending on modesty who owns PDF2Json and that he updates the PDF.js base.