Copy JSON only copies half of the JSON - node.js

I'm using Webstorm's debugger to inspect local variables for my Node application. However, when I right click on the variable I'm interested in and click Copy JSON, the pasted output only contains half of the JSON.
Has anyone experienced this issue and what did you do to resolve this? Yes, I could console.log the data or write it to a file, but I figured that using a debugger would be more efficient.
Thanks in advance,
Q

Yes, I have seen this too. Not sure what causes it. Here is a workaround:
Try switching to the console tab and saving variable out as a JSON string as follows:
JSON.stringify(myvar);
Then copy the results and, if necessary, parse it elsewhere with:
JSON.parse('..data goes here..')
Don't forget to use single quotes because the JSON contains double quotes everywhere.

Related

JODIT WYSIWYG editor 3.24.1 - how to call a function and get source filename after a base64 encoded image has been inserted

I've got images inserting into the editor as base64 encoded images (uploader option insertImageAsBase64URI is set to true). However, I'd like to call a function after the image has been inserted and also read the source filename for the inserted image.
I'm new to the JODIT editor, it seems great so far, but I need to tweak it a bit and am not sure how to register an event callback for this, or if there is another/better way. Any help is appreciated!
I think the best solution is to fork JODIT in github and edit the code. For some reason, however, I have been unable to build the code on my mac laptop for at least a couple of reasons (missing file in node module, fixed, and a build error "TypeError: require(...) is not a function" that may indicate circular dependencies in node modules?). Anyway, I found a complete and limited "HACK" for my needs and that is to actually capture the filename when the file is added by attaching an "onchange" handler function to the JODIT instance's file input element. This works roughly as follows (I'm using jQuery):
var selectedFile = null;
function setSelectedFile(){
$('.jodit').find('input[type="file"]').removeProp('multiple');
$('.jodit').find('input[type="file"]').on('change', function(){
var files = $(this).prop('files');
selectedFile = files[0].name;
});
});
$('.jodit').find('button[aria-label="Insert file"]').on('click',
function(){
setSelectedFile();
}
);
I run something like this after the page has loaded. This works only for the "change" event (where you select a file directly) and I could not figure out how to read the filename after a file is "dropped". Dropping a file does not seem to trigger the "change" event in the file input element. If anyone knows how to get the filename of a dropped file for the JODIT editor I'd appreciate sharing. I will update this if I get around to fixing that.

How to write Postman tests to compare a response body with an external JSON/CSV file using Newman?

It is well documented how to use an external data file with the Collection Runner in Postman to run multiple iterations. However, what I'd like to do is write tests to compare the response body from a GET request to a JSON/CSV file.
From the research I've done it seems this isn't possible using just Postman. I wonder though if it'd be possible with Newman?
I have no experience with Newman but am curious if this might work. Could you create a Postman collection with an env variable called TestJsonFile and write tests comparing values from that to the response body. Then export said collection and using Newman set the TestJsonFile value to that of a JSON/CSV file (--env-var "TestJsonFile=<path_to_file>") and run the collection with that variable?
This seems plausible to me; maybe I could create a Node project to do it all in one place. I'd just like to ask if anyone has done something similar as it seems like it should be doable.
Any critiques or suggestions would be much appreciated, thank you for reading!

Webdriver IO How to store Excel file in cache and call from cache on Node JS

I have an excel file that I want to only call in once and make available to all of my tests, at the moment it's being called on each test, I have tried storing it in the cache using https://www.npmjs.com/package/node-cache but when I tried to get it from cache it's saying undefined, so then I tried using onPrepare hook with no luck. Can someone point me in the right direction thanks in advance.
Assuming that excel has data for test automation and it does not include any writing operations, I would suggest reading the content and store it in a constant. If you declare that variable globally, it will be available to all your tests. The complexity of this READ function depends on how diverse is your data. You can use libraries like https://www.npmjs.com/package/xlsx, https://www.npmjs.com/package/exceljs, etc..,

Matlab GUIDE GUI Handles change in values after using Load() function?

Inside a GUI that I have made using GUIDE in Matlab. I run into a problem where upon using the Load() function to load a .MAT file all my handles change values. This means that if I had a button that I wanted to use on my GUI. My program will believe its handle is for example
handles.button1 =190.082
when in reality the only way I can access that button any more is through a different handle that is unknown. So if its unknown lets see what its new handle must be.
findobj('Tag','button1') = 227.0093
As you can see these numbers are completely different. Why the handles values get changed is beyond me. Since the handles change I can no longer use the set() function as I have written in previous sections of code. For example I have to change
set(handles.button1, 'Enable', 'off');
to
set(findobj('Tag','button1'),'Enable','off');
Does anyone have an explanation as to why this problem occurs when using Load()?
Is there a feasible solution instead of having to find the handle for an object every time you want to use it?
The .MAT file conveniently also had a handles variable in it which overwrote my current handles.

Problems came up in the following areas during load: Table

I have generated an excel file from xml. But i can not open it with Excel. Excel gives the following error opening it:
Problems came up in the following areas during load:
Table
Then it shows a message that the log file corresponding the error can be found at : C:/Documents and Setting/myUserName/Local Settings/Temporary Internet Files/Content.MSO/xxxxx.log
But i can not find Content.MSO folder in my windows. I checked folder settings and made all folders visible but i still can not access this folder. So that i can not analise the log file.
how could i find the generated log file?
I found the problem without analising the log file. i stil can not access the log file in temporary internet files. But i realised that i put a string(non-number) characters on a number-styled cell in Excel xml. So if you having the similar issues about your Excel file generated from xml, then have a look at if your cell values are appopriate with your cell data type.
If you type or paste the path of the log file into Explorer or your text editor of choice, you may find that the folder does exist, despite being invisible.
In my case it was a <Row> with an incorrect ss:Index
I was using a template and the last row had a fixed Index=100. If the number of rows I added exceeded 100, this last row had a wrong index and excel threw the error without any other message or log (MacOSX, Excel 15.25.1). I wish they printed more informative error messages, what a waste of our time.
Excel 2016. My error message was "Worksheet Settings". Path was pointing to non-existing file.
My cause of the problem was ExpandedRowCount not big enough for number of rows in Worksheet. If you add rows in XML directly (i.e. on a machine where Excel is not installed), make sure to increment number of rows in ExpandedRowCount.
yes.Even i too faced the same problem and problem was with the data type of cells ofexcel generated using xslt
In addition to checking the data being used vs "Type" assigned, make sure that the list of characters that need to be encoded for XML are indeed encoded.
I had a system that appeared to be working, but then some user data including & and < was throwing this error.
If you're not sure what's going on with your file, try http://www.xmlvalidation.com/ - that helped be spot the issue in a large file immediately.
I used this function to fix it, modified from this post:
function xmlsafe($s) {
return str_replace(array('&','>','<','"'), array('&','>','<','"'), $s);
}
and then run echo xmlsafe($myvalue) where you were just echoing $myvalue in your script.
This seems to be more appropriate for XML than htmlentities() or other options built into PHP.
I had the same issue, and the answer was - type of Cell was Number and some values doesn't converts to this type on my backend.
I had the SAME problem,
and its because de file is TOO BIG.
I try an extract from SAP, more little than the one with that make the error) and save it in XML file. and it WORK, no more error.
so maybe if you can save in 2 Excel files XML instead of 1 it will be good ;)
ALicia

Resources