I've written a simple Node.js app that streams content (primarily for videos, but other stuff works too). It seems to work fine except with some jpegs, where the bottom of the image is not always displayed.
Here's an example, with the bottom portion of the picture missing on the left.
A site showing the problem is here and an identical-looking site without the problem is here. (There should be pink bits at the bottom of most images with the first link.)
The code for the app is on Github, but I guess the important lines are
stream = fs.createReadStream(info.file, { flags: "r", start: info.start, end: info.end });
stream.pipe(res);
(So all of the heavy lifting is done by stream.js, not my code.)
I've tried removing the start and end params, but it makes no difference.
If I change it to stream the file to process.stdout instead and save the output to a file, the whole image is shown.
A file comparison program says the file from stdout and the original are identical.
If I transfer something other than a baseline jpeg (so even progressive jpegs work), it shows the whole file.*
If I access the file via the linked Node.js app (which uses Express), but not through the streamer, it shows the whole image.
If I save the incomplete picture from my browser to the desktop, it saves the whole image.
For some jpegs it works, but not for most.
The problem happens locally and on my server (Windows 2008 R2 and Ubuntu 11 resp.).
Browser tests (over 3 computers & 1 VM)
Safari 4, 5 - works
Firefox 4, 11 (2 computers) - doesn't work
Chrome 18 (2 computers) - doesn't work
IE 8 - doesn't work
IE 9 - works
Opera 11 - works
I really have no idea what's going on here. I thought it could be a weird graphics driver bug, but the behaviour is consistent across all computers I've tried. I find it hard to blame Node.js / my code for this as it does seem to be transferring the whole of the file, but then it works fine without it.
Happy to supply more details if needed.
* Obviously I haven't tried every possible file type, but pngs, progressive jpegs, text and videos seem to work.
It turns out that the problem was the file size sent in the header (header["Content-Length"]). I'd set this to info.end - info.start, whereas the actual size is 1 more than that. Some browsers didn't like this and missed off the end of the picture when displaying it on screen. Oops.
Related
PROBLEM: I've hit a troubleshooting wall and hoping for suggestions on what to check to get past an issue I'm having with an internet site I'm working on. When reading data from a spreadsheet using NPOI (C#), sometimes (not all the time) the row reading stops after just ten rows.
Sorry for the very long post but not sure what is/isn't useful. Primary reason for posting here is that I don't know the right question to ask the Great Google Machine.
I have an intranet site where I'm reading an XLSX file and pushing its contents into an Oracle table. As you can tell by the subject line, I'm using NPOI. For the most part, it's just working, but only sometimes...
In Oracle, I have a staging table, which is truncated and is supposed to be filled with data from the spreadsheet.
In my app (ASPX), users upload their spreadsheet to the server (this just works), then the app calls a WebMethod that truncates data from the Oracle staging table (this just works), then another WebMethod is called that is supposed to read data from the spreadsheet and load the staging table (this, kinda works).
It's this "kinda works" piece is what I need help with.
The spreadsheet has 170 data rows. When I run the app in VS, it reads/writes all 170 records most of the time but sometimes it reads just 10 records. When I run the app from the web server, the first time it fails (haven't been able to catch a specific error), the second and subsequent times, it reads just ten records from the spreadsheet and successfully loads all ten. I've checked the file uploaded to the server and it does have 170 data records.
Whether the process reads 10 records or 170 records, there are no error messages and no indication why it stopped reading after just ten. (I'll mention here that the file today has 170 but tomorrow could have 180 or 162, so it's not fixed).
So, I've described what it's supposed to do and what it's actually doing. I think it's time for code snippet.
/* snowSource below is the path/filename assembled separately */
/* SnowExcelFormat below is a class that basically maps row data with a specific data class */
IWorkbook workbook;
try
{
using (FileStream file = new FileStream(snowSource, FileMode.Open, FileAccess.Read, FileShare.Read))
{
workbook = WorkbookFactory.Create(file);
}
var importer = new Mapper(workbook);
var items = importer.Take<SnowExcelFormat>(0);
/* at this point, item should have 170 rows but sometimes it contains only 10 with no indication why */
/* I don't see anything in the workbook or importer objects that sheds any light on what's happening. */
Again, this works perfectly fine most of the time when running from VS. That tells me this is workable code. When running this on the web server, it fails the first time I try the process but subsequently it runs but only picking up that first 10 records, ignoring the rest. Also, all the data that's read (10 or 170) is successfully inserted into the staging table, which tells me that Oracle is perfectly okay with the data, its format, and this process. All I need is to figure out why my code doesn't read all the data from Excel.
I have verified numerous times that the local DLL and webserver DLL are the same. And I'm reading the same Excel file.
I'm hitting a serious wall here and have run out of ideas on how to troubleshoot where the code is failing, when it fails. I don't know if there's something limiting memory available to the FileStream object causing it to stop reading the file prematurely - and didn't run across anything that looked like a resource limiter. I don't know if there's something limiting the number of rows pulled by the importer.Take method. Any suggestions would be appreciated.
I faced the same issue on some files and after analyzing the problem, this is what worked for me.
importer.Take<SnowExcelFormat>(0) have three parameters and one of them is maxErrorRows. Its default value is 10.
Your parsing stop when facing more than 10 errors then the function stop reading.
What you have to do is to set maxErrorRows instead of taking default value that is 10.
I'm currently making an electron app that needs to play some 40Mbyte audio file from the file system, maybe it's wrong to do this but I found that the only way to play from anywhere in the file system is to convert the file to a dataurl in the background script and then transfer it using icp, after that I simply do
this.sound = new Audio(dataurl);
this.sound.preload = "metadata"
this.sound.play()
(part of a VueJS component hence the this)
I did a profling inside electron and this is what came out:
Note that actually transferring the 40Mbytes audio file doesn't take that long (around 80ms) what is extremely annoying is the "Second Task" which is probably buffering (I have no idea) which last around 950ms, this is way too long and ideally would need it under <220ms
I've already tried changing the preload option to all available options and while I'm using the native html5 audio right now I've also tried howlerjs with similar results (seemed a bit faster tho).
I would guess that loading the file directly might be faster but even after disabling security measures put by electron to block the file:/// it isn't recognized as a valid URI by XHR
Is there a faster way to load the dataurl since all the data is there it just needs to be converted to a buffer or something like that ?
Note: I can not "pre-buffer" every file in advance since there is about 200 of them it just wouldn't make sense in my opinion.
Update:
I found this post Electron - throws Not allowed to load local resource when using showOpenDialog
don't know how I missed it, so I followed step 1 and I now can load files inside electron with the custom protocol, however, nor Audio nor howlerjs is faster, it's actually slower at around 6secs from click to first sound, is it that it needs to buffer the whole file before playing ?
Update 2:
It appears that the 6sec loading time is only effective on the first instance of audio that is created. I do not know why tho. After that the use of two instances (one playing and one pre-buffering) work just fine, however even loading a file that isn't loaded is instantaneous. Seems weird that it only is the firs one.
Does anyone know anything about the UITestActionLog.html file ending up empty for long running test.
When I run just a few steps I get the file just fine with the expected content.
When I run my full test a bit over two hours the file ends up with just 28kb but with virtually no content. It just has this in it "Coded UI Test Log" on the top-left and "TOTAL TIME:" on the top-right.
I could not find anything about it on the web.
I am wondering if someone else has had this issue and what could be tweaked to make it work.
Thanks
This is happening because the images captured are too large when you have:
Playback.PlaybackSettings.LoggerOverrideState = HtmlLoggerState.AllActionSnapshot;
We changed that parameter to:
Playback.PlaybackSettings.LoggerOverrideState = HtmlLoggerState.ErrorAndWarningOnlySnapshot;
We don't have screenshots anymore but we have the details of the steps (i.e. the HTMl file is still generated)!
Next I will looks at the System.Drawing to see if we can't override the default screenshot size and encoding, this way we can restore the AllActionSnapshot but with lower resolution of screenshots....to be continued!
I use the audio class to read MP3 file thanks to a little trick : replacing the ffmpegsumo.so of Node-Webkit with the chromium one. This enable the reading of MP3 on Windows but doesn't works on Mac OS. Does anyone know why ?
Here's the code :
player = new Audio()
player.src = '/path/to/the/audio.mp3';
player.play();
This seems to be dependant upon the dll/so being a 32 bit version. I am guessing that is why copying the file from Chrome doesn't work correctly for most people ( my 3 year old phone is the only 32-bit device I have left ).
I keep seeing this link --
https://github.com/rogerwang/node-webkit/wiki/Support-mp3-and-h264-in-video-and-audio-tag
.. but it is a blank page. I am guessing it was deleted since the info was likely not current or correct.
This issue thread has links to some rebuilt ffmpegsumo libraries for both Mac and Windows --
https://github.com/rogerwang/node-webkit/issues/1423
The alternative appears to be rebuilding ffmpegsumo, this thread has some config for doing that -- https://github.com/rogerwang/node-webkit/issues/1208
I am still confused about the licensing on it after you build the library, so that is probably worth some research. Everything about mpeg4-part10 is copyrighted and heavily patent encumbered. I think we all need to get smart enough to stop using mp4/h.264. Before I got this working correctly on node-webkit, it was easier to use ffmpeg to transcode the video to an ogv container using Theora and Vorbis codecs. At this point it seems like iOS is keeping h.264 alive, when it should probably die the horrible death it has earned.
I have a desktop application developed with wxPython. The applications runs fine under Windows and OSX (same codebase, no platform specific code). Everything works on Linux except drag and drop. I can drag just fine, but DoDragDrop always returns wx.DragCancel. I can however, drag from my application or to another app/desktop which excepts text and DoDragDrop returns wx.DragCopy.
It seems to me like the DropTargets aren't getting called. I've added debug statements to OnData, etc and they are never activated.
Has anyone seen this and know of a workaround?
Found a known issue in wxWidgets that was considered fixed, http://trac.wxwidgets.org/ticket/2763, I am able to recreate this issue on linux. I reopened the ticket.
In the meantime you can swap your StaticBoxSizers or BoxSizers. or...
This works....
parent = DropTargetCtrn.GetParent()
boxes = [x for x in parent.GetChildren() if type(x)==wx.StaticBox]
tmpParent = wx.Panel(parent)
for box in boxes:
box.Reparent(tmpParent)
box.Reparent(parent)
parent.Destroy()
This solution seems to lower the StaticBox in the window hierarchy so it don't interfere with drop events. Note, box.Lower() does not work.