using jpeg as a memory mapped file - jpeg

I have a jpeg image 360x288 resolution. It is stored on my server. This image can be updated randomly but in a sequential manner.
I upload the image via a web service as a byte array, I then save it to an image/jpeg on my hard drive.
And it continues...
Now, I am curious as to whether using a memory mapped file would improve efficiency as I have read some where (and not recall where) that an image by default is already a memory mapped file?
Any info would be great...

Related

Determining where the extra information from squashfs comes from

I extracted the root file system from an IoT device, and I was able to peruse it using unsquashfs. I then changed a single byte in a single file, and recompressed it again using mksquashfs. When I inspect the two files, the original and the one I created, the output from binwalk is identical, except for the size. The original had a size of 1038570 bytes while the one I created had a size of 1086112. I have no idea where the extra data came from. Are there any tools or methods for determining what the difference is?
So it turns out I was missing a flag while creating the squashed file system.
When using the xz compressor method, you can specify another flag, -Xbcj, that further compresses optimized per architecture. Once I added this and chose arm as my architecture, the file size was the same

Are Node.js Buffers as JSON a portable storage format?

If I create Node.js Buffer containing the bytes of a binary file like a jpg -image, convert it to JSON, can I transport binary content in this way to other machines and have the images viewable on those other machines too?
In other words can I fill a Buffer on one machine with bytes of an image-file, and transport the buffer as JSON to another machine, then restore the image there by simply writing the same buffer to a file with the same name?
Would it work between platforms say Linux Windows and Mac? Does "endiannes" become an issue?
Would TypedArrays be a better solution?
JSON isn't useful for transferring binary data... at least, not efficiently. You would have to base64-encode the data before putting it in JSON, which increases its size by 33% and adds an extra layer of processing on each end.
There is another standard serialization format you can use called CBOR. It's binary in nature and supports a byte string. There are libraries for many languages.

Image sanitization library

I have a website that displays images submitted by users. I am concerned about
some wiseguy uploading an image which may exploit some 0-day vulnerability in a
browser rendering engine. Moreover, I would like to purge images of metadata
(like EXIF data), and attempt to compress them further in a lossless manner
(there are several such command line utilities for PNG and JPEG).
With the above in mind, my question is as follows: is there some C/C++
library out there that caters to the above scenario? And even if the
full pipeline of parsing -> purging -> sanitizing -> compressing -> writing
is not available in any single library, can I at least implement the
parsing -> purging -> sanitizing -> writing pipeline (without compressing) in a
library that supports JPEG/PNG/GIF?
Your requirement is impossible to fulfill: if there is a 0-day vulnerability in one of the image reading libraries you use, then your code may be exploitable when it tries to parse and sanitize the incoming file. By "presanitizing" as soon as the image is received, you'd just be moving the point of exploitation earlier rather than later.
The only thing that would help is to parse and sanitize incoming images in a sandbox, so that, at least, if there was a vulnerability, it would be contained to the sandbox. The sandbox could be a separate process running as an unprivileged user in a chroot environment (or VM, for the very paranoid), with an interface consisting only of bytestream in, sanitized image out.
The sanitization itself could be as simple as opening the image with ImageMagick, decoding it to a raster, and reencoding and emitting them in a standard format (say, PNG or JPEG). Note that if the input and output are both lossy formats (like JPEG) then this transformation will be lossy.
I know, I'm 9 years late, but...
You could use a idea similar to the PDF sanitizer in Qubes OS, which copies a PDF to a disposable virtual machine, runs a PDF parser which converts PDF to basically TIFF images, which are sent back to the originating VM and reassembled into a PDF there. This way you reduced your attack surface to TIFF files. Which is tiny.
(image taken from this article: https://blog.invisiblethings.org/2013/02/21/converting-untrusted-pdfs-into-trusted.html)
If there is really a 0-day exploit for your specific parser in that PDF, it compromises the disposable VM, but since only valid TIFF is accepted by the originating VM and since the disposable VM is discarded once the process is done, this is pointless. Unless of course the attacker also has a either Xen exploit at hand to break out of the disposable VM or a Spectre-type full memory read primitive coupled with a sidechannel to leak data to their machines. Since the disposable VM is not connected to the internet or has any audio hardware assigned, this boils down to creating EM interference by modulating the CPU power consumption, so the attacker probably needs a big antenna and a location close to your server.
It would be an expensive attack.

GoogleEarth crashed or still loading?

How do you know if the (free) GoogleEarth App is still loading data or hasnt crashed for some reaons.
Im loading a huge kml file, 100% cpu usage, but it never stops processing..
Are there any limits about the KML size of the displayed file?
How many KML MBs the Google Earth PC application can show without crashing ?
I don't think there are any file size limitations for Google Earth when using .kml files. There is however limits with regard to the size of images, so if your kml is trying to load images (such as screen overlays) then perhaps your problem lies there.
This is related to the specifications of your computer, so you can find the maximum resolution of images you can import by looking under the 'About'. I am not sure where to find info about the kb size of the image.
Next, you can try creating a .kmz out of the .kml - A KMZ is simply a compressed file the same as a .zip is - learn more about it
http://code.google.com/apis/kml/documentation/kmzarchives.html
Also you can try breaking the file up into smaller ones, either by using network links
http://code.google.com/apis/kml/documentation/kmlreference.html#networklink
or Regions
http://code.google.com/apis/kml/documentation/regions.html
By breaking the file up into smaller ones, you might also discover which part/element of the KML is causing hassles if you have some kind of format error

Loading website Images faster

Is it possible to improve Website background image to load faster that it is. My website's background image size is 1258X441 and memory size is 656KB. Its taking too long to load complete background image while accessing my website. Is there anyway than Compressing (As the image is already compressed) to improve its loading speed.
Choose: (2. is my main solution, and 3.)
Compress image even more, because image is background, it doesn't matter that much. Users do not focus at background as much as they do at content.
Since background is partially covered by content, you can paint black (or any other solid color) the part of background that is not visible (is behind content). This will make the image compress more nicely, sparig some place.
Save image in JPG progresive compression. That will make background display in gradually more quality as the image loads.
Get rid of background image. (obvious) :)
Today's web speeds are big, don't change anything.
ALSO: If your PNG image has any repeating parts, you can then slice your image in three parts and spare a lot of space.
The speed of loading the background image is determined by (latency ignored) bandwidth of your connection and the size of the image. So if you have let's say 128 KB/s and an image of size 4096 KB, you at least need
4096/128 = 32s
for it to load.
Since you can't change the bandwidth, the only thing you can do is change the size of the picture. That is, if you can't compress it more, lower the resolution.
If you don't want to lose precision, you could put different layers of background in your website with different qualities, the better ones over the bad ones. Then when your page is loading, the low quality images load fast and you get some background. Then over time the better quality images load and the background is improved.
Load your image in Database and call them , when ever you required. The benefit of this is ? Database is loading once you initiate it and can able to retrieve the information when ever required. it is fast compare to any other technique

Resources