Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I have lots of JPEGs from DSLR, and they are roughly about 5-6MB per JPEG. I open any of them using MSPAINT, and click the SAVE and notice the size immediately go down to 2-3MB.
Why? Is Mspaint doing a lossy or lossless compression?
Things Paint May be doing:
Using different quantization tables
Subsampling the Cb and Cr color components
Using optimal huffman tables.
Stripping out metadata.
You an run a JPEG dumping program on the two versions and compare the output to see the changes
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed last year.
Improve this question
Everyone!
I hope you are doing well.
I have a gif file that will be used for the background of my homepage.
But its resolution is not good so I think it would be better if I use the GIF as SVG.
Is there any way to convert the GIF image to SVG.
Any tools or website link will be helpful for me.
Thanks.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have 6 very big directorys and once a day I would like to check size each of this directories for my monitoring. Now I'm using du -s command but it take many time and significantly slows my server. Is any different better way to do this?
Depending on circumstances you could put those directories on seperate partitions, the "used" size of which you can check very quickly with df.
This, of course, means that the directories are limited to the size of their respective partitions, which could be a pain. Hence the "depending on circumstances".
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have a Lumix camera which, like most new cameras, record video in AVCHD format. The files get segmented into 2 or 4 GiB segments because of the limitations of the filesystem used on the memory card.
When I transfer the files to my linux computer to edit them I naturally want to have each video in a single file, which is no problem at all for linux's filesystems. So, how can I losslessly join these segments, maintaining a/v-sync?
(With Avidemux 2.6.8 I can append these segments, but it leads to nasty distortions at the cut point.)
The solution, which seems to work with my files at least, turned out to be very simple:
ffmpeg -i "concat:00000.MTS|00001.MTS|00002.MTS" -c copy output.mts
One still has to figure out which of the files belong together, though.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to compress an mp4 file to reduce the file size. I am using a Debian box.
The file is around 701 MB. First, I tried bzip2, which compressed it down to like 700MB, then I tried lrzip with zpaq, took like 5 mins to compress, and only brought it down to like 695 MB... Am I doing something wrong?
mp4 (or MPEG4) is already a very compressed video format, which uses advanced coding specific for video.
You won't be able to compress it more using loseless zip algorithms like bzip. What you can do is transcode the file to lower its video quality using some kind of transcoder (such as HandBrake).
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
how do I get a list of amplitudes from a audio file using a linux command line tool ?
Do you mean getting all the individual samples as text? SoX can do that.
$ sox file.wav file.dat
will take an audio file file.wav, and generate a text file file.dat with a column for the timebase in seconds, and a column for each audio channel scaled by the maximum possible value.