When I deploy my site to Heroku, the JPG's are greatly compressed. For example, a JPG which should display as about 72k is showing up as 14k.
I am utilizing the imagemin grunt plugin to compress the image on deploy, however it compresses the image only slightly (original is about 84kb and it compresses to the aforementioned 72k) and the quality difference is not noticable. The correctly compressed image is making it onto the deployed instance. I've checked this by shelling into the deployed instance (via heroku run bash) and examining the file size (via ls -l) -- it does show up as being 72k.
I'm thinking this may be cache related because if I open the image in a new tab (I'm using chrome) and ctrl f5 to reload it uncached, it does come as the expected 72k. Unfortunately, doing a similar type of refresh on the main site doesn't seem to be bring the correctly compressed images.
It's also a bit perplexing that I've never to my knowledge compressed the images so greatly. The grunt imagemin plugin only has one option (progressive) and while I have in the past played around with toggling that option, it only affects the image size by a small degree. So, if this is caching related, I have no idea where the super compressed ones came from in the first place.
Can anybody shed some light as to what is happening?
Related
Hey Guys
At the moment I have a NodeJS webapp in the making which scrapes a website for data. Specifically, this webapp scrapes images for the purpose of downloading them. For example, all the image permalinks are scraped from the reddit front page. They are then sent to the client to download individually. My issue is with the website I am scraping there can be thousands of images.
This provides a horrible user experience if 1000+ images are downloaded to the download folder.
As a result I have two options.
A) Download to a temporary folder on the server. Zip. Send to client for download. Delete from server
B) Download files to browser cache. zip. download to specified download directory.
My question to you is this; Is option B even possible?
I am relatively new to this entire process and I can't find anything to actually zip the files in the browser cache. I can implement option A relatively easily however this requires a large amount of bandwidth, something I can find for around $5/MO on DigitalOcean. However this entire project is a learning experience and as a result I would love to be able to manage files in the browser cache instead.
I am using the following NPM Modules:
NodeJS
Express
Cheerio
Request
Further Update
I had come across a plugin for NPM called jsZip: https://stuk.github.io/jszip/
However, I was unaware it could be implemented on the client side as well. This was purely an error on my part. This brings up an interesting issue of WebStorage: https://www.w3schools.com/html/html5_webstorage.asp
the maximum storage size for the session is 5MB
From here I will attempt to implement this answer here: How do you cache an image in Javascript to my current code and will update this answer with the result for anyone else facing this issue.
Before getting too far into developing my theme, which is based on Cornerstone, I have tried running the stencil bundle command and then uploading the customized Cornerstone theme onto my bigcommerce website. It finishes uploading and gets a small portion of the loading bar finished before telling me "There was a problem processing the theme". I'm not sure what I missed in the stencil documentation. Maybe I have made adjustments to a file that shouldn't have been changed, such as the Foundation settings file? If you could help me I would appreciate it.
I put my videos into the cdn folder and it uploaded fine.
I am doing my first deployment on AWS (using Elastic Beanstalk), and I am completely new to this.
I built a personal website using NodeJS / Express, and on my local machine it loads just fine. Once I was ready to deploy a v1, I created an AWS account and set up a new EBS application environment for Node. I set up the static files to load from /public, set my node version, and set the launch command as node app.js, but those were the only options I changed.
I zipped up my site (using CNTL + Click -> Compress on a selection of all site files) and uploaded that zip, and after some time, it came up all green. Clicking the link to load my site though, I get a half finished version. Looking at my console, I see that I am getting 4 files as 404, and because of that, 4 failures from RequireJS.
These 4 files are backbone views, and are contained in a folder with 4 other JS files that are all loading just fine (I can open them in the chrome dev tools source tab from the deployed version). I am confused how just these 4 files would go missing.
Is there some way to FTP into where ever my files are contained, to confirm the files are in fact not present? And barring that, what steps are available to figure out what is occurring here? Like I said, it looks and loads just fine locally, and I am at a loss as to where to even start debugging something like this. The AWS docs I have read so far only tell me to do exactly what I have been doing.
Repo for the project is here: https://github.com/RyanMG/trustycode
And the deployment is here: http://trustycode.elasticbeanstalk.com/
The files it is having trouble with are under public/javascript/views/ (CodeView, AboutView, PhotoView, DesignView)
Any ideas / advice?
Is there some way to FTP into where ever my files are contained, to confirm the files are in fact not present?
You can ssh into the EC2 instance of the Elastic Beanstalk app using your pem file.
Check files in /var/app/current
I don't have the reputation to comment, but that is one of those common gotchas I found myself switching to OSX from GNU/LINUX at work. OSX is case insensitive; linux world is case sensitive.
I am tearing my hair out with a file weird file upload issue that I have never run into before. For some reason I’m unable to upload images via the file manager (both in the file manager itself and if I upload with a custom field using the “file” fieldtype). Strangely, if I add files directly to any of the file upload directories, and sync the files, everything works fine.
After selecting the file and hitting “upload file” (see 01_choose_file.jpg) the modal window displays the CP homepage in an iframe (see 02_upload_progress.jpg).
Has anyone else seen this? Does anyone know how I can start troubleshooting this?
Background Info:
I’m running EECMS v2.5.2 - Build Date: 20120606 in MAMP (only 2 out of 15 sites I have set up locally are not working)
I have tried uploading images/files using the latest versions of Chrome, Chrome Canary, Safari, and Firefox (OS X 10.7.5)
This issue is showing up on the two latest sites I’ve started dev’ing locally on, no other site (locally or otherwise)
Things I’ve done:
Checked Apache/PHP error logs; they don’t show anything
Confirmed file upload paths and file upload directory settings are correct – I can sync files that i manually move into the various file upload directories
Permissions are fine; image manipulations and thumbnail creation work fine if I manually add files to the upload directories
Tested various other 2.5.2 installs I dev on locally and they work fine (settings on these two new sites are identical to sites that work)
Only a handful of native add-ons are enabled
“Apply XSS Filtering to uploaded files?” setting Yes or No does not make a difference
Huge thanks for any help!
I can't post images so here are links to the images:
01_choose_file.jpg: http://expressionengine.com/?ACT=51&fid=105&aid=16264_Jiof3p0V1gfEEFrpC55G&board_id=5
02_upload_progress.jpg: http://expressionengine.com/?ACT=51&fid=105&aid=16265_mjGH02xK2fIFZJI6kruP&board_id=5
I have sorted this out. I went back through to make sure I disabled all third party add-ons and I had forgotten to uninstall the "Quickee" extension http://devot-ee.com/add-ons/quickee. For now that seems to be the culprit.
I've submitted the bug to Matt (the developer) and it should be patched up soon.
The ExpressionEngine filemanager sends out a AJAX POST request to the following URL:
http://YOUR_ADMIN_CP_URL?S=0&D=cp&C=content_files_modal&M=upload_file
Have you tried loading that URL yourself? You should get a page like this
But maybe EE is trying to POST to a different URL. You can find it by uploading a large file and while it's uploading using Firebug and in the Network tab at the bottom of the list you will find the URL EE is posting to
Fo you know any script to make a screenshot of rendered web browser contents to an image file?
For now I've tried:
wkhtmltoimage - doesnt dump flash
cutycapt - problems to compile on my hosting
khtml2png - problems with compilation
At home I'm using Ubuntu, hosting is on Debian
Never got around to trying myself, but check out http://en.wikipedia.org/wiki/Xvfb. You should be able to run Firefox in xvfb and just save an image of the whole virtual window.
The xwd(1) program can capture a running browser window's contents and save it to a file or stdout:
xwd -out /tmp/image.out
You can view it again with xwud(1).
The ImageMagick import(1) command can also capture X11 windows or any rectangular portion of the screen. It also supports many output file formats, which might be nicer than the standard xwd format.
PDFMyURL - really useful except bug with header sending. They have
simple "API". Unless you need simple grab the screen from WebKit, it
is best solution IMHO
If you have own VDS, I recommend to discover PhantomJS See rasterize.js
UPD: I have just seen this is necropost Z)