I'm beginner web-developer, front-end only. Sometimes i need to make existing working websites to be responsive. I use browser extension(Styler for Chrome), that show me window where i can insert my styles, which will be applied for a page. But it looks little difficult(need to write code in my text-editor, copy this to extension form, than again, again and again...). Is there a way, to integrate my local stylesheet, to existing website, make changes only in my editor and reload page automatically, like with local page? I've found something on LiveReload website -
http://feedback.livereload.com/knowledgebase/articles/86220-preview-css-changes-against-a-live-site-then-uplo, but I can't use their app, cause i'm on windows(LiveReload is still in beta on it). If anybody use similar, can you please explain how to get it to work? Thanks.
In process of expiriencing I found simple solution for this:
Need to install Chrome extension(CSS Inject).
Run web-server on your machine which will host your css file for injecting(CSS Inject works only with HTTP) and insert it to CSS Inject, in my situation it looks like - http:// adapt/css/style.css
Need livereload server. I'am using node.js and this package https://www.npmjs.com/package/livereload for this.
Create file in your web-site root(for example server.js)
Paste this code in server.js:
var livereload = require('livereload'),
server = livereload.createServer();
server.watch(__dirname + "/css");
console.log('waiting for changes');
Go to your live website and activate CSS Inject.
run node ./server.js
That's it. You can now modify your styles localy and see changes on real website.
If anybody knows better solution(using API from this package https://www.npmjs.com/package/livereload#api-options, specifically overrideURL option) or have better expirience with node.js and understang how to implemet it, please post your solution here, I will be grateful.
Related
I'm working on a project for adding an external pre-registration page with Stripe to a Wordpress website.
The wordpress is already in place. But the pre-registration page needed the Stripe module, working on back-end...
So I've decided to make a SPA outside this wordpress to have this pre-registrations set up for the next customers.
But now, I'm struggling from days to find how to put this project online, I may definitely need your help for that.
I have this file structure:
File structure
It's nothing more than a single page + a single css file and a single server.js file, that I need to start to interact with the Stripe API.
But I also need some modules like Stripe, which is the key function of this project.
I thought about bundling it with webpack (only the back-end, as the front is only composed of 1 html and 1 css file). Then upload it on my web host.
Haven't found any working way to do so.
I'd like to try some cloud services like Netlify, but I don't even think that it could work without bundling the server side
Everything works perfectly on localhost, but I'm absolutely unable to make it works in prod with a real domain name. And I'm already late to deliver it.
Do you have an idea on how I could do that?
Thanks
Can anyone explain me how I can run the server.js on client-side. Like I have index.html page and I've a button in it and I want to generate PDF on button click event and pass dynamic data to PDF. I've literally no idea about node.js. Any kind of help is appreciated. Sorry if the question is duplicate.
The simple answer is you don't. Node.js is a back end and the server.js file you have should not run on the browser. If you need the functionality on server.js after the first load you need to use something like ajax to communicate with the server. Or you could try to get your server code to run in the browser, but that may not work for all browsers and is probably a bad practice.
I'm using artoo.js for web scraping however For some reason the scraped image url's change when working with cheerio in node . i.e the original image url is :
"https://images-na.ssl-images-amazon.com/images/M/MV5BNWU4NmY3MTMtMTBmMi00NjFjLTkwMmItYWZhZWUwNDg5M2ExXkEyXkFqcGdeQXVyNDUyOTg3Njg#._V1_SX300.jpg"
However after scraping the Url turns to this url:
"http://ia.media-imdb.com/images/G/01/imdb/images/nopicture/156x231/tv-3797070466._CB522736147_.png#._V1_SX300.jpg"
If I scrape it while in chrome browser console using Artoo.js bookmark. The Url stays same as original.
Why is it changing when i use it in node?.Any Suggestions
UPDATE: Update: I think I found the issue but not the solution. It seems the scraper method runs before the correct images have loaded on page. the changed URL is just the placeholder image. How can I wait till the entire page loads.
It may be caused by some JS code. If you are using request+cheerio to scrap the page. When you make the request in node the JS code does nothing (it's not interpreted). So you are probably getting the original url before any lib or piece of code changes it. Try to look at the source code of the page in the browser Crtl+u. If it's "http://ia.media-imdb.com/images/G/01/imdb/images/nopicture/156x231/tv-3797070466._CB522736147_.png#._V1_SX300.jpg" then you will know some piece of code is doing something to change it.
Edit
If you absolutly need to run the JS to obtain the URL. You sould use phantomjs. It's a headless browser. The imaes will load. You can use it directly from nodejs or if you want a simpler way go with casperjs. I assume you're not used to scraping complicated web apps. If it's the case would go with casperjs. It's easy and it does the job. It's not as fast as using request + cheerio but it works. And you can put your code to run on a server.
I have my fledgling polymer application running. I would like to load and save a file. I would like to do that by using the ajax element. But what happens on the server side?
I have made nodejs express applications before. I could make a separate server for the client to talk to. But there are at least two other options:
Take the front-end material supplied by the polymer start-kit and put it in the /public directory of an express application.
Put express routes in my polymer starter-kit application.
I am inclined to bring Express into the existing start-kit application. But maybe someone else has already tried this, and can tell me what I will run into?
I looked at the starter-kit code a little bit. Apparently, it uses the spdy package and not the http package, but I can work with that. That's as far as I have gotten. Any advice?
Regards, Rick
I talked to the guys on the Polymer slack channel. Thanks, milesje. polymer serve is just supposed to be a light server to serve static files. If you want to do something more substantial, you should create your own server (nodejs or something else) and put your Polymer files in the public/ directory.
Hope this saves someone else some time.
I'm trying to automate some datascraping from a website. However, because the user has to go through a login screen a wget cronjob won't work, and because I need to make an HTTPS request, a simple Perl script won't work either. I've tried looking at the "DejaClick" addon for Firefox to simply replay a series of browser events (logging into the website, navigating to where the interesting data is, downloading the page, etc.), but the addon's developers for some reason didn't include saving pages as a feature.
Is there any quick way of accomplishing what I'm trying to do here?
A while back I used mechanize wwwsearch.sourceforge.net/mechanize and found it very helpful. It supports urllib2 so it should also work with HTTPS requests as I read now. So my comment above could hopefully prove wrong.
You can record your action with IRobotSoft web scraper. See demo here: http://irobotsoft.com/help/
Then use saveFile(filename, TargetPage) function to save the target page.