Remotely upload a file to Wordpress from nodejs instance - node.js

I’m running a node app on Heroku. It gathers a chunk of data and sticks it into a set of JSON files. I want to POST this data to my Wordpress server (right now it’s on siteground, but moving to WP Engine).
I thought the WordPress REST API would provide what I wanted but after reading the docs I’m not not so sure.
Does anyone have any advice on this? It’s not the kind of thing I’ve done before.
Naturally, I could download the generated files and manually upload them in the right place… but I want it to be automated!
Can anyone point me in the right direction?
Looked at WordPress REST API but don’t think that’s the answer.

An option would be to setup a cron job on your Wordpress site to download the already made JSON files. This would let you download the files via http if they're publicly available, or using an api-key, or SFTP even.
You could also go the opposite way and setup a cron job on the server running your node app and use SFTP to deliver the updated files onto your Wordpress site.
If you are really committed to wanting to use the wordpress rest api, I think the closest one that comes to my mind (not an expert in wordpress) would be that you'd upload the json files as media objects: https://developer.wordpress.org/rest-api/reference/media/.

Related

Zip Files In Browser Cache

Hey Guys
At the moment I have a NodeJS webapp in the making which scrapes a website for data. Specifically, this webapp scrapes images for the purpose of downloading them. For example, all the image permalinks are scraped from the reddit front page. They are then sent to the client to download individually. My issue is with the website I am scraping there can be thousands of images.
This provides a horrible user experience if 1000+ images are downloaded to the download folder.
As a result I have two options.
A) Download to a temporary folder on the server. Zip. Send to client for download. Delete from server
B) Download files to browser cache. zip. download to specified download directory.
My question to you is this; Is option B even possible?
I am relatively new to this entire process and I can't find anything to actually zip the files in the browser cache. I can implement option A relatively easily however this requires a large amount of bandwidth, something I can find for around $5/MO on DigitalOcean. However this entire project is a learning experience and as a result I would love to be able to manage files in the browser cache instead.
I am using the following NPM Modules:
NodeJS
Express
Cheerio
Request
Further Update
I had come across a plugin for NPM called jsZip: https://stuk.github.io/jszip/
However, I was unaware it could be implemented on the client side as well. This was purely an error on my part. This brings up an interesting issue of WebStorage: https://www.w3schools.com/html/html5_webstorage.asp
the maximum storage size for the session is 5MB
From here I will attempt to implement this answer here: How do you cache an image in Javascript to my current code and will update this answer with the result for anyone else facing this issue.

Prevent Azure App Service from viewing backend configuration

I am working on a project that has us deploying to an Azure Web Site.
The code is overall working and now we are focusing more on security.
Right now we are having an issue that back end configuration files are visible with the direct URL.
Examples (Link won't work):
https://myapplication.azurewebsite.net/foldername/FileName.xml (this
file is in a folder that is contained within the root application)
https://myapplication.azurewebsite.net/vApp/FileName.css (this file
is a part of virtual application sub folder)
I have found this to be true with multiple extensions and locations.
Extensions like:
.css
.htm
.xml
.html
the list likely goes on
I understand that certain files are downloaded to the client side and that those can't be stopped. However backend XML files are something we don't pass to the client (especially if has connection strings).
I did read a similar article, Azure App Service Instrumentation Profiling?
However this didn't directly relate to my issue.
Any insight would extremely helpful.
Do not store sensitive information in flat files, especially under your site root. Even if you web.config it just right you're still one botched commit away from disaster.
Use Application Settings instead, that's what they're for.
https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-configure

AWS to host assets and client side static code at central repository which should be accessible from node to upload files to central location

I am using AWS to run my application based on MEAN stack. I am using load balancer with three instances of Node application servers and three instances of mongo database server on cluster. My application has feature to upload file contents to server, mainly images, audios, vidoes etc. I want following
I want to create one central content repository which should be accessible from all of my three node application servers so that my node code should be able to upload files to central content repository.
I want to one URL to access this central content repository which can be used on user interface to load assets and display it
I also want to re-purpose this same central repository to host all of my client side javascript, css, images and would like my index.html to refer client side assets from central repository URL
I was looking into options in AWS however go confused. Not able to understand what is best and easy approach. Following is server architecture for reference.
Please advice
I couldn't figure out to use EC2 to solve this problem, so I changed my implementation approach, rather than hosting files on file system, I am using the MongoDB GridFS to store files. It helped to make it available for each nodejs hosts.

Is there a Grunt.js plugin for downloading via FTP?

There are so many plugins available which will help push to a webserver, but are there any that will download?
A Bit of Background
I'd like to automate the process of publishing my CMS-based website. The only issue is that our marketing people regularly blog and make content changes, so I'd like to first download the content files which have changed to my local development environment (.md files, which are not accessible from the web) before I push everything up to the staging server.
Does anything like this exist? I've searched NPM quite thoroughly, as well as this question which unfortunately didn't yield any results.
I did see a pretty robust cURL based plugin, however it doesn't support FTP authentication and since these files are not web-accessible directly, I'll need to leverage FTP.
A quick google search gave me this: https://www.npmjs.com/package/grunt-ftp

how to upload,download, or delete a directory or multiple files in FTP server using CFNetwork?

i have starting learning about FTP Programming, i learn some from simpleFTPSample that using CFNetwork. From this sample i understand how to upload and download a file from ftp server, and also i understand how to get a list file and directory from FTP server. But the problem is, i want to upload, download, and delete a directory or multiple files in FTP server, but i dont know how to do. Can somebody give an example? Can i do it using CFNetwork? Or am i have to add another library for doing this?
Thank you
i think it will help u
http://code.google.com/p/s7ftprequest/
or
http://www.ftponthego.com/
you could use cURL if you have little success with the apple sample code, see here.
http://www.intelliproject.net/articles/showArticle/index/use_curl_iphone_sdk

Resources