I'm trying to create a node programatically and use the Filefield Module to upload a file from a remote url.
The node is being created properly but I can't get the remote url upload working.
Anyone have any experience with this?
After speaking with the maintainer of the module the code can actually be copied from the module. It's around 30 lines that deals with the remote url. Once you have that then you can just pass the resulting file into an array and let filefield deal with the rest.
Related
how you doing?
I'm trying to download a excel file from a web site (Specifically DataCamp) in order to use its data into an automatic process, but before to get the file is necessary to sign in on the page. I was thinking that this would be possible with the JSON Query on the HTTP action, but to be honest I don't know where to start (I'm new on Azure).
The process that I need to emulate to get the file extraction would be as follow (I know this could be possible with an API or RPA but I don't have any available for now):
Could you tell me guys some advices (how to get the desired result or at least where to make research)? is this even posibile?
Best regards.
If you don't have other ways, e.g. your source is on an SFTP, etc. than using an HTTP Action should work, pass the BODY to your next action (e.g. you might want to persist that on a BLOB if content is binary).
If your content is "readable", e.g. JSON, CSV and want to load for processing, you need to ensure, for large files, that you read it in Chunks to load it completely before processing.
Detailed explanation at https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-handle-large-messages#download-content-in-chunks
The scenario is:
I am working on a express.js app with mongoDB and EJS.
There is a URL/ link on a page which I want to change, let's say using
a form.
I am already using user model to retrieve and update user data.
What I can do is save that link in a collection in mongoDB i.e. create
a model.
But, I think it's not good to create a model for just a link and get
it to render on EJS.
What should I do any suggestions? Any tricks?
*No need to read if you already know what I should do.
I tried something which is not a good practice and sometimes causing big issues.
I have added a JSON file in public directory to serve.
I am reading this file to get the link on client side.
when I want to change the link, I submit new link using a form and on server
side overwrite the content of that file (JSON file present in public directory).
so next time that file will be served with changed content.
I tried overwriting the contents of that file(using "fs") synchronously as well as asynchronously but because that may be sometimes web page stucks for a second, but it is not crashing the app.
may be this was silly, please suggest if you know anything i should do.
*NOTE: Sorry, if this question is inappropriate for StackOverflow. But I am struggeling to find any solution.
On my website I have some categories in the database which don't change often. I am using the NextJS framework. In order to decrease the load time and make category control load faster I am thinking of putting categories in a json file under /static/data/categories.json and let React fetch that json file instead of making a database call. I have read about multiple approaches to importing static json files including json-loader. However, I am thinking of making an api call something like the following from React once on the home page and store the contents in redux state so that I could use them wherever I need. My intention here is whenever categories are modified, I update categories.json and the client will fetch the updated json file. Deployment downtime is not an issue for me.
const categories = yield call(request, `${BASE_URL}/static/data/categories.json`, options);
My questions are:
Is this a right approach to load a json file from the server keeping in mind I would want to update the file every few months without the need to redeploy the whole website.
Can a json file become a bottleneck if thousands of concurrent users try to access it. I am using express with nginx. Would using express.static help at all?
Thanks in advance.
To render on threejs, we need some images(jpg/png) and , jsons(uv data). All these files are stored in respective folders and the files visible for clients to look at.
I use django/python to start a local server, python code is compiled to .pyc & js code is obfuscated. But the folder structure is accessible for Casual Users. In threejs, we use tex_loader and json_loader functions to which the file paths are given as inputs. Was looking at ways of securing the behind the scenes work.
Happened to read about custom binary formats, but that felt like a lot of work.
or giving access to files only for certain process starting through django/web browser?
Are there any available easy to deploy solutions to protect our IP ?
An option would be to only serve the files to authenticated users. This could be achieved by having an endpoint on your backend like:
api/assets/data.json
and the controller in the backend would receive the file name(data.json), the code could check if the user requesting the endpoint is authenticated and if so read the file from the file system(my-private-folder/assets/data.json) and return it as file with correct mime-type to the browser.
I want to upload images from the client to the server. The client must see a list of all images he or she has and see the image itself (a thumbnail or something like that).
I saw people using two methods (generically speaking)
1- Upload image and save the binaries to MongoDB
2- Upload an image and move it to a folder, save the path somewhere (the classic method, and the one I implemented so far)
What are the pros and cons of each method and how can I retrieve the data and show it in a template in each case (getting the path and writing to the src attribute of and img tag and sending the binaries) ?
Problems found so far: when I request foo.jpg (localhost:3000/uploads/foo.jpg) that I uploaded and the server moved to a known folder, my router (iron router) fails to find how to deal with the request.
1- Upload image and save the binaries to MongoDB
Either you limit the file size to 16MB and use only basic mongoDB, either you use gridFS and can store anything (no size limit). There are several pro-cons of using this method, but IMHO it is much better than storing on the file system :
Files don't touch your file system, they are piped to you database
You get back all the benefits of mongo and you can scale up without worries
Files are chunked and you can only send a specific byte range (useful for streaming or download resuming)
Files are accessed like any other mongo document, so you can use the allow/deny function, pub/sub, etc.
2- Upload an image and move it to a folder, save the path somewhere
(the classic method, and the one I implemented so far)
In this case, either you store everything in your public folder and make everything publicly accessible using the files names + paths, either you use dedicated asset delivery system, such as an ngix server. Either way, you will be using something less secure and maintenable than the first option.
This being said, have a look at the file collection package. It is much simpler than collection-fs and will offer you everything you are looking for out of the box (including a file api, gridFS storing, resumable uploads, and many other things).
Problems found so far: when I request foo.jpg
(localhost:3000/uploads/foo.jpg) that I uploaded and the server moved
to a known folder, my router (iron router) fails to find how to deal
with the request.
Do you know this path leads to your root folder public/uploads/foo.jpg directory? If you put it there, you should be able to request it.