I would like to test a very simple case with the API CloudConvert with a curl request.
I want to import the file essaiFichier.txt with a curl request. I get a response in Json with a status "waiting". I have no idea if the request was well done. If someone has faced the same problem it would be great to have some
Below my code in order to fix the issue.
$authorization ="Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOi";
$url ="https://api.cloudconvert.com/v2/jobs";
$post = '{
"tasks": {
"import-1": {
"operation": "import/url",
"url": "http://localhost/biere/essaiFichier.txt",
"filename": "essaiFichier.txt"
}
}
}';
$ch=curl_init($url);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post);
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type: application/json' , $authorization));
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
$info = curl_getinfo($ch);
I'm also new to cloudconvert, though it looks to me like you aren't following the 'rules' for using the service - at least, not to get anything useful out of it....
You need to do THREE things (at least):
Import (you have that)
Task (like 'convert'...)
Export (get your modified file back)
I find their 'Job Builder' to be a simple way to get the code - at least for starting out. See https://cloudconvert.com/api/v2/jobs/builder
I entered your 'Import' into the 'Job Builder' (note - I think you don't need the 'filename' in there, or you should break apart the 'url' and only have the file named in the 'filename' section - again, I'm new at it, but that is how I read it) and it still shows me this in the blue box at the top (the 'hints')
How to build a job
Add a processing task, for example a convert task.
Add an export task. You can use the export/url task to generate a URL for the output file.
Which tells me you just need to do the other parts so you have a complete request.
As for the 'waiting' response, yes, that is what you will get on the initial request. Again, see the docs on the Job Builder page - you can either do another request for the 'wait' response (which should get you the link for the 'export' part) or you can do a webhook that will be your trigger to download the file (which would make things more automatic).
Following your code and the Job Builder, I just finished my first conversion - worked great and now I can move on with my project (yeah!)
thank you very much for your complete and precise answer. Finally I found an alternative with https://github.com/dompdf/dompdf, wich is much easier to use for my opinion and without any inscription. I recommend it.
Thanks
Related
As the title says, I'm trying to intercept script requests from the user's page, make a GET request to the script url from the background, add a bit of functionality and send it back to the user.
A few caveats:
I don't want to do this with every script request
I still have to guarantee that the script tags are executed in the original order
So far I came with two solutions, none of which work properly. The basic code:
chrome.webRequest.onBeforeRequest.addListener(
function handleRequest(request) {
// First I make the get request for the script myself SYNCHRONOUSLY,
// because the webRequest API cannot handle async.
const syncRequest = new XMLHttpRequest();
syncRequest.open('GET', request.url, false);
syncRequest.send(null);
const code = syncRequest.responseText;
},
{ urls: ['<all_urls>'] },
['blocking'],
);
Now once we have the code, there are two approaches that I've tried to insert it back into the page.
I send the code through a port to a content script, that will add it to the page inside a <script></script> tag. Along with the code, I also send an index to keep sure the scripts are inserted back into the page in the correct order. This works fine for my dummy website, but it breaks on bigger apps, like youtube, where it fails to load the image of most videos. Any tips on why this happens?
I return a redirect to a data url:
if (condition) return { cancel: false }
else return { redirectUrl: 'data:application/javascript; charset=utf-8,'.concat(alteredCode) };
This second options breaks the code formatting, sometimes removing the space, sometimes cutting it short. I'm not sure on the reason behind this behavior, it might have something to do with data url spec.
I'm stuck. I've researched pretty much every related answer on this website and couldn't find anything. Any help or information is greatly appreciated!
Thanks for your time!!!
Trying to post a file to a subfolder of the Shared Documents folder. I thought I had the correct syntax down, but I keep getting StatusCode 400 Bad Request.
https://graph.microsoft.com/v1.0/sites/xxxxxx.sharepoint.com,495435b4-60c3-49b7-8f6e-1d262a120ae5,0fad9f67-35a8-4c0b-892e-113084058c0a/drives/b!tDVUScNgt0mPbh0mKhIK5WefrQ-oNQtMiS4RMIQFjAqJk9Tt237bQYC9yEkyNOr6/items/01JDP7KXJ7ZSCYHUJC7BFJW2X6BTR4Z4JH:/filename.xlsx:/content
where "filename" is the actual filename.
I know a GET to the following lists the subfolder:
https://graph.microsoft.com/v1.0/sites/xxxxxx.sharepoint.com,495435b4-60c3-49b7-8f6e-1d262a120ae5,0fad9f67-35a8-4c0b-892e-113084058c0a/drives/b!tDVUScNgt0mPbh0mKhIK5WefrQ-oNQtMiS4RMIQFjAqJk9Tt237bQYC9yEkyNOr6/items/01JDP7KXJ7ZSCYHUJC7BFJW2X6BTR4Z4JH
Request is going out as:
{Method: PUT, RequestUri: 'https://graph.microsoft.com/v1.0/sites/xxxxxx.sharepoint.com,495435b4-60c3-49b7-8f6e-1d262a120ae5,0fad9f67-35a8-4c0b-892e-113084058c0a/drives/b!tDVUScNgt0mPbh0mKhIK5WefrQ-oNQtMiS4RMIQFjAqJk9Tt237bQYC9yEkyNOr6/items/01JDP7KXJ7ZSCYHUJC7BFJW2X6BTR4Z4JH:/', Version: 2.0, Content: <null>, Headers:
{
Authorization: Bearer eyJ0eXAiOiJKV1QiLCJub...BXS_cSg1CcZHj5Q
}}
Seems like it is dropping part of the request to me.
First, https://graph.microsoft.com/v1.0/sites/xxx.sharepoint.com,495435b4-60c3-49b7-8f6e-1d262a120ae5,0fad9f67-35a8-4c0b-892e-113084058c0a/drives and https://graph.microsoft.com/v1.0/sites/xxx.sharepoint.com/drives will return the same results, we prefer the second one.
I have never sucessful run the following API:
/drives/{drive-id}/items/{parent-id}:/{filename}:/content
But based on my test, the following API works well:
/v1.0/me/drive/root:/Test/Test1.txt:/content
or
/v1.0/me/drives/driveid/root:/Test/Test1.txt:/content
I'm trying to make a request with Content-Type x-www-form-urlencoded that works perfectly in postman but does not work in Azure Logic App I receive a Bad Request response for missing parameters, like I'd not send enything.
I'm using the Http action.
The body value is param1=value1¶m2=value2, but I tried other formats.
HTTP Method: POST
URI : https://xxx/oauth2/token
In Headers section, add the below content-type:
Content-Type: application/x-www-form-urlencoded
And in the Body, add:
grant_type=xxx&client_id=xxx&resource=xxx&client_secret=xxx
Try out the below solution . Its working for me .
concat(
'grant_type=',encodeUriComponent('authorization_code'),
'&client_id=',encodeUriComponent('xxx'),
'&client_secret=',encodeUriComponent('xxx'),
'&redirect_uri=',encodeUriComponent('xxx'),
'&scope=',encodeUriComponent('xxx'),
'&code=',encodeUriComponent(triggerOutputs()['relativePathParameters']['code'])).
Here code is dynamic parameter coming from the previous flow's query parameter.
NOTE : **Do not forget to specify in header as Content-Type ->>>> application/x-www-form-urlencoded**
Answering this one, as I needed to make a call like this myself, today.
As Assaf mentions above, the request indeed has to be urlEncoded and a lot of times you want to compose the actual message payload.
Also, make sure to add the Content-Type header in the HTTP action with value application/x-www-form-urlencoded
therefore, you can use the following code to combine variables that get urlEncoded:
concat('token=', **encodeUriComponent**(body('ApplicationToken')?['value']),'&user=', **encodeUriComponent**(body('UserToken')?['value']),'&title=Stock+Order+Status+Changed&message=to+do')
When using the concat function (in composing), the curly braces are not needed.
First of all the body needs to be:
{ param1=value1¶m2=value2 }
(i.e. surround with {})
That said, value1 and value2 should be url encoded. If they are a simple string (e..g a_b) then this would be find as is but if it is for exmaple https://a.b it should be converted to https%3A%2F%2Fa.b
The easiest way I found to do this is to use https://www.urlencoder.org/ to convert it. convert each param separately and put the converted value instead of the original one.
Here is the screenshot from the solution that works for me, I hope it will be helpful. This is example with Microsoft Graph API but will work with any other scenario:
I'm very new to LINUX working with node.js. Its just my 2nd day. I use node-curl for curl request. In the link below I have found example with Get request. Can anybody provide me a Post request example using node-curl.
https://github.com/jiangmiao/node-curl/blob/master/examples/low-level.js
You need to use setopt in order to specify POST options for a cURL request. The options you should start looking at first are CURLOPT_POST and CURLOPT_POSTFIELDS. From the libcurl documentation linked from node-curl:
CURLOPT_POST
A parameter set to 1 tells the library to do a regular HTTP post. This will also make the library use a "Content-Type: application/x-www-form-urlencoded" header. (This is by far the most commonly used POST method).
Use one of CURLOPT_POSTFIELDS or CURLOPT_COPYPOSTFIELDS options to specify what data to post and CURLOPT_POSTFIELDSIZE or CURLOPT_POSTFIELDSIZE_LARGE to set the data size.
Optionally, you can provide data to POST using the CURLOPT_READFUNCTION and CURLOPT_READDATA options but then you must make sure to not set CURLOPT_POSTFIELDS to anything but NULL. When providing data with a callback, you must transmit it using chunked transfer-encoding or you must set the size of the data with the CURLOPT_POSTFIELDSIZE or CURLOPT_POSTFIELDSIZE_LARGE option. To enable chunked encoding, you simply pass in the appropriate Transfer-Encoding header, see the post-callback.c example.
CURLOPT_POSTFIELDS
... [this] should be the full data to post in a HTTP POST operation. You must make sure that the data is formatted the way you want the server to receive it. libcurl will not convert or encode it for you. Most web servers will assume this data to be url-encoded.
This POST is a normal application/x-www-form-urlencoded kind (and libcurl will set that Content-Type by default when this option is used), which is the most commonly used one by HTML forms. See also the CURLOPT_POST. Using CURLOPT_POSTFIELDS implies CURLOPT_POST.
If you want to do a zero-byte POST, you need to set CURLOPT_POSTFIELDSIZE explicitly to zero, as simply setting CURLOPT_POSTFIELDS to NULL or "" just effectively disables the sending of the specified string. libcurl will instead assume that you'll send the POST data using the read callback!
With that information, you should be able add the following options to the low-level example to have it make a POST request:
var fieldsStr = '{}';
curl.setopt('CURLOPT_POST', 1); // true?
curl.setopt('CURLOPT_POSTFIELDS', fieldsStr);
You will need to tweak the contents of fieldsStr to match the format the server is expecting. Per the documentation you may also need to url-encode the data - which should be as simple as using encodeURIComponent according to this post.
I am a high school student reasonably concerned about my grades, and when I check them through a system known as Zangle! Student Connection, it is mildly painful in how long it takes.
I was wondering if it would be possible to construct a script, in whichever language is deemed appropriate, to login for me, based on a pre-entered login username and password, and then present my grade percentages in a nice layout, instead of the kind of awkward and messy layout it is now presented in.
I'm guessing this either way to hard, or way out of my reach at the moment, or even impossible, but I just thought it was a decently cool idea and was looking for any suggestions.
Also, I have no idea as to which language is best for this, so I would definitely need help on this!
You could try Greasemonkey, it lets you create "user scripts" in Firefox. That way you could use Javascript to reorganise the interface.
Well, here is my take on it in PHP, utilizing the cURL library:
PHP is by no means deemed appropriate for the task. It is however, fairly easy to setup what you're looking for in the language.
<?php
error_reporting(-1);
$ch = curl_init();
/*Some sites block your access if you do not have cookies enabled. In order to get the cookies you will need to submit the form manually and using a packet sniffer (or Firebug) get the cookies that are being sent.*/
//$cookies ="CFID=25318504; CFTOKEN=38400766; PERSON_ID=3461047";
/*Again, if you have Firebug then getting the following POST data, once you submit the form manually, fairly straightforward. This is what cURL will utilize in the POST fields*/
//The action=submit may also vary, this is also easily acceible via Firebug. (right next to the parameters tab.
$post_data = "username=test&password=test&action=submit";
curl_setopt($ch, CURLOPT_URL, "http://www.sitename.com");
//follows a Location: redirect
curl_setopt($ch, CURLOPT_AUTOREFERER, 1);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
//send above cookies, which were gathered manually =(
//Utilize this only if cookies are a neccesity.
//curl_setopt($ch, CURLOPT_COOKIE, $cookies);
//Doing a POST request
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post_data);
$output = curl_exec($ch);
curl_close($ch);
if($output == false) {
echo "cURL Error:" . curl_error($ch);
}
//You can sort this data using an HTML parser
echo $output;
Once you have successfully connected to the site, you can utilize one of many PHP HTML parsers to traverse through the data, such as: DOMDocument and Xpath or SimpleXML.