Marklogic installation on redhat 7 - linux

I have a redhat instance on aws. I am trying to install marklogic. I got the url after signing up for marklogic developer. using the following command, I get a 403 forbidden return.
curl https://developer.marklogic.com/download/binaries/7.0/MarkLogic-7.0-6.x86_64.rpm?

The curl statement that you get after signing up, and clicking the download of a MarkLogic installer should include an access token. Something like:
https://developer.marklogic.com/download/binaries/8.0/MarkLogic-8.0-4.2-x86_64.dmg?t=xxxxxxxxxxx&email=mypersonal%40email.com
You may have overlooked the last bit, it looks like the UI is wrapping the long url after the ?. I suggest using the 'Copy to clipboard' button to make sure you get the full url.
HTH!

Looks like you need to visit the site in a web browser first to register (Name, Email), and then you will get a link you can use with curl.

http://developer.marklogic.com/products/download-via-curl
You need to register an account first. Have you done that step yet?

The URL will have special characters in it (the email id will at least have '#'). So wrap the URL within quotes.
For eg:
wget "http://developer.marklogic.com/download/binaries/9.0/MarkLogic-9.0-1.1.x86_64.rpm?t=Xtxyxyxyxx.&email=xyz.abc%40abc.com"

Related

A third party application may be attempting to make unauthorized access to your account - Ameritrade

I was trying to do some simple authorization for ameritrade's developer platform. I was attempting.
According to the platform, the Endpoint I need to access is is:
https://auth.tdameritrade.com/auth?response_type=code&redirect_uri={uri}&client_id={client_id}}%40AMER.OAUTHAP
https://developer.tdameritrade.com/content/simple-auth-local-apps
When looking at the client_id, for the dev application, I was noticing that they may actually be referencing the Applications, Consumer Key instead? So i did just that, but when attempting to query the information, it returns: A third-party application may be attempting to make unauthorized access to your account. The reason why i think it is the consumer key, is listed at: https://developer.tdameritrade.com/content/getting-started
So I ended up doing something like:
from urllib.parse import urlencode, quote_plus
url = "https://auth.tdameritrade.com/auth?response_type=code&redirect_uri={uri}&client_id={client_id}}%40AMER.OAUTHAP".format(
uri=urlencode("http://localhost", quote_via=quote_plus),
client_id="JHBDFGJH45OOUDFHGJKSDBNG" #Sample
)
I dont think this is because I am currently in a different country currently, I think that something else is wrong here.
It doesnt follow through with it, but instead returns a 400 error with that information. Im not sure whats wrong though.
This happens when you copied the callback URI incorrectly. Imagine if this were a client application, and TD detected that the application is trying to send the user to a different URL than the app is configured with. If they send the callback request to that application, it will receive the token and gain full control over your account.
Have you double and triple checked that you're copying the callback URL correctly, protocol name, ports, and trailing slashes and everything? Also, consider using an API library instead of writing your own. You can find documentation about this specific error here.
I had this issue and I solved it using simply using http://127.0.0.1 on the call back URI of the App.
I then used below URL and it worked as expected.
https://auth.tdameritrade.com/auth?response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1&client_id={MyConsumerKey}%40AMER.OAUTHAP
Just in case anyone is still having this problem, make sure the callback URI is spelled EXACTLY the same as you specified when creating the app. I was having this problem because I set the callback on the TD developer website to "https://localhost/" and used "https://localhost" in the URL instead (missing the slash at the end). As soon as I added the slash at the end, it worked.
I found out that the issue is caused by the way the callback URL is set. It have to be exactly the same as the callback URL you have typed in at the apps details on the TD developer API page. I tried several permutations and indeed to get the authorization to work both have to be the same. eg. https or http.. end with '/' or does not, it matters. There is also no need to URL encode it.

Is it possible to have a link to raw content of file in Azure DevOps

It's possible to generate a link to raw content of the file in GitHub, is it possible to do with VSTS/DevOps?
Even after reading the existing answers, I still struggled with this a bit, so I wanted to leave a bit more of a thorough response.
As others have said, the pattern is (query split onto separate lines for ease of reading):
https://dev.azure.com/{{organization}}/{{project}}/_apis/sourceProviders/{{providerName}}/filecontents
?repository={{repository}}
&path={{path}}
&commitOrBranch={{commitOrBranch}}
&api-version=5.0-preview.1
But how do you find the values for these variables? If you go into your Azure DevOps, choose Repos > Files from the left navigation, and select a particular file, your current url should look something like this:
https://dev.azure.com/{{organization}}/{{project}}/_git/{{repository}}?path=%2Fpackage.json
You should use those values for organization, project, and repository. For path, you'll see an HTTP encoded version of the unix file path. %2F is the HTTP encoding for /, so that path is actually just /package.json (a tool like Postman will do that encoding for you).
Commit or branch is pretty self explanatory; you either know what you want for this value or you should use master. I have "hard-coded" the api version in the above url because that's what the documentation currently points to.
For the last variable, you need providerName. In short, you should probably use TfsGit. I got this value from looking through the list of source providers and looking for one with a value of true for supportedCapabilities.queryFileContents.
However, if you just request this URL you'll get a "203 Non-Authoritative Information" response back because you still need to authenticate yourself. Referring again to the same documentation, it says to use Basic auth with any value for the username and a personal access token for the password. You can create a personal access token at https://dev.azure.com/{{organization}}/_usersSettings/tokens; ensure that it has the Token Administration - Read & Manage permission.
If you're unfamiliar with this sort of thing, again Postman is super helpful with getting these requests working before you get into the code.
So if you have a repository with a src directory at the root, and you're trying to get the file contents of src/package.json, your URL should look something like:
https://dev.azure.com/{{organization}}/{{project}}/_apis/sourceProviders/TfsGit/filecontents?repository={{repository}}&commitOrBranch=master&api-version={{api-version}}&path=src%2Fpackage.json
And don't forget the basic auth!
Sure, here's the rests call needed:
GET https://feeds.dev.azure.com/{organization}/_apis/packaging/Feeds/{feedId}/packages/{packageId}?includeAllVersions={includeAllVersions}&includeUrls={includeUrls}&isListed={isListed}&isRelease={isRelease}&includeDeleted={includeDeleted}&includeDescription={includeDescription}&api-version=5.0-preview.1
https://learn.microsoft.com/en-us/rest/api/azure/devops/artifacts/artifact%20%20details/get%20package?view=azure-devops-rest-5.0#package
I was able to get the raw contents of a file using this URL.
GET https://dev.azure.com/{organization}/{project}/_apis/sourceProviders/{providerName}/filecontents?serviceEndpointId={serviceEndpointId}&repository={repository}&commitOrBranch={commitOrBranch}&path={path}&api-version=5.0-preview.1
I got this from here.
https://learn.microsoft.com/en-us/rest/api/azure/devops/build/source%20providers/get%20file%20contents?view=azure-devops-rest-5.0
You can obtain the raw URL using chrome.
Turn on Developer tools and view the Network tab.
Navigate to view the required file in the DevOps portal (Content panel). Once the content view is visible check the network tab again and find the URL which starts with "Items?Path", this is json response which contains the required "url:" element.
Drag the filename from the attachments windows and drop it in to any other MS application to get the raw URL or linked filename.
Most answers address this well, but in context of a public repo with anonymous access the api is different. Here is the one that works in such a scenario:
https://dev.azure.com/{{your_user_name}}/{{project_name}}/_apis/git/repositories/{{repo_name_encoded}}/items?scopePath={{path_to_your_file}}&api-version=6.0
This is the exact equivalent of the "raw" url provided by Github.
Another way that may be helpful if you want to quickly get the raw URL for a specific file that you are browsing:
install the browser extension named "Undisposition"
from the dot menu (top right) choose "Download": the file will open in a new browser tab from which you can copy the URL
(edit: unfortunately this will only work for file types that the browser knows how to open, otherwise it will still offer to download it...)
I am fairly new to this and had an issue accessing a raw file in an Azure DevOps Repo. It's straightforward in Github.
I wanted to download a file in CMD and BASH using Curl.
First I browsed to the file contents in the browser make a note of the bold sections:
https://dev.azure.com/**myOrg**/_git/**myProjectName**?path=%2F**MyFileName.ps1**
I then constructed the URL similar to what #Zach posted above.
https://dev.azure.com/**myOrg**/**myProjectName**/_apis/sourceProviders/TfsGit/filecontents?repository=**myProjectName**&commitOrBranch=**master**&api-version=5.0-preview.1&path=%2F**MyFileName.ps1**
Now when I paste the above URL in the browser it displays the content in RAW form similar to GitHub.
The difference was I had to setup a PAT (Personal Access Token) in My Azure DevOps account then authenticate the URL in DOS/BASH example below:
curl -u "<username>:<password>" "https://dev.azure.com/myOrg/myProjectName/_apis/sourceProviders/TfsGit/filecontents?repository=myProjectName&commitOrBranch=master&api-version=5.0-preview.1&path=%2FMyFileName.ps1" -# -L -o MyFileName.ps1

DocuSign Integration Into FileMaker

The basic work-flow I am trying to implement is to generate a PDF from FileMaker data, upload it to DocuSign for signing, and download the signed document back to FileMaker.
The DocuSign API requires custom headers, so I cannot use the built-in FileMaker 13 Insert From URL script step. Instead, I am using the BaseElements plug-in BE_HTTP_Set_Custom_Header and BE_GetURL functions. I currently have the DocuSign Login API call working.
Now I am trying to use the DocuSign API to upload a document and request a signature. This requires a multi-part/form-data POST request. Unfortunately, neither the BaseElements nor Troi URL plug-ins support multipart/form-data. In fact, I cannot find any plug-in that does. Is anyone aware of a FileMaker plug-in that supports multipart/form-data POST?
https://www.docusign.com/developer-center/quick-start/request-signatures
According to a comment on the Goya support forum last week, the next version of the BaseElements plug-in should support pass-through to the curl command line utility. If true, then as an alternative it seems possible to write a curl command to build the proper request, but my HTTP and curl knowledge is limited. So far, I have been unable to get the DocuSign signature request example working in Terminal. Has anyone been able to upload a document and request a signature with a single curl command?
http://support.goya.com.au/discussions/free-baseelements-plugin/1088-be-plugin-and-http-file-upload
Finally, I would be grateful for any other ideas or suggestions for attacking this problem.
Thank you!
Yoo could use ScriptMaster and Groovy to write your own function to support the multipart/form-data type
It took me a whole week but I managed to get it working on FileMaker 14.
It took me a whole week but I did manage to integrate it completely. I studied with my CEO the various providers out there and we ended up choosing DocuSign because our clients would be able to sign/approve our Estimates from literally any device (and also providing us with some more information if we need) such as Credit Card and etc), and because the annual costs were very competitive.
On the Development side it was hard, but not impossible.
The steps to master it are as follows:
1 - Study REST, and HTTP requests fundamentals (two youtube videos will do the trick).
2 - Get familiar with the command Curl in order to make POST and GET requests from Terminal (on Mac). Once you get to this point. Then you can try to follow Docusign's steps to POST and GET from Mac Terminal (not through FileMaker as yet). The first command that worked for me is as follows:
curl -i -H "Accept: application/json" -H 'X-DocuSign-Authentication:{"Username": "myemail#hotmail.com","Password": "mypassword", "IntegratorKey": "fae5e715-dec2-477f-906e-b6300bc9d09a"}' -X GET https://demo.docusign.net/restapi/v2/accounts/8aabdb38-41ab-4fab-bae9-63a071394a7a
If you replace the text above in bold with your DocuSign Developer credentials/information you should get a HTTP Header 200 OK - which means that you are in the right way.
This is important because you need to understand what information should be in the Headers, allowing Docusign to accept your requests.
3 - Translate the same concept into FileMaker. This is the most tricky part. First you need to create two Custom Variables. One will be your Docusign authentication (email, password and Integration Key). Something like this:
enter image description here
and the other one will be your endPoint , which is simply your Docusign url followed by your account number.
enter image description here
4 - Now you need to install BaseElements plugin onto your FileMaker Database(you wouldn't need this if you were using FileMaker 16). This will allow you to send POST, GET, DELETE, and UPDATE requests to Docusign. Unfortunately BaseElements's documentation is not super so I struggled a little bit till a get an actual result, but I will try to break it down for you here.
In order to make a GET request similar to the example that we have made we need to make a one line Script which looks like:
enter image description here
and then, inside the Refresh Object function:
enter image description here
This Script was adapted from a BaseElements plugin used to do Vimeo HTTP GET Requests - please see the link for more details on how to use it and test it: FileMaker REST using BaseElements Plugin - ISO FileMaker Magazine .
I think this gives you pretty much a good idea of how it works.
Unfortunately Docusign still doesn't provide any documentation or support to FileMaker developers, and this is as far as the internet goes. So if you need to make POST requests, it will get a bit more complex since you will have to put your FileMaker fields in JSON format, BaseElements POST syntax changing the script from GET to POST accordingly and POST inserting one more command with your data:
// insert this on your FileMaker POST script after the Headers List end
~data = BE_HTTP_POST ( ~endpoint ; "{
\"documents\": [
{ all of your JSON code adapted to FileMaker }]}");
You will also have to turn the PDF files that you want to send into Base64 format.
If you do this trick you will be good to go.

Download webpage source from a page that requires authentication

I would like to download a webpage source code from a page that requires authentication, using shell script or something similar (like Perl, Python, etc..) in a Linux machine.
I tried to use wget and curl, but when I pass the URL, the source code that is being downloaded is for a page that ask me for credential. The same page is already open on Firefox, or Chrome, but I don't known how I can re-use this session.
Basically what I need to do is run a refresh on this page in a regular basis, and grep for some information inside the source code. If I found what I'm looking for, I will trigger another script.
-- Edit --
Tks #Alexufo .I managed to make it work, this way:
1 - Download a Firefox addon to allow me save the cookies in a TXT file. I used this addon: https://addons.mozilla.org/en-US/firefox/addon/export-cookies/
2 - Logged in the site I want, and saved the cookie.
3 - Using wget:
wget --load-cookies=cookie.txt 'http://my.url.com' -O output_file.txt
4 - Now the page source code is inside output_file.txt and I can parse the way I want.
CURL should works anywhere.
1) do first response for autorization. Save cookes.
2) use cookes when you try second response to get you source page code.
update:
Wget should work with post autorization like curl
wget with authentication
update2: http://www.httrack.com/
Mechanize (http://mechanize.rubyforge.org/) can do that. I am using it (together) with Ruby 2.0.0 for exactly that.

How to auto upload and check in the files to sharepoint using curl?

I am trying to upload a file from linux to sharepoint with my sharepoint login credentials.
I use the cURL utility to achieve this. The upload is successful.
The command used is : curl --ntlm --user username:password --upload-file myfile.txt -k https://sharepointserver.com/sites/mysite/myfile.txt
-k option is used to overcome the certificate errors for the non-secure sharepoint site.
However, this uploaded file is showing up in "checked out" view(green arrow) in sharepoint from my login.
As a result, this file is non-existent for users from other logins.
My login has the write access previlege to sharepoint.
Any ideas on how to "check in" this file to sharepoint with cURL so that the file can be viewed from anyone's login ?
I don't have curl available to test right now but you might be able to fashion something out of the following information.
Check in and check out is handled by /_layouts/CheckIn.aspx
The page has the following querystring variables:
List - A GUID that identifies the current list.
FileName - The name of the file with extension.
Source - The full url to the allitems.aspx page in the library.
I was able to get the CheckIn.aspx page to load correctly just using the FileName and Source parameters and omitting the List parameter. This is good because you don't have to figure out a way to look up the List GUID.
The CheckIn.aspx page postbacks to itself with the following form parameters that control checkin:
PostBack - boolean set to true.
CheckInAction - string set to ActionCheckin
KeepCheckout - set to 1 to keep checkout and 0 to keep checked in
CheckinDescription - string of text
Call this in curl like so
curl --data "PostBack=true&CheckinAction=ActionCheckin&KeepCheckout=0&CheckinDescription=SomeTextForCheckIn" http://{Your Server And Site}/_layouts/checkin.aspx?Source={Full Url To Library}/Forms/AllItems.aspx&FileName={Doc And Ext}
As I said I don't have curl to test but I got this to work using the Composer tab in Fiddler 2
I'm trying this with curl now and there is an issue getting it to work. Fiddler was executing the request as a POST. If you try to do this as a GET request you will get a 500 error saying that the AllowUnsafeUpdates property of the SPWeb will not allow this request over GET. Sending the request as a POST should correct this.
Edit I am currently going through the checkin.aspx source in the DotPeek decompiler and seeing some additional options for the ActionCheckin parameter that may be relevant such as ActionCheckinPublish and ActionCheckinFromClientPublish. I will update this with any additional findings. The page is located at Microsoft.SharePoint.ApplicationPages.Checkin for anyone else interested.
The above answer by Junx is correct. However, Filename variable is not only the document filename and the extension, but should also include the library name. I was able to get this to work using the following.
Example: http://domain/_layouts/Checkin.aspx?Filename=Shared Documents/filename.txt
My question about Performing multiple requests using cURL has a pretty comprehensive example using bash and cURL, although it suffers from having to reenter the password for each request.

Resources