Things actually work if I access the BigCommerce WebDAV via Cyberduck. However, I want to do this programmatically. Therefore, I wrote these code:
const { createClient } = require("webdav");
async function run() {
const client = createClient(
"https://mystore.mybigcommerce.com/dav",
{
username: "myemail#email.com",
password: "mypassword"
}
);
const contents = await client.getDirectoryContents("/");
}
run();
This is my code to get directory contents. I copied it from https://github.com/perry-mitchell/webdav-client#usage. I copied the email and password from the BigCommerce site.
The computer returns (node:32672) UnhandledPromiseRejectionWarning: Error: Request failed with status code 401 after I ran the script.
If I enter the URL in the web browser and enter the correct username and password, it returns this:
<d:error xmlns:d="DAV:" xmlns:s="http://sabredav.org/ns">
<s:exception>Sabre\DAV\Exception\NotImplemented</s:exception>
<s:message>
There was no plugin in the system that was willing to handle this GET method. Enable the Browser plugin to get a better result here.
</s:message>
</d:error>
Hope you guys can find out what is going on, thanks.
You'll need a client that supports Digest Auth, not just Basic. Looks like there was some conversation about adding digest support to the WebDAV client you're using. This PR might be a good starting place:
https://github.com/perry-mitchell/webdav-client/pull/96
Related
New to docusign and have been reading the docs, and I can't seem to get around these problems.
Per the docs it says that gets a code getAuthorizationUri() which can then be used by generateAccessToken() to get a token. After that, the token can be used to explore the API with other methods.
I'm stuck in two places: First I configured my get Authorization like this (typescript)
import { ApiClient } from 'docusign-esign'
const apiClient = new ApiClient()
const url: string = apiClient.getAuthorizationUri(
config.docusign.INTERGRATION_KEY,
['signature'],
'http://localhost/',
'code',
'ready'
)
console.log(url)
I get the url fine, but it is always starts with acccounts.docusign as opposed to the required account-d.docusign that's required because my app is in dev.
Secondly, I can't figure out how to transition that URL into a code programtically. Yes, I can copy and paste it from the browser URL (as in the instruction videos), but obviously my next step is to run something like:
async getToken(authCode){
const token = await apiClient.generateAccessToken(
INTERGRATION_KEY,
SECRET_KEY,
authCode
)
console.log(token)
return token
}
To change authentication from prod to demo, you need to make this call:
apiClient.setOAuthBasePath("account-d.docusign.com");
You may also want to change the URL for API calls if you use this later like this:
apiClient.setBasePath("https://demo.docusign.net/restapi");
I suggest using an OAuth npm package and not doing this yourself.
See how it's done in the quickstart (Pick Node.js and Auth Code Grant)
When I go to the following URL in the browser with my own client_id:
https://account-d.docusign.com/oauth/auth?response_type=token&scope=signature&client_id=6d1a8594-xxxx-xxxx-xxxx-878e593de049&state=a39fh23hnf23&redirect_uri=http://localhost:8888/auth
I need to login and afterwards I get redirected to:
http://localhost:8888/auth#access_token=myAccessToken&expires_in=28800&token_type=bearer&state=a39fh23hnf23
Now I have a node.js application where I want to make API calls using my access token, until now I did the above manually every time to get my access token, of course, this isn't optimal. My question, therefore, is: how do I get my access token without using the browser?
Right now I have the following
sendRequestAccessToken(): Promise<any> {
const scope = 'signature';
const clientId = '*my client id*';
const state = 'a39fh23hnf23';
const url = `https://account-d.docusign.com/oauth/auth?
response_type=token
&scope=${scope}
&client_id=${clientId}
&state=${state}
&redirect_uri=http://localhost:8888/auth
`;
return this.httpService.get(url).toPromise(); }
The response from this function contains a lot of data, but no access token.
Please review the SDK examples we have here: https://github.com/docusign/code-examples-node
In those samples, we have examples on how to use the tokens using the apis.
I have a React app in Electron, and I'm trying to access the spotify API using the spotify-web-api-node library. However, I'm not sure exactly how the oauth flow is meant to work inside of an Electron app... Firstly, for the redirect URL, I used this question and added a registerFileProtocol call to my file. Then I added a specific ipcMain.on handler for receiving the spotify login call from a page, which I've confirmed works with console logs. However, when I get to actually calling the authorizeURL, nothing happens?
This is part of my main.js:
app.whenReady().then(() => {
...
protocol.registerFileProtocol(
"oauthdesktop",
(request, callback) => {
console.log("oauthdesktop stuff: ", request, callback);
//parse authorization code from request
},
(error) => {
if (error) console.error("Failed to register protocol");
}
);
});
ipcMain.on("spotify-login", (e, arg) => {
const credentials = {
clientId: arg.spotifyClientId,
clientSecret: arg.spotifySecret,
redirectUri: "oauthdesktop://test",
};
const spotifyApi = new SpotifyWebApi(credentials);
console.log("spapi: ", spotifyApi);
const authorizeURL = spotifyApi.createAuthorizeURL(
["user-read-recently-played", "playlist-modify-private"],
"waffles"
);
console.log("spurl: ", authorizeURL);
axios.get(authorizeURL);
}
I'd expect the typical spotify login page popup to show up, but that doesn't happen. I'd also expect (possibly) the registerFileProtocol callback to log something, but it doesn't. What am I meant to be doing here? The authorization guide specifically mentions doing a GET request on the auth url, which is what I'm doing here...
In a desktop app it is recommended to open the system browser, and the Spotify login page will render there, as part of creating a promise. The opener library can be used to invoke the browser.
When the user has finished logging in, the technique is to receive the response via a Private URI Scheme / File Protocol, then to resolve the promise, get an authorization code, then swap it for tokens. It is tricky though.
RESOURCES OF MINE
I have some blog posts on this, which you may be able to borrow some ideas from, and a couple of code samples you can run on your PC:
Initial Desktop Sample
Final Desktop Sample
The second of these is a React app and uses a Private URI scheme, so is fairly similar to yours. I use the AppAuth-JS library and not Spotify though.
I'm attempting to use the Google Maps Embed API to embed a map matching an address that a user enters. Following the developer guide, I acquired an API Key to add to my API requests. I'm able to successfully pull up a map when I request it via the "src" attribute of an iframe in my React client, like so:
<iframe
...
src={`https://www.google.com/maps/embed/v1/${mode}?key=${apiKey}&q=${encodedAddressQuery}`}
>
But this leaves my API key exposed to the client.
In Google's API Key Best Practices, it's recommended that the API key be stored in an environment variable and that a proxy server be used to safeguard keys. However when I try proxying the request (code below), it seems to return the appropriate map for a split second, but then the iframe replaces the map with an "Oops! Something went wrong" message, and the browser console displays this error:
Google Maps JavaScript API error: UnauthorizedURLForClientIdMapError
https://developers.google.com/maps/documentation/javascript/error-messages#unauthorized-url-for-client-id-map-error
Your site URL to be authorized: http://127.0.0.1:3001/api/maps?parameters=[my_encoded_parameters]
I'm just developing locally at the moment, so I've tried registering http://127.0.0.1:3001/* as an authorized URL as the error documentation suggests, and I've also tried removing all website restrictions on the API key just to see if I was authorizing the wrong URL, but both of those attempts still produced the same error.
I haven't found many resources on setting up a proxy other than this Google Codelabs project, but I haven't been able to pick anything out of it to help with this issue.
Here's the code I'm using in my React front end:
<iframe
...
src={`http://127.0.0.1:3001/api/maps?parameters=${encodedAddressQuery}`}
>
And in my Express Node.js back end:
router.get('/api/maps', async (req: Request, res: Response) => {
try {
const parameters = req.query.parameters;
const map = await MapDataAccessObject.getOne(parameters);
return res.status(OK).send(map.data);
} catch (err) {
...
}
});
export class MapDataAccessObject {
public static async getOne(parameters: string) {
const apiUrl = this.getMapsEmbedUrl(parameters);
const map = await axios.get(apiUrl);
return map;
}
private static getMapsEmbedUrl(parameters: string) {
const encodedParams = encodeURI(parameters);
return `https://www.google.com/maps/embed/v1/place?key=${process.env.MAPS_API_KEY}&q=${encodedParams}`;
};
}
I'm running my React front end and my Node.js back end in Docker containers.
My best guess is that I'm missing a request header that the Maps API is expecting, but I haven't found any documentation on that, or maybe I'm misunderstanding something more fundamental. Any guidance would be very much appreciated.
There is no way to protect your google maps key on a website from being "spied on" because it is always public. The only way to protect your key from foreign usage is to restrict it's access to only the allowed domain/ip-addresses - so if it is used from someone else, it will not work or take anything from your credits (google will show an error message).
https://developers.google.com/maps/documentation/javascript/get-api-key
I'm using the Hapi framework (nodejs) with the Bell module, working with the Twitter provider.
It was pretty simple to get a working code with the example given in the github page. I access the /login route and I get redirected to Twitter, where I authorize the app and then I'm redirected back to /login?oauth_token=xxxxxxx&oauth_verifier=xxxxxxx where I can have access to the user profile in the request.auth.credentials.
The problem came when I tried to reject the app. Instead of clicking the "Sign In" button on Twitter, I clicked the "Cancel" button and then the "Return to site name" button. This last button redirects me to /login?denied=xxxxxx and then I'm redirected (again) to Twitter to approve the app.
I tried to handle this scenario using another example in the same page https://github.com/hapijs/bell#handling-errors but can't get it to work.
server.route({
method: ['GET', 'POST'],
path: '/login',
config: {
auth: {
strategy: 'twitter',
mode: 'try'
},
handler: function (request, reply) {
if (!request.auth.isAuthenticated) {
return reply('Authentication failed due to: ' + request.auth.error.message);
}
return reply.redirect('/home');
}
}
});
It seems that before checking the request.auth it interprets the /login route and redirects to Twitter. I still don't understand very well the Bell module but it might be that the Twitter strategy is expecting the oauth_token and oauth_verifier in the request.params, but the denied param is not interpreted by the strategy and that's why the redirect happens.
Has somebody managed to handle this scenario?
I found a workaround. It's not an optimal solution but at least allows me to handle the rejection from Twitter.
I had to modify a file inside the bell module. In bell/lib/oauth.js, before the verification of oauth_token
exports.v1 = function (settings) {
var client = new internals.Client(settings);
return function (request, reply) {
var cookie = settings.cookie;
var name = settings.name;
// Sign-in Initialization
// Verify if app (Twitter) was rejected
if (name=='twitter' && request.query.denied) {
return reply(Boom.internal('App was rejected'));
}
if (!request.query.oauth_token) {
// Obtain temporary OAuth credentials
var oauth_callback = request.server.location(request.path, request);
With that change I can catch and show the auth error in the handler, without the automatic redirect.
At least this is the way I managed to make it work. The cons of this modification is that if the bell module is updated then the modification is lost and the bug arise again, unless the updated module comes already with a fix for this. So, you have to keep an eye on that.
Here's the link off the Github issue I created on the Bell repository regarding this bug.