Solution
The server logs are telling me that I was getting a 301 redirect. This is an .htaccess issue, not GWallet. The redirect was stripping my POST data, leading to a blank request. Thanks everybody for your help.
I'm trying to process the postback from Google Wallet, but I'm not getting the proper data from Google Wallet to start.
Apache's error_log doesn't have any entry for my postback. The access_log shows the following for every attempt at a postback request:
[source-ip-1] - - [16/Sep/2013:15:18:05 +0000] "POST /google-postback.php HTTP/1.1" 301 293 "-" "Google-In-App-Payments; (+http://www.google.com/payments)"
[source-ip-2] - - [16/Sep/2013:15:18:06 +0000] "GET /google-postback.php HTTP/1.1" 200 20 "-" "Google-In-App-Payments; (+http://www.google.com/payments)"
[source-ip-1] - - [16/Sep/2013:15:18:36 +0000] "POST /google-postback.php HTTP/1.1" 301 293 "-" "Google-In-App-Payments; (+http://www.google.com/payments)"
[source-ip-2] - - [16/Sep/2013:15:18:36 +0000] "GET /google-postback.php HTTP/1.1" 200 20 "-" "Google-In-App-Payments; (+http://www.google.com/payments)"
A reverse-DNS shows both IPs are google proxies.
The JavaScript performing the request is as follows:
<script src="https://sandbox.google.com/checkout/inapp/lib/buy.js"></script>
<script type="text/javascript">
//Success handler
var successHandler = function(purchaseAction){
if (window.console != undefined) {
console.log("Purchase completed successfully.");
}
}
//Failure handler
var failureHandler = function(purchaseActionError){
if (window.console != undefined) {
console.log("Purchase did not complete.");
}
}
function purchase()
{
console.log("Attempting to buy stuff");
google.payments.inapp.buy({
'jwt' : '<?=$jwt?>',
'success' : successHandler,
'failure' : failureHandler
});
}
</script>
I POSTed to my postback page myself with the JWT string I giving to Google Wallet's buy() function, and it decodes to what I expected:
stdClass Object
(
[iss] => [my seller id]
[aud] => Google
[typ] => google/payments/inapp/item/v1
[exp] => 1379106326
[iat] => 1379102726
[request] => stdClass Object
(
[name] => Piece of Cake
[decription] => Virtual whazzit
[price] => 10.50
[currencyCode] => USD
[sellerData] => id:1224245,offer_code:3098576987,affiliate:aksdfbovu9j
)
)
Originally, my postback looked like this:
<?php
require_once "google/googlefunctions.php";
$encoded_jwt = $_POST["jwt"];
$decodedJWT = JWT::decode($encoded_jwt, [seller_secret]);
$orderId = $decodedJWT->response->orderId;
header("HTTP/1.0 200 OK");
echo $orderId;
?>
But Google's popup kept returning an error about not being able to verify the purchase (I forget the exact message). After trouble-shooting, I've decided to email myself to verify that I'm actually hitting the postback, and not being blocked by a seller id/secret credentials problem:
// google-postback.php
<?php
mail("my#email.com", "Purchase post", print_r($_POST, true));
?>
My response email is as follows:
Array
(
)
As far as I know, Google Wallet should be sending me a POST with a 'jwt' variable to decode that's identical to what I sent, but with a response, but I'm getting nothing.
Any ideas?
Related
I'm having a lot problems with OAuth in Google. I have 2 apps deployed in Azure (front in react, and backend in Play (scala)).
Let's suppose we can find:
backend at: "backend.azure.net",
frontend at: "frontend.azure.net".
I use Silhouette in my backend app.
My SocialAuthController :
class SocialAuthController #Inject()(scc: DefaultSilhouetteControllerComponents, addToken: CSRFAddToken)(implicit ex: ExecutionContext) extends SilhouetteController(scc) {
def authenticate(provider: String): Action[AnyContent] = addToken(Action.async { implicit request: Request[AnyContent] =>
(socialProviderRegistry.get[SocialProvider](provider) match {
case Some(p: SocialProvider with CommonSocialProfileBuilder) =>
p.authenticate().flatMap {
case Left(result) => Future.successful(result)
case Right(authInfo) => for {
profile <- p.retrieveProfile(authInfo)
res <- userRepository.getByEmail(profile.email.getOrElse(""))
user <- if (res == null) userRepository.create(profile.loginInfo.providerID, profile.loginInfo.providerKey, profile.email.getOrElse(""))
else userRepository.getByEmail(profile.email.getOrElse(""))
_ <- authInfoRepository.save(profile.loginInfo, authInfo)
authenticator <- authenticatorService.create(profile.loginInfo)
value <- authenticatorService.init(authenticator)
result <- authenticatorService.embed(value, Redirect(s"https://backend.azure.net?user-id=${user}"))
} yield {
val Token(name, value) = CSRF.getToken.get
result.withCookies(Cookie(name, value, httpOnly = false))
}
}
case _ => Future.failed(new ProviderException(s"Cannot authenticate with unexpected social provider $provider"))
}).recover {
case _: ProviderException =>
Forbidden("Forbidden")
}
})
}
My routes:
# Authentication
POST /signUp controllers.SignUpController.signUp
POST /signIn controllers.SignInController.signIn
POST /signOut controllers.SignInController.signOut
GET /authenticate/:provider controllers.SocialAuthController.authenticate(provider: String)
silhouette.conf (google part):
# Google provider
google.authorizationURL="https://accounts.google.com/o/oauth2/auth"
google.accessTokenURL="https://accounts.google.com/o/oauth2/token"
google.redirectURL="https://backend.azure.net/authenticate/google"
google.clientID="my_id"
google.clientSecret="my_secret"
google.scope="profile email"
So as you see i paste token to the link in Redirect(s"http://backend.azure.net?user-id=${user}")).
My js function in frontend app:
export function signInGoogle() {
const host = "https://backend.azure.net/"
const route = "authenticate/google";
const requestOptions = {
method: 'GET',
headers: { 'Content-Type': 'application/json' },
credentials: 'include',
mode: 'cors',
};
return fetch(host + route, requestOptions)
}
And function with redirect
export default function GoogleSignIn() {
const responseGoogle = () => {
window.location.href = "https://backend.azure.net/authenticate/google";
}
return (
<div>
<button onClick={responseGoogle}
className="add-to-cart-button register-bttn ingsoc ggl-bttn"> Login With Google </button>
</div>
)
}
If you click login with google the second function is called.
On my google dev account I have some links. My URI Identificators:
https://backend.azure.net and http://front.azure.net
and authorized URI Redirect:
http://backend.azure.net/authenticate/google
After I click "login with google" I get 403 forbidden on site and 3 exceptions in console :
1 GET https://uj-ebiznes-back.azurewebsites.net/authenticate/google?state=very_long_link 403 (Forbidden)
Unchecked "long link on back " runtime.lastError: Could not establish connection. Receiving end does not exist.
Error handling response: "very long link to backend/authenticate/google"
TypeError: Cannot destructure property 'yt' of 'undefined' as it is undefined.
at chrome-extension://cmedhionkhpnakcndndgjdbohmhepckk/scripts/contentscript.js:535:6
Could someone explain what is wrong here? I have no idea and I tried everything.
I'm trying to call a lamda function writen in Node.JS hosted in the SAM local environment. The function is connecting to a locally hosted MySQL database.
The code is as follows:
var mysql = require('mysql');
exports.handler = (event, context, callback) => {
let id = (event.pathParameters || {}).division || false;
var con = mysql.createConnection({
host: "host.docker.internal",
user: "root",
password: "root",
database: "squashprod"
});
switch(event.httpMethod){
case "GET":
con.connect(function(err) {
if (err) throw err;
con.query("SELECT * FROM players where division_id = 1",
function (err, result, fields) {
if (err) throw err;
//console.log(result);
return callback(null, {body: "This message does not work"});
}
);
});
// return callback(null, {body: "This message works"});
break;
default:
// Send HTTP 501: Not Implemented
console.log("Error: unsupported HTTP method (" + event.httpMethod + ")");
callback(null, { statusCode: 501 })
}
}
However the callback (with the message "This message does not work") is not coming out. I know it's calling the DB as the console.log call prints the result. When this code runs I get an internal server error in the browser and the following messages from SAM Local:
2018-09-13 20:46:18 Function 'TableGetTest' timed out after 3 seconds
2018-09-13 20:46:20 Function returned an invalid response (must include one of: body, headers or statusCode in the response object). Response received: b''
2018-09-13 20:46:20 127.0.0.1 - - [13/Sep/2018 20:46:20] "GET /TableGetTest/2 HTTP/1.1" 502 -
2018-09-13 20:46:20 127.0.0.1 - - [13/Sep/2018 20:46:20] "GET /favicon.ico HTTP/1.1" 403 -
If I comment out the call to the DB and just go with the callback that says, "This message works" then there is no timeout and that message appears in the browser
I know the DB code is sound as it works standalone. I feels it's got something to do with the callback but I don't know Node well enough to understand.
I'm pulling what little hair I've got out. Any help would be greatly appreciated!
I had the same problem and here is how I solved it.
First problem is time is not enough for cold start.
Increase the execution time of your lambda. Initial connection setup will take longer time.
Further,
You need to close the connection once you are done with the query. Otherwise it will not keep the event loop of node empty which make the lambda assume still it is in the work.
Resolved with two ways:
Close all the connection as soon as everything is complete.
Use Sequelize rather than plan mysql library. Sequelize will help to maintain connection pools and share across connections.
https://www.npmjs.com/package/sequelize
Hope it helps.
try to Invoke Facebook v3.0 translations service but got exception like this "
Unhandled rejection StatusCodeError: 400 - "Unsupported post request. Object with ID 'XXXXXX
XXXXXX' does not exist, cannot be loaded due to missing permissions, or does not support this operation. Please read the Graph API documentation ,
and with "type":"GraphMethodException","code":100, "error_subcode":33,
here is my node-js code
const request = require('request-promise');
var uri = 'https://graph.facebook.com/v3.0/my_app_id/translations?'
+ 'access_token=my_app_token';
const options = {
method: 'POST',
uri: uri,
qs: {
native_strings: [{"text":"Test String", "description": "This is a test string."}]
}
};
request(options).then(function (res1) {
console.log(res1);
res.write('request done : ' + res1);
res.end();
}
the app token can be obtained via facebook tool explorer
what am I doing wrong?
I am trying to stream audio files from S3 to my React client by making a request to my Node/Express server. I managed to implement something that works, but I am not sure if I am actually streaming the file here, or simply downloading it. I suspect I might be downloading the file, because my requests to the server take a long time to come back:
Established database connection.
Server listening on port 9000!
::1 - - [18/Apr/2017:21:13:43 +0000] "GET / HTTP/1.1" 200 424 6.933 ms
::1 - - [18/Apr/2017:21:13:43 +0000] "GET /static/js/main.700ba5c4.js HTTP/1.1" 200 217574 1.730 ms
::1 - - [18/Apr/2017:21:13:43 +0000] "GET /index.css HTTP/1.1" 200 - 8.722 ms
Server received a request: GET /tracks
::1 - - [18/Apr/2017:21:13:43 +0000] "GET /tracks HTTP/1.1" 304 - 41.468 ms
Server received a request: GET /tracks/1/stream
::1 - - [18/Apr/2017:21:14:13 +0000] "GET /tracks/1/stream HTTP/1.1" 200 - 636.249 ms
Server received a request: GET /tracks/2/stream
Database query threw an error: ETIMEDOUT
Note the 636.249 ms!
Can you guys tell if I am doing anything wrong here? I pasted the code from my current approach below; It does the following:
Client makes a fetch call to /tracks/id/stream
Server queries database to get the track
Server uses downloadStream (from the s3 package) to get the file
Server pipes the data to the client
Client receives the data as an ArrayBuffer
Client decodes the buffer and passes it to an AudioBufferSourceNode
AudioBufferSourceNode plays the audio
The server-side:
app.get('/tracks/:id/stream', (req, res) => {
const id = req.params.id
// Query the database for all tracks
database.query(`SELECT * FROM Tracks WHERE id = ${id}`, (error, results, fields) => {
// Upon failure...
if (error) {
res.sendStatus(500)
}
// Upon success...
const params = {
Bucket: bucketName, // use bucketName defined elsewhere
Key: results[0].key // use the key from the track object
};
// Download stream and pipe to client
const stream = client.downloadStream(params)
stream.pipe(res)
})
});
The client-side fetch call:
const URL = `/tracks/${id}/stream`
const options = { method: 'GET' }
fetch(URL, options)
.then(response => response.arrayBuffer())
.then(data => {
// do some stuff
AudioPlayer.play(data)
})
The client-side AudioPlayer, responsible for handling the actual AudioBufferSourceNode:
const AudioPlayer = {
play: function(data) {
// Decode the audio data
context.decodeAudioData(data, buffer => {
// Create a new buffer source
source = context.createBufferSource()
source.buffer = buffer
source.connect(context.destination)
// Start playing immediately
source.start(context.currentTime)
})
},
...
There's a lot wrong here, so let's just go through it piece by piece, whether it was related to your original question or not.
database.query(`SELECT * FROM Tracks WHERE id = ${id}`, (error, results, fields) => {
In this line, you open yourself up to SQL injection attacks. Never concatenate arbitrary data into the context of a query (or any other context for that matter) without proper escaping. Whatever database library you're using will have a parameterized method that you should be utilizing.
I suspect I might be downloading the file, because my requests to the server take a long time to come back
Who knows... you didn't show us a single location where you're doing the logging so it's hard to say whether or not the logged line is before or after the request is complete. One thing that is clear however is that the response has to at least begin, otherwise the response status code wouldn't be known. A 600 ms response time for the first resource byte from S3 isn't unheard of anyway.
Server uses downloadStream (from the s3 package) to get the file
Server pipes the data to the client
You're wasting a lot of bandwidth with this. Rather than fetching the file and relaying it to the client, what you should do is sign a temporary URL with a 15-minute expiration or so, and redirect the client to it. The client will follow the redirect and now S3 is responsible for handling your clients. It will cost you half as much bandwidth, less CPU resource, and will be delivered from a location likely closer to your users. You can create this signed URL with the AWS JS SDK.
Client receives the data as an ArrayBuffer
There's no streaming happening here. Your client is downloading the entire resource before it's playing anything.
What you should do is be creating a normal Audio instance. It will automagically follow your redirect from your Node.js app to your signed S3 URL and handle all the buffering and streaming for you.
let a = new Audio('/tracks/' + encodeURIComponent(id) + '/stream');
a.play();
How can I adjust the Hapi reply function such that it would reply JSON objects only?
Should I send it as plain and send? I seem not to find a good example
Here is some edit - added some sample code to understand what's happening.
The route:
server.route({
method: 'GET',
path: '/user/',
handler: function (request, reply) {
var ids = null;
mysqlConnection.query('SELECT ID FROM Users;',function(err,rows,fields){
if(err) throw err;
ids = rows;
// console.log(ids);
reply(ids);
});
}
});
The reply:
<html><head></head><body>
<pre style="word-wrap: break-word; white-space: pre-wrap;">[{"ID":1},{"ID":2},{"ID":3},{"ID":4},{"ID":5},{"ID":6},{"ID":7},{"ID":8},{"ID":9},{"ID":10},{"ID":11},{"ID":12},{"ID":13},{"ID":14},{"ID":15},{"ID":16},{"ID":17},{"ID":18},{"ID":19},{"ID":20},{"ID":21}]
</pre></body></html>
I hope I understand the question ok. Are we talking version 8.x ? For me it seems the default. With this code as a route handler,
folders: {
handler: function( request, reply ) {
'use strict';
reply({
folders: folders
}).code( 200 );
}
},
and doing
curl http://localhost:3001/folders
I get the following output
* Hostname was NOT found in DNS cache
* Trying 127.0.0.1...
* Connected to localhost (127.0.0.1) port 3001 (#0)
> GET /folders HTTP/1.1
> User-Agent: curl/7.37.1
> Host: localhost:3001
> Accept: */*
>
< HTTP/1.1 200 OK
< content-type: application/json; charset=utf-8
< cache-control: no-cache
< content-length: 266
< accept-ranges: bytes
< Date: Tue, 03 Feb 2015 23:19:31 GMT
< Connection: keep-alive
<
{folders ..... }
Also, note that I only call reply()not return reply()
HTH
As for v17 and above, the reply() interface was removed. Now the handlers uses async functions, and you can just return the value.
From the hapi docs example:
// Before
const handler = function (request, reply) {
return reply('ok');
};
// After
const handler = function (request, h) {
return 'ok';
};
Using hapi's reply(data) and passing a data object will do the work for you. Internally, hapi will create the appropriate JSON of your data object and respond it.
There's a tutorial on how to reply JSON for a given request using hapi that might give some more insights.
Using v17 and above, simply returning a naked string doesn't result in a json encoded reply.
Use return JSON.stringify() to ensure the string is json encoded
e.g.
function (request, h) {
return JSON.stringify('ok');
};