We have an endpoint that returns a gzip encoding. We want to cache the value and we are using ApiGateway to do that for us. The resource method is defined as follows,
GetManifestApiGatewayMethod: # very good
Type: "AWS::ApiGateway::Method"
Properties:
AuthorizationType: "NONE"
HttpMethod: "GET"
ResourceId:
Ref: ManifestConfigurationResource
RestApiId:
Ref: ApiGatewayRestApi
RequestParameters:
method.request.path.seasonCode: true
method.request.path.facilityCode: true
method.request.path.configurationCode: true
method.request.querystring.policyCode: true
method.request.header.PAC-Authorization: true
method.request.header.PAC-Application-ID: true
method.request.header.PAC-API-Key: true
method.request.header.PAC-Channel-Code: true
method.request.header.PAC-Organization-ID: true
method.request.header.PAC-Developer-ID: true
method.request.header.PAC-Request-ID: false
method.request.header.Accept-Encoding: true
MethodResponses:
- StatusCode: 200
# ResponseParameters:
# method.response.header.Content-Encoding: true
Integration:
IntegrationHttpMethod: GET
Type: HTTP
Uri: https://${self:provider.environment.PDI_HOST}/pdi/v1/manifest/{seasonCode}/{facilityCode}/{configurationCode}
PassthroughBehavior: WHEN_NO_MATCH
CacheKeyParameters:
- method.request.path.seasonCode
- method.request.path.facilityCode
- method.request.path.configurationCode
- method.request.querystring.policyCode
IntegrationResponses:
- StatusCode: 200
SelectionPattern: '\d\d\d'
# ResponseParameters:
# method.response.header.content-encoding: integration.response.body.headers.content-encoding
RequestParameters:
integration.request.path.seasonCode: method.request.path.seasonCode
integration.request.path.facilityCode: method.request.path.facilityCode
integration.request.path.configurationCode: method.request.path.configurationCode
integration.request.querystring.policyCode: method.request.querystring.policyCode
integration.request.header.Authorization: method.request.header.PAC-Authorization
integration.request.header.PAC-Application-ID: method.request.header.PAC-Application-ID
integration.request.header.PAC-API-Key: method.request.header.PAC-API-Key
integration.request.header.PAC-Channel-Code: method.request.header.PAC-Channel-Code
integration.request.header.PAC-Organization-ID: method.request.header.PAC-Organization-ID
integration.request.header.PAC-Developer-ID: method.request.header.PAC-Developer-ID
integration.request.header.PAC-Request-ID: method.request.header.PAC-Request-ID
integration.request.header.Accept-Encoding: method.request.header.Accept-Encoding
http.get method has the following logic in it:
const encoding = response.headers["content-encoding"];
if (encoding && encoding.indexOf("gzip") >= 0) {...} // handle the gzip
but when we use the integration method above, I am not getting that header that I would normally get from directly hitting the api it proxies to. There is some commented code where I tried to pass that along but Im getting internal server error when those response mappings are used.
From the looks of your code template, the Method response header looks correct.
method.response.header.Content-Encoding: true
However your integration response, ResponseParameters is seems wrong.
method.response.header.content-encoding: integration.response.body.headers.content-encoding
Firstly, the property Key should exactly match, so potentially the capitalisation may be adding to the issue. However, your Value looks to be the real issue. As per the AWS Documentation:
Use the destination as the key and the source as the value:
The destination must be an existing response parameter in the
MethodResponse property.
The source must be an existing method request parameter or a static
value. You must enclose static values in single quotation marks and
pre-encode these values based on the destination specified in the
request.
It looks as if you're trying to map a integration response method to the value instead of a Method Request parameter.
Related
rundeck ver: 4.0.0
I want to send job execution result to specific webhook url.
So I set job notification use GUI
Trigger: On success
Notification Type: Send Webhook
URL(s): specific incoming webhook url
Payload Format: JSON
Method: POST
Rundeck job notification error is occured. (server response: 415 Unsupported Media Type)
Is there a configuration file that needs to be added other than the job notifications settings?
To send the job execution result you can use the HTTP notification plugin and put an export variable (containing the result) in the body section.
For this, you need to "capture" the output and store it on a global variable to put in the notification, I made a workable example that works with webhooks.site test service:
- defaultTab: nodes
description: ''
executionEnabled: true
id: bcee72e0-b6fa-4e8e-a065-9937a67f3de7
loglevel: INFO
name: HelloWorld
nodeFilterEditable: false
notification:
onsuccess:
plugin:
configuration:
authentication: None
body: ${export.myexport}
contentType: application/json
method: POST
remoteUrl: https://webhook.site/xxxx-xxxx-xxxx-xxxx-xxxx
timeout: '30000'
type: HttpNotification
notifyAvgDurationThreshold: null
options:
- name: opt1
value: localhost
plugins:
ExecutionLifecycle: null
scheduleEnabled: true
sequence:
commands:
- exec: nmap ${option.opt1}
plugins:
LogFilter:
- config:
hideOutput: 'false'
logData: 'true'
name: myoutput
regex: (.*)
type: key-value-data-multilines
- description: only for debug
exec: echo ${data.myoutput}
- configuration:
export: myexport
group: export
value: ${data.myoutput*}
nodeStep: false
type: export-var
keepgoing: false
strategy: node-first
uuid: bcee72e0-b6fa-4e8e-a065-9937a67f3de7
Run command and capture the output in a data capture filter, I use the Multiline Regex Data Capture plugin, now the output is saved on a data variable called ${data.myoutput}.
Then using the Global Variable Workflow Step, take the ${data.myoutput} (using the ${data.myoutput*} format) and create the ${export.myexport} variable to use in notifications (just putting the ${export.myexport} variable).
Check the result.
While installing serverless with following command
sls plugin install -n serverless-alexa-skills --stage dev
I am getting an error like Your serverless.yml has an invalid value with key: "Ref"
The following is my sample serverless.yml file
plugins:
- serverless-webpack
- serverless-s3-sync
- serverless-plugin-git-variables
- serverless-alexa-skills
functions: ${file(./deploy/${opt:stage}.yml):functions}
resources: ${file(./deploy/${opt:stage}.yml):resources}
custom: ${file(./deploy/${opt:stage}.yml):custom}
outputs:
DialogflowFunctionArn:
Value:
Ref:
Got a block here. can some one help me here.
Ref is a Cloudformation intrinsic function. It needs to reference a resource. The whole outputs section is also optional, use it only if you need to reference the resources from one stack in another.
It basically says that Ref: is expecting a value. You have defined it but not assigned any value to it. If there is no use then you should remove this part from your code:
outputs:
DialogflowFunctionArn:
Value:
Ref:
Ref expects to reference something, right now you are not passing it anything to reference.
So, assuming you want the ARN of DialogflowFunction and that function config looks something like this in your functions file:
DialogflowFunction:
description: get the flow
handler: src/functions/dialog-controller.flow
events:
- http:
path: '/dialog/flow'
method: get
cors: true
Then your ref would look something like this:
outputs:
DialogflowFunctionArn:
Value:
Ref: DialogflowFunction
Ref takes the logical id of the resource for which you want to reference, in this case that is DialogflowFunction, and will return the ARN of that resource.
Modern ECMAScript-262 from 6 version supports Proxy object, which allows to intercept custom object invocations (https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Proxy )
By using Proxy-object, user can write something like obj.func1(123).func2('ABC').something, there nothing right to obj is not existing actually. Of course, by proper Proxy declaration, this chain can exist. The simplest way is return new Proxy on every property access and function invocation.
But is there way to automate this process, maybe some NPM library? The goal is transform custom free-form expression with Proxy to declarative string.
Upper example will transform to something like:
const result = obj.func1(123).func2('ABC').something;
result.__INVOCATION_VIEW__ === [
{ type: 'function', name: 'func1', args: [123] },
{ type: 'function', name: 'func2', args: ['ABC'] },
{ type: 'get', name: 'something' }
]
Pure theoretically, solution is possible anyway, because any entity in invocation chain is available as string in subsequent Proxy. But implementation is much complex. Maybe there is existing library for it?
I am using Active Model Serializers 0.10.x with EmberCLI and Rails while trying to have the Json-Api as the Adapter. GET requests are working, but deserialization for the Active Model is not even though I tried to implement the rails strong_parameters solution described by jimbeaudoin here.
My latest attempt in saving a comment:
Payload:
{"data":{
"attributes": {"soft_delete":false,"soft_delete_date":null,"text":"hjfgfhjghjg","hidden":false,"empathy_level":0},
"relationships":{
"user":{"data":{"type":"users","id":"1"}},
"post":{"data":{"type":"posts","id":"1"}}},"type":"comments"}}
Console Output:
Completed 400 Bad Request in 13ms (ActiveRecord: 8.6ms)
ActionController::ParameterMissing (param is missing or the value is empty: data):
Comments Controller:
class Api::V1::CommentsController < MasterApiController
respond_to :json
...
def create
render json: Comment.create(comment_params)
end
...
def comment_params
#Deserialization issues... Waiting for #950 https://github.com/rails-api/active_model_serializers/pull/950
params.require(:data).require(:attributes).permit(:text, :user_id, :post_id, :empathy_level, :soft_delete_date, :soft_delete, :hidden)
end
end
Noting that if I set the parameters to only params.permit(...), the server saves it with everything null (I did not set any constraints on the comments model for now):
data: {id: "9", type: "comments",…}
attributes: {soft_delete: null, soft_delete_date: null, text: null, hidden: null, empathy_level: null}
id: "9"
relationships: {post: {data: null}, user: {data: null}}
type: "comments"
You can access the full code here.
Update #2: For AMS >= 0.10.2, please check other answers.
Update #1: Answer is still valid for AMS 0.10.1.
If you use 0.10.0.rc4, you can now use the Deserialization implementation described on Active Model Serializers #1248.
def post_params
ActiveModel::Serializer::Adapter::JsonApi::Deserialization.parse(params.to_h)
// or (params.to_unsafe_h) in some cases like in my example below...
end
Bonus: If you use Ember Data, then you can see an example implementation on my Fakktion Github repo.
For AMS >= 0.10.2
In 0.10.2 there was a cleanup so after 0.10.2 use:
def post_params
ActiveModelSerializers::Deserialization.jsonapi_parse(params)
end
Reference:
https://github.com/rails-api/active_model_serializers/commit/252f9c4ae932e6280dfe68605d495b208fe22ba7
With AMS 0.10.2+
Use only hash to create a parameter whitelist,
def post_params
ActiveModelSerializers::Deserialization.jsonapi_parse!(
params, only: [:title, :author, :tags]
)
end
For googlers:
If you have empty data payload, you need to add Mime support
https://stackoverflow.com/a/32013294/2664779
When you want to access json-api formatted json, you should do it like this
(in your controller)
def create
user = User.new(user_params)
...
end
private
def user_params
params.require(:data).require(:attributes).permit(:email, :password)
end
When previously I would do it like this
private
def user_params
params.require(:user).permit(:email, :password)
end
I am using shoulda matchers to test critical routes on Rails 4.2.1 with Ruby 2.2.0 on a new application. I just moved my API namespace to a subdomain, and I can't figure out how to get the shoulda routes matcher (or any other concise routes test) to work.
Here is some example code:
config/routes.rb (using versionist for versioning, but that shouldn't be relevant)
namespace :api, path: '', constraints: { subdomain: 'api'} do
api_version(module: 'V1',
path: {value: 'v1'},
defaults: {format: 'json'}, default: true) do
resources :bills, only: :index
end
end
app/controllers/api/v1/bills_controller.rb
module API
module V1
class Bill < APIVersionsController
# GET api.example.com/v1/bills.json
def index
#bills = Bill.all.limit(10)
render json: #bills
end
end
end
end
test/controllers/api/v1/routing_test.rb
module API
module V1
class RoutingTest < ActionController::TestCase
setup { #request.host = 'http://api.example.com' }
should route('/v1/bills')
.to(controller: :bill, action: :index, format: :json)
end
end
end
Before I was using a subdomain, should route('/api/v1/bills').to(action: :index, format: :json) in my BillsControllerTest worked just fine.
Now, when I run rake test, I get Minitest::Assertion: No route matches "/v1/bills".
I've tried putting this in the BillControllerTest (no change);
I've tried an integration test (route matcher doesn't work);
I've tried setting the host with setup { host! 'api.example.com' } and setup { #request.host = 'api.example.com' }
I've tried putting the full URL in the get request ( { get 'http://api.example.com/v1/bills' } );
and I've tried putting subdomain: 'api' and constraints: subdomain: 'api' anywhere that might make sense.
What is a concise way to do route testing with subdomains/what is the current best practice? Is there a way to get the shoulda route matcher to work with them?
This ended up being a simple fix. I just needed to add a subdomain constraint in the #to method and ensure the #route method had the full url:
module API
module V1
class RoutingTest < ActionController::TestCase
should route(:get, 'http://api.example.com/v1')
.to('api/v1/data#index',
subdomain: 'api',
format: :json)
...
Or, if you are in data_controller_test.rb,
module API
module V1
class DataControllerTest < ActionController::TestCase
should route(:get, 'http://api.example.com/v1')
.to(action: :index,
subdomain: 'api',
format: :json)
...