Plug.Conn.Unfetched does not implement the Access behaviour - struct

From the code below, when I call conn.params["geo"], I get the following error:
test/plugs/geoip_test.exs:4
** (UndefinedFunctionError) function Plug.Conn.Unfetched.fetch/2 is undefined (Plug.Conn.Unfetched does not implement the Access behaviour)
stacktrace:
(plug) Plug.Conn.Unfetched.fetch(%{:__struct__ => Plug.Conn.Unfetched, :aspect => :params, "geo" => "Mountain View, US", "ip" => "8.8.8.8"}, "geo")
...
defmodule AgilePulse.Plugs.GeoIPTest do
use AgilePulse.ConnCase
test "returns Mountain View for 8.8.8.8" do
conn = build_conn
params = Map.put(conn.params, "ip", "8.8.8.8")
conn = Map.put(conn, :params, params) |> AgilePulse.Plugs.GeoIP.call(%{})
assert conn.params["geo"] == "Mountain View, US"
end
end
defmodule AgilePulse.Plugs.GeoIP do
import Plug.Conn
def init(opts), do: opts
def call(%Plug.Conn{params: %{"ip" => ip}} = conn, _opts) do
geo = set_geo(ip)
params = Map.put(conn.params, "geo", geo)
Map.put(conn, :params, params)
end
def call(conn, _opts), do: conn
...
end
Could someone enlighten me on why this is failing and what the appropriate solution is? TY!

Short answer: Change this:
params = Map.put(conn.params, "ip", "8.8.8.8")
To:
params = %{"ip": "8.8.8.8"}
Explanation: Phoenix.ConnTest.build_conn/0 returns a Conn with params set to %Plug.Conn.Unfetched{}. By using Map.put on that, you don't reset the value of __struct__, but only add a new key:
%Plug.Conn{ ...,
params: %{:__struct__ => Plug.Conn.Unfetched, :aspect => :params,
"ip" => "8.8.8.8"}, ... }
When you call params["geo"] later, Elixir sees that params is a struct, and tries to call the fetch/2 function on the struct's module, which doesn't exist. To reset params to a normal map (so that Elixir calls Map.get when you use the square bracket syntax), you can just do params = %{"ip": "8.8.8.8"}.

Related

Gatling Rest API Testing - retrieve a value from json response and add it to the list, iterate through list

I am new to Gatling, I am trying to do the performance testing for couple of rest calls. In my scenario I need to extract a value from json response of the 1st call and add those values to the list after looping for few times. Again after looping for few times and adding the values into the list, I want to reuse each value in my next rest call by iterating over the values in the list. Can anyone please suggest on how to implement this. I tried something as below,
var datasetIdList = List.empty[String]
val datasetidsFeeder = datasetIdList.map(datasetId => Map("datasetId" -> datasetId)).iterator
def createData() = {
repeat(20){
feed("").exec(http("create dataset").post("/create/data").header("content-type", "application/json")
.body(StringBody("""{"name":"name"}"""))
.asJson.check(jsonPath("$.id").saveAs("userId"))))
.exec(session => { var usrid = session("userId").as[String].trim
datasetIdList:+= usrid session})
}}
def upload()= feed(datasetidsFeeder).exec(http("file upload").post("/compute-metaservice/datasets/${datasetId}/uploadFile")
.formUpload("File","./src/test/resources/data/File.csv")
.header("content-type","multipart/form-data")
.check(status is 200))
val scn = scenario("create data and upload").exec(createData()).exec(upload())
setUp(scn.inject(atOnceUsers(1))).protocols(httpConf)
}
I am seeing an exception that ListFeeder is empty when trying to run above script. Can someone please help
Updated Code:
class ParallelcallsSimulation extends Simulation{
var idNumbers = (1 to 50).iterator
val customFeeder = Iterator.continually(Map(
"name" -> ("test_gatling_"+ idNumbers.next())
))
val httpConf = http.baseUrl("http://localhost:8080")
.header("Authorization","Bearer 6a4aee03-9172-4e31-a784-39dea65e9063")
def createDatasetsAndUpload() = {
repeat(3) {
//create dataset
feed(customFeeder).exec(http("create data").post("/create/data").header("content-type", "application/json")
.body(StringBody("""{ "name": "${name}","description": "create data and upload file"}"""))
.asJson.check(jsonPath("$.id").saveAs("userId")))
.exec(session => {
val name = session("name").asOption[String]
println(name.getOrElse("COULD NOT FIND NAME"))
val userId = session("userId").as[String].trim
println("%%%%% User ID ====>"+userId)
val datasetIdList = session("datasetIdList").asOption[List[_]].getOrElse(Nil)
session.set("datasetIdList", userId :: datasetIdList)
})
}
}
// File Upload
def fileUpload() = foreach("${datasetIdList}","datasetId"){
exec(http("file upload").post("/uploadFile")
.formUpload("File","./src/test/resources/data/File.csv")
.header("content-type","multipart/form-data")
.check(status is 200))
}
def getDataSetId() = foreach("${datasetIdList}","datasetId"){
exec(http("get datasetId")
.get("/get/data/${datasetId}")
.header("content-type","application/json")
.asJson.check(jsonPath("$.dlp.dlp_job_status").optional
.saveAs("dlpJobStatus")).check(status is 200)
).exec(session => {
val datastId = session("datasetId").asOption[String]
println("request for datasetId >>>>>>>>"+datastId.getOrElse("datasetId not found"))
val jobStatus = session("dlpJobStatus").asOption[String]
println("JOB STATUS:::>>>>>>>>>>"+jobStatus.getOrElse("Dlp Job Status not Found"))
println("Time: >>>>>>"+System.currentTimeMillis())
session
}).pause(10)
}
val scn1 = scenario("create multiple datasets and upload").exec(createDatasetsAndUpload()).exec(fileUpload())
val scn2 = scenario("get datasetId").pause(100).exec(getDataSetId())
setUp(scn1.inject(atOnceUsers(1)),scn2.inject(atOnceUsers(1))).protocols(httpConf)
}
I see below error when I try to execute above script
[ERROR] i.g.c.s.LoopBlock$ - Condition evaluation crashed with message 'No attribute named 'datasetIdList' is defined', exiting loop
var datasetIdList = List.empty[String] defines a mutable variable pointing to a immutable list.
val datasetidsFeeder = datasetIdList.map(datasetId => Map("datasetId" -> datasetId)).iterator uses the immutable list. Further changes to datasetIdList is irrelevant to datasetidsFeeder.
Mutating a global variable with your virtual user is usually not a good idea.
You can save the value into the user's session instead.
In the exec block, you can write:
val userId = session("userId").as[String].trim
val datasetIdList = session("datasetIdList").asOption[List[_]].getOrElse(Nil)
session.set("datasetIdList", userId :: datasetIdList)
Then you can use foreach to iterate them all without using a feeder at all.
foreach("${datasetIdList}", "datasetId") {
exec(http("file upload")
...
}
You should put more work in your question.
Your code is not syntax-highlighted, and is formatted poorly.
You said "I am seeing an exception that ListFeeder is empty" but the words "ListFeeder" are not seen anywhere.
You should post the error message so that it's easier to see what went wrong.
In the documentation linked, there is a Warning. Quoted below:
Session instances are immutable!
Why is that so? Because Sessions are messages that are dealt with in a multi-threaded concurrent way, so immutability is the best way to deal with state without relying on synchronization and blocking.
A very common pitfall is to forget that set and setAll actually return new instances.
This is why the code in the updated question doesn't update the list.
session => {
...
session.set("datasetIdList", userId :: datasetIdList)
println("%%%% List =====>>>" + datasetIdList.toString())
session
}
The updated session is simply discarded. And the original session is returned in the anonymous function.

How to send total model object as a parameter of Alamofire post method in Swift3?

I have a model class like this
class Example() {
var name:String?
var age:String?
var marks:String?
}
I'm adding data to that model class
let example = Example()
example.name = "ABC"
example.age = "10"
example.marks = "10"
After that I converted to JSON then I posted
Alamofire.request(URL, method:.post, parameters: example)
Alamofire not accepting parameters only its accepting like parameters = ["":"","",""]-->key value based, so I tried to convert model to JSON, JSON to dictionary, even though not accepting its showing like parameters problem. Exactly I need total model object need to send as a parameter of post method in Alamofire like this:
let example = Example()
Alamofire.request(URL, method:.post, parameters: example)
Since the Alamofire API is only accepting dictionaries, create a dictionary yourself!
Add a method in the model class called toJSON:
func toJSON() -> [String: Any] {
return [
"name": name as Any,
"age": age as Any,
"marks": marks as Any
]
}
Then call this method when calling request:
Alamofire.request(URL,
method:.put,
parameters:example.toJSON(),
encoding:JSONEncoding.default,
headers :Defines.Api.Headers )
Alternatively, use SwiftyJSON:
func toJSON() -> JSON {
return [
"name": name as Any,
"age": age as Any,
"marks": marks as Any
]
}
Usage:
Alamofire.request(URL,
method:.put,
parameters:example.toJSON().dictionaryObject,
encoding:JSONEncoding.default,
headers :Defines.Api.Headers )
The best way so far is to make your model conform to Encodable then
convert you model into json Data like so
let data = try! JSONEncoder.init().encode(example)
then use SwiftyJSON to convert it back to dictionary
let json = try! JSON.init(data: data)
let dictionary = json.dictionaryObject
as Rob said you can also use JSONSerialization if you are not already using SwiftyJSON
let dictionary = try! JSONSerialization.jsonObject(with: data) as! [String: Any]
Then use the dictionary in your parameters
Also Alamofire now supports Encodable parameters with
let urlRequest = JSONParameterEncoder.init().encode(example, into: urlRequest)

Spock: check the query parameter count in URI

I have just started with spock. I have one functionality. where the java function makes an http call. As per functionality, the URI used in http call, must contain "loc" parameter and it should be only once.
I am writing Spock test case. I have written below snippet.
def "prepareURI" () {
given: "Search Object"
URI uri = new URI();
when:
uri = handler.prepareURI( properties) // it will return URI like http://example.com?query=abc&loc=US
then:
with(uri)
{
def map = uri.getQuery().split('&').inject([:]) {map, kv-> def (key, value) = kv.split('=').toList(); map[key] = value != null ? URLDecoder.decode(value) : null; map }
assert map.loc != null
}
}
From above snippet, my 2 tests got passed like
It should be exists
It should not be null
I want to check the count of "loc" query parameter. It should be passed exactly once. With map as above, If I pass "loc" parameter twice, map overrides the old value with 2nd one.
Does any one knows, how to access the query parameters as list, and in list I want to search the count of Strings which starts with "loc"
Thanks in advance.
Perhaps an example would be the best start:
def uri = new URI('http://example.com?query=abc&loc=US')
def parsed = uri.query.tokenize('&').collect { it.tokenize('=') }
println "parsed to list: $parsed"
println "count of 'loc' params: " + parsed.count { it.first() == 'loc' }
println "count of 'bob' params: " + parsed.count { it.first() == 'bob' }
println "count of params with value 'abc': " + parsed.count { it.last() == 'abc' }
prints:
$ groovy test.groovy
parsed to list: [[query, abc], [loc, US]]
count of 'loc' params: 1
count of 'bob' params: 0
count of params with value 'abc': 1
the problem, as you correctly noted, is that you can not put your params into a map if your intent is to count the number of params with a certain name.
In the above, we parse the params in to a list of lists where the inner lists are key, value pairs. This way we can call it.first() to get the param names and it.last() to get the param values. The groovy List.count { } method lets us count the occurences of a certain item in the list of params.
As for your code, there is no need to call new URI() at the beginning of your test as you set the value anyway a few lines down.
Also the with(uri) call is unnecessary as you don't use any of the uri methods without prefixing them with uri. anyway. I.e. you can either write:
def uri = new URI('http://example.com?query=abc&loc=US')
def parsed = uri.query.tokenize('&').collect { it.tokenize('=') }
or:
def uri = new URI('http://example.com?query=abc&loc=US')
uri.with {
def parsed = query.tokenize('&').collect { it.tokenize('=') }
}
(note that we are using query directly in the second example)
but there is not much point in using with if you are still prefixing with uri..
The resulting test case might look something like:
def "prepareURI"() {
given: "Search Object"
def uri = handler.prepareURI( properties) // it will return URI like http://example.com?query=abc&loc=US
when:
def parsed = query.tokenize('&').collect { it.tokenize('=') }
then:
assert parsed.count { it.first() == 'loc' } == 1
}

Chef custom attributes

I'm working on a custom Chef Cookbook and have defined a custom attribute called default["server"]["apikey"] = nil thats been defined within the cookbook in a separate attributes file that looks like this:
#Default Attributes
default["webhooks"]["get_response"] = ""
default["webhooks"]["put_response"] = ""
default["webhooks"]["post_response"] = ""
default["server"]["username"] = "user"
default["server"]["password"] = "123"
default["server"]["apikey"] = nil
Within my recipe I then do this:
webhooks_request "Request an API key from TPP " do
uri "172.16.28.200/sdk/authorize/"
post_data (
{ 'Username' => node["server"]["username"], 'Password' => node["server"]["password"]}
)
header_data (
{ 'content-type' => 'application/json'}
)
expected_response_codes [ 200, 201, 400 ]
action :post
end
I then follow this with ruby_block that updates the value of the ``default["server"]["apikey"]` attribute with the API key like this:
ruby_block "Extract the API Key" do
block do
jsonData = JSON.parse(node["webhooks"]["post_response"])
jsonData.each do | k, v |
if k == 'APIKey'
node.overide["server"]["apikey"] = v
end
end
end
action :run
end
I can then validate it using this:
ruby_block "Print API Key" do
block do
print "\nKey = : " + node["server"]["apikey"] + "\n"
end
action :run
end
However, if I then try to use the node["server"]["apikey"] attribute in a following block like this:
webhooks_request "Get data from TPP" do
uri "127.0.0.1/vedsdk/certificates/retrieve?apikey=#{node["server"]["apikey"]}"
post_data (
{ 'data' => "NsCVcQg4fd"}
)
header_data (
{ 'content-type' => 'application/json', 'auth' => node["server"] ["username"]}
)
expected_response_codes [ 200, 201, 400, 401 ]
action :post
end
The value of node["server"]["apikey"]} is always empty. Interestingly though the value of the node["server"] ["username"] attribute is available and works as expected.
Obviously, I'm missing something here buy can't work out what :(
Writing it as a generic answer (it will avoid keeping it unanswered in list too ;))
When inside a resource you may evaluate an attribute value at converge time with lazy attribute evaluation.
The correct usage is
resource "name" do
attribute lazy {"any value #{with interpolation} inside"}
end
The common error is to use lazy inside interpolation as we only want the variable to be lazy evaluated and there's only one.
By design lazy is meant to evaluate the attribute value, it can contain Ruby code to compute the value from something done by a previous resource too.

Writing dynamic query results into file

I am trying to write a generic program in Groovy that will get the SQL from config file along with other parameters and put them into file.
here is the program:
def config = new ConfigSlurper().parse(new File("config.properties").toURL())
Sql sql = Sql.newInstance(config.db.url, config.db.login, config.db.password, config.db.driver);
def fileToWrite = new File(config.copy.location)
def writer = fileToWrite.newWriter()
writer.write(config.file.headers)
sql.eachRow(config.sql){ res->
writer.write(config.file.rows)
}
in the config the sql is something like this:
sql="select * from mydb"
and
file.rows="${res.column1}|${res.column2}|${res.column3}\n"
when I run it I get
[:]|[:]|[:]
[:]|[:]|[:]
[:]|[:]|[:]
in the file. If I substitute
writer.write(config.file.rows)
to
writer.write("${res.column1}|${res.column2}|${res.column3}\n")
it outputs the actual results. What do I need to do different to get the results?
You accomplish this by using lazy evaluation of the Gstring combined with altering the delegate.
First make the Gstring lazy by making the values be the results of calling Closures:
file.rows="${->res.column1}|${->res.column2}|${-> res.column3}"
Then prior to evaluating alter the delegate of the closures:
config.file.rows.values.each {
if (Closure.class.isAssignableFrom(it.getClass())) {
it.resolveStrategy = Closure.DELEGATE_FIRST
it.delegate = this
}
}
The delegate must have the variable res in scope. Here is a full working example:
class Test {
Map res
void run() {
String configText = '''file.rows="${->res.column1}|${->res.column2}|${-> res.column3}"
sql="select * from mydb"'''
def slurper = new ConfigSlurper()
def config = slurper.parse(configText)
config.file.rows.values.each {
if (Closure.class.isAssignableFrom(it.getClass())) {
it.resolveStrategy = Closure.DELEGATE_FIRST
it.delegate = this
}
}
def results = [
[column1: 1, column2: 2, column3: 3],
[column1: 4, column2: 5, column3: 6],
]
results.each {
res = it
println config.file.rows.toString()
}
}
}
new Test().run()
The good news is that the ConfigSlurper is more than capable of doing the GString variable substitution for you as intended. The bad news is that it does this substitution when it calls the parse() method, way up above, long before you have a res variable to substitute into the parser. The other bad news is that if the variables being substituted are not defined in the config file itself, then you have to supply them to the slurper in advance, via the binding property.
So, to get the effect you want you have to parse the properties through each pass of eachRow. Does that mean you have to create a new ConfigSlurper re-read the file once for every row? No. You will have to create a new ConfigObject for each pass, but you can reuse the ConfigSlurper and the file text, as follows:
def slurper = new ConfigSlurper();
def configText = new File("scripts/config.properties").text
def config = slurper.parse(configText)
Sql sql = Sql.newInstance(config.db.url, config.db.login, config.db.password, config.db.driver);
def fileToWrite = new File(config.copy.location)
def writer = fileToWrite.newWriter()
writer.write(config.file.headers)
sql.eachRow(config.sql){ result ->
slurper.binding = [res:result]
def reconfig = slurper.parse(configText)
print(reconfig.file.rows)
}
Please notice that I changed the name of the Closure parameter from res to result. I did this to emphasize that the slurper was drawing the name res from the binding map key, not from the closure parameter name.
If you want to reduce wasted "reparsing" time and effort, you could separate the file.rows property into its own separate file. i would still read in that file text once and reuse the text in the "per row" parsing.

Resources