1) In my response body comes like json format.
2) Some expected reason ,i have changed that body json to normal text using below code and working expected way
import groovy.json.*
String js = vars.get("cAccountDetails")
def data = new JsonSlurper().parseText(js)
log.info("the value is "+ data)
vars.putObject('data', data)
3) This code meaning converted json to normal text and stored in some variable thats "data"
4) so my response stored in "data" variable .
5) From "data", how can i extract **specific data** using groovy code or some other code?
import java.util.regex.*
import java.util.regex.Matcher
import java.util.regex.Pattern
def matches = (data =~ '{accountDetails=\\[(.*)\\],')
vars.putObject('matches', matches)
The above code using for correlation purpose {"matches" VARIABLE will store extracted value}
but above code is not working ,how can i fix this issue ?
Thanks in advance!!
We cannot help you unless you share your cAccountDetails variable value and indicate what do you need to extract from it.
From the first glance you regular expression should look a little bit different, i.e.
def matches = (data =~ /accountDetails=[(.*)],/)
More information:
Apache Groovy - Find Operator
Apache Groovy - Why and How You Should Use It
Related
In groovy, I am getting below output in List. I am using Jmeter JSR223 Post processor for the script. My List print below data in result.
def a = [{Zip=36448, CountryID=2}]
I want to fetch only values (36448 and 2) from this List and not Key. How Can I do that?
For simple single instance fetch do this:
def zip = a.first().Zip
def countryId = a.first().CountryID
Seems pretty straight forward if those are only known values that you want.
If you want all Zips and CountryIDs then you can do this:
def zips = a*.Zip
def countryIds = a*.CountryID
That will return 2 Lists one with all the Zips, and one with all the CountryIDs using the spread operator.
I don't know what is the data structure is inside your list your code is not a valid Groovy code.
For Map it would be something like:
a[0].collect {it -> it.value}
More information on Groovy scripting in JMeter: Apache Groovy - Why and How You Should Use It
I have a simple python app and i'm trying to combine bunch of output messages to standardize output to the user. I've created a properties file for this, and it looks similar to the following:
[migration_prepare]
console=The migration prepare phase failed in {stage_name} with error {error}!
email=The migration prepare phase failed while in {stage_name}. Contact support!
slack=The **_prepare_** phase of the migration failed
I created a method to handle fetching messages from a Properties file... similar to:
def get_msg(category, message_key, prop_file_location="messages.properties"):
""" Get a string from a properties file that is utilized similar to a dictionary and be used in subsequent
messaging between console, slack and email communications"""
message = None
config = ConfigParser()
try:
dataset = config.read(prop_file_location)
if len(dataset) == 0:
raise ValueError("failed to find property file")
message = config.get(category, message_key).replace('\\n', '\n') # if contains newline characters i.e. \n
except NoOptionError as no:
print(
f"Bad option for value {message_key}")
print(f"{no}")
except NoSectionError as ns:
print(
f"There is no section in the properties file {prop_file_location} that contains category {category}!")
print(f"{ns}")
return f"{message}"
The method returns the F string fine, to the calling class. My question is, in the calling class if the string in my properties file contains text {some_value} that is intended to be interpolated by the compiler in the calling class using an F String with curly brackets, why does it return a string literal? The output is literal text, not the interpolated value I expect:
What I get The migration prepare phase failed while in {stage_name} stage. Contact support!
What I would like The migration prepare phase failed while in Reconciliation stage. Contact support!
I would like the output from the method to return the interpolated value. Has anyone done anything like this?
I am not sure where you define your stage_name but in order to interpolate in config file you need to use ${stage_name}
Interpolation in f-strings and configParser files are not the same.
Update: added 2 usage examples:
# ${} option using ExtendedInterpolation
from configparser import ConfigParser, ExtendedInterpolation
parser = ConfigParser(interpolation=ExtendedInterpolation())
parser.read_string('[example]\n'
'x=1\n'
'y=${x}')
print(parser['example']['y']) # y = '1'
# another option - %()s
from configparser import ConfigParser, ExtendedInterpolation
parser = ConfigParser()
parser.read_string('[example]\n'
'x=1\n'
'y=%(x)s')
print(parser['example']['y']) # y = '1'
I have getuserdetails api where I need to send purchased item in array
API: api/getuserdetails/${id}
Method: post
body: {"uname":{
"purchaseitem": ["121","11","4","12345"]
}
}
In jmeter setup looks like
>>TP_1
>>CSVDataSet_ids
>>CSVDataSet_purchaseitem
>> ThreadGroup1
>>HTTP Req
>>JSR223 PreProcessor
>>View result tree
purchaseitem csv has values like
1231
12121
312232
13
1
42435
133
I want to pass "purchaseitem" array values fetched from CSV in comma separated manner as shown in body
I've tried something like this in JSR223 PreProcessor
def list = ${purchaseitem}
for (i=0;i<=list.size()-1;i=i+10)
{
def mylist = list.subList(i,i+10);
props.put ("mylist"+i,mylist)
}
Can someone please help? Or is there any function to solve this simple problem?
You're doing something weird
You won't be able to use CSV Data Set Config because it reads the next value with next virtual user/iteration
Your code doesn't make sense at all
If you need to send all the values from the CSV file at once you should be rather using JsonBuilder class, example code which generates JSON and prints it to jmeter.log file:
def payload = [:]
def purchasedtem = [:]
purchasedtem.put('purchaseitem', new File('/path/to/your/purchase.csv').readLines().collect())
payload.put('uname', purchasedtem)
log.info(new groovy.json.JsonBuilder(payload).toPrettyString())
More information:
Apache Groovy - Parsing and Producing JSON
Apache Groovy - Why and How You Should Use It
The function below opens and loads a Json file. My question is whats the best approach to test it?
def read_file_data(filename, path):
os.chdir(path)
with open(filename, encoding="utf8") as data_file:
json_data = json.load(data_file)
return json_data
filename and path are passed in as sys.argv's.
I figured that i would need sample data in my test case for a start but not sure how i would actually use it to test the function
class TestMyFunctions(unittest.TestCase):
def test_read_file_data(self):
sample_json = {
'name' : 'John',
'shares' : 100,
'price' : 1230.23
}
Any pointer will be appreciated.
As stated above you do not need to retest the standard python library code works correctly, so by creating a hard coded file as also stated, you are defeating the point of a unit test by testing outside of your unit of code.
Instead a correct approach would be to mock the opening of the file using pythons mocking framework. And thus test that your function returns the json that's read in correctly.
e.g.
from unittest.mock import patch, mock_open
import json
class TestMyFunctions(unittest.TestCase):
#patch("builtins.open", new_callable=mock_open,
read_data=json.dumps({'name' : 'John','shares' : 100,
'price' : 1230.23}))
def test_read_file_data(self):
expected_output = {
'name' : 'John',
'shares' : 100,
'price' : 1230.23
}
filename = 'example.json'
actual_output = read_file_data(filename, 'example/path')
# Assert filename is file that is opened
mock_file.assert_called_with(filename)
self.assertEqual(expected_output, actual_output)
I think that what you would want to do would be to make the JSON file, hardcode an in memory version of that JSON file and assert equals between the two.
Based on your code:
class TestMyFunctions(unittest.TestCase):
def test_read_file_data(self):
import json
sample_json = {
'name' : 'John',
'shares' : 100,
'price' : 1230.23
}
sample_json = json.dump(sample_json, ensure_ascii=False)
path = /path/to/file
filename = testcase.json
self.assertEqual(read_file_data(filename, path), sample_json)
Check these answers first:
How to use mock_open with json.load()?
some of the answers are using json.dump instead of json.dumps interchagebly, which is wrong.
read, open, load, as stated are from Standard Python Library, already been tested, so you'd rather think about testing some actual column's values/types or some rows in your json file instead, and if you did that, this would not be a unit test anymore, it'll be an integration test since you have dependencies coupled to your method (json data in this case) which would be meaningless to purposely decouple them via mocking.
I'm trying to pass argument inside function but no successes.
the purpose of this function is to return xml tag
this code doesn't work:
from bs4 import BeautifulSoup
def xmlTag(message):
conf = open('timeLimit.conf').read().lower()
for config in conf.splitlines():
if config in conf.splitlines():
data = BeautifulSoup(conf, "lxml")
tag = data.message
print(tag['msg'])
break
xmlTag("fun2")
if i put fun2 instead of "message" variable, like this "tag = data.fun2" the code works
please help
what i"m doing wrong
Try doing:
...
tag = getattr(data, message)
...
getattr is the way of retrieving an attribute from an object when you have its name in a variable.
(Though your code have some other issues as well - that break statement where it is ensures your loop will terminate on the first iteration, for example)