When I use GPT3's playground, I often get results that are formatted with numbered lists and paragraphs like below:
Here's what the above class is doing:
1. It creates a directory for the log file if it doesn't exist.
2. It checks that the log file is newline-terminated.
3. It writes a newline-terminated JSON object to the log file.
4. It reads the log file and returns a dictionary with the following
- list 1
- list 2
- list 3
- list 4
However, when I directly use their API and extract the response from json result, I get the crammed text version that is very hard to read, something like this:
Here's what the above class is doing:1. It creates a directory for the log file if it doesn't exist.2. It checks that the log file is newline-terminated.3. It writes a newline-terminated JSON object to the log file.4. It reads the log file and returns a dictionary with the following-list 1-list 2-list 3- list4
My question is, how do people keep the formats from GPT results so they are displayed in a neater, more readable way?
Option 1: Edits endpoint
If you run test.py the OpenAI API will return the following completion:
test.py
import openai
openai.api_key = 'sk-xxxxxxxxxxxxxxxxxxxx'
response = openai.Edit.create(
model = 'text-davinci-edit-001',
input = 'I have three items:1. First item.2. Second item.3. Third item.',
instruction = 'Make numbered list'
)
content = response['choices'][0]['text']
print(content)
Option 2: Processing
Process the completion you get from the Completions endpoint by yourself (i.e., write Python code).
Related
I am currently working on obtaining data from a nested JSON response called "Result"
Now after reviewing the API documentation, they say that they only return 100 records per request, so which means if we have 425 records I would have to pass the request. get at least 4 times with:
/example
/example?$skip=100
/example?$skip=200
/example?$skip=400
After that is done it should write the response list in a csv file.I have parsed the response from the get to json.loads, I have converted the dictionary to list and created a for loop that writes whatever is in the "Result" dictionary.
My question is how can I create that it loops also the request.get and increments the url value to skip 100,200,300,400. Hope this makes sense
So after searching and searching the best way that worked for me was.
Creating a for loop with the number of times that it needs to loop over.
toSkip = (i+1) * 100
Concatenate the string with 'url string' + '?$Skip=' + str(toSkip)
Create a request with passing the authorization header
Parse it with json.loads
Write the result to a csv file or google sheets API
I store some data in a excel that I extract in a JSON format. I also call some data with GET requests from some API I created. With all these data, I do some test (does the data in the excel = the data returned by the API?)
In my case, I may need to store in the excel the way to select the data from the API json returned by the GET.
for example, the API returns :
{"countries":
[{"code":"AF","name":"Afghanistan"},
{"code":"AX","name":"Ă…land Islands"} ...
And in my excel, I store :
excelData['countries'][0]['name']
I can retrieve the excelData['countries'][0]['name'] in my code just fine, as a string.
Is there a way to convert excelData['countries'][0]['name'] from a string to some code that actually points and get the data I need from the API json?
here's how I want to use it :
self.assertEqual(str(valueExcel), path)
#path is the string from the excel that tells where to fetch the data from the
# JSON api
I thought strings would be interpreted but no :
AssertionError: 'AF' != "excelData['countries'][0]['code']"
- AF
+ excelData['countries'][0]['code']
You are looking for the eval method. Try with this:
self.assertEqual(str(valueExcel), eval(path))
Important: Keep in mind that eval can be dangerous, since malicious code could be executed. More warnings here: What does Python's eval() do?
I am getting a value from HTTP request which I am writing it into a CSV file, each and every time when the program is executed, the new values are overwritten and not appended to the CSV. I would like to append the values instead of overwriting. I am using Regex and XPath extractor to get the values from the HTTP requests and writing it an CSV file.
new File('/Users/ddd/testgui/queueId1.csv').newWriter().withWriter { w ->
w << vars.get('queueid')
}
So this works for me, on groovysh 2.5.3 :
new File('/Users/ddd/testgui/queueId1.csv').newWriter(true).withWriter { w ->
w << vars.get('queueid')
}
The true in the newWriter is for append == true.
You can do just:
new File('/Users/ddd/testgui/queueId1.csv') << vars.get('queueid')
Be aware that your code is going to work fine only when you have 1 thread, if there will be more - you may suffer from a race condition when 2 threads will be simultaneously writing into a file.
If you're going to execute this code with > 1 virtual user I would rather recommend going for Sample Variables functionality.
If you add the next line to user.properties file:
sample_variables=queueid
and restart JMeter to pick the property up next time you run your test the .jtl results file will have an extra column with queueid variable value for each thread/request.
If you want to store it into a separate file - go for Flexible File Writer
I'm pretty new to Python and the overall goal of the project I am working on is to setup a SQLite DB that will allow easy entries in the future for non-programmers (this is for a small group of people who are all technically competent). The way I am trying to accomplish this right now is to have people save their new data entry as a .py file through a simple text editor and then open that .py file within the function that enters the values into the DB. So far I have:
def newEntry(material=None, param=None, value=None):
if param == 'density':
print('The density of %s is %s' % (material, value))
import fileinput
for line in fileinput.input(files=('testEntry.py'))
process(line)
Then I have created with a simple text editor a file called testEntry.py that will hopefully be called by newEntry.py when newEntry is executed in the terminal. The idea here is that some user would just have to put in the function name with the arguments they are inputing within the parentheses. testEntry.py is simply:
# Some description for future users
newEntry(material='water', param='density', value='1')
When I run newEntry.py in my terminal nothing happens. Is there some other way to open and execute a .py file within another that I do not know of? Thank you very much for any help.
Your solution works, but as a commenter said, it is very insecure and there are better ways. Presuming your process(...) method is just executing some arbitrary Python code, this could be abused to execute system commands, such as deleting files (very bad).
Instead of using a .py file consisting of a series of newEntry(...) on each line, have your users produce a CSV file with the appropriate column headers. I.e.
material,param,value
water,density,1
Then parse this csv file to add new entries:
with open('entries.csv') as entries:
csv_reader = csv.reader(entries)
header = True
for row in csv_reader:
if header: # Skip header
header = False
continue
material = row[0]
param = row[1]
value = row[2]
if param == 'density':
print('The density of %s is %s' % (material, value))
Your users could use Microsoft Excel, Google Sheets, or any other spreadsheet software that can export .csv files to create/edit these files, and you could provide a template to the users with predefined headers.
I am running a small web server in python (cherrypy). I wish to return data of the following format:
20100701,1.5127
20100702,1.5184
20100705,1.51075
So at the moment my python looks like that in order to test the output:
return """20100701,1.5127
20100702,1.5184
20100705,1.51075
"""
When I request the URL from my other end, the one supposed to use the data, and expecting to parse line by line, I get the following output:
20100701,1.5127 20100702,1.5184 20100705,1.51075
Line feeds have been replaced by spaces... I guess this might be because my server considers I am sending html, so ignores my line feeds...
Set the content type of the response to text/plain.