Using request_mock to dynamically set response based on request - python-3.x

I am trying to mock a simple POST request that creates a resource from the request body, and returns the resource that was created. For simplicity, let's assume the created resource is exactly as passed in, but given an ID when created. Here is my code:
def test_create_resource(requests_mock):
# Helper function to generate dynamic response
def get_response(request, context):
context.status_code = 201
# I assumed this would contain the request body
response = request.json()
response['id'] = 100
return response
# Mock the response
requests_mock.post('test-url/resource', json=get_response)
resource = function_that_creates_resource()
assert resource['id'] == 100
I end up with runtime error JSONDecodeError('Expecting value: line 1 column 1 (char 0)'). I assume this is because request.json() does not contain what I am looking for. How can I access the request body?

I had to hack up your example a little bit as there is some information missing - but the basic idea works fine for me. I think as mentioned something is wrong with the way you're creating the post request.
import requests
import requests_mock
with requests_mock.mock() as mock:
# Helper function to generate dynamic response
def get_response(request, context):
context.status_code = 201
# I assumed this would contain the request body
response = request.json()
response['id'] = 100
return response
# Mock the response
mock.post('http://example.com/test-url/resource', json=get_response)
# resource = function_that_creates_resource()
resp = requests.post('http://example.com/test-url/resource', json={'a': 1})
assert resp.json()['id'] == 100

This example is not complete and so we cannot truly see what is happening.
In particular, it would be useful to see a sample function_that_creates_resource.
That said, I think your get_response code is valid.
I believe that you are not sending valid JSON data in your post request in function_that_creates_resource.

Related

requests_mock multiple responses for single url

I am trying to use the requests_mock as part of my unit test to test API Calls which run in a while loop so in order to end the while loop or to meet the condition of the while loop I need to send different response to my API. My URL remains the same but the param changes but requests_mock doesn't really cares about it.
My function goes like this :
def func():
response = requests.get(
url=<url>
params={"limit": 1000}
headers=<headers>
).json()
while "next" in response["info"].keys():
response = requests.get(
url=<url>
params={"limit": 1000, "info": response["info"]}
headers=<headers>
).json()
My test looks like:
def test_url(requests_mock):
requests_mock.get(url, json=<response_with_info_key>)
func()
data_request = requests_mock.request_history[0]
assert data_request.query = "limit=1000"
What i want is my while loop to end with my second response to be without the "next" key. What I have already tried :
def test_url(requests_mock):
requests_mock.get(url, json=[<response_with_info_key>, <response_without_info_key>])
func()
data_request = requests_mock.request_history[0]
assert data_request.query = "limit=1000"
The simplest way to explain the whole question would be : How do i make requests-mock send two different response for the same API?

How to mock an API that returns a file

I have a function (api_upload) that accepts a file and HTTP headers, it then sends the file along with the HTTP headers to an API service (example.com/upload), which processes the file, and returns a URL of the processed file.
I am trying to write a unit test for this function (api_upload), and I am feeling a bit lost since this is my first time playing with unit-tests. I have installed pip install requests-mock and also pip install pytest-mock. I guess my question is how do I mock the example.com/upload, because I don't need to actually call example.com/upload in my testing. But I need to test that the function api_upload is "working" as expected.
def api_upload(input_file, headers):
# covert the input_file into a dictionary, since requests library wants a dictionary
input_file = {'file': input_file}
url = 'https://example.com/upload'
files = {'file': open(input_file['file'], 'rb')}
response = requests.post(url, headers=headers, files=files)
saved_file_url = response.json()['url']
return saved_file_url
What I tried is:
def test_api_upload(requests_mock):
requests_mock.get("https://example.com/upload", text='What do I put here?')
#I am not sure what else to write here?
The first thing you need to alter is the (HTTP) method you call. api_upload() does a POST request so that needs to be matched by requests_mock call.
The second thing is that you need to mock the response that the mock would return for your requests.post() call. api_upload() apparently expects JSON that consists of an object with url as one of the attributes. That's what you need to return then.
As to what you should do inside your test function, I'd say:
Mock the POST call and the expected response.
Call api_upload() and give it a mock file (see tmp_path that returns an instance of pathlib.Path).
Assert that the returned URL is what you mocked.
An untested example:
def test_api_upload(requests_mock, tmp_path):
file = tmp_path / "a.txt"
file.write_text("aaa")
mocked_response = {"url": "https://example.com/file/123456789"}
requests_mock.post("https://example.com/upload", json=mocked_response)
assert api_upload(input_file=str(file), headers={}) == mocked_response["url"]

Mocking service objects in FastAPI controllers using pytest.mock and patch

I'm working on a FastAPI application where I need to test the controller behavior without actually triggering the internal service call. My goal is to implement something similar to an instance_double in RSpec
My controller code looks like this
# module_name.api.routers.v1.py
#router.post("/notes")
async def create(note: Note):
s3_persister = S3Persister()
kafka_persister = KafkaPersister()
service = Persist.create(
note=note,
persisters=[s3_persister, kafka_persister]
)
await service.invoke()
return JSONResponse(
status_code=status.HTTP_200_OK,
content=success_response(payload=jsonable_encoder(service.note)),
)
The corresponding unit test that I have managed to get working in a non Pythonic way is
#pytest.fixture(scope='session')
def client():
client = TestClient(app)
yield client # testing happens here
def test_create_success_response(client):
"""
When the required parameters are passed
it should return a success response
"""
# copies some sample request body data
data = payload.copy()
# creates a note object using Pydantic models
valid_note = Note(**valid_note_payload)
# creates an service object with no persisters
persist_svc = Persist(note=note, persisters=[], event_publisher_version='1.0')
# mock the method that I intend to avoid calling in the controller flow
persist_svc.invoke = AsyncMock()
# A objects that will reflect on the svc_object post processing
persist_svc.note = valid_note
# create a mock of on the factory method so it returns my modified object
Persist.create = Mock(return_value=persist_svc)
response = client.post('/api/v1/notes',
headers=default_headers,
data=json.dumps(data)
)
assert persist_svc.invoke.called
assert response.status_code == 200
While this works I am unable to replicate this using the recommended #patch decorator.
I notice that the patched instance returned by the decorator isn't called and Fastapi ends up calling the actual service methods.
I was forced to add a create class method to the Persist service purely because I couldn't work with the default constructor object to obtain the same result.
#patch('module_name.api.services.persist.Persist')
def test_create_success_response(client, mockPersist):
data = payload.copy()
# I expect `persistInstance` to behave like an instance variable
persistInstance = mockPersist.return_value
valid_note = Note(**valid_note_payload)
# I attempt to mock the instance method
persistInstance.invoke = AsyncMock(return_value=valid_note)
persistInstance.note.return_value = valid_note
response = client.post('/api/v1/notes',
headers=default_headers,
data=json.dumps(data)
)
assert persistInstance.invoke.called
assert response.status_code == 200
The idea here is that I don't create another controller entirely as recommended by the FastAPI docs but instead mock the service method call alone.
I would like help on how to get write the unitTest in the recommended Pythonic way.

Scrapy does not respect LIFO

I use Scrapy 1.5.1
My Goal is to go through entire chain of requests for each variable before moving to the next variable. For some reason Scrapy takes 2 variables, then sends 2 requests, then takes another 2 variables and so on.
CONCURRENT_REQUESTS = 1
Here is my code sample:
def parsed ( self, response):
# inspect_response(response, self)
search = response.meta['search']
for idx, i in enumerate(response.xpath("//table[#id='ctl00_ContentPlaceHolder1_GridView1']/tr")[1:]):
__EVENTARGUMENT = 'Select${}'.format(idx)
data = {
'__EVENTARGUMENT': __EVENTARGUMENT,
}
yield scrapy.Request(response.url, method = 'POST', headers = self.headers, body = urlencode(data),callback = self.res_before_get,meta = {'search' : search}, dont_filter = True)
def res_before_get ( self, response):
# inspect_response(response, self)
url = 'http://www.moj-yemen.net/Search_detels.aspx'
yield scrapy.Request(url, callback = self.results, dont_filter = True)
My desired behavior is:
1 value from Parse is sent to res_before_get and then i do smth with it.
then another values from Parse is sent to res_before_get and so on.
Post
Get
Post
Get
But currently Scrapy takes 2 values from Parse and adds them to queue , then sends 2 requests from res_before_get. Thus im getting duplicate results.
Post
Post
Get
Get
What do I miss?
P.S.
This is asp.net site. Its logic is as follows:
makes POST request with search payload.
Make GET request to get actual data.
Both request share the same sessionID
Thats why it is important to preserve the order.
At the moment im getting POST1 and POST2. And since the sessionID is associated with POST2, both GET1 and GET2 return the same page.
Scrapy works asynchronously, so you cannot expect it to respect the order of your loops or anything.
If you need it to work sequentially, you'll have to accommodate the callbacks to work like that, for example:
def parse1(self, response):
...
yield Request(..., callback=self.parse2, meta={...(necessary information)...})
def parse2(self, response):
...
if (necessary information):
yield Request(...,
callback=self.parse2,
meta={...(remaining necessary information)...},
)

How to Validate a Xero webhook payload with HMACSHA256 python 3

Based on the instructions here (https://developer.xero.com/documentation/webhooks/configuring-your-server) for setting up and validating the intent to receive for the Xero webhook.
The computed signature should match the signature in the header for a correctly signed payload.
But, using python 3, the computed signature doesn't match the signature in the header in any way at all. Xero would send numerous requests to the subscribing webhook url for both correctly and incorrect. In my log, all those requests returned as 401. So, below is my test code which also proved to not match. I don't know what is missing or I what did wrong.
Don't worry about the key been show here, i have generated another key but this was the key assigned to me to use for hashing at this point.
based on their instruction, running this code should make the signature match one of the headers. But not even close at all.
XERO_KEY =
"lyXWmXrha5MqWWzMzuX8q7aREr/sCWyhN8qVgrW09OzaqJvzd1PYsDAmm7Au+oeR5AhlpHYalba81hrSTBeKAw=="
def create_sha256_signature(key, message):
message = bytes(message, 'utf-8')
return base64.b64encode(hmac.new(key.encode(), message,
digestmod=hashlib.sha256).digest()).decode()
# first request header (possibly the incorrect one)
header = "onoTrUNvGHG6dnaBv+JBJxFod/Vp0m0Dd/B6atdoKpM="
# second request header (possibly the correct one)
header = "onoTrUNvGHG6dnaBv+JBJxFodKVp0m0Dd/B6atdoKpM="
payload = {
'events':[],
'firstEventSequence':0,
'lastEventSequence':0,
'entropy':
'YSXCMKAQBJOEMGUZEPFZ'
}
payload = json.dumps(payload, separators=(",", ":")).strip()
signature = create_sha256_signature(XERO_KEY, str(payload))
if hmac.compare_digest(header, signature):
print(True)
return 200
else:
print(False)
return 401
The problem was because when I was receiving the request payload, I was using
# flask request
request.get_json()
this will automatically parse my request data into a json format, hence the reason why the calculated signature never matched
So, what I did was change the way I receive the request payload:
request.get_data()
This will get the raw data.
I still could not get this to work even with the OP's answer, which I found a little vague.
The method I found which worked was:
key = #{your key}
provided_signature = request.headers.get('X-Xero-Signature')
hashed = hmac.new(bytes(key, 'utf8'), request.data, hashlib.sha256)
generated_signature = base64.b64encode(hashed.digest()).decode('utf-8')
if provided_signature != generated_signature:
return '', 401
else:
return '', 200
found on https://github.com/davidvartanian/xero_api_test/blob/master/webhook_server.py#L34

Resources