Error integrating matcher into logstash grok config - logstash

I am using logstash 7.6.2. I have log lines that are json strings. Each json has 3 fields, "msg" which is text, "topic" which is text, and "ts" which is a float.
Here is my matching expression:
{"msg"\s*:\s*(?<msg>".*")\s*,\s*"topic"\s*:\s*(?<topic>".*")\s*,\s*"ts"\s*:\s*(?<ts>[+-]?([0-9]*[.])?[0-9]+)\s*}
Here are two example log lines:
{"msg": "2020-05-01 01:09:06,043 ERROR [luna_messaging.handlers.base] HTTP 400: {\"success\": false}\nTraceback (most recent call last):\n File \"/home/lunalife/luna_messaging/handlers/base.py\", line 238, in wrapper\n yield func(self, *args, **kwargs)\n File \"/home/lunalife/.local/lib/python2.7/site-packages/tornado/gen.py\", line 1015, in run\n value = future.result()\n File \"/home/lunalife/.local/lib/python2.7/site-packages/tornado/concurrent.py\", line 237, in result\n raise_exc_info(self._exc_info)\n File \"/home/lunalife/.local/lib/python2.7/site-packages/tornado/gen.py\", line 1021, in run\n yielded = self.gen.throw(*exc_info)\n File \"/home/lunalife/luna_messaging/handlers/device_status.py\", line 41, in get\n raise tornado.web.HTTPError(400, reason=json.dumps(reason))\nHTTPError: HTTP 400: {\"success\": false}", "topic": "com.walker.prod.luna_messaging.handlers.base", "ts": 1588295346.043578}
{"msg": "2020-05-01 01:09:06,076 ERROR [luna_messaging.handlers.base] HTTP 403: Forbidden\nTraceback (most recent call last):\n File \"/home/lunalife/luna_messaging/handlers/base.py\", line 238, in wrapper\n yield func(self, *args, **kwargs)\n File \"/home/lunalife/.local/lib/python2.7/site-packages/tornado/gen.py\", line 1015, in run\n value = future.result()\n File \"/home/lunalife/.local/lib/python2.7/site-packages/tornado/concurrent.py\", line 237, in result\n raise_exc_info(self._exc_info)\n File \"/home/lunalife/.local/lib/python2.7/site-packages/tornado/gen.py\", line 1024, in run\n yielded = self.gen.send(value)\n File \"/home/lunalife/luna_messaging/handlers/device_status.py\", line 46, in get\n raise tornado.web.HTTPError(403)\nHTTPError: HTTP 403: Forbidden", "topic": "com.walker.prod.luna_messaging.handlers.base", "ts": 1588295346.076928}```
I've used a couple of grok testers that show this works. https://grokdebug.herokuapp.com/ and https://grokconstructor.appspot.com/do/match
The problem is, when I integrate into my logstash configuration, it gives me a syntax error. I'm not sure what I am doing wrong.
This is the grok matcher in my logstash configuration:
grok {
match => {"msg"\s*:\s*(?<msg>".*")\s*,\s*"topic"\s*:\s*(?<topic>".*")\s*,\s*"ts"\s*:\s*(?<ts>[+-]?([0-9]*[.])?[0-9]+)\s*}
}
and this is the logstash startup error:
Expected one of [ \\t\\r\\n], \"#\", \"=>\" at line 44, column 21
I believe my matching expression is correct, but I don't know to how add it to the grok config. Any help would be appreciated.

You need to tell the grok filter on which field the pattern matching should be applied.
As you can see from the documentation (https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#plugins-filters-grok-match), the match-setting follows the syntax
grok{
match => { "FIELDNAME" => "PATTERN" }
}
The default-field Logstash puts the log line text into is called message. So you would adjust your code like so:
grok{
match => { "message" => "PATTERN" }
}
Furthermore, please be aware that the pattern must be quoted and the special characters have to be escaped (I haven't done the latter in the example below). Since you use double quotes in the pattern itself, you need to use single quotes as in the following:
grok{
match => { 'message' => '{"msg"\s*:\s*(?<msg>".*")\s*,\s*"topic"\s*:\s*(?<topic>".*")\s*,\s*"ts"\s*:\s*(?<ts>[+-]?([0-9]*[.])?[0-9]+)\s*}' }
}
I hope I could help you.

Related

InvalidClientTokenId error when trying to access SQS from Celery

When I try to connect to SQS using Celery, I'm not able to. The worker crashes with the following message:
[2022-10-28 14:05:39,237: CRITICAL/MainProcess] Unrecoverable error: ClientError('An error occurred (InvalidClientTokenId) when calling the GetQueueAttributes operation: The security token included in the request is invalid.')
Traceback (most recent call last):
File "/home/aditya/.local/lib/python3.10/site-packages/celery/worker/worker.py", line 203, in start
self.blueprint.start(self)
File "/home/aditya/.local/lib/python3.10/site-packages/celery/bootsteps.py", line 116, in start
step.start(parent)
File "/home/aditya/.local/lib/python3.10/site-packages/celery/bootsteps.py", line 365, in start
return self.obj.start()
File "/home/aditya/.local/lib/python3.10/site-packages/celery/worker/consumer/consumer.py", line 332, in start
blueprint.start(self)
File "/home/aditya/.local/lib/python3.10/site-packages/celery/bootsteps.py", line 116, in start
step.start(parent)
File "/home/aditya/.local/lib/python3.10/site-packages/celery/worker/consumer/tasks.py", line 38, in start
c.task_consumer = c.app.amqp.TaskConsumer(
File "/home/aditya/.local/lib/python3.10/site-packages/celery/app/amqp.py", line 274, in TaskConsumer
return self.Consumer(
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/messaging.py", line 387, in __init__
self.revive(self.channel)
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/messaging.py", line 409, in revive
self.declare()
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/messaging.py", line 422, in declare
queue.declare()
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/entity.py", line 606, in declare
self._create_queue(nowait=nowait, channel=channel)
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/entity.py", line 615, in _create_queue
self.queue_declare(nowait=nowait, passive=False, channel=channel)
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/entity.py", line 643, in queue_declare
ret = channel.queue_declare(
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/transport/virtual/base.py", line 523, in queue_declare
return queue_declare_ok_t(queue, self._size(queue), 0)
File "/home/aditya/.local/lib/python3.10/site-packages/kombu/transport/SQS.py", line 633, in _size
resp = c.get_queue_attributes(
File "/home/aditya/.local/lib/python3.10/site-packages/botocore/client.py", line 514, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/aditya/.local/lib/python3.10/site-packages/botocore/client.py", line 938, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidClientTokenId) when calling the GetQueueAttributes operation: The security token included in the request is invalid.
The credentials are correct, and my Cloud Admin insists that my credentials have these permissions.
If it helps, these are in my settings.py :
from urllib.parse import quote
CELERY_BROKER_URL = 'sqs://{access_key}:{secret_key}#'.format(
access_key=quote(env.str("AWS_ACCESS_KEY_ID"), safe=''),
secret_key=quote(env.str("AWS_SECRET_ACCESS_KEY"), safe=''),
)
CELERY_BROKER_TRANSPORT_OPTIONS = {
"region": "us-east-2",
"polling_interval": 60,
'sdb_persistence': False,
"predefined_queues":{
'myq': {
'url': 'https://sqs.us-east-2.amazonaws.com/164782647287/myq',
'access_key_id': env.str("AWS_ACCESS_KEY_ID"),
'secret_access_key': env.str("AWS_SECRET_ACCESS_KEY"),
},
}
}
CELERY_TASK_DEFAULT_QUEUE = 'myq'
I'm using the latest version of Celery and associated dependencies:
celery[sqs]==5.2.7

How to pass multiple parameters for aiohttp request wit the same name?

The API from which I want to receive data takes several values with the same name as a parameter:
https://example.com/method?types=Retail&types=Office&types=Industrial......
I try to do it like this:
params_api_sale = {
'types': [
'Retail', 'Office', 'Industrial',
'Mixed Use', 'Development Site',
'Land', 'Special Purpose', 'Other'
],
'sortDirection': 'Descending',
'sortOrder': 'ActivatedOn'
}
async def get_items():
async with aiohttp.request(
method='GET', url=vars_.url_api_sale,
headers=vars_.headers, params=vars_.params_api_sale) as response:
...
But this gives me an error:
Traceback (most recent call last):
File "foo.py", line 85, in <module>
asyncio.get_event_loop().run_until_complete(main())
File "C:\Python37\lib\asyncio\base_events.py", line 583, in run_until_complete
return future.result()
File "foo.py", line 80, in main
await asyncio.create_task(get_items())
File "foo.py", line 44, in get_items
headers=vars_.headers, params=vars_.params_api_sale) as response:
File "D:\Dev\venv\lib\site-packages\aiohttp\client.py", line 1051, in __aenter__
self._resp = await self._coro
File "D:\Dev\venv\lib\site-packages\aiohttp\client.py", line 473, in _request
ssl=ssl, proxy_headers=proxy_headers, traces=traces)
File "D:\Dev\venv\lib\site-packages\aiohttp\client_reqrep.py", line 263, in __init__
url2 = url.with_query(params)
File "D:\Dev\venv\lib\site-packages\yarl\__init__.py", line 922, in with_query
new_query = self._get_str_query(*args, **kwargs)
File "D:\Dev\venv\lib\site-packages\yarl\__init__.py", line 886, in _get_str_query
quoter(k) + "=" + quoter(self._query_var(v)) for k, v in query.items()
File "D:\Dev\venv\lib\site-packages\yarl\__init__.py", line 886, in <genexpr>
quoter(k) + "=" + quoter(self._query_var(v)) for k, v in query.items()
File "D:\Dev\venv\lib\site-packages\yarl\__init__.py", line 864, in _query_var
"of type {}".format(v, type(v))
TypeError: Invalid variable type: value should be str or int, got ['Retail', 'Office', 'Industrial', 'Mixed Use', 'Development Site', 'Land', 'Special Purpose', 'Other'] of type <class 'list'>
I have not found a documented way to do this. All that comes to my mind is to generate a URL string with these parameters, of course, if there is no more correct way.
import requests
params = {
'types': [
'Retail', 'Office', 'Industrial',
'Mixed Use', 'Development Site',
'Land', 'Special Purpose', 'Other'
],
'sortDirection': 'Descending',
'sortOrder': 'ActivatedOn'
}
r = requests.get("http://www.test.com/", params=params)
print(r.url)
output:
https://www.test.com/?types=Retail&types=Office&types=Industrial&types=Mixed+Use&types=Development+Site&types=Land&types=Special+Purpose&types=Other&sortDirection=Descending&sortOrder=ActivatedOn
For aiohttp you can use MultiDict which is also included in the official document.
For sending data with multiple values for the same key MultiDict may be used as well.
As stated at aiohttp documentation, you can pass a list of tuples to params with the same key to achieve your goal:
Also it is possible to pass list of 2 items tuples as parameters, in that case you can specifiy multiple values for each key:
params = [
("key1", "value1"),
("key1", "value2"),
("key2", "value3"),
]

How do I get $near to work in Flask-mongoalchemy

I have been trying to work on fetching events from my mongoDB database that are close to the current user's location
I have tried to reformat my Model scheme to contain [type: "Point"] and even arranged my longitude and latitude into a list.
I also tried adding "2dsphere" indexes using meta to my Model based on what I saw in mongo-alchemy documentation.
My Model
class Event(db.Document):
meta = {
'indexes': [
("*location.coordinates", "2dsphere")
]
}
user_id = db.StringField()
uuid = db.StringField()
name = db.StringField()
address = db.StringField()
start_time = db.DateTimeField(required=True, default=datetime.datetime.now())
end_time = db.DateTimeField(required=True, default=datetime.datetime.now())
location = db.DictField(db.AnythingField())
This is now my main query code
def get(self):
latitude = float(request.args.get('lat'))
longitude = float(request.args.get('long'))
print(longitude);
print(latitude);
event = Event.query.filter({"location" :
{ "$near" :
{
"$geometry" : {
"type" : "Point",
"coordinates" : [longitude, latitude] },
"$maxDistance" : 4000
}
}
}).first()
print(event);
[2019-01-15 23:20:59,797] ERROR in app: Exception on /v1/event [GET]
Traceback (most recent call last):
File "/home/creative_joe/.local/lib/python3.5/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/home/creative_joe/.local/lib/python3.5/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functionsrule.endpoint
File "/home/creative_joe/.local/lib/python3.5/site-packages/flask_restplus/api.py", line 325, in wrapper
resp = resource(*args, **kwargs)
File "/home/creative_joe/.local/lib/python3.5/site-packages/flask/views.py", line 88, in view
return self.dispatch_request(*args, **kwargs)
File "/home/creative_joe/.local/lib/python3.5/site-packages/flask_restplus/resource.py", line 44, in dispatch_request
resp = meth(*args, **kwargs)
File "/media/creative_joe/3004586c-9a2d-4cb0-8a5f-d41fe99afc05/home/creative_joe/MonkeyMusic.server/app/views/main.py", line 323, in get
"$maxDistance" : 4000
File "/home/creative_joe/.local/lib/python3.5/site-packages/mongoalchemy/query.py", line 139, in first
for doc in iter(self):
File "/home/creative_joe/.local/lib/python3.5/site-packages/mongoalchemy/query.py", line 412, in next
return self._next_internal()
File "/home/creative_joe/.local/lib/python3.5/site-packages/mongoalchemy/query.py", line 416, in _next_internal
value = next(self.cursor)
File "/home/creative_joe/.local/lib/python3.5/site-packages/mongoalchemy/py3compat.py", line 41, in next
return it.next()
File "/home/creative_joe/.local/lib/python3.5/site-packages/pymongo/cursor.py", line 1189, in next
if len(self.__data) or self._refresh():
File "/home/creative_joe/.local/lib/python3.5/site-packages/pymongo/cursor.py", line 1104, in _refresh
self.__send_message(q)
File "/home/creative_joe/.local/lib/python3.5/site-packages/pymongo/cursor.py", line 982, in __send_message
helpers._check_command_response(first)
File "/home/creative_joe/.local/lib/python3.5/site-packages/pymongo/helpers.py", line 155, in _check_command_response
raise OperationFailure(msg % errmsg, code, response)
pymongo.errors.OperationFailure: error processing query: ns=heroku_c9gg06k0.EventTree: GEONEAR field=location maxdist=4000 isNearSphere=0
Sort: {}
Proj: {}
planner returned error: unable to find index for $geoNear query

With logstash how to combine lines starting with timestamp

Here is my sample log file which i need to parse using logstash:
2016-12-27 07:54:38.621 8407 ERROR oslo_service.service Traceback (most recent call last):
2016-12-27 07:54:38.621 8407 ERROR oslo_service.service File "/usr/lib/python2.7/dist-packages/oslo_service/service.py", line 680, in run_service
2016-12-27 07:54:38.621 8407 ERROR oslo_service.service service.start()
2016-12-27 07:54:38.621 8407 ERROR oslo_service.service File "/usr/lib/python2.7/dist-packages/nova/service.py", line 428, in start
2016-12-27 07:54:38.621 8407 ERROR oslo_service.service self.binary)
2016-12-27 07:54:38.621 8407 ERROR oslo_service.service File "/usr/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 181, in wrapper
Please give me some suggestions how can i parse logs of this format using grok multiline filter and what pattern should i use for this.
Thank you in advance !!
What if you try something like this using grok and multiline:
input {
file {
path => [""] <-- path to your log directory
start_position => "beginning"
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => previous
}
}
}
filter {
grok {
patterns_dir => "./patterns" <-- the path to the patterns file
match=>["message","%{TIMESTAMP_ISO8601:timestamp} %{WORD:level} %{GREEDYDATA:content}"]
}
}
The above is just a sample, you could reproduce it as you wish.
Multiline is the pattern used to read the data, appends all lines that begin with a whitespace, to the previous line. In other words, when Logstash reads a line of input that begins with a whitespace (space, tab), that line will be merged with the previously read input information.
This SO might be helpful as well. Hope it helps!
There are two way to implement
Mutipleline in logstash input
This could use mutiple thread to parsing the log.
input {
beats {
port => 5044
codec => multiline {
pattern => "^[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}[\.,][0-9]{3,7} "
negate => true
what => "previous"
}
congestion_threshold => 40
}
}
Mutipleline in logstash filter
can't use muti thread , but could by special case to setting the filter
filter {
if [#metadata][beat] =~ "xxxx" {
multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => "previous"
}
}
}

Google Adwords Traffic Estimator Service and Python

I've downloaded a code sample that looks for particular keywords and pulls some metrics. I've noticed a lot of Google Adwords API examples are compliant with python 3.x so I'm wondering if there is an issue with that? See below for code sample:
from googleads import adwords
def main(client):
# Initialize appropriate service.
traffic_estimator_service = client.GetService(
'TrafficEstimatorService', version='v201609')
# Construct selector object and retrieve traffic estimates.
keywords = [
{'text': 'mars cruise', 'matchType': 'BROAD'},
{'text': 'cheap cruise', 'matchType': 'PHRASE'},
{'text': 'cruise', 'matchType': 'EXACT'}
]
negative_keywords = [
{'text': 'moon walk', 'matchType': 'BROAD'}
]
keyword_estimate_requests = []
for keyword in keywords:
keyword_estimate_requests.append({
'keyword': {
'xsi_type': 'Keyword',
'matchType': keyword['matchType'],
'text': keyword['text']
}
})
for keyword in negative_keywords:
keyword_estimate_requests.append({
'keyword': {
'xsi_type': 'Keyword',
'matchType': keyword['matchType'],
'text': keyword['text']
},
'isNegative': 'true'
})
# Create ad group estimate requests.
adgroup_estimate_requests = [{
'keywordEstimateRequests': keyword_estimate_requests,
'maxCpc': {
'xsi_type': 'Money',
'microAmount': '1000000'
}
}]
# Create campaign estimate requests.
campaign_estimate_requests = [{
'adGroupEstimateRequests': adgroup_estimate_requests,
'criteria': [
{
'xsi_type': 'Location',
'id': '2840' # United States.
},
{
'xsi_type': 'Language',
'id': '1000' # English.
}
],
}]
# Create the selector.
selector = {
'campaignEstimateRequests': campaign_estimate_requests,
}
# Optional: Request a list of campaign-level estimates segmented by
# platform.
selector['platformEstimateRequested'] = True
# Get traffic estimates.
estimates = traffic_estimator_service.get(selector)
campaign_estimate = estimates['campaignEstimates'][0]
# Display the campaign level estimates segmented by platform.
if 'platformEstimates' in campaign_estimate:
platform_template = ('Results for the platform with ID: "%d" and name: '
'"%s".')
for platform_estimate in campaign_estimate['platformEstimates']:
platform = platform_estimate['platform']
DisplayEstimate(platform_template % (platform['id'],
platform['platformName']),
platform_estimate['minEstimate'],
platform_estimate['maxEstimate'])
# Display the keyword estimates.
if 'adGroupEstimates' in campaign_estimate:
ad_group_estimate = campaign_estimate['adGroupEstimates'][0]
if 'keywordEstimates' in ad_group_estimate:
keyword_estimates = ad_group_estimate['keywordEstimates']
keyword_template = ('Results for the keyword with text "%s" and match '
'type "%s":')
keyword_estimates_and_requests = zip(keyword_estimates,
keyword_estimate_requests)
for keyword_tuple in keyword_estimates_and_requests:
if keyword_tuple[1].get('isNegative', False):
continue
keyword = keyword_tuple[1]['keyword']
keyword_estimate = keyword_tuple[0]
DisplayEstimate(keyword_template % (keyword['text'],
keyword['matchType']),
keyword_estimate['min'], keyword_estimate['max'])
def _CalculateMean(min_est, max_est):
if min_est and max_est:
return (float(min_est) + float(max_est)) / 2.0
else:
return None
def _FormatMean(mean):
if mean:
return '%.2f' % mean
else:
return 'N/A'
def DisplayEstimate(message, min_estimate, max_estimate):
"""Displays mean average cpc, position, clicks, and total cost for estimate.
Args:
message: str message to display for the given estimate.
min_estimate: sudsobject containing a minimum estimate from the
TrafficEstimatorService response.
max_estimate: sudsobject containing a maximum estimate from the
TrafficEstimatorService response.
"""
# Find the mean of the min and max values.
mean_avg_cpc = (_CalculateMean(min_estimate['averageCpc']['microAmount'],
max_estimate['averageCpc']['microAmount'])
if 'averageCpc' in min_estimate else None)
mean_avg_pos = (_CalculateMean(min_estimate['averagePosition'],
max_estimate['averagePosition'])
if 'averagePosition' in min_estimate else None)
mean_clicks = _CalculateMean(min_estimate['clicksPerDay'],
max_estimate['clicksPerDay'])
mean_total_cost = _CalculateMean(min_estimate['totalCost']['microAmount'],
max_estimate['totalCost']['microAmount'])
print (message)
print ('Estimated average CPC: %s' % _FormatMean(mean_avg_cpc))
print ('Estimated ad position: %s' % _FormatMean(mean_avg_pos))
print ('Estimated daily clicks: %s' % _FormatMean(mean_clicks))
print ('Estimated daily cost: %s' % _FormatMean(mean_total_cost))
if __name__ == '__main__':
# Initialize client object.
adwords_client = adwords.AdWordsClient.LoadFromStorage()
main(adwords_client)
Here is the error message:
(Money) not-found
path: "Money", not-found
(Keyword) not-found
path: "Keyword", not-found
(Keyword) not-found
path: "Keyword", not-found
(Keyword) not-found
path: "Keyword", not-found
(Keyword) not-found
path: "Keyword", not-found
(Location) not-found
path: "Location", not-found
(Language) not-found
path: "Language", not-found
<suds.sax.document.Document object at 0x03BF1D10>
Server raised fault in response.
Traceback (most recent call last):
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\transport\http.py", line 82, in send
fp = self.u2open(u2request)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\transport\http.py", line 132, in u2open
return url.open(u2request, timeout=tm)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 472, in open
response = meth(req, response)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 582, in http_response
'http', request, response, code, msg, hdrs)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 510, in error
return self._call_chain(*args)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 444, in _call_chain
result = func(*args)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 590, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 500: Internal Server Error
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\client.py", line 613, in send
reply = self.options.transport.send(request)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\transport\http.py", line 94, in send
raise TransportError(e.msg, e.code, e.fp)
suds.transport.TransportError: Internal Server Error
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\adwords test - Copy (2).py", line 177, in <module>
main(adwords_client)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\adwords test - Copy (2).py", line 95, in main
estimates = traffic_estimator_service.get(selector)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\googleads\common.py", line 696, in MakeSoapRequest
raise e
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\googleads\common.py", line 692, in MakeSoapRequest
for arg in args])
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\client.py", line 521, in __call__
return client.invoke(args, kwargs)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\client.py", line 581, in invoke
result = self.send(soapenv)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\client.py", line 619, in send
description=tostr(e), original_soapenv=original_soapenv)
File "C:\Users\sfroese\AppData\Local\Programs\Python\Python35-32\lib\site-packages\suds\client.py", line 670, in process_reply
raise WebFault(fault, replyroot)
suds.WebFault: Server raised fault: '[AuthenticationError.CLIENT_CUSTOMER_ID_IS_REQUIRED # ; trigger:'<null>']'
You should set client_customer_id in your googleads.yaml file . you can get client_customer_id from your manager account. go to your manager account and add client then copy your id from upright corner of screen. in googleads.yaml file paste that id to client_customer_id .

Resources