Azure python sdk service bus receive message - python-3.x

I am a bit confused about the azure python servicebus.
I have a servicebus TOPIC and SUBSCRIPTION which listen to specific messages, I have the code to receive those messages which then they will be processed by aws comprehend.
Following Microsoft documentation, the basic code to receive the message work and I am able to print it, but when I integrate the same logic with comprehend it fails.
Here is the example, this is the bit of code from Microsoft documentation:
with servicebus_client:
# get the Queue Receiver object for the queue
receiver = servicebus_client.get_queue_receiver(queue_name=QUEUE_NAME, max_wait_time=5)
with receiver:
for msg in receiver:
print("Received: " + str(msg))
# complete the message so that the message is removed from the queue
receiver.complete_message(msg)
and the output is this
{"ModuleId":"123458", "Text":"This is amazing."}
Receive is done.
My first thought was that the message received, was a Json object. so I started writing the code to read data from a json outputs as follow:
servicebus_client = ServiceBusClient.from_connection_string(conn_str=CONNECTION_STR)
with servicebus_client:
receiver = servicebus_client.get_subscription_receiver(
topic_name=TOPIC_NAME,
subscription_name=SUBSCRIPTION_NAME
)
with receiver:
received_msgs = receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
# print(str(msg))
message = json.dumps(msg)
text = message['Text']
#passing the text to comprehend
result_json= json.dumps(comprehend.detect_sentiment(Text=text, LanguageCode='en'), sort_keys=True, indent=4)
result = json.loads(result_json) # converting json to python dictionary
#extracting the sentiment value
sentiment = result["Sentiment"]
#extracting the sentiment score
if sentiment == "POSITIVE":
value = round(result["SentimentScore"]["Positive"] * 100,2)
elif sentiment == "NEGATIVE":
value = round(result["SentimentScore"]["Negative"] * 100,2)
elif sentiment == "NEUTRAL":
value = round(result["SentimentScore"]["Neutral"] * 100,2)
elif sentiment == "MIXED":
value = round(result["SentimentScore"]["Mixed"] * 100,2)
#store the text, sentiment and value in a dictionary and convert it tp JSON
output={'Text':text,'Sentiment':sentiment, 'Value':value}
output_json = json.dumps(output)
print('Text: ',text,'\nSentiment: ',sentiment,'\nValue: ', value)
print('In JSON format\n',output_json)
receiver.complete_message(msg)
print("Receive is done.")
But when I run this I get the following error:
TypeError: Object of type ServiceBusReceivedMessage is not JSON serializable
Did this ever happened to anybody who can help me to understand what is the type of servicebus that is coming back from the receive?
Thank you so much everyone

Did this ever happened to anybody who can help me to understand what
is the type of servicebus that is coming back from the receive?
The type of the received message is ServiceBusReceivedMessage which is derived from ServiceBusMessage. The contents of the message can be fetched from its body property.
Can you please try something like:
message = json.dumps(msg.body)

Related

Streaming json into bigquery using python

We have a code to reads some electricity meter data ,which we want to push to bigquery so that it can be visualized in data studio. We tried usign Cloud function, but it seems the code generates streaming data and cloud function timesout. So this may not be a correct use case for cloud function
def test():
def print_recursive(usage_dict, info, depth=0):
for gid, device in usage_dict.items():
for channelnum, channel in device.channels.items():
name = channel.name
if name == 'Main':
name = info[gid].device_name
d = datetime.now()
t = d.strftime("%x")+' '+d.strftime("%X")
print(d.strftime("%x"),d.strftime("%X"))
res={'Gid' : gid,
'ChannelNumber' : channelnum[0],
'Name' : channel.name,
'Usage' : channel.usage,
'unit':'kwh',
'Timestamp':t
}
global resp
resp = res
print(resp)
return resp
devices = vue.get_devices()
deviceGids = []
info ={}
for device in devices:
if not device.device_gid in deviceGids:
deviceGids.append(device.device_gid)
info[device.device_gid] = device
else:
info[device.device_gid].channels += device.channels
device_usage_dict = vue.get_device_list_usage(deviceGids=deviceGids,
instant=datetime.utcnow(), scale=Scale.SECOND.value, unit=Unit.KWH.value)
print_recursive(device_usage_dict, info)
This generates a electricity consumption data in real time
Can anyone suggest which GCP service would be ideal here? based on my research it seems pub/sub => bigquery . But I my question is can we programmatically ingest data into pubsub ? if yes then what are the prerequisites ?

Input block the subscriber/client of receiving message in Pub/Sub system

I am trying to build a publisher/subscriber system. In simple words, a publisher can publish messages to all subscribers via a broker. Subscribers can support 2 functionalities: 1) receive a message from the broker 2) send a message to the broker with input method. But there is a problem: Subscribers block while waiting the input from stdin and they receive the publishers messages after input, even though no input is needed.
Ι wοuld like to solve the problem in this direction: while waiting for input from stdin, a subscriber can receive a message from publish. I tried "curses" but i failed.
I post a part of my code
subscriber.py
while True:
try:
command = input("Enter a command: ")
#command = sys.stdin.readline()
if command == "quit":
break
command.strip()
command_to_broker = process_command(command, sub_id)
if command_to_broker == error_message:
print("Command with wrong format!")
continue
sock.sendall(bytes(command_to_broker, ENCODING))
received = str(sock.recv(BUFFER_SIZE), ENCODING)
print("Received from BROKER: " + received)
except:
print("Error/Disconnect")
broker.py (starts a publisher thread)
def publisher_thread(connection, topics_and_subscribers, subscribers_and_ports, subscribers):
while True:
data = connection.recv(BUFFER_SIZE)
print("Command from PUBLISHER {}".format(data.decode(ENCODING)))
response = 'OK'
command = data.decode(ENCODING).split(" ", 3)
pub_id = command[0]
topic = command[2]
message = command[3]
if topic in topics_and_subscribers:
for s in subscribers:
print(s.getpeername())
sum = s.send(bytes(message, ENCODING))
print(sum)
if not data:
break
connection.sendall(bytes(response, ENCODING))
connection.close()

How to count messages from a subscription utilizing streaming pull with GCP Pub/Sub?

I am confused about the GCP Pub/Sub REST API.
Background: I am trying to count the number of messages in a pubsub subscription but I can not iterate through the message object streaming pull.
Therefore, I will need to rely upon the REST API provided: https://cloud.google.com/pubsub/docs/pull#asynchronous-pull
Based on my understanding of the REST API:
It currently pulls messages as expected but when I try to iterate through a loop the stack trace highlights the Message object cannot be iterable.
What I have tried :
with my current implementation, it only repeats 1 message being sent for each company involved
company_name = {}
if len(message) == 0:
logging.warning('Nothing pulled from pubsub')
else:
logging.info('Pulled %s messages from pubsub' % str(len(message.data)))
for msg in message:
if msg.attributes in message:
agency_name[message.attributes['company_name']] = 1
else:
agency_name[message.attributes['company_name']] += 1
message.ack()
what is the best way of achieving this solution?
In addition to what #guillaume said, you can check this GCP Documentation for reading time-series data using Python: https://cloud.google.com/monitoring/docs/samples/monitoring-read-timeseries-simple#code-sample
from google.cloud import monitoring_v3
import time
client = monitoring_v3.MetricServiceClient()
project_name = f"projects/anjela"
now = time.time()
seconds = int(now)
nanos = int((now - seconds) * 10 ** 9)
interval = monitoring_v3.TimeInterval(
{
"end_time": {"seconds": seconds, "nanos": nanos},
"start_time": {"seconds": (seconds - 1200), "nanos": nanos},
}
)
results = client.list_time_series(
request={
"name": project_name,
"filter": 'metric.type = "pubsub.googleapis.com/subscription/num_undelivered_messages"',
"interval": interval,
"view": monitoring_v3.ListTimeSeriesRequest.TimeSeriesView.FULL,
}
)
for result in results:
print(result)
I put this filter pubsub.googleapis.com/subscription/num_undelivered_messages as it tracks the number of unacknowledged or backlog messages. You can use this GCP documentation to alter the filter according to your purpose: https://cloud.google.com/monitoring/api/metrics_gcp#gcp-pubsub
Result in Cloud Monitoring Interface:
Response:

Cant publish to aws mqtt broker over websockets

I am following aws api to connect to mqtt over websockets. Below is my code:
credentials_provider = AwsCredentialsProvider.new_static(
access_key_id = auth_response_dictionary['user']['accessKeyId'],
secret_access_key = auth_response_dictionary['user']['secretKey'],
session_token = auth_response_dictionary['user']['sessionToken']
)
event_loop_group = io.EventLoopGroup(1)
host_resolver = io.DefaultHostResolver(event_loop_group)
client_bootstrap = io.ClientBootstrap(event_loop_group, host_resolver)
mqtt_connection = mqtt_connection_builder.websockets_with_default_aws_signing(
endpoint=auth_response_dictionary['user']['iotEndpoint'],
region=auth_response_dictionary['user']['region'],
credentials_provider=credentials_provider,
client_bootstrap=client_bootstrap,
client_id=clientId
)
print("Connecting to aws")
# Make the connect() call
connect_future = mqtt_connection.connect()
# Future.result() waits until a result is available
print('connect_future ' + str(connect_future))
x= connect_future.result()
print('connect_future ' + str(x))
print("Connected!")
future, packet_id = mqtt_connection.publish(topic=TOPIC, payload=json.dumps(message), qos=mqtt.QoS.AT_LEAST_ONCE)
future, packet_id = mqtt_connection.publish(topic='test/po', payload=json.dumps(message), qos=mqtt.QoS.AT_LEAST_ONCE)
print('future ' + str(future))
print('future ' + str(packet_id))
print('Publish End')
I am not getting any error while connecting and while publishing but I am not receiving any msgs on my aws mqtt broker when I subscribe to that topic there in 'Test' section.
I think that i have configured something wrong in either credentials_provider or client_bootstrap or both but dont know what.
Here are the printed logs
Connecting to aws
connect_future<Future at 0x7f605f942af0 state=pending>
connect_future{'session_present': False}
Connected!
future <Future at 0x7f605f8e54f0 state=pending>
future 3
Publish End
Can somebody please help?
mqtt_connection.subscribe(...) is used to subscribe to an MQTT topic for AWS IoT messages, which I can't see anywhere in your code.
mqtt_connection.subscribe is called like below, taking in the topic name, a Quality of Service level and a callback.
received_count = 0
received_all_event = threading.Event()
...
topic='test/po'
print("Subscribing to topic '{}'...".format(topic))
subscribe_future, packet_id = mqtt_connection.subscribe(
topic=topic,
qos=mqtt.QoS.AT_LEAST_ONCE,
callback=on_message_received)
subscribe_result = subscribe_future.result()
print("Subscribed with {}".format(str(subscribe_result['qos'])))
on_message_received can look like this:
def on_message_received(topic, payload, dup, qos, retain, **kwargs):
print("Received message from topic '{}': {}".format(topic, payload))
global received_count
received_count += 1
# Number of messages to wait for
if received_count = 10:
received_all_event.set()
Then in your main method, you can wait until you've received 10 messages:
# Wait for all messages to be received.
# This waits forever if count was set to 0.
if not received_all_event.is_set():
print("Waiting for all messages to be received...")
received_all_event.wait()
print("{} message(s) received.".format(received_count))
There's really good sample code provided by AWS, which I'd recommend you check out.

Python3 socket doesn't work - graphite

I created a function to send data to a graphite server. It sends the metricname, value and the timestamp to the graphite server at execution:
def collect_metric(metricname, value, timestamp):
sock = socket.socket()
sock.connect( ("localhost", 2003) )
sock.send("%s %s %s\n" % (metricname, value, timestamp))
sock.close()
This function above worked fine in Python2. I had to rewrite this function for Python3. Now no data will be send to graphite. No log entries in the graphite/carbon logs or something else ...:
def collect_metric(metricname, value, timestamp):
sock = socket.socket(socket.AF_INET,socket.SOCK_STREAM)
sock.connect( ("localhost", 2003) )
metricname = metricname.encode()
if type(value) == "str":
value = value.encode()
timestamp = timestamp.encode()
message = bytearray()
message = bytes(metricname+b" "+value+b" "+timestamp)
sock.sendall(message)
print(message.decode())
sock.close()
I receive no errors. Also on terminal I get the right format/output (see "print(message.decode())")
Has anybody some ideas why it doesn't work?
Thanks.
The bytearray is without any encoding. Try this:
message = (metricname+" "+value+" "+timestamp).encode("UTF-8")
sock.send(messages)
It seems like you're missing '\n' at the end of the message you're sending
message = bytes(metricname+b" "+value+b" "+timestamp)
should be:
message = bytes(metricname+b" "+value+b" "+timestamp + '\n')

Resources