Writing to serial port works via Putty but not PySerial - python-3.x

I am able to send a command string to the serial port via Putty and get a response. However, when I try the same using Python PySerial read/write I am not able to send read/write commands.
Putty terminal:
example-1:
<command_string>
response = Success
Example-2:
<incorrect_command_string>
response = Fail
Python code:
serialData = serial.Serial(port=2, baudrate=921600, parity=serial.PARITY_NONE,
stopbits=serial.STOPBITS_ONE, bytesize=serial.EIGHTBITS)
serialData.write(b'<command_string>')
print(serialData.in_waiting)
print(serialData.read(serialData.in_waiting))
Output of the code:
0
b''
Any suggestions?

Found it:
The response string has '\r'. So I am able to use read_until as below and eliminate the sleep.
serialData.read_until("\r".encode('utf-8'))

Related

How to form DHCPDISCOVER via scapy on Python?

I created a request using scapy. It works fine with my eth0 interface. I send a message to the discoverer and i get an offer. But it doesn't work through the wlan0 interface, when I connect to it via wifi and try to send a packet.
Why is this happening ? How to fix it ?
from scapy.all import *
conf.checkIPaddr = False
dhcp_discover = Ether(dst='ff:ff:ff:ff:ff:ff',src=RandMAC()) \
/IP(src='0.0.0.0',dst='255.255.255.255') \
/UDP(sport=68,dport=67) \
/BOOTP(op=1, chaddr=RandMAC()) \
/DHCP(options=[('message-type','discover'),('end')])
#sendp(dhcp_discover,iface='eth0') # Ok
sendp(dhcp_discover,iface='wlan0') # not working
enter image description here

How to pass dynamic inputs for a command on windows remote machine using python pypsexec.client library

I am executing commands on windows remote machine using python pypsexec.client library from a local windows machine where python script is written.
Below is the command that I am trying to execute from the python script:
How do I pass name and details parameters form the script?
Below is the my script:
from pypsexec.client import Client
ip = '10.X.X.X'
try:
conn = Client(ip, 'administrator', 'password', encrypt='False')
conn.connect()
conn.create_service()
print('service created for following "{}".......\n\n'.format(ip))
callback = r"""C:\Progra~1\Nimsoft\bin\pu -u administrator -p password
/CHOSERVER1_domain/CHOSERVER1_hub/CHOSERVER1/hub getrobots"""
stdout, stderr, rc = conn.run_executable('cmd.exe',arguments='''/c
{}'''.format(callback),stdin=None)
stdout = str(stdout, 'utf-8');stderr = str(stderr, 'utf-8')
print(stdout)
except Exception as e:
print('Below exception occured .....\n')
print(e)
print()
While running the above script terminal is waiting for the name,details parameters which has mentioned on screenshot.
Any help would be appreciated. Thank you.

Python script which access GitLab works on Windows but returns 'Project Not Found' on Windows Subsystem for Linux (WSL) - Used python requests

I have a python script which does a GET request to GitLab and stores the data from the response in an excel file using tablib library.
This script works fine in Windows when I execute it using python3.
I have tried to execute the same script in the Windows Subsystem for Linux (WSL) I have enabled and the script fails.
The output when I execute with python3 script.py in WSL is the following:
RESPONSE {"message":"404 Project Not Found"}
When I execute from Windows using python .\gitlab.py where python is python3:
RESPONSE [{"id":567,"iid":22}, {"id":10,"iid":3}]
I think the problem could be related to the GET api call I am doing because in WSL it returns Project Not Found.
I executed that request using curl in WSL to see if the unix in general has this issue, but I get back the expected response instead of the not found response. This was the request:
curl -X GET 'https://URL/api/v4/projects/server%2Fproducts%2FPROJECT/issues?per_page=100' -H 'Content-Type: application/json' -H 'PRIVATE-TOKEN: TOKEN' --insecure
Why is python failing in unix using Python if unix is able to execute the get request using curl? Should I enable/disable something in the request perhaps?
This is the request I am doing in my python script:
def get_items():
url = "https://URL/api/v4/projects/server%2Fproducts%2FPROJECT/issues"
payload = {}
querystring = {"state": "closed", "per_page": "100"}
headers = {
'Content-Type': "application/json",
'PRIVATE-TOKEN': os.environ.get("GITLAB_KEY") # enviromental variable added in windows
}
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
response = requests.request(
"GET", url, headers=headers, data=payload, params=querystring, verify=False)
print("RESPONSE " + response.text)
return json.loads(response.text)
UPDATE:
I have tried using the project id as well instead of the path but it didn't work
REF: https://docs.gitlab.com/ee/api/projects.html#get-single-project
GET /projects/:id
Change this:
url = "https://URL/api/v4/projects/server%2Fproducts%2FPROJECT/issues"
To
projectId = 1234 # or whatever your project id is ... Project Page, Settings -> General
url = "https://URL/api/v4/projects/" + projectId + "/issues"
Based on an answer I got in the post I did in Reddit, I found the problem.
In the python script, I am using an environmental variable which is not accessible in that way ( os.environ.get("GITLAB_KEY") ) from the WSL.
For now, I have replaced it with the hard-coded value just to check that this was really the issue. The script now works as expected.
I will find a way to access the env var again now that I know what the problem was.

Execute Long running jobs from bottle web server

What I am trying to do
I have a front end system that is generating output. I am accessing this data(JSON) with a post request using bottle. My post receives the json without issue. I need to execute a backend python program(blender automation) and pass this JSON data to that program.
What I have tried to do
Subprocess - Using subprocess call the program and pass the input. In appearance seems to execute but when i check System Monitor the program is not starting but my server continues to run as it should. This subprocess command runs perfectly fine when executed independently from the server.
blender, script, and json are all string objects with absolute file paths
sub = subprocess.Popen([blender + " -b -P " + script + " -- " + json], stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=False)
C Style os.fork() - Same as above which i found reading pydoc that subprocess operates using these methods
Double Fork - From a posting on here tried forking from the server and then calling subprocess from that fork and terminating the parent of the subprocess to create an orphan. My subprocess command still does not execute and never shows up in System Monitor.
What I need
I need a solution that will run from the bottle server in its own process. It will handle multiple requests so the subprocess cannot block in the server. The process being called is fully automated and just requires sending the JSON data in the execution command. The result of the subprocess program will be string path to a file created on the server.
The above subprocess works perfectly fine when called from my test driver program. I just need to connect the execution to the webservice so my front end can trigger its execution.
My bottle post method - prints json when called without issue.
#post('/getData')
def getData():
json_text = request.json
print(json_text)
I am not sure where to go from here. From what i have read thus far, subprocess should work. Any help or suggestions would be very much appreciated. If additional information is needed please let me know. I will edit with more details. Thank you.
Relevant Information:
OS: Ubuntu 16.04 LTS,
Python 3.x
*EDIT
This isn't an elegant solution but my subprocess call works now.
cmd = blender
cmd += " -b -P "
cmd += script
cmd += " -- "
cmd += str(json)
sub = subprocess.Popen([cmd], shell=True)
It seems by setting shell=True and removing the stdout, stderr=PIPE allowed me to see output where i was throwing an unhandled exception because my json data was a list and not a string.
When using python for executing your scripts a process created by Popen.subprocess will unintentionally inherited and keeps open a file descriptor.
You need to close that so that process can run independently. (close_fds=True)
subprocess.Popen(['python', "-u", Constant.WEBAPPS_FOLDER + 'convert_file.py', src, username], shell=False, bufsize=-1, close_fds=True)
Alsso, u dont have to use shell for creating another process. It might have unintended consequences.
I had the exact same problem where bottle was not returning/hangs. It works now.

Send an email from a Python3 script with localhost?

I need to send mails from my Python3 script. Now it does, but my gmail password is visible and I cannot trust in any admin of this machine, so the solution I see is to mount a local mail server. To do some tests, I was trying to execute a script (this one: SMTP sink server). While this one is running, I execute my old script with some changes:
import smtplib
# server = smtplib.SMTP('smtp.gmail.com:587')
server = smtplib.SMTP('localhost:25')
# smtp.ehlo()
# server.starttls()
# smtp.ehlo()
# server.login('my_account#gmail.com', 'my_password')
server.login(None, None)
server.sendmail('Me <my_account#gmail.com'>, ['to_user#gmail.com'], 'Hi!'.as_string())
server.quit()
I understand the script at the link will create a file in the folder where it is with the mail content, but nothing happens, because I get this error message:
SMTP AUTH extension not supported by server.
I googled and I think this could be sorted out if I uncomment the line server.starttls(), but it gives another error, which is supposed to be solved with the lines smtp.ehlo(), but not in my case.
Any suggestions?
OK, I managed to send the email, what I only had to do was removing this line:
server.login(None, None)

Resources