Connections to two Oracle Databases using TNS with different authentication methods (Python, cx_Oracle) - python-3.x

I have to maintain connections to two Oracle databases (12c, Python 3.5, cx_Oracle 6.0.3, Oracle drivers 12.1.0) in parallel (one database uses Kerberised authentication and the other doesn't).
I have created two separate TNS configuration folders (with their separate sqlnet.ora and tnsnames.ora) and attempted the following:
import cx_Oracle
import os
os.environ['TNS_ADMIN'] = r'D:\tns\settings1' # settings folder for first connection
with cx_Oracle.connect("", "", "DB1") as con: # uses kerberos
cursor = con.cursor()
ret = cursor.execute("SELECT 'HELLO WORLD FROM DATABASE1!' AS msg FROM DUAL")
for entry in cursor:
print(entry[0])
os.environ['TNS_ADMIN'] = r'D:\tns\settings2' # settings folder for second connection
with cx_Oracle.connect("username", "password", "DB2") as con:
cursor = con.cursor()
ret = cursor.execute("SELECT 'HELLO WORLD FROM DATABASE2!' AS msg FROM DUAL")
for entry in cursor:
print(entry[0])
While I am able to establish connections from two separate processes in isolation, above fails when executed in a single script from one process (I am getting a "ORA-12631: Username retrieval failed"-error which indicates that the sqlnet.ora-settings from D:\tns\settings1 are still being used to establish the second connection).
My sqlnet.ora in D:\tns\settings2 unsets all Kerberos-related values set in D:\tns\settings2\sqlnet.ora:
SQLNET.AUTHENTICATION_SERVICES = (NONE)
NAMES.DIRECTORY_PATH = (TNSNAMES, EZCONNECT)
SQLNET.KERBEROS5_CC_NAME = NONE
SQLNET.KERBEROS5_CONF = NONE
SQLNET.KERBEROS5_CONF_MIT = NONE
SQLNET.AUTHENTICATION_KERBEROS5_SERVICE = NONE
Again - in isolation both connections succeed (and TNS-settings are picked up correctly from their respective folders TNS_ADMIN points to).
Any ideas how to get this to work?

Related

relation does not exist in postgreSQL but already exist

I've read a lot of articles about my problem but no one solve it. So u can see my code here
DATABASE_URL = os.environ.get('url_of_my_db')
con = None
try:
con = psycopg2.connect(DATABASE_URL)
cur = con.cursor()
print('PostgreSQL database version:')
#cur.execute('SELECT version()')
#cur.execute('SELECT * FROM qwerty')
#cur.execute(sql.SQL('SELECT * FROM {}').format(sql.Identifier('qwerty')))
#cur.execute(sql.SQL("INSERT INTO {} (chat_id, username, created_on) VALUES (8985972942, vovakirdan, 2022-01-05)").format(sql.Identifier('users')))
cur.execute("""INSERT INTO users (chat_id, username, created_on)
VALUES (3131,
vovakirdan,
2022-01-05)""")
# display the PostgreSQL database server version
db_version = cur.fetchone()
print(db_version)
# close the communication with the HerokuPostgres
cur.close()
except Exception as error:
print('Cause: {}'.format(error))
finally:
# close the communication with the database server by calling the close()
if con is not None:
con.close()
print('Database connection closed.')
and in my DB table named "users" (named without quotes) are exist, but I still have this error:
error
...relation "users" does not exist
all the #commented code doesn't work and send the same error besides SELECT version(), it works perfectly that proves that connection works.
The problem was that PostgreDB wants me to use SELECT colum FROM schema.table instead of SELECT colum FROM table. And that's all. Thanks everyone

Algolia search python client to upload indexes behind a firewall

Building a search with Algolia search product for our Moodle site.
To update the index at Algolia site we want to use some automated process (upload the index data).
I decided to start with the Python client (Python v 3.9.x). Since we are working inside a corporate network we are behind the firewalls and there is a problem accessing algolia's servers.
I'm testing with two methods:
Search within already existing indices: index.search
update all indexes: index.replace_all_objects
Error message: 'Unreachable hosts'
Here is my code snippets:
# Search:
searchAppID = constants.ALGOLIA_APP_ID
searchKeyHash = constants.ALGOLIA_SEARCH_KEY
config = SearchConfig(searchAppID, searchKeyHash)
config.connect_timeout = 2
config.read_timeout = 5
config.write_timeout = 30
client = SearchClient.create_with_config(config)
index = client.init_index('myIndex')
result = index.search('SearchString')
# function to update index using json file:
def replace_IdxObj(client,index,fileName):
try:
if (index.exists()):
with open(fileName, encoding="utf8") as f:
data = json.load(f)
result = index.replace_all_objects(data, { 'autoGenerateObjectIDIfNotExist' : True })
return result
else:
print('No Index found! Cancel operation... \n')
return None
except Exception as err:
print ("Error: " + str(err))
When I test my code outside of the company's network - it's all ok!
Also I have a SSL certificate that could be used to work with external resources, so I used it in the following test (running from company's network behind the firewall):
response = requests.get('https://google.com/', verify=("C:\\Program Files\\Common Files\\SSL\\certs\\cert.cer"))
print(response)
This test works just fine (I'm getting 200 response)!
Please advise are there any ways to make Python process (which is used by algolia's python client) to use the certificate that I have ??? Any alternatives?
Many thanks!

ldap3 add user to group after conn.search

currently, I am writing an AWS Lambda function and the idea is, that someone can send an email to a specific address with a username and AD group and it will trigger the function and add this person to the desired group.
I am using the python module ldap3 and the conn.search part is working, aswell as the addUsersInGroup, but only if I run it separately. If I create a script where I already have the cn or dn name of both user and group and use the addUsersInGroup Function it works, but if I do a conn.search somewhere before it somehow can't establish the connection for the add-to-group part.
from ldap3 import Server, Connection, ALL, NTLM, SUBTREE
from ldap3.extend.microsoft.addMembersToGroups import ad_add_members_to_groups as addUsersInGroups
import email
import os
import json
email = "sample#test.com"
subject = "username,ad-group"
user = subject.split(",")[0]
group = subject.split(",")[1]
emaildomain = email.split("#")[1]
domaingroup = ["test.com"]
adgroups = ["group1","group2"]
server = Server('serverIP', use_ssl=True, get_info=ALL)
conn = Connection(server, OU,
password, auto_bind=True)
def find_user():
user_criteria = "(&(objectClass=user)(sAMAccountName=%s))"%user
if conn.search("OU", user_criteria):
result = str(conn.entries)
user_dn = result.split("-")[0].replace("[DN: ","")
return user_dn
return nouser
def find_group():
group_criteria = "(&(objectClass=group)(sAMAccountName=%s))"%group
if conn.search("OU", group_criteria):
result_group = str(conn.entries)
group_dn = result_group.split("-")[0].replace("[DN: ","")
return group_dn
return nogroup
def add_to_group(user,group):
addUsersInGroups(conn,user,group)
if emaildomain in domaingroup:
user = find_user()
group = find_group()
add_to_group(user,group)
Please note that I had to delete some things off the script for security reasons.
The connection to search for a user or group is working and if I run the add-to-group function it works too, but only running it without any search beforehand.
Somehow I have the feeling that making the conn.search blocks the connection for anything search related and if try to use the same connection for something different e.g. adding a user to group, that request gets blocked.
Here is the error I receive:
Error_Message
Found the solution on this website:
https://github.com/cannatag/ldap3/issues/442
You are getting this error probably due to auto_referrals=True in Connection by default. Try to use:
conn = Connection(server, "cn=xxx,cn=users,dc=wwww,dc=zzzz,dc=com", "my_pass", auto_bind=True, auto_referrals=False) and do not search and another DC.

Getting owner of file from smb share, by using python on linux

I need to find out for a script I'm writing who is the true owner of a file in an smb share (mounted using mount -t cifs of course on my server and using net use through windows machines).
Turns out it is a real challenge finding this information out using python on a linux server.
I tried using many many smb libraries (such as smbprotocol, smbclient and others), nothing worked.
I find few solutions for windows, they all use pywin32 or another windows specific package.
And I also managed to do it from bash using smbcalcs but couldn't do it cleanly but using subprocess.popen('smbcacls')..
Any idea on how to solve it?
This was unbelievably not a trivial task, and unfortunately the answer isn't simple as I hoped it would be..
I'm posting this answer if someone will be stuck with this same problem in the future, but hope maybe someone would post a better solution earlier
In order to find the owner I used this library with its examples:
from smb.SMBConnection import SMBConnection
conn = SMBConnection(username='<username>', password='<password>', domain=<domain>', my_name='<some pc name>', remote_name='<server name>')
conn.connect('<server name>')
sec_att = conn.getSecurity('<share name>', r'\some\file\path')
owner_sid = sec_att.owner
The problem is that pysmb package will only give you the owner's SID and not his name.
In order to get his name you need to make an ldap query like in this answer (reposting the code):
from ldap3 import Server, Connection, ALL
from ldap3.utils.conv import escape_bytes
s = Server('my_server', get_info=ALL)
c = Connection(s, 'my_user', 'my_password')
c.bind()
binary_sid = b'....' # your sid must be in binary format
c.search('my_base', '(objectsid=' + escape_bytes(binary_sid) + ')', attributes=['objectsid', 'samaccountname'])
print(c.entries)
But of course nothing will be easy, it took me hours to find a way to convert a string SID to binary SID in python, and in the end this solved it:
# posting the needed functions and omitting the class part
def byte(strsid):
'''
Convert a SID into bytes
strdsid - SID to convert into bytes
'''
sid = str.split(strsid, '-')
ret = bytearray()
sid.remove('S')
for i in range(len(sid)):
sid[i] = int(sid[i])
sid.insert(1, len(sid)-2)
ret += longToByte(sid[0], size=1)
ret += longToByte(sid[1], size=1)
ret += longToByte(sid[2], False, 6)
for i in range(3, len(sid)):
ret += cls.longToByte(sid[i])
return ret
def byteToLong(byte, little_endian=True):
'''
Convert bytes into a Python integer
byte - bytes to convert
little_endian - True (default) or False for little or big endian
'''
if len(byte) > 8:
raise Exception('Bytes too long. Needs to be <= 8 or 64bit')
else:
if little_endian:
a = byte.ljust(8, b'\x00')
return struct.unpack('<q', a)[0]
else:
a = byte.rjust(8, b'\x00')
return struct.unpack('>q', a)[0]
... AND finally you have the full solution! enjoy :(
I'm adding this answer to let you know of the option of using smbprotocol; as well as expand in case of misunderstood terminology.
SMBProtocol Owner Info
It is possible to get the SID using the smbprotocol library as well (just like with the pysmb library).
This was brought up in the github issues section of the smbprotocol repo, along with an example of how to do it. The example provided is fantastic and works perfectly. An extremely stripped down version
However, this also just retrieves a SID and will need a secondary library to perform a lookup.
Here's a function to get the owner SID (just wrapped what's in the gist in a function. Including here in case the gist is deleted or lost for any reason).
import smbclient
from ldap3 import Server, Connection, ALL,NTLM,SUBTREE
def getFileOwner(smb: smbclient, conn: Connection, filePath: str):
from smbprotocol.file_info import InfoType
from smbprotocol.open import FilePipePrinterAccessMask,SMB2QueryInfoRequest, SMB2QueryInfoResponse
from smbprotocol.security_descriptor import SMB2CreateSDBuffer
class SecurityInfo:
# 100% just pulled from gist example
Owner = 0x00000001
Group = 0x00000002
Dacl = 0x00000004
Sacl = 0x00000008
Label = 0x00000010
Attribute = 0x00000020
Scope = 0x00000040
Backup = 0x00010000
def guid2hex(text_sid):
"""convert the text string SID to a hex encoded string"""
s = ['\\{:02X}'.format(ord(x)) for x in text_sid]
return ''.join(s)
def get_sd(fd, info):
""" Get the Security Descriptor for the opened file. """
query_req = SMB2QueryInfoRequest()
query_req['info_type'] = InfoType.SMB2_0_INFO_SECURITY
query_req['output_buffer_length'] = 65535
query_req['additional_information'] = info
query_req['file_id'] = fd.file_id
req = fd.connection.send(query_req, sid=fd.tree_connect.session.session_id, tid=fd.tree_connect.tree_connect_id)
resp = fd.connection.receive(req)
query_resp = SMB2QueryInfoResponse()
query_resp.unpack(resp['data'].get_value())
security_descriptor = SMB2CreateSDBuffer()
security_descriptor.unpack(query_resp['buffer'].get_value())
return security_descriptor
with smbclient.open_file(filePath, mode='rb', buffering=0,
desired_access=FilePipePrinterAccessMask.READ_CONTROL) as fd:
sd = get_sd(fd.fd, SecurityInfo.Owner | SecurityInfo.Dacl)
# returns SID
_sid = sd.get_owner()
try:
# Don't forget to convert the SID string-like object to a string
# or you get an error related to "0" not existing
sid = guid2hex(str(_sid))
except:
print(f"Failed to convert SID {_sid} to HEX")
raise
conn.search('DC=dell,DC=com',f"(&(objectSid={sid}))",SUBTREE)
# Will return an empty array if no results are found
return [res['dn'].split(",")[0].replace("CN=","") for res in conn.response if 'dn' in res]
to use:
# Client config is required if on linux, not if running on windows
smbclient.ClientConfig(username=username, password=password)
# Setup LDAP session
server = Server('mydomain.com',get_info=ALL,use_ssl = True)
# you can turn off raise_exceptions, or leave it out of the ldap connection
# but I prefer to know when there are issues vs. silently failing
conn = Connection(server, user="domain\username", password=password, raise_exceptions=True,authentication=NTLM)
conn.start_tls()
conn.open()
conn.bind()
# Run the check
fileCheck = r"\\shareserver.server.com\someNetworkShare\someFile.txt"
owner = getFileOwner(smbclient, conn, fileCheck)
# Unbind ldap session
# I'm not clear if this is 100% required, I don't THINK so
# but better safe than sorry
conn.unbind()
# Print results
print(owner)
Now, this isn't super efficient. It takes 6 seconds for me to run this one a SINGLE file. So if you wanted to run some kind of ownership scan, then you probably want to just write the program in C++ or some other low-level language instead of trying to use python. But for something quick and dirty this does work. You could also setup a threading pool and run batches. The piece that takes longest is connecting to the file itself, not running the ldap query, so if you can find a more efficient way to do that you'll be golden.
Terminology Warning, Owner != Creator/Author
Last note on this. Owner != File Author. Many domain environments, and in particular SMB shares, automatically alter ownership from the creator to a group. In my case the results of the above is:
What I was actually looking for was the creator of the file. File creator and modifier aren't attributes which windows keeps track of by default. An administrator can enable policies to audit file changes in a share, or auditing can be enabled on a file-by-file basis using the Security->Advanced->Auditing functionality for an individual file (which does nothing to help you determine the creator).
That being said, some applications store that information for themselves. For example, if you're looking for Excel this answer provides a method for which to get the creator of any xls or xlsx files (doesn't work for xlsb due to the binary nature of the files). Unfortunately few files store this kind of information. In my case I was hoping to get that info for tblu, pbix, and other reporting type files. However, they don't contain this information (which is good from a privacy perspective).
So in case anyone finds this answer trying to solve the same kind of thing I did - Your best bet (to get actual authorship information) is to work with your domain IT administrators to get auditing setup.

How to make connection in python to connect as400 and call any as400 programs with parameter

Anyone knows How to make connection in python to connect as400 iseries system and call any as400 programs with parameter.
For example how to create library by connecting as400 through python. I want to call " CRTLIB LIB(TEST) " from python script.
I am able to connect to DB2 database through pyodbc package.
Here is my code to connect DB2 database.
import pyodbc
connection = pyodbc.connect(
driver='{iSeries Access ODBC Driver}',
system='ip/hostname',
uid='username',
pwd='password')
c1 = connection.cursor()
c1.execute('select * from libname.filename')
for row in c1:
print (row)
If your IBM i is set up to allow it, you can call the QCMDEXC stored procedure using CALL in your SQL. For example,
c1.execute("call qcmdexc('crtlib lib(test)')")
The QCMDEXC stored procedure lives in QSYS2 (the actual program object is QSYS2/QCMDEXC1) and does much the same as the familiar program of the same name that lives in QSYS, but the stored procedure is specifically meant to be called via SQL.
Of course, for this example to work, your connection profile has to have the proper authority to create libraries.
It's also possible that your IBM i isn't set up to allow this. I don't know exactly what goes into enabling this functionality, but where I work, we have one partition where the example shown above completes normally, and another partition where I get this instead:
pyodbc.Error: ('HY000', '[HY000] [IBM][System i Access ODBC Driver][DB2 for i5/OS]SQL0901 - SQL system error. (-901) (SQLExecDirectW)')
This gist shows how to connect to an AS/400 via pyodbc:
https://gist.github.com/BietteMaxime/6cfd5b2dc2624c094575
A few notes; in this example, SYSTEM is the DSN you're set up for the AS/400 in the with pyodbc.connect statement. You could also switch this to be SERVER and PORT with these modifications:
import pyodbc
class CommitMode:
NONE = 0 # Commit immediate (*NONE) --> QSQCLIPKGN
CS = 1 # Read committed (*CS) --> QSQCLIPKGS
CHG = 2 # Read uncommitted (*CHG) --> QSQCLIPKGC
ALL = 3 # Repeatable read (*ALL) --> QSQCLIPKGA
RR = 4 # Serializable (*RR) --> QSQCLIPKGL
class ConnectionType:
READ_WRITE = 0 # Read/Write (all SQL statements allowed)
READ_CALL = 1 # Read/Call (SELECT and CALL statements allowed)
READ_ONLY = 2 # Read-only (SELECT statements only)
def connstr(server, port, commit_mode=None, connection_type=None):
_connstr = 'DRIVER=iSeries Access ODBC Driver;SERVER={server};PORT={port};SIGNON=4;CCSID=1208;TRANSLATE=1;'.format(
server=server,
port=port,
)
if commit_mode is not None:
_connstr = _connstr + 'CommitMode=' + str(commit_mode) + ';'
if connection_type is not None:
_connstr = _connstr + 'ConnectionType=' + str(connection_type) + ';'
return _connstr
def main():
with pyodbc.connect(connstr('myas400.server.com', '8471', CommitMode.CHG, ConnectionType.READ_ONLY)) as db:
cursor = db.cursor()
cursor.execute(
"""
SELECT * FROM IASP.LIB.FILE
"""
)
for row in cursor:
print(' '.join(map(str, row)))
if __name__ == '__main__':
main()
I cleaned up some PEP-8 as well. Good luck!

Resources