[Question posted by a user on YugabyteDB Community Slack]
I log in my VM using ssh. I can't connect to VM's IP:7000 directly using the browser. How can I get read ops/sec and write ops/sec etc by bash?
I use this in Python (with pandas installed):
#!/usr/bin/python
master='http://localhost:7000'
# install pandas module with:
# python -m pip install pandas
import pandas
from datetime import datetime
tablet_servers=(pandas.read_html(master+'/tablet-servers')[0])
tablet_servers['Server'].replace(to_replace="(:[0-9]+)?( [0-9a-f]*)?$"
, value="", regex=True, inplace=True)
print(tablet_servers[['Server','Read ops/sec','Write ops/sec']])
But you can also probably tunnel the 7000 port through ssh (-L 7000:localhost:7000) to get it from your browser.
Related
Hi I was trying to run browsermob-proxy on google colab but was constantly getting the error:
Could not read Browsermob-Proxy json
Another server running on this port?
Also I tried to find code on the internet on how to use browsermob-proxy on google colab, but didn't find it.
Can you provide a working code or link for the same.
This is the code that works for me:
#!/usr/bin/python
# coding=utf-8
%cd path/browsermob-proxy-2.1.4/bin
!chmod -R 777 *
from browsermobproxy import Server
dict={'port':8090}
server = Server('./browsermob-proxy',options=dict)
server.start()
proxy = server.create_proxy()
you need to change the current working directory to the path/browsermob-proxy-2.1.4/bin folder, and then use !chmod -R 777 * to modify the file permissions. Point to the file without .bat
If the local port being used, There will appear Errno 8 error warning. You need to add a port options dict
I am building a DAG that starts with an SFTPSensor Operator. As we can see, the SFTPSensor class takes an sftp_conn_id parameter. (Strangely, the SFTP Operator uses an SSH connection, which is what I would have expected from the sensor as well).
So I go to the UI to create an SFTP connection like the example sftp connection that ships with airlfow:
However this connection type does not exist among the selectable connection types. When I 'edit' the existing example, the 'type' shows up as Amazon Web Services:
Do I have to create this type of connection with the CLI as described here. So something like:
airflow connections -a \
--conn_id my_sftp_connection \
--conn_type sftp \
--conn_login **ux**\
--conn_port 22 \
--conn_host **host** \
--conn_extra "{"key_file": "**keypath**"}"
Both https://airflow.apache.org/docs/apache-airflow-providers-sftp/stable/index.html
and https://airflow.apache.org/docs/apache-airflow-providers-ssh/stable/index.html use Paramiko implementation of the SSHv2 protocol.
As described here http://docs.paramiko.org/en/stable/api/sftp.html paramiko sftp client uses an ssh transport to perform remote file operations.
You need to install both extra provider package's to see each connection type listed in the UI.
pip install apache-airflow-providers-sftp[ssh]
How can I connect to an Azure PostgreSQL database, from a remote machine?
Update 2. I can connect to the database from WSL/Ubuntu using sudo psql, but I can't using plain psql. So it's a permissions issue somewhere...
Update. I've discovered I can connect from the remote machine using PgAdmin4, but I can't connect using psql. So I want to know: how should I connect using psql?
Original question. I can connect to it using psql from a VM inside Azure, so I know the database is up and accepting connections.
But when I try to connect from my home machine, using exactly the same psql command, it fails:
psql --user=UUU --host=HHH DB
psql: server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
More information...
On the Azure database's "Connection Security" blade, I have
added a firewall rule with start IP=0.0.0.0 and end IP=255.255.255.255
set "enforce SSL connection" to disabled
turned on "allow access to Azure services".
My home machine is running Windows+WSL, and I'm trying to connect from WSL / Ubuntu 18.04
using psql version 10.11. I run into the same problem whether I try to connect from home or from work, and I'm not blocking any outgoing ports (that I know of).
The database is running PostgreSQL 10.
When I connect (successfully) from an Azure VM, using psql 10.10, it looks like this:
psql --user=UUU --host=HHH DB
Password for user UUU:
psql (10.10 (Ubuntu 10.10-0ubuntu0.18.04.1))
SSL connection (protocol: TLSv1.2, cipher: ECDHE-RSA-AES256-GCM-SHA384, bits: 256, compression: off)
Maybe your root user uses a different psql binary than your user. ( You can find out using
which psql and sudo which psql )
I Ran into the same connection issue. In my case, the base issue was a postgres major version mismatch.
I was connecting to an Azure Postgresql on version 11 with my local psql on version 12. Downgrading my local machine's Postgres version to 11.6 solved this for me.
Maybe your root user is using psql 10 and your default user is using psql 11 or 12. ( You can check this using psql -V and sudo psql -V )
I had the same issue. The error message sucks and is of 0 help.
You're probably using a different version of psql than your Azure DB. It needs to match whatever is installed in Azure.
So if you provisioned a version 10 DB in Azure, either you install version 10 for the pqsl tool or do a full Postgre version 10 install instead. The point is the major versions need to match between psql and the target database.
from PyQt5.QtGui import QGuiApplication
app = QGuiApplication([])
When starting, it gives an error
qt.qpa.screen: QXcbConnection: Could not connect to display
Could not connect to any X display.
Tell me how to fix this? I could not find it.
Looks like the DISPLAY environment variable is not set. Are you running this from a graphical session? If you are running this over SSH you need to use X11 forwarding.
Assuming it is enabled on the server, you need to run ssh with the -X option.
You can find more information about this on ssh man page.
import paramiko
import time
import os
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('server',port=22,username='user',password='pass123')
print("connected to the linux machine from windows machine.")
channel=ssh.invoke_shell()
channel_data = str()
while True:
if channel.recv_ready():
channel_data += channel.recv(9999).decode(encoding='utf_8', errors='strict')
os.system('cls')
print("##### Device Output #####")
print("\n",channel_data)
print("\n #####################")
else:
continue
time.sleep(5)
if channel_data.endswith('[root#home ~]#'):
#if block statements are not executed why
print("Hi,why not executing this statement")
ssh.send('cd /\n')
#stdin,stdout,stderr=ssh.exec_command('pwd')
#output1=stdout.readlines()
print("My present working directory is")
elif channel_data.endswith('[root#home /]#'):
#Also elif block statements are not executed why
ssh.send('mkdir BB444')
#stdin,stdout,stderr=ssh.exec_command('mkdir /pn444')
#output1=stdout.readlines()
print("created pn444 directory")
I am using paramiko for ssh connection. I am able to login to linux machine. then i am checking the condition i.e if channel_data.endswith('[root#home ~]#') then send "cd /" command else if channel_data.endswith('[root#home /]#') then send 'mkdir BB444' command but this script is not sending these commands. after debugging i see that these sending command statements are not executed. Please let me know what mistake i am making here.
I am using python 3.6, paramiko 2.1.1
The problem might not be the python script. i encountered similar problems while sending remote commands to putty ssh connection from a python script. If you are in a corporate network. some administrators strip the commands from the parameters like -m which i use for putty. so you can login but when you try to send a remote command. it does not reach the server. Check with your administrator if you are on a corporate network about remote commands for ssh connections.