Update ini file using conf file data using shell script - linux

I have below 2 files a.conf and b.ini.
File a.conf has drivers path in it. Which needs to be updated in b.ini file against that particular driver.
a.conf
#find driver directory and replace that
oracle=/client64/lib #bla bla
db2=/opt/db2/lib
#dvs=/opt/dvs/lib
b.ini
[SQLSERVER]
Driver = /opt/local/lib/libtdsodbc.so
HOST = 192.168.220.156
PORT = 1433
TDS_VERSION = 8.0
[ORACLE]
Driver=/usr/lib/oracle/19.5/client64/lib/libsqora.so.19.1
HOST = 192.168.220.182
PORT = 1521
I have to write a shell script in such a way that it must read all the values which are not commented in file a.conf and update the path of Driver in b.ini file.
I am new to shell script any kind of help would be appreciated.

Related

how I can prevent res.sendFile changing file path?

I'm using node.js and want to send file to the frontend. So I specified the direct path to my file like:
path = "c:/app/A"
and when I run res.sendFile(path, fileName);
I'm getting the Error: ENOENT: no such file or directory, stat '/home/projects/c:/app/A'
How I can disable this auto path adding "/home/projects" part?
I want to download file that is not in my project folder with my code. File is in my computer in different folder.
Try to use \\ as path delimiter for Windows (c:\\app\\A) and read about Node.js module "path".
so I need use just new URL(file:${"c:/app/A"});
so it will be like that:
let filename = "someName.com"
let absPath = "c:/app/someName.com";
fs.writeFileSync(`${filename}`, fs.readFileSync(new URL(`file:${absPath}`)));
res.download(`${filename}`, `${filename}`)

pySpark local mode - loading text file with file:/// vs relative path

I am just getting started with spark and I am trying out examples in local mode...
I noticed that in some examples when creating the RDD the relative path to the file is used and in others the path starts with "file:///". The second option did not work for me at all - "Input path does not exist"
Can anyone explain what the difference is between using the file path and putting 'file:///' in front of it ?
I am using Spark 2.2 on Mac running in local mode
from pyspark import SparkConf, SparkContext
conf = SparkConf().setMaster("local").setAppName("test")
sc = SparkContext(conf = conf)
#This will work providing the relative path
lines = sc.textFile("code/test.csv")
#This will not work
lines = sc.textFile("file:///code/test.csv")
sc.textFile("code/test.csv") means test.csv in /<hive.metastore.warehouse.dir>/code/test.csv on HDFS.
sc.textFile("hdfs:///<hive.metastore.warehouse.dir>/code/test.csv") is equal to above.
sc.textFile("file:///code/test.csv") means test.csv in /code/test.csv on local file system.

Selenium webdriver issue with file paths

I'm having an issue with Selenium standalone webdriver used with webdriver-manager npm module. I'm using the Firefox Gecko driver. I need to select a file from an HTML file input component. When I try this on my local machine or on BrowserStack I get the error:
"WebDriverError: File not found: /Users/christophergrigg/a.pdf"
const requestFile = By.id('requestFile');
driver.wait(until.elementLocated(requestFile));
const requestFileEl = driver.findElement(requestFile);
driver.wait(until.elementIsVisible(requestFileEl), TIMEOUT).click();
requestFileEl.sendKeys('/Users/christophergrigg/a.pdf');
requestFileEl.sendKeys(webdriver.Key.ENTER);
On Browser stack I'm using this path:
requestFileEl.sendKeys('C:\\Desktop\\documents\\pdf-sample2.pdf'); // Windows 7 / 8 / 8.1
You need to provide the full path of the file. And if the file is not present on the machine running the remote instance, you'll also have to set the file detector to automatically upload the file.
On mac OS X:
var remote = require('selenium-webdriver/remote');
driver.setFileDetector(new remote.FileDetector);
driver.sendKeys('/Users/christophergrigg/Desktop/a.pdf');
, or Windows:
var remote = require('selenium-webdriver/remote');
driver.setFileDetector(new remote.FileDetector);
driver.sendKeys('C:\\Users\\christophergrigg\\Desktop\\a.pdf');

Create lnk shortcut from lua (without lfs)

I would like to write a function to create a windows .lnk file from my lua script. I found a function in the LuaFileSystem library . Is there a way to do this without the library? (The reason: I am writing the script for multiple users, would be nice if we don't have to install the library on every machine.)
I appreciate the help!
To make a shortcut (an .lnk file)
-- your .lnk file
local your_shortcut_name = "your_shortcut.lnk"
-- target (file or folder) with full path
local your_target_filespec = [[C:\Windows\notepad.exe]]
local ps = io.popen("powershell -command -", "w")
ps:write("$ws = New-Object -ComObject WScript.Shell;$s = $ws.CreateShortcut('"..your_shortcut_name.."');$s.TargetPath = '"..your_target_filespec.."';$s.Save()")
ps:close()
To make a symlink simply use os.execute"mklink ..."
Use luacom is faster than powershell
local luacom=require'luacom'
local shortcut_file_path='test_create_shortcut.lnk'
local target_file_path=arg[0]
local shellObject=luacom.CreateObject("WScript.Shell")
local shortcut=shellObject:CreateShortcut(shortcut_file_path)
shortcut.TargetPath=target_file_path
shortcut:Save()
assert(io.open(shortcut_file_path)):close()--shortcut file exist now
os.remove(shortcut_file_path)
And use FileSystemObject object (another COM), or Windows shell link file format spec for Kaitai Struct (parse binary file struct to get info on various file format) to retrieve shortcut info. Which 'lfs' can't do now.
see: Create a desktop shortcut with Windows Script Host - Windows Client | Microsoft Docs
LuaCOM User Manual (Version 1.3)

spark standalone without hdfs

i've been trying a simple wordcount app on spark standalone.
I have 1 windows machine and 1 linux machine,
Windows runs Master & slave
Linux runs slave.
Connection was fast a simple.
i try to avoid using hdfs but i do want to work on a cluster.
My code so far is:
String fileName = "full path at client";
File file = new File(fileName);
Path filePath = new Path(file);
String uri= filePath.toURI().toString();
SparkConf conf = new sparkConf().setAppName("stam").setMaster("spark://192.168.15.17:7077").setJars(new String[] { ..,.. });
sc = new JavaSparkContext(conf);
sc.addFile(uri);
JavaRDD<String> textFile = sc.textFile(SparkFiles.get(getOnlyFileName(fileName))).cache();
This fails with
Input path does not exist:........
or
java.net.URISyntaxException: Relative path in absolute URI
depends on what i try, the error is from the linux slave
Any idea if this possible ?
The file is being copied to all slaves work directories .
Please help
This cannot be done.
I've moved from standalone to yarn

Resources