Using node-adodb with .mdb Access Files - node.js

I am attempting to write a containerized Node application on Windows OS that intakes Microsoft Access databases and accesses the data within. I wish to use npm node-adodb to interact with Access.
My application works perfectly fine with .accdb Access files. When I try to connect to .mdb Access files I get this error Spawn C:\Windows\SysWOW64\cscript.exe error, Provider cannot be found. It may not be properly installed. My code works on my local Windows computer, so I'm guessing that it's something to do with how my container environment is set up.
I set up the base Dockerfile like this:
# Get base Windows OS image
FROM mcr.microsoft.com/windows/servercore:ltsc2019
# Set environment variables
ENV NPM_CONFIG_LOGLEVEL info
ENV NODEJS_VERSION 12.9.1
# Download & Install 2010 Access Driver
RUN powershell -Command "wget -Uri https://download.microsoft.com/download/2/4/3/24375141-E08D-4803-AB0E-10F2E3A07AAA/AccessDatabaseEngine.exe -OutFile AccessDatabaseEngine.exe -UseBasicParsing"
RUN powershell -Command "Start-Process -NoNewWindow -FilePath \"AccessDatabaseEngine.exe\""
# Download & Install 2016 Access Driver
RUN powershell -Command "wget -Uri https://download.microsoft.com/download/3/5/C/35C84C36-661A-44E6-9324-8786B8DBE231/accessdatabaseengine_X64.exe -OutFile accessdatabaseengine_X64.exe -UseBasicParsing"
RUN powershell -Command "Start-Process -NoNewWindow -FilePath \"accessdatabaseengine_X64.exe\""
# Download and install Node.js
RUN powershell -Command "wget -Uri https://nodejs.org/dist/v%NODEJS_VERSION%/node-v%NODEJS_VERSION%-x64.msi -OutFile node.msi -UseBasicParsing"
RUN msiexec.exe /q /i node.msi
# Run node
CMD [ "node" ]
I establish the Access connection like so. How I instantiate the connection differs depending on if I'm in my local environment or online. It also differs on .accdb vs .mdb:
// Define connection string & initialize export connection depending on if it's a .accdb .mdb file
let access;
if ((file.path).substring(Math.max(0, (file.path).toString().length - 5)) === 'accdb') {
const connStr = `Provider=Microsoft.ACE.OLEDB.12.0;Data Source=${file.path};Persist Security Info=False;`;
access = ADODB.open(connStr); // This works
} else {
const connStr = `Provider=Microsoft.Jet.OLEDB.4.0;Data Source=${file.path};`;
access = ADODB.open(connStr); // This fails
}
Is there another software package that I need to install in order to work with .mdb files? Do I need to connect in a different way? Any help would be very much appreciated.

With the error message coming from "C:\Windows\SysWOW64\cscript.exe", please make sure that you have the 32-bit version of the "Microsoft Access Database Engine" distribution package installed.
If you have the 64-bit engine package installed, open the database connection like this:
access = ADODB.open(connStr, true);

Related

Not able to login to azure using powershell when executing powershell via an API [duplicate]

I am getting below error while executing powershell script. This error is coming when I am trying to import a module -
File C:\Program Files\WindowsPowerShell\Modules\Az.Accounts\2.10.3\Az.Accounts.psm1 cannot be loaded because running scripts is disabled on this system. For more information, see about_Execution_Policies at https://go.microsoft.com/fwlink/?LinkID=135170.
C# code -
using (Automation.PowerShell ps = Automation.PowerShell.Create())
{
ps.AddScript(script);
ps.AddParameters(parameters);
result = ps.Invoke();
}
It works fine when I run script directly in powershell console but getting this error while running it via C# web API running in localhost.
Current Execution Policy -
PS C:\Program Files\WindowsPowerShell> Get-ExecutionPolicy
Unrestricted
PS C:\Program Files\WindowsPowerShell> cd .\Modules
PS C:\Program Files\WindowsPowerShell\Modules> Get-ExecutionPolicy
Unrestricted
PS C:\Program Files\WindowsPowerShell\Modules> cd Az.Accounts
PS C:\Program Files\WindowsPowerShell\Modules\Az.Accounts> Get-ExecutionPolicy
Unrestricted
PS C:\Program Files\WindowsPowerShell\Modules\Az.Accounts>
Could someone please help on resolving this? Thanks in Advance!!
I have tried setting Execution policy to Unrestricted and ByPass but nothing worked.
Try setting the execution policy within your program:
using (Automation.PowerShell ps = Automation.PowerShell.Create())
{
ps.AddCommand("Set-ExecutionPolicy").AddArgument("Unrestricted")
.AddParameter("Scope","CurrentUser");
ps.AddScript(script);
ps.AddParameters(parameters);
result = ps.Invoke();
}

Alternative to New-SqlAzureKeyVaultColumnMasterKeySettings

I currently have a script that is similair to the script that is described inside https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/configure-always-encrypted-keys-using-powershell?view=sql-server-ver15
However the script uses powershell 5 which isn't available to me on linux agents in our azure-devops environment. Because of this a majority of the azure-sql commands aren't available to us. Is there an alternative to these sqlserver module cmdlets.
The recommended alternative is to use the cloud console, but you may be able to get away with linux powershell like so:
I can't guarantee if this all works, but that specific command only creates a SqlColumnMasterKeySettings object that contains information about the location of your column master key. You can probably just create one manually, but you'll need to know the exact values. I would recommend running it from a windows machine first to see what the values should be for your environment.
# On Windows [Optional]
$cmkSettings = New-SqlAzureKeyVaultColumnMasterKeySettings -KeyURL $akvKey.Key.Kid
$cmkSettings | Format-List KeystoreProviderName,KeyPath
KeystoreProviderName : # Take these strings
KeyPath : # And use them in your script on linux
# Now on Linux, using the values above:
$cmkSettings = [Microsoft.SqlServer.Management.PowerShell.AlwaysEncrypted.SqlColumnMasterKeySettings]::new("YourKeystoreProviderName","YourKeyPath")
New-SqlColumnMasterKey -Name 'CMK1' -InputObject $database -ColumnMasterKeySettings $cmkSettings
# Success!
The key settings properties are just strings that get saved to your SQL Instance, so this should work fine. The harder part is authenticating to Azure to create keys from your master key, but you can try importing the desktop version of the commands like so:
# Start a NEW powershell session without the sqlserver module:
pwsh
# Get the module directory:
$d = (Get-Item (Get-Module SqlServer).path).DirectoryName
# Import the desktop version of these assemblies:
Import-Module "$d/Microsoft.SqlServer.Diagnostics.Strace.dll"
Import-Module "$d/Microsoft.SqlServer.Management.PSSnapins.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AzureAuthenticationManagement.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AlwaysEncrypted.Types.dll"
# Then import the module normally (there will be errors - you can ignore these)
Import-Module SqlServer
# Now you may be able to authenticate to Azure to generate new keys:
# Required to generate new keys
# NOTE: -Interactive fails on linux
Add-SqlAzureAuthenticationContext -ClientID "YourID" -Secret "YourSecret" -Tenant "YourTenant"
# Create a key using your master key:
New-SqlColumnEncryptionKey -Name 'CEK1' -InputObject $database -ColumnMasterKey 'CMK1'
This worked on my installation of centos7/pwsh7.1.3 - make sure you have SqlServer version 21.1.18245 (only 10 days old at the moment) as many new sql commands got ported to pwsh 7.1.

Transfer file from Windows to Linux without using 3rd party software and using Shell.Application only

How to transfer file from Window Server to Linux without using 3rd party software? I just can use pure PowerShell script to transfer zip file.
I'm using PowerShell v2.0 (I know it's pretty old and I don't have privilege to update to current version - only can use for Shell.Application script)
Telnet successfully
Destination server installed private/public key (which I gen from my server using PuTTYgen - but no privilege to install PuTTY or WinSCP)
$timestamp = (Get-Date).AddMonths(-1).ToString('yyyy-MM-dd')
$todaysDate = (Get-Date).AddDays(-1)
$source = "D:\Testing\*.csv", "D:\Testing\*.csv"
$target = "D:\Testing\bin\$timestamp.zip"
$housekeepZipFile = "D:\Testing\bin\*.zip"
$locationToTransfer = "D:\Testing\bin\*.zip"
$mftFileTransfer = "UserName#192.168.0.50:/UserName/Outbox"
Get-ChildItem -Path $locationToTransfer –Recurse | Where-Object {
$_.LastWriteTime -gt (Get-Date).AddDays(-1)
} | Copy-Item -Destination $mftFileTransfer -Force
Is my syntax correct? Just now tried, seems not receive any file.
Using Window Server 2008
As Ansgar already commented, keys are used with SSH/SFTP. There's no support for SSH/SFTP in PowerShell nor in Windows 2008. If you need to use SSH/SFTP, you have to use 3rd party software/library.
And as already said above, you do not need install privileges to use WinSCP nor PuTTY/psftp.

Clearing Azure Redis Cache using PowerShell during deployment

When deploying new versions of our web application to Azure App Service, I have a requirement to clear out the data in the associated Azure Redis Cache. This is to ensure that we don't return old versions of items which have schema changes in the new version.
We're deploying using Octopus Deploy, and I have previously tried executing the following PowerShell command to Reset the cache:
Reset-AzureRmRedisCache -ResourceGroupName "$ResourceGroup" -Name "$PrimaryCacheName" -RebootType "AllNodes" -Force
This works successfully but it's a bit heavy-handed and we're having intermittent connection issues which I suspect are caused by the fact that we're rebooting Redis and dropping existing connections.
Ideally, I'd just like to execute a FLUSHALL command via PowerShell. Is this a better approach, and is it possible to execute in PowerShell using the StackExchange.Redis library?
The Reset-AzureRmRedisCache cmdlet restarts nodes of an Azure Redis Cache instance, which I agree it is a bit overkill for your requirement.
Yes, it is possible to execute a Redis FLUSHALL command in PowerShell.
As the pre-requisite, you should install the Redis CLI and set an environment variable to point to the Redis CLI executable/binary path in your environment.
Then, you can execute in PowerShell using the Redis-CLI commands as shown below.
Invoke-Command -ScriptBlock { redis-cli -h <hostname>.redis.cache.windows.net -p <redisPort> -a <password> }
Invoke-Command -ScriptBlock { redis-cli flushall }
A execution result of code sample above is as shown below:
The way I eventually implemented this is to call the StackExchange.Redis library via PowerShell, so you'll need to have a copy of this DLL somewhere handy. During my deployment, I have access to the connection string, so this function strips out the host and port to connect to the server. This works without the need to open the non-SSL port, and the connection string allows admin access to the cache:
function FlushCache($RedisConnString)
{
# Extract the Host/Port from the start of the connection string (ignore the remainder)
# e.g. MyUrl.net:6380,password=abc123,ssl=True,abortConnect=False
$hostAndPort = $RedisConnString.Substring(0, $RedisConnString.IndexOf(","))
# Split the Host and Port e.g. "MyUrl.net:6380" --> ["MyUrl.net", "6380"]
$RedisCacheHost, $RedisCachePort = $hostAndPort.split(':')
Write-Host "Flushing cache on host - $RedisCacheHost - Port $RedisCachePort" -ForegroundColor Yellow
# Add the Redis type from the assembly
$asm = [System.Reflection.Assembly]::LoadFile("StackExchange.Redis.dll")
# Open a connection
[object]$redis_cache = [StackExchange.Redis.ConnectionMultiplexer]::Connect("$RedisConnString,allowAdmin=true",$null)
# Flush the cache
$redisServer = $redis_cache.GetServer($RedisCacheHost, $RedisCachePort,$null)
$redisServer.FlushAllDatabases()
# Dispose connection
$redis_cache.Dispose()
Write-Host "Cache flush done" -ForegroundColor Yellow
}
I have used the Windows port of netcat to clear Redis cache remotely from my Windows machine, like so:
$redisCommands = "SELECT $redisDBIndex`r`nFLUSHDB`r`nQUIT`r`n"
$redisCommands | .\nc $redisServer 6379
Where $redisDBIndex is the Redis Cache index you want to clear. Or simply the command FLAUSHALL if you want to clear everything. $redisServer is your Redis server. And simply pipe to nc.
I have also documented it here: https://jaeyow.github.io/fullstack-developer/automate-redis-cache-flush-in-powershell/#

Chef WebPI cookbook fails install in Azure

I setup a new Win2012 VM in Azure with the Chef plugin and have it connected to manage.chef.io. Added a cookbook which uses the WebPi cookbook to install ServiceBus and its dependencies. The install fails with the following error:
“Error opening installation log file. Verify that the specified log file location exists and is writable.”
After some searching it looks like this is not new in Azure based on this 2013 blog post - https://nemetht.wordpress.com/2013/02/27/web-platform-installer-in-windows-azure-startup-tasks/
It offers a hack to disabled security on the folder temporarily but I'm looking for a better solution.
Any ideas?
More of the log output -
Started installing: 'Microsoft Windows Fabric V1 RTM'
.
Install completed (Failure): 'Microsoft Windows Fabric V1 RTM'
.
WindowsFabric_1_0_960_0 : Failed.
Error opening installation log file. Verify that the specified log file location exists and is writable.
DependencyFailed: Microsoft Windows Fabric V1 CU1
DependencyFailed: Windows Azure Pack: Service Bus 1.1
.
..
Verifying successful installation...
Microsoft Visual C++ 2012 SP1 Redistributable Package (x64) True
Microsoft Windows Fabric V1 RTM False
Log Location: C:\Windows\system32\config\systemprofile\AppData\Local\Microsoft\Web Platform Installer\logs\install\2015-05-11T14.15.51\WindowsFabric.txt
Microsoft Windows Fabric V1 CU1 False
Windows Azure Pack: Service Bus 1.1 False
Install of Products: FAILURE
STDERR:
---- End output of "WebpiCmd.exe" /Install /products:ServiceBus_1_1 /suppressreboot /accepteula /Log:c:/chef/cache/WebPI.log ----
Ran "WebpiCmd.exe" /Install /products:ServiceBus_1_1 /suppressreboot /accepteula /Log:c:/chef/cache/WebPI.log returned -1
A Chef contact (thanks Bryan!) helped me understand this issue better. Some WebPI packages do not respect the explicit log path provided to WebPIcmd.exe. The author should fix the package to use the provided log path when it is set. So the options became:
Have the author fix the package
Run Chef in a new scheduled task as a different user which has access
to the AppData folder
Edit the cookbook to perform/unperform a registry edit to temporarily move the AppData folder to a location that the System
user has access. Either in my custom cookbook or fork the WebPI
cookbook.
Obviously, waiting on the author (Microsoft in this case) to fix the package would not happen quickly.
Changing how the Azure VM runs Chef doesn't make sense considering the whole idea is to provide the configuration at the time of provisioning and it just work. Plus changing the default setup may have unintended consequences and puts us in a non-standard environment.
In the short term, I decided to alter the registry in my custom cookbook.
registry_key 'HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders' do
values [{
:name => "Local AppData",
:type => :expand_string,
:data => "%~dp0appdata"
}]
action :create
end
webpi_product 'ServiceBus_1_1' do
accept_eula true
action :install
end
webpi_product 'ServiceBus_1_1_CU1' do
accept_eula true
action :install
end
registry_key 'HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders' do
values [{
:name => "Local AppData",
:type => :expand_string,
:data => '%%USERPROFILE%%\AppData\Local'
}]
end
This change could also be done in the WebPI cookbook as well to fix this issue for all dependent cookbooks. I decided to not approach this until the WebPI team responds to a feature request for the framework to verify packages respect the log path instead.
http://forums.iis.net/t/1225061.aspx?WebPI+Feature+Request+Validate+product+package+log+path+usage
Please go and reply to this thread to try to get the team to help protect against this common package issue.
Here is the solution with POWERSHELL
I had the same error while installing "Service Fabric SDK" during VMSS VM creation. Also the system user was used.
Issue: when I was connecting with RDP with my "admin" user and run the same, it worked.
Solution: change the registry entry as above, install and reset back
here is my solution using "powershell"
I installed 2 .reg files into %TEMP% folder. The content is the old and new exported key / value for the
plugin-sf-SDK-temp.reg
Windows Registry Editor Version 5.00
[HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders]
"Local AppData"=hex(2):25,00,54,00,45,00,4d,00,50,00,25,00,00,00
plugin-sf-SDK-orig.reg
Windows Registry Editor Version 5.00
[HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders]
"Local AppData"=hex(2):25,00,55,00,53,00,45,00,52,00,50,00,52,00,4f,00,46,00,\
49,00,4c,00,45,00,25,00,5c,00,41,00,70,00,70,00,44,00,61,00,74,00,61,00,5c,\
00,4c,00,6f,00,63,00,61,00,6c,00,00,00
Integrate the following code into your custom-powershelgl script:
Write-Output "Reset LocalApp Folder to TEMP"
Start-Process "$($env:windir)\regedit.exe" `
-ArgumentList "/s", "$($env:TEMP)\plugin-sf-SDK-temp.reg"
## replace the following lines with your installation - here my SF SDK installation via WebWPIcmd
Write-Output "Installing /Products:MicrosoftAzure-ServiceFabric-CoreSDK"
Start-Process "$($env:programfiles)\microsoft\web platform installer\WebPICMD.exe" `
-ArgumentList '/Install', `
'/Products:"MicrosoftAzure-ServiceFabric-CoreSDK"', `
'/AcceptEULA', "/Log:$($env:TEMP)\WebPICMD-install-service-fabric-sdk.log" `
-NoNewWindow -Wait `
-RedirectStandardOutput "$($env:TEMP)\WebPICMD.log" `
-RedirectStandardError "$($env:TEMP)\WebPICMD.error.log"
Write-Output "Reset LocalApp Folder to ORIG"
Start-Process "$($env:windir)\regedit.exe" `
-ArgumentList "/s", "$($env:TEMP)\plugin-sf-SDK-orig.reg"

Resources