How can I safely move the CBB database? - cloudberry

I want to move the CBB database to the 'data drive' so that it is included in the regular backups.
1) is this possible?
2) if so how can it be done safely?
Is it as simple as taking a copy of the .db file to a new location and then pointing CBB application to it?

Why do you need it? With every backed up file it becomes more and more outdated. If anything happens all you need is to create storage account, pick the same bucket and rebuild the database.
Go to Tools -> Options -> Repository tab -> Synchronize
CLI:
cbb.exe account -l
cbb.exe account -s "accountname"
-l will list all account
To move it:
cbb.exe option -databaseLocation path
(where "path" is a new repository file destination. As a result, the CBBackup.db file will be moved to the new location.)
Or just click Tools -> Options -> Repository tab -> Shield icon
Make sure that you don't have any running plans.

Related

What Visual Studio Code extension API should be used to add folders and move files in a Visual Studio Code workspace?

I want to write an extension that will:
Read the folders and files in the current workspace
Reorganise those files by creating new folders and moving the files into them
So I need an API to do the above. I am not clear on whether I should use
the standard fs node module,
the File System Provider or
the workspace namespace.
I read the answer here, but it doesn't say what to use to create folders and move files within the workspace.
The FileSystemProvider is to be used if you want to serve files from non-local storage like FTP sites or virtual file systems inside remote devices and present them to Visual Studio Code as storage directories. The FileSystemProvider is a view/controller of the remote storage. In other words, you have to implement all the file manipulation operations by communicating with the remote storage.
If you want to just manipulate the files in the current workspace and also be able to use URI's from FileSystemProviders, you use vscode.workspace.fs.
You can also use the Node.js fs module, but that only handles local disk workspaces (URI's with scheme file:). I recommend to use the synchronous versions of the fs methods. I had some troubles using the asynchronous fs methods in Visual Studio Code (I did not know of vscode.workspace.fs at that time).
I found extension.ts, sample code for implementing FileSystemProvider provided by Microsoft.
The below steps are provided which helps you understand how to create a new folder (createDirectory) and move files within the workspace (using the copy command, copying all files in an old folder to a new folder and use the delete command if you don’t wish for files to remain in the old folder).
The first step in FileSystemProvider is to register the filesystem provider for a given scheme using registerFileSystemProvider:
const memFs = new MemFS();
context.subscriptions.push(
vscode.workspace.registerFileSystemProvider('memfs', memFs, {
isCaseSensitive: true
}));
Next is to register the command registerCommand to perform your operations in FileSystemProvider like readDirectory, readFile, createDirectory, copy, delete.
context.subscriptions.push(
vscode.commands.registerCommand('memfs.init', _ =>
{
#TODO: add required functionalities
}));
Read folders and read files
for (const [name] of memFs.readDirectory(vscode.Uri.parse('memfs:/'))) {
memFs.readFile(vscode.Uri.parse(`memfs:/${name}`));
}
Create the new folder
memFs.createDirectory(vscode.Uri.parse(`memfs:/folder/`));
Move files to newly created folder. Unfortunately there doesn't seem to be a separate move file command, but you can use the Copy command to copy to the new folder and then delete:
COPY
copy(source: Uri, destination: Uri, options: {overwrite: boolean})
DELETE
context.subscriptions.push(vscode.commands.registerCommand('memfs.reset', _ => {
for (const [name] of memFs.readDirectory(vscode.Uri.parse('memfs:/'))) {
memFs.delete(vscode.Uri.parse(`memfs:/${name}`));
}
}));

Load Azure ML experiment run information from datastore

I have lots of run files created by running PyTorch estimator/ ScriptRunStep experiments that are saved in azureml blob storage container. Previously, I'd been viewing these runs in the Experiments tab of the ml.azure.com portal and associating tags to these runs to categorise and load the desired models.
However, a coworker recently deleted my workspace. I created a new one which is connected to the previously-existing blob container, the run files therefore still exist and can be accessed on this new workspace, but they no longer show up in the Experiment viewer on ml.azure.com. Neither can I see the tags I'd associated to the runs.
Is there any way to load these old run files into the Experiment viewer or is it only possible to view runs created inside the current workspace?
Sample scriptrunconfig code:
data_ref = DataReference(datastore=ds,
data_reference_name="<name>",
path_on_datastore = "<path>")
args = ['--data_dir', str(data_ref),
'--num_epochs', 30,
'--lr', 0.01,
'--classifier', 'int_ext' ]
src = ScriptRunConfig(source_directory='.',
arguments=args,
compute_target = compute_target,
environment = env,
script='train.py')
src.run_config.data_references = {data_ref.data_reference_name:
data_ref.to_config()}
Sorry for your loss! First, I'd make absolutely sure that you can't recover the deleted workspace. Definitely worthwhile to open an priority support ticket with Azure.
Another thing you might try is:
create a new workspace (which will create a new storage account for you for the new workspace's logs)
copy your old workspace's data into the new workspace's storage account.

Possible to edit web.config of cloud app deployed on windows Azure without redeploying app?

I would like to add rewrite URL code on azure web app's web.config without redeploying the whole app again. for this I am using 'app service editor' and 'kudu- debug console' for editing the web.config, first I cant save the file and gives me error.
after some search I found that under APP SETTING KEY value should be 0 instead 1
edited the value 1 to 0 and save the APP SETTING KEY, after that I am able to edited the config file, in order to test the code again I changed the value 0 to 1 and save the setting. but when I refresh the file which is opened in editor or kudu the pasted code disappeared, the site is connected with automatic azure deployment pipeline
How I can edited the web.config file without redeploying the app again.
Yes, it's possible to make changes without redeploying the app.
Some details:
Check Run the package document and we can find:
1.The zip package won't be extracted to D:\home\site\wwwroot, instead it will be uploaded directly to D:\home\data\SitePackages.
2.A packagename.txt which contains the name of the ZIP package to load at runtime will be created in the same directory.
3.App Service mounts the uploaded package as the read-only wwwroot directory and runs the app directly from that mounted directory. (That's why we can't edit the read-only wwwroot directory directly)
So my workaround is:
1.Navigate to D:\home\data\SitePackages in via kudu- debug console:
Download the zip(In my case it's 20200929072235.zip) which represents your deployed app, extract this zip file and do some changes to web.config file.
2.Zip those files(choose those files and right-click...) into a childtest.zip, please follow my steps carefully here!!! The folder structure of Run-from-package is a bit strange!!!
3.Then zip the childtest.zip into parenttest.zip(When uploading the xx.zip, the kudu always automatically extra them. So we have to zip the childtest.zip into parenttest.zip first)
4.Drag and drop local parenttest.zip into online SitePackages folder in kudu-debug console and we can get a childtest.zip now:
5.Modify the packagename.txt, change the content from 20200929072235.zip to childtest.zip and Save:
Done~
Check and test:
Now let's open App Service Editor to check the changes:
In addition: Though it answers the original question, I recommend using other deployment methods(web deploy...) as a workaround. It could be much easier~

IFileProvider Azure File storage

I am thinking about implementing IFileProvider interface with Azure File Storage.
What i am trying to find in docs is if there is a way to send the whole path to the file to Azure API like rootDirectory/sub1/sub2/example.file or should that actually be mapped to some recursion function that would take path and traverse directories structure on file storage?
just want to make sure i am not missing something and reinvent the wheel for something that already exists.
[UPDATE]
I'm using Azure Storage Client for .NET. I would not like to mount anything.
My intentention is to have several IFileProviders which i could switch based on Environment and other conditions.
So, for example, if my environment is Cloud then i would use IFileProvider implementation that uses Azure File Services through Azure Storage Client. Next, if i have environment MyServer then i would use servers local file system. Third option would be environment someOther with that particular implementation.
Now, for all of them, IFileProvider operates with path like root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to get sub3 info/content or should the path be broken into individual directories and get reference/content for each step?
I hope that clears the question.
Now, for all of them, IFileProvider operates with path like ˙root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to getsub3` info/content or should the path be broken into individual directories and get reference/content for each step?
For access the specific subdirectory across multiple sub directories, you could use the GetDirectoryReference method for constructing the CloudFileDirectory as follows:
var fileshare = storageAccount.CreateCloudFileClient().GetShareReference("myshare");
var rootDir = fileshare.GetRootDirectoryReference();
var dir = rootDir.GetDirectoryReference("2017-10-24/15/52");
var items=dir.ListFilesAndDirectories();
For access the specific file under the subdirectory, you could use the GetFileReference method to return the CloudFile instance as follows:
var file=rootDir.GetFileReference("2017-10-24/15/52/2017-10-13-2.png");

Rename an Azure Function

How to rename an Azure Function?
I want to replace a default 'HttpTriggerCSharp1' name to my own. At the moment unfortunately this name is included in the function url and there is no option to change it:
https://functions-xxx.azurewebsites.net/api/HttpTriggerCSharp1
The UI does not directly support renaming a Function, but you can work around this using the following manual steps:
Stop your Function App. To do this, go under Function app settings / Go To App Service Settings, and click on the Stop button.
Go to Kudu Console: Function app settings / Go to Kudu (article about that)
In Kudu Console, go to D:\home\site\wwwroot and rename the Function folder to the new name
Now go to D:\home\data\Functions\secrets and rename [oldname].json to [newname].json
Then go to D:\home\data\Functions\sampledata and rename [oldname].dat to [newname].dat
Start your function app, in the same place where you stopped it above
In the Functions UI, click the refresh button in the top left corner, and your renamed function should appear
Note: doing this can lose some historical logging.
Github Issue for renaming Azure Function
Edit for new info
To anyone like myself that arrived here looking to rename their function, despite this being the previously correct answer, there is now a much smoother CMD based process as detailed in this answer by SLdragon and an even smoother GUI based process detailed in this answer by Amerdeep below.
Now (2017.10) we can use console to rename the Azure Function name
Open the Console from your Function APP -> Platform features:
Rename the Function folder using command line:
Restart the Function:
Refresh
Create a new function and you will have an option to name it, then delete the default one(HttpTriggerCSharp1).
I know it's not renaming, but the easiest option around.
Go to Function Apps
Click on platform features
Click on app service editor
Right click on your default function name-select
Below worked for me.
I wanted to rename my azure function from "HttpTriggerCSharp1" to "my-new-func1"
Go to
Function Apps >
My-Function-App >
Platform Features TAB >
Console >
Run below commands:
cd D:\home\site\wwwroot
move HttpTriggerCSharp1 my-new-func1
Now restart the application:
Function Apps >
My-Function-App >
Overview TAB >
Restart
NOTE: The function 'code' query param changes by doing this.

Resources