cassandra restore from incremental backup - cassandra

I want to use incremental backup as main cassandra backup type in my system, but I have some miss understanding:
The one way to make restore from incremental backup that worked for me - just copy from backup folder to table folder , is that the right way to make it?
Can I ,somehow , make backup of table\keyspace parameters , like index/replica_factor etc?
Thanks.

If you have not dropped the table.for restoration
Stop the C* node
Copy the backup folder to table directory. You are right on that.
Start the server.
If you want schema, you can check snapshot folder. There you can find schema.cql

Related

How to configure path for Delta Live Table in cloud_files

I am new to the Databricks Delta Live table. I have some small doubts and need your help to understand the concept behind it. I am unable to proceed without this.
I have a file in the Azure data lake container, and I know that I need to give the path under "cloud_files" so that delta live table can read files from this folder and show them. But my doubt is, if I give only the path, how do I mention the storage account name and container name? Also, do I need to provide an access key in order to read the data securely ?
I think I am missing something, I have gone through various articles and Youtube demo videos, and everywhere they just mention the path but do not tell me how to configure the path.
Please help me to understand this concept.
Thank You.
This is my code for the Delta Live table:
CREATE LIVE TABLE customers_raw
COMMENTS "This is raw table"
AS
SELECT *
FROM cloud_files("/raw_data/customers.csv", "csv")
You need to specify full URL for this folder, like, abfss://<container>#<storage>.dfs.core.windows.net/raw_data/customers.csv. Otherwise if you specify it /raw_data/customers.csv it will consider it as a folder on DBFS, and will fail. Please note that in this case you will need to setup corresponding Spark properties so DLT can access data - you can find it in the following answer.

Azure Data Factory - Recording file name when reading all files in folder from Azure Blob Storage

I have a set of CSV files stored in Azure Blob Storage. I am reading the files into a database table using the Copy Data task. The Source is set as the folder where the files reside, so it's grabbing it's file and loading it into the database. The issue is that I can't seem to map the file name in order to read it into a column. I'm sure there are more complicated ways to do it, for instance first reading the metadata and then read the files using a loop, but surely the file metadata should be available to use while traversing through the files?
Thanks
This is not possible in a regular copy activity. Mapping Data Flows has this possibility, it's still in preview, but maybe it can help you out. If you check the documentation, you find an option to specify a column to store file name.
It looks like this:

getting info from an MS SQL .bak file

I am writing an Electron app that, among many other things, restores an unknown .bak file to a MS SQL server and then extracts more information. In order to do this successfully, I need to extract some info from that .bak file programmatically (so SSMS cannot be used). I will be using sqlcmd, since that can be run by Electron's node.js backend. Unfortunately, I have a bit of a chicken and egg problem, because it seems I cannot restore a .bak file without knowing things about the paths for the .mdf files specified within the .bak file (that cannot be found without first restoring it). There is a RESTORE WITH MOVE option, though this seems to also require knowledge of the paths inside the .bak, which cannot be determined from the .bak itself. How might I get this information, or is it impossible?
Read about RESTORE FILELISTONLY.
At this link you'll find further statements one can use together with RESTORE in order to fetch meta data.
The returned resultset of FILELISTONLY will give you the LogicalName, the file's type (Data or Log), information about the file group and much more.
The other statements provide other meta data. Just check it out...

Azure DataFactory Incremental BLOB copy

I've made a pipeline to copy data from one blob storage to another. I want to have incremental copy if it's possible, but haven't found a way to specify it. The reason is I want to run this on a schedule and only copy any new data since last run.
If your blob name is well named with timestamp, you could follow this doc to copy partitioned data. You could use copy data tool to setup the pipeline. You could select tumbling window and then in file path filed input {year}/{month}/{day}/fileName and choose the right pattern. It will help you construct the parameters.
If you blob name is not well named with timestamp, you could use get metadata activity to check the last modified time. Please reference this post.
Event trigger is just one way to control when the pipeline should run. You could also use tumbling window trigger or schedule trigger in your scenarios.
I'm going to presume that by 'incremental' you mean new blobs added to a container. There is no easy way to copy changes to a specific blob.
So, this is not possible automatically when running on a schedule since 'new' is not something the scheduler can know.
Instead, you can use a Blob created Event Trigger, then cache the result (Blob name) somewhere else. Then, when your schedule runs, it can read those names and copy only those blobs.
You have many options to cache. A SQL Table, another blob.
Note: The complication here is trying to do this on a schedule. If you can adjust the parameters to merely copy every new file, it's very, very easy because you can just copy the blob that created the trigger.
Another option is to copy the blob on create using the trigger to a temporary/staging container, then use a schedule to move those files to the ultimate destination.

Where Sticky Notes are saved in Windows 10 1607

It seems like Sticky Notes are no longer saved in %AppData%\Microsoft\Sticky Notes\
I even did a search for *.SNT with no results.
It seems like Microsoft have changed the way Windows handles Notes. Anyone know where the notes are saved now and how to backup/restore them?
Use this document to transfer Sticky Notes data file StickyNotes.snt to the new format
http://www.winhelponline.com/blog/recover-backup-sticky-notes-data-file-windows-10/
Restore:
%LocalAppData%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState
Close Sticky Notes
Create a new folder named Legacy
Under the Legacy folder, copy your
existing StickyNotes.snt, and rename it to ThresholdNotes.snt
Start the Sticky Notes app. It reads the legacy .snt file and
transfers the content to the database file automatically.
Backup
just backup following file.
%LocalAppData%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState\plum.sqlite
It appears Microsoft now stores them in a SQLite database file called plum.sqlite located here:
C:\Users\%username%\AppData\Local\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState\plum.sqlite
It depends on the version of Windows 10 you're using. Starting with Windows 10 Anniversary Update version 1607, Sticky Notes is storing its data in the following directory:
%UserProfile%\AppData\Local\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe
If your Windows 10 has an older version, it is storing the date in the following directory:
%UserProfile%\AppData\Roaming\Microsoft\StickyNotes\StickyNotes.snt
Here what i found. C:\Users\User\AppData\Local\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\TempState
There is snapshot of your sticky note in .png format. Open it and create your new note.
Sticky notes in Windows 10 are stored here:
C:\Users\"Username"\Appdata\Roaming\Microsoft\Sticky Notes
If you want to restore your sticky notes from earlier versions of windwos, just copy the .snt file and place it in the above location.
N.B: Replace only if you don't have any new notes in Windows 10!
If at all you can't find .snt folder and above mentioned answers don't work for you. you can simply take plum.sqlite file and read it online or sqlite editor.
for online you can refer to http://inloop.github.io/sqlite-viewer/ link and browse the url as C:\Users\YOURUSERNAME\AppData\Local\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState
and pick sql lite file and execute it. Post executing select Note and you will find all rows corresponding to each sticky notes you have lost. Select the Text column and copy content, you will find all your data there.
ENJOY !!!!
In windows 10 you can recover in this way, there is no .snt file
Start Run
Go to this %LocalAppData%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe
Copy this folder Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe
Replace it with new Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe
Check your sticky notes now, you will get all your data
It worked for me when HDD with win8.1 crashed and my new HDD has win10.
Important to know
- Create Legacy folder mentioned in this link.
- Remember to rename the StickyNotes.snt to ThresholdNotes.snt.
- Restart the app
Find details here
https://www.reddit.com/r/Windows10/comments/4wxfds/transfermigrate_sticky_notes_to_new_anniversary/

Resources