managing weekly back ups [closed] - linux

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have written a script which takes mysql dumps and uploads it to s3 and I have added the script to the cronjob and script runs at 2 o clock in the mid night and uploads the mysql dump to S3. I am using the date and time stamp as the file name before uploading it to S3.
My problem is I need to manage back ups of 7 days on S3 and automatically I have to delete the 8th day backup file from S3 since I am using the date and Time stamp as file name to make each file unique, I am not able to figure out how to do it.
And also I have to restore the latest backup in another EC2 instance.

You can grab the XML response from your S3 bucket host, such as
http://YOUR_BUCKET.s3.amazonaws.com/
It should return a XML like :-
<ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<Name>...</Name>
<Prefix/>
<Marker/>
<MaxKeys>1000</MaxKeys>
<IsTruncated>true</IsTruncated>
<Contents>
<Key>xxxx.gz (if you gzip the dump)</Key>
<LastModified>2011-11-10T02:38:49.000Z</LastModified>
<ETag>"xxxxx"</ETag>
<Size>xxx</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
And with the value from LastModified node, you can determine when the file is created.
S3 has SDK api available is different languages,
you can download a copy then programmatically do the purging
As for replicating, with the SDK api, you can grab the content from original S3 bucket then post it to another S3 bucket.
SDK Api:-
http://aws.amazon.com/sdkforphp/ (PHP)
http://aws.amazon.com/sdkfornet/ (.Net)
http://aws.amazon.com/sdkforjava/ (Java)

You can create backup file use date string as filename , and use date -7 days to figure out
the files need delete.

Related

Google Drive File Stream files creation date

I know it has been asked a few times already, but - not in this context I think and other questions have been asked few years ago already, so I'm hoping maybe something changed.
So my issue is - I am uploading files to the Google Drive using Google Drive File Stream. However, while the uploading goes smoothly, I have a problem with files creation date - it is always changed to the timestamp of the time the file got uploaded, not the actual, local file creation date. It is a serious problem, as I am going to use this to back-up huge amounts of data and preserve all the meta-data I can and the creation date is crucial. Is there a way to either upload it with the creation date intact, or to change it after the upload? From what I've seen this seems not to be possible, but I have to try and make it work. Any help and insight will be appreciated. I'm using the Drive File Stream with Python.
EDIT: I didn't make it clear enough - the issue here is that I don't want to use Google Drive API at all, but rather deal with this using only Google Drive File Stream interface if it's possible.
create
If you check the documentation for files.create You will find that acceptable metadata for file creation does include a createdTime
You should then just add this to the metadata you use when uploading the file. As you did not post your code I have grabbed the standard example from the documentation and added the created time as follows.
file_metadata = {'name': 'photo.jpg', 'createdTime': 'THETIME'}
media = MediaFileUpload('files/photo.jpg', mimetype='image/jpeg')
file = drive_service.files().create(body=file_metadata,
media_body=media,
fields='id').execute()
print 'File ID: %s' % file.get('id')
Update
In the event that you want to update the ones you have already created you could use the following method.
If you check the documentation for file create you will find that the response is just a File resource
If you check file resource you will see that CreatedTime is write able.
You should run a file.update and reset the createdTime to the proper time.

Adding timestamp to your amazon s3 bucket folder names [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am trying to add system time as timestamp to my s3 bucket folder names so that every time i run the code, it would create a separate folder with a different time stamp on s3.
How do i achieve this ?
import json
import boto3
s3 = boto3.resource('s3')
s3object = s3.Object('your-bucket-name', 'your_file.json')
s3object.put(
Body=(bytes(json.dumps(json_data).encode('UTF-8')))
)
You would use standard Python date functions to construct the folder name you want, then set that string as part of the S3 object's key. Something like this:
import json
import boto3
from datetime import datetime
s3 = boto3.resource('s3')
prefix = 'folder_' + datetime.now().strftime("%I%p") + "/"
s3object = s3.Object('your-bucket-name', prefix + 'your_file.json')
s3object.put(
Body=(bytes(json.dumps(json_data).encode('UTF-8')))
)

Linux os : Sending email using multiple attachment using python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Could someone please help me with the below requirement. I am using below version of linux os :-
Red Hat Enterprise Linux Server release 6.6 (Santiago)
Python version :2.6.6
I need to send a multiple log files to an user everyday as an attachment.
In my log directory i have multiple files with *.fix extension. I need to send all these files to user as attachment. Could you please let me know the code for it ?
FYI .. its a linux server and i am not gonna use gmail.
Appreciate your earliest help. Thanks !!
There is a python package called email that helps you in sending mails.
Getting the list of *.fix files could be done using glob.
Something like this should do it:
from glob import glob
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
msg = MIMEMultipart()
# Fill out the needed properties of msg, like from, to, etc.
for filename in glob("*.fix"):
fp = open(filename)
msg.attach(MIMEText(fp.read()))
fp.close()
...
The msgcan then be sent using smtplib as shown here

(PYTHON) Manipulating certain portions of URL at user's request [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
The download link I want to manipulate is below:
http://hfrnet.ucsd.edu/thredds/ncss/grid/HFR/USWC/6km/hourly/RTV/HFRADAR,_US_West_Coast,_6km_Resolution,_Hourly_RTV_best.ncd?var=u&var=v&north=47.20&west=-126.3600&east=-123.8055&south=37.2500&horizStride=1&time_start=2015-11-01T00%3A00%3A00Z&time_end=2015-11-03T14%3A00%3A00Z&timeStride=1&addLatLon=true&accept=netcdf
I want to make anything that's in bold a variable, so I can ask the user what coordinates and data set they want. This way I can download different data sets by using this script. I would also like to use the same variables to name the new file that was downloaded ex:USWC6km20151101-20151103.
I did some research and learned that I can use the urllib.parse and urllib2, but when I try experimenting with them, it says "no module named urllib.parse."
I can use the webbrowser.open() to download the file, but manipulating the url is giving me problems
THANK YOU!!
Instead of urllib you can use requests module that makes downloading content much easier. The part that makes actual work is just 4 lines long.
# first install this module
import requests
# parameters to change
location = {
'part': 'USWC',
'part2': '_US_West_Coast',
'km': '6km',
'north': '45.0000',
'west': '-120.0000',
'east': '-119.5000',
'south': '44.5000',
'start': '2016-10-01',
'end': '2016-10-02'
}
# this is template for .format() method to generate links (very naive method)
link_template = "http://hfrnet.ucsd.edu/thredds/ncss/grid/HFR/{part}/{km}/hourly/RTV/\
HFRADAR,{part2},_{km}_Resolution,_Hourly_RTV_best.ncd?var=u&var=v&\
north={north}&west={west}&east={east}&south={south}&horizStride=1&\
time_start={start}T00:00:00Z&time_end={end}T16:00:00Z&timeStride=1&addLatLon=true&accept=netcdf"
# some debug info
link = link_template.format(**location)
file_name = location['part'] + location['km'] + location['start'].replace('-', '') + '-' + location['end'].replace('-', '')
print("Link: ", link)
print("Filename: ", file_name)
# try to open webpage
response = requests.get(link)
if response.ok:
# open file for writing in binary mode
with open(file_name, mode='wb') as file_out:
# write response to file
file_out.write(response.content)
Probably the next step would be running this in loop on list that contains location dicts. Or maybe reading locations from csv file.

Backup Google Drive to .zip file with file conversion

I keep going in circles on this topic, and can't find an automated method that works for mass data on a Google Drive. Here is the goal I'm looking to achieve:
My company uses an unlimited Google Drive to store shared documents, and we are looking to backup the contents automatically. But we can't have the data stored in a backup with google documents like ".gdoc" and ".gsheet"... we need to have the documents backed up in Microsoft/Open-Office format (".docx" and ".xlsx").
We currently use Google's Takeout page to zip all the contents of the Drive and save it on our Linux server (That has redundant storage). And it does zip and export the files to the correct formats.
Here: [https://takeout.google.com/settings/takeout][1]
Now that works... but requires a bit of manual work on our part. And babysitting the zip, download and upload processes is becoming wasteful. I have searched and have read that the google API for Takeout is unavailable to use through gscript. So, that seems to be out of the question.
Using Google scripts, I have been able to convert single files.... but can't, for instance, convert a folder of ".gsheet" files to ".xlsx" format. Maybe copying and converting all the google files into a new folder on the drive could be possible. Having access to the drive and the converted "backup", we could then backup the collection of converted files via the server...
So here is the just of it all:
Can you mass-convert all of a google drive and/or a specific folder on the drive from ".gdoc" to ".docx", and ".gsheet" to ".xlsx". Can this be done with gscript?
If not able to via the method in question one, is anyone familiar with an Linux of Mac app that could do such a directory conversion? (Don't believe it because of googles proprietary file types)
I'm stuck in a bit of a hole, and any insight to this problem could help. I really wish Google would allow users to convert and export drive folders via a script selection.
#1) Can you mass-convert all of a google drive and/or a specific folder on the drive from ".gdoc" to ".docx", and ".gsheet" to ".xlsx". Can this be done with gscript?
You can try this:
How To Automaticlly Convert files in Google App Script
Converting file in Google App Script into blob
var documentId = DocumentApp.getActiveDocument().getId();
function getBlob(documentId) {
var file = Drive.Files.get(documentId);
var url = file.exportLinks['application/vnd.openxmlformats-officedocument.wordprocessingml.document'];
var oauthToken = ScriptApp.getOAuthToken();
var response = UrlFetchApp.fetch(url, {
headers: {
'Authorization': 'Bearer ' + oauthToken
}
});
return response.getBlob();
}
Saving file as docx in Drive
function saveFile(blob) {
var file = {
title: 'Converted_into_MS_Word.docx',
mimeType: 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'
};
file = Drive.Files.insert(file, blob);
Logger.log('ID: %s, File size (bytes): %s', file.id, file.fileSize);
return file;
}
Time-driven triggers
A time-driven trigger (also called a clock trigger) is similar to a cron job in Unix. Time-driven triggers let scripts execute at a particular time or on a recurring interval, as frequently as every minute or as infrequently as once per month. (Note that an add-on can use a time-driven trigger once per hour at most.) The time may be slightly randomized — for example, if you create a recurring 9 a.m. trigger, Apps Script chooses a time between 9 a.m. and 10 a.m., then keeps that timing consistent from day to day so that 24 hours elapse before the trigger fires again.
function createTimeDrivenTriggers() {
// Trigger every 6 hours.
ScriptApp.newTrigger('myFunction')
.timeBased()
.everyHours(6)
.create();
// Trigger every Monday at 09:00.
ScriptApp.newTrigger('myFunction')
.timeBased()
.onWeekDay(ScriptApp.WeekDay.MONDAY)
.atHour(9)
.create();
}
Process:
List all files id inside a folder
Convert Files
Insert Code to a Time-driven Triggers
2) If not able to via the method in question one, is anyone familiar with an Linux of Mac app that could do such a directory conversion? (Don't believe it because of googles proprietary file types)
If you are want to save it locally try setting a cronjob and use Download Files
The Drive API allows you to download files that are stored in Google Drive. Also, you can download exported versions of Google Documents (Documents, Spreadsheets, Presentations, etc.) in formats that your app can handle. Drive also supports providing users direct access to a file via the URL in the webViewLink property.
Depending on the type of download you'd like to perform — a file, a Google Document, or a content link — you'll use one of the following URLs:
Download a file — files.get with alt=media file resource
Download and export a Google Doc — files.export
Link a user to a file — webContentLink from the file resource
Sample Code :
$fileId = '0BwwA4oUTeiV1UVNwOHItT0xfa2M';
$content = $driveService->files->get($fileId, array(
'alt' => 'media' ));
Hope this helps and answered all you questions

Resources