Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 days ago.
Improve this question
I started a trial of CloudBerry Backup Server to backup a Windows Server 2012R2 to the Amazon Cloud Drive. I have an upstream of up to 20 MBit/s.
When I started the backup, I got a nice speed (between 15 and 18 MBit/s) so I left it running. When I checked it again later, I had to find out, that the speed dropped to something almost not visible. In the last 4 hours, only 4 GB was transferred.
When I start zo upload a file to Amazon, then I see that the full speed is possible. (e.g. using Amazon Drive to upload files.)
Any ideas, what I could check / change so that the upload is using the available speed again?
Thank you in advance for your assistance.
With kind regards,
Konrad
Amazon Cloud Drive (consumer product) has been designed for personal use and Amazon reduces throughput on large data amounts upload/download. It's made not to compete with S3 I assume.
Have you tried S3 as your backup destination?
The main reason you see this decreasing speed is that Amazon Cloud Drive does not offer something called "Multipart upload" and yes, agree with another answer here, it also is not designed for backups and massive uploads rather just personal data (e.g. jpegs, documents etc).
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
This post was edited and submitted for review 6 months ago and failed to reopen the post:
Original close reason(s) were not resolved
Improve this question
I'm working on a Linux mint virtualbox image hosted on Windows 10 where I accidentally deleted the /etc/share directory and subdirectories.
Now my Image can't boot and I have sensitive data that I want to recover.
How can I do so?
First - switch the VM off and don't turn it off.
If you made a backup or taken a snapshot of the VM's image just use it to restore the VM. Create a new Linux VM with the new disk, and use the copied image os the additional one from which you will be recovering data. Don't ever try to recover deleted files/folders to the same drive or even use it as a drive from which the system boots up. Here are some general tips of what to do and what do not when you loose the data.
Otherwise it's very hard to recover any data from that image - it all depends whether you made some other changes afterward or just switched the VM off (which would be the best option). If you made some changes (wrote some data to the disk after deleting /etc/share then you can still try to recover it but if you have the data stored in some other locations - don't waste your time.
I' won't be copying the exact process from other pages but here are some usefull links; the tool that's mostly used in such cases is called testdisk.
How To Recover Deleted Files From Any Drive in Linux
How To Recover Deleted Files In Linux [Beginner’s Guide]
How to Recover Deleted Files Using TestDisk in Linux
Top 20 Best Linux Data Recovery Tools to Recover Deleted/Corrupted Files
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
My old parents have been hacked/virus-ed for the nth time.
I have an old HP server.
I thought of rebuilding it with VMWare (free version) or Oracle virtualbox and having them use windows in a controlled environment. I would back it up and patch it, etc. Maybe they RDC to my server.
I assume I would need a Windows server license to allow multiple connections. (I could also use it for myself to host Plex media server.)
At a 10,000 foot level, is this possible or just a technology quagmire?
Super User SE might be a better place for this.
Anyway: Are they using it for anything windows-specific? My parents used to use my Linux-based computer for web browsing, now they use an Android tablet for the same. Running a virtualised Windows on top of the former could've been an alternative. Also, backing up and rolling back is easier if you use virtualisation, just use something else for permanent data storage. Maybe a remote storage with backup and rollback (for ransomware) either your own infrastucture or in the cloud. (like syncthing, owncloud, etc.)
I'm assuming here they don't have trade secrets or plans of a home-built nuclear plant or anything that kind.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
So I create a storage account on azure and started using a file share for some days, but then I noticed some activity any hour during any day, even after stopping using the file share and deleted it:
I'm showing tables activity here because is where there's activity every hour, on other resources like tables and queues there's sporadic activity too, but I don't have anything on this storage account
After noticed this I try some things thinking about a security hole:
I tried to rotate access keys, several times
I even deleted and recreated the storage account with no avail, this was still showing the previous activity (can this be related that I used the same storage account name?)
This is not running in a production environment but still I don't know what is going on
Is this some background process of azure or do I need to worry about it?
As #David Makogon said, it turned out that was diagnostic activity, i disabled it in Diagnostic logs (classic) and the activity was gone.
Glad to know that my account wasn't hacked or something like that.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Are there any minimum server requirements for using Apache SVN? If not, what are some general server specifications used for Apache SVN? Any information on server capacities for Apache SVN would be appreciated!
As long as your team is not extremely large, a very decent server is enough. Even a virtual server with about 1 virtual CPU and 1GB RAM running on a decent real CPU is enough. I'd say it doesn't need to be any faster than a server you'd use as a file server.
I'm using it myself on a very limited v-server and it works very well.
I have yet to find an Apache httpd configured Subversion server that's underpowered. Subversion itself doesn't take up a lot of bandwidth. I would still suggest that the server be dedicated. It isn't that Subversion sucks up a lot of power. More likely, whatever else you do will suck up too much power, and it will slow down Subversion. I was at one site that kept on piling more stuff onto the Subversion server (including database services) and then ware upset that Subversion was slow. Everything on that machine was slow.
The main concern would be bandwidth which seems to matter much more than the server itself. Also, be careful with NFS mounted disks (although Netapps seem fine).
I found this http://subversion.apache.org/faq.html#server-requirements but it's probably not enough detail.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I'm trying to get some firm handle on how reliable long-term data storage is using Google Docs/Google Drive. Presumably when a file gets uploaded or automatically synced, the transfer is verified using an md5sum -- I see that these are saved as meta-information according to the Google Docs API. And since the file is mirrored to multiple servers, presumably each of these transfers is also verified.
But then the file sits there for years. I don't change it, so no syncing ever gets triggered. Does Google occasionally verify that the md5sum hasn't changed, to protect against silent corruption of the file -- and repair the file if an inconsistency is found? Or is the md5sum meta-information just a static value representing what the file looked like when first uploaded years ago?
I would not worry about this. We can't share the specifics but data hosted on Google is checked against corruption and is also replicated multiple times.
This doesn't prevent you from uploading corrupted data though. So you could potentially use the read-only MD5 checksum field post upload to make sure that the file that you just uploaded to Drive has the correct MD5 if data consistency is critical for you.