Sizes of subfolders in hidden App Data folder - excel

I want to reduce size of hidden AppData sub folders by making adjustments in large data size folders applications as I am falling short of space in drive C. To do this I searched microsoft community and in google. Some VBA code routines do not work as Permission denied message comes up. I have already done disk cleaning of the drive as a part of windows 10 feature. I just want a list of sub folders and files along with their sizes so as to take remedial steps. Manual process is very time consuming. Is there a VBA routine or function which takes due consideration of permissions on hidden folders.
Can any one help in this regard.

Related

Mirroring files from one partition to another on the second disk without RAID1

I am looking for a program that would allow me to mirror one partition to another disk (something like RAID1) for Linux. It doesn't have to be a windowed application, it can be a console application, I just want what is in one place to be mirrored to another.
It would be nice if it were possible to mirror a specific folder that I would care for instead of copying everything from the given partition.
I was looking on the internet, but it's hard to find something that would give such opportunities, hence the idea to ask such a question.
I do not want to make fake RAID on Linux or hardware RAID because I read that if the motherboard fails then it is best to have the same second one to recover data.
I will be grateful for every suggestion :)
You can check my script "CopyDirFile" written in bash, which is located on github.
You can perform a replication (mirroring) task of any source folder to another destination folder (deleting a file in the source folder means deleting it in the destination folder).
The script also allows you to create copy tasks (deleted files in the source folder will not be deleted in the target folder).
The tasks are executed in background at a specified time, not all the time, frequency is set by the user when creating the task.
You can also set the task to start automatically when the user logs on.
All the necessary information can be found in the README file in repository.
If I understood you correctly, I think it meets your requirements.
Linux has standard support for software RAID: mdraid.
It allows you to bundle two disk devices into a RAID 1 device (among other things); you then create a filesystem on top of that device.
LVM offers another way to do software RAID; it doesn't seem to be very popular, but it's certainly supported.
(If your system supports hardware RAID, on the motherboard or with a separate RAID controller, Linux can use that, too, but that doesn't seem to be what you're asking here.)

Archive format that can be unpacked without copying its content?

Main question: if you want bundle multiple files or an entire directory structure (without compression), formats like .tar or .zip can be used. However when unpacking those archives, all its contents need to be copied. Is there an archive format that can be unpacked 'in place', i.e. without copying its contents?
Motivation: I need to send directories containing tens of GB of compressed video data from one computer to another (relatively small number of big files). Specifically: I am responsible for packing and unpacking the data on both ends, but another business partner is responsible for the actual transport (it happens on their machines).
Asking a business partner to move single files simplifies things from a business standpoint a lot. The questions 'did something go wrong?' and 'show something went wrong to possibly technically inexperienced managers' is a lot simpler with a single file than with a directory structure of many files.
The obvious choice here is to just zip/unzip it using 7zip. However on the unpacking side, this requires unpacking GB of data. Worse: since these are uncompressed zip-files, the unpacking mainly consists of copying GB of video data that was already exactly where it needed to be, only inside an archive.
So a lot of time is spent on basically unnecessary work. This makes me wonder: is there an archive format that does something like 'pretend this directory structure is a single file' and then gives the option to say 'now stop pretending this directory structure is a single file'. I imagine this would require something like unindexing and then reindexing the individual files, but since there are relatively few files, I would imagine this would be a lot faster than copying everything.
Does anyone know an archive format that could help me here?

Avoid data base open when excel is running

I developed a vb.net program that uses excel file to generate some reports.
Once the program takes too much time to generate a report, I usually do other things while the program is running. The problem is that sometimes I need to open other excel files and the excel files used in the program are shown to me. I want to still hide those files being processed even when I run other excel files. Is this possible? Thanks
The FileSystem.Lock Method controls access by other processes to all or part of a file opened by using the Open function.
The My feature gives you better productivity and performance in file I/O operations than Lock and Unlock. For more information, see FileSystem.
More information here.

Excel Workbook External Links not updating when saved on network drive.

I have two spreadsheets, one a source document that has data inputted and the other a destination document.
Both of these sheets are saved to a network drive, the cells are linked through the "=SOURCE!$A$1" type formula.
If i have both spreadsheets opened on the same computer they work swimmingly, but as soon as I open one on one computer and the other on another they no longer update.
Excuse my beginnerness, this is the first time i have attempted to do this, it may be impossible, but if i thought it works on one computer then why isn't it working on another.
I really need them to update in real time :) Both the source and destination are shared.
Any help would be greatly appreciated.
Cheers,
Logan
In your comments you clarified that File1 and File2 both need to opened at the same time, as they both require human interaction in order to function.
This implies that File2 isn't really a "data file" per-se. Data Files are by definition used only for data storage and have no live interaction.
Your setup is an unusual one (having two Excel files that both require interaction and are dependent on each other being open to function). If this functions properly with both files open on the same computer, then this is likely a file-locking and/or permissions issue, and I'm not sure you'll be able to get around it as-is.
Potential Solutions:
Migrate your entire setup to Microsoft Access, which is designed to handle record locking and database splitting that is necessary for multi-user environments. (More info)
Create a 3rd Excel file (an actual data file that is to remain closed) and have Excel files 1 and 2 both link to it, using File3 as the "go-between".
Create an Access ACCDB file as the "go-between" data-storage location location, and have Excel files 1 and 2 both link to it, since the ACCBD can have many connected computers/users regardless of whether it's opened or closed.
-
In order to get this working with Excel, you'll need to figure out the network shares/permissions necessary to open the two files simultaneously with asynchronous file locks, which is more of a network admin topic.
Asynchronous file locking is a built-in feature of Microsoft Access.
More Information:
Wikipedia : Data File (definition)
Microsoft Docs : Asynchronous File I/O
LinkedIn : Yes, Microsoft Access works in a Multi-User Environment
Office.com : Move data from Excel to Access
Stack Exchange : Server Fault
Q&A for system and network administrators
Stack Exchange : Super User
Q&A for computer enthusiasts and power users

VBA: Coordinate batch jobs between several computers

I have a vba script that extract information from huge text files and does a lot of data manipulation and calculations on the extracted info. I have about 1000 files and each take an hour to finish.
I would like to run the script on as many computers (among others ec2 instances) as possible to reduce the time needed to finish the job. But how do I coordinate the work?
I have tried two ways: I set up a dropbox as a network drive with one txt file with the current last job number thart vba access, start the next job and update the number but there is apparently too much lag between an update on a file on one computer is updated throughout the rest to be practical. The second was to find a simple "private" counter service online that updated for each visit so han would access the page, read the number and the page would update the number for the next visit from another computer. But I have found no such service.
Any suggestions on how to coordinate such tasks between different computers in vba?
First of all if you can use a proper programming language, forexample c# for easy parallel processing.
If you must use vba than first optimize your code first. Can you show us the code?
Edit:
If you must than you could do the following. First you need some sort of fileserver to store all text files in a folder.
Then in the macro, foreach every .txt file in folder,
try to open the first in exclusive mode if the file can be opened, then ("your code" after your code is finished move the file elsewhere) else Next .txt file.

Resources