Processing Multiple sub-directories and all files in each directory - c#-4.0

Hoping someone can help me here, I'm not very familiar with working with directories. My application compresses files and stores them in a media directory as well as stores them in the database, This works well if I create the directory and actually call each file separately.
But I would like to make it dynamic, and process all folders under the install folder, For every directory under this directory I would like to process each directory and all the files within that directory.
What I have tried:
public class SetupFiles : ISetupFiles
{
private readonly ISession _session;
private readonly IFileService _fileService;
private readonly IFileProvider _fileProvider;
public SetupFiles(ISession session, IFileService fileService, IFileProvider fileProvider)
{
_session = session;
_fileService = fileService;
_fileProvider = fileProvider;
}
public void Setup()
{
string installPath = ("install/images/");
_session.Transact(session =>
{
DirectoryInfo dir = new DirectoryInfo(installPath);
DirectoryInfo[] subdirectories = dir.GetDirectories();
foreach (var subdirectory in subdirectories)
{
string currentDirName = Directory.GetCurrentDirectory();
var directory = new MediaCategory
{
Name = subdirectory.Name.ToString().ToUpper(),
UrlSegment = subdirectory.Name.Trim().ToLower(),
};
session.Save(directory);
string[] pngFiles = Directory.GetFiles(currentDirName, "*.png", SearchOption.AllDirectories);
string[] jpgFiles = Directory.GetFiles(currentDirName, "*.jpg", SearchOption.AllDirectories);
string[] gifFiles = Directory.GetFiles(currentDirName, "*.gif", SearchOption.AllDirectories);
string[] pdfFiles = Directory.GetFiles(currentDirName, "*.pdf", SearchOption.AllDirectories);
foreach (var currentPngFile in pngFiles)
{
var fileStream = _fileProvider.GetFileInfo(currentPngFile).CreateReadStream();
MediaFile dbpngFile = _fileService.AddFile(fileStream, Path.GetFileName(currentPngFile), "image/png", fileStream.Length, directory);
}
foreach (var currentJpgFile in jpgFiles)
{
var fileStream = _fileProvider.GetFileInfo(currentJpgFile).CreateReadStream();
MediaFile dbjpgFile = _fileService.AddFile(fileStream, Path.GetFileName(currentJpgFile), "image/jpg", fileStream.Length, directory);
}
foreach (var currentGifFile in gifFiles)
{
var fileStream = _fileProvider.GetFileInfo(currentGifFile).CreateReadStream();
MediaFile dbgifFile = _fileService.AddFile(fileStream, Path.GetFileName(currentGifFile), "image/gif", fileStream.Length, directory);
}
foreach (var currentPdfFile in pdfFiles)
{
var fileStream = _fileProvider.GetFileInfo(currentPdfFile).CreateReadStream();
MediaFile dbgifFile = _fileService.AddFile(fileStream, Path.GetFileName(currentPdfFile), "file/pdf", fileStream.Length, directory);
}
}
});
}
But for some reason, I get an error that it can not find the directory.
Any help would be greatly appreciated.
int the Line DirectoryInfo[] subdirectories = dir.GetDirectories(); dir shows the directory but GetDirectories() returns NULL.

I must have had a Brain Fart. lol With help from #ciubotariu-florin I was able to get past this issue.
For anyone else with this issue, wwwroot is an environment folder and is part of web root from IIS config. So to work with Directory() and or DirectoryInfo() you must use IHostingEnvironment this allows you to use IHostingEnvironment.WebRootPath so changing my path from
string installPath = ("install/images/");
to
string installPath = Path.Combine(_env.WebRootPath, "install/images/").ToLower();
solved my issues.

Related

Copying existing files into a server SharePoint

I'm facing of difficulties to change a files into CSV and save it into local environment. How can I achieve this? Try to look around but seem like not what I'm looking for.
I'm running on SharePoint 2010. Before this, this code only grab data from SharePoint and turn it into xlsx and update it into our web.
private static void GenerateSPGroupUsersReport() //
{
Log("Generate Sharepoint Group Users Report");
DataSet dsSecurityReport = new DataSet();
string ConnectedWebURL = ConfigurationManager.AppSettings["SPGroupUsersWebURL"];
//string[] strURL = ConnectedWebURL.Split(';');
DataTable dTblSPGroupUser = new DataTable();
dTblSPGroupUser.Columns.Add("SiteURL", typeof(string));
dTblSPGroupUser.Columns.Add("SharepointGroup", typeof(string));
dTblSPGroupUser.Columns.Add("User", typeof(string));
// Hafees add 10/22/2019
dTblSPGroupUser.Columns.Add("UserLanID", typeof(string));
dTblSPGroupUser.Columns.Add("Email", typeof(string));
SPSite site = new SPSite(ConnectedWebURL);
SPWebApplication webApp = site.WebApplication;
foreach (SPSite s in webApp.Sites)
{
SPGroupCollection groupCol = s.RootWeb.SiteGroups;
foreach (SPGroup group in groupCol)
{
// Hafees include group.Users, user.Email
foreach (SPUser user in group.Users)
{
dTblSPGroupUser.Rows.Add(s.Url, group.Name, user.Name, user.LoginName, user.Email);
}
//bool contains = dTblSPGroupUser.AsEnumerable().Any(rowC => group.Name == rowC.Field<string>("SharepointGroup"));
//if (!contains)
//{
// foreach (SPUser user in group.Users)
// {
// dTblSPGroupUser.Rows.Add(s.Url, group.Name, user.Name);
// }
//}
}
}
DataSet dsSPGroup = new DataSet();
dsSPGroup.Tables.Add(dTblSPGroupUser);
SaveIntoSPLibrary(site, dsSPGroup, "GroupUsers_" + ConnectedWebURL.Replace("http://", "").Replace("https://", "").Replace(":", "-").Trim());
Log("Generate Sharepoint Group Users Report Complete");
}
// This is where I generate the group of user report.
private static void SaveIntoSPLibrary(SPSite site, DataSet ds, string fileName)
{
string UIResourceServerRelativeWebURL = ConfigurationManager.AppSettings["UIResourceServerRelativeWebURL"];
using (SPWeb web = site.OpenWeb(UIResourceServerRelativeWebURL))
{
byte[] byteArray = GenerateExcelFile(ds);
string CustomReportLibrary = ConfigurationManager.AppSettings["CustomReportLibrary"];
string strFileName = String.Format(fileName + ".{0}.xlsx", DateTime.Today.ToString("yyyyMMdd"));
Log("Saving into SP Library. " + CustomReportLibrary + strFileName);
web.AllowUnsafeUpdates = true;
SPFile file = web.Files.Add(CustomReportLibrary + strFileName, byteArray, true);
file.Item["Year"] = DateTime.Now.ToString("yyyy");
file.Item["Month"] = string.Format("{0}. {1}", DateTime.Now.Month, DateTime.Now.ToString("MMMM"));
file.Item.Update();
file.Update();
web.AllowUnsafeUpdates = false;
}
}
// This is where the files save into xlsx and update it into SharePoint Library.
I try to do a copy of SaveIntoLibrary with abit of modification, I change to CSV files and create a new configurationManager which it will point into my local directory. But seem I'm wrong at somewhere. The files still didn't get into my local directory. Please advice.
You should export the DataTable report data to local CSV,
Check the code in this thread, this will output the csv file so you could save to your local.
var dataTable = GetData();
StringBuilder builder = new StringBuilder();
List<string> columnNames = new List<string>();
List<string> rows = new List<string>();
foreach (DataColumn column in dataTable.Columns)
{
columnNames.Add(column.ColumnName);
}
builder.Append(string.Join(",", columnNames.ToArray())).Append("\n");
foreach (DataRow row in dataTable.Rows)
{
List<string> currentRow = new List<string>();
foreach (DataColumn column in dataTable.Columns)
{
object item = row[column];
currentRow.Add(item.ToString());
}
rows.Add(string.Join(",", currentRow.ToArray()));
}
builder.Append(string.Join("\n", rows.ToArray()));
Response.Clear();
Response.ContentType = "text/csv";
Response.AddHeader("Content-Disposition", "attachment;filename=myfilename.csv");
Response.Write(builder.ToString());
Response.End();

.net core 2.0 Cant download file saved in folder.

I save info from datagrid to xslx file located at the folder, after I want to download this file.
I have code, that is working correct, but in my project when I try to download file it return nothing. Maybe it because of protection or user role is admin.
I have tried with different folders, and I'm sure that folder is not a problem.
What else it can be?
public FileResult downloadFile(string filePath)
{
IFileProvider provider = new PhysicalFileProvider(filePath);
IFileInfo fileInfo = provider.GetFileInfo(fileName);
var readStream = fileInfo.CreateReadStream();
var mimeType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
return File(readStream, mimeType, fileName);
}
I have try your code and everthing seems ok
public IActionResult Index()
{
var fileName = "SN-export.xlsx";
string filePath = _hostingEnvironment.WebRootPath;
string fileName2 = Path.Combine(filePath, fileName);
FileInfo excelFile = new FileInfo(fileName2);
if (excelFile.Exists) {
excelFile.Delete();
excelFile = new FileInfo(fileName2);
}
// excel.SaveAs(excelFile); Have to comment this because i don't know what excel nuget you are using
// For testing purpose i have used the following two lines
var file = System.IO.File.Create(excelFile.ToString());
file.Close();
return DownloadFile(filePath, fileName);
}
public FileResult DownloadFile(string filePath,string fileName)
{
IFileProvider provider = new PhysicalFileProvider(filePath);
IFileInfo fileInfo = provider.GetFileInfo(fileName);
var readStream = fileInfo.CreateReadStream();
var mimeType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
return File(readStream, mimeType, fileName);
}

How to save text files or excel files in local folder in windows phone

How to save text files or excel files in local folder in windows phone?
I get the text files from web service response from http url,
but how do I save them in a local folder?
For more ideas see my code below.
Thanks!!
Uri uri = new Uri("https://abcd/xyz/def");
WebClient wc = new WebClient();
wc.DownloadStringAsync(uri);
wc.DownloadStringCompleted += wc_DownloadStringCompleted;
void wc_DownloadStringCompleted(object sender, DownloadStringCompletedEventArgs e)
{
string result = e.Result.ToString();
MessageBox.Show(result);
IsolatedStorageFile myIsolatedStorage = IsolatedStorageFile.GetUserStoreForApplication();
string FolderName = "localFolder\\myData\\";
myIsolatedStorage.CreateDirectory(FolderName);
string filepath = System.IO.Path.Combine(FolderName, "myfile.txt");
using (StreamWriter write = new StreamWriter(new IsolatedStorageFileStream(filepath, FileMode.Create, FileAccess.Write, myIsolatedStorage)))
{
write.WriteLine(result);
write.Close();
}
}
Use below code to write .txtFile:
void wc_DownloadStringCompleted(object sender, DownloadStringCompletedEventArgs e)
{
String response = e.Result.ToString();
IsolatedStorageFile myIsolatedStorage = IsolatedStorageFile.GetUserStoreForApplication();
//Set File Path
string FolderName = "localFolder\\myData\\";
myIsolatedStorage.CreateDirectory(FolderName);
string FilePath = System.IO.Path.Combine(FolderName, "myFile.txt");
//create new file
using (StreamWriter writeFile = new StreamWriter(new IsolatedStorageFileStream(FilePath , FileMode.Create, FileAccess.Write, myIsolatedStorage)))
{
writeFile.WriteLine(response);
writeFile.Close();
}
}
And check file is saved or not:
if (myIsolatedStorage.DirectoryExists("localFolder\\myData\\"))
{
String[] fileNames = myIsolatedStorage.GetFileNames("localFolder\\myData\\*");
}

Changing connection data in PowerPivot file

So I've run into an interesting problem. I need to specify a custom WHERE clause in a PowerPivot query. I must change it based on external conditions. I would like to edit the file and save a copy. Any idea how to do this? I opened the PowerPivot file from binary, but it appears encrypted...
You can go to existing connections, and then make an update there. If you open the same data source (SQL, SSRS, or anything else) again instead of changing parameters on the existing connection, it will slow your perf as PowerPivot will treat those as separate connections.
Solution was to open the Excel workbook up as a Zip (using the Package class).
If you are looking to modify the queries, you can. The file at /xl/customData/item1.data is a backup file that represents the PowerPivot database (which is just an Analysis Services database running in Vertipaq mode) used to process queries. You need to restore the file to a SSAS instance running in Vertipaq mode. Once that's done, script the queries as an ALTER script. Modify the scripts (in this case, replacing #projectId with my actual projectID), then run them against the database. Once all this is done, back the database up and put back into the Excel workbook. That modifies the queries.
The connection data is stored in the /xl/connections.xml file. Open that up, modify, and replace. Repack it all up again, and now you have a workbook again.
Here's the code I made. You will have to call the methods as you need. Basic idea is there, though...
const string DBName = "Testing";
const string OriginalBackupPath = #"\\MyLocation\BKUP.abf";
const string ModifiedBackupPath = #"\\MyLocation\BKUPAfter.abf";
const string ServerPath = #"machineName\powerpivot";
private static readonly Server srv = new Server();
private static readonly Scripter scripter = new Scripter();
private static Database db;
private static byte[] GetPackagePartContents(string packagePath, string partPath)
{
var pack = Package.Open(packagePath, FileMode.OpenOrCreate, FileAccess.ReadWrite);
var part = pack.GetPart(new Uri(partPath, UriKind.Relative));
var stream = part.GetStream();
var b = new byte[stream.Length];
stream.Read(b, 0, b.Length);
stream.Flush();
stream.Close();
pack.Flush();
pack.Close();
return b;
}
private static void WritePackagePartContents(string packagePath, string partPath, byte[] contents)
{
var uri = new Uri(partPath, UriKind.Relative);
var pack = Package.Open(packagePath, FileMode.OpenOrCreate, FileAccess.ReadWrite);
var part = pack.GetPart(uri);
var type = part.ContentType;
pack.DeletePart(uri);
pack.CreatePart(uri, type);
part = pack.GetPart(uri);
var stream = part.GetStream();
stream.Write(contents, 0, contents.Length);
stream.Flush();
stream.Close();
pack.Flush();
pack.Close();
}
private static void RestoreBackup(string server, string dbName, string backupPath)
{
srv.Connect(server);
if (srv.Databases.FindByName(dbName) != null) { srv.Databases.FindByName(dbName).Drop(); srv.Update(); }
srv.Restore(backupPath, dbName, true);
srv.Update();
srv.Refresh();
}
private static void WriteContentsToFile(byte[] contents, string filePath)
{
var fileStream = File.Open(filePath, FileMode.OpenOrCreate, FileAccess.Write);
fileStream.Write(contents, 0, contents.Length);
fileStream.Flush();
fileStream.Close();
}
private static byte[] ReadContentsFromFile(string filePath)
{
var fileStream = File.Open(filePath, FileMode.OpenOrCreate, FileAccess.Write);
var b = new byte[fileStream.Length];
fileStream.Read(b, 0, b.Length);
fileStream.Flush();
fileStream.Close();
return b;
}
private static XDocument GetAlterScript(MajorObject obj)
{
var stream = new MemoryStream();
var streamWriter = XmlWriter.Create(stream);
scripter.ScriptAlter(new[] { obj }, streamWriter, false);
streamWriter.Flush();
streamWriter.Close();
stream.Flush();
stream.Position = 0;
var b = new byte[stream.Length];
stream.Read(b, 0, b.Length);
var alterString = new string(Encoding.UTF8.GetString(b).ToCharArray().Where(w => w != 65279).ToArray());
var alter = XDocument.Parse(alterString);
stream.Close();
return alter;
}
private static void ExecuteScript(string script)
{
srv.Execute(script);
srv.Update();
db.Process();
db.Refresh();
}
private static void ProcessPowerpointQueries(string bookUrl, string projectId)
{
byte[] b = GetPackagePartContents(bookUrl, "/xl/customData/item1.data");
WriteContentsToFile(b, OriginalBackupPath);
RestoreBackup(ServerPath, DBName, OriginalBackupPath);
var db = srv.Databases.GetByName(DBName);
var databaseView = db.DataSourceViews.FindByName("Sandbox");
var databaseViewAlter = GetAlterScript(databaseView);
var cube = db.Cubes.FindByName("Sandbox");
var measureGroup = cube.MeasureGroups.FindByName("Query");
var partition = measureGroup.Partitions.FindByName("Query");
var partitionAlter = GetAlterScript(partition);
var regex = new Regex(#"\s#projectid=\w*[ ,]");
var newDatabaseViewAlter = databaseViewAlter.ToString().Replace(regex.Match(databaseViewAlter.ToString()).Value.Trim(',',' '), #"#projectid=" + projectId);
ExecuteScript(newDatabaseViewAlter);
var newPartitionAlter = partitionAlter.ToString().Replace(regex.Match(partitionAlter.ToString()).Value.Trim(',', ' '), #"#projectid=" + projectId);
ExecuteScript(newPartitionAlter);
db.Backup(ModifiedBackupPath, true);
WritePackagePartContents(bookUrl, #"/xl/customData/item1.data", ReadContentsFromFile(ModifiedBackupPath));
db.Drop();
srv.Disconnect();
}
private static void ProcessWorkbookLinks(string bookUrl, string newCoreUrl)
{
var connectionsFile = GetPackagePartContents(bookUrl, #"/xl/connections.xml");
var connectionsXml = Encoding.UTF8.GetString(connectionsFile);
connectionsXml = connectionsXml.Replace(
new Regex(#"Data Source=\S*;").Match(connectionsXml).Value.Trim(';'), #"Data Source=" + newCoreUrl);
WritePackagePartContents(bookUrl, #"/xl/connections.xml", connectionsXml.Replace(#"https://server/site/", newCoreUrl).ToCharArray().Select(Convert.ToByte).ToArray());
}

Copy folders when copying list items from source to destination

This is my code to copy files in a list from source to destination. Using the code below I am only able to copy files but not folders. Any ideas on how can I copy the folders and the files within those folders?
using (SPSite objSite = new SPSite(URL))
{
using (SPWeb objWeb = objSite.OpenWeb())
{
SPList objSourceList = null;
SPList objDestinationList = null;
try
{
objSourceList = objWeb.Lists["Source"];
}
catch(Exception ex)
{
Console.WriteLine("Error opening source list");
Console.WriteLine(ex.Message);
}
try
{
objDestinationList = objWeb.Lists["Destination"];
}
catch (Exception ex)
{
Console.WriteLine("Error opening destination list");
Console.WriteLine(ex.Message);
}
string ItemURL = string.Empty;
if (objSourceList != null && objDestinationList != null)
{
foreach (SPListItem objSourceItem in objSourceList.Items)
{
ItemURL = string.Format(#"{0}/Destination/{1}", objDestinationList.ParentWeb.Url, objSourceItem.Name);
objSourceItem.CopyTo(ItemURL);
objSourceItem.UnlinkFromCopySource();
}
}
}
}
Thanks
This is what worked for me. I had to move folders from spweb to another.
private static void RecursiveCopy(SPList objSourceList, SPFolder objSourceFolder, SPFolder objDestinationFolder)
{
SPListItemCollection objItems = ((SPDocumentLibrary)objSourceList).GetItemsInFolder(objSourceList.DefaultView, objSourceFolder);
foreach (SPListItem objItem in objItems)
{
//If it's a file copy it.
if (objItem.FileSystemObjectType == SPFileSystemObjectType.File)
{
byte[] fileBytes = objItem.File.OpenBinary();
string DestinationURL = string.Format(#"{0}/{1}", objDestinationFolder.Url, objItem.File.Name);
//Copy the file.
SPFile objDestinationFile = objDestinationFolder.Files.Add(DestinationURL, fileBytes, true);
objDestinationFile.Update();
}
else
{
string dirURL = string.Format(#"{0}/{1}", objDestinationFolder.Url, objItem.Folder.Name);
SPFolder objNewFolder = objDestinationFolder.SubFolders.Add(dirURL);
objNewFolder.Update();
//Copy all the files in the sub folder
RecursiveCopy(objSourceList, objItem.Folder, objNewFolder);
}
}
}
public static void CopyListItems(string SourceSiteURL, string DestinationSiteURL, string ListName)
{
string DestinationURL = string.Empty;
using (SPSite SourceSite = new SPSite(SourceSiteURL))
{
using (SPWeb SourceWeb = SourceSite.OpenWeb())
{
using (SPSite DestinationSite = new SPSite(DestinationSiteURL))
{
using (SPWeb DestinationWeb = DestinationSite.OpenWeb())
{
DestinationWeb.AllowUnsafeUpdates = true;
//Get the QA Forms Document libarary from the source web
SPList objSourceList = SourceWeb.Lists[ListName];
SPList objDestinationList = null;
try
{
objDestinationList = DestinationWeb.Lists[ListName];
}
catch
{
//Create a list in the destination web
DestinationWeb.Lists.Add(ListName, string.Empty, SPListTemplateType.DocumentLibrary);
}
objDestinationList = DestinationWeb.Lists[ListName];
//Recursively copy all the files and folders
RecursiveCopy(objSourceList, objSourceList.RootFolder, objDestinationList.RootFolder);
DestinationWeb.Update();
DestinationWeb.AllowUnsafeUpdates = false;
}
}
}
}
}
this copies all the files and folders recursively.
Hope it helps someone.
If you are copying to a destination that is located within the same SPWeb, you can try the following.
using (SPSite site = new SPSite("http://urltosite"))
{
using (SPWeb web = site.OpenWeb())
{
//get the folder from the source library
SPFolder sourceFolder = web.GetFolder("Documents/Folder 1");
//get the folder to the destination
SPFolder destinationFolder = web.GetFolder("New Library");
sourceFolder.CopyTo(destinationFolder.ServerRelativeUrl + "/" + sourceFolder.Name);
}
}
Sadly I don't think this works when copying a folder to a different SPWeb or SPSite.
SPList.Items only returns non-folder items. You can use SPList.Folders to iterate all of the folders in a list. So if you did the same foreach loop, only using:
foreach (SPListItem objSourceFolderItem in objSourceList.Folders)
You would then get all of the folders. To properly move the folder and all of its contents, you would use objSourceFolderItem.Folder.CopyTo(ItemUrl).
I've tried this using a list with only one level of folders (pair it with a foreach loop to get all of the items in the root folder), and it worked for me in SP2007. I believe SPList.Folders gets all of the folders in the entire list, not just the ones in the root folder, so if you end up breaking the list with a multi-level folder system, then an alternative to try might be:
foreach (SPFolder objSourceFolderItem in objSourceList.RootFolder.SubFolders)
Since those are already SPFolder objects, you can just use objSourceFolderItem.CopyTo(ItemUrl).

Resources