I've put together a function that creates a sharepoint folder in a document library based on the url that's past in as an argument. The code works and the folder shows up in sharepoint from the webapplication.
However, when I query the SPWeb object for the folder afterward, it says the folder doesnt exist. Which makes no sense to me. Stranger still, is that this very same code worked no too long ago. I had been using it to create tree structures in sharepoint.
Even if the query folder fails, the GetFolder still returns a the folder, but when I add files to the returned folder, I get a runtime exception indicating that the file doesn't exist...which I assume means the folder I am trying to add it to doesn't exist since the file I am adding, doesn't exist yet. Which is why I am adding it.
So my question is, why am I getting this error, and why does FolderExists return false when the folder actually exists? We know it exists because GetFolder actually returns it...
I've included some actual code from the app to make things clear.
If someone could have a look at the code and see and anything jumps out at them, that would be fantabulous...Thanks
Code to build folders:
public void CreateFolder(SPUriBuilder url)
{
try
{
Log.Instance.WriteToLog("CreateFolder({0})", url);
var library = GetLibrary(url.Library);
if (library != null)
{
// parse out string data
//
var parent = library.RootFolder.ServerRelativeUrl;
var segments = url.Account.Split(new char[] { '/' }, StringSplitOptions.RemoveEmptyEntries);
var path = parent;
// get default folder collection
//
SPFolderCollection subFolders = _web.GetFolder(parent).SubFolders;
// check for sub-folders to create
//
if (segments.Length > 0)
{
int i = 0;
do
{
// check for folder and create if non-existant
//
var buildPath = String.Format("{0}/{1}", path, segments[i]);
if (_web.GetFolder(buildPath).Exists == false)
_web.GetFolder(path).SubFolders.Add(segments[i]);
// retrieve new sub-folder collection
//
subFolders = _web.GetFolder(buildPath).SubFolders;
path = buildPath;
// next folder in path
//
i++;
}
while (i < segments.Length);
}
// finally, add folder of interest
//
subFolders.Add(url.Folder);
}
}
catch (Exception e)
{
throw new SPImportException("Exception: {0}, creating folder: {1} in Library: {2}", e.Message, url.Folder, url.Library);
}
}
Code to Query folder:
public bool FolderExists(SPUriBuilder url)
{
return _web.GetFolder(url.Uri.LocalPath).Exists;
}
Code to Get Folder:
private SPFolder GetFolder(SPUriBuilder url)
{
return _web.GetFolder(url.Uri.LocalPath);
}
The SPUriBuilder is a custom class I created to assemble the Uri:
public class SPUriBuilder
{
public string SiteUrl { get; private set; }
public string Library { get; private set; }
public string Parent { get; private set; }
public string Folder { get; private set; }
public string File { get; private set; }
public string Account { get; private set; }
public Uri Uri { get; private set; }
public SPUriBuilder(string siteUrl, string library, string account, string parent, string folder)
{
this.SiteUrl = siteUrl;
this.Library = library;
this.Account = account;
this.Parent = parent.Replace("\\", "/");
this.Parent = this.Parent.StartsWith("/") ? this.Parent.Substring(1) : this.Parent;
this.Folder = folder;
StringBuilder url = new StringBuilder();
url.AppendFormat("{0}/{1}/{2}", SiteUrl, Library, Account);
if (String.IsNullOrEmpty(Parent) == false)
url.AppendFormat("/{0}", Parent);
url.AppendFormat("/{0}", Folder);
this.Uri = new Uri(url.ToString());
}
public SPUriBuilder(SPUriBuilder uri, string file)
: this(uri.SiteUrl, uri.Library, uri.Account, uri.Parent, uri.Folder)
{
this.File = file;
StringBuilder url = new StringBuilder();
url.AppendFormat("{0}/{1}", this.Uri.ToString(), this.File);
this.Uri = new Uri(url.ToString());
}
public override string ToString()
{
return Uri.ToString();
}
}
I found the answer this to this myself. The problem was in the code used to create the folder.
var parent = library.RootFolder.ServerRelativeUrl;
// This line of code is incorrect, so it returned the wrong data, thus building the folder path incorrectly.
//
var segments = url.Account.Split(new char[] { '/' }, StringSplitOptions.RemoveEmptyEntries);
var path = parent;
// This is the replacement line of code that fixed the issue.
//
var segments = url.Uri.LocalPath.Substring(parent.Length+1).Split(new char[] { '/' }, StringSplitOptions.RemoveEmptyEntries);
// as well, this line had to be removed since it was no longer needed
//
// finally, add folder of interest
//
subFolders.Add(url.Folder);
Ultimately the issue turned out be that the folder structure did not exist that I was attempting to create the file in. One or more segments in the path were missing.
So if you ever see this error, make sure you're the folder exists that you are adding the file to. If it isn't, you will certainly experience this error.
Related
I have following hierarchy in my azure storage container :
Container
-- Folder 1
-- Folder 2
-- Folder 2.1
-- File 1
-- File 2
-- File 3
What I'm searching for is a generic function where I can pass string e.g. "container/Folder1/Folder2" and it return me the hierarchy i.e.
-- Folder 2.1
-- File 1
-- File 2
-- File 3
I have following code in place but in this I'm not able to pass the prefix as "container/Folder1/Folder2". If I add "/" in prefix string then it throws error that invalid uri string.
static void printCloudDirectories(IEnumerable<IListBlobItem> blobList, Container cont)
{
foreach (var blobitem in blobList)
{
if (blobitem is CloudBlobDirectory)
{
var container = new Container();
var directory = blobitem as CloudBlobDirectory;
Console.WriteLine(directory.Prefix);
container.Name = directory.Prefix;
BlobContinuationToken token = null;
var directories = directory.ListBlobsSegmentedAsync(token).Result.Results;
printCloudDirectories(directories, container);
cont.Containers.Add(container);
}
else
{
cont.Children.Add(blobitem.Uri.AbsoluteUri);
}
}
}
public static void ListClientMethod(CloudBlobClient cloudBlobClient)
{
BlobContinuationToken token = null;
var containerSegments = cloudBlobClient.ListContainersSegmentedAsync(token).Result;
List<Container> containers = new List<Container>();
foreach (var container in containerSegments.Results)
{
Console.WriteLine("Container: " + container.Name);
var cont = new Container();
cont.Name = container.Name;
// ADD A CALL TO printCloudDirectories:
BlobContinuationToken token1 = null;
var blobs = container.ListBlobsSegmentedAsync(token1).Result.Results;
printCloudDirectories(blobs, cont);
containers.Add(cont);
}
}
public class Container
{
public Container()
{
Children = new List<string>();
Containers = new List<Container>();
}
public string Name { get; set; }
public List<string> Children { get; set; }
public List<Container> Containers { get; set; }
}
I use c# as coding language
Please use ListBlobsSegmentedAsync(String, Boolean, BlobListingDetails, Nullable<Int32>, BlobContinuationToken, BlobRequestOptions, OperationContext) method.
The 1st parameter to this method is the Blob Prefix and you need to specify Folder 1/Folder 2/ there.
2nd parameter to this method is useFlatBlobListing and you need to pass true for that.
It should return you a result like:
Folder 1/Folder 2/Folder 2.1/File 1
Folder 1/Folder 2/Folder 2.1/File 2
Folder 1/Folder 2/Folder 2.1/File 3
and you should be able to construct the desired treeview based on this.
I want to access all files attached to the current project. Im not able to find any files using the PXSelect statement below.
My Code
public PXSelect<UploadFile, Where<UploadFile.name, Like<Current<PMProject.contractCD>>>> Files;
string files = "";
foreach (UploadFile f in Files.Select())
{
files += "\n"+f.FileID;
}
Static GetFileNotes method of the PXNoteAttribute returns the list of identifiers of files attached to a record. Below is a code snippet showing how to retrieve all files attached to the current project:
public class ProjectEntryExt : PXGraphExtension<ProjectEntry>
{
public PXAction<PMProject> GetFiles;
[PXButton]
[PXUIField(DisplayName = "Get Files")]
protected void getFiles()
{
var projectCache = Base.Caches[typeof(PMProject)];
Guid[] files = PXNoteAttribute.GetFileNotes(projectCache, projectCache.Current);
foreach (Guid fileID in files)
{
var fm = new PX.SM.UploadFileMaintenance();
PX.SM.FileInfo fi = fm.GetFileWithNoData(fileID);
}
}
}
in UWP there are files and permissions restrictions, so we can only acces files directly from few folders or we can use filepicker to access from anywhere on system.
how can I use the files picked from filepicker and use them anytime again when the app runs ? tried to use them again by path but it gives permission error. I know about the "futureacceslist" but its limit is 1000 and also it will make the app slow if I am not wrong? .
Is there a better way to do this ? or can we store storage files link somehow in local sqlite database?
If you need to access lots of files, asking the user to select the parent folder and then storing that is probably a better solution (unless you want to store 1,000 individually-picked files from different locations). You can store StorageFolders in the access list as well.
I'm not sure why you think it will make your app slow, but the only real way to know if this will affect your performance is to try it and measure against your goals.
Considering this method..
public async static Task<byte[]> ToByteArray(this StorageFile file)
{
byte[] fileBytes = null;
using (IRandomAccessStreamWithContentType stream = await file.OpenReadAsync())
{
fileBytes = new byte[stream.Size];
using (DataReader reader = new DataReader(stream))
{
await reader.LoadAsync((uint)stream.Size);
reader.ReadBytes(fileBytes);
}
}
return fileBytes;
}
This class..
public class AppFile
{
public string FileName { get; set; }
public byte[] ByteArray { get; set; }
}
And this variable
List<AppFile> _appFiles = new List<AppFile>();
Just..
var fileOpenPicker = new FileOpenPicker();
IReadOnlyList<StorageFile> files = await fileOpenPicker.PickMultipleFilesAsync();
foreach (var file in files)
{
var byteArray = await file.ToByteArray();
_appFiles.Add(new AppFile { FileName = file.DisplayName, ByteArray = byteArray });
}
UPDATE
using Newtonsoft.Json;
using System.Linq;
using Windows.Security.Credentials;
using Windows.Storage;
namespace Your.Namespace
{
public class StateService
{
public void SaveState<T>(string key, T value)
{
var localSettings = ApplicationData.Current.LocalSettings;
localSettings.Values[key] = JsonConvert.SerializeObject(value);
}
public T LoadState<T>(string key)
{
var localSettings = ApplicationData.Current.LocalSettings;
if (localSettings.Values.ContainsKey(key))
return JsonConvert.DeserializeObject<T>(((string) localSettings.Values[key]));
return default(T);
}
public void RemoveState(string key)
{
var localSettings = ApplicationData.Current.LocalSettings;
if (localSettings.Values.ContainsKey(key))
localSettings.Values.Remove((key));
}
public void Clear()
{
ApplicationData.Current.LocalSettings.Values.Clear();
}
}
}
A bit late, but, yes the future access list will slow down your app in that it returns storagfile, storagefolder, or storeageitem objects. These run via the runtime broker which hits a huge performance barrier at about 400 objects regardless of the host capability
I am using svnkit-1.3.5.jar in my application. On one of my screens on clicking a button I need to display a jQuery dialog box containing list of folders present at a particular path in SVN. Does svnkit provide any method that retrieves all folder names present at a specific location? How do I achieve this in java?
Here is the code i use for the same purpose (uses svnkit library). Modified version of #mstrap's code for better clarity.
public static String NAME = "svnusername";
public static String PASSWORD = "svnpass";
public final String TRUNK_VERSION_PATH = "svn://192.168.1.1/path";
public static List<String> apiVersions;
public List<String> getApiVersion() {
logger.info("Getting API Version list....");
apiVersions = new ArrayList<String>();
SVNURL repositoryURL = null;
try {
repositoryURL = SVNURL.parseURIEncoded(TRUNK_VERSION_PATH);
} catch (SVNException e) {
logger.error(e);
}
SVNRevision revision = SVNRevision.HEAD;
SvnOperationFactory operationFactory = new SvnOperationFactory();
operationFactory.setAuthenticationManager(new BasicAuthenticationManager(NAME, PASSWORD));
SvnList list = operationFactory.createList();
list.setDepth(SVNDepth.IMMEDIATES);
list.setRevision(revision);
list.addTarget(SvnTarget.fromURL(repositoryURL, revision));
list.setReceiver(new ISvnObjectReceiver<SVNDirEntry>() {
public void receive(SvnTarget target, SVNDirEntry object) throws SVNException {
String name = object.getRelativePath();
if(name!=null && !name.isEmpty()){
apiVersions.add(name);
}
}
});
try {
list.run();
} catch (SVNException ex) {
logger.error(ex);
}
return apiVersions;
}
Cheers!!
final URL url = ...
final SVNRevision revision = ...
final SvnOperationFactory operationFactory = ...
final SvnList list = operationFactory.createList();
list.setDepth(SVNDepth.IMMEDIATES);
list.setRevision(revision);
list.addTarget(SvnTarget.fromURL(url, revision);
list.setReceiver(new ISvnObjectReceiver<SVNDirEntry>() {
public void receive(SvnTarget target, SVNDirEntry object) throws SVNException {
final String name = object.getRelativePath();
System.out.println(name);
}
});
list.run();
I need to create a simple program, which goes through a user-given directory on Sharepoint and finds all the folders which are older than 1 month and then it copies them to some local hard drive.
Perhaps it creates some log in a way that this folder was moved to.......
Thanks
Jakub
I wrote this sample code which you can use to understand how it can be done, or you can just use it, because it seems to work fine.
class Program
{
static void Main(string[] args)
{
MoveFolders("your_web_url", "your_doclib_url");
}
public static void MoveFolders(string webUrl, string listUrl)
{
using (SPSite site = new SPSite(webUrl))
{
using (SPWeb web = site.OpenWeb())
{
SPList targetList = web.GetList(web.Url + "/" + listUrl);
MoveFolders(targetList.RootFolder, #"C:\test"); // path to your local storage folder
}
}
}
public static void MoveFolders(SPFolder targetFolder, string rootLocalPath)
{
string currentPath = Path.Combine(rootLocalPath, targetFolder.Name);
if (!Directory.Exists(currentPath))
Directory.CreateDirectory(currentPath);
DateTime lastModified = (DateTime)targetFolder.Properties["vti_timelastmodified"]; //folder last modified date
if (lastModified < DateTime.Today.AddMonths(-1))
SaveFolderLocal(targetFolder, currentPath);
foreach (SPFolder folder in targetFolder.SubFolders)
{
MoveFolders(folder, currentPath);
}
}
public static void SaveFolderLocal(SPFolder folder, string localStoragePath)
{
foreach (SPFile file in folder.Files)
{
var contents = file.OpenBinary();
using (FileStream fileStream = new FileStream(Path.Combine(localStoragePath, file.Name), FileMode.Create))
{
fileStream.Write(contents, 0, contents.Length);
}
}
}
}
This code will save your doclib folder structure locally with contents of any folder modified more than one month ago. Just be careful of using recursive MoveFolders method, because it can cause a StackOverflowException on libraries with very complex folder structure.