The migration of the SQL Reporting component of Windows Azure from the old portal to the newer html 5 one has in the process limited the folder hierarchy to 2 levels deep (As indicated in this article).
The article does however state that existing reporting services can still have deeper hierarchies; whilst Business Intelligence Development Studio still allows you to deploy to the sub folder.
We have preserved our hierarchy like so:
Root Level
Client Reports
Internal Reports
Report Category 1
Data Source
Report1.rdl
Report2.rdl
Report Category 2
Due to the number of reports we have it is unfeasible to have every folder at root level and, thus far, the hierarchy has still be functioning correctly.
However we have run into a problem; we can no longer update any data sources or delete reports that are more than 2 levels deep.
Rather than restructure all our reports to suit what feels like an extremely restrictive structure, is there a way of managing our SQL Reporting reports external to the portal, via either an API, BIDS or Powershell?
OK so I've done a bit of research into this; SQL Reporting on Windows Azure exposes the ReportService2010 SOAP interface for administering the reports programmatically.
Using a proxy generated through the WSDL tool we can remotely connect to SQL Reporting using the below C# code:
var reportServiceUrl =
"https://X.reporting.windows.net/reportserver/reportservice2010.asmx?wsdl";
var serviceUsername = "AdminUserName;
var servicePassword = "AdminPassword";
var reportingService = new ReportingService2010
{
Url = reportServiceUrl,
CookieContainer = new CookieContainer()
};
reportingService.LogonUser(serviceUsername, servicePassword, reportServiceUrl);
The reportingService object can then be used to upload, update and delete all items on the SQL Reporting instance. If, for example, I wanted to delete a report in a sub folder that cannot be accessed on the Windows Azure portal, I could use:
reportingService.DeleteItem("Internal Reports/Report Category 1/Report1.rdl");
That stated; it is much easier to refactor the report folders to a 2 level hierarchy instead. The naming convention we ended up using is:
Root Level
Internal Reports - Report Category 1
Data Source
Report1.rdl
Report2.rdl
Internal Reports - Report Category 2
Related
We are designing an Azure Website which will allow users to Upload content(MP4,Docx...MSOffice Files) which can then be accessed.
Some video content we will encode to provide several differing quality formats, before it will be streamed (using Azure Media Services).
We need to add an intermediate step so we can scan uploaded files for potential virus risk. Is there functionality built into azure (or third party) which will allow us to call an API to scan content before processing it? We are ideally looking for an API rather than just a background service on a VM, so we can get feedback potentially for use in a web or worker role.
Had a quick look at Symantec Endpoint and Windows Defender but not sure these offer an API
I have successfully done this using the open source ClamAV. You don't specify what languages you are using, but as it's Azure I'll assume .Net.
There is a .Net wrapper that should provide the API that you are looking for:
https://github.com/tekmaven/nClam
Here is some sample code (note: this is copied directly from the nClam GitHub repo page and reproduced here just to protect against link rot)
using System;
using System.Linq;
using nClam;
class Program
{
static void Main(string[] args)
{
var clam = new ClamClient("localhost", 3310);
var scanResult = clam.ScanFileOnServer("C:\\test.txt"); //any file you would like!
switch(scanResult.Result)
{
case ClamScanResults.Clean:
Console.WriteLine("The file is clean!");
break;
case ClamScanResults.VirusDetected:
Console.WriteLine("Virus Found!");
Console.WriteLine("Virus name: {0}", scanResult.InfectedFiles.First().VirusName);
break;
case ClamScanResults.Error:
Console.WriteLine("Woah an error occured! Error: {0}", scanResult.RawResult);
break;
}
}
}
There are also APIs available for refreshing the virus definition database. All the necessary ClamAV files can be included in the deployment package and any configuration can be put into the service start-up code.
ClamAV is a good idea, specially now that 0.99 is about to be released with YARA rule support - it will make it really easy for you to write custom rules and allow clamav to use tons of good YARA rules in the open today.
Another route, and a bit of shameless plugging, is to check out scanii.com, it's a SaaS for malware/virus detection and it integrates quite nicely with AWS and Azures.
There are a number of options to achieve this:
Firstly you can use ClamAV as already mentioned. ClamAV doesn't always receive the best press for its virus databases but as others have pointed out it's easy to use and is expandable.
You can also install a commercial scanner, such as avg, kaspersky etc. Many of these come with a C API that you can talk to directly, although often getting access to this can be expensive from a licensing point of view.
Alternatively you can make calls to the executable directly using something like the following to capture the output:
var proc = new Process {
StartInfo = new ProcessStartInfo {
FileName = "scanner.exe",
Arguments = "arguments needed",
UseShellExecute = false,
RedirectStandardOutput = true,
CreateNoWindow = true
}
};
proc.Start();
while (!proc.StandardOutput.EndOfStream) {
string line = proc.StandardOutput.ReadLine();
}
You would then need to parse the output to get the result and use it within your application.
Finally, now there are some commercial APIs available to do this kind of thing such as attachmentscanner (disclaimer I'm related to this product) or scanii. These will provide you with an API and a more scalable option to scan specific files and receive the response from at least one virus checking engine.
New thing coming Spring / Summer 2020. Advanced threat protection for Azure Storage includes Malware Reputation Screening, which detects malware uploads using hash reputation analysis leveraging the power of Microsoft Threat Intelligence, which includes hashes for Viruses, Trojans, Spyware and Ransomware. Note: cannot guarantee every malware will be detected using hash reputation analysis technique.
https://techcommunity.microsoft.com/t5/Azure-Security-Center/Validating-ATP-for-Azure-Storage-Detections-in-Azure-Security/ba-p/1068131
`webClient.UploadFile("http://www.myurl.com/~/media/DCF92BB74CDA4D558EEF2D3C30216E30.ashx", #"E:\filesImage\Item.png");
I'm trying to upload images to sitecore using webclient.uploadfile() method by sending my sitecore address and the path of my local images.But I'm not able to upload it.I have to do this without any API's and Sitecore Instances.
The upload process would be the same as with any ASP.net application. However, once the file has been uploaded you need to create a media item programtically. You can do this from an actual file in the file system, or from a memory stream.
The process involves using a MediaCreator object and using its CreateFromFile method.
This blog post outlines the whole process:
Adding a file to the Sitecore Media Library programatically
If you're thinking simply about optimizing your developer workflow you could use the Sitecore PowerShell Extensions using the Remoting API as described in this this blog post
If you want to use web service way than you can use number of ways which are as follows:
a) Sitecore Rocks WebService (If you are allowed to install that or it is already available).
b) Sitecore Razl Service(It is third party which need license).
c) Sitecore Powershell Remoting (This needs Sitecore PowerShell extensions to be installed on Sitecore Server).
d) You can also use Sitecore Service which you can find under sitecore\shell\WebService\Service.asmx (But this is legacy of new SitecoreItemWebAPI)
e) Last is my enhanced SitecoreItemWebAPI (This also need SitecoreItemWebApi 1.2 as a pre-requisite).
But in end except option d you need to install some or other thing in order to upload the image using HTTP, you should also know the valid credentials to use any of above stated methods.
If your customers upload the image on the website, you need to create the item in your master database. (needs access and write right on the master database) depend on your security you might consider not build it with custom code.
But using the Sitecore webforms for marketers module With out of the box file upload. Create a form with upload field and using the WFFM webservices.
If you dont want to use Sitecore API, then you can do the following:
Write a code that uploads images into this folder : [root]/upload/
You might need to create folder structure that represent how the images are stored in Sitecore, eg: your images uploaded into [root]/upload/Import/ will be stored in /sitecore/media library/Import
Sitecore will automatically upload these images into Media library
Hope this helps
Option: You can use Item Web API for it. No reference to any Sitecore dll is needed. You will only need access to the host and be able to enable the Item Web API.
References:
Upload the files using it: http://www.sitecoreinsight.com/how-create-media-items-using-sitecore-item-web-api/
Enable Item Web Api: http://sdn.sitecore.net/upload/sdn5/modules/sitecore%20item%20web%20api/sitecore_item_web_api_developer_guide_sc66-71-a4.pdf#search=%22item%22
I guess that is pretty much what you need, but as Jay S mentioned, if you put more details on your question helps on finding the best option to your particular case.
private void CreateImageIteminSitecore()
{
filePath = #"C:\Sitecore\Website\ImageTemp\Pic.jpg;
using (new SecurityDisabler())
{
Database masterDb = Sitecore.Configuration.Factory.GetDatabase("master");
Sitecore.Resources.Media.MediaCreatorOptions options = new Sitecore.Resources.Media.MediaCreatorOptions();
options.FileBased = true;
options.AlternateText = Path.GetFileNameWithoutExtension(filePath);
options.Destination = "/sitecore/media library/Downloads/";
options.Database = masterDb;
options.Versioned = false; // Do not make a versioned template
options.KeepExisting = false;
Sitecore.Data.Items.MediaItem mediaitemImage = new Sitecore.Resources.Media.MediaCreator().CreateFromFile(filePath, options);
Item ImageItem = masterDb.GetItem(mediaitemImage.ID.ToString());
ImageItem.Editing.BeginEdit();
ImageItem.Name = Path.GetFileNameWithoutExtension(filePath);
ImageItem.Editing.EndEdit();
}
}
This is regarding SharePoint 2010 Integration with MSCRM 2011.
While creating a record in CRM, trying to create a Custom Document location for that record and a similar folder in sharepoint, So that when user clicks on document link in the entity record it does not prompt user to create folder in Sharpoint (Trying to avoid sharepoint noise for better user experience)
I have implemented through post create asynchronous plug-in. (I did this through console program working fine). Build the plugenter code here-in and deployed to CRM.
When creating a record it error out with a message like "An internal server 500 error - Could not load the assembly with public key token etc…blab bla bla…”
But when I am debugging the plug-in it failed at the first line of command where I am instantiating sharePoint method Create client context of sharepoint, it says [System.Security.SecurityException]={“That assembly does not allow partially trusted callers”.}
As per google, per this issue it should be having one attribute “Allow partial users” in assembly info file. As per my understanding, this should be done in because the request goes from CRM plug-in to SharePoint dll. I mean share point dlls are not allowing request from my assembly. How can we change that?
I have referenced Microsoft.SharePoint.client.dll and Microsoft.SharePoint.Client.Runtime.dll
What is the alternate to overcome this issue?
Appreciate if some one can help me ..Thanks In advance.
Here is my code for SharePoint
ClientContext clientContext = new ClientContext(siteUrl)
CredentialCache cc = new CredentialCache();
Cc.Add(new Uri(siteUrl), "NTLM", CredentialCache.DefaultNetworkCredentials);
clientContext.Credentials = cc;
clientContext.AuthenticationMode = ClientAuthenticationMode.Default;
Web web = clientContext.Web;
SP.List list = web.Lists.GetByTitle(listName);
ListItemCreationInformation newItem = new ListItemCreationInformation();
newItem.UnderlyingObjectType = FileSystemObjectType.Folder;
newItem.FolderUrl = siteUrl + "/" + folderlogicalName;
if (!relativePath.Equals(string.Empty))
newItem.FolderUrl += "/" + relativePath;
newItem.LeafName = newfolderName;
SP.ListItem item = list.AddItem(newItem);
item.Update();
clientContext.ExecuteQuery();
Where I am passing the siteurl, folderlogicalname,relativepath and new foldername as parameters.
This works fine from my Console application. But when converted to CRM plug-in it gives the above specified issue
I've seen a similar issue before.
CRM plugins run inside a sandbox, so all assemblies and .NET libraries used must allow partial trust callers (since the CRM sandbox runs under partial trust). It works in the console because you are executing the code as a full trust user in that context.
This issue is not necessarily your code, but could be a dependency or a .NET library itself does not allow partial trust callers - in your case it sounds like the Sharepoint library is the culprit (but a stack trace of the error should reveal exactly where the cause is).
Since you don't have access to the source library causing the problem, to overcome the error you will likely have to create a wrapper. However, the problem is the wrapper cannot directly reference the problem library or you will get the same issue. So to get around this, you may have to create a web service which acts as your wrapper and then call the web service in your CRM plugin. This way the full trust code is executed by the web service (which is full trust) and then returns the result to your calling CRM plugin.
Here is more info on the error.
Thanks Jason. This works for me.
I Would like to add additional few points to the answer.
1. I have added the sharepoint dlls to the bin folder of CRM 2011 site.
2. Also deployed the same dlls in the folder whereever Async job is running to make my Async plug-in to work.
Thanks once again for the cooperation
I have some Test, Security, Project Management and some other word documents in TFS2010 source control under Documents folder. Does anybody know how to access them in order to download and copy to a local path?
Those files are not physically under $/... folder, though they have a Sharepoint web server path like: "http://myServer/sites/MyProyect/Test/Tests_P13_F00120.doc". I have tried to use DownloadFiles activity without success due to it needs a path which starts with $/. Any suggestion please?
DownloadFiles is not an activity you can make use of, it's meant to deal with files residing in Source control.
Instead, you need to establish a connection to the Sharepoint Copy service of your TFS, which resides at http://<Site>/_vti_bin/Copy.asmx. We did this by adding a Service Reference in our build solution.
We then implemented a build activity that does basically the opposite of what you are after: during TFS build it uploads documents into Sharepoint.
The instantiation looks like this:
BasicHttpBinding binding = new BasicHttpBinding();
binding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly;
binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Ntlm;
binding.Security.Transport.ProxyCredentialType = HttpProxyCredentialType.None;
binding.Security.Message.ClientCredentialType = BasicHttpMessageCredentialType.UserName;
EndpointAddress endpointAddress = new EndpointAddress("http://<Site>/_vti_bin/Copy.asmx");
CopySoapClient copyService = new CopySoapClient(binding,endpointAddress);
This copy service exposes a GetItem method, this is the one you should probably be invoking.
I 'm not aware if this GetItem is capable of supporting a http://myServer/sites/MyProject/Test/*.doc kind of thing
Is there any way to do WSS 3.0 site provisioning? My client's requirement is attributes as variables that will be defined in XML format: Organization Name, Logo, Address, User and Role information. The client should be able to install this web application to any WSS production server by just defining the attributes in the XML file.
Is it possible to to write a utility to parse that well defined XML and provision the site accordingly?
It's possible to provision sites from the object model, but creating entirely customized sites is beyond the scope of a single question. To get you started, you should take a look at the SPWebCollection.Add as well as the SPSiteCollection.Add.
To create a site collection and some subsites into one of your web applications, you could use something like this:
var farm = SPFarm.Local;
var solution = farm.Solutions.GetValue<SPSolution>("YourSolution.wsp");
var application = solution.DeployedWebApplications.First();
var sites = application.Sites;
using(var site = sites.Add("/", "Root Site", "Description", 1033, "YOURTEMPLATE#1", "YOURDOMAIN\SiteCollectionAdmin", "Site Collection Admin", "admin#yourcompany.example")) {
using(var rootWeb = site.RootWeb) {
// Code customizing root site goes here
using (var subSite = rootWeb.Webs.Add("SubSite", "Sub Site", "Description", 1033, "YOURTEMPLATE#2", false, false)) {
// Code customizing sub site goes here
}
}
}
Yes, there are more than one.
Take a look at SharePoint Solution Generator which is in Windows SharePoint Services 3.0 Tools: Visual Studio 2005 Extensions.
You may create a site with all requirements of yours (pages, lists, document libraries...) and then generate a VS project that will create a SharePoint feature with all of your site. Then you may deploy that feature to any WSS production server.
You may alter the VS project to implement the logic to read your attributes from an additional xml file.
If the structure of your site is plain or you can save it as a template you may also write a small console application that reads the attribute xml file and create the site.
Create a regular solution, or use the aforementioned solution generator to generate the .wsp file. Then create a small console application, that expects the variables you mentioned as parameters.
With the code listed above, provision the new sitecollection from that solution, and store the entered parameters (Company name etc.) in the site in a list, or in the SPSite.Properties propertybag, from which you can then read them in custom webparts etc..
The SharePoint Data Population Tool available on CodePlex allows you to define sites with XML.