I have several services which I would like to deploy to azure.
Each service has an xsd schema which is placed in the project where the service is.
In the validation module I try to load the schema this way:
XmlSchemaSet schemaSet = new XmlSchemaSet();
Uri baseSchema = new Uri(AppDomain.CurrentDomain.BaseDirectory);
string mySchema = new Uri(baseSchema, "LogInService.xsd").ToString();
XmlSchema schemaLogIn = XmlSchema.Read(new XmlTextReader(mySchema), null);
schemaSet.Add(schemaLogIn);
... but apparently the path AppDomain.CurrentDomain.BaseDirectory is incorrect, and when I try to deploy the service I get the following error:
Could not find file 'F:\sitesroot\0\LogInService.xsd'
(on dev this code is working perfectly)
My question is, where am I supposed to put the xsd files - or how can I change the code above so it will work on the cloud?
I think your path is correct, what is probably happening is that your xsd is not being included in the package that is being uploaded. Check the properties of the file, in particular the build action property to make sure it will be copied.
Related
edit
I made a simplified repo at https://github.com/GilShalit/XMLValidation
I am building an XML editor in Blazor WebAssembly (TargetFramework=net5.0). Part of the functionality involves validating the XML for completeness and according to a complex xsd schema with three includes.
These are the steps I follow:
build an XmlSchemaSet and add 4 schemas to it by calling the following method for each xsd:
private async Task loadSchema(string path, string nameSpace)
{
byte[] byteArrayS = await _client.GetByteArrayAsync(path);
Console.WriteLine($"{path}: {byteArrayS.Length}");
MemoryStream streamS = new MemoryStream(byteArrayS);
XmlReader xmlSchemaReader = XmlReader.Create(streamS);
schemaSet.Add(nameSpace, xmlSchemaReader);
}
Initializing an event handler with:
ValidationEventHandler eventHandler = new ValidationEventHandler(ValidationEventHandler);
loading the XML into an XmlDocument:
byte[] byteArrayX = Encoding.ASCII.GetBytes(await _editorTarget.GetValue());
MemoryStream streamX = new MemoryStream(byteArrayX);
XmlReader reader = XmlReader.Create(streamX);
XmlDocument document = new XmlDocument();
document.Load(reader);
validating according to the schemaSet:
document.Schemas = schemaSet;
document.Validate(eventHandler);
ssteps 3 and 4 are run inside a Try...Catch block and running locally when the XML is not well formed (missing closing tag for example), the document.Load(reader); line produces an error with a message like the following:
The 'publicationStmt1' start tag on line 9 position 11 does not match the end tag of 'publicationStmt'. Line 11, position 12.
which is great. But validating a similar situation in the application deployed to Azure produces the following error message:Xml_MessageWithErrorPosition, Xml_TagMismatchEx, 11, 12.
Schema validation errors are caught in the event handler when the line document.Validate(eventHandler); is run and a typical message is:
The element 'fileDesc' in namespace 'http://www.tei-c.org/ns/1.0' has invalid child element 'publicationStmt1' in namespace 'http://www.tei-c.org/ns/1.0'. List of possible elements expected: 'editionStmt, extent, publicationStmt' in namespace 'http://www.tei-c.org/ns/1.0'.
But when run on Azure the message is Sch_InvalidElementContentExpecting.
What could the reason for this difference in the validation results between running locally and in Azure?
I tried to disable linking by adding:
<ItemGroup>
<BlazorLinkerDescriptor Include="LinkerConfig.xml" />
</ItemGroup>
But that did not make a difference in the deployed application, and running locally with Release instead of Debug did not change anything either.
I also made sure the 4 xsd files are actually loaded when running from Azure.
It looks like Blazor does not include localised error message templates from System.Xml.Res class. My guess is Blazor strips it away when building it via your CI/CD pipeline. It's possible your dev machine and build agent have different locales.
I would suggest playing with the following project properties to try force bundling all cultures and/or loading invariant culture based on en_US:
<PropertyGroup>
<BlazorWebAssemblyLoadAllGlobalizationData>true</BlazorWebAssemblyLoadAllGlobalizationData>
<InvariantGlobalization>true</InvariantGlobalization> <!-- If the app doesn't require localization, you may configure the app to support the invariant culture -->
</PropertyGroup>
You also mentioned tweaking the linker, but according to documentation it only kicks in for Release builds (you seem to not have tried deploying debug version yet). So I would suggest to try deploy a debug build of your app just to eliminate linker completely.
You also could force-link all i18 resources:
<PropertyGroup>
<BlazorWebAssemblyI18NAssemblies>all</BlazorWebAssemblyI18NAssemblies>
</PropertyGroup>
and add System.Xml to LinkerConfig.xml so hopefully it gets served to the client without further optimisations:
<linker>
<assembly fullname="System.Xml" />
</linker>
So this was a feature not a bug...
An issue I opened on Dev community was picked up by the dotnet/runtime team and added to the GitHub issue tracker here.
It turns out exception messages are removed to save on size.
Using <UseSystemResourceKeys>false</UseSystemResourceKeys> enables exception messages and I must say I am not seeing an increase in size.
When importing data model from external service which in this case is an xsodata source, I see that the VDM creates a wrong DEFAULT_SERVICE_PATH. Indeed the original service contains a "." inside like this "/xsodata/Internals.xsodata/$metadata",
but when checking the service generated by the VDM when importing data from external service in SAP Web IDE, I see this:
String DEFAULT_SERVICE_PATH = "/xsodata/Internalsxsodata";
without the "."
I had to add the "." manually.
Could you please check?
Simmaco
In case the manual change of the service EDMX file is not possible and you don't want to repeatedly adjust the generated source file(s), then I would recommend the #withServicePath method when interacting the service instance, e.g.:
InternalxsodataService service = new DefaultInternalxsodataService()
.withServicePath("xsodata/internals.xsodata");
Problem:
trying to get an image out of azure fileshare for manipulation. I need to read the file as an Drawing.Image for manipulation. I cannot create a valid FileInfo object or Image using uncpath (which I need to do in order to use over IIS)
Current Setup:
Attach a virtual directory called Photos in IIS website pointing to UNCPath of the Azure file share (e.g. \myshare.file.core.windows.net\sharename\pathtoimages)
This works as http://example.com/photos/img.jpg so I know it is not a permissions or authentication issue.
For some reason though I cannot get a reference to File.
var imgpath = Path.Combine(Server.MapPath("~/Photos"),"img.jpg")
\\resolves as \\myshare.file.core.windows.net\sharename\pathtoimages\img.jpg
var fi = new FileInto(imgpath);
if(fi.exists) //this returns false 100% of the time
var img = System.Drawing.Image.FromFile(fi.FullName);
The problem is that the file is never found to exist, even though I cant take that path and put it in an explorer window and return the img.jpg 100% of the time.
Does anyone have any idea why this would not be working?
Do I need to be using CloudFileShare object to just get a read of a file I know is there?
It turns out the issue is that I needed to wrap my code in an impersonation of the azure file share userid since the virtual directory is not really in play at all at this point.
using (new impersonation("UserName","azure","azure pass"))
{
//my IO.File code
}
I used this guys impersonation script found here.
Can you explain why DirectoryInfo.GetFiles produces this IOException?
When executing the following code in an empty azure container, I get file not found error (segments.gen; The specified blob does not exist.).
AzureDirectory azureDirectory = new AzureDirectory(account, "audiobookindex"); // <-- audiobookindex is the name of the blog storage container on my Azure account
// Create the index writerIndexWriter indexWriter = new IndexWriter(azureDirectory, new StandardAnalyzer(), true);
It seems to be failing on the OpenInput inside the Azure Library for Lucene.net assembly. However I don't understand while it's even calling that method. Would think it would just try to create it.
Also, the assembly and code IS hitting the container because it creates a write.lock file that I can see in the container.
Any suggestions?
This should solve this problem. The examples in market are developed with older apis and older framework version etc. I found the above solution which works fine! No need of putting interfering with debugger ;)
I have some Test, Security, Project Management and some other word documents in TFS2010 source control under Documents folder. Does anybody know how to access them in order to download and copy to a local path?
Those files are not physically under $/... folder, though they have a Sharepoint web server path like: "http://myServer/sites/MyProyect/Test/Tests_P13_F00120.doc". I have tried to use DownloadFiles activity without success due to it needs a path which starts with $/. Any suggestion please?
DownloadFiles is not an activity you can make use of, it's meant to deal with files residing in Source control.
Instead, you need to establish a connection to the Sharepoint Copy service of your TFS, which resides at http://<Site>/_vti_bin/Copy.asmx. We did this by adding a Service Reference in our build solution.
We then implemented a build activity that does basically the opposite of what you are after: during TFS build it uploads documents into Sharepoint.
The instantiation looks like this:
BasicHttpBinding binding = new BasicHttpBinding();
binding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly;
binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Ntlm;
binding.Security.Transport.ProxyCredentialType = HttpProxyCredentialType.None;
binding.Security.Message.ClientCredentialType = BasicHttpMessageCredentialType.UserName;
EndpointAddress endpointAddress = new EndpointAddress("http://<Site>/_vti_bin/Copy.asmx");
CopySoapClient copyService = new CopySoapClient(binding,endpointAddress);
This copy service exposes a GetItem method, this is the one you should probably be invoking.
I 'm not aware if this GetItem is capable of supporting a http://myServer/sites/MyProject/Test/*.doc kind of thing