listItem.File.OpenBinary() not working - Remote Blob Storage / FileStreaming not enabled on SQL Server the culprit? - sharepoint

I'm moving a cmd line migration utility from DEV to QA and I'm running into a strange error. Of course, things work perfectly on dev.
This is the offending line of code:
byte[] fileBytes = sourceItem.File.OpenBinary();
"Cannot Open File Error". Inner exception shows COM HResult code is 0x81070211
I have made sure that my account has owner permissions on the library that I'm attempting to pull this file from. I have even made myself a farm admin - but I still get the same error.
I'm seeing that several other people have encountered this same error, but no solutions. One post mentions downloading the file through code as a workaround - what would that look like?
I'm also seeing that some link this problem to files > 100Kb and that it can be overcome by putting the assembly into the GAC. However, this would be problematic for this application.
Yes, I have also tried using all of the different options parameters. I have also tried the workaround of just opening a Stream by using OpenBinaryStream, getting the length and reading the data into the byte[] array. The result is always the same. Something is disallowing me from getting access to the file to be able to read in the bytes – and the error message is just useless.
Thanks in Advance for any help you can provide.
I now believe that the problem may be related to Remote Blob Storage. Several of the posts have mentioned that the problem is only when the file size is > 100kb. That happens to be the limit at which files move from the content db to the file system. I believe that the DBA must enable FileStreaming on the SQL Server database in order to fix this problem. I'm awaiting our offshort DBA to act on this suggestion. I will follow up with a report if it works. In the meantime - anybody else have experience with using OpenBinary on files with RBS?
Trey Carroll

This is just an educated guess - I don't have time to test my theory. Does the library require check out? If so, are you checking out the file before trying to open it?

The problem was due to StoragePoint. The account running the OpenBinary() call must have explicit access to the StoragePoint databases.

I also faced this problem and found that it was due to the CAS permissions.
You may be able to resolve the problem by adding the below entry in your CAS permissions
<IPermission class="System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1" Unrestricted="true" />

Related

Power Query M - An operation that uses the database driver could not be completed

From time to time when I run some queries I get this error message :
An operation that uses the database driver could not be completed. If the driver is a Microsoft driver, make sure the driver file isn't damaged
I didn't found yet how to fix it with internet help.
this last website give a "solution" but I can't do the same.
"Well in my case it worked by giving the user access to the DB it is accessing."
I just have a worksheet who use others files from my company network, I don't use any database ...
I can't reinstall the driver or anything else, my company have a very strict politic about IT security.
May I have some help ?
I had the same issue when I try to connect Analysis service in Azure. The error message is misguiding. In actual, you are missing few client drivers that are needed to connect to Azure Analysis services. Install all the missing drivers (OLEDB, AMO and ADOMD) from https://learn.microsoft.com/en-us/analysis-services/client-libraries?view=asallproducts-allversions#known-issues and you should be good to go.
I think OLEDB installation should suffice the need but I did all three to avoid future encounters related to other admin stuff. Check if OLEDB updated # C:\Program Files (x86)\Microsoft Analysis Services\AS OLEDB\140 by comparing msolap.dll version. Latest version as I write this is 15.1.108.23
Tip: 1) Try connecting to Analysis Service from Excel by following the steps - Data Menu --> Get Data --> From Database --> From Analysis Services. Follow the prompts by filling appropriate info. Bingo! You are in!
2) Wait for a while if still you can't connect after installing all drivers. I don't know the actual reason why but was able to connect next day.
Please mark this as an answer if it helps to solve your problem.
Good Luck!
Update - 1/5/2021
Today I face same ugly error again which made me to rethink what else could be gone wrong in the system. Later realized that I had restarted the machine and that removed the temp connection file referred by Excel file. Following the steps to reconnect under the Tip section in above instructions let me successfully in.

Why might I be getting an access error from an Azure WebJob using WebJobsSdk.marker file? What is this file?

I'm specifically trying to understand what the WebJobsSdk.marker file is for and how it is used. It is a black-box to me so it is hard to understand why there is an access problem here.
The Error:
The process cannot access the file '...\WebJobsSdk.marker' because it
is being used by another process.
Any help or insight on this is greatly appreciated!
The WebJobs SDK is no longer a black box, it is open source now :)
The marker file is generated here and it is used by Kudu to determine if the webjob is using the SDK or not.
There might be a bug in that code, maybe it is not catching all the expected exceptions. Feel free to open a bug in GitHub

Resolving Mass-loading problems in WebSphere Commerce Instance creation

I am trying to create an instance using the Configuration Manager of WCS 7. I am working on a Win 7 x64 machine with DB2 9.5 64 bit version.
I am struck with this Massloading error when the instance creation happens :
In createInstanceANT.log file :
[Massload] Massloading
C:\IBM\WebSphere\CommerceServer\schema\xml\wcs.keys.xml Error in
MassLoading, please check logs for details.
The error log shows the following error :
[jcc][10165][10044][4.3.111] Invalid database URL syntax:
jdbc:db2://:0/WCSDEMO. ERRORCODE=-4461, SQLSTATE=42815
C:\IBM\WEBSPH~1\COMMER~2\config\DEPLOY~1\xml\createBaseSchema.xml:185:
Error in massloading
WCSDEMO is the database name. The Massloader is not able to get the URL and port to connect. It is supposedly getting them from createInstance.properties file but it is not working. The createInstance.properties file has all the details of the DB to connect.
What could be the reason for this error and how to resolve it ? Is there any configuration change that I am missing ?
Can you provide some more details.
look inside the messages.txt file located in WC_install_dir/instances/instance_name/logs
and confirm what the exact issue is. If it is related to jdbc driver being wrong I may be able to help you.
I've been running into massloading problems with external systems. Eg. databases not on the same machine as the WAS installation.
In these cases I look for the
As you can see setting the loaderDBName to just the name of the database would look on the local machine. But by changing this statement so you load with the syntax
loaderDBName=[DATABASE_SERVER_NAME]:[PORT]/[DATABASE_NAME]
You'll be able to massload using the commerce standard scripts. These changes needs to be done in many scripts. Both for updating fixpacks and enabling features. If you run database updates without the changes it will crash at first and have done all the schema changes to the database that you then need to comment out before trying again.
IBM Software Support is your friend. They'll help you fix it.

StructureMap, IIS 7.5 and FileIOException

Howdy all. I am trying to solve a problem which is apparently not uncommon and I'm not sure how to find how this was resolved for folks. When I run StructureMap on my machine through IIS I get an exception and it looks like this:
**Description**: The application attempted to perform an operation not allowed by the security policy. To grant this application the required permission please contact your system administrator or change the application's trust level in the configuration file.
**Exception Details**: System.Security.SecurityException: Request for the permission of type 'System.Security.Permissions.FileIOPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed.
This question has come up here at SO (https://stackoverflow.com/questions/784666/), in the comments on this blog post and a year or so ago on the structuremap mailing list.
My problem is not running it in a foreign hosting environment. I can't even get it to run on my own box (IIS 7.5, Win7 RC, .NET 3.5). I have tried to configure the site to use a custom policy file and the FileIOPermission is marked to have unrestricted access...no dice. If anyone has some tips or a link it would be greatly appreciated.
Update
So there is no way that this is the best way to solve the problem, but after digging around and looking into what Joshua mentioned, these are the things I had to do to get it working: StructureMap, Code Access Security and a Bad Solution to a Problem. A better solution would be appreciated.
For what it's worth, I ran into this same issue where I had full control over the box and even set all the permissions to full trust. With IIS 7.5, I had to change the identity used for the specific application pool to NetworkService instead of ApplicationPoolIdentity. Once I restarted IIS, it worked.
FYI, I'm using StructureMap v2.6.1 and ran into this issue.
I do not use an XML configuration, so I added the following line to my configuration code, which fixed the problem.
IgnoreStructureMapConfig = true;
This is a bug, and has been fixed in the trunk. It will be included in the 2.6+ releases.
Some earlier versions of StructureMap would either attempt to unnecessarily write the dynamic assemblies to disk, or unnecessarily attempt to read from the filesystem.
If you are running in a restricted environment that does not allow access to full paths in the filesystem (ASP.NET), make sure to set IgnoreDefaultFile = true when you configure your container. Keep in mind this will disable the ability to load XML configuration from StructureMap.config.
Using the official StructureMap 2.5.4 build on Windows 7 with IIS 7.5 I still encountered this problem.
Mallioch's change
ObjectFactory.Initialize(x =>
{
x.UseDefaultStructureMapConfigFile = false;
x.IgnoreStructureMapConfig = true;
was necessary to resolve the FileIOPermission exception but I then received Request for the permission of type ‘System.Web.AspNetHostingPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′ failed. which I resolved using Mike's solution ( for which I've created a step-by-step visualization ).

MOSS 404 errors for some users on certain sites, sometimes

Everything works fine for most accounts 100% of the time but here and there some users who are able to access a subsite fine one day are greeted with a standard 404 the next. This can last for an hour or two days, it's really inconsistent.
I check the iis logs and it says the status is also a 404 for these requests, nothing else looks unusual. Sharepoint logs have nothing for the timestamps either.
Correct me if I am wrong but if it was a permission issue an access denied message would be shown.
It is not the individual computer because when a user is having this problem and I can log in with their account and I also get the 404 error although I am on the exact site with another account in another browser at exactly the same time, and it works perfect.
Any suggestions would be greatly appreciated, I've done a fair amount of searching but can not find a similar situation or help anywhere.
Since the information is very strict at the moment, I will throw a few questions that might lead you towards the problem.
Are we on a loading balance setup? The intermittent 404 might be cause by one of the servers Web FrontEnds not correctly serving requests.
Are we running out of disk space in the SQL Server machine? This might cause it
Are the blogcache or site caches enabled? Some routines may break them
Do we have Anti-Virus on the server? (it HAS to be asked :p)
We get very high packet loss between the servers? (The error would be different though)
Something that might occur very often is when you have, say, custom code a DLL on the GAC or the bin folder, and since you could be on loading balance the second server does not have this DLL in its own GAC or BIN folder. Sharepoint usually raises 404s when assemblies are not found too, not only requests.
This seems like a longshot, but... could those SharePoint sites be throwing a 404 HttpException? It seems like you'd see that in the SharePoint logs, too, so it's even more unlikely.
At any rate, is your SharePoint logging level set to an appropriately verbose level to debug weird stuff like this?
Are people checking in & publishing master page changes when new CSS files and/or other includes (script files, etc.) are not yet published at all? I've seen this cause a 404 a few times when users forget to check before they publish.
I have had to write custom code within a SharePoint context and I had those 404 errors as well. The solution, in my case, was to ensure that the block of code executing had these 2 conditions met:
1) Run with elevated privileges, regardless of the fact that it is a 404 this was part of the solution.
2) web.AllowUnsafeUpdates = true; This line of code needed to be added even though it was wrapped within an elevated privileges block.
Once these were added the issue did not occur. This was happening on both load and non-load balanced environments.
This reminds me of a strange asp.net bug that I had a few months back.
It was caused by a patch that had been applied (to the framework if I remember correctly)
try downloading this Windows update list tool and have a look at what has changed since the problem started.
I concede that if there was a problem with a patch then its more likely that it would be a consistent error, but it's worth a look.
It was a permission issue, I had figured this out awhile back but basically a few accounts were not setup properly and could not load a fresh copy of the page, when someone else would hit it they would be able to view the cached page fine. Error was never thrown as a permission issue or access denied anywhere but this is what fixed the problem.

Resources