SSIS Connection to Excel via ACE.OLEDB as Service Account - excel

We have a process which needs to work with a series of Excel (sigh) files.
The setup is:
SQL agent job run as a SSIS proxy account.
Calls SSIS package on a share on the server.
Which then starts accessing these excel files using the ACE driver.
The process will work under my credentials.
The process will work under other people's credentials.
The process will work in debug mode (although this is not a fair test
as that would use my local machine's driver)
The process will not work using the SSIS proxy account.
The process WILL work if I make the SSIS proxy account an
administrator on the server.
I have ruled out the following:
access to the files share. The account can load text files from
there.
32bit/64bit issues. The account CAN run given sufficient
permissions.
My opinion is that the service account needs some sort of level of permission to be able to use the driver. I can't work out what though.
I have tried LOCAL SECURITY POLICY option "Load and unload device drivers" with no success. ( I did think this had done it, but then realised that I had left the account in the admin group :-( )
Finally, the error message in question:
SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.
The AcquireConnection method call to the connection manager
"TPR_ReadReportsExcelConnection" failed with error code 0xC0202009.
There may be error messages posted before this with more information
on why the AcquireConnection method call failed.

This seems to be beyond the supported scope depending on how you've set up your SSIS proxy account. See Additional Information section here. Not enough points to post an image so here is the important sentence:
provided the SSIS jobs run in the context of a logged-on user with a valid HKEY_CURRENT_USER registry hive

Related

Notes cannot create automation object

I have created an agent iin which I call a dll file so that I can get data from another system eventually..
When I manually run this agent then the call to the dll file works and I get data from it.
When I call this agent from a browser then I get:
Error Description : Cannot create automation object
The security of the agent is on 3.Allow restricted operations with full administration rights.
Any idea why I get Cannot create automation object when running agent from browser?
When you start the agent from browser then the agent runs on Domino server.
So, you need to install automation object's software on this server.
When you manually run the agent then the code is executed on your client. This works because you installed the software necessary for creating the automation object on your client.
I found the same error but in my case everything was working properly until we restored a backup of Windows. The issue was caused because Office had to be reactivated again.
Sometimes this happens too because a corruption of the Word document, so you have to rebuild it.
I hope this could be helpful in future to other people

Automated SQL Export Failed

I have an automatic backup running each night through the Portal which should back up my Azure database to blob storage as a .bacpac file and up until Friday that had been working successfully.
Each night I get an email error saying:
Automated SQL Export failed for myServer:myDatabase at 5/30/2016 11:35:39 PM. The temporary database copy was made, but this copy could not be exported to the .bacpac file.
Some tutorials suggest logging into the Portal and doing it manually. When I do this it works successfully and I am able to see the file without error. But on the following night, the process fails again (it doesn't recover itself from performing a manual backup). Is there a way to get more information on why it is failing?
In the new Portal, you can find more information via audit log, database level operations will be logged there including import/export.
OK so after further analysis I was able to pinpoint the root cause of my issue to a Stored Procedure.
I had a Stored Proc which was explicitly referencing my database. Whenever the database backup is taken in Azure, it creates a temporary name and at that point, "breaks" the Stored Procedure as it was Self Referencing.
Fixing the Stored Proc has resumed the automatic backups.
An example of a statement the Proc was calling was:
Select Name from MyDatabase.Dbo.MyTable
This should be rewritten as the following to make it exportable:
Select Name from Dbo.MyTable
Note that while I was able to obtain a more meaningful error using a local copy of Sql Server Management Studio, no error was present in the Azure Portal.
Hopefully this will help someone else.

Assertion Failed getting External Data from SQL Server

When I select from the Ribbon Get External Data -> From SQL Server I get the following Assertion Failed dialog from EXCEL:
I have tried this with an Admin account, and after logging off and on, with the same results. If I select -> From Analysis Services I don't get the dialog.
f I choose Ignore I get a dialog to specify connection details but it always fails, reporting an inability to locate the server, even though in SSMS and through TFS I can connect to the server without issues. The DB server is my local machine, though a Named Instance.
(Yes, I have very carefully checked that the server name is correctly specified.)
Has anyone seen this before and can assist in diagnosing the issue?
Update:
The error dialog occurs immediately on clicking on From SQL Server.
If I opt to Retry from the dialog, I get the standard Microsoft Excel has stopped working, would you like to check online for a solution etc. dialog. If I attempt to debug the code I am informed that Source is not available, although symbols appear to be successfully downloaded form the Microsoft Symbol Server.

How to do remote staging in liferay 6.1.1 GA2?

I have a site when I tried to apply local staging it's worked fine,but we I tried to connect it through remote server it's not working giving error connection can't be established.Does any one tried it?
This is the configuration with the error message:
This blog post (disclaimer: my own) explains how to do it with https - you can omit long parts of it if you don't want encryption. It also covers 6.0, but the general principle is still the same.
You want to pay special attention to the paragraph Allow access to webservices in that article and check if your publishing server (the "stage") has access to the live server. In general, if this is not on localhost, it requires configuration as mentioned in that article.
As you indicate that you can't connect to your production server from your staging server, please check by opening a browser, running on the staging server and connect it to the production server - go to http://production-server-name:8080/api/axis and validate that you can connect (note: You get the authoritative result for this test only when not accessing localhost as the production system: Do run the browser on the staging system!) - with this test you can eliminate the first chance of your remote system being disallowed. Once this succeeds, you'll need credentials for the production server to be entered on the staging server - the account that you use needs to have permissions to change all the data it needs to change when publishing content (and pages etc.)
The error message you give in the added screenshot can appear when the current user on staging does not have access to the production system (with the credentials used) - verify that you have the same user account that you are using on your staging system (the one that gets the error message from the screenshot) in your production system. Synchronize the passwords of the two.
I your comment you give the information that you're using different version for the staging and the production environment - I don't expect that to work, so this might be the root cause. Test with both systems at the same version.
A couple important points to keep in mind with remote publishing:
If you're not on LDAP (or you have different LDAPs for different environments), you should validate that your user account is exactly the same in both source and target environments. So, if you're on the QA site and you want to remote publish to production, your screen name, email address, and password should all be the same.
Email address is uber important. Depending on which distribution (version) of Liferay you are on, the remote publish code uses your email address to irrespective of whether or not you have portal-ext.properties configured to use screenname.
You should have the Administrator role in on both sides. It may not be required in every scenario, but giving that role out to users that do remote publishing has saved me time and effort debugging why someone's remote publish didn't work. Debugging this process takes a very long time.
If remote publishing is causing you problems (and it probably is or you wouldn't be here), try doing lar file exports / imports. This is important since remote publish failures are not exactly helpful in telling you what failed, they just tell you then failed. Surprisingly, there are often problems in the export process and you can sometimes pinpoint some bad documents or a funky development thing you did using Global scope and portlet preferences that caused your RP to fail. I generally use this order in this situation a) documents and media [exclude thumbnails or your lar file will likely double in size, also exclude ranks if you're not using them] from the wrench icon in the control panel b) web content from the wrench icon in the control panel c) public pages [include data > web content display, but remove all the other data check boxes], include permissions, include categories d) private pages [same options as public pages].
If you already have Administrator role and it's saying you don't have permissions to RP to the remote site, setup your user on the target environment with the "Site Administrator" or "Site Owner" role.
A little late for first and foremost, but anytime you have something that's not working (remote publishing or otherwise), check the logs before you do anything else. The Liferay code base doesn't include a lot of helpful logging, but you do occasionally get a nugget of information that helps you piece together enough to do root cause analysis.
Cheers! HTH

Querying Service Status with ADSI - what rights are needed?

I'm using VB6 and using ADSI to query for the status (running or not) of a Windows Service. See this MS article: http://msdn.microsoft.com/en-us/library/aa746322(v=vs.85).aspx.
With a user who is a member of the USERS group, I'm receiving a thrown exception. I believe it's on the GetObject method:
Set comp = GetObject("WinNT://.,Computer")
The exception is: 80070005 "General access denied error"
Running the same code as a member of POWER USERS, however, works just fine.
Elevating all users to Power users isn't an option. What exact rights do I need to have granted in order for this function to run successfully?
I've tried running procmon.exe, and wasn't able to determine from the output as to what or where a denial is occurring.
Thanks!
Edit: This is running on XP sp2.
Sounds like you're running into a UAC barrier. I'm not familiar with IADsService, but it is hardly necessary in determining if a Windows service is running. Have you considered using API functions to query your service? Try QueryServiceStatus on a service opened with SERVICE_QUERY_STATUS.
There is no need for heavyweight administrative services or API calls. The Shell Automation interface has offered this for some time (Win2K or later, Shell32.dll v. 5.0 or later):
With CreateObject("Shell.Application")
MsgBox .IsServiceRunning("MSMQ")
End With
Works fine for me without elevation.

Resources