We are getting error Error during upload when uploading a file that is configured to save files to SharePoint Online. This is on Microsoft Dynamics AX 2012 R3.
We have document management enabled, we are currently saving files to a local folder on the AOS server but now want to save files to our online sharepoint library in O365.
I created the following folder AXTest in sharepoint, and granted the AX service account access to the sharepoint team.
I set Archive directory appropriately. I tried various veresions of the URL, which were all unsuccessful:
https://xxxxx.sharepoint.com/sites/Dev-Test/AXTest
https://xxxxx.sharepoint.com/sites/Dev-Test/AXTest/Forms
various document folders
All result in error when trying to upload a file that is local to the AOS server:
Through debugging, the error is raised here.
There is no exception stack so I don't know the exact reason why, but we tried various versions of the URL, we tried uploading a file locally, and we know the service account has access to the sharepoint site/team. Any other ideas?
Late but the "solution" is to make sure that a persistent cookie for SharePoint Online exists locally. I have been biting my head off to find a better solution, because it can get pretty complex to get that persistent cookie which makes it not suitable for end-users. To the point that my only solution might be to build a custom connector (or to urge for an upgrade...)
The key is to somehow force login.microsoftonline.com to prompt the user the question to remain signed-in. Only when you click Yes in that dialog is a persistent cookie created for SPO. Then the upload works fine, until the cookie expires / gets deleted.
These are the instructions for our users:
1. Make sure SPO / MS / Azure / ADFS related urls are not in the intranet zone in IE to prevent automatic login with windows credentials.
2. Sign out of SPO
3. Delete browser cookies
3. Restart IE
4. Go to SPO
5. You should be prompted to login with username and password
6. Then you will receive the question to Remain signed in. Click Yes.
Optional if this didnt work: Reset IE to default settings.
There's a KB included in CU13 for AX2012 R3 that seems to address the issue (mentions the vague "Error during upload" error). Haven't tested it myself, because i am trying to solve it in an AX2012 R2 CU9 environment.
I just ran across the same issue and what a NIGHTMARE!
For Windows Server 2012 R2, I wrote this little PowerShell script to help clear the cookies for all users.
We had users sign out of AX, sign off the terminal server(s), and then I ran the below script to just remove all user cookies. Next connection, they were in business.
$Remove = $false
Get-ChildItem -Path "C:\Users" -Directory | % {
$CachePath = Join-Path -Path $_.FullName -ChildPath "\AppData\Local\Microsoft\Windows\INetCache"
$CookiePath = Join-Path -Path $_.FullName -ChildPath "\AppData\Local\Microsoft\Windows\INetCookies"
if (Test-Path -Path $CookiePath)
{
Write-Host $_.FullName -ForegroundColor Green
if ($Remove) {
Get-ChildItem -Path $CookiePath -File -Recurse | Remove-Item -Force -ErrorAction Continue
Get-ChildItem -Path $CachePath -File -Recurse | Remove-Item -Force -ErrorAction Continue
} else {
Get-ChildItem -Path $CookiePath -File -Recurse | Format-Table -AutoSize
Get-ChildItem -Path $CachePath -File -Recurse | Format-Table -AutoSize
}
}
}
Related
I have a windows Azure server where I want to mount a file mapping. Below code works fine when I try it on my local windows machine. But it says Access Denied when try the same on Azure Windows Server. What am I missing here?
$acctKey = ConvertTo-SecureString -String "<account_key>" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential -ArgumentList "Azure\<account_username>", $acctKey
New-PSDrive -Name Z -PSProvider FileSystem -Root "\\server.name\files" -Credential $credential -Persist
P.S. It says here that I do have the access.
Could you please try to mount azure file share with command prompt:
net use Z: \\jasonvmdisks304.file.core.windows.net\jasonshare yjya1gkE0TK0lqx/OUh1kD4fxdhCLDjcOW6XPSF6Y4jyCxxMd45eFEvYRzKp8CMRjRpuz38RISA49qXWw3wKA== /user:Azure\jasonvmdisks304
If it still not work, could you please check the event view to find the log, and post it here.
Not sure if resolved so will chime in. One question, if the client is in a different Azure region (e.g. on premise or elsewhere) then SMB3.0 is required, else, SMB2.1 (if in the same region) is allowed. This is a security feature.
We have an On-Prem SharePoint 2013, with a team site on it
Trying to export a document library from this path
https://myportal.mycompany.com/mygroup/
The name of the document library is testDoc, it has only two files with less than 10kb data
The command I am using on the server where Sharepoint 2013 on Prem is located is the following
Export-SPWeb -Identity "https://myportal.mycompany.com/mygroup/" -ItemUrl "/testDoc" -Path "E:\SPBackup\TestDoc.cmp" -NoFileCompression -IncludeVersions 4
This is the error that I receive
Yes I have tried the following permutations of the command too
Export-SPWeb -Identity "https://myportal.mycompany.com/" -ItemUrl "/mygroup/testDoc" -Path "E:\SPBackup\TestDoc.cmp" -NoFileCompression -IncludeVersions 4
With and without the trailing slashes etc and each time I get the same error.
When I try to export a document library on root it works perfectly
Export-SPWeb -Identity "https://myportal.mycompany.com/" -ItemUrl "/testDoc" -Path "E:\SPBackup\TestDoc.cmp" -NoFileCompression -IncludeVersions 4
This works like a charm.
Any idea, suggestions or resolutions for this error.
Make sure the -Identity parameter gets the URL of your subsite and the -ItemUrl should get relative path of the library to be exported.
Export-SPWeb -Identity "https://myportal.mycompany.com/mygroup/" -ItemUrl "/mygroup/testDoc" -Path "E:\SPBackup\TestDoc.cmp" -NoFileCompression -IncludeVersions 4
Previously uploading publish files to azure web apps using FileZilla .but i am trying 2 or more files uploading to azure web apps using powershell scripts now.please help me any power shell script .
but i am trying 2 or more files uploading to azure web apps using powershell scripts now.please help me any power shell script.
I assumed that you could use powershell script for deploying your web app code using FTP (via WebClient.UploadFile()) and recursively upload your publish files as follows:
# Upload files recursively
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse | Where-Object{!($_.PSIsContainer)}
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
"Uploading to " + $uri.AbsoluteUri
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
For more details, you could refer to Upload files to a web app using FTP.
Additionally, you could Using KUDU with Microsoft Azure Web Apps, click "Debug Console > PowerShell or CMD" to open a console, then cd site\wwwroot to the web content of your web app, then you could drag your publish files, and drop them directly into the Kudu console's file explorer UI for uploading files. For more details, you could refer to Kudu console about uploading files.
Does anybody know how to resolve this issue?
Replicate when you type the following command in PowerShell.
dir iis:\sslbindings
I have comes across this page on Microsoft TechNet which doesn't address the problem.
Edit
When invoking the command I get the error
failed to enumerate SSL bindings
Apparently due to a corrupted registry?
In my case, I've got the error when I had both SslCertStoreName and DefaultSslCtlStoreName in the registry. I deleted DefaultSslCtlStoreName and the error is gone for a while. For some reason, DefaultSslCtlStoreName was created in the registry again, and I've got the error again. So I wrote a simple powershell script that deletes it.
This is the part from my build script.
function CleanupSslBindings()
{
$sslBindingsPath = 'hklm:\SYSTEM\CurrentControlSet\services\HTTP\Parameters\SslBindingInfo\'
$registryItems = Get-ChildItem -Path $sslBindingsPath |
Where-Object -FilterScript { ($_.Property -eq 'DefaultSslCtlStoreName')}
If ($registryItems.Count -gt 0) {
ForEach ($item in $registryItems) {
$item | Remove-ItemProperty -Name DefaultSslCtlStoreName
Write-Host "Deleted DefaultSslCtlStoreName in " $item.Name
}
} Else {
Write-Host "No DefaultSslCtlStoreName found. The SSL Bindings registry is clean."
}
}
In my case, I had built WCF services hosted as windows services. When I did this, I apparently didn't know (and still don't) how to assign things like appid's (noticeable when you netsh http show sslcert), and other items that crop up... including an item related to this error.
Essentially, I read the same page the OP did: https://social.technet.microsoft.com/Forums/windowsserver/en-US/87b1252d-a6a0-4251-bbb6-38e104a8c07a/enumerating-iissslbindings-gives-failure-on-one-machine-works-on-another?forum=winserverpowershell
...and using a regedit, went to the key: HKLM\System\Currentcontrolset\services\http\parameters\sslbindinginfo
I saw all the same entries I see when I do the netsh command above. However, my wcf services are listed first, followed by my IIS sites. None of my wcf services had the SSLCertStoreName key (only the IIS sites had the key). Following the article's explanation that the first entry needs to have that registry key (this is a bug in my opinion), I performed the following PowerShell commands:
Try
{
Get-ChildItem IIS:\SslBindings
}
Catch
{
$1stentry = Get-ChildItem HKLM:\SYSTEM\CurrentControlSet\services\HTTP\Parameters\SslBindingInfo | Select-Object -First 1
$1stentry | New-ItemProperty -Name "SslCertStoreName" -Value "MY"
Get-ChildItem IIS:\SslBindings
}
This code works for me. And that article helped get me here and understand that my root cause of this 234 error code, is an assumed self-inflicted wound by not installing my WCF services correctly. YMMV. Hope this helps.
Apologies for the delay but I resolved the issue with the following script (see below). For some bizarre reason (I don't know why) something was adding two entries in my registry and after removing these the problem went away. I figured this out as I compared my registry to another machine who wasn't having this problem and found the culprit.
Remove-ItemProperty -Path "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\HTTP\Parameters\SslBindingInfo\" -Name "[::1]:26143" -ErrorAction SilentlyContinue
Remove-ItemProperty -Path "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\HTTP\Parameters\SslBindingInfo" -Name "127.0.0.1:26143" -ErrorAction SilentlyContinue
echo "Done."
#Bewc I reckon you are onto something there although I think it affects more than just WCF services. We have a powershell script that builds and deploys a website onto a machine (sounds crazy I know). Who or what creates these entries I have no idea but perhaps some background process in IIS?
I'm creating a powershell script so I can create website hosting with a single command using the IIS Powershell Management Console.
I have the commands I need to create the IIS Site and add bindings for the domain names etc...
The one piece of the puzzle I'm missing though is how to change the default Logging directory from %SystemDrive%\inetpub\logs\LogFiles to my own folder that's not on the boot drive of the server.
After extensive searching I expected to find a command along the lines of the following pseudo powershell
New-ItemProperty IIS:\Sites\MyNewSite -name logging -value #{format='W3C';directory='d:\sites\site\logs';encoding=UTF-8}
Please could you show me with an example how you change the logging folder in the IIS Powershell Management Console
Thanks in advance
Import-Module WebAdministration
Set-WebConfigurationProperty "/system.applicationHost/sites/siteDefaults" -name logfile.directory -value $logdir
While testing the answer from this thread, toggling options via IIS Manager and PowerShell, I stumbled on something that has been hidden to me. In IIS Manager, choosing Configuration Editor and making a change, allows the IIS Manager to generate and display the script for the change in C#, JavaScript, AppCmd.exe and PowerShell. Just click the Generate Script option.
[]
For changing an individual web site's logFile configuration, the original post was nearly correct. Instead of New-ItemProperty, use Set-ItemProperty, like so...
Set-ItemProperty "IIS:\Sites\$SiteName" -name logFile -value #{directory=$LogPath}
For changing the server-wide default settings, see Andy Schneider's answer.
For more information about the options available, see this IIS.net article.
This works as well, using the WebAdministration Module
Import-Module WebAdministration
$site = gi IIS:\Sites\MyNewSite
$site.Logging.format='W3C'
$site.Logging.directory='d:\sites\site\logs'
$site.Logging.encoding=UTF-8
$site | set-item
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Web.Administration")
$iis = new-object Microsoft.Web.Administration.ServerManager
$web = $iis.Sites["test"]
#set new logpath, must be existing
$web.LogFile.Directory = "F:\Logfiles\"
$iis.CommitChanges()
If you host multiple sites on a single server and want them to all log to the same log file, the process is quite different. It took some digging to find clues here and here, so I thought I would leave a description behind for anyone else with this need.
The following two statements will combine logs for all of your websites into a folder e:\log\w3svc.
Set-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST' -filter 'system.applicationHost/log' -name CentralLogFileMode -Value 'CentralW3C'
Set-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST' -filter 'system.applicationHost/log' -name centralW3CLogFile.directory -value 'e:\log'