So I have used the following command to deploy a solution to SharePoint as I am a new commer to SharePoint developement.
stsadm -o addsolution -filename Demo.wsp
I used WSPBuilder to build the wsp file which just contains one web part. What do I do deploy the new wsp file. When I try using the same command above I get the error.
a solution with the same name "Demo.wsp" or id "blahblah" already exists in the solution store
Use stsadm -o upgradesolution -filename <localwsp> -name <solutionname> -- but remember that the solutionid must be the same for both the new and the current WSPs (this is normally the case unless you're doing something a bit weird.)
-Oisin
Related
I have a PowerShell script that works all the time when I use from my local machine (I have azCopy installed):
AzCopy `
/Source:C:\myfolder `
/Dest:https://mystorageaccount.blob.core.windows.net/mystoragecontainer `
/DestKey:<storage-account-access-key> `
/Pattern:"myfile.txt"
Using azure pipeline (Microsoft Hosted agent) this script fails with
"AzCopy.exe : The term 'AzCopy.exe' is not recognized as the name of a cmdlet, function, script file, or operable program."
I have tried different agents but still the same error.
Which agent I must use to use azCopy?
Am I missing the obvious?
Is there another way of doing this always using powershell?
To copy files to Azure with AzCpoy you can use build-in task Azure File Copy, you not need use PowerShell:
In addition, you can install the Microsoft Azure Build and Release Tasks extension that give you another task "Azure Copy File Extended" with more options.
Agree with Shayki Abramczyk, the Azcopy task he provided can also be used to achieve copy file. This is another way, you can consider give it a try :-)
Back to this issue. According to error message, I think it's because the missing SDK in hosted agent.
Until now, Microsoft does not install Azure.Storage.AzCopy in every hosted agent. So, the agent you used may does not support this.
We provide seven different agents for user use, but only Hosted VS2017, Hosted Windows 2019 with VS2019 and Hosted Ubuntu 1604 has been installed the SDK which support Azcopy.exe.
So, you can try with these three agents to execute your azcopy command with powershell.
Edit:
Becaues the executable file (azcopy.exe)is in local. So, where is your AzCopy.exe located? For me, it's C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.
So, in script, you need to execute cd command to change directory to the file where AzCopy.exe located first.
cd “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy”
Note: DO NOT lost double quote here, or you will get x86 is not recognized. If file path located not same with mine, just change file path with yours.
And then, because of using Powershell, you may need to use powershell syntax. Here is the complete format example which modify it based on your script:
$source="C:\MyFolder"
$dest="https://mystorageaccount.blob.core.windows.net/mystoragecontainer"
$pattern = "myfile.txt"
$destkey = <key>
cd “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy”
$azcopy = .\AzCopy.exe /Source:$source /Dest:$dest /DestKey: $destkey
/Pattern: $pattern
Please try with it.
For people like me, landing to this thread because they have this error by calling AZ copy in a PS Script, it's confirmed AZ Copy in not installed in Last (VM2019) version of Windows Hosted. But according to MS, binary is present in the Image, so you don't have to install it, but just to use the right path.
For more information about packages installed (or saved) on VM, you can check this Git Repo
I have a requirement to run webtests which are located in a particular folder through a build. Currently the tests are run from Visual Studio 2015.
Got to know to execute/use the below powershell script as a task. But clueless how to implement it. Is this powershell script enough?.
param
(
$tool = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe",
$path ,
$include = "*.webtest"
)
$web_tests = get-ChildItem -Path $paths -Recurse -Include $include
foreach ($item in $web_tests) {
$args += "/TestContainer:$item"
}
In this, how to pass in the $path value? Do I need to give the path of the directory which has all these 5 webtests?
Do I need a testsettings file to carry out this execution.
If I need a test settings file, do I need to copy all the webtests file to output directory?
All I get with this powershell is the below message
No test runs are available for this build. Enable automated tests in
your build definition by adding a task that runs tests for your test
framework of choice, such as the Visual Studio Test task. If you choose
to run tests using a custom task or runner, you can publish results
using the Publish Test Results task
Could any one please help me as in what I am missing here? Thanks for your time and help on this.
how to pass in the $path value?
When you use the powershell task to execute your powershell scripts, there is a option Arguments, which you could pass the value of $path:
Do I need to give the path of the directory which has all these 5
webtests?
Yes. In general, the .webtests file are all copied to the drops location, so the path always point to the drops location.
Do I need a testsettings file to carry out this execution.
The short answer is yes.
There is a great blog about how to Running WebTests as part of a VSTS VNext Release pipeline, you can check if it helps.
Hope this helps.
Visual Studio 2010, in my project I made a custom build step which renames a dll file and copies it to other folder. So, Alt+F7, Config props, Custom Build Step / General, command line:
copy /y $(TargetPath) $(TargetName).node
It didn't do anything. Then I also added
ping bat.femei.ro -n 1 -w 5000
It still didn't do anything. It simply flashed a command prompt window for a split second then the window went away. I googled as much as I could concluding that there might be a problem with the folder where the batch file is generated.
I did my best to screenshot that split second with the command prompt and after a boring F7-PrintScreen-PasteInPaint session finally I got
C:\Users\FURAT\AppData\Local\Temp\blablablablablablablablabla.exec.cmd is not recognized as internal or external...
I double checked the directory. It has Everyone permissions set to Allow both Read & Write operations. What's wrong? How do I fix this?
I was unable to find any knobs to tweak temp folder path. It's not Env and it's not in the config either.
What did work however was running VS2010 as Administrator. Now the Custom Build Step works.
I want to download all these RPMs from SourceForge in one go with wget:
Link
How do I do this?
Seeing how for example HeaNet is one of the SF mirrors hosting this project (and many others), you could find out where SF redirects you, specifically:
http://ftp.heanet.ie/mirrors/sourceforge/h/project/hp/hphp/CentOS%205%2064bit/SRPM/
... and download that entire directory with the -r option (probably should use "no parent" switch, too).
One of the two ways:
Create a script that parses the html file and gets the links that ends withs *.rpm, and download those links using wget $URL
Or start copy & pasting those urls and use:
wget $URL from the console.
I was wondering, if I deploy a WSP using the stsadm command:
stsadm -o addsolution –filename myWSP.wsp
Will this also install the required DLL's (already included in the WSP) into the GAC?
Or is this another manual process?
This is determined by the DeploymentTarget attribute in the solution's manifest.xml. If you are maintaining this file yourself, using the following syntax will deploy the code to the GAC:
<Assemblies>
<Assembly DeploymentTarget="GlobalAssemblyCache"
Location="MyGAC.dll" />
</Assemblies>
If you are using a tool to create the solution, it depends on the tool. WSPBuilder defaults to deploying to the GAC however it can be configured otherwise. See the "Scoping the assembly for BIN instead of GAC (including Code Access Security generation)" section of this article by Tobias Zimmergren for steps on how to deploy to bin.
If you're building the packages via VS, open the Package and click the Advanced tab on the bottom. You'll be able to add additional assemblies and specify the Deployment Target from here. I'd strongly recommend doing this rather than updating the XML directly...but that's just me.
As the command says addsolution it is just going to add the solution to the Solution store. You need to call the command deploysolution to get the stuffs to place. Here is the command that you need to call
stsadmin -o deploysolution -name [solutionname] -allowgacdeployment
Note that allowgacdeployment is mandatory to place the files to gac. you can more help on this command with this
STSADM.EXE -help deploysolution
There is an alternate option to get this done,through UI. Go to Central Admin -> Operations ->Solution management select the solution and say deploy. this will be easier way to get it done quick.