I'm not sure how to ask this question so let me explain my situation...
I generally work remotely but travel into the office once or twice a week. When I'm working remotely I use VPN to gain access to everything that I would as if I were in the office. When I'm writing code at home I'll grab the latest version of code that I'm working on from TFS and use the local workspace on the home pc. However, when I'm at the office I have no way of accessing that code unless I check it in from the home PC and I'd rather not check in half written code. What is the best possible way to half both sets of code available on both PC's? I've read about remote workspaces but I'm not sure how to set anything like that up.
Any help would be appreciated.
Simplest answer is to shelve your changes. That way they get stored on the server but don't get committed to the code base, you can then unshelve and carry on where you left off. Plus this means you'll be working with code on your local machine, negating any issues with the von connection, you can also share shelvesets with other members of your team
Related
Help! The folder C:\ProgramData\Microsoft\Crypto\SystemKeys is growing out of control. It is doing this on some of our servers and some desktops. We are a medium to small business and use Active Directory (not Azure AD). I've heard that this folder is used by IIS, SQL Server, Remote Desktop Licence Server (maybe other things too?). My guess is it has something to do with Remote Desktop as I've not seen this problem with any computer that hasn't been using Remote Desktop. I've heard that some of the keys in this folder are important so you can't just delete it. Anyone have any idea what to do to get it to stop, what causes it, or how to clean it up?
Here my file properties:
Thank you for any help.
I found an answer to this question. Turns out in my case it was because of an application writing to this directory. Contacted them and they are implementing a fix in the next release (October 2021). In the mean time, I've deleted the extra keys in the folder. So far no issues.
An automated Windows update this morning left my Windows Server 2012 R2 Classic Virtual Machine on Azure in a semi-crashed state. The VM is a web server, and all the files and applications in it are still accessible via the browser. In other words, IIS and a number of other services are still running. Unfortunately, however, the VM is not accessible via Remote Desktop and is unresponsive to commands from the Azure management interface on the portal.azure.com website.
This type of error is quite common and can be found reported on many other websites. The error has been happening to Windows users (not just Windows Server) for many years already, and none of the solutions online will work for Azure users, because they involve restarting from a CD, pressing shift-f8 during boot, issuing DOS commands, restoring from backup, or unchecking certain properties in VMWare or other software.
Does anybody have a real solution for this problem on Microsoft Azure?
After struggling with this for weeks, I think I was able to fix this with the help of Microsoft support! I decide to post the solution here in case it can help someone in the future. Here are the three things that you need to do to fix this:
1-Restore the VM from a backup prior to the crash. The VM with the "Undoing Changes" crash is pretty much toast at this point. Now, proceed to steps 2 and 3 to ensure that the next batch of Windows Updates won't crash it again!
2-On your new VM, ensure that the Environment Variables for TEMP and TMP both point to C:\Windows\TEMP. In my case, they were both pointing to a temporary folder in the logged in user's profile.
3-Ensure that C:\Windows\TEMP is always empty. I achieved this by setting up a scheduled task that runs a simple BAT file that deletes all files and folders inside of the C:\Windows\TEMP once a day. I spoke with a Microsoft representative who said that even though you may have plenty of hard drive space in your C:\ drive, the Windows TEMP folder is really not supposed to get much bigger than 500MB. When it gets very large you may have some issues with Windows Updates (mine was just under 500MB when the updates were failing).
I would recommend contacting Azure support as something may have to be done by an engineer to fix the issue and unfortunately classic VMs don't have the redeploy feature.
I've added only InboundPort 3389 RPD, and works well now.
I'm working on a website with some other people. Usually when we want to modify something, we do the change on our machine and just upload the new version with ftp, hope it'll works (or that nobody will notice it doesn't the time we correct it) and that's it.
It's already not the best way to work alone but even less to work collaboratively so I'm asking advices.
I think that a solution like svn/git/mercurial could help me. I found bitbucket which allows free private repository with mercurial. But still after, how can I upload the changes I did to the ftp and make sure the version I've on my computer is the same than the one on the server.
We are all doing it during our free time (not paid) and some people comes and leave every year so I'm looking for something free, easy to use (explain to everyone why we should use a DVCS is already hard) and which doesn't rely on a specific person.
The server we are using to host the website is a cheap one and doesn't allow the use of ssh, svn,...
Thank you
Version control will not help with the issue you are describing - namely, uploading untested changes to a production site.
What you (and your team) need, is better quality control procedures - you need a test website and a tester (QA) person. The process would be:
Make a change
Update the test website
Have the update and the whole website signed off by QA
Update the production/live site
What you will gain by using version control (CVS, SVN, Git or anything else) is recoverability - you will be able to go back to a version before any breaking change. It will still not solve the issue of "the new code broke the site".
You want scheduled releases.
Commit and update code regularly
Code freeze or develop in a branch and merge to the trunk
test on a staging environment
Find a bug goto step 1
Release
You need to understand that what represents your latest correct working build is not what's on the server but in your source repository whether that be SVN or just the file system. Anything as long as it isn't the live server! Make sure everything works locally as expected then unless the site is huge (I guess not given your situation) deploy it in its entirety as a single version.
We are starting with Sharepoint development with a team of three and are currently setting up our development environments. We would like to avoid installing a Server 2008 for each developer, thus a single terminal server has been setup, using Remote Windows to start a VS2008 instance on each developer's machine. Now we would like to separate developers' testing environments (i.e. a different site colletion per developer), but have realized that the assemblies would need to be installed into GAC to show properly on the site. But since there is AFAIK only one GAC, developers wouldn't be able to test their stuff independently.
Is there any way we could create separate testing environments without installing a bunch of 2008 Servers?
So you're all going to remote in an fire up Visual Studio and be compiling stuff and restarting IIS, etc?
You're going to be stamping on each other's toes.
A wiser choice nowadays is to use Hyper-V (or some other virtualisation).
We use Windows Server 2008 on our laptops, and use Hyper-V to run our dev environments. We then have a dev environment (sandbox) each, and these have VS2008, SVN, Nunit, etc.
Our code is tested against each other thanks to CruiseControl on the only shared Hyper-V.
This has been great for us... we distribute the load, we can work on the move, we don't step on each others toes and if we need to do a demo we can switch Hyper-Vs and demo from the demo Hyper-V (branched from the dev one early on so that the environments are known).
Go virtual and don't look back.
PS: I've just seen your comment about one server... just put Hyper-V on that and run 3 instances. That's also what we do ;)
I don't know about installing the server on everything, but this sounds like an ideal task for Virtual Machines rather than physical ones- where I work we using VMWare a whole lot for this kind of work and it does very well.
It's also useful to be able to roll back to a snapshot when it comes to testing installation processes and so on.
No. In addition to the GAC there are all the SharePoint files in the 12 hive, such as features and site templates. It's not worth what you save on server costs.
(Of course if you don't use the GAC, but deploy to the bin folder, and you don't touch anything in the 12 hive, you can give each developer their own web application on the same server. But this approach puts a lot of restrictions on what they can do. It's still not worth it.)
Virtual machines will work, but they can be slow to develop on. For instance, you'll need to restart the application pool for every GAC deploy - which means a pause of maybe 15-60 seconds to reload the application, (depending on the hardware). This will become annoying.
Virtual machines work better for test and production, where you don't restart the application so often.
I recommend a physical server for each developer. This will minimize the code-deploy-test cycle time, and make sure they don't have to worry about stepping on each others toes.
You are on the wrong track with Terminal Services - its just not going to give you any separation.
A lot of people do recommend developing on W003/2008 server directly, and it does simplify some things like remote debugging.
I prefer the more traditional method of using VMWare to run virtual machines. These can be running on a local or remote host. Remote debugging is a little more complex to setup but still possible.
Finally - if possible then deploy to the bin dir rather than the GAC. This will make it much easier to deploy automatically after compilation.
The contributors are right that there are lots of stumbling blocks to multi-developer single server environments.
Number one developers will be trying to attach to the same Web Application process w2ps.exe so creating separate Web Applications on different ports is a must unless you are prepared to share time debugging. How to setup a development environment for sharepoint 2013
The second problem is when you try to collaborate and use shared components/features. Having a desire to work separately is debatable, I believe that the team developers should be collaborating and sharing so combing work is desirable to ensure seamless integration into a single final solution and that no work is duplicated. The multi-developer single server environment works perfectly until you try to collaborate 'One common mistake is to have one “development server” used by all team developers. Unless team members are working on totally unrelated components and never need to do common things such as restart IIS or attach a debugger to an IIS process, this type of environment generally doesn’t work well.' http://technet.microsoft.com/en-us/magazine/dn145990.aspx We made this mistake through lack of experience and knowledge, but once you make it it's possible to work round it.
My first attempt to share features was to copy developer 1's project into developer 2's solution and add a reference to it in developer’s 2's project and add all the features to developer 2's package. Deploying this works fine for developer 2, until as I discovered if developer 1 detaches their solution from the debugger it retracts the solution based on the duplicated solution id from the farm and therefore from each developer's web application. Therefore developer 2 has the rug pulled out from underneath them. Although this is a part solution and seemed to work for a while, it took me a while to work out what was happening and what combinations of dev 1 and 2 deployments make each other’s work and not work.
So I found a better solution. Under the project properties in Visual Studio under SharePoint tab there is a combo box called 'Auto-retract after debugging'. This by default retracts the solution when the developer stops the attached debugger and pulls the features out from underneath the other developers. Unticking this box prevents the retract and leaves each developers individual solutions deployed at farm level and on reattaching to the debugger just replaces the solution with minimal fuss.
In my experience recycling the IIS application pool is so fast other developers don't even notice, but with a larger team than 2 this might become more prevalent, so perhaps someone else could add their experiences. I also guess unless the other develops try to attach at exactly the same time that the recycle is happening it'll be fine, so is a really small chance of having a cross over time, and simply detaching and reattaching will fix this if it is ever experienced.
Okay, so I'm running a small test webserver on my private network. I've got a machine running Windows 2000 Pro, and I'm trying to run an ASP.NET app through IIS.
I wrote it so that the webpage would use the registry to store certain settings (connection strings, potentially volatile locations of other web services, paths in the local filesystem where certain information is stored etc...) Of course, it worked fine when testing with VStudio.NET 2005, because the user running the app has elevated privileges. However, running it on IIS I get a "Access to the registry key 'HKEY_LOCAL_MACHINE\Software' is denied.", which suggests the IIS user doesn't have read access to that part of the registry (I only do reads through the website itself, never writes).
I was like "okay, simple enough, I'll just go give that user rights to that part of the registry through regedit." The problem is, I don't see an option anywhere in regedit to change security settings... at all. Which got me thinking... I don't think I've ever actually had to change security settings for registry hives/keys before, and I don't think I know how to do it.
Half an hour of searching the web later, I haven't found any usable information on this subject. What I'm wondering is... how DO you change security rights to portions of the registry? I'm stumped, and it seems my ability to find the answer on Google is failing me utterly... and since I just signed up here, I figured I'd see if anyone here knew. =)
If your having touble with RegEdit in Windows 2000 you can try the following:
Copy the Windows XP RegEdt32.exe to the Windows 2000 Machine
Using a Windows XP Machine, connect to the Windows 2000 registry remotely: File > Connect Network Registry
You can set permissions at the folder level for which you want to grant user permissions read/write access.
In your case, right click on the "Software" folder and select "Permissions".
You'll probably know the rest from there.
EDIT: If you still run into issues, you may want to modify your web.config file and use impersonation to have your web application run as a certain user account. Then you can put a tighter reign on the controls.
RegEdt32.exe will allow you to set permissions to registry keys.
Simply right click on a Key (Folder) and click Permissions, then you can edit the permissions as you would an file system folder.
I did so, assuming that a Security setting would be available. I didn't see any "Security" option when I right-clicked on the Key. =( I triple-checked just to make sure... and I just tried it on my XP machine, and it does indeed have the "Permissions" section... but the Windows 2000 machine doesn't. (how's that for wierd?)
In my searching, I found:
http://www.experts-exchange.com/Programming/Languages/.NET/ASP.NET/Q_21563044.html
Which notes that RegEdit for Windows 2000 doesn't have the Security/Permissions settings... but it proposes no solution to the problem. (Whoever asked the question was using Windows XP so he was okay... but in my case, it's 2000)
Is there any way to make it happen specifically in 2000?
EDIT: Ahhhh... if worse come to worse, I suppose I can do the impersonation as mentioned below... though if I can't set security settings for the registry in 2000, I'm left with making that user have Administrative access (I assume?) to actually get those rights, which sadly defeats the purpose. =(
Oh, let me try that! I didn't realize you could remotely connect to another registry.
(EDIT: I was wrong, it did work... it just took several minutes to respond to my request to change permissions remotely)
The remote connection idea did it! You're good! Thanks so much for your help! I never realized you could remote connect with RegEdit... you learn something new every day, they say! =) Thanks again for your assistance! =)
On another note though, about copying the XP version of RegEdit to Windows 2000... is that safe? I figured they would be coded in such a way as to be incompatible... but I could be assuming too much. =)
Just use RegEdt32.exe instead of Regedit.exe.
Go to the desired key or folder, then open the security menu and click on 'permissions'.