We received a sharepoint backup file as part of a project. Is there any way to access the ".dat" file without Sharepoint?
I've heard that some of the files created with SharePoint are .cab files, but apparently not this one.
I hadn't heard of Sharepoint until yesterday, but there were 77 pages of unclosed questions about it on here so I'm assuming it's okay to ask (even though it doesn't seem very programmerish to me)
The .dat file is similar or equal to a MS SQL backup file, you really have no much use for it without the server software installed. If you expect to extract pages, images etc, keep in mind everything is serialized into the SQL Server database, so pretty much only browsable on sharepoint itself.
SharePoint is very programmy :)
There are a few ways of creating backup files for SharePoint. There is the SQL backup but also the stsadm -o export and stsadm -o backup commands. Only the stsadm -o export command yields a cab but whats inside is not very usable I'm afraid. What would you need to get out of that backup?
If 'someone' gave you that they should really have told you what it is!
You can always ask someone with a running instance of SharePoint to restore it in their environment and pull out any documents you might need.
The base variety of SharePoint (WSS) is a free install. You can install an eval version of the full SharePoint product (MOSS). I think MS even has pre installed virtual machines you can download.
Related
Our Operations team has started piloting a project where we would be migrating to ANF. I manage an application which would be using templates stored on the ANF share. Sometimes users leave the template files open and we are unable to update the templates.
Previously, we would be able to go to computer management and check for Open files and close them. I am being told that after we move to ANF this would no longer be able to do this. Please note I have no idea how the ANF is setup or anything about this technology.
Is there any way we can use Powershell to check if files are open on the ANF share and close it?
I was given the file location in the following format \domain\Office\AppData<application files>.
I'm not even sure if we can use Get-SmbOpenFile cmdlet.
Any advice would be helpful.
Previously for opened files we could go to Computer Management and close the files. We can no longer do this with ANF or I dont know how to do it.
I am a new web developer. I have been using Joomla with a 3rd party template, developing it on a local MAMP server. The template is sort of unstable and breaks easily. I would like to Backup my work on a daily basis.
I'm assuming all the files and the database need to be backed-up? Is there a best practice for this?
Thanks very much!
Using Akeeba Backup (https://www.akeebabackup.com/products/akeeba-backup.html) is indeed a good idea. You can schedule a command line task to run every day and create a backup of this site. To restore such a backup you can use Akeeba Kickstart (https://www.akeebabackup.com/products/akeeba-kickstart.html). Very easy, very comfortable.
But this only works if you don't break your Joomla! installation! Your question implies something like this. To do a manual backup you can simply zip the folder which contains your Joomla installation and create a database dump. You can do both every day using command line script.
Creating the dump: https://dev.mysql.com/doc/refman/5.7/en/mysqldump-sql-format.html
Using GIT might work as well. Instead of zipping your folder you simply commit it to your git repository. Don't forget to add the database dump to your repo as well.
https://techjoomla.com/developers-blogs/joomla-development/deploying-joomla-projects-using-git.html
http://joomlaablog.blogspot.de/2010/11/how-to-track-your-joomla-project-with.html
The best and easy method is to get it done through Akeeba Backup https://www.akeebabackup.com/download.html. Install this component at the backend. Run it and take a backup whenever you want. It takes backup of files and database both. You can even download and extract it to run it in another web server. To extract Akeeba you can use their software Akeeba Extract. This is all free.
I'd like to update existing production moss2007 website and to do that I want to create this website on my dev environmen (vhd win 2003EE, MOSS 2007 trial version).
I was given:
some stp files which include(aspx pages, javascript files, etc...)
some aspx pages, masterpage, css file
some wsp files
source code for almost all wsp files.
There is no documentation how to install all this stuff and I'm not a sharepoint developer but Asp.net dev in fact.
I set up local dev enviroment (i had no problems when following http://www.pptspaces.com/sharepointreporterblog/Lists/Posts/Post.aspx?ID=28
) and try to create a website based on given files.
My questions:
Do I need a copy of the database?
I noticed that one of wsp files is not a webpart but a project that contains some web controls and even a master page) - what should I do with it?
I tried to import all stp files. I had to use stsadm, but I can't see them. I can list them using "stsadm -o enumtemplates" - all of them have the same id!
I changed stp->cab->modified the "WebUrl" in the manifest.xml -> created new cab and rename it to stp-> imported to sharepoint but still the same, I can't see them (custom tab is still not visible).
I created a custom site template locally (custom tab appeared with it) and opened its manifest.xml. I noticed that the section contains 17 parameters (manifest.xml files from other stp files 9 parameters only).
I think that my SharePoint (Trial version) works in a different way and that's the problem.
Can it be?
What is the best way to create this site locally?
thx ahead for all answers
if you want to set up development environment containing the same functionality as your production server, you can duplicate your production site collection by:
Copying and attaching production SharePoint content database to your development environment farm
Backuping and restoring it by stsadm tool
By firstly you need to prepare your development environment. If you have any wsp solutions deployed on production farm, deploy it on your development environment.
Then if you prefer first option check this link
If you prefer second option check this link
But if you prefer backup way, you can catch some problems with version difference between production and development revilement, different updates installed, so different assemblies in GAC.
So I advice to use attach database way, it should upgrade attached database if it has different version automatically.
Just google little bit, may be you'll find more complete guide for attaching or restoring site collections. I just give you start point.
I'd like to download the Trac database so I can view its tickets offline. Is there anyway to achieve this? I.e. if I need to leave the office and bring my laptop with me, how can I bring the tickets with me without having to connect to the company network?
I know that Mylyn can download and sync tickets via it's trac connector but I'd like some stand-alone viewer.
See Simple Defects (SD).
I particularly like the "One-tweet install" idea.
I’m installing #SD (http://syncwith.us)
after reading about it on #StackOverflow
curl fsck.com/sd|perl;
export $PATH=~/sd/bin:$PATH; sd
Note that you can clone Trac (and other bugtrackers) in SD:
sd clone --from trac:https://trac.parrot.org/parrot
Seeing as you don't want to install a server, how about using RSS? IIRC, Trac let lets you get RSS feeds for each person, so you can have a feed of things assigned to you.
All you need do then is get a nice client that will download these tickets. You should be able to access a plaintext version without internet connection.
If that's not flexible enough, you could write a script on the server to publish a feed using the database directly.
And if RSS isn't for you (and your email is available offline), you could mail reports home. Trac also has this built in.
The default Trac installation uses a combination of SQLite to matintain all of the data. Attachements are stored on the file system.
In the folder containing the trac site, find \db\trac.db
This file can be viewed using the SQLite manager Firefox Addon
Happy hunting.
And if RSS or email isn't your notification of choice, there's a trac plugin that will let you receive task notifications on your Remember The Milk todo list.
See: http://1.www.rememberthemilk.com/forums/ideas/3580/?forum=ideas&hl=bs&topic=3580
If your objective is simply to view the tickets offline, how about
Run a report with all the tickets (or all those you're interested in).
Select either the comma-delimited or tab-delimited download link at the bottom of the page.
Import the downloaded file into Excel.
you could install it on a local machine
You can host the trac locally and set up the connectionstring point to your dowloaded database.
Sure. Install a web server locally, install trac, get it set up the same (or similar) way to the way it is on the live version and then script the server to publish db backups and write a local script to download those and restore them over your database.
It's not simple (installing Trac is a battle on its own from my experience of it) but every element is highly googleable =)
The trac client FatBug (http://fat-bug.com/) listed in
https://trac.edgewall.org/wiki/Clients
seems to do the exact what was described by the OP. I bumped into it after I just checked SD. SD seems trival on Linux, but heavy on Windows, it depends on Perl & CPAN.
I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.