Can you copy a website? - web

Can you copy a Composite C1 website? I would like to create a copy of an existing website as a new website.
I start by creating Site A. Then I want to copy it and create Site B.
For example: copy the pages, functions, data, content, layouts, css from website A to website B. The only difference between the two would be the name.

It would infringe copywrites and may get you sued, but yes, its possible with a scraper, which basicly get all of the site, and download it to you, such things are used by google and search engines for a cache of sites.
Some exaples:
http://www.grepsr.com/?adwords2&gclid=CIe4rrPF57cCFURcpQodASIAgg
http://info.kapowsoftware.com/WebScrapingDefinitiveGuide.html?pi_ad_id=11920224743&gclid=CPCfxbTF57cCFWNNpgodnCQAKQ
http://scrapy.org/
or just google "web scrapers"
If you own the site however, and have access to the ftp, just simply copy the files to a folder called /b and it can become www.a.com/b or you can set up an addon domain to point to /b and make the addon domain.... say www.b.com

The answer to your question "can you copy a website?"
Is Yes....you can.
Provided you have access to all the files/folders, its no different then copying a bunch of folders on your computer, to another folder.
So if you're using a shared host....and everything is in your public_html folder.
Just put the whole website in one folder, then copy it over to another folder.
And then just simply point your new domain to that folder, through your hosting platform.
The process to do this is different for different hosts, but the actual answer to your question is...
YES....YOU CAN COPY A WEBSITE FROM ONE FOLDER TO ANOTHER

IF you have access to the files on the server you can simply copy it to the other desired location...
But remember you have to update links and other paths (if they are absolute).
If you don't have the access you could maybe use the developer tools like firebug, or using F12 on chrome or IE and copy each file and source code you have by hand. This approach is a little more time consuming than the last one but at least it can be made.
Cheers

As far as I know the easiest way would be use use Internet Explorers save to offline webpage function (if it is still there) - this will copy all the resources of the currently open webpage and recode the HTML to use them, as for an entire website..I dont think it will be easy, for legal reasons.

If it's your own site, sure why not! Who is there to stop you?
But if it's someone elses site, of course you have to worry about copyright and most of the time the website uses server side scripts which are not downloabeable.

You can duplicate a Composite C1 website by copying the entire file structure to a new folder and then update the installation id in the folder ~/App_Data/Composite/Configuration/InstallationInformation.xml (put in a new random GUID). Then point a new IIS site into this new folder.
If your site is using SQL Server as a backend you also need to create a copy of your database, create a new user account with dbo access for this database and update the connection string in ~/web.config.
If you wish to duplicate an entire page structure inside the existing instance of the CMS and share media files, templates etc. this could be done, but no tooling is available. This would be a coding task.

Copy the the directory(website physical path) where the website is pointing to and paste it somewhere...create a new website and point it to that copied directory....

Related

How do I copy a website from one azure website to another (not deployment slots)?

I have 2 azure websites in the same subscription and I want to copy the site from one to another. I know I can copy the entire site down to my local machine using FTP then upload the entire site, but it seems like there should be an easier way, especially considering the FTP hostname is the same for both sites.
These are not deployment slots, so I can't just swap them in the interface.
Thanks.
I think you can use the SiteReplicator site extension to do that. You can find it here https://www.siteextensions.net/packages/sitereplicator/.
The scm site can be found at URL_OF_Your_SITE.scm.azurewebsites.net and then go to Site Extensions to install it if it's not visible in the portal under the Site Extensions gallery.
You can use backup/restore feature. That will create a ZIP file from your site, store it in your storage account (including configuration serialized to XML) and then you can restore to a different/new site. In the end it is basically copying files around anyway, but maybe it is fancier than doing it manually through FTP. Another benefit is that also the website's configuration is copied around. This is one time thing only though. It is not clear from your question whether you want to do copy one time or periodically (that I would suggest the site replicator mentioned in the other response here).
Some links which might help:
Backup - http://azure.microsoft.com/en-us/documentation/articles/web-sites-backup/
Restore - http://azure.microsoft.com/en-us/documentation/articles/web-sites-restore/

Protect static files in Classic ASP website

I want to secure static files (images, .txt files) from unauthenticated users. How can I implement the user authentication to the website so that the static files in specific folder also get secured? I have used simple authentication in a login.asp file and started a session for authenticated user and I check the session value for protected .asp files. But I have no idea how to secure static content on Classic ASP website.
The website is hosted on IIS 7 with Integrated pipeline mode.
You already asked this, and I answered it, and I will give you the same answer.
You will need to use BASIC AUTHENTICATION to restrict access on static files in IIS (Classic ASP). Otherwise, you need to save the static content in another format and encrypt it and only make it viewable by people authenticated by your program.
Please don't ask this again, the answers will not be different.
If using Basic Authentification is not your cup of tea, one possibility would be to replace your static files with an ASP file that upon authorization, will output the correct file. If necessary, you can set the ContentType of the Response to the appropriate type. The link http://support2.microsoft.com/kb/173308 show you how to do that with an image stored inside a database but of course, you can take whatever you want as the source of the file. In the case of .TXT files, you can even directly take the file and simply add a small section of ASP code at the beginning for doing the check.
All of this required extra work. There is no way to simply activate some sort of protection with the session state for static files without extra work.
Old question but -- Most MS servers with Classic Asp installed have several default folders which cannot be accessed except via ASP. they are /bin /app_code /app_data and there may be others. It depends on your hosting company. Windows 10 IIS (their cut down dev & test suite) locks these by default. Using ASP code to retrieve and display text and html is very easy but I'm not sure how to do images. If you have very low traffic, one way would be to copy the image file to an unlocked folder and give it a random name, then access it normally in an IMG tag, then delete it after use. (I came here looking for a better method).
Update: The answer to loading images via ASP is here -- displaying images from sql database with classic asp ... see bottom answer by "HeavenCore" and, instead of Response.BinaryWrite rs("ImageBlob"), get the binary of the image into Your variable, eg: BinaryImageData and do Response.BinaryWrite BinaryImageData

Downloading all folders and its contents(files) from a website and uploading to another site?

I am working on a problem where I have to login to a website (through a form on the webpage), which has many folders displayed on the page and these folders contains many files under them.DO I understand correctly that these folders are not same as folders present on our PC which has a physical location? These folders on the website are just a link which opens a list of files upon clicking.
So, I am struggling to write a code which can login to the site and download all folders and the contents(files) and arrange it in same fashion on the PC in the same hierarchy in which it is arranged in the website. I am thinking about using httpwebrequest for logging in the site, but I have no idea how to download the folders and the contents in the same form as in website.
can anyone help me to develop the code?? I am using C# as my language with .net 4.0
Before thinking about reinventing the wheel, why not ask your hosting provider about ftp access to the site. If you have that then problem solved ... use one of the numerous ftp clients to download all the website content in one click.
Good luck

Check site for my site files

Is there a program that crawls a specified website and will spit out if there is a reference to another website? I have images,video files,pdf's,etc. that I need to give to another developer to finish the port over to their new server.
I just transferred an old site to another person and they are still using my files. I don't know 100% were all the files are and I want to be sure what files I need to give to them. It would be nice to have something like linkchecker that can crawl and if there is a reference to a website root (ex. sub.domain.com) then it will spit out information about it (what page, what is the url).
I don't want to block the site at this point from using the files so that is out.
I'm on a Mac so any terminal program would be just fine.
You could try Sitesucker which can be used to download all the files used on a site (and any it links to depending on the settings). It's OSX (and iPhone) donation-ware so that might be just what you're looking for. I believe it creates a log file of the files it downloads so you could send that if you just want to send the URL's to your colleague instead of the actual files.
You could check out wget. It can recursively (-r option) download a website and save its content to your harddisk. It usually (i.e. if not specified otherwise) downloads everything into directories named like the host.
But be careful not to download the whole internet recursively ;) So be sure to specify correct --domains or --exclude-domains options.

How to download a whole Sharepoint site?

I hope someone has met this need before. I got quite a bunch of documents in a Sharepoint site. And I want to download all the docs as a whole instead of one by one. I have tried the Teleport Pro but it just said HTTP 401 Unauthorized error. Is there any way to download the whole Sharepoint document-sharing site?
Many thanks.
If you have WebDav enabled, you can just open your sharepoint site as network folder and copy paste the documents into your local hard drive.
http://support.microsoft.com/kb/841215
http://hosting.intermedia.net/support/kb/default.asp?id=1603
http://insomniacgeek.com/blog/sharepoint-open-with-windows-explorer-on-windows-server-2008/
You can use DMS-Shuttle for SharePoint for this purpose. With one drag & drop (or CTRL+C, CTRL+V ) you can download a document library or the whole site with all subsites and document libraries. You can define different filters (by modified date, size or file extension). There is a Trial Version here.

Resources