Downloading all folders and its contents(files) from a website and uploading to another site? - c#-4.0

I am working on a problem where I have to login to a website (through a form on the webpage), which has many folders displayed on the page and these folders contains many files under them.DO I understand correctly that these folders are not same as folders present on our PC which has a physical location? These folders on the website are just a link which opens a list of files upon clicking.
So, I am struggling to write a code which can login to the site and download all folders and the contents(files) and arrange it in same fashion on the PC in the same hierarchy in which it is arranged in the website. I am thinking about using httpwebrequest for logging in the site, but I have no idea how to download the folders and the contents in the same form as in website.
can anyone help me to develop the code?? I am using C# as my language with .net 4.0

Before thinking about reinventing the wheel, why not ask your hosting provider about ftp access to the site. If you have that then problem solved ... use one of the numerous ftp clients to download all the website content in one click.
Good luck

Related

Is there a way to integrate gitbook and sharepoint cleanly?

I have several books in Gitbook and am bouncing users from the Sharepoint based intranet to documentation in Gitbook. Is there a way to automatically embed the Gitbook content into Sharepoint so it looks like it is integrated within the intranet?
I have successfully integrated Gitbook into Sharepoint. Initially I tried the answer provided above, but that rendered my gitbook inside a window within Sharepoint which looked bad to me.
Here is the way I accomplished it:
Using the Gitbook CLI Toolchain installed on a linux computer, issue the gitbook build command.
Take the output from this command, which is a folder called _book, and upload its contents to your Sharepoint documents folder.
Take care to replicate the folder structure exactly. This is a bit tedious since Sharepoint doesn't allow you to upload folders (at least not my instance).
Rename every .html document in the _book folder to .aspx. This allows Users to visit a page when they click a link rather than downloading the page. If i'm not mistaken, I also had to edit the links to my books pages inside the index.aspx page from .html to .aspx as well.
Here comes the cool part... visit the link provided for the (now) index.aspx. Get the link by clicking the ... button next to the file in Sharepoint. And...bingo, Sharepoint will serve your entire gitbook as a static site.
Hope this helps

How to make created html templates downloadable to clients?

I have created templates for a company's website that I am working on. I have three different folder each folder containing different styles. I am trying to find a tool where I can put the files on, and then send a link or whatever to my client so that he can see the templates that I have created. Is there any tool out there where I can do this..? Besides a flashdrive...
Try Dropbox. You can either create a folder and "share" it, or just send a link via email to the folder to the client.

Stop people browsing images on website

I'm pretty much new to IIS and this is the problem that I'm trying to solve.
I have a legacy web app (complied so not easily changeable) that is running in IIS 7.
This web app displays images to the user of possibly sensitive personal data but if the user views the source the of the page then they are exposed to the url which is viewable directly through any browser.
What I think that I need to do is to remove the folder permissions (where the images live) to stop people directly browsing vewing them and then create another account that has the permissions to view this folder and then associate this newly created account with the application pool that is running the site.
So my questions are would this be the correct way of doing this? And if it is, how would I actually achieve (bearing in mind that I'm a complete novice in IIS and permissions).
Any help would be muchly appreciated.
Thanks,
Craig

Can you copy a website?

Can you copy a Composite C1 website? I would like to create a copy of an existing website as a new website.
I start by creating Site A. Then I want to copy it and create Site B.
For example: copy the pages, functions, data, content, layouts, css from website A to website B. The only difference between the two would be the name.
It would infringe copywrites and may get you sued, but yes, its possible with a scraper, which basicly get all of the site, and download it to you, such things are used by google and search engines for a cache of sites.
Some exaples:
http://www.grepsr.com/?adwords2&gclid=CIe4rrPF57cCFURcpQodASIAgg
http://info.kapowsoftware.com/WebScrapingDefinitiveGuide.html?pi_ad_id=11920224743&gclid=CPCfxbTF57cCFWNNpgodnCQAKQ
http://scrapy.org/
or just google "web scrapers"
If you own the site however, and have access to the ftp, just simply copy the files to a folder called /b and it can become www.a.com/b or you can set up an addon domain to point to /b and make the addon domain.... say www.b.com
The answer to your question "can you copy a website?"
Is Yes....you can.
Provided you have access to all the files/folders, its no different then copying a bunch of folders on your computer, to another folder.
So if you're using a shared host....and everything is in your public_html folder.
Just put the whole website in one folder, then copy it over to another folder.
And then just simply point your new domain to that folder, through your hosting platform.
The process to do this is different for different hosts, but the actual answer to your question is...
YES....YOU CAN COPY A WEBSITE FROM ONE FOLDER TO ANOTHER
IF you have access to the files on the server you can simply copy it to the other desired location...
But remember you have to update links and other paths (if they are absolute).
If you don't have the access you could maybe use the developer tools like firebug, or using F12 on chrome or IE and copy each file and source code you have by hand. This approach is a little more time consuming than the last one but at least it can be made.
Cheers
As far as I know the easiest way would be use use Internet Explorers save to offline webpage function (if it is still there) - this will copy all the resources of the currently open webpage and recode the HTML to use them, as for an entire website..I dont think it will be easy, for legal reasons.
If it's your own site, sure why not! Who is there to stop you?
But if it's someone elses site, of course you have to worry about copyright and most of the time the website uses server side scripts which are not downloabeable.
You can duplicate a Composite C1 website by copying the entire file structure to a new folder and then update the installation id in the folder ~/App_Data/Composite/Configuration/InstallationInformation.xml (put in a new random GUID). Then point a new IIS site into this new folder.
If your site is using SQL Server as a backend you also need to create a copy of your database, create a new user account with dbo access for this database and update the connection string in ~/web.config.
If you wish to duplicate an entire page structure inside the existing instance of the CMS and share media files, templates etc. this could be done, but no tooling is available. This would be a coding task.
Copy the the directory(website physical path) where the website is pointing to and paste it somewhere...create a new website and point it to that copied directory....

can I create a project from my web site url

I am just starting with Aptana and I don't have the original HTML files for my web site. Is there a way that I can import my whole web site as a project or do I have to open each page from with Aptana and save with the original urls?
Thanks
I use Interachy which is a commercial Mac option. One open source Windows program is HTTrack.
If your site isn't large, it's often feasible to go through page by page and save each one as source. You also need to save all the images and CSS files, and reconstruct the folder directories, though it's goes faster than you might think.
Good luck!

Resources