Can I use README files in github pages - web

I am working on a website to host on github, and I want to add README.md file, to let people know what exactly is it.
So is it ok to do that, would that end up in error

Yes, you can add a readme.md file to any Github repo. Github offers extensive documentation on how this works.
If I read into your question more deeply, I think what you're really asking is that if you add a readme to a project being hosted on Github pages - will that effect how the hosted website is presented? The answer would be no. In this case, read about how Github pages hosting works.

Related

Github CLI Beta: How to view readme inside a folder

I recently stumbled across Github CLI Beta. It allows for viewing repos, cloning, forking, and creating repos all from the command line.
My question is the following: How can I view a readme that is inside a folder of a repo?
I can view this one:
bhristov96/example_repo/README.md
How can I view this one?
bhristov96/example_repo/src/README.md
The syntax to view the repo is the following:
gh repo view OWNER/REPO
Specifying anything further doesn't work:
gh repo view OWNER/REPO/FOLDER
Is there a different way to accomplish this, or is it just not supported yet?
Thank you for your time!
Looking at the current cli/cli command/repo_test.go # TestRepoView, I would say: not supported
You can see the repoView() implementation: it uses
readmeContent, _ := api.RepositoryReadme(apiClient, fullName)
And RepositoryReadme is just a call to the GitHub Repository "Get README" API
So it is for now limited to the official README (at the root of your repository)
But it would not be such a stretch to propose another function which would call the Get contents API, with any path you want as a parameter.

CNAME for github repository already taken, but earlier repo was deleted?

About two years ago, I created a repository to host a website via Github Pages with a custom domain. As time went on, I wanted to discontinue that project so I deleted that repository. I am currently trying to relaunch that website. However, whenever I go to the Settings part of my repository and down to the Github Pages section and try to enter my custom domain, I get the following warning:
Does anyone know how I can fix this solution? There is no other repo on Github that uses this CNAME and I own this domain. When I go to the URL it just says that there's a 404 There isn't a GitHub Pages site here. from Github. Any help would be appreciated and in the mean time I am trying to contact Github Support.
EDIT: If it's any additional help, I have included the Advanced DNS info from NameCheap:

How to deploy a node app to azure if the node app is buried in repo directory

I am trying to deploy a project to azure, via the "remote git repo" method. But in my repo, the actual node application is a few directories in. Thus, Azure does not do anything when the repo is pushed.
Is there some way to configure the azure website to run from a directory buried in the repo?
There's a super easy way actually. This scenario was anticipated by the Azure team and there's a good way to handle it. It's simple too.
You simply create a text file at the root of your project called .deployment. In the .deployment file you add the following text...
[config]
project = mysubfolder
When you either Git deploy or use CI to deploy from source control, the entire repository is deployed, but the .deployment file tells Kudu (that's the engine that handles your website management) where the actual website (or node project) is.
You can find more info here.
Also, check out this post where I mention an alternative strategy for project repos in case that helps.
This isn't so much an Azure question as a Git question. What you want to know is if there is a way to clone only a sub-directory or branch of a project. From doing some research on this just a couple of weeks ago, the best I could find were solutions for how to do a sparse clone, which does allow one to restrict the files cloned (almost there) but does so within the entire project's directory structure (denied).
A couple of related SO questions & answers which you might find helpful:
How do I clone a subdirectory only of a Git repository?
(Short answer 'no')
Checkout subdirectories in Git?
(Answer describes the sparse checkout ability).
I would love to see if a git guru might have a better answer based on updates to git, etc.
Good luck with it - I ended up just putting my node app in its own Git project as it seemed the most straightforward approach overall, though not ideal.

Update a GitHub project wiki through the GitHub API

Is there a way a developer can automatically upload Doxygen documentation for his project hosted on GitHub through their API?
I didn't find anything on develop.github.com related to this. It would be nice if one could just SCP the files or something.
It's now possible to check out the wiki as a separate Git repository. You could clone the repository, add the pages to it, and push it. You can clone the repository from this URL:
git#github.com:user/project.wiki.git
There is no way, at this time, to access the GitHub wiki via the API. However, there is a much better solution already built into GitHub. Since Doxygen outputs static HTML pages, you can push them to the gh-pages branch of your project and access them at username.github.com/projectname
For more information, http://pages.github.com/.

Using hg repository as web site

This is somewhat related to my security question here. Is it a bad idea to use an hg / mercurial repository for a live website? If so, why?
Furthermore, we have dev, test and production installations of our website, like dev.example.com, test.example.com and www.example.com. If it's a bad idea to use a repository for a live/production website, would it be OK to use an hg repository for the dev and test sites?
I'm also concerned about ease of deployment. We have technical and less technical co-workers who will be working with the site. The technical people (software engineers) won't have any problem working with the command line or TortoiseHG. I'm more concerned about the less technical people (web designers). They won't be comfortable working on the command line, and may even find TortoiseHG daunting. These co-workers mostly upload .css files and images to the server. I'd like for these files (at least the .css files) to be under version control, but I want this to be as transparent as possible for the non technical team members.
What's the best way to achieve this?
Edit:
Our 'site' is actually a multi-site CMS setup with a main repository and several subrepositories. Mock-up of the repository structure:
/root [main repository containing core files and subrepositories]
/modules [modules subrepository]
/sites/global [subrepository for global .css and .php files]
/sites/site1 [site1 subrepository]
...
/sites/siteN [siteN subrepository]
Software engineers would work in the root, modules and sites/global repositories. Less technical people (web designers) would work only in the site1 ... siteN subrepositories.
Yes, it is a bad idea.
Do not have your repository as your website. It means that things checked in, but unworking, will immediately be available. And it means that accidental checkins (it happens) will be reflected live as well (i.e. documents that don't belong there, etc).
I actually address this "concept" however (source control as deployment) with a tool I've written (a few other companies are addressing this topic now, as well, so you'll see it more). Mine is for SVN (at the moment) so it's not particularly relevant; I mention it only to show that I've considered this previously (not on a Repository though; a working copy, in that scenario the answer is the same: better to have a non-versioned "free" are as the website directory, and automate (via user action) the copying of the 'versioned' data to that directory).
Many folks keep their sites in repositories, and so long as you don't have people live-editing the live-site you're fine. Have a staging/dev area where your non-revision control folks make their changes and then have someone more RCS-friendly do the commit-pull-merge-push cycle periodically.
So long as it's the conscious action of a judging human doing the staging-area -> production-repo push you're fine. You can even put a hook into the production clone that automatically does a 'hg update' of the working directory within that production clone, so that 'push' is all it takes to deploy.
That said, I think you're underestimating either your web team or tortoiseHg; they can get this.
me personally (i'm a team of 1) and i quite like the idea of using src control as a live website. more so with hg, then with svn.
the way i see it, you can load an entire site, (add/remove files) with a single cmd
much easier then ftp/ssh this, delete that etc
if you are using apache (and probably iis as well) you can make a simple .htaccess file that will block all .hg files (or .svn if you are using svn)
my preferred structure is
development site is on local machine running directly out of a repository (no security is really required here, do what you like commit as required)
staging/test machine is a separate box or vm running a recent copy of the live database
(i have a script to push committed changes to staging server and run tests)
live machine
(open ssh connection, push changes to live server, test again, can all be scripted reasonably easily, google for examples)
because of push/pull nature of hg, it means you can commit changes and test without the danger of pushing a broken build to the live website. like you say in your comments, only specific people should have permission to push a version to the live site. (if it fails, you should easily be able to revert to the previous version, via src control)
Why not have a repo also be an active web server (for dev or test/QA environment anyway)?
Here's what I am trying to implement:
Developers have local test environments in which they can build and test their code
Developers make a clone of the dev environment on their local dev machine
Developers commit as often as they want to their local repo
When chunk of work is done and tested, then developer pushes working change sets to dev repo
Changes would be merged and tested on Dev, then pushed to Test/QA, and so on.
BTW, we're using Mercurial. I believe this model would only work using a distributed source code management tool.

Resources