Create file in GitHub repository with PyGithub - github-api

I have the following code:
import github
token = "my gitHub token"
g = github.Github(token)
new_repo = g.get_user().create_repo("NewMyTestRepo")
print("New repo: ", new_repo)
new_repo.create_file("new_file.txt", "init commit", "file_content ------ ")
I have run this code, and this is the result:
New repo: Repository(full_name="myname/NewMyTestRepo")
Traceback (most recent call last):
...
File "/home/serega/PycharmProjects/GitProj/myvenv/lib/python3.5/site-packages/github/Requester.py", line 180, in __check
raise self.__createException(status, responseHeaders, output)
github.GithubException.UnknownObjectException: 404 {'message': 'Not Found', 'documentation_url': 'https://developer.github.com/v3'}
I think there may be problem in the scope of my token, it has repo scope. Nevertheless, I have managed to create a repo, so it seems, it should be allowed to make commit in that repo with new file inside.
About scopes I saw that link : https://developer.github.com/v3/oauth/#scopes
And it states:
repo
Grants read/write access to code, commit statuses, repository
invitations, collaborators, and deployment statuses for public and
private repositories and organizations.
I will really appreciate if somebody can clarify about required token's scope, and what could be the problem.

repo scope is enough to create files in a repository. It would seem from this question that the problem is that your file must have a leading slash:
new_repo.create_file("new_file.txt", "init commit", "file_content ------ ")

Related

Terraform gitlab provider changes commit message

Is it expected behavior that gitlab provider on terraform adds [DELETE] if previous commit message was changed in your tf code?
For example I had a tf file with
resource "gitlab-repository-files_gitlab_repository_file" "this" {
project = gitlab_project.foo.id
file_path = "meow.txt"
branch = "main"
content = base64encode("hello world")
author_email = "meow#catnip.com"
author_name = "Meow Meowington"
commit_message = "feature: add meow file*"
}
Then changed it to
commit_message = "[ci skip] terraform templating commit\n\nJob URL: ${local.gitlab_configuration_details.pipeline_job_url}"
After the change my commit message on gitlab was [DELETE]: feature: add meow file
If this is the expected behavior is there any way around it to prevent the provider from adding comments?
Because I expected after the change for the commit to read on git as "[ci skip] terraform templating commit\n\nJob URL: https:url.com"
Thanks!
After further investigating, turns out the [DELETE] gets inserted if you are deleting files. If you just make code changes it will not edit your commit message, this only happens when files are deleted.

How to change permissions or call a command without having to change permissions in an ONLINE Python environment?

I tried to make it so that my bot will create a directory inside a directory in the bot to store server data.
I originally just wanted to create the directory without worrying about permissions:
#client.event
async def on_guild_join(guild):
print(f'Recognized that Beatboxer has joined {guild.name}')
guild_path = rf'/guilds/{guild.id}'
if not os.path.exists(guild_path):
os.makedirs(rf'/guilds/{guild.id}')
An error message came up that looked like this:
Ignoring exception in on_guild_join
Traceback (most recent call last):
File "/opt/virtualenvs/python3/lib/python3.8/site-packages/discord/client.py", line 312, in _run_event
await coro(*args, **kwargs)
File "main.py", line 227, in on_guild_join
os.makedirs(rf'guilds/{guild.id}')
File "/usr/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: 'guilds/727168023101964298'
I then tried adding os.chmod into the code, but, for some reason, still had the same error message.
os.chmod("guilds", 777)
#client.event
async def on_guild_join(guild):
print(f'Recognized that Beatboxer has joined {guild.name}')
guild_path = rf'/guilds/{guild.id}'
if not os.path.exists(guild_path):
os.makedirs(rf'/guilds/{guild.id}')
Also, calling os.chdir and changing the directory into there did not work, and had a similar error message.
os.chmod("guilds", 777)
#client.event
async def on_guild_join(guild):
print(f'Recognized that Beatboxer has joined {guild.name}')
guild_path = rf'/guilds/{guild.id}'
if not os.path.exists(guild_path):
os.chdir('/guilds')
os.makedirs(rf'{guild.id}')
Finally, I attempted one last thing (which obviously still didn't work), which was os.popen, which opens a pipe for a command, allowing it to transfer the output to a file which is editable by other programs (which therefore should not regard what any permissions do):
#client.event
async def on_guild_join(guild):
print(f'Recognized that Beatboxer has joined {guild.name}')
guild_path = rf'/guilds/{guild.id}'
if not os.path.exists(guild_path):
os.popen(os.makedirs(rf'/guilds/{guild.id}'))
All of these attempted codes have very similar error messages, particularly Errno 13. Computer configuration will most likely not work. Please help? Thank you!
The problem is not that the functions you use are wrong.
You are actually giving a directory location you dont have permission to.
There are 2 kinds of paths to folders. Absolute paths and relative paths. In the examples you gave you used absolute path. When you use the absolute path you use the root directory as starting point. The thing with online IDE's is that you often dont have direct access to the root directory. Thus making new directories will give permission errors.
So how do we fix this? I suggest using relative paths instead. To fix this in your code is really easy, instead of doing this:
'/path/to/folder'
Do this:
'./path/to/folder'
By using ./ instead of /. You use the current folder as your starting point instead of the root directory. As you often have access to the current folder this wont give permission errors.

GCF Node10 deploy failed: "Function failed on loading user code. Error message: Provided code is not a loadable module."

After making some adjustments (a rather big PR), which basically adds Google Cloud Storage connection to this function, deployment starts to fail. Unfortunately, the error message is pretty unclear and therefore doesn't provide me in much hint. Locally and in tests things run fine, so I'm a bit lost right now which direction to search. Logs don't provide insights either.
Can't really easily share the changes in the PR unfortunately. Worst case I'll revert and go piece by piece from there, but that's a tedious process.
The service account that is used in the deployment got access to the used bucket (with write), but I also don't think this error hints to permissions else I hope the error message would be more insightful.
Command used:
gcloud beta functions deploy eventStreamPostEvent --runtime nodejs10 --memory 128MB --trigger-http --source ./dist --service-account $DEPLOY_SERVICE_ACCOUNT --verbosity debug
Deploying function (may take a while - up to 2 minutes)...
..............................failed.
DEBUG: (gcloud.beta.functions.deploy) OperationError: code=3, message=Function failed on loading user code. Error message: Provided code is not a loadable module.
Could not load the function, shutting down.
Traceback (most recent call last):
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 985, in Execute
resources = calliope_command.Run(cli=self, args=args)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 795, in Run
resources = command_instance.Run(args)
File "/usr/lib/google-cloud-sdk/lib/surface/functions/deploy.py", line 231, in Run
enable_vpc_connector=True)
File "/usr/lib/google-cloud-sdk/lib/surface/functions/deploy.py", line 175, in _Run
return api_util.PatchFunction(function, updated_fields)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/functions/util.py", line 300, in CatchHTTPErrorRaiseHTTPExceptionFn
return func(*args, **kwargs)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/functions/util.py", line 356, in PatchFunction
operations.Wait(op, messages, client, _DEPLOY_WAIT_NOTICE)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/functions/operations.py", line 126, in Wait
_WaitForOperation(client, request, notice)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/functions/operations.py", line 101, in _WaitForOperation
sleep_ms=SLEEP_MS)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/core/util/retry.py", line 219, in RetryOnResult
result = func(*args, **kwargs)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/functions/operations.py", line 65, in _GetOperationStatus
raise exceptions.FunctionsError(OperationErrorToString(op.error))
FunctionsError: OperationError: code=3, message=Function failed on loading user code. Error message: Provided code is not a loadable module.
Could not load the function, shutting down.
ERROR: (gcloud.beta.functions.deploy) OperationError: code=3, message=Function failed on loading user code. Error message: Provided code is not a loadable module.
Could not load the function, shutting down.
I hope anyone knows what is causing this error.
Stackdriver logs show me nothing more than:
protoPayload: {
#type: "type.googleapis.com/google.cloud.audit.AuditLog"
authenticationInfo: {…}
methodName: "google.cloud.functions.v1.CloudFunctionsService.UpdateFunction"
requestMetadata: {
destinationAttributes: {…}
requestAttributes: {…}
}
resourceName: "projects/<projectName>/locations/europe-west1/functions/eventStreamPostEvent"
serviceName: "cloudfunctions.googleapis.com"
status: {
code: 3
message: "INVALID_ARGUMENT"
}
}
I had the same issue and seems the message comes from here.
When you have multiple .js files with some subfolders in the root folder of your function, by default without any specification you need to name the entry module as index.js or function.js.
I found that by deploying the function using node8. The error messages should be clearer...
Usually (or; for me) the cause of OperationError: code=3 is an error in importing the modules you have defined.
Fixed this by:
deleting node_modules
rm -r .\node_modules\
optional: you can do npm i after deleting node_modules and test your function locally before deploying.
then deleting .gcloudignore and deploying as usual.
For me, the problem was caused by having installed one of my node_modules in the wrong directory (.. - up one dir). Make sure all of your node_modules needed are in the right place. This can easily happen if you have multiple functions in subfolders.
Your source code must contain an entry point function that has been correctly specified in your deployment, either via Cloud console or Cloud SDK.
Source: https://cloud.google.com/functions/docs/troubleshooting#entry-point

Edit github gist through Python with pyGithub lib

this is my first question to stackoverflow. So be kind, if I am not on topic or precise and help me improve for the next time.
I am trying to modify and existing Github Gist through Python3 using pyGithub.
I created an API-token and authentification works ok, but I am struggeling to edit the Gist. I could not find an appropriate example, that made it clear to me.
Here is my code:
from github import Github
g = Github("XXX")
test2 = {"description": "the description for this gist",
"files": {"filter": {"content": "updated file contents"},
"Task": {"filename": "new_name.txt",
"content": "modified content"},
"new_file.txt": {
"content": "a new file"
}
}
}
g.get_gist(id="b2c5668fefe1f2e80252aabf4ef4e96c").edit(test2)
This is the error message I am getting:
Traceback (most recent call last):
File "gist.py", line 15, in <module>
g.get_gist(id="b2c5668fefe1f2e80252aabf4ef4e96c").edit(test2)
File "/Users/DSpreitz/ogn-silentwings/venv/lib/python3.6/site-packages/github/Gist.py", line 249, in edit
assert description is github.GithubObject.NotSet or isinstance(description, str), description
AssertionError: {'description': 'the description for this gist', 'files': {'filter': {'content': 'updated file contents'}}}
I found some description of the pygithub lib here:
pyGithub Docu
This is the Gist I am trying to modify: Gist
Any help to solve this problem is greatly appreciated.
Dominic
The main issue with this code is that it's passing a dictionary to Gist.edit. Gist.edit accepts keyword arguments.
PyGithub's documentation says:
edit(description=NotSet, files=NotSet)
so it should be called as g.edit(description="new description", files=...). Regarding files, the same documentation says:
files – dict of string to github.InputFileContent.InputFileContent
so the files parameter could look like:
{"foo.txt": github.InputFileContent(content="bar")}
Summarized:
import github
token = "..." # https://github.com/settings/tokens
gh = github.Github(token)
gist = gh.get_gist("f04c4b19919c750602f4d0c5f7feacbf")
gist.edit(
description="new description",
files={"foo.txt": github.InputFileContent(content="bar")},
)
If using pyGithub lib is NOT a hard constraint then I'd suggest using gifc gist client also written in python. So for your case, editing or updating the gist can be done as follows after installing it via pip (after cloning) -
Update a gist
Edit all (or some) files iteratively
gifc update ffd2f4a482684f56bf33c8726cc6ae63 -i vi
You can get the gist id from the get method from earlier
Change description
gifc update ffd2f4a482684f56bf33c8726cc6ae63 -cd "New description"
You can get the gist id from the get method from earlier
Edit contents of a file interactively in an editor like nano, vim or gedit
gifc update ffd2f4a482684f56bf33c8726cc6ae63 -f file_to_update.md
Do both
gifc update ffd2f4a482684f56bf33c8726cc6ae63 -f file_to_update.md -cd "New description"

gitolite + gitweb | 'repo #all R = gitweb' not working

For some reason the following gitolite.conf does not add any repository to projects.list.
When I set 'R = gitweb' for each repository manually, they get added to projects.list.
[....]
repo aaa
repo bbb
repo #all
RW+ = #admins
R = gitweb
[...]
Any hints for me? I'd really like to allow gitweb access to all repositories and then remove permissions for a single repositories via '- = gitweb' ...
I don't actually need gitweb rules or project.list to be complete in my gitweb setup:
I only make sure I have a gitweb.conf.pl which:
will be called by gitweb (through the gitweb_config.perl file, called if gitweb detects it exists)
will call gitolite to see if the access to a repo can be granted or should be denied.
I just ran into the similar problem, but the resolution was different :
In gitolite3, it seems that if you simply see a gitweb.* property, then your repository is gitweb enabled:
repo foobar
desc = "Foobar repository"
category = "foobar"
Rw+= myself
Or if you prefer :
repo foobar
config gitweb.description = "Foobar repository"
config gitweb.category = "foobar"
Rw+= myself
I don't know if it works with a #all, like:
repo #all
category= "uncategorized"
But since a description or (valid) category is not a bad thing to have, I'd say it works for me.
On the other hand, I tried also making an #almost-all group with all my repositories except gitolite-admin, except I don't know if it works because of gitweb.description/category config.

Resources