How to start the pyscaffold project?
I use this command to create the project putup sampleProject
but i don't know how to start this project?
You don't start a pyscaffold project per say -- Its goal is simply to create the files and folder that you will commonly need for your project. See my structure below from "putup MyTestProject". Look at all the nice stuff already created that you now don't have to do by hand.
To get started, you need to start adding packages/code to "..src/mytestproject" and run that code like you normally would.
Might I recommend for you the use of a good IDEA, such as pycharm? I think you will find it makes starting your journey much easier.
A second recommendation -- if you are just getting started, you might skip pyscaffold for now. While a great tool, it might add confusion that you don't need right now.
MyTestProject/
├── AUTHORS.rst
├── CHANGELOG.rst
├── docs
│ ├── authors.rst
│ ├── changelog.rst
│ ├── conf.py
│ ├── index.rst
│ ├── license.rst
│ ├── Makefile
│ └── _static
├── LICENSE.txt
├── README.rst
├── requirements.txt
├── setup.cfg
├── setup.py
├── src
│ └── mytestproject
│ ├── __init__.py
│ └── skeleton.py
└── tests
├── conftest.py
└── test_skeleton.py
[Edit]
With respect to why "python skeleton.py" gives an output, the library is simply providing an example to show the user where to start adding code, and how the code relates to the tests (test_skeleton.py). The intent is that skeleton.py will be erased and replaced with your code structure. This may be some python.py files or packages and sub packages with python.py files. Read it this way; "Your Code goes here ... and here is an arbitrary example to get you started."
But you have to ask yourself what you are trying to accomplish? If you are just creating a few scripts for yourself -- for nobody else in the world to see, do you need the additional stuff (docs, setup, licensing, etc?) If the answer is no - don't use pyscaffold, just create your scripts in a venv and be on your way. This scaffolding is meant to give you most of what you need to create a full, github worthy, project to potentially share with the world. Based on what I gather your python experience to be, I don't think you want to use pyscaffold.
But specific to your question. Were I starting with pyscaffold, I would erase skeleton.py, replace it with "mytester.py", use the begins library to parse my incoming command arguments, then write individual methods to respond to my command line calls.
Related
I am reading other people's code and found extensions.py in their package.
I can see the modules imported in the extensions.py are imported in init.py as well.
I could not find how the extensions.py works with init.py and in what situation you need to use the extensions.py.
Could anyone give me some explaination or provide some link that explain it?
In init.py
from flask_app.extensions import cors, guard
In extension.py
from flask_praetorian import Praetorian
cors = CORS()
guard = Praetorian()
According to Python’s package tutorial this is the minimal structure:
packaging_tutorial/
├── LICENSE
├── pyproject.toml
├── README.md
├── setup.cfg
├── src/
│ └── example_package/
│ ├── __init__.py
│ └── example.py
└── tests/
Seems as its just a backup for installing requirements, or for more readability. Maybe provide where you found extensions.py, and I can take a deeper look.
You could also dig deeper into docs and see exactly what flask-praetorian does.
I think It's just a backup for installing things like requirements.
We have 3 different RESTapis namely A,B and C and all access the same MySQL database.
Our stack is NodeJS + ExpressJS + SequlizeJS ORM + MySQL
Right now if I want to create a new db model, I need to create it in the api A and
communicate with other developers who are working on the api B and C to copy and paste new model in their projects' models folder.
This is very inefficient and has many mistakes during the process.
So instead of doing this manually, can we automate this task with a new repo in the bitBucket?
The idea is to create a repo in bitBucket and some how refer that models folder in all 3 projects instead of keeping models folder in each and every project?
How do I achieve this using NodeJS, ExpressJS and BitBucket?
I am assuming that by API A, B and C you are referring to completely different projects here. If that's the case then i can suggest you to use GIT Submodules. But having extensively used submodules i would suggest to use this only if it is inevitable.
Project Structure That i usually work on:
.(Git Root)
├── logs
├── resources
├── schema
│ └── <different-entities>
├── src
│ ├── config
│ ├── controllers
│ ├── jobs
│ │ ├── emails
│ │ └── notifications
│ ├── locales
│ ├── middlewares
│ ├── migrations
│ ├── models (You need to have a git submodule here)
│ ├── public
│ ├── seeders
│ ├── services
│ │ ├── entities
│ │ └── factories
│ ├── transformers
│ ├── types
│ ├── types-override
│ ├── util
│ └── validators
│ ├── keywords
│ └── <different-entities>
├── storage
│ ├── <date>
├── stubs
└── temp_files
This sounds easy but keep these things in mind:
If your existing project has models directory in git history, you cannot create a submodule on that directory (At least not an easy way. The way i did it was rename models to shared-models)
Now there will be 2 git repositories:
A. git repository containing all the model files (You will never open this GIT repo. in your IDE)
B. the main git repository of your project.
So there will be unnecessary merge conflicts most of the time you
merge any branches in your main repository. because main repository just keeps track of commit hash your models repository should be on at that moment. Any new commit (Irrespective of whether it's even fast-forward will be treated as a merge conflict)
Third and the drawback that's most tiring: Suppose you have to just make a change in model file and nothing else in project changes. (Yes, that's infrequent but possible. e.g. adding another enum value in status key). To achieve this you will have to make two commits, first one in models repository which will store the actual changes, then in the main repository which will store the new commit hash and push two commits on two different repositories
If I have not lost you already & you feel this is the right approach for your use case (I can't think of anything better though)
copy your complete models directory to a new place (Let's say desktop).
git init inside ~/Desktop/models
push it on a separate bitbucket repo. (I usually name it <project>-models-<lang>, e.g. facebook-models-node)
come back to your main Project A.
remove models directory there
run: git submodule add <HTTPS/SSH Url of bitbucket> src/shared-models
replace imports of models from src/models to src/shared-models
Repeat steps 5-7 for other projects too
Official GIT Submodule: https://git-scm.com/book/en/v2/Git-Tools-Submodules
I want to create a library in Typescript that I can share via npm. Specifically, I want to use webpack to generate a js bundle along with a definition file to share the types with the js. So I'd have a tree of files like:
├── lib
│ ├── lib.d.ts
│ └── lib.min.js
├── test
...
├── ts
│ ├── errors
│ │ ├── CannotModifyAlteredObject.ts
│ ├── Lib.ts
│ ├── PostProcessors.ts
│ ├── Serializers.ts
├── tsconfig.json
├── typings.json
├── LICENSE
├── package.json
├── README.md
└── webpack.lib.config.js
And all the types exported by ts/Lib.ts would be exported to a single .d.ts in the lib directory to sit next to the js bundle.
I've looked at the following questions/sources:
Writing npm modules in typescript
How to create a typescript library (and the question it duplicates)
This unanswered question
The offical typescript guide to creating packages
This example typescript library project
And another SO question
However, none of these provide an example using webpack. Being able to bundle everything you need to use the library (apart from the nodejs runtime) into a single file is pretty important for my use case, so webpack fits this role well. I'd like to be able to generate a .d.ts file that maps to what webpack creates. However, I want to avoid creating the .d.ts file manually - it should be possible to automatically extract the types without having manually created .d.ts files get out of sync with my source code. Is there a way of doing this?
I'm using package.json file to define nodejs requirements, along with npm update, and of course is working fine.
How can I manage (update the easy way) other third party libraries with node? For example:
https://github.com/documentcloud/backbone.git
https://github.com/twitter/bootstrap.git
In vendor folder.
Summary: I think you want to use http://twitter.github.com/bower/
Details:
There are two ways to understand your question:
how to manage/update non-npm code?
how to manage/update client-side javascript assets?
The question is worded as the former, but from your included examples I think what you want to ask the latter.
In case of server-side code, just insist all code gets shipped with npm-style package.json manifest. If the author of the code is unresponsive, fork it and add the manifest. There is no excuse.
For client-side code, the situation is different. There is no established standard for package management, however it's a widely recognized problem and very active field of development. Several challengers have risen recently, trying to grab the dominant position: BPM, Jam or Ender. It's your call which to pick, they are well summarized here: A package manager for web assets
However, all of the above address a slightly too ambitious problem - they try to sort out the transport of those modules to the browser (via lazy loading, require-js style, dependency resolution, concatenation/minification etc.) This also makes them more difficult to use.
A new entrant to the field is Bower from Twitter. It focuses just on download/update lifecycle in your vendor folder and ignores the browser delivery. I like that approach. You should check it out.
You could go for git submodules:
http://git-scm.com/book/en/Git-Tools-Submodules
Using someone else's repo as a Git Submodule on GitHub
[UPDATE 1]
Do this at the root of your repository:
git submodule add git://github.com/documentcloud/backbone.git vendors/backbone
git submodule add git://github.com/twitter/bootstrap.git vendors/bootstrap
Check this for more: http://skyl.org/log/post/skyl/2009/11/nested-git-repositories-with-github-using-submodule-in-three-minutes/
Although this may not be the nodejs way, and some purists may complain, Composer will do what you want. Even though Composer is used with PHP projects there is no reason why it cannot be used to manage 3rd party non-npm repos for nodejs projects too. Obviously it is preferable for the 3rd party library to include a package.json but thats not aways going to happen. I tried this on my current nodejs app and it worked perfectly for me.
Pros:
One json file specifies custom external dependancies that do not have package.json files
Customize where packages are stored in the project folder
Works with private repos and repos not originally intended for use with nodejs
Cons:
Requires php cli
An extra step for updating dependancies
An extra json file for dependancies
Here how to do it (You need to be able to run php from the cli):
1. Download Composer (directly into your root nodejs project folder)
curl -s https://getcomposer.org/composer.phar > composer.phar
2. Create a composer.json file (in the root of the project)
{
"repositories": [
{
"type": "package",
"package": {
"name": "twitter/bootstrap",
"version": "2.0.0",
"dist": {
"url": "https://github.com/twitter/bootstrap/zipball/master",
"type": "zip"
},
"source": {
"url": "https://github.com/twitter/bootstrap.git",
"type": "git",
"reference": "master"
}
}
}
],
"require": {
"twitter/bootstrap": "2.0.0"
}
}
3. Run the Composer update
php composer.phar update
This will download the package into the vendor folder as you requested:
├── vendor
│ ├── ...
│ ├── composer
│ │ └── installed.json
│ └── twitter
│ └── bootstrap
│ ├── LICENSE
│ ├── Makefile
│ ├── README.md
│ ├── docs
│ │ ├── assets
│ │ └── ...
│ ├── img
│ │ ├── glyphicons-halflings-white.png
│ │ └── glyphicons-halflings.png
│ ├── js
│ │ ├── bootstrap-affix.js
│ │ └── ...
│ ├── less
│ │ ├── accordion.less
│ │ └── ...
│ └── ...
There are no installation instructions for the tabular plugin. I tried either copying the files into the correct folders, or putting in under ~/.vim/bundle to let pathogen deal with it, in both cases I get the following error messages when I load up vim (if it's of any concern, the message is repeated 6 times).
AddTabularPattern: Vim(runtime):E194: No alternate file name to substitute for '#': runtime autoload/tabular#ElementFormatPattern.vim
EDIT some more information if it will help diagnose the problem
Here is where the files are stored in my ~/.vim/bundles/godlygeek-tabular-b7b4d87 folder (not I obviously have not shown all files)
.vim/
├── [drwxrwxr-x] bundle
│ ├── [drwxrwxr-x] godlygeek-tabular-b7b4d87
│ │ ├── [drwxrwxr-x] after
│ │ │ └── [drwxrwxr-x] plugin
│ │ │ └── [-rw-rw-r--] TabularMaps.vim
│ │ ├── [drwxrwxr-x] autoload
│ │ │ └── [-rw-rw-r--] tabular.vim
│ │ ├── [drwxrwxr-x] doc
│ │ │ └── [-rw-rw-r--] Tabular.txt
│ │ └── [drwxrwxr-x] plugin
│ │ └── [-rw-rw-r--] Tabular.vim
Could you tell us a bit more about your setup? With a diagram if possible?
The AddTabularPattern command is called exactly 6 times from after/plugin/TabularMaps.vim and declared in plugin/Tabular.vim. I don't see why it would trigger the expansion of #, though.
IIRC, Tabular comes with an after directory which contains a plugin in its own directory, an autoload directory, a doc directory & a plugin directory.
So just copy the contents of those directories to their counterparts in $HOME/.vim/ (making any directory that does not already exist) & you're good to go.
I realize that this does not answer your question directly (I could not replicate the error), but you could try using Vundle to manage your Vim plugins. It serves a very similar purpose as pathogen, but with one important difference: it's entirely declarative.
If you used Vundle, installing Tabular would require you only to put this line into your .vimrc:
Bundle 'Tabular'
and then issue :BundleInstall command and that's it.
It also supports downloading the plugins from GitHub, so alternatively you could use:
Bundle 'godlygeek/tabular'
I have switched from pathogen to Vundle a while ago and haven't looked back.