I want to run a kedro pipeline in the base env using jupyter notebook. I do this the following way:
%reload_kedro --env=base
session.run(pipeline_name='dpfm1')
Doing this, the %reload_kedro command raises the following error:
RuntimeError: Could not find the project configuration file 'pyproject.toml' in --env=base. If you have created
your project with Kedro version <0.17.0, make sure to update your project template. See
https://github.com/kedro-org/kedro/blob/main/RELEASE.md#migration-guide-from-kedro-016-to-kedro-0170 for how to
migrate your Kedro project.
However, I have installed kedro version 0.18.2:
>>>!kedro --version
kedro, version 0.18.2
What's the matter here?
#ilja This is mentioned in the RELEASE.md if you have an old Kedro project, i.e. 0.16.x, there is no pypropject.toml file.
You may have Kedro 0.18.2 installed, but if it is an old project, there are some migration steps that you need to take, which are included in the RELEASE.md
If it is a new project, it's likely that you are not providing the right path argument, kedro need to find the pyproject.toml for certain metadata and determine where is the project root.
p.s. %reload_kedro path --env --extra_params is only supported since 0.18.3, previously it does not support any argument other than path, so you may to upgrade your Kedro version.
Related
I currently have a dockerfile which installs python libraries in the container which i eventually use to execute code. Now for every release , I need to add or update the dependencies , which results in rebuilding the image.
The issue is while rebuilding many internal transitive libraries create version issues which effects my functionality , for eg some library can bring in a new numpy library version which can cause issues in the code.
How should I handle this problem ? Should I create a new base image for every release and update it in dockerfile ?
Edit : Caching does not help me , because the moment my requirement.txt file change , rebuild will happen.
Also , I cant specify versions for all libraries. Transitive libraries are a challenge here.
This is not related to Docker. You can either pin the package version in requirement.txt or use Poetry to manage the dependencies. Poetry uses a lock file which makes sure the proper version is installed for all the dependencies.
I have a Django project forked from GitHub. But I don't know which version of Django is used in that project. How can I find the Django version of the project ? I haven't installed Django in my PC.
You can check Django(installed) version with
pip show django
or
pip3 show django
In your case, it is difficult to find Django version since it is not mentioned in any file/s such as requirements.txt
You can assume the Django version used for existing GitHub project by Latest commit on [Date] by looking at that date you can assume figure out which version was used.
Note: You can check the logfile (if it is available)
I found a solution on my own. In the top of the settings.py file, you can see an auto generated multiline comment like this
"""
Django settings for remedy_server project.
Generated by 'django-admin startproject' using Django 3.2.
For more information on this file, see
https://docs.djangoproject.com/en/3.2/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/3.2/ref/settings/
"""
In that comment, the Django version of your project has been mentioned
Django version or any other package version
Open the terminal or command prompt
Type
pip show django
import django
print(django.get_version())
3.0.7
I really like virtual environments in Python, where you can put a whole Python environment including the interpreter into a project directory. If you dig out an old project after years you can just activate the environment and you are ready to go - this is awesome.
What is the node.js way of doing this?
Usually you mark down the package versions and Node.js version that your code supports in the appropriate package.json directive. This means that distributed versions of your projects import the same modules. Locally, this doesn't matter as npm installs your packages in the project directory by default.
However, for managing your local Node versions more efficiently, a tool such as Node Version Manager will do the trick. NVM specifically supports a .nvmrc file in the project directory to mark down the Node version.
I am struggling around a wrong usage of composer, for sure.
I set up this repository: https://github.com/alle/assets-merger
I forked the project and was just trying to make it a kohana-module, including all the dependencies.
As for it would need the YUI comporess JAR, I was trying to make just that JARfile as a dependency, and I ended to declare it in the composer.json file (please, look at this).
Once I need to add my new package to a project I add it in the require section as follows:
...
"alle/assets-merger": "dev-master",
...
But the (latest) composer update command says:
Loading composer repositories with package information
Updating dependencies (including require-dev)
Your requirements could not be resolved to an installable set of packages.
Problem 1
- Installation request for alle/assets-merger dev-develop -> satisfiable by alle/assets-merger[dev-develop].
- alle/assets-merger dev-develop requires yui/yuicompressor 2.4.8 -> no matching package found.
Potential causes:
- A typo in the package name
- The package is not available in a stable-enough version according to your minimum-stability setting see <https://groups.google.com/d/topic/composer-dev/_g3ASeIFlrc/discussion> for more details.
And my story ends here.
How should I configure my composer.json in the https://github.com/alle/assets-merger repository, in order to include it as a fully satisfied kohana-module in other projects?
Some things I notice in your composer.json.
There is a version of that CSS minify available on Packagist which says it is just a copy of the original Goole-Code hosted files, but with Composer: natxet/cssmin. It is version 3.0.2, but I think that shouldn't make a difference.
mrclay/minify is included twice in the packages with the same version. It also is available on Packagist. You will probably already use that (version 2.2.0 is registered, and because you didn't turn of Packagist access, it will be generally available for install unless a version requirement or conflict prevents it).
You are trying to download a JAR file (which is a java executable without and PHP), but try to get PHP classmaps out of it. That will fail for sure.
You did miss the big note in the Composer documentation saying that Composer cannot resolve repositories mentioned in sub packages, only in the root package. That means that whatever you mention in your alle/asset-merger package will not be used if you use that package anywhere else. You'd have to duplicate these repositories in every package in addition to adding the package name itself as "required".
What this means is that you probably avoided missing mrclay/minify because it is available on Packagist, you might as well have added the cssmin by accident, but you definitly did not add YUICompressor.
But you shouldn't add this in the first place, because it is no PHP software. You can however add post-install commands to your projects. All your Composer integration does is download the JAR file. You can do that with a post-install or post-update command. See the documentation here.
I'm currently using the dojotoolkit and its build system.
I read the new build tutorial for 1.8 at http://dojotoolkit.org/documentation/tutorials/1.8/build/.
In the tutorial it mentions that you can speed up your build by using nodejs.
The build tool itself relies on Java (and, optionally, Node.js for even faster builds), so make sure that have you that installed as well.
But it fails to mention how to do this. Anyone know how this works?
I normally run it like this:
> node dojo/dojo.js load=build --profile myprofile.profile.js --release
This would build a release for the profile contained in myprofile.profile.js. It assumes you are in a directory, which contains both dojo and util as sub-directories. It also assumes that the path to node is set correctly.
If node is not configured in the path variable, you will need to use the full path to node:
> <path to node here> dojo/dojo.js load=build --profile myprofile.profile.js --release
On windows the path is normally C:\Program Files\nodejs\ but you might have to configure it as C:\PROGRA~1\nodejs\ to get it working.
Windows Notes:
The build scripts do not work with Node on Windows (except using Cygwin). If you are using Windows you can get it to work via the following patch:
Windows Patch
Use the attached node-win.patch file to edit the files: util/build/main.js and util/build/transforms/writeOptimized.js. The patch has worked for me 100% of the time and it a simple matter editing a few lines of code.
I've personally found the alternative to Node, using Rhino, useless. It always fails to detect the build paths correctly, no-matter what I set basePath to. I would strongly advise using Node over Rhino as it is more reliable and easier to setup.
The buildscript util/buildscripts/build.sh checks if node is in your path and if so uses it.
This is currently not working under Windows (http://bugs.dojotoolkit.org/ticket/15413).