How to hook a tool to multiple hook points in Cherrypy? - cherrypy

I have a custom tool that I have hooked to before_finalize hook point. I want the same tool to run for 'after_error_response' hook point. Is there a way to achieve this?
This is how I am creating my tool
class MyTool(cherrypy.Tool):
def __init__(self):
cherrypy.Tool.__init__(self, 'before_finalize',
self._do_something,
priority=100)
cherrypy.tools.mytool = MyTool()

You could parameterize the hook point and assign it under different names.
cherrypy.tools.mytool = MyTool('before_finalize')
cherrypy.tools.mytool2 = MyTool('after_error_response')
To register an identical tool at mutiple points, you'll have to override the Tool._setup itself. Replacing:
cherrypy.serving.request.hooks.attach(self._point, self.callable,
priority=p, **conf)
with calls for each point.

Related

Dynamically generate Flask-RESTPlus routes

I am trying to abstract away some of the route class logic (i.e. I am looking to dynamically generate routes). api.add_resource seemed like the right place to do this.
So this is what I am trying to do:
# app.py
from flask import Flask
from flask_restplus import Api, Resource, fields
from mylib import MyPost
# Define my model
json_model = api.schema_model(...)
api.add_resource(
MyPost,
'/acme',
resource_class_kwargs={"json_model": json_model}
)
And then in mylib:
# mylib.py
def validate_endpoint(f):
def wrapper(*args, **kwargs):
return api.expect(json_fprint)(f(*args, **kwargs))
return wrapper
class MyPost(Resource):
def __init__(self, *args, **kwargs):
# Passed in via api.add_resource
self.api = args[0]
self.json_model = kwargs['json_model']
# I can't do this because I don't have access to 'api' here...
# #api.expect(json_model)
# So I am trying to make this work
#validate_endpoint
def post(self):
return {"data":'some data'}, 200
I don’t have access to the global api object here so I can’t call #api.expect(json_model). But I do have access to api and json_model inside of the post method. Which is why I am trying to create my own validate_endpoint decorator.
This does not work though. Is what I am trying to do here even possible? Is there a better approach I should be taking?
Stop using flask-restplus. Thats the most valuable answer I can give you (and anyone else).
Ownership is not there
Flask-restplus is a fork of flask-restful. Some engineers started developing features that suited them. The core guy has ghosted the project so its been officially forked again as Flask-Restx.
Poorly designed
I used to love flask when I was a yout’. I’ve realized since then that having global request, application, config that all magically update is not a good design. Their application factory pattern (to which flask-restplus conforms) is a style of statefully mutating the application object. First of all, Its hard to test. Second of all, it means that flask-restplus is wrapping the app and therefore all of the requests/handlers. How can anyone thing thats a good thing? A library whose main feature is endpoint documentation has its filthy hands all over every one of my requests?? (btw, this is whats leading to your problem above) Because my post is serious and thoughtful I’m skipping my thoughts on the Resource class pattern as it would probably push me into the waters of ranting.
Random Feature Set
A good library has a single purpose and it does that single thing well. Flask-restplus does 15 things (masking, swagger generation, postman generation, marshaling, request arg validation). Some features you can’t even tell are in the libraries code by reading the docs.
My solution to your problem
If you want to document your code via function decorators and models use a tool that does that alone and does it well. Use one that won’t touch your handlers or effect your actual request decorators. Use oapispec for swagger generation. For the other features of flask-restplus you’ve got marshmallow for marshaling request/response data, pydantic for validating request objects and args, and so on.
btw, I know all this because I had to build an api with it. After weeks of fighting the framework I forked it, ripped it apart, created oapispec, and trashed it from the project.

Nested Classes in Python 3

I am trying to create a nested class to perform sum or multiplication of the arguments passed in each subclass.
The below example helps me perform action within the class, however I am unable to find any documentation which would help me with inheriting the attributes from the Parent Class to child.
Recently I came across an article which highlights "nested classes can't access any members of their outer classes at compile-time.". Is there a better way to pass the values between Classes? I tried using global variables, but would like to avoid setting many global variables while I scale this logic to extract my entire datacenter's inventory, perform some calculations and again pass to another class.
class Class1:
firstnumber=0
def __init__(self,arg):
self.firstnumber=arg
class Class2:
def __init__(self,arg):
self.secondnumber=arg
def sumit(self):
return Class1.firstnumber+Class1.Class2.secondnumber
print(Class1(5).firstnumber)
print(Class1(6).Class2(4).secondnumber)
print(Class1(4).Class2(10).sumit())
I would like to perform calculations with
Class1(variable1).Class2(variable2).Class3(variable3).sum() or
Class1(variable1).Class2(variable2).Class3(variable3).multiple() and eventually be able to do following
Datacenter('DC1').GetServer('ServerName').GetStorageCapacity('NFS').Used()
Datacenter('DC1').GetServer('ServerName').GetStorageCapacity('NFS').Free()
http://momentaryfascinations.com/programming/bound.inner.classes.for.python.html
i may be wrong but to my understanding anything you put in between the class() and the init statement is permanent and unchangable. you shouldn't need to create seperate classes for each number. create different instances of the same class.
class numbers:
def __init__(self,arg):
self.arg = arg
c1 = numbers(3)
c2 = numbers(5)
i don't know how you would add the arg variables together maybe someone else can fill in what i'm missing.

Correct way of updating an actor with changes applied to SetUserTransform

I have a poly actor which has a vtkBoxWidget that is connected to a callback as from the example in the docs:
def widget_callback(obj, event):
t = vtk.vtkTransform()
obj.GetTransform(t)
obj.GetProp3D().SetUserTransform(t)
All works fine and I am able to move and transform the actor with the widget but the transformation is applied to the UserTransform and not processed down to the actor properties.
So if I call:
actor.GetPosition()
It returns the initial position prior to making the changes with the widget. And if I call:
actor.GetUserTransform().GetPosition()
I get the updated position relative to the starting point of the first interaction.
Do I have to connect it all through a vtkTransformPolyDataFilter and then update the input connection to the mapper and also calculate the coordinate space offset or is there a simpler way of doing it? ... in short:
What is the correct way of updating an actor with changes applied to SetUserTransform?
After trying out many variations with vtkTransformPolyDataFilter and start/end interaction events linked to various steps in the process I found that for my purpose the absolute simplest way of handling this was simply linking the widget transformation directly to the actor properties:
def widget_callback(self, obj, event):
t = vtk.vtkTransform()
obj.GetTransform(t)
self.actor.SetPosition(t.GetPosition())
self.actor.SetScale(t.GetScale())
self.actor.SetOrientation(t.GetOrientation())

Groovy Script for JIRA-actions

I want to achieve the following using Adaptavist Scriptrunner in JIRA: A user comments on an issue and triggers a Scriptrunner custom script. If the issue is in the state "waiting for customer reply" and the user is a customer, trigger the workflow-transition "respond to question" and transfer the issue into the state "customer responded".
The Adaptavist Scriptrunner-Plugin uses Groovy as its language of choice for custom scripts. Unfortunately I have never worked with Groovy before and thus have no idea what I have to do to make this work. Out of the examples in the Scriptrunner docs I made the following:
import com.atlassian.jira.component.ComponentAccessor
def issue = event.issue
def workflow = ComponentAccessor.getWorkflowManager().getWorkflow(issue)
def wfd = workflow.getDescriptor()
def actionName = wfd.getAction(transientVars["actionId"] as int).getName()
This is supposed to get me the current workflow step but doesn't work. Would anyone be so kind, to help me write this script?
Cheers!
There's already available Script Listener called Fast-track transition an issue. You need just to create a new instance of it, bind it to your project and Issue Commented event, and add extra condition like issue.status.name == 'Waiting For Customer Reply' && currentUser == issue.reporter, and specify the transition. If you change workflow, you might need to update a listener too.
Also, these listeners, post-functions etc. are implemented as 'canned' scripts (classes implementing certain interface) which are available as plain groovy files in the plugin itself in the JAR file, they can teach a lot.

Add renderer in #view_config from configuration?

How do I supply a configured value to a #view_config-decorated function or class?
E.g.
#view_config(route_name='example', renderer=some_config['template.name'])
class MyClass(BaseView):
...
Or
#view_defaults(route_name='example', renderer=some_config['template.name2'])
class MyClass2(BaseView):
...
Or
#view_config(route_name='example', renderer=some_config['template.name3'])
def method3(request):
...
It's very hard to know where to start, as I'm trying to edit a pyramid plugin, which pulls together its config in an includeme function, so it doesn't have anything obvious that I can include, and it's hard to know what's available to the #view_config decorator.
You can add views using declarative configuration (what you are doing now using #view_config or alternatively using imperative configuration by calling config.add_view() method.
In this case, as you need to access the Pyramid registry and settings file, it is easier to do adding the views imperatively.
In your __init__.py you can do:
settings = config.registry.settings
# You need to call config.add_route("foobar") to map view to URL also
config.add_view('views.MyClass', route_name="foobar", renderer=settings['template.name3'])
Then in your views.py:
class MyClass(BaseView):
pass
#view_config() and add_view() arguments are equal.
I thin kyou can also mix view_config and add_view() arguments for the same view, but I am not sure aobut this. Hope this helps.

Resources