I saw an example of peewee where pysqlcipher was used as the connector for managing a database file rather than the sqlite module. That's great and there's even an async version of peewee but I don't need (Or want) to use peewee's object model. In peewee, the connector is initialized like this:
from peewee import *
from playhouse.sqlcipher_ext import SqlCipherDatabase
db = SqlCipherDatabase(None)
class Entry(Model):
class Meta:
database = db
I want to do something similar with aiosqlite and pysqlcipher3 instead of using peewee. Maybe it can work by overriding aiosqlite.Connection but I've never done something like that before. How can I use pysqlcipher3 with aiosqlite?
aiosqlite uses the standard library sqlite3 module -- and that appears to be hardcoded here:
https://github.com/omnilib/aiosqlite/blob/master/aiosqlite/core.py
In addition they've sprinkled all kinds of sqlite3-specific type annotations all over the place, so I'm not sure whether you can even monkey-patch it without causing issues.
Related
I am trying to use a nodejs library like uuid in my typescript app. This simple import works just fine if I want to only use it in a specific class:
import { v4 } from "uuid";
class MyClass {
...
}
However, this makes the class MyClass not "discoverable" by any other files in my solution. Sure I could export it, but that forces me to import the class in every usage, and the problem spreads like a cancer to every file. Do I really have to import/export every single class/file in my application just because I want to produce a simple UUID?
I saw that I can use require instead, however typescript doesn't know what that keyword is. I found this question but neither installing #types/node nor the quick and dirty declare var require any works.
It really seems like I am jumping through a lot of unnecessary hoops just to generate a uuid in typescript. Am I doing something very wrong?
Thanks
I am having strange issues with Typescript when I import things from a file which exports them. Sometimes I will export a function, then import it to another file, then I use the function and it is not a function anymore. When I define the function in the same file, all of a sudden the function is a function?!?!?
Why would a function stop being a function when it is exported? I have had similar problems with classes too.
The hard part of this issue is I can't recreate a simple example because it only happens when I am using some kind of higher level package.
For example, I had a similar issue with sequelize-typescript here: my github issue with typescript-sequelize
Below is some codes showing off the basic issue I'm having with one of the decorators from InversifyJS.
container.ts
import {fluentProvide} from "inversify-binding-decorators";
export const provideSingleton = (identifier: any) => {
return fluentProvide(identifier)
.inSingletonScope()
.done(true);
};
test.service.ts
import {provideSingleton} from './container'
#provideSingleton(TYPES.TEST)
export default class TestService {}
The strangest thing is when I put the provideSingleton in the same file as the TestService, everything works!?!?!
Basically to recreate the issue, simply follow the example from here: inversify-binding-decorators - Using #provideFluent multiple times. However there is an issue with the example, so please see this issue: fluentProvide example needed. The above provideSingleton reflects the changes from that issue. Then you simply import the provideSingleton function from another file instead of defining it in the same like in the example.
Can anyone explain to me what I'm missing? Why oh why would certain exported items not bee seen as the type they are? Is there a step I'm not seeing that NodeJS takes to make the item actually exported and therefore different? Can I force the function to resolve as a function so it can be used as such?
ENV:
NodeJS: 10.9.0
Typescript: 3.0.1
Mac: 10.13.16
So it looks like you can get issues like this when NodeJS can't handle a recursive import. I'm not exactly sure how to check you are getting this error other than your symptoms are like what I stated above. Basically the recursion caused my function to not load and therefore undefined is not a function.
It would be easy to notice if you had code like so:
a.ts
import B from './b';
export default class A extends B {}
b.ts
import A from './a';
export default class B extends A {}
In my case, I think my function provideSingleton did not like the file I put it in because of some conflicting code in the file, which all I had was:
import {Container} from 'inversify';
import "reflect-metadata";
import {fluentProvide} from "inversify-binding-decorators";
const container = new Container();
function ProvideSingleton(identifier: any) {
return fluentProvide(identifier)
.inSingletonScope()
.done(true);
}
export {container, ProvideSingleton}
In the end, if this issue comes up, try another file for your function and pay good attention to how the order of the loading happens. Although NodeJS handles recursive imports most of the time, you can still trip it out.
Does anyone know how can you write mock tests for Odoo objects?
I have these classes and methods:
my_module:
from odoo import models
class MyModel(models.Model):
_name = 'my.model'
def action_copy(self):
IrTranslation = self.env['ir.translation']
for rec in self:
if rec.translate:
IrTranslation.force_translation(rec)
my_module_2:
from odoo import models
class IrTranslation(models.Model):
_inherit = 'ir.translation'
def force_translation(self, rec):
# do stuff
When I call it, I want to test if IrTranslation.force_translation was called in action_copy method and how many times.
But this method is not imported directly, it is referenced through env.
If let say force_translation would be imported like:
from my_module_2.IrTranslation import force_translation
def action_copy(self):
# do stuff.
force_translation()
Then I could try doing something like this:
from unittest import mock
from my_module import action_copy
def test_some_1(self):
with mock.patch('my_module.my_module_2.IrTranslation') as mocked_translation:
action_copy()
mocked_translation.force_translation.assert_called_once()
But because modules in Odoo are not imported directly (like you do it in plain Python), I don't understand how to specify methods in Odoo environment to be mocked.
P.S. I also did not see any mocked tests in standard Odoo, except for base classes that do not inherit Model class -> which then you need to use its _inherit attribute instead of importing class and passing it to be inherited on another class.
Testing in Odoo does not use the concept of mocking. Instead, tests are derived from standard base classes. The standard class TransactionalTest opens a transaction and never commits it, but rolls it back to undo any changes.
This is obviously not the same as regular mocking in that you can't replace other methods or classes to return fixed/expected values and/or avoid other side effects apart from persisting changes in the database, like sending emails or calling a remote web service.
It can be done. I do it all the time since Odoo 8.0 (until 15.0 now). The key is to know where to patch. Odoo adds odoo.addons to your module's package when its imported so in your case, you may do the following:
from odoo import tests
from mock import patch
from odoo.addons.my_module_2.models.ir_translations import IrTranslation
class TestMyModule2(tests.TransactionCase):
def some_test_1(self):
my_model = self.env['my.model'].create({})
with patch.object(IrTranslation, 'force_translation') as mocked_translation:
my_model.action_copy()
mocked_translation.assert_called_once()
Or using just patch, then no need to import:
with patch('odoo.addons.my_module_2.models.ir_translations.IrTranslation.force_translation') as mocked_translation:
my_model.action_copy()
This patches your specific method in your specific class. This way you can also target the method of a super class.
If you need to patch a method and you don't care where it is or where it's overriden, just patch using Python's type() (then no need to import class):
with patch.object(type(self.env['ir.translation']), 'force_translation') as mocked_translation:
my_model.action_copy()
Some additional notes to save you some headaches:
If you use pyCharm, don't mock socket objects. It messes with
pyCharm's mechanismes. Better to put your calls to socket into a one line
method and mock that method instead.
datetime.datetime.now() cannot be mocked, as all builtin types, but fields.Datetime.now() can.
In our framework we have an interface with this method in the public API:
JaxbConfiguration newJaxbConfiguration(Options xjcOpts);
In the implementation, we do something like this:
import com.sun.tools.xjc.ModelLoader;
import com.sun.tools.xjc.Options;
import com.sun.tools.xjc.model.Model;
...
public JaxbConfiguration newJaxbConfiguration(Options xjcOpts) {
Model model = ModelLoader.load(xjcOpts, ...);
...
}
However, both OSGi and Java 9's jigsaw don't like that we use com.sun.tools.xjc.Options, not in our implementation and especially not in our public API interface.
How can we get rid of it?
The JDeps website lists some of the JDK internal APIs and the recommended way to replace their usage. However, the use of ModelLoader.load() is not mentioned. My guess is that this use case has not come up enough to get the attention of the JDeps team.
My recommendation would be to refactor this method so that
you pass in the data you're using to construct the Options argument, instead of passing in the Options argument
use that data to construct your JaxbConfiguration object instead of converting from the internal Model.
You don't mention what JaxbConfiguration is or what library it's from so it's hard for me to say exactly how to construct it. Anyway, this answer is about how to remove the use of the internal API. How to construct a JaxbConfiguration is probably a different question.
I have been setting up a scripting envrionment using Groovy. I have a groovy script called FrameworkiDatabase.groovy which contains a class of the same name. This works fine. I also have another file called connections.groovy which contains maps like the following:
SUPPORT2=[
host:"host.name",
port:"1521",
db:"support2",
username:"username",
password:"password",
dbType:"oracle"
]
This holds a collection of database bookmarks, a bit like an oracle tnsnames file, so I don't need to remember all the parameters when connecting to databases.
When using groovysh, I can import this using the load command, and it is available in current scope. How can I load it as part of a script the same way? It has no class definition around it - does it need one? I have tried doing that, and adding a static import, but that didn't work...
I tried something like this, but no luck:
testFrameworkiDatabase.groovy:
import static connections
def db = new FrameworkiDatabase(SUPPORT2)
db.listInvalidObjects()
db.getDBSchemaVersion()
db.getFWiVersion()
db.getSPVersion()
db.getFileloaderVersion()
db.getAdminToolVersion()
db.getReportsVersion()
So I want to load those connections as constants - is there any way I can do this easily?
Not sure if it's the best way, but one way would be to write this into Connections.groovy
class Connections {
static SUPPORT2 = [
host:"host.name",
port:"1521",
db:"support2",
username:"username",
password:"password",
dbType:"oracle"
]
}
Then, compile this with groovyc Connections.groovy to generate a class file
Then, in your test script or on the groovysh prompt, you can do:
import static Connections.*
println SUPPORT2
To get the output:
[host:host.name, port:1521, db:support2, username:username, password:password, dbType:oracle]
If compiling the Connections.groovy class isn't good enough, I think you're going to be looking at loading the source into a Binding object by using one of the Groovy embedding techniques