How to resolve the error "object mockito is not a member of package org" in Databricks - mockito

I am trying to write unit tests for my Scala application from Databricks Notebook.I ran the below command to import mockito for my session .
'''import org.mockito.Mockito._'''
But getting below error after run
object mockito is not a member of package org

Related

Unable to import module 'lambda_function': No module named 'aws_xray_sdk'

I am trying to implement this AWS Lambda Rest API Handler in my lambda code to handle proper response code. For this I needed to repackage the library aws_lambda_powertools and add as a layer in lambda function.
All the import related to this lib below are working.
from aws_lambda_powertools import Logger, Tracer
from aws_lambda_powertools.event_handler import APIGatewayRestResolver
from aws_lambda_powertools.logging import correlation_paths
from aws_lambda_powertools.utilities.typing import LambdaContext
But When I am creating object of above Tracer class below its giving error(Rest two commented object logger and app are working fine.
tracer = Tracer()
# logger = Logger()
# app = APIGatewayRestResolver()
Error I am getting while declaring tracer object is below:
Response
{
"errorMessage": "Unable to import module 'lambda_function': No module named 'aws_xray_sdk'",
"errorType": "Runtime.ImportModuleError",
"stackTrace": []
}
Function Logs
OpenBLAS WARNING - could not determine the L2 cache size on this system, assuming 256k
START RequestId: ae8b006b-e7f7-495b-99a0-eb5231c3f81c Version: $LATEST
[ERROR] Runtime.ImportModuleError: Unable to import module 'lambda_function': No module named 'aws_xray_sdk'
Traceback (most recent call last):
I tried to install pip install aws_xray_sdk and repackaged it and re-added to layer still its giving the same error.
Can anyone help me with this? I am new to lambda. Thanks in advance.
Fixed the error by using AWS Arn arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV2:18📋 instead of using my own custom repackaged library layer.
Reference Link:
https://awslabs.github.io/aws-lambda-powertools-python/2.6.0/
Tracing (like validation and parsing) requires additional dependencies. They're not included by default in an attempt to keep the resulting package as small as possible.
When packaging with AWS SAM, I'm using this in my my_code/requirements.txt, which I then pip install in my local virtual env:
aws-lambda-powertools[parser, validation, tracer]==2.6.0
Additionally, I'm including this in tests/requirements.txt, which I also pip install locally but which is not picked up by SAM (keeping the image small again, and it's not required at runtime anyhow).
aws-lambda-powertools[aws-sdk]==2.6.0
pytest==7.2.1
In version 2.0 the xray SDK is not included by default, as this would introduce a size overhead even for those who would not be using Tracer.
To solve this problem, just put in your requirements.txt the tracer dependency using aws-lambda-powertools[tracer] or all dependencies using aws-lambda-powertools[all].
Refer This link:
https://github.com/awslabs/aws-lambda-powertools-python/issues/1872

Unable to install PySpark Module Error No Module Found

I'm trying to work with Microsofts Hyperspace application.
In order to make it work with Python I need to install the module called Hyperspace.
When I implement the code from hyperspace import * I get the following error:
ModuleNotFoundError: No module named 'hyperspace'
I tried the following but still no luck
from pyspark hyperspace import *
Can someone let me know what it will take to successfully install the module?
I
Thanks
The module isn't supported on Databricks

Android build system command mm fails with this error "The import org.mockito cannot be resolved"

I am using Android build system command mm to create an apk file for unit testing. I use mockito in the test classes and therefore I imported it in my code:
import static org.mockito.Mockito.*;
mm terminates with this error:
ERROR: : The import org.mockito cannot be
resolved ninja: build stopped: subcommand failed.
Could anyone know what has caused this issue and how it can be solved?

Using pyspark with pybuilder

We are setting pybuilder on a new big data project. We have to tests that some classes build the correct distributed tables. As a consequence we built a few unitests that pass when running them on eclipse/pydev.
I run independant unit tests succesfully but when i ad the one using pyspark i have a long list of java exception starting with:
ERROR Utils:91 - Aborting task
ExitCodeException exitCode=-1073741515:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:582)
This my build.py file:
from pybuilder.core import use_plugin
from pybuilder.core import init
import sys
import os
sys.path.append(os.path.join(os.environ['SPARK_HOME'], 'python\lib\py4j-0.10.7-src.zip'))
sys.path.append(os.path.join(os.environ['SPARK_HOME'], 'python'))
use_plugin("python.core")
use_plugin("python.unittest")
use_plugin("python.install_dependencies")
default_task = "publish"
We are using pyspark 2.3.1 and python 3.7.
What am i doing wrong?
The solution for me was to execute winutils CHMOD 777 -R in my workspace after installing Microsoft Visual C++ 2010 Redistributable Package

autodoc on readthedocs and PyQt5

I'm writing a package that wraps PyQt5 functionality and trying to put the documentation on readthedocs. Since PyQt5 is an extension module I mock the module and its classes (manually, because using unittest.mock causes metaclass conflicts):
class PyQt5:
class QtCore:
#staticmethod
def qVersion():
return '5.0.0'
class QObject:
pass
# etc
sys.modules['PyQt5'] = PyQt5
This works fine locally. But although the builds pass without error on readthedocs, there is no autodoc output. What am I missing ?
The project on BitBucket: https://bitbucket.org/fraca7/qtypy/
On ReadTheDocs: https://readthedocs.org/projects/qtypy/
Despite it "passing" the build, if you look carefully at your logs, you will see there are errors like ImportError: No module named 'qtypy' when it starts invoking sphinx.
When I've done this successfully in the past, I've always had a setup.py file at the top level of the repository for installing the package, which I believe is the only way that readthedocs can install the package.
I've then enabled, on readthedocs project admin -> advanced settings,
Install your project inside a virtualenv using setup.py install"
This ensures your module is available to be imported when sphinx runs, so that it can automatically generate the documentation (provided you have successfully mocked PyQt5).

Resources