AWS Lambda Unable to import module 'demo': cannot import name 'windll' - python-3.x

Need a little help on aws lambda if ya'll came across this issue with deployment package uploading in aws lambda.
Regards,
xxSoumya----
[find snippet of issue ] [1]: https://i.stack.imgur.com/2QeGe.png

Your deployment package structure should be something like this,
main.py <---------- lambda entry/handler file
(can be name anything, just config your aws lambda to use it)
demo.py
mylib/
__init__.py
foo.py
bar.py
numpy/
...
pandas/
...
If demo.py is in a another folder from where your main lambda handler file is locate, then you will need to put a "__init__".py in that folder.
main.py <---------- lambda entry/handler file
mylib/
__init__.py
demo.py
foo.py
bar.py
numpy/
...
pandas/
...
Now in main, you will need to do, from mylib.demo import .....

Related

With AWS CDK Python, how to create a subdirectory, import a .py, and call a method in there?

I am attempting to get the simplest example of creating a S3 bucket with the AWS CDK Python with no luck.
I want to put the code to create the bucket in another file (which file exists in a subdirectory).
What I am doing works with every other Python project I have developed or started.
Process:
I created an empty directory: aws_cdk_python/. I then, inside that directory ran:
$cdk init --language python to layout the structure.
This created another subdirectory with the same name aws_cdk_python/, and created a single .py within that directory where I could begin adding code in the __init__(self) method (constructor)
I was able to add code there to create a S3 bucket.
Now I created a subdirectory, with an __init__.py and a file called: create_s3_bucket.py
I put the code to create a S3 bucket in this file, in a method called 'main'
file: create_s3_bucket.py
def main(self):
<code to create s3 bucket here>
When I run the code, it will create the App Stack with no errors, but the S3 bucket will not be created.
Here is my project layout:
aws_cdk_python/
setup.py
aws_cdk_python/
aws_cdk_python_stack.py
my_aws_s3/
create_s3_bucket.py
setup.py contains the following two lines:
package_dir={"": "aws_cdk_python"},
packages=setuptools.find_packages(where="aws_cdk_python"),
The second line here says to look in the aws_cdk_python/ directory, and search recursively in sub-folders for .py files
In aws_cdk_python_stack.py, I have this line:
from my_aws_s3.create_s3_bucket import CreateS3Bucket
then in __init__ in aws_cdk_python_stack.py, I instantiate the object:
my_aws_s3 = CreateS3Bucket()
and then I make a call like so:
my_aws_s3.main() <== code to create the S3 bucket is here
I have followed this pattern on numerous Python projects before using find_packages() in setup.py
I have also run:
$python -m pip install -r requirements.txt which should pick up the dependencies pointed to in setup.py
Questions:
- Does anyone that uses the AWS CDK Python done this? or have recommendations for code organization?
I do not want all the code for the entire stack to be in aws_cdk_python_stack.py __init__() method.
Any ideas on why there no error displayed in my IDE? All dependencies are resolved, and methods found, but when I run, nothing happens?
How can I see any error messages, no error messages appear with $cdk deploy, it just creates the stack, but not the S3 bucket, even though I have code to call and create a S3 bucket.
This is frustrating, it should work.
I have other sub-directories that I want to create under aws_cdk_python/aws_cdk_python/<dir> , put a __init__.py there (empty file) and import classes in the top level aws_cdk_python_stack.py
any help to get this working would be greatly appreciated.
cdk.json looks like this (laid down from cdk init --language python
{
"app": "python app.py",
"context": {
"#aws-cdk/aws-apigateway:usagePlanKeyOrderInsensitiveId": true,
"#aws-cdk/core:enableStackNameDuplicates": "true",
"aws-cdk:enableDiffNoFail": "true",
"#aws-cdk/core:stackRelativeExports": "true",
"#aws-cdk/aws-ecr-assets:dockerIgnoreSupport": true,
"#aws-cdk/aws-secretsmanager:parseOwnedSecretName": true,
"#aws-cdk/aws-kms:defaultKeyPolicies": true,
"#aws-cdk/aws-s3:grantWriteWithoutAcl": true,
"#aws-cdk/aws-ecs-patterns:removeDefaultDesiredCount": true,
"#aws-cdk/aws-rds:lowercaseDbIdentifier": true,
"#aws-cdk/aws-efs:defaultEncryptionAtRest": true,
"#aws-cdk/aws-lambda:recognizeVersionProps": true,
"#aws-cdk/aws-cloudfront:defaultSecurityPolicyTLSv1.2_2021": true
}
}
app.py looks like this
import os
from aws_cdk import core as cdk
from aws_cdk import core
from aws_cdk_python.aws_cdk_python_stack import AwsCdkPythonStack
app = core.App()
AwsCdkPythonStack(app, "AwsCdkPythonStack",
)
app.synth()
to date: Tue 2021-12-31, this has not been solved
Not entirely sure, but I guess it depends on what your cdk.json file looks like. It contains the command to run for cdk deploy. E.g.:
{
"app": "python main.py", <===== this guy over here assumes the whole app is instantiated by running main.py
"context": {
...
}
}
Since I don't see this entrypoint present in your project structure it might be related to that.
Usually after running cdk init you should at least be able to synthesize. usually in app.py you keep your main App() definition and stack and constructs go in subfolders. Stacks are often instantiated in app.py and the constructs are instantiated in the stack definition files.
I hope it helped you a bit further!
Edit:
Just an example of a working tree is shown below:
aws_cdk_python
├── README.md
├── app.py
├── cdk.json
├── aws_cdk_python
│   ├── __init__.py
│   ├── example_stack.py
│   └── s3_stacks <= this is your subfolder with s3 stacks
│   ├── __init__.py
│   └── s3_stack_definition.py <== file with an s3 stack in it
├── requirements.txt
├── setup.py
└── source.bat
aws_cdk_python/s3_stacks/s3_stack_definition.py:
from aws_cdk import core as cdk
from aws_cdk import aws_s3
class S3Stack(cdk.Stack):
def __init__(self, scope: cdk.Construct, construct_id: str, **kwargs) -> None:
super().__init__(scope, construct_id, **kwargs)
bucket = aws_s3.Bucket(self, "MyEncryptedBucket",
encryption=aws_s3.BucketEncryption.KMS
)
app.py:
from aws_cdk import core
from aws_cdk_python.s3_stacks.s3_stack_definition import S3Stack
app = core.App()
S3Stack(app, "ExampleStack",
)
app.synth()

Python import from parent directory for dockerize structure

I have a project with two applications. They both use a mongo-engine database model file. Also they have to start in different Docker containers, but use the same Mongo database in the fird container. Now my app structure looks like this:
app_root/
app1/
database/
models.py
main.py
app2/
database/
models.py
main.py
And it works fine, BUT I have to support two same files database/models.py. I dont want to do this and I make the next structure:
app_root/
shared/
database/
models.py
app1/
main.py
app2/
main.py
Unfortunately it doesnt work for me, because when I try this in my main.py:
from ..shared.database.models import *
I get
Exception has occurred: ImportError
attempted relative import with no known parent package
And when I try
from app_root.shared.database.models import *
I get
Exception has occurred: ModuleNotFoundError No module named 'app_root'
Please, what do I do wrong?
In the file you perform the import, try adding this:
import os
import sys
sys.path.append(os.path.abspath('../../..'))
from app_root.shared.database.models import *

Python 3: Importing Files / Modules from Scattered Directories and Files

I have the following directory structure:
/home/pi
- project/
- p1v1.py
- tools1/
- __init__.py
- tools1a/
- __init__.py
- file1.py
- file2.py
- tools1a1/
- __init__.py
- file3.py
- file4.py
- tools1a2/
- __init__.py
- file5.py
- file6.py
I am trying to import all the modules from the file1.py into my project file p1v1.py
from file1 import *
but end up with either an
ImportError: attempted relative import with no known parent package
or an
ValueError: Attempted relative import in non-package
depending on what I use in p1v1.py because the functions in file1.py depend on file3.py and file4.py. I would like to use explicit imports (for clarity), but I'm not sure how to do this. Any advice would be appreciated.
Thank you!
Through trial and error eventually figured out how to solve this:
import sys
sys.path.insert(0,'..')
from tools1.tools1a.file1 import function as f1
Note: For this to work, I needed to be editing and executing my script p1v1.py out of the working directory /home/pi/project/. Hope this helps others with a similar problem!

Why are my custom operators not being imported into my DAG (Airflow)?

I am creating an ETL pipeline using Apache Airflow and I am trying to create generalized custom operators. There seems to be no problem with the operators but they are not being imported into my DAG python file.
This is my directory structure.
my_project\
.env
Pipfile
Pipfile.lock
.gitignore
.venv\
airflow\
dags\
logs\
plugins\
__init__.py
helpers\
operators\
__init__.py
data_quality.py
load_fact.py
load_dimension.py
stage_redshift
This is what is present in the __init__.py file under plugins folder.
from __future__ import division, absolute_import, print_function
from airflow.plugins_manager import AirflowPlugin
import airflow.plugins.operators as operators
import airflow.plugins.helpers as helpers
# Defining the plugin class
class SparkifyPlugin(AirflowPlugin):
name = "sparkify_plugin"
operators = [
operators.StageToRedshiftOperator,
operators.LoadFactOperator,
operators.LoadDimensionOperator,
operators.DataQualityOperator
]
helpers = [
helpers.SqlQueries
]
I'm importing these operators into my DAG file as following
from airflow.operators.sparkify_plugin import (StageToRedshiftOperator,
LoadFactOperator,
LoadDimensionOperator,
DataQualityOperator)
I am getting an error as follows
ERROR - Failed to import plugin /Users/user_name/Documents/My_Mac/Projects/sparkify_etl_sql_to_sql/airflow/plugins/operators/stage_redshift.py
Can you help me understand why this is happening?
I figured out how to register my custom operators with Airflow without dedicating a python file to use AirflowPlugin class.
I achieved this by declaring them in my __init__.py file under plugins directory.
This is how I did it.
My project folder structure is as follows
my_project\
.env
Pipfile
Pipfile.lock
.gitignore
.venv\
airflow\
dags\
logs\
plugins\
__init__.py
helpers\
operators\
__init__.py
data_quality.py
load_fact.py
load_dimension.py
stage_redshift
My code in plugins/__init__.py
from airflow.plugins_manager import AirflowPlugin
import operators
import helpers
# Defining the plugin class
class SparkifyPlugin(AirflowPlugin):
name = "sparkify_plugin"
operators = [
operators.StageToRedshiftOperator,
operators.LoadFactOperator,
operators.LoadDimensionOperator,
operators.DataQualityOperator
]
helpers = [
helpers.SqlQueries
]
My code in plugins/operators/__init__.py
from operators.stage_redshift import StageToRedshiftOperator
from operators.load_fact import LoadFactOperator
from operators.load_dimension import LoadDimensionOperator
from operators.data_quality import DataQualityOperator
__all__ = [
'StageToRedshiftOperator',
'LoadFactOperator',
'LoadDimensionOperator',
'DataQualityOperator'
]
I am importing these custom operators in my dag file(dags/etl.py) as:
from airflow.operators.spark_plugin import LoadDimensionOperator
spark_plugin is what the name attribute in SparkifyPlugin class (stored in plugins/__init__.py) holds.
Airflow automatically registers these custom operators.
Hope it helps someone else in the future.
In case you are having some import errors, try running python __init__.py for each module as described by #absolutelydevastated. Make sure that the one in plugins directory runs without throwing errors.
I used Pycharm and it did throw me a few errors when running __init__.py files in the plugins/operators directory.
Fixing the one in plugins directory and ignoring the errors thrown by plugins/operators/__init__.py fixed my issue.
If you check out: Writing and importing custom plugins in Airflow
The person there was having a similar problem with their plugin, which they fixed by including a file under airflow/plugins named for their plugin, rather than defining it in the __init__.py file.

Serverless cannot import local files;in same directory; into python file

I have a serverless code in python. I am using serverless-python-requirements:^4.3.0 to deploy this into AWS lambda.
My code imports another python file in same directory as itself, which is throwing an error.
serverless.yml:
functions:
hello:
handler: functions/pleasework.handle_event
memorySize: 128
tags:
Name: HelloWorld
Environment: Ops
package:
include:
- functions/pleasework
- functions/__init__.py
- functions/config
(venv) ➜ functions git:(master) ✗ ls
__init__.py boto_client_provider.py config.py handler.py sns_publish.py
__pycache__ cloudtrail_handler.py glue_handler.py pleasework.py
As you can see, pleasework.py and config are in same folder, but when I do import config in pleasework I get an error:
{
"errorMessage": "Unable to import module 'functions/pleasework': No module named 'config'",
"errorType": "Runtime.ImportModuleError"
}
I am struggling with this for few days and think I am missing something basic.
import boto3
import config
def handle_event(event, context):
print('lol: ')
ok, so i found out my isssue. Way i was importing the file was wrong
Instead of
import config
I should be doing
import functions.config
#Pranay Sharma's answer worked for me.
An alternate way is creating and setting PYTHONPATH environment variable to the directory where your handler function and config exist.
To set environment variables in the Lambda console
Open the Functions page of the Lambda console.
Choose a function.
Under Environment variables, choose Edit.
Choose Add environment variable.
Enter a key and value.
In our case Key is "PYTHONPATH" and value is "functions"

Resources