I have a working lambda code, but I have this new idea of a neural network system that I am trying to implement for which I would require Keras (Python). I tried uploading the same working code along with the keras dependencies as a ZIP file form my desktop with an additional "import keras" statement. Funnily , the same code with the additional "import keras" gives me an error saying remote endpoint could not be reached on the developer portal but when I remove the "import keras" statement, my code works perfectly fine. Why is this happening even after I give both the keras dependencies and my lambda code on the ZIP file? I understand that I am not using keras anywhere in my existing code but when I give the keras dependencies and just try to import, logically, it should still work right?
Related
I am using matplotlib library in my python project which in turn uses numpy. I have deployed the libraries in AWS Lambda Layers and I am importing them in my AWS lambda function. When I test my AWS Lambda function it throws the following error:
Importing the numpy C-extensions failed. This error can happen for many reasons, often due to issues with your setup or how NumPy was installed. We have compiled some common reasons and troubleshooting tips at: numpy.org/devdocs/user/troubleshooting-importerror.html Please note and check the following: * The Python version is: Python3.8 from "/var/lang/bin/python3.8" * The NumPy version is: "1.18.5" Original error was: No module named 'numpy.core._multiarray_umath'
Any idea what could be the possible reason and how to resolve it?
I am answering the question so that if anyone in future also faces this issue so the below solution would might work for them as well.
The problem was that I compiled the required packages in windows 10 enviroment and then I deployed them on layers to be used by AWS Lambda function. AWS Lambda function and Layers use Linux in
the background so the packages compiled in the Windows enviroment were not compatible with AWS Lambda function. When I compiled the required packages again in Linux enviroment and deployed them on layers and used them again with lambda function then it worked like a charm!
This Medium article helped me in solving my issue.
I've included this library as a layer to my lambda function but when I go to test it I get the error: cannot import name 'etree' from 'lxml'
There are multiple posts about people having this issue and some say to that I need to build it to compile some c libraries. Most posts say to look for another file or folder name 'lxml' which I've verified is not the issue.
I'm able to run the same code I've deployed to my layer on my local linux workstation and it runs without an issue.
Turns out my lambda was running python version 3.8 and that version is not compatible with the version of lxml I am using 4.5.1. Changing the run time to 3.7 fixed the issue. Hope this helps someone.
Bert is very powerful model for text classification but implementation of bert requires much more code than any other model. bert-text is pypi package to provide developer a ready-to-use solution.I have installed it properly.When I have tried to import ,it is throwing error ModuleNotFoundError: No module named 'bert_text'.I have properly written the name bert_text.
I have tried it in Kaggle,Colab and local machine but the error is same.
Hey as this is a refactor made by Yan Sun, This issue is already pending, you can go to this link and subscribe for an update when the developers will provide its solution. https://github.com/SunYanCN/bert-text/issues/1
There are other similar question like mine, but I think no one looks complete or fits/answer my case.
I'm deploying a Python 3.6 application on AWS lambda via serverless framework.
With this application I'm using diskcache to perform some small file caching (not using at all sqlite actually)
I'm using "serverless-python-requirements" plugin in order to have all my dependencies (defined in requirements.txt file) packed up and uploaded (diskcache in this case)
When application is live on AWS and I request it, I'll get back a 500 error. And in my logs I can read:
Unable to import module 'handler': No module named '_sqlite3'
Then from answer below I get that sqlite module should not be needed to be installed.
Python: sqlite no matching distribution found for sqlite
So no need (and it wont work) to add sqlite as a requirement...
Then I wonder why AWS lambda is unable to find sqlite once deployed.
Any hint pls?
Thanks
On my AWS Lambda Python 3.6 function I'd like to use Google Firestore (Cloud Firestore BETA) for caching purposes, but as soon as I add
from google.cloud import firestore
to my Python script and upload ZIP to AWS Lambda function, Lambda test come back with error
Unable to import module 'MyLambdaFunction': cannot import name 'cygrpc'.
AWS CloudWatch log doesn't contain any details on the error, just that same error message.
Lambda function works great on my local dev machine (Windows 10), and I can write to Firestore fine. It also works on AWS if I comment out the import and all Firestore related lines.
Any tips how I could go about solving this?
The python client for Firestore relies on the C-based implementation of GRPC. This appears not to work by default in AWS Lambda.
Node.js users have reported similar problems and they've documented a workaround of building a docker image.
This should be similar to any getting any other python package that requires native code to work. Perhaps something like this method for getting scikit to work?
I hope this is enough to get you going in the right direction, but unfortunately I don't know anything about AWS Lambda :-(.
Ran into same issue, i solved it by using the plugin serverless-python-requirements for serverless framework and passing:
pythonRequirements:
dockerizePip: true
Essentially this installs your c-based packages (and all other packages) in a docker container where it would work and then symlinks them to your lambda fn.
A helpful guide can be found on: https://serverless.com/blog/serverless-python-packaging/
Plugin: https://github.com/UnitedIncome/serverless-python-requirements