How to resolve the loader? - azure

I am converting a working WebApp project to use a Function App instead. While I can use SQLite with EF Core on the Web App, the Function App fails. I used the latest EntityFramework.SQLite.Core libraries. I am deploying to a local Docker container so the folder structure is as it would be on Azure. Here is my test project showing the issue.
I copied the database to the /home directory but to no avail. I expect the Seed process to pass the check for !Employees.Any() and if true then seed the database with the CSV mock data file. But the first attempt to access the database with Employees.Any() gives the error:
DllNotFoundException: Unable to load shared library 'e_sqlite3' or one of its dependencies. In order to help diagnose loading problems, consider setting the LD_DEBUG environment variable: libe_sqlite3: cannot open shared object file: No such file or directory
How to get around this? It would be a way to cheaply host an API.

Related

Ignore folder or file for Node Azure Function App

I want my Node.js Azure Function App to ignore a test data file.
When I run my app, after adding the file, I see it trying to parse the file and showing the following error:
Worker was unable to load function ...
How do I tell the runtime to ignore this data file or the folder containing it?
As far as I Know,
Worker was unable to load function ...
This error comes when any packages installed outside of root node_modules folder and if those are not included in the runtime or deployment.
How do I tell the runtime to ignore this data file or the folder containing it?
.gitignore file plays major role in Azure Functions that used to ignore the specified files even they have changes.

Is there anyway to access the base path of an uploaded file with Rust WASM?

Does WebAssembly have the same issue as native JS(for very valid security reasons) that it cannot access the base/root path of any given uploaded file or selected folder? I want to write a Rust UI app with seed and using rfd (or its previous iterations nfd/nfd2) that users can indicate a file path so the application can install certain files in the correct place.
Otherwise can seed compile to a local .exe instead of a web based app, and therefore have proper access to the file system?
Thanks

SAP Commerce Cloud Hot Folder local setup

We are trying to use cloud hot folder functionality and in order to do so we are modifying our existing hot-folder implementation that was not implemented originally for usage within cloud.
Following the steps on this help page:
https://help.sap.com/viewer/0fa6bcf4736c46f78c248512391eb467/SHIP/en-US/4abf9290a64f43b59fbf35a3d8e5ba4d.html
We are trying to test the cloud functionality locally. I have on my machine azurite docker container running and I have modified the mentioned properties in local.properties file but it seems that the files are not being picked up by hybris in any of the cases that we are trying.
First we have in our local azurite storage a blob storage called hybris. Within this blob storage we have folders master>hotfolder, and according to docs uploading a sample.csv file into this should trigger a hot folder upload.
Also we have a mapping for our hot-folder import that scans the files within this folder: #{baseDirectory}/${tenantId}/sample/classifications. {baseDirectory} is configured using a property like so: ${HYBRIS_DATA_DIR}/sample/import
Can we keep these mappings within our hot folder xml definitions, or do we need to change them?
How should the blob container be named in order for it to be accessible to hybris?
Thank you very much,
I would be very happy to provide any further information.
In the end I did manage to run cloud hot folder imports on local machine.
It was a matter of correctly configuring a number of properties that are used by cloudhotfolder and azurecloudhotfolder extensions.
Simply use the following properties to set the desired behaviour of the system:
cluster.node.groups=integration,yHotfolderCandidate
azure.hotfolder.storage.account.connection-string=DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:32770/devstoreaccount1;
azure.hotfolder.storage.container.hotfolder=${tenantId}/your/path/here
cloud.hotfolder.default.mapping.file.name.pattern=^(customer|product|url_media|sampleFilePattern|anotherFileNamePattern)-\\d+.*
cloud.hotfolder.default.images.root.url=http://127.0.0.1:32785/devstoreaccount1/${azure.hotfolder.storage.container.name}/master/path/to/media/folder
cloud.hotfolder.default.mapping.header.catalog=YourProductCatalog
And that is it, if there are existing routings for traditional hot folder import, these can also be used but their mappings should be in the value of
cloud.hotfolder.default.mapping.file.name.pattern
property.
I am trying the same - to set up a local dev env to test out the cloud hotfolder. It seems that you have had some success. Can you provide where you located the azurecloudhotfolder - which is called out here https://help.sap.com/viewer/0fa6bcf4736c46f78c248512391eb467/SHIP/en-US/4abf9290a64f43b59fbf35a3d8e5ba4d.html
Thanks

Where are source files stored on Google Cloud Platform when deployed from local machine

I have just deployed the basic NodeJS express app on Google Cloud Platform from IntelliJ IDEA. However I cannot find and browse the source files. I have searched in the Development tab, App Engine tab and it shows the project but not the actual files. I can access the application from my browser and it is running fine. I can see the activity and requests everything coming into the application but I cannot see the source files. I tried searching for them in the terminal Google Cloud Console and I cannot locate the files in there either. It's puzzling because I don't know where the files are being served from.
AFAIK seeing the live app code/static content directly in the developer console is not possible (at least not yet), not even for the standard environment apps.
For apps using the flexible environment (that includes node.js apps) accessing the live app source code may be even more complex as what's actually executed on GAE is a container image/docker file (as opposed to plain app code source file from a standard environment app). From Deploying your program:
Deploy your app using the gcloud app deploy command. This command
automatically builds a container image for you by using the Container
Builder service (Beta) before deploying the image to the App
Engine flexible environment control plane. The container will include
any local modifications you've made to the runtime image.
Since the container images are fundamentally dockerfiles it might be possible to extract their content using the docker export command:
Usage: docker export [OPTIONS] CONTAINER
Export a container's filesystem as a tar archive
Options:
--help Print usage
-o, --output string Write to a file, instead of STDOUT
The docker export command does not export the contents of volumes
associated with the container. If a volume is mounted on top of an
existing directory in the container, docker export will export the
contents of the underlying directory, not the contents of the
volume.
One way of checking the exact structure of the deployed app (at least in the standard environment) is to download the app code and check it locally - may be useful if suspected incorrect app deployment puts a question mark on the local app development repository from where deployment originated. Not sure if this is possible with the flexible environment, tho.
The accepted answer to the recent Deploy webapp on GAE then do changes online from GAE console post appears to indicate that reading and maybe even modifying live app code might be possible (but I didn't try it myself and it's not clear if it would also work for the flexible environment).

How can I bundle my database file into my app, ready for a side load install?

I am creating a Windows 10 Universal App which uses a local SQLite Database.
In order for the app to use the database file It must be placed in:
C:\Users\<Username>\AppData\Local\Packages\<Name of Package>\Local State
Now I understand this is the 'local' file structure for the application. However I have a pre-made database that the app needs to interact with and therefore should be bundled as part of the app on install.
Is there a method of including my database in a usable fashion when distributing my application via a side-load install?
Furthermore, This problem is of paramount importance as This 'C:\' Directory will not exist when pushing my application to the mobile phone or other Windows 10 (not a desktop) device.
You cannot package the database directly as read-write data (local state). If you only ever need to read from the database, you can just include it in your project and read it from Package.Current.InstalledLocation.
If you need to write to the database, but it contains some initial values you want to ship with your app, then you still need to include the database in your project, but then copy it from the InstalledLocation to ApplicationData.Current.LocalFolder if it doesn't exist when your app starts up.
You can all ways export your existing data base as SQL script and save it in your project assets.
On the first run of your application you can create the Sqlite file in your LocalFolder, and run the script with CREATE and INSERT queries.

Resources