I have deleted a service using the ArangoDB web interface. When I'm trying to load zipped folder with a new service to the same mount path, I get the error:
Services: 3011. service already exists Mount path
I'm using arangodb for win64 ver 3.3.13
How can I fix this?
Thank you.
Okay,
after much of a hustle, issue solved. The issue may be related to some operations I did before deleting the problematic service:
Went to C:\ProgramData\ArangoDB-apps_db\healDB
Delete the sub-folder with the relevant service name (it was empty)
Enter _appbundles sub-folder
Delete the relevant zip file service
Restart the DB.
Woila!
Related
I am converting a working WebApp project to use a Function App instead. While I can use SQLite with EF Core on the Web App, the Function App fails. I used the latest EntityFramework.SQLite.Core libraries. I am deploying to a local Docker container so the folder structure is as it would be on Azure. Here is my test project showing the issue.
I copied the database to the /home directory but to no avail. I expect the Seed process to pass the check for !Employees.Any() and if true then seed the database with the CSV mock data file. But the first attempt to access the database with Employees.Any() gives the error:
DllNotFoundException: Unable to load shared library 'e_sqlite3' or one of its dependencies. In order to help diagnose loading problems, consider setting the LD_DEBUG environment variable: libe_sqlite3: cannot open shared object file: No such file or directory
How to get around this? It would be a way to cheaply host an API.
I was trying to deploy my web app to a web app service I created in Azure services portal. I was doing it through Visual Studio Code with the help of Azure extensions (idk if that's important to mention or no). I got this error:
"The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters"
Unfortunately, I can not change file names or move the project to a different folder. What can I do to fix this issue? Thanks in advance :)
Please check if the below steps help to fix the issue:
If it is Node JS Web App, delete the folder called node_modules within the project folder. Reload the solution > Rebuild and publish to the Azure. In the Azure Portal > Web App > Console - run the command npm install which adds the dependencies within that project with the folder called node_modules.
If possible, try to move your projects/solutions to the root directory or to the short directory path.
According to this similar issue reported in the GitHub Repo-Azure Web App, you can include a setting in publish profile file like:
<PublishTempFolderName>i2</PublishTempFolderName>
Check that reference when using the above setting, some global settings need to be applied given on the GitHub Repo
(OR) Enable the policy state of Enable NTFS long paths in the Local Group Policy Editor and check once.
My docker compose file consists of services and volumes. Volumes are mapped to file shares. It was working all fine for few months but now I have changed my file share path but the service in container is still taking the old file share path.
My docker volume inspect <container_id> is showing new file share path. Even inside of container it is showing new file share path. I don't know from where, it is taking old file share path. I have deleted and recreated volumes and containers multiple times but no luck. Restarted docker service also, but no luck. Please help
Updating mounted path to a different name worked for me finally.
We are trying to use cloud hot folder functionality and in order to do so we are modifying our existing hot-folder implementation that was not implemented originally for usage within cloud.
Following the steps on this help page:
https://help.sap.com/viewer/0fa6bcf4736c46f78c248512391eb467/SHIP/en-US/4abf9290a64f43b59fbf35a3d8e5ba4d.html
We are trying to test the cloud functionality locally. I have on my machine azurite docker container running and I have modified the mentioned properties in local.properties file but it seems that the files are not being picked up by hybris in any of the cases that we are trying.
First we have in our local azurite storage a blob storage called hybris. Within this blob storage we have folders master>hotfolder, and according to docs uploading a sample.csv file into this should trigger a hot folder upload.
Also we have a mapping for our hot-folder import that scans the files within this folder: #{baseDirectory}/${tenantId}/sample/classifications. {baseDirectory} is configured using a property like so: ${HYBRIS_DATA_DIR}/sample/import
Can we keep these mappings within our hot folder xml definitions, or do we need to change them?
How should the blob container be named in order for it to be accessible to hybris?
Thank you very much,
I would be very happy to provide any further information.
In the end I did manage to run cloud hot folder imports on local machine.
It was a matter of correctly configuring a number of properties that are used by cloudhotfolder and azurecloudhotfolder extensions.
Simply use the following properties to set the desired behaviour of the system:
cluster.node.groups=integration,yHotfolderCandidate
azure.hotfolder.storage.account.connection-string=DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:32770/devstoreaccount1;
azure.hotfolder.storage.container.hotfolder=${tenantId}/your/path/here
cloud.hotfolder.default.mapping.file.name.pattern=^(customer|product|url_media|sampleFilePattern|anotherFileNamePattern)-\\d+.*
cloud.hotfolder.default.images.root.url=http://127.0.0.1:32785/devstoreaccount1/${azure.hotfolder.storage.container.name}/master/path/to/media/folder
cloud.hotfolder.default.mapping.header.catalog=YourProductCatalog
And that is it, if there are existing routings for traditional hot folder import, these can also be used but their mappings should be in the value of
cloud.hotfolder.default.mapping.file.name.pattern
property.
I am trying the same - to set up a local dev env to test out the cloud hotfolder. It seems that you have had some success. Can you provide where you located the azurecloudhotfolder - which is called out here https://help.sap.com/viewer/0fa6bcf4736c46f78c248512391eb467/SHIP/en-US/4abf9290a64f43b59fbf35a3d8e5ba4d.html
Thanks
I registered Appfog, for sharing my web applications and made login with linux terminal commands, but exists one problem, I didn't find way to fix this.
I want to delete files or directories in my appfog application using terminal commands, is this possible?
I couldn't find any documentation regarding deletion. Since it works on the principle of git, you could just delete the file you wan to delete from your local directory and run the af update appname command. That should do it.