Archive not found in the storage location: Google Function - node.js

I have a Running Google Function which I use in my code and it works fine.
But when I go to Google function to see the source code it shows:
Archive not found in the storage location
Why Can't I see my source code? What should I do?
Runtime: Node.js 10

There are two possible reasons:
You may have deleted the source bucket in Google Cloud storage. Have you perhaps deleted GCS bucket named like gcf-source-xxxxxx? It is the source storage where your code is archived. If you are sure you have deleted the source bucket, There is no way to restore your source code.
Much more likely, though, is that you did not delete anything but instead renamed the bucket, for example by choosing a near city for the location settings. If the GCS bucket's region does not match your Cloud function region, the error is thrown. You should check both services' region.
You can check the Cloud Function's region at details -> general information

This error had appeared before when I browsed the Google Storage location that is used by the cloud function - without deleting anything there. It might have happened, though, that I changed the location / city of the bucket to Region MY_REGION (MY_CITY). In my case, the CF was likely already at the chosen region, therefore the other answer above, bullet point 2., probably does not cover the whole issue.
I guess it is about a third point that could be added to the list:
+ 3. if you firstly choose a region at all, the bucket name gets a new suffix that was not there before, that is, from gcf-sources-XXXXXXXXXXXX to gcf-sources-XXXXXXXXXXXX-MY_REGION. Then, the CF is not be able to find its source code anymore in the old bucket address. That would explain this first error.
First error put aside, now the error in question appears again, and this time I have not done anything to get Google app engine deployment fails- Error while finding module specification for 'pip' (AttributeError: module '__main__' has no attribute '__file__'). I left it for two days, doing anything, only to get the error in question afterwards. Thus, you seem to sometimes just lose your deployed script out of nowhere, better keep a backup before each deployment.
Solution:
Create a new Cloud Function or
edit the Cloud Function, choose Inline Editor as source code, create the default files for Runtime Node.js 10 manually and fill them with your backup code.

Related

How to check if website name provided is available and free in Azure SDK for dot net

I'm trying to create a web app in azure using
azure.WebApps
.Define(name)
.WithExistingWindowsPlan(plan)
.WithExistingResourceGroup(resourceGroupName)
.CreateAsyn()
Since the name must be globally unique, how can I check if the name exist already?
Similar to Erndob's comment there - I'll extend it by saying that catch the error and look into the specifics. If it's failing with a name not unique error - that's your check failing right there. Any other failures should be treated differently.

Active storage returns attached? as true in console but on rails server it return false

I am new to rails ActiveStorage and facing some issues in image uploading.
while i try to upload image it uploaded successfully but when i try to get the image it returns attached as false. But when i try same record in console it return the image url.
Rails server output:
I ran into a similar situation when I had multiple records attached to the same blob.
Not sure if that's what happened here, if you had 2 companies using the same attachment, and then purged that attachment from one record, it will delete both the blob reference and file itself, without removing other associated blobs. This means one record will still sometimes think it has a file attached (as it's still associated with a blob)
A good way to find out is to check out in rails console:
obj.image.blob.filename
This will show if the actual file that's associated with an object exists, rather than just its blob. It's a bug in Active Storage that they're apparently addressing, not sure if it applies here or not.

Pushing documents(blobs) for indexing - Azure Search

I've been working in Azure Search + Azure Blob Storage for while, and I'm getting trouble indexing the incremental changes for new files uploaded.
How can I refresh the index after upload a new file into my blob container? Following my steps after upload file(I'm using rest service to perform these actions): I'm using the Microsoft Azure Storage Explorer [link].
Through this App I've uploaded my new file to a folder already created before. After that, I used the Http REST to perform a 'Run' indexer command, you can see in this [link].
The indexer shows me that my new file was successfully added, but when I go to search the content in this new file is not found.
Please, anybody knows how to add this new file in Index and also how to find this new file by searching for his content?
I'm following Microsoft tutorials, but for this issue, I couldn't find a solution.
Thanks, guys!
Assuming everything is set up correctly, you don't need to do anything special - new blobs will be picked up and indexed the next time indexer runs according to its schedule, or you run the indexer on demand.
However, when you run the indexer on demand, successful completion of the Run Indexer API means that the request to run the indexer has been submitted; it does not mean that the indexer has finished running. To determine when the indexer has actually finished running (and observe the errors, if any), you should use Indexer Status API.
If you still have questions, please let us know your service name and indexer name and we can take a closer look at the telemetry.
I'll try to describe how can I figured out this issue.
Firstly, I've created a DataSource through this command:
POST https://[service name].search.windows.net/datasources?api-version=[api-version]
https://learn.microsoft.com/en-us/rest/api/searchservice/create-data-source.
Secondly, I created the Index:
POST https://[servicename].search.windows.net/indexes?api-version=[api-version]
https://learn.microsoft.com/en-us/rest/api/searchservice/create-index
Finally, I created the Indexer. The problem happened at this moment because it is where all configurations are setted.
POST https://[service name].search.windows.net/indexers?api-version=[api-version]
https://learn.microsoft.com/en-us/rest/api/searchservice/create-indexer
After all these things done. The Index starts indexing all contents automatically (once we have contents into blob storage).
The crucial thing comes now. while your index is trying to extract all 'text' into your files, could occur some issue when the type of file is not 'indexable'. For example, there are two properties that you must pay attention excluded extensions, indexed extensions.
If you don't write the types properly, the Index throws an exception. Then, The Feedback Message(in my opinion is not good, was like a 'miss lead') says to avoid this error you should set the Indexer to '"dataToExtract" : "storageMetadata"'.
This command means that you are trying just index the metadata and no more the content of your files, then you cannot search by this and retrieve.
After that, the same message at the bottom says to avoid these issue you should set two properties (who solved the problem)
"failOnUnprocessableDocument" : false,"failOnUnsupportedContentType" : false
In addition, now everything is working properly. I appreciate your help #Eugene Shvets, and I hope this could be useful for someone else.

Writing and Reading to Local Storage in Azure WebJobs

I need to use local storage in an Azure WebJob (continuous if it matters). What is the recommended path for this? I want this to be as long-lasting as possible, so I am not wanting a Temp directory. I am well aware local storage in azure will always need to be backed by Blob storage or otherwise, which I already will be handling.
(To preempt question on that last part: This is a not frequently changing but large file (changes maybe once per week) that I want to cache in local storage for much faster times on startup. When not there or if out of date (which I will handle checking), it will download from the source blob and so forth.)
Related questions like Accessing Local Storage in azure don't specifically apply to a WebJob. However, this question is vitally connected, but 1) the answer replies on using Server.MapPath which is a System.Web dependent solution I think, and 2) I don't find this answer to have any research or definitive basis (though it is probably a good guess for the best solution). It would be nice if the Azure team gave more direction on this important issue, we're talking about nothing less than usage of the local hard drive.
Here are some Environment variables worth considering, though I don't know which to use:
Environment.CurrentDirectory: D:\local\Temp\jobs\continuous\webjobname123\idididid.id0
[PUBLIC, D:\Users\Public]
[ALLUSERSPROFILE, D:\local\ProgramData]
[LOCALAPPDATA, D:\local\LocalAppData]
[ProgramData, D:\local\ProgramData]
[WEBJOBS_PATH, D:\local\Temp\jobs\continuous\webjobname123\idididid.id0]
[SystemDrive, D:]
[LOCAL_EXPANDED, C:\DWASFiles\Sites\#1appservicename123]
[WEBSITE_SITE_NAME, webjobname123]
[USERPROFILE, D:\local\UserProfile]
[USERNAME, RD00333D444333$]
[WEBSITE_OWNER_NAME, asdf1234-asdf-1234-asdf-1234asdf1234+eastuswebspace]
[APP_POOL_CONFIG, C:\DWASFiles\Sites\#1appservicename123\Config\applicationhost.config]
[WEBJOBS_NAME, webjobname123]
[APPSETTING_WEBSITE_SITE_NAME, webjobname123]
[WEBROOT_PATH, D:\home\site\wwwroot]
[TMP, D:\local\Temp]
[COMPUTERNAME, RD00333D444333]
[HOME_EXPANDED, C:\DWASFiles\Sites\#1appservicename123\VirtualDirectory0]
[APPDATA, D:\local\AppData]
[WEBSITE_INSTANCE_ID, asdf1234asdf134asdf1234asdf1234asdf1234asdf1234asdf12345asdf12342]
[HOMEPATH, \home]
[WEBJOBS_SHUTDOWN_FILE, D:\local\Temp\JobsShutdown\continuous\webjobname123\asdf1234.pfs]
[WEBJOBS_DATA_PATH, D:\home\data\jobs\continuous\webjobname123]
[HOME, D:\home]
[TEMP, D:\local\Temp]
Using the %HOME% environment variable as a base path works for me nicely. I use a subfolder to store job-specific data, but other folder structure on top of this base path can be valid. For more details take a look at https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-system and https://github.com/projectkudu/kudu/wiki/File-structure-on-azure

Azure Storage copy an image from blob to blob

I am using Azure Storage Nodejs and what i need to do is to copy image from one blob to another.
First i tried to getBlobToFile to get the image on temp location in disk and then just createBlockBlobFromFile from that temp location. That method did the task, but for some reason it didn't copied completely in 10% of cases.
The i was trying to use getBlobToText and the result of that put into createBlockBlobFromText, also tried to put options which is need blob to be image container. That method failed completely, image not even opened after copy.
Perhaps there is a way to copy blob file and paste it in other blobl but i didn't find that method.
What else can i do?
I'm not sure what your particular copy-error is, but... with getLocalBlobToFile(), you're actually physically moving blob content from blob storage to your VM (or local machine), and then with createBlockBlobFromLocalFile() you're pushing the entire contents back to blob storage, which is resulting in two physical network moves.
The Azure Storage system supports blob-copy as a 1st-class operation. While it's available via REST API call, it's also wrapped in the same SDK you're using, in the method BlobService.startCopyBlob() (source code here). This will instruct the storage to initiate an async copy operation, completely within the storage system (meaning no download+upload on your side). You'll be able to set source and destination, set timeouts, etc. (all parameters are fully documented in the source code).
The link in the accepted answer is broken, although the method is correct: the method startCopyBlob is documented here
(Updated: Jan 3, 2020) https://learn.microsoft.com/en-us/javascript/api/azure-storage/BlobService?view=azure-node-latest#azure_storage_BlobService_createBlockBlobFromLocalFile
(The old link) https://learn.microsoft.com/en-us/javascript/api/azure-storage/BlobService?view=azure-node-latest#azure_storage_BlobService_createBlockBlobFromLocalFile

Resources