I have a static website with one $web container and one more container named storybook containing a build in the $web container. Currently to reach the storybook container I have to write: www.xyz/storybook/index.hmtl.
The goal is to reach the Image when calling www.xyz/storybook.
To achieve this I tried to setup Rule Engines URL Rewrite.
My rule looks like this
Somehow the rewrite is not working and I have no idea why.
I tried to reproduce the same in my environment i am getting the same error as 404:
Thus, to resolve this issue, check whether you have provided correct Html error document path as below:
In $web container, I have added my error document path as abc.html file and my image files as well. In document path, try to add your image URL path as below:
Also, when I try to browse my primary endpoint URL, I am getting the image successfully as shown below. For this, make sure your container’s access level should be public container (anonymous read access for container and blobs) and I created azure CDN under security + networking.
Check whether you have given correct origin path default as below:
To access the image, copy the Endpoint hostname along with your image url path as below that is, Append with blob url (/$web/abc.html) by omitting origin hostname(https://imran955.blob.core.windows.net)
i.e., https://Endpointhostname/blob url
Result:
Related
I am working on creating a databricks notebook template with company logo. Using the below code to display image is throwing error.
Code:
%md
<img src ='/test/image/MyImage.jpg'>
Error:
HTTP ERROR 403: Invalid or missing CSRF token
Please guide me.
You either need to store image somewhere, and refer to it as a full URL, for example, you can refer your company site.
Another way is to upload file to the /FileStore directory on DBFS, and then you can refer to it using the /files/ URL, that is supported in both HTML and Markdown (see docs):
%md
![my_test_image](files/image.jpg)
You can upload image using databricks-cli, or via UI (if you have DBFS File Browser enabled). (Another option is the DBFS REST API, but it's cumbersome)
I have deployed an ASP.NET CORE web API project on Azure app services. I have copied a file using an FTP client to /site/wwwroot. Now let suppose file name is xyz.jpg, so it should be accessible with link somename.azurewebsites.net/xyz.jpg but ITS NOT. I have tried pasting the file in other folders to but nothing works.
I also have a controller for uploading pictures. It's also working fine. It uploads the picture in desired folder, i can see the picture via FTP client but still the picture is not accessible via any link. What am I doing wrong here ?
For a Web API application, you have to define the request and response yourself in the controller, or your link can't be recognized by the application.
For example, you can add the method to your controller. It works on my side.
[Route("myroute/{pic}")]
public IActionResult Get(string pic)
{
Byte[] b = System.IO.File.ReadAllBytes("image/"+pic);
return File(b, "image/jpeg");
}
In my code, pictures are stored in the folder called image in the root directory, and I define a route called myroute.
Here's my link to access the picture.https://myappname.azurewebsites.net/myroute/mypicname.jpg
Hope it helps.
I am using ImageResizer with the AzureReader2 plugin on an ASP.NET MVC application.
When I type http://localhost:[port]/[prefix]/[blobname] on the address bar I get redirected to [endpoint]/[blobname], and I am able to see my image.
But I can't use any query string, for example typing http://localhost:[port]/[prefix]/[blobname]?width=200 gives me the IIS HTTP Error 404.0 - Not Found page. I've tried setting redirectToBlobIfUnmodified to both true and false, but I get the same result.
When I host the image locally, everything works fine.
I got it working by including the container name in the url:
http://localhost:[port]/[prefix]/[containername]/[blobname]?width=200
Using this method, it's the original image that gets broken. In those cases, I would just have to use the raw blob storage url.
I am using the Java jclouds API for access to my Rackspace cloud files account.
I can create and list containers, and upload objects, but I can't figure out how to get the public links for an uploaded object. (I can see these public links from within the Rackspace control panel, by right-clicking on the object - there are 4 types: HTTP, HTTPS, Streaming, iOS Streaming).
The closest I can get is by using object.getInfo() to get the object's metadata. This includes a URI, but this doesn't resemble the public links I find from within the control panel.
Anyone know what I'm doing wrong?
I figured it out...
First, I should get the public URI of the object's container, not from the object.
Then I use a CloudFilesClient object. On the container I need to use getCDNMetadata("containername").getCDNUri()
Here is more information and some sample code to get the specific file CDN address.
For more details you can checkout the Java guide:
https://developer.rackspace.com/docs/cloud-files/quickstart/?lang=java
First get the cloud files api:
CloudFilesApi cloudFilesApi = ContextBuilder.newBuilder("rackspace-cloudfiles-us")
.credentials("{username}", "{apiKey}")
.buildApi(CloudFilesApi.class);
From there you can query the container:
CDNApi cdnApi = cloudFilesApi.getCDNApi("{region}");
CDNContainer cdnContainer = cdnApi.get("{containerName}");
Now with that CDNContainer you can get the specific web address that you need:
URI httpURI = cdnContainer.getUri();
URI httpsURI = cdnContainer.getSslUri();
This will get you the base URI for the container. Now to get the final address for your specific file you will need to append /{"your_file_name.extension"} to the end of the address. For example if my base URI was converted to a URL then to a String it may look like:
http://123456asdf-qwert987653.rackcdn.com/
From here I can get a file with the name example.mp4 with the following address:
http://123456asdf-qwert987653.rackcdn.com/example.mp4
This all assumes that you have already enabled CDN on the container.
I'm trying to deploy to Azure (Silverlight application); I've migrated my DB, updated connection strings and published my application to Azure but when I click the service URL I get this:
403 - Forbidden: Access is denied.
You do not have permission to view this directory or page using the credentials that you supplied.
Any idea what I need to change?
Many thanks
If the name of your bundle matches the path in the file system then IIS on Azure will throw the 403 Forbidden error.
So if you have a path in your solution called /Content/css and you have a bundle name called bundles.Add(new StyleBundle("~/Content/css").Include( ... in BundleConfig.cs which is displayed like this #Styles.Render("~/Content/css") in your _Layout.cshtml file. Then you get that error.
I solved this by changing the name of my bundle from /Content/css to /Style/css
bundles.Add(new StyleBundle("~/Content/css").Include( ... in BundleConfig.cs becomes bundles.Add(new StyleBundle("~/Style/css").Include( ...
#Styles.Render("~/Content/css") in your _Layout.cshtml becomes #Styles.Render("~/Style/css") in your _Layout.cshtml
You can use any names you like there is no specific limitations. I imagine you can go ahead and rename the folders in your solution too and that should work.
NB: The name of the bundle turns into a virtual directory that the browser can request from. If it resembles a physical folder structure then it will throw back the 403.
I needed to use the full path to a page within the application as I hadn't set a default document in my web config e.g
<add value="Pages/Home.aspx"/>
I got the same error in my MVC project.
After some debugging I found that it was because I have removed all "default pages" in the Azure Portal.
I added a dummy "index.html"-record in the portal and then everything worked nicely again :)