Servers to upload images - linux

I have 2 servers, one of them has the application for upload images and the folder that save the images is in the other server
So, how can i do that?? (connect the application with the folder in the other server)

If you upload your images with php, a solution is to upload via ftp the image to your second server (with folders to save images) from the first server.
You need :
private function ftpConnnect($ftp_server, $ftp_user_name, $ftp_user_pass)
{
// set up basic connection
$conn_id = ftp_connect($ftp_server);
if(ftp_login($conn_id, $ftp_user_name, $ftp_user_pass))
{
echo "Logged In\n";
}
else
{
echo "Not Logged In\n";
}
ftp_pasv($conn_id, true);
return $conn_id;
}
private function uploadOnServer($path, $saved_to, $ftp, $ftp_server_path)
{
// upload a file
if (ftp_put($ftp, $ftp_server_path.$saved_to, $path, FTP_BINARY)) {
echo "successfully uploaded $ftp_server_path$saved_to\n";
} else {
echo "There was a problem while uploading $ftp_server_path$saved_to\n";
}
}
Then call once :
$ftp = $this->ftpConnnect($serv, $user, $pass);
for each images
if($this->uploadOnServer($path, $saved_to, $ftp, $ftp_server_path))
unlink($path);
And at the end :
ftp_close($ftp);

Related

With chrome.storage, how to save a string into local storage and then test for existence?

I am writing a chrome extension where I want to save a string (an URL) into local stage and later test for existence.
For example:
function getURL(e) {
let domain = e.url;
//if domain does not exist in local storage, save domain into local storage
}
I looked the tutorial on both chrome.storage.local.get() and set(), but I still can not figure an easy way to do this.
save it in an array and then test it.
function getURL(e) {
let domain = e.url;
//if domain does not exist in local storage, save domain into local storage
// get the saved `urls`
chrome.storage.local.get("urls", i => {
// urls might not saved before
let urls = i.urls || []
// test if exists
if (!urls.includes(domain)) {
urls.push(domain)
//save it into store
chrome.storage.local.set({urls})
}
})
}

Duplicating File Uploading Process - Asp.net WebApi

I created a web API which allows users to send files and upload to Azure Storage. The way it works is, the client app will connect to API to send one or more files to the file upload controller and controller will take care of rest such as
Upload file to Azure storage
Update database
Works great but I don't think it is the right way to do this because now I can see there are two different processes
Upload file from the client's file system to my web API (server)
Upload file to the Azure storage from API (server)
It gives me the feeling that I am duplicating the upload process as the same file first travels to API (server) and then Azure (destination) from the client (file system). I feel the need of showing two progress-bars to the client for file upload progress (from client to server and then the server to Azure) - That just doesn't make sense to me and I feel that my approach is incorrect.
My API accepts up to 250MBs so you can imagine the overload.
What do you guys think?
//// API Controller
if (!Request.Content.IsMimeMultipartContent("form-data"))
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
var provider = new RestrictiveMultipartMemoryStreamProvider();
var contents = await Request.Content.ReadAsMultipartAsync(provider);
int Total_Files = contents.Contents.Count();
foreach (HttpContent ctnt in contents.Contents)
{
await storageManager.AddBlob(ctnt)
}
////// Stream
#region SteamHelper
public class RestrictiveMultipartMemoryStreamProvider : MultipartMemoryStreamProvider
{
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
var extensions = new[] { "pdf", "doc", "docx", "cab", "zip" };
var filename = headers.ContentDisposition.FileName.Replace("\"", string.Empty);
if (filename.IndexOf('.') < 0)
return Stream.Null;
var extension = filename.Split('.').Last();
return extensions.Any(i => i.Equals(extension, StringComparison.InvariantCultureIgnoreCase))
? base.GetStream(parent, headers)
: Stream.Null;
}
}
#endregion SteamHelper
///// AddBlob
public async Task<string> AddBlob(HttpContent _Payload)
{
CloudStorageAccount cloudStorageAccount = KeyVault.AzureStorage.GetConnectionString();
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("SomeContainer");
cloudBlobContainer.CreateIfNotExists();
try
{
byte[] fileContentBytes = _Payload.ReadAsByteArrayAsync().Result;
CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference("SomeBlob");
blob.Properties.ContentType = _Payload.Headers.ContentType.MediaType;
blob.UploadFromByteArray(fileContentBytes, 0, fileContentBytes.Length);
var B = await blob.CreateSnapshotAsync();
B.FetchAttributes();
return "Snapshot ETAG: " + B.Properties.ETag.Replace("\"", "");
}
catch (Exception X)
{
return ($"Error : " + X.Message);
}
}
It gives me the feeling that I am duplicating the upload process as
the same file first travels to API (server) and then Azure
(destination) from the client (file system).
I think you're correct. One possible solution would be have your API generate a Shared Access Signature (SAS) token and return that SAS token/URI to the client whenever a client wishes to upload a file.
Using this SAS URI your client can directly upload the file to Azure Storage without sending it to your API first. Once the file is uploaded successfully by the client, it can send a message to the API to update the database.
You can read more about SAS here: https://learn.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1.
I have also written a blog post long time back on using SAS that you may find useful: https://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/.

Can't see image that was uploaded

I have moved my xampp server to raspbian for image uploading. I can access the server, folder via browser and can see that image was registered in DB but unfortunately there is no image at that folder.
If you need more specific information let me know.
My phone app: https://gist.github.com/DoIReallyNeedIt/a068c69958f269ad1d1a373d9ec8bcdb
Connection to server:
<?php
$user_name = "root";
$user_pass = "root";
$host_name = "localhost";
$db_name = "mydb";
$con = mysqli_connect ($host_name, $user_name, $user_pass, $db_name);
if ($con)
{
$image = $_POST["image"];
$name =date('Y').date('m').date('d'). "_" . date('H'). date('i'). "_". $_POST["name2"]."_".$_POST["name1"];
$sql = "insert into imageinfo(name) values ('$name')";
$upload_path = "uploads/$name.jpg";
if(mysqli_query($con,$sql)){
file_put_contents ($upload_path, base64_decode($image));
echo json_encode (array ('response'=>'Nuotrauka buvo sėkmingai įkelta'));
}
else {
echo json_encode (array ('response'=>'Nuotraukos įkelti nepayvko'));
}
}
mysqli_close($con);
?>
Found the problem. For some reason chmod didn't saved for folders

Not able to write image in shared folder in IIS

I have developed an module to save image in windows shared location
my code works fine development machine in VS 2015 and IIS express.
But when i deploy the code in my IIS sever(IIS 8) and set my appppol.
When it checks the directory exists or no it fails and does not save the image.
in share path. I have tried accessing the shared path from server I am able to open it without issues
I have applied logs to check but it fails that directory does not exit
Sharepath ex:\atse-bs-13450.abc.xyz.com\Sharefolder\PhotoImages
My app pool is set to ApplicationPoolIdentity
public void WriteImage(string Location, string base64Image)
{
try
{
// Check if directory exist
if (Directory.Exists(Location))
{
//location value is set in appSettings;
//"\\atse-bs-13450.abc.xyz.com\Sharefolder\PhotoImages\"
string strImagePath = Location;
// Check file exist in location
if (!File.Exists(Location))
{
if (!string.IsNullOrEmpty(base64Image))
{
using (FileStream stream = new FileStream(strImagePath, FileMode.OpenOrCreate, FileAccess.Write, FileShare.None))
{
BinaryWriter writer = new BinaryWriter(stream);
writer.Write(Base64String2Blob(base64Image));
}
}
}
else
{
strBlobLogMessage = "image file could not be stored on shared location , Share path location : ";
PathNotFound(strBlobLogMessage);
}
}
else
{
strBlobLogMessage = "image file could not be stored on shared location as path does not exists , Share path location : ";
PathNotFound(strBlobLogMessage);
}
}
catch (Exception ex)
{
throw ex;
}
}
Two options
Create a user account and then assign that user account read and write access on shared location. Then you can set the Application Pool Identity to Custom account and then set it to newly created user account.
Since your app pool is using Application Pool Identity, there will be user account with name IIS AppPool\{Applicaiton Pool name} e.g. for DefaultAppPool the user account is IIS AppPool\DefaultAppPool so you can allow read/write access to shared directory to Applicaiton Pool user

Using NodeJS, How to upload a file into azure container / directory (such as: test-container/folder-test)

I am able to upload file to azure on specific container but I am not able to find any directory reference from azure documentation for Nodejs.
While using c# and Java we are able to upload on specific directory of container (sub directories)
Nodejs:
var BlobService = azure.createBlobService(storageAccount, accessKey);
BlobService.createBlockBlobFromLocalFile(container, BlobName, FilePath,
function(error, result, response) {
if (error) {
console.log(error)
} else {
console.log("uploaded to azure");
}
}
The above written piece of code works fine as I receive file to specific container..
but I need to upload to directory of container, anyone have encountered this issue before please help. I have found ways in Java but I need to use Nodejs
Thanks a lot
yes, i have tried and find the solution just by adding the directory name append to blobName. when calling the function just call it like below
let subDirecory = "Upload";
let BlobName = subDirecory + "/" + "BlobName"

Resources