CInternetSession::WriteString writes the file and removes directly - visual-c++

I wrote a dll for an application which wil upload some information from the application via ftp to a server. The upload works fine for most of the files I upload.
There is one file which will upload completely but after the upload it is directly removed from the server (this happens not alway, sometimes the file exists on the server after upload). The file is arround 400 kb, all the other files are smaller.
date and type are two CStrings. Data contains the file content and type the first part of the filename.
CInternetSession session(_T("whtsnXt_dll"));
CFtpConnection* pServer = NULL;
CInternetFile* pFile = NULL;
LPCTSTR pszServerName = _T("servername");
CString fileName = type + L".txt";
int curPos = 0;
CString postData = data;
try
{
CString strServerName;
INTERNET_PORT nPort = 21;
pServer = session.GetFtpConnection(pszServerName, _T("username"), _T("password"), nPort, TRUE);
if (pServer->SetCurrentDirectory(L"goes") == 0) {
// MessageBox::Show("De map bestaat niet", "whtsnXt error", MessageBoxButtons::OK, MessageBoxIcon::Error);
}
pFile = pServer->OpenFile((LPCTSTR)fileName, GENERIC_WRITE);
pFile->WriteString(postData);
pFile->Close();
pServer->Close();
delete pFile;
delete pServer;
}
catch (CInternetException* pEx)
{
//catch errors from WinInet
TCHAR pszError[64];
pEx->GetErrorMessage(pszError, 64);
MessageBox::Show(gcnew String(pszError), "whtsnXt error", MessageBoxButtons::OK, MessageBoxIcon::Error);
}
session.Close();
Does someone know a way to upload that file withoud directly removing?

Try to upload the file in smaller pieces:
int i;
for (i = 0; i < postData.Getlength(); i += 1024)
{
pFile->WriteString(postData.Mid(i, min(1024, postData.Getlength() - i));
}
Just to be sure: data is actually a multi-byte or unicode string and holds no binary data? WriteString will only write till it finds a '\0' character. To upload binary data, use pFile->Write.

Related

Upload large file using azure java sdk more than 50k block

I'm trying to upload a file size of 230GB into azure block blob with the following code
private void uploadFile(FileObject srcFile, FileObject destFile) throws Exception {
try {
BlobClient destBlobClient = blobContainerClient.getBlobClient("destFilename");
long blockSize = 4 * 1024 * 1024 // 4 MB
ParallelTransferOptions opts = new ParallelTransferOptions()
.setBlockSizeLong(blockSize)
.setMaxConcurrency(5);
BlobRequestConditions requestConditions = new BlobRequestConditions();
try (BlobOutputStream bos = destBlobClient.getBlockBlobClient().getBlobOutputStream(
opts, null, null, null, requestConditions);
InputStream is = srcFile.getContent().getInputStream()) {
byte[] buffer = new byte[(int) blockSize];
int i = 0;
for (int len; (len = is.read(buffer)) != -1; ) {
bos.write(buffer, 0, len);
}
}
}
finally {
destFile.close();
srcFile.close();
}
}
Since,I am explicitly setting block size 4MB for each write operation I'm in a assumption that each write block is considered as single block in azure. But which is not the case.
For the above example 230GB file the write operation was executed 58880 times and the file got uploaded successfully.
Can someone please explain me more about how blocks are splits internally in azure and let help me to understand better.
Thanks in advance

How to open a excel file/Log file through CAPL for viewing

Whenever the CAPL script is executed, I want to open an excel file or log file.
For example, there is a file named " Standard_details.xlsx or Logfile.txt". once the CAPL script is executed, I need this file to be opened for the user to read. How to open .xlsx or .txt file through CAPL script?
CAPL has function sysExec and sysExecCmd, which allow you to run external commands.
I suggest to you, to use a parser for that file and import in a matrix.
This is my personal solution to parse a CSV file :
void loadSimpleCCSfile (char ccsFile [], long matrix [][]){
dword fh;
char text[1000],temp[1000];
int CCS_index,i,res,lastRes;
fh = openFileRead(ccsFile,0);
if (!fh) {
write ("ERROR CCS: Open file failed!");
return;
}else{
write("Open file %s",ccsFile);
}
write("Parsing file %s...",ccsFile);
/* read first line and check */
if (!fileGetString(text, elcount(text), fh) || strstr(text, "UCM#") < 0) {
write("ERROR: Wrong file format, 'UCM#' not found!");
}
CCS_index = 0;
while(fileGetString(text, elcount(text), fh))
{
lastRes = 0;
for(i = 0; i < elcount(matrix[0]); i++)
{
res = strstr_off(text, lastRes, ";");
substr_cpy_off(temp, 0, text, lastRes, res-lastRes, 40);
strtol(temp, matrix[CCS_index][i]);
lastRes = res+1;
}
//write("%d -> %s",CCS_index,text);
CCS_index = CCS_index + 1;
}
write("%d Elements",CCS_index);
fileClose(fh);
}

Input output stream not working in Web Forms function

Can someone tell me why I keep getting a read and write timeout on this function? I have this as a code behind function on click even from a button. Everything as far as the data looks good until I get to the stream section and it still steps through, but when I check the Stream object contents after stepping into that object it states Read Timeout/Write Timeout: System invalid Operation Exception.
protected void SubmitToDB_Click(object sender, EventArgs e)
{
if (FileUploader.HasFile)
{
try
{
if (SectionDropDownList.SelectedValue != null)
{
if (TemplateDropDownList.SelectedValue != null)
{
// This gets the full file path on the client's machine ie: c:\test\myfile.txt
string strFilePath = FileUploader.PostedFile.FileName;
//use the System.IO Path.GetFileName method to get specifics about the file without needing to parse the path as a string
string strFileName = Path.GetFileName(strFilePath);
Int32 intFileSize = FileUploader.PostedFile.ContentLength;
string strContentType = FileUploader.PostedFile.ContentType;
//Convert the uploaded file to a byte stream to save to your database. This could be a database table field of type Image in SQL Server
Stream strmStream = FileUploader.PostedFile.InputStream;
Int32 intFileLength = (Int32)strmStream.Length;
byte[] bytUpfile = new byte[intFileLength + 1];
strmStream.Read(bytUpfile, 0, intFileLength);
strmStream.Close();
saveFileToDb(strFileName, intFileSize, strContentType, bytUpfile); // or use FileUploader.SaveAs(Server.MapPath(".") + "filename") to save to the server's filesystem.
lblUploadResult.Text = "Upload Success. File was uploaded and saved to the database.";
}
}
}
catch (Exception err)
{
lblUploadResult.Text = "The file was not updloaded because the following error happened: " + err.ToString();
}
}
else
{
lblUploadResult.Text = "No File Uploaded because none was selected.";
}
}
Try something like this:
using (var fileStream = FileUploader.PostedFile.InputStream)
{
using (var reader = new BinaryReader(fileStream))
{
byte[] bytUpfile = reader.ReadBytes((Int32)fileStream.Length);
// SAVE TO DB...
}
}

C#: WPD - Downloading a Picture with meta tags

I am running the Portable Device API to automatically get Photos from a connected Smart Phone. I have it all transferring correctly. The code that i use is that Standard DownloadFile() routine:
public PortableDownloadInfo DownloadFile(PortableDeviceFile file, string saveToPath)
{
IPortableDeviceContent content;
_device.Content(out content);
IPortableDeviceResources resources;
content.Transfer(out resources);
PortableDeviceApiLib.IStream wpdStream;
uint optimalTransferSize = 0;
var property = new _tagpropertykey
{
fmtid = new Guid(0xE81E79BE, 0x34F0, 0x41BF, 0xB5, 0x3F, 0xF1, 0xA0, 0x6A, 0xE8, 0x78, 0x42),
pid = 0
};
resources.GetStream(file.Id, ref property, 0, ref optimalTransferSize, out wpdStream);
System.Runtime.InteropServices.ComTypes.IStream sourceStream =
// ReSharper disable once SuspiciousTypeConversion.Global
(System.Runtime.InteropServices.ComTypes.IStream)wpdStream;
var filename = Path.GetFileName(file.Name);
if (string.IsNullOrEmpty(filename))
return null;
FileStream targetStream = new FileStream(Path.Combine(saveToPath, filename),
FileMode.Create, FileAccess.Write);
try
{
unsafe
{
var buffer = new byte[1024];
int bytesRead;
do
{
sourceStream.Read(buffer, 1024, new IntPtr(&bytesRead));
targetStream.Write(buffer, 0, 1024);
} while (bytesRead > 0);
targetStream.Close();
}
}
finally
{
Marshal.ReleaseComObject(sourceStream);
Marshal.ReleaseComObject(wpdStream);
}
return pdi;
}
}
There are two problems with this standard code:
1) - when the images are saves to the windows machine, there is no EXIF information. this information is what i need. how do i preserve it?
2) the saved files are very bloated. for example, the source jpeg is 1,045,807 bytes, whilst the downloaded file is 3,942,840 bytes!. it is similar to all of the other files. I would of thought that the some inside the unsafe{} section would output it byte for byte? Is there a better way to transfer the data? (a safe way?)
Sorry about this. it works fine.. it is something else that is causing these issues

Regarding CloudBlockblob.putBlock and CloudBlockBlob.PutBlockList

I am aware that we can use CloudBlockblob.putBlock and CloudBlockBlob.PutBlockList to upload in chunks but these methods do not have lease id parameter.
For this can i form the httpwebrequest with header "x-ms-lease-id" and attach to CloudBlockblob.putBlock and CloudBlockBlob.PutBlockList
Hi Gaurav,I could not big comment to your response hence adding it.
I tried with BlobRequest.PutBlock and Blobrequest.PutBlock with following code:
`for (int idxThread = 0; idxThread < numThreads; idxThread++)
{
tasks.Add(Task.Factory.StartNew(() =>
{
KeyValuePair blockIdAndLength;
while (true)
{
lock (queue)
{
if (queue.Count == 0)
break;
blockIdAndLength = queue.Dequeue();
}
byte[] buff = new byte[blockIdAndLength.Value];
//copying chunks into buff from inputbyte array
Array.Copy(buffer, blockIdAndLength.Key * (long)blockIdAndLength.Value, buff, 0, blockIdAndLength.Value);
// Upload block.
string blockName = Convert.ToBase64String(BitConverter.GetBytes(
blockIdAndLength.Key));
//string blockIdString = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(string.Format("BlockId{0}", blockIdAndLength.Key.ToString("0000000"))));
/// For small files like 100 KB it works files,for large files like 10 MB,it will end up uploading only 2-3 MB
/// //Is there any better way to implement Uploading in chunks and leasing.
///
string url = blob.Uri.ToString();
if (blob.ServiceClient.Credentials.NeedsTransformUri)
{
url = blob.ServiceClient.Credentials.TransformUri(url);
}
var req = BlobRequest.Put(new Uri(url), 90, new BlobProperties(), BlobType.BlockBlob, leaseId, 0);
using (Stream writer = req.GetRequestStream())
{
writer.Write(buff,0,buff.Length);
}
blob.ServiceClient.Credentials.SignRequest(req);
req.GetResponse().Close();
}
}));
}
// Wait for all threads to complete uploading data.
Task.WaitAll(tasks.ToArray());`
This does not work for multiple chunks..Could you please provide your inputs
I don't think you can. However take a look at BlobRequest class in Microsoft.WindowsAzure.StorageClient.Protocol namespace. It has PutBlock and PutBlockList functions which allows you to specify LeaseId.
Hope this helps.

Resources