Azure table storage API inconsistent replies - azure

I have an http-triggered azure function that receives a zip code. It then queries an Azure Table using the table API and retrieves the city, state, etc. The table contains zip/postal codes for US and CAN so there's about a million rows. When I send a request it returns the correct value the first time but if I keep sending it over and over it randomly switches between returning the record and returning an empty set. So it's not failing, and I'm not getting any kind of error like a timeout.
here is an example of a successful reply:
{
odata.metadata: "https://storageaccount###.table.core.windows.net/$metadata#Table###",
value: [
{
odata.etag: "W/"datetime'2019-10-18T16%3A02%3A26.9420514Z'"",
PartitionKey: "Portsmouth",
RowKey: "00210",
Timestamp: "2019-10-18T16:02:26.9420514Z",
AreaCode: "603",
City: "Portsmouth",
Country: "US",
Pref: "P",
State: "NH",
Zip: "00210"
}
]
}
and here is an empty one after pressing F5 after getting above reply:
{
odata.metadata: "https://storageaccount###.table.core.windows.net/$metadata#Table###",
value: [ ]
}
And then if I keep pressing F5 sometimes I get the record and sometimes I don't.
here are the table api url parameters (SAS-minus the signature)
?$filter=RowKey eq '00210' and Country eq 'US'
&sv=2019-02-02
&ss=t
&srt=sco
&sp=r
&se=2099-10-18T05:27:30Z
&st=2019-10-17T21:27:30Z
&spr=https
Does anyone know why it's behaving this way or what I could look into to figure it out?
According to this page there is a 5 second timeout for querying azure tables
(https://learn.microsoft.com/en-us/rest/api/storageservices/query-timeout-and-pagination). but when I look at the headers in postman I don't see any tokens.
postman results: https://i.stack.imgur.com/hReDM.png
full CORRECTED code
public static async Task<string> Run(HttpRequest req, ILogger log)
{
log.LogInformation("Start time = "+ DateTime.Now);
string apiResponse = "";
string zip = req.Query["zip"];
if(string.IsNullOrEmpty(zip)){return "No zip code found - please provide a url parameter in the format 'zip=[code]'";}
string apiBaseUrl = "https://***storage.table.core.windows.net/zip**?";
string queryFilter = "$first&$filter=RowKey eq '" + zip + "'";
//generate auth url in storage account in Azure
string authorization = "&sv=2019-02-02&ss=t&srt=sco&sp=r&se=2099-10-18T00:38:11Z&st=2019-10-17T16:38:11Z&spr=https&sig=7S%2BkaiTwGsZIkL***";
Regex rx_US = new Regex(#"^\d{5}$");
Regex rx_CA = new Regex(#"^[A-Za-z]\d[A-Za-z][ -]?\d[A-Za-z]\d$");
if (rx_US.IsMatch(zip))
{
queryFilter = queryFilter + " and Country eq 'US'";
}
else if (rx_CA.IsMatch(zip))
{
//the table search is case sensitive - test for common errors
zip = zip.ToUpper(); //make all upper case
Regex rx_CA1 = new Regex(#"^[A-Z]\d[A-Z]-\d[A-Z]\d$"); //dash
Regex rx_CA2 = new Regex(#"^[A-Z]\d[A-Z]\d[A-Z]\d$"); //no space
if (rx_CA1.IsMatch(zip)){zip = zip.Replace("-", " ");}
if (rx_CA2.IsMatch(zip)){zip = zip.Insert (3, " ");}
queryFilter = "$single&$filter=RowKey eq '" + zip + "'" + " and Country eq 'CA'";
}
string queryUrl = apiBaseUrl + queryFilter + authorization;
try
{
var httpWebRequest = WebRequest.Create(queryUrl);
httpWebRequest.ContentType = "application/json";
httpWebRequest.Headers.Add("Accept","application/json"); //if this is omitted you will get xml format
httpWebRequest.Method = "GET";
var httpResponse = await httpWebRequest.GetResponseAsync();
using(var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
var responseText = streamReader.ReadToEnd();
apiResponse = responseText;
log.LogInformation("Full Table Response = " + responseText);
}
int i = 0;
while (httpResponse.Headers["x-ms-continuation-NextPartitionKey"] != null && apiResponse.Length < 105)
{
//if response is > 105 then it found something - don't keep looking
//if there are continuation tokens then keep querying until you find something
var partitionToken = httpResponse.Headers["x-ms-continuation-NextPartitionKey"];
var rowToken = httpResponse.Headers["x-ms-continuation-NextRowKey"];
var continuationUrl = "NextPartitionKey="+partitionToken+"&NextRowKey="+rowToken+"&";
queryUrl = apiBaseUrl + continuationUrl + queryFilter + authorization;
log.LogInformation("begin new httpRequest...");
httpWebRequest = WebRequest.Create(queryUrl);
httpWebRequest.ContentType = "application/json";
httpWebRequest.Headers.Add("Accept","application/json");
httpWebRequest.Method = "GET";
httpResponse = await httpWebRequest.GetResponseAsync();
using(var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
var responseText = streamReader.ReadToEnd();
apiResponse = responseText;
log.LogInformation("Full Table Response = " + responseText);
}
i++;
log.LogInformation("loop # "+ i + " - url = " + queryUrl + " Response = "+ apiResponse);
}
if(apiResponse.Length > 105)
{
//strip out extra data
apiResponse = apiResponse.Remove(1,101);
apiResponse = apiResponse.Remove(apiResponse.Length - 2,2);
}
else
{
apiResponse = "No data found for zip = " + zip + " - Ensure you have proper format and case";
}
}
catch (Exception ex)
{
apiResponse = "error: " + ex.Message;
}
log.LogInformation("ZipPostal function completed and returned "+ apiResponse);
return apiResponse;
}

Related

InvalidOperationException on Asp .Net Core

i am getting this error when i am calling this function:
InvalidOperationException: A second operation started on this context before a previous operation completed. This is usually caused by different threads using the same instance of DbContext.
I don't know what the problem is.
public async Task<bool> sendFirstOrderNotification(Order order)
{
var users = getUsersWithrole("Admin");
foreach (var user in users)
{
if (!string.IsNullOrEmpty(user.NotificationToken))
{
var notificationPayload = new FirebaseNotificationPayload()
{
Title = user.Name,
Body = order.Id + "رقم الطلب " + "بإجراء طلب لأول مرة " + "قام مستخدم ",
};
string response = SendNotification(notificationPayload, user.NotificationToken).Result;
Create(new NotificationLog()
{
CreatedDate = DateTime.Now,
Id = 0,
OrderId = order.Id,
Token = user.NotificationToken,
UserId = user.Id,
NotificationJson = JsonConvert.SerializeObject(notificationPayload),
ResponseJson = response
});
}
}
return true;
}
can anyone tell me what's causing the problem

Create Folder and Update Title and Custom Field in Sharepoint Online

I try to create folder in sharepoint online, based on some of tutorial. the problem comes since creating folder is not given "Title" column value.
I want to create folder and also update column "Title".
here is the code for create folder
public string CreateDocumentLibrary(string siteUrl, string relativePath)
{
//bool responseResult = false;
string resultUpdate = string.Empty;
string responseResult = string.Empty;
if (siteUrl != _siteUrl)
{
_siteUrl = siteUrl;
Uri spSite = new Uri(siteUrl);
_spo = SpoAuthUtility.Create(spSite, _username, WebUtility.HtmlEncode(_password), false);
}
string odataQuery = "_api/web/folders";
byte[] content = ASCIIEncoding.ASCII.GetBytes(#"{ '__metadata': { 'type': 'SP.Folder' }, 'ServerRelativeUrl': '" + relativePath + "'}");
string digest = _spo.GetRequestDigest();
Uri url = new Uri(String.Format("{0}/{1}", _spo.SiteUrl, odataQuery));
// Set X-RequestDigest
var webRequest = (HttpWebRequest)HttpWebRequest.Create(url);
webRequest.Headers.Add("X-RequestDigest", digest);
// Send a json odata request to SPO rest services to fetch all list items for the list.
byte[] result = HttpHelper.SendODataJsonRequest(
url,
"POST", // reading data from SP through the rest api usually uses the GET verb
content,
webRequest,
_spo // pass in the helper object that allows us to make authenticated calls to SPO rest services
);
string response = Encoding.UTF8.GetString(result, 0, result.Length);
if (response != null)
{
//responseResult = true;
responseResult = response;
}
return responseResult;
}
I already tried to use CAML, but, the problem is, the list of sharepoint is big, so got the error prohibited access related to limit tresshold.
Please help.
Refer below code to update folder name.
function renameFolder(webUrl,listTitle,itemId,name)
{
var itemUrl = webUrl + "/_api/Web/Lists/GetByTitle('" + listTitle + "')/Items(" + itemId + ")";
var itemPayload = {};
itemPayload['__metadata'] = {'type': getItemTypeForListName(listTitle)};
itemPayload['Title'] = name;
itemPayload['FileLeafRef'] = name;
var additionalHeaders = {};
additionalHeaders["X-HTTP-Method"] = "MERGE";
additionalHeaders["If-Match"] = "*";
return executeJson(itemUrl,"POST",additionalHeaders,itemPayload);
}
function getItemTypeForListName(name) {
return"SP.Data." + name.charAt(0).toUpperCase() + name.slice(1) + "ListItem";
}

Azure - 403 error - Copying a large number of storage account blobs to another storage account on pay-as-you-go acct

I am getting a
403 Forbidden
error when copying a large number of block blobs from one storage account to another storage account (in a different region as a backup). After 100,000+ are copied I get a 403 Forbidden error.
I have seen answers talking about a Quota but I believe that is for free accounts. I have a client with a 578,000 files that I moved to Azure from local and that worked fine but I cannot make a copy to another storage account I set up to act as a backup (in case of deletions mostly).
I am using StartCopyAsync and then checking the Copystate status to verify the copy succeeded and retrying in my code but it appears to be failing on the StartCopyAsync.
The copy works fine until I have copied well over 100,000 of the files, then the error occurs. I am not sure what is causing that since the same code works fine for so many blobs first. I have added a log file that told me which file it failed on and I can open that file in Azure explorer.
I can post the code but right now, I am wondering if I am hitting some sort of quote/bandwidth issue I do not know about.
namespace BackupCloudContainers
{
class Program
{
static string privateconnectionstring = ConfigurationManager.AppSettings["StorageConnectionString"];
static string privatebackupconnectionstring = ConfigurationManager.AppSettings["BackupStorageConnectionString"];
static DateTime testdate = new DateTime(2017, 8, 28, 0, 0, 0);
static string destContainerName = "";
static void Main(string[] args)
{
try
{
//Console.WriteLine("Starting Backup at " + DateTime.Now.ToString("hh:mm:ss.ffff"));
Log("Starting Incremental Backup (everything since " + testdate.ToString("f") + ") at " + DateTime.Now.ToString("hh:mm:ss.ffff"));
Backup().GetAwaiter().GetResult();
// Console.WriteLine("Backup Created as " + destContainerName);
Log("Backup Created as " + destContainerName);
//Console.WriteLine("Backup ended at " + DateTime.Now.ToString("hh:mm:ss.ffff"));
Log("Backup ended at " + DateTime.Now.ToString("hh:mm:ss.ffff"));
Console.WriteLine("\n\nPress Enter to close. ");
Console.ReadLine();
}
catch (Exception e)
{
//Console.WriteLine("Exception - " + e.Message);
Log("Exception - " + e.Message);
if (e.InnerException != null)
{
//Console.WriteLine("Inner Exception - " + e.InnerException.Message);
Log("Inner Exception - " + e.InnerException.Message);
}
}
}
static async Task Backup()
{
CloudStorageAccount _storageAccount = CloudStorageAccount.Parse(privateconnectionstring);
CloudStorageAccount _storageBackupAccount = CloudStorageAccount.Parse(privatebackupconnectionstring);
CloudBlobClient blobClient = _storageAccount.CreateCloudBlobClient();
CloudBlobClient blobBackupClient = _storageBackupAccount.CreateCloudBlobClient();
foreach (var srcContainer in blobClient.ListContainers())
{
// skip any containers with a backup name
if (srcContainer.Name.IndexOf("-backup-") > -1)
{
continue;
}
var backupTimeInTicks = DateTime.UtcNow.Ticks;
//var destContainerName = srcContainer.Name + "-" + backupTimeInTicks;
var backupDateTime = DateTime.UtcNow.ToString("yyyyMMdd-hhmmssfff");
destContainerName = srcContainer.Name + "-backup-" + backupDateTime;
var destContainer = blobBackupClient.GetContainerReference(destContainerName);
// var destContainer = blobClient.GetContainerReference(destContainerName);
// assume it does not exist already,
// as that wouldn't make sense.
await destContainer.CreateAsync();
// ensure that the container is not accessible
// to the outside world,
// as we want all the backups to be internal.
BlobContainerPermissions destContainerPermissions = destContainer.GetPermissions();
if (destContainerPermissions.PublicAccess != BlobContainerPublicAccessType.Off)
{
destContainerPermissions.PublicAccess = BlobContainerPublicAccessType.Off;
await destContainer.SetPermissionsAsync(destContainerPermissions);
}
// copy src container to dest container,
// note that this is synchronous operation in reality,
// as I want to only add real metadata to container
// once all the blobs have been copied successfully.
await CopyContainers(srcContainer, destContainer);
await EnsureCopySucceeded(destContainer);
// ensure we have some metadata for the container
// as this will helps us to delete older containers
// on a later date.
await destContainer.FetchAttributesAsync();
var destContainerMetadata = destContainer.Metadata;
if (!destContainerMetadata.ContainsKey("BackupOf"))
{
string cname = srcContainer.Name.ToLowerInvariant();
destContainerMetadata.Add("BackupOf", cname);
destContainerMetadata.Add("CreatedAt", backupTimeInTicks.ToString());
destContainerMetadata.Add("CreatedDate", backupDateTime);
await destContainer.SetMetadataAsync();
//destContainer.SetMetadata();
}
}
// let's purge the older containers,
// if we already have multiple newer backups of them.
// why keep them around.
// just asking for trouble.
//var blobGroupedContainers = blobBackupClient.ListContainers()
// .Where(container => container.Metadata.ContainsKey("Backup-Of"))
// .Select(container => new
// {
// Container = container,
// BackupOf = container.Metadata["Backup-Of"],
// CreatedAt = new DateTime(long.Parse(container.Metadata["Created-At"]))
// }).GroupBy(arg => arg.BackupOf);
var blobGroupedContainers = blobClient.ListContainers()
.Where(container => container.Metadata.ContainsKey("BackupOf"))
.Select(container => new
{
Container = container,
BackupOf = container.Metadata["BackupOf"],
CreatedAt = new DateTime(long.Parse(container.Metadata["CreatedAt"]))
}).GroupBy(arg => arg.BackupOf);
// Remove the Delete for now
// foreach (var blobGroupedContainer in blobGroupedContainers)
// {
// var containersToDelete = blobGroupedContainer.Select(arg => new
// {
// Container = arg.Container,
// CreatedAt = new DateTime(arg.CreatedAt.Year, arg.CreatedAt.Month, arg.CreatedAt.Day)
// })
// .GroupBy(arg => arg.CreatedAt)
// .OrderByDescending(grouping => grouping.Key)
// .Skip(7) /* skip last 7 days worth of data */
// .SelectMany(grouping => grouping)
// .Select(arg => arg.Container);
//// Remove the Delete for now
// //foreach (var containerToDelete in containersToDelete)
// //{
// // await containerToDelete.DeleteIfExistsAsync();
// //}
// }
}
static async Task EnsureCopySucceeded(CloudBlobContainer destContainer)
{
bool pendingCopy = true;
var retryCountLookup = new Dictionary<string, int>();
while (pendingCopy)
{
pendingCopy = false;
var destBlobList = destContainer.ListBlobs(null, true, BlobListingDetails.Copy);
foreach (var dest in destBlobList)
{
var destBlob = dest as CloudBlob;
if (destBlob == null)
{
continue;
}
var blobIdentifier = destBlob.Name;
if (destBlob.CopyState.Status == CopyStatus.Aborted ||
destBlob.CopyState.Status == CopyStatus.Failed)
{
int retryCount;
if (retryCountLookup.TryGetValue(blobIdentifier, out retryCount))
{
if (retryCount > 4)
{
throw new Exception("[CRITICAL] Failed to copy '"
+ destBlob.CopyState.Source.AbsolutePath + "' to '"
+ destBlob.StorageUri + "' due to reason of: " +
destBlob.CopyState.StatusDescription);
}
retryCountLookup[blobIdentifier] = retryCount + 1;
}
else
{
retryCountLookup[blobIdentifier] = 1;
}
pendingCopy = true;
// restart the copy process for src and dest blobs.
// note we also have retry count protection,
// so if any of the blobs fail too much,
// we'll give up.
await destBlob.StartCopyAsync(destBlob.CopyState.Source);
}
else if (destBlob.CopyState.Status == CopyStatus.Pending)
{
pendingCopy = true;
}
}
Thread.Sleep(1000);
}
}
static async Task CopyContainers(
CloudBlobContainer srcContainer,
CloudBlobContainer destContainer)
{
// get the SAS token to use for all blobs
string blobToken = srcContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = DateTime.Now.AddMinutes(-5),
SharedAccessExpiryTime = DateTime.Now.AddHours(3)
});
int ii = 0;
int cntr = 0;
int waitcntr = 0;
string sourceuri = "";
int datecntr = 0;
try
{
//Console.WriteLine(" container contains " + srcContainer.ListBlobs(null, true).Count().ToString());
Log(" container contains " + srcContainer.ListBlobs(null, true).Count().ToString());
foreach (var srcBlob in srcContainer.ListBlobs(null, true))
{
ii++;
//THIS IS FOR COUNTING Blobs that would be on the Incremental Backup
CloudBlob blob = (CloudBlob)srcBlob;
if (blob.Properties.LastModified > testdate)
{
datecntr++;
}
else
{
// We are only doing an Incremental Backup this time - so skip all other files
continue;
}
//if (ii > 2000)
//{
// //Console.WriteLine(" test run ended ");
// Log(" test run ended ");
// break;
//}
cntr++;
if (cntr > 999)
{
//Console.WriteLine(" " + ii.ToString() + " processed at " + DateTime.Now.ToString("hh:mm:ss"));
Log(" " + ii.ToString() + " processed at " + DateTime.Now.ToString("hh:mm:ss"));
//Log(" EnsureCopySucceeded - finished at " + DateTime.Now.ToString("hh:mm:ss"));
//await EnsureCopySucceeded(destContainer);
//Log(" EnsureCopySucceeded - finished at " + DateTime.Now.ToString("hh:mm:ss"));
cntr = 0;
}
waitcntr++;
if (waitcntr > 29999)
{
Log(" EnsureCopySucceeded (ii=" + ii.ToString() + "- started at " + DateTime.Now.ToString("hh:mm:ss"));
await EnsureCopySucceeded(destContainer);
Log(" EnsureCopySucceeded - finished at " + DateTime.Now.ToString("hh:mm:ss"));
waitcntr = 0;
}
var srcCloudBlob = srcBlob as CloudBlob;
if (srcCloudBlob == null)
{
continue;
}
CloudBlob destCloudBlob;
if (srcCloudBlob.Properties.BlobType == BlobType.BlockBlob)
{
destCloudBlob = destContainer.GetBlockBlobReference(srcCloudBlob.Name);
}
else
{
destCloudBlob = destContainer.GetPageBlobReference(srcCloudBlob.Name);
}
sourceuri = srcCloudBlob.Uri.AbsoluteUri + blobToken;
try
{
await destCloudBlob.StartCopyAsync(new Uri(srcCloudBlob.Uri.AbsoluteUri + blobToken));
}
catch (Exception e)
{
Log("Error at item " + ii.ToString() + " Source = " + sourceuri + " Message = " + e.Message + " Time = " + DateTime.Now.ToString("F") + "\r\n");
}
}
Log("Total Items checked = " + ii.ToString() + " backed up files = " + datecntr.ToString());
Log("TestDate = " + testdate.ToString("F") + " datecntr = " + datecntr.ToString());
}
catch (Exception e)
{
Log("Error at item " + ii.ToString());
Log(" Source = " + sourceuri);
Log(" Message = " + e.Message);
Log(" Time = " + DateTime.Now.ToString("F") + "\r\n");
//throw e;
}
}
static void Log(string logdata)
{
Console.WriteLine(logdata);
File.AppendAllText("c:\\junk\\dwlog.txt", logdata + "\r\n");
}
}
}
You mentioned that your code starts to fail after 3 hours. Well, the following lines of code are culprit for that:
string blobToken = srcContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = DateTime.Now.AddMinutes(-5),
SharedAccessExpiryTime = DateTime.Now.AddHours(3)
});
If you notice, you are creating a shared access signature (SAS) that is valid for a duration of 3 hours and you're using this SAS for all blobs. Your code works as long as SAS is valid i.e. has not expired. Once the SAS expires, because now the SAS token is not authorized to perform the operation, you start getting 403 (Not Authorized) error.
My recommendation would be to create a SAS token that is valid for a longer duration. I would recommend a SAS token valid for 15 days because that's the maximum amount of time Azure Storage will try to copy your blob from one account to another.

Resumable Upload using Google Mail APIs throwing exception: "Bad Request"

I am trying to insert mail to Google Mailbox using GMail APIs.
I want to upload mails of size more than 5 mb. So that I am using Resumable upload request.
I have used POST request first to initiate a resumable upload which gives "200 OK" response.
Post Request:
String postUrl = "https://www.googleapis.com/upload/gmail/v1/users/" + "<username>" + "/messages/send?uploadType=resumable";
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(postUrl);
httpRequest.Headers["Authorization"] = "Bearer " + f_token;// AccessToken;
httpRequest.Headers["X-Upload-Content-Type"] = "message/rfc822";
httpRequest.Headers["X-Upload-Content-Length"] = f_bytes.Length.ToString();
httpRequest.Method = "POST";
httpRequest.ContentLength = 0;
var response = (HttpWebResponse)httpRequest.GetResponse(); // 200 OK
From that response I get location URL to upload EML.
Location: https://www.googleapis.com/upload/gmail/v1/users//messages/send?uploadType=resumable&upload_id=AEnB2UqeNYKVyyQdL07RZcbenWOqY8a2NFVIsQrbA-S-vxwUXC_W4ORQtpPx1HG6tc4Indx8AvqDjwXII3F6OW0G3wsdUMUjHw
To upload EML file I used Location URL as PUT URL to create request.
putUrl = https://www.googleapis.com/upload/gmail/v1/users/<username>/messages/send?uploadType=resumable&upload_id=AEnB2UqeNYKVyyQdL07RZcbenWOqY8a2NFVIsQrbA-S-vxwUXC_W4ORQtpPx1HG6tc4Indx8AvqDjwXII3F6OW0G3wsdUMUjHw";
HttpWebRequest httpRequest1 = (HttpWebRequest)WebRequest.Create(postUrl);
httpRequest1.Method = "PUT";
httpRequest1.ContentLength = f_bytes.Length;
int EndOffset = f_bytes.Length;//5120000;5242880
httpRequest1.Headers["Content-Range"] = "bytes " + 0 + "-" + EndOffset + "/" + f_bytes.Length;
httpRequest1.ContentType = "message/rfc822";
MemoryStream stream = new MemoryStream(f_bytes);
System.IO.Stream requestStream = httpRequest1.GetRequestStream();
{
stream.CopyTo(requestStream);
requestStream.Flush();
requestStream.Close();
}
HttpWebResponse f_webResponse = (HttpWebResponse)httpRequest1.GetResponse(); //Exception
Exception :
The remote server returned an error: (400) Bad Request.
Please suggest soluion to upload eml file in a particular folder of mailbox .
I am able to send mail using resumable upload.
if (f_MailService == null)
{
bool isCreated = createMailService(ref f_MailService);
}
FileStream fs = new FileStream(#p_EMLPath, FileMode.Open,FileAccess.Read);
Create HTTPRequest for sending mail :
string postUrl = "https://www.googleapis.com/upload/gmail/v1/users/ab#edu.cloudcodes.com/messages/send?uploadType=resumable";
HttpWebRequest f_httpRequest = (HttpWebRequest)WebRequest.Create(postUrl);
f_httpRequest.Headers["X-Upload-Content-Type"] = "message/rfc822";
f_httpRequest.Headers["X-Upload-Content-Length"] = fs.Length.ToString();
f_httpRequest.Headers["Authorization"] = "Bearer " + f_token;
f_httpRequest.Method = "POST";
//f_httpRequest.ContentLength = 524288;
f_httpRequest.ContentType = "application/json; charset=UTF-8";//"message/rfc822";
f_httpRequest.ContentLength = fs.Length;
f_httpRequest.Timeout = 6000000;
f_httpRequest.SendChunked = true;
Get Response for first POST request :
try
{
using (Stream f_ObjHttpStream = f_httpRequest.GetRequestStream())
{
}
}
catch (Exception EX)
{
}
try
{
using (var response = (HttpWebResponse)f_httpRequest.GetResponse())
{
// data = ReadResponse(response);
UploadUrl = response.Headers["Location"].ToString();
}
}
catch (WebException exception)
{
using (var response = (HttpWebResponse)exception.Response)
{
// data = ReadResponse(response);
}
}
Read EML File & send chunk data to upload
byte[] Arrbyte = new byte[1024];
int ReadByte = 0;
while (fs.Length > ReadByte)
{
bool ac = false;
int ByteRead = 0;
byte[] Data = new byte[4194304];
byte[] LastData;
//Read block of bytes from stream into the byte array
// if (ReadByte == 0)
{
ByteRead = fs.Read(Data, 0, Data.Length);
}
//else
{
if ((ReadByte + Data.Length) > fs.Length)
{
//fs.Length - ReadByte-
LastData = new byte[fs.Length - ReadByte];
ByteRead = fs.Read(LastData, 0, LastData.Length);
CallPUTReq(fs.Length, LastData);
ac = true;
}
}
//f_MsgRawStr = Convert.ToBase64String(f_bytes).TrimEnd(padding).Replace('+', '-').Replace('/', '_');
ReadByte = ReadByte + ByteRead;
if (ac == false)
{
CallPUTReq(fs.Length, Data);
}
//long pos = fs.Seek(0, SeekOrigin.Current );
//fs.Position = ReadByte;
}
private void CallPUTReq(long p_lenth, byte[] Arrbyte)
{
try
{
String postUrl = UploadUrl; //"https://www.googleapis.com/upload/gmail/v1/users/ab#edu.cloudcodes.com/messages/send?uploadType=resumable&upload_id=AEnB2UqZNtZVwWulAOhAVoFp-pZ-vTMcIXOpt_0dH_6jJecpm2Y1MNOGkE6JoDb0kn9Dt4yuHHMZWR--dBncxWQkZctF9h6jiPSL5uJDKeYE9Ut1c7-fImc";
int EndOffset = 0;
HttpWebRequest httpRequest1 = (HttpWebRequest)WebRequest.Create(postUrl);
httpRequest1.Method = "PUT";
httpRequest1.ContentLength = Arrbyte.Length;
if (rangeStartOffset == 0)
{
EndOffset = Arrbyte.Length - 1;
}
else
{
EndOffset = rangeStartOffset + Arrbyte.Length - 1;
if (EndOffset > p_lenth)
{
EndOffset = Convert.ToInt32(p_lenth);
httpRequest1.ContentLength = EndOffset - rangeStartOffset;
}
}//5120000;5242880
httpRequest1.Headers["Content-Range"] = "bytes " + rangeStartOffset + "-" + EndOffset + "/" + p_lenth; //"bytes */" + p_lenth; //
httpRequest1.ContentType = "message/rfc822";
httpRequest1.Timeout = 6000000;
UTF8Encoding encoding = new UTF8Encoding();
Stream stream = httpRequest1.GetRequestStream();
stream.Write(Arrbyte, 0, Arrbyte.Length);
stream.Close();
try
{
using (Stream f_ObjHttpStream = httpRequest1.GetRequestStream())
{
}
}
catch (Exception EX)
{
}
WebResponse response1 = null;
try
{
using (response1 = (HttpWebResponse)httpRequest1.GetResponse())
{
}
}
catch (Exception ex)
{
// UploadUrl = response1.Headers["Location"].ToString();
}
//4194303
rangeStartOffset = EndOffset +1;
}
catch (Exception)
{
}
}

Why am I unable to list all my accounts with this code?

This is my first foray into Google Analytics. I created a service account and downloaded the p12 file from the developer console.
This code works, but in an incomplete way.
I have two accounts, but the code below just returns one account from the list.
How do I get all my accounts listed?
private static ServiceAccountCredential Run2()
{
const string keyfilePath = "file.p12";
const string serviceAccountMail = "notarealemailaddress#developer.gserviceaccount.com";
var certificate = new X509Certificate2(keyfilePath, "notasecret", X509KeyStorageFlags.Exportable);
var credential = new ServiceAccountCredential(new ServiceAccountCredential.Initializer(serviceAccountMail)
{
Scopes = new[] { AnalyticsService.Scope.Analytics, AnalyticsService.Scope.AnalyticsReadonly, AnalyticsService.Scope.AnalyticsProvision }
}.FromCertificate(certificate));
return credential;
}
static void Main()
{
var cr = Run2();
var service = new AnalyticsService(new BaseClientService.Initializer()
{
HttpClientInitializer = cr,
ApplicationName = "Analytics API Sample"
});
var request = service.Management.Accounts.List();
request.MaxResults = 20;
var result = request.Execute();
foreach (var item in result.Items)
{
Console.WriteLine("Account Name: {0} {1} {2}", item.Name, item.Kind, item.Id);
}
}
This is what I ended up doing. The service account that Google creates needs to be added to every account that you need to access. I figured this from reading the documentation.
https://developers.google.com/analytics/devguides/config/mgmt/v3/quickstart/service-py
Try this out
ManagementResource.AccountSummariesResource.ListRequest list = service.Management.AccountSummaries.List();
list.MaxResults = 1000; // Maximum number of Account Summaries to return per request.
AccountSummaries feed = list.Execute();
List allRows = new List();
//// Loop through until we arrive at an empty page
while (feed.Items != null)
{
allRows.AddRange(feed.Items);
// We will know we are on the last page when the next page token is
// null.
// If this is the case, break.
if (feed.NextLink == null)
{
break;
}
// Prepare the next page of results
list.StartIndex = feed.StartIndex + list.MaxResults;
// Execute and process the next page request
feed = list.Execute();
}
feed.Items = allRows;
//Get account summary and display them.
foreach (AccountSummary account in feed.Items)
{
// Account
Console.WriteLine("Account: " + account.Name + "(" + account.Id + ")");
foreach (WebPropertySummary wp in account.WebProperties)
{
// Web Properties within that account
Console.WriteLine("\tWeb Property: " + wp.Name + "(" + wp.Id + ")");
//Don't forget to check its not null. Believe it or not it could be.
if (wp.Profiles != null)
{
foreach (ProfileSummary profile in wp.Profiles)
{
// Profiles with in that web property.
Console.WriteLine("\t\tProfile: " + profile.Name + "(" + profile.Id + ")");
}
}
}
}
Reference: http://www.daimto.com/googleanalytics-management-csharp/
http://www.daimto.com/googleAnalytics-authentication-csharp/

Resources