Azure autoscale metricname values - azure

I need to define scale rule for my virtual machine I have read the following
The MetricName and MetricNamespace are not values I just made up.
These have to be precise. You can get these values from the
MetricsClient API and there is some sample code in this link to show
how to get the values.
http://rickrainey.com/2013/12/15/auto-scaling-cloud-services-on-cpu-percentage-with-the-windows-azure-monitoring-services-management-library/
But its still not clear ho do I get a MetricName list of possible values as I didn't found any sample code for it

Here is the code I used to get the available MetricNames for the cloud service. It was part of a unit test project, hence the [TestMethod] attribute.
[TestMethod]
public async Task GetMetricDefinitions()
{
// Build the resource ID string.
string resourceId = ResourceIdBuilder.BuildCloudServiceResourceId(
cloudServiceName, deploymentName, roleName );
Console.WriteLine("Resource Id: {0}", resourceId);
//Get the metric definitions.
var retrieveMetricsTask =
metricsClient.MetricDefinitions.ListAsync(resourceId, null, null, CancellationToken.None);
var metricListResponse = await retrieveMetricsTask;
MetricDefinitionCollection metricDefinitions = metricListResponse.MetricDefinitionCollection;
// Make sure something was returned.
Assert.IsTrue(metricDefinitions.Value.Count > 0);
// Display the metric definitions.
int count = 0;
foreach (MetricDefinition metricDefinition in metricDefinitions.Value)
{
Console.WriteLine("MetricDefinitio: " + count++);
Console.WriteLine("Display Name: " + metricDefinition.DisplayName);
Console.WriteLine("Metric Name: " + metricDefinition.Name);
Console.WriteLine("Metric Namespace: " + metricDefinition.Namespace);
Console.WriteLine("Is Altertable: " + metricDefinition.IsAlertable);
Console.WriteLine("Min. Altertable Time Window: " + metricDefinition.MinimumAlertableTimeWindow);
Console.WriteLine();
}
}
Here is the output of the test for my cloud service:

Related

Identify modified blobs from storage account changefeed

I'm currently consuming the changefeed on Azure storage account and would like to distinguish between blobs created (uploaded) and those that are just modified.
In the example below I upload a blob (agent-diag.txt) and then edit the file (add some text).
In both cases it raises 'BlobCreated', there seems no concept of 'BlobUpdated'.
From MS Doc: The following event types are captured in the change feed records:
BlobCreated
BlobDeleted
BlobPropertiesUpdated
BlobSnapshotCreated
BlobPropertiesUpdated is recorded if the meta data or tags etc are changed. But if the file is modified I can't see any way to identify this. Any ideas?
Operation Name: PutBlob
Api: Azure.Storage.Blobs.ChangeFeed.BlobChangeFeedEventData
Subject: /blobServices/default/containers/myblobs/blobs/agent-diag.txt
Event Type: BlobCreated
Event Time: 17/11/2021 23:25:42 +00:00
Operation Name: PutBlob
Api: Azure.Storage.Blobs.ChangeFeed.BlobChangeFeedEventData
Subject: /blobServices/default/containers/myblobs/blobs/agent-diag.txt
Event Type: BlobCreated
Event Time: 17/11/2021 23:26:07 +00:00
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.ChangeFeed;
namespace Changefeed
{
class Program
{
const string conString = "DefaultEndpointsProtocol=BlahBlah";
public static async Task<List<BlobChangeFeedEvent>> ChangeFeedAsync(string connectionString)
{
// Get a new blob service client.
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
// Get a new change feed client.
BlobChangeFeedClient changeFeedClient = blobServiceClient.GetChangeFeedClient();
List<BlobChangeFeedEvent> changeFeedEvents = new List<BlobChangeFeedEvent>();
// Get all the events in the change feed.
await foreach (BlobChangeFeedEvent changeFeedEvent in changeFeedClient.GetChangesAsync())
{
changeFeedEvents.Add(changeFeedEvent);
}
return changeFeedEvents;
}
public static void showEventData(List<BlobChangeFeedEvent> changeFeedEvents)
{
foreach (BlobChangeFeedEvent changeFeedEvent in changeFeedEvents)
{
string subject = changeFeedEvent.Subject;
string eventType = changeFeedEvent.EventType.ToString();
string eventTime = changeFeedEvent.EventTime.ToString();
string api = changeFeedEvent.EventData.ToString();
string operation = changeFeedEvent.EventData.BlobOperationName.ToString();
Console.WriteLine("Subject: " + subject + "\n" +
"Event Type: " + eventType + "\n" +
"Event Time: " + eventTime + "\n" +
"Operation Name: " + operation + "\n" +
"Api: " + api);
}
}
public static void Main(string[] args)
{
Console.WriteLine("Hello World!");
List<BlobChangeFeedEvent> feedlist = ChangeFeedAsync(conString).GetAwaiter().GetResult();
Console.WriteLine("Feedlist :" + feedlist.Count());
showEventData(feedlist);
}
}
}
Each blob has 2 system defined properties - created date and last modified which tells you when a blob was created and when it was last modified respectively.
When a blob is created, both of these properties will have the same value. However when the same blob is overwritten (i.e. content updated), only the last modified value is changed.
What you could do is use these properties to identify whether a new blob is created or content of an existing blob is updated.
You would still work with BlobCreated event. One additional step you would need to do is fetch the properties of the blob and compare these two properties to make the distinction.

Is there a way to store the last changes on real time database?

I'm using the real time database with the following data structure. I can add more machines in the webapp that attributes an autogenerated ID and all the machines has the same data structure.
machines
-autogeneratedID1
-id1 : 1
-id2 : 2
-autogeneratedID2
-id1 : 4
-Id2 : 3
I want to track the changes on the id inside the autogeneratedIDs, if the id1 on the autogeneratedID1 changes from 1 to 3, I want something that returns the change with a timestamp.
I'm trying to use cloud functions with the following code:
exports.foo = functions.database.ref("machines/").onUpdate((change) => {
const before = change.before.val(); // DataSnapshot before the change
const after = change.after.val(); // DataSnapshot after the change
console.log(before);
console.log(after);
return null;
but the before and after objects returns me the JSON of all the strucutre of the database.
My first guess was to compare the 2 JSON objects and detect where the changes were and add a timestamp. Then I want to store the changes in the database.
Is there a way to do this?
Best Regards
Found the answer finally, here it is the code that I'm using:
exports.foo = functions.database.ref("machines/{machineID}/{ValueID}").onUpdate((change,context) => {
const before = change.before.val(); // DataSnapshot before the change
const after = change.after.val(); // DataSnapshot after the change
const machineID = context.params.machineID; //Machine ID
const valueID = context.params.ValueID; //Value ID
console.log("MachineID: " + machineID + " " +"IDChanged: " + valueID + " " + "BeforeValue: " + before +" "+ "AfterValue: " + after);
return null;
});

Using Filters in MS-Project Queries using Sharepoint API (PS.js / SP.js)

I'm trying to use the PS.js API to access information in Sharepoint, and I'm struggling to find a way to practically use the system.
In this case I'm trying to show upcoming resource allocation. I can easily get my hands on the list of all resources [ProjectContext.get_resources()], and from there the list of each of their assignments to projects/tasks [EnterpriseResource.get_assignments()]. But there are thousands of these, and I only want those that are booked on or after of the current date.
Is there any way I can create a filter or CAML view of these using the existing SP.js or PS.js, rather than reverting to the REST API?
I'd love if I could just modify the result of 'get_assignments' to add filter options, or add filters as arguments to 'load'. But I can't seem to find any documentation that tells me if it's possible.
Sample Below...
function ResourceListArrived(resource)
//Go through every Resource in the list
var rEnumerator = Resources.getEnumerator();
while (rEnumerator.moveNext()) {
var resource = rEnumerator.get_current();
$('#message').html('Processing Resource: ' + resource.get_name() + " / " + resource.get_id());
log('Investigating Resource ' + resource.get_name());
//Get the assignments for this resource.
GetResourceAssignments(resource);
}
$('#messageblock').fadeOut(500);
}
function GetResourceAssignments(resource)
{
var assignments = resource.get_assignments();
Project.load(assignments, 'Include(Start,Stop)');
//I'd like to filter THESE results somewhere in the previous 2x lines.
// Run the request on the server.
Project.executeQueryAsync(
function (sender, args) {
TotalRequestsOutstanding--;
var aEnumerator = assignments.getEnumerator();
while (aEnumerator.moveNext()) {
var assignment = aEnumerator.get_current();
log(' Assignment Found On ' + resource.get_name() + " - " + assignment.get_start());
}
},
function (sender, args) {
alert('Failed to get list of assignments. Error: ' + args.get_message());
});
}

Why code running on Azure is so slow?

I have a web app running on Azure shared web site mode. A simple method where I add items to a list and sort this list, when the list size is about 300 items, takes 0.3s on my machine and 10s after deploy (on azure machine).
Does anybody has any idea why Azure is so slow?
Is any configuration I do it wrong? I use default one but replaced FREE mode with SHARED mode because I thought this would help but it seems it does not.
UPDATE:
public ActionResult GetPosts(String selectedStreams, int implicitSelectedVisualiserId, int userId)
{
DateTime begin = DateTime.UtcNow;
List<SearchQuery> selectedSearchQueries = searchQueryRepository.GetSearchQueriesOfStreamsIds(selectedStreams == String.Empty ? new List<int>() : selectedStreams.Split(',').Select(n => int.Parse(n)).ToList());
var implicitSelectedVisualiser = VisualiserModel.ToVisualiserModel(visualiserRepository.GetVisualiser(implicitSelectedVisualiserId));
var twitterSearchQueryOfImplicitSelectedVisualiser = searchQueryRepository.GetSearchQuery(implicitSelectedVisualiser.Stream.Name, Service.Twitter, userId);
var instagramSearchQueryOfImplicitSelectedVisualiser = searchQueryRepository.GetSearchQuery(implicitSelectedVisualiser.Stream.Name, Service.Instagram, userId);
var facebookSearchQueryOfImplicitSelectedVisualiser = searchQueryRepository.GetSearchQuery(implicitSelectedVisualiser.Stream.Name, Service.Facebook, userId);
var manualSearchQueryOfImplicitSelectedVisualiser = searchQueryRepository.GetSearchQuery(implicitSelectedVisualiser.Stream.Name, Service.Manual, userId);
List<SearchResultModel> approvedSearchResults = new List<SearchResultModel>();
if (twitterSearchQueryOfImplicitSelectedVisualiser != null || instagramSearchQueryOfImplicitSelectedVisualiser != null || facebookSearchQueryOfImplicitSelectedVisualiser != null
|| manualSearchQueryOfImplicitSelectedVisualiser != null)
{
// Define search text to be displayed during slideshow;
SearchModel searchModel = new SearchModel();
// Set slideshow settings from implicit selected visualiser.
ViewBag.CurrentVisualiser = implicitSelectedVisualiser;
// Load search results from selected visualisers.
foreach (SearchQuery searchQuery in selectedSearchQueries)
{
approvedSearchResults.AddRange(
SearchResultModel.ToSearchResultModel(
searchResultRepository.GetSearchResults
(searchQuery.Id,
implicitSelectedVisualiser.Language)));
// Add defined query too.
searchModel.SearchValue += " " + searchQuery.Query;
}
// Add defined query for implicit selected visualiser.
if (twitterSearchQueryOfImplicitSelectedVisualiser != null)
searchModel.SearchValue += " " + twitterSearchQueryOfImplicitSelectedVisualiser.Query;
if (instagramSearchQueryOfImplicitSelectedVisualiser != null)
searchModel.SearchValue += " " + instagramSearchQueryOfImplicitSelectedVisualiser.Query;
if (facebookSearchQueryOfImplicitSelectedVisualiser != null)
searchModel.SearchValue += " " + facebookSearchQueryOfImplicitSelectedVisualiser.Query;
ViewBag.Search = searchModel;
// Also add search results from implicit selected visualiser
if (twitterSearchQueryOfImplicitSelectedVisualiser != null)
approvedSearchResults.AddRange(SearchResultModel.ToSearchResultModel(searchResultRepository.GetSearchResults(twitterSearchQueryOfImplicitSelectedVisualiser.Id, implicitSelectedVisualiser.Language)));
if (instagramSearchQueryOfImplicitSelectedVisualiser != null)
approvedSearchResults.AddRange(SearchResultModel.ToSearchResultModel(searchResultRepository.GetSearchResults(instagramSearchQueryOfImplicitSelectedVisualiser.Id, implicitSelectedVisualiser.Language)));
if (facebookSearchQueryOfImplicitSelectedVisualiser != null)
approvedSearchResults.AddRange(SearchResultModel.ToSearchResultModel(searchResultRepository.GetSearchResults(facebookSearchQueryOfImplicitSelectedVisualiser.Id, implicitSelectedVisualiser.Language)));
if (manualSearchQueryOfImplicitSelectedVisualiser != null)
approvedSearchResults.AddRange(SearchResultModel.ToSearchResultModel(searchResultRepository.GetSearchResults(manualSearchQueryOfImplicitSelectedVisualiser.Id, implicitSelectedVisualiser.Language)));
// if user selected to show only posts from specific number of last days.
var approvedSearchResultsFilteredByDays = new List<SearchResultModel>();
if (implicitSelectedVisualiser.ShowPostsFromLastXDays != 0)
{
foreach (SearchResultModel searchResult in approvedSearchResults)
{
var postCreatedTimeWithDays = searchResult.PostCreatedTime.AddDays(implicitSelectedVisualiser.ShowPostsFromLastXDays + 1);
if (postCreatedTimeWithDays >= DateTime.Now)
approvedSearchResultsFilteredByDays.Add(searchResult);
}
}
else
{
approvedSearchResultsFilteredByDays = approvedSearchResults;
}
// Order search results (posts to be displayed by created datetime).
var approvedSearchResultsOrdered = new List<SearchResultModel>();
if (implicitSelectedVisualiser.PostsSortOrder == PostsSortOrder.CREATED_DATE_ASC)
{
approvedSearchResultsOrdered = approvedSearchResultsFilteredByDays.OrderBy(s => s.PostCreatedTime).ToList(); ;
}
else if (implicitSelectedVisualiser.PostsSortOrder == PostsSortOrder.CREATED_DATE_DESC)
{
approvedSearchResultsOrdered = approvedSearchResultsFilteredByDays.OrderByDescending(s => s.PostCreatedTime).ToList(); ;
}
else if (implicitSelectedVisualiser.PostsSortOrder == PostsSortOrder.RANDOM)
{
var rnd = new Random();
approvedSearchResultsOrdered = approvedSearchResultsFilteredByDays.OrderBy(x => rnd.Next()).ToList();
}
// Load background images;
var visualiserImages = visualiserImageRepository.GetImages(implicitSelectedVisualiser.Id);
//foreach (SearchResultModel searchResultModel in approvedSearchResultsOrdered)
//{
// searchResultModel.BackgroundImagePath = TwitterUtils.GetRandomImageBackgroundForDisplay(visualiserImages);
//}
ViewBag.BackgroundImagePath = TwitterUtils.GetRandomImageBackgroundForDisplay(visualiserImages);
approvedSearchResults = approvedSearchResultsOrdered;
}
DateTime end = DateTime.UtcNow;
Elmah.ErrorSignal.FromCurrentContext().Raise(new Exception(String.Format("User {0}: Preparing {1} posts for visualiser took {2} seconds", MySession.Current.LoggedInUserName, approvedSearchResults.Count(), (end - begin).TotalMilliseconds / 1000)));
return PartialView("_DisplayPostsNew", approvedSearchResults);
}
This isn't surprising actually. The servers used in Windows Azure are currently mostly 1.6 GHz machines. The larger sized machine you use the more cores you get, but they are all the same speed. This likely is a much slower CPU than the development machine you use.
On Windows Azure Web Sites when you move to Shared mode you are still in a multi-tenant environment, so you could be seeing some noisy neighbors here. The difference between Free and Shared is that many of the quotas for free are removed since you are paying. When you move to Standard then you are assigned a Virtual Machine dedicated to your web sites (up to 100 of them), so that is the best case scenario since you are the only one using the resources at that point.
There was a thread on this on the MSDN forums a while back : http://social.msdn.microsoft.com/Forums/windowsazure/en-US/0d0a3a88-eac4-4b9e-8b10-4a547cbf653b/performance-of-azure-servers-slow-cpus?forum=windowsazuredevelopment
They have started offering different hardware configurations with more memory for Virtual Machines and Cloud Services and such, but I'm not sure the CPUs have been changed. It's hard to find the CPU stated on WindowsAzure.com anymore, but on the pricing calculator for Web Sites it references 1.6Ghz machines when you move the slider to Standard.
Actually I found the issue.
Locally, I tested with a few hundreds of records in my DB while in Azure DB I have over 70 000 records in that table which affects performance of the algorithm...
One mistake I did in the code above: I have filtered records from DB by specific date AFTER taking all out. By filtering directly in Linq, I increased the performance from 10s to 0.3s in Azure too.

How to generate requests and responses for all operations of a WSDL in soapUI using a groovy script?

I have a WSDL which has multiple operations. For each op i want a template .xml with its response and request.
I know how to do this manually in soapUI but I would like to generate them using a groovy script.
I googled a lot already, but seems I'm the only one who is looking for this.
My service has 16 Operations, so to do this manual would be too much time. Since the service gets updates every 2 months, an automation using a test step would be perfect.
I managed to do it for the requests already:
right-click on ´services´ in left tree, ´Generate Test Suite´, ´Single Test Case with one Request for each Operation´
then I loop through those Test Step Requests and store them on my disk.
import com.eviware.soapui.impl.wsdl.teststeps.*
for( testCase in testRunner.testCase.testSuite.getTestCaseList() )
{
for( testStep in testCase.getTestStepList() )
{
if( testStep instanceof WsdlTestRequestStep )
{
log.info "operation name: " +testStep.getName()
// create file name
Date startTime = new Date();
def cur_Time = startTime.getMonth() + "_" + startTime.getDate();
cur_Time = cur_Time + "_" + startTime.getHours() + startTime.getMinutes() +startTime.getSeconds()
def fileName = testStep.getName() + "_" + cur_Time
def inputFileRequest = new File("T:\\"+ "Request_" + fileName+".txt")
def inputFileResponse = new File("T:\\"+ "Response_" + fileName+".txt")
// write request to file
inputFileRequest.write(testStep.getProperty("request").value)
}
}
}
But I havent figured out a way to do this also for the resposes.
If i use getProperty("reponse") it's null of course.
Any hint? :)
and the winner is, I figured it out myself:
map = context.testCase.testSuite.project.interfaces["services"].operations
for (entry in map)
{
opName = entry.getKey()
inputFileRequest = new File("T:\\" + opName + "Request.xml")
inputFileResponse = new File("T:\\" + opName + "Response.xml")
inputFileRequest.write(entry.getValue().createRequest(true))
inputFileResponse.write(entry.getValue().createResponse(true))
}
This is great , even I am also working on the same. As of now I am taking xml request from a folder but I just want to get the Request from WSDL itself and want to get it's parameter.
try{
//Hitting the WSDLs one by one
wsdlList.each
{
wsdl ->
wsdlToHit=wsdl
log.info("WSDL To Hit :" + wsdlToHit)
// Creating an interface
log.info("Before Interface Creation")
iface= WsdlInterfaceFactory.importWsdl( project,wsdl, false )[0]
//iface= WsdlInterfaceFactory.importWsdl( project,WSDLFile, false )[0]
log.info("After Interface Creation")
if(Operation == "xyz")
{
requestXML= requestXML1
responseActual= responseActual1
expectedActual=expectedActual1
}
if(Operation == "abc")
{
requestXML= requestXML2
responseActual= responseActual2
expectedActual=expectedActual2
}
requestXML.each
{
request1 ->
def wsdlReqDir=request1
log.info("RequestLocation : " + wsdlReqDir)
File fl = new File(wsdlReqDir)
File[] wsdlDirFiles = fl.listFiles()
log.info("XML Files in Request Folder : " + wsdlDirFiles)
if(wsdlDirFiles.size()>0)
{
wsdlDirFiles.each
{
wsdlFile->
log.info("Request XML file to Send :" + wsdlFile)
//Calling the function to hit the service
sendRequest(wsdlFile,iface,Operation,Report_File_LOC,requestXML,responseActual,propData)
reportFilewriter.flush()
}
}
}
//removing Interface created
removeInterface(wsdl)
log.info("Removed iface : " + wsdl)
reportFilewriter.flush()
}
Thanks,
Hanumant

Resources