Open Graph Insights: Definitions of "Objects" and "Actions" - object

When viewing insights for Open Graph, at the very bottom of the page we get the following:
1,193 App Created Actions?
The number of Open Graph Actions created by this application.
and:
50,808 App Created Objects?
The number of Open Graph Objects created by this application.
But what exactly are they, and how do they correlate? Actions are relatively obvious, they would cover the 'news.reads' and I had assumed objects would be the 'article' object you 'read'... but logic would dictate that you'd have a significantly lower number of objects than you would actions, rather than vice versa.
You'd think that 10 people might 'read' 1 article, thus giving you 10 actions against 1 object.
Have I misunderstood?

Related

How to show tracking in a user flow diagram?

I have made a user flow diagram but just wanted to ask you guys about clarification on displaying "tracking".
Let's say these are the requirements
User enters a web page to browse products
User selects product
The selected product is tracked by Google analytics ( making a request to google servers)
User goes to order page to purchase product or not
What would be the correct way of showing step number 3 in my user flow diagram?
Please see screenshot below and let me know of any suggestions
Thank you in advance!
A data-flow diagram differs from ordinary flow-charts: instead of arrows showing the control flow (sequence of actions and decisions), the arrow should correspond to data moving from one function to the other. On each arrow you should be able to write the data that is transmitted.
If a product is tracked with analytics, the arrow to Google tracking would probably be a product. The question then is:
does Google tracking return something to selects product? In this case an arrow in the opposite direction is missing (for the response).
Or will another function get data from Google tracking to exploit the tracking? This function would be missing.
or is Google tracking considered to be some data store that would be read elsewhere.
Unrelated to your question
Does the selects product get the product selected from the web page ? Or is this something that receives data from the user? Or is it in fact a part of your webpage?
On the symbols, the main DFD notations use circles instead of boxes, or rounded rectangles. The diamond and the circle here look like flow-chart diagram symbols and are confusing.

Recognize if object is completely or partially visible with Bing/Azure Cognitive API

Wondering, how do I recognize if an image contains a specific object and this object is completely visible (not partially).
Cognitive Services Computer Vision API provides set of tags and description of the image I send, however, there is no information if object is completely or partially represented.
My goal is to have a service that I can upload a picture of, say, car, and get information is it a full car visible or just part of it.
Unfortunately the Computer Vision API is currently unable to perform such a function.
The tags returned do have a 'score' which represents the confidence that this item is in the image. You may find there's some correlation between the confidence and how much of the item is in the image, but you'd need to run some experiments to see how well it matches up. If the object is obscured too much, it may not even detect it all.
Feel free to drop a suggestion on our User Voice, if you think this would be a useful feature.

How to make Sequence Diagram for Update Inventory

I'm preparing the sequence diagram for a project. I made the following sequence diagram for a retailer updating his inventory
It's confusing to me because this is the first time I use this technique with a real project.i have used database as an object here and i don't know whether its right or wrong. And another thing i need to clarify is by using Updating i meant for both editing/add new item To the inventory. Is it wrong to do like that way? or else can we draw it separately?
The following image is part of the updating process, would any one take a look and correct me if I did any mistake.(UpdateUI- User interface).Thanks in Advance.
It does not look right. There are a couple of issues:
Your database will likely never issue any messages
Actions inside a DB are usually not exposed. You normally only call CRUD from outside for a DB.
You mix synch/asynch (likely unwillingly). Filled arrows are synch, unfilled ones as asynch.
Main Page is likely the V in MVC and UpdateUI the C. So the controller will act on a click from the user and interact with the DB.
So just from my guts here is a more reasonable sketch:

How to ease updating inferno with web performance test scripts

Updating can performance test script e.g. with LoadRunner can take a lot of time and be quite frustrating. If there has been some updates with the applications, you usually have to run the script and then find out what has to be changed, update and run again and so on. Does anyone have some concrete best practices how to ease this updating inferno? One obvious thing is good communication with developers.
It depends on the kind of updates. If the update is dramatic, like adding new fields for user to fill in, then, someone has to manually touch up the test scripts.
If, however, the update is minor, for example, some changes to the hidden fields or changes to the internal names of user-facing fields, then it's possible to write a script that checks the change and automatically updates the test script.
One of the performance test platforms, NetGend, automatically takes care of the hidden fields and the internal names of user-facing fields so it's very easy to create a script to performance-test a HTML form. Tester only needs to fill in the values that he/she would have to enter using a browser, so no correlation is necessary there. Please send me a message if you need to know more about it.
There are many things you can do to insulate your scripts from build to build variability. The higher up the OSI stack you go the lower the maintenance charge, but the higher the resource cost for the virtual user type. Assuming changes are limited to page level resources and a few hidden fields here and there for web sites or applications, then you can record in HTML mode. You blast the EXTRARES sections as the page parser in HTML mode will automatically parse the page and load the page resources even without an explicit reference - It can be a real pain to keep these sections in synch if you have developers who are experimenting quite a bit.
Next up, for forms which have a very high velocity in terms of change consider the use of a web_custom_request() for the one form. You can use correlation statements to pick up all of the name|value pairs as needed and build the form submit dynamically. There will be a little bit more up front work for this but you should have pay offs at around the fourth changed build where you would normally have been rebuilding some scripts.
Take a look at all of the hosts referenced in your code. Parameterize all of these items. I have a template that I use for web virtual users which pairs a default value and the ability to change any of the host names via the control panel extra attributes section. Take a look at the example for lr_get_attrib_string() for how you might implement the pickup and pair that with a check for NULL and a population with a default value in your code
This is going to seem counter intuitive, but comment your script heavily for changes that are occurring often so you know where to take the extra labor change up front to handle a more dynamic data set.
Almost nothing you do with any tool can save you from struuctural changes in the design and flow of the app, such as the insertion of a new page in the workflow, but paying attention to the design on the high change pages, of which there are typically a small number, can result in a test code with a very long life.
Of course if your application is web services based then there is a natual long life to the use of exposed public services. Code may change on the back end of the service, but typically the exposed public interface is very stable.

Can an excel worksheet be used as UDF?

I'm building a network business model in excel. A similar model is that of Gawker Media.
In my model I have a number properties that have some over lap of audience. Each property attracts users, which in turn affords cross promotional opportunities. In the case of Gawker they have a series of blogs whose audience will likely read several of their blogs in their network.
If gawker launched a new blog they're able to direct traffic from their blog network.
Creating a model for a single blog is fairly simple - although the initial assumptions are harder. The next step is to model the network effect.
Excel provides a scenarios manager that allows me to vary the key assumptions in the basic model. This is almost perfect, I can model the launch of 10 properties, each with different launch assumptions and see the summary.
Where I need help is figuring out how I can vary the initial number of users for the launch of each property. In other words, once the network is established, its possible to drive people to any new property launched on the network.
I don't believe the scenario manager will do what I need.
So, I'm wondering if its possible to use the model work sheet as a UDF? The UDF would need to spit out the monthly revenue and unique users given a number of input assumptions.
I would then be able to create my own summary sheet for the 10 properties and using the total uniques for each property get a summary for the network. This network summary would be used to determine how many people could be driven to the launch of a new property.
In effect, the only difference to the scenario manager is that I need one of my input variables (initial users) to be programmatically generated as a function of the number of people in the network at the time of launch.
I'm hoping its possible to achieve something along these lines in excel. I could drop down and create the whole model in Java, but then its much harder to share with business colleagues!
Thanks - Matt.
You could try Data Table.
It only allows you to analyse the effect of varying 2 input parameters, but you can create several data tables, and each parameter can take hundreds of different values.
It's little know, but efficient and available since Excel 3.0.
There is a product that I have researched but never used - search for calc4web. It takes a sheet of formulas and generates code (C++) that can be compiled into an XLL add-in. Then you can call a function that does what your sheet does. But of course then you have an XLL to distribute, and a build step every time you change your logic, which defeats much of the point of using a spreadsheet.
In my case, I wound up writing some very simple VBA code to vary my sheet "inputs" using the scenario manager, and capture my "outputs". This works if you have a batch of inputs that you can just point your macro at and step through.
EDIT:
See here for a VBA-only example of doing this:
using a sheet in an excel user defined function

Resources