I am new to Zipkin and am trying to understand how to view all spans in a trace.
On the trace query page for Zipkin I am seeing a trace come through with the expected 5 spans. The problem is that when I click "SHOW" to view the individual spans, it only shows 3 spans (and now says it only has 3). There should be two spans for both the API-GATEWAY and the CURRENCY-CONVERSION services, but they seem to be combined for some reason.
Query Page: Note it says 5 spans here
Trace Page: Note it now says there are only 3 spans
I am using the stock Docker image openzipkin/zipkin:2.23 with the sampler probability set to 100% for all services. Is there some configuration that I am missing? I would really like to be able to see the timing breakdown of the individual calls.
Related
In my Google Analytics stats i get two lines related to one url like with different stats
/my-page | xxx views
/my-page/ | xxxx views
So the question is, are they counted as different pages and statistics of each link is completely unique and does not affect views of the other one, or when visiting first link, views of the second one also increments and vice versa?
Also is this normal behavior or i can somehow reduce stats only to one url? For the first one, for example.
Analytics treats them as two different pages, so the stats are independent of each other.
To consider the aggregate statistics in Analytics you can make a filter that includes those 2 pages and look at the row of totals.
I've been working on improving our Core Web Vitals and thought I'd check another site to compare how we're doing.
I've noticed that one website has just the FCP, LCP, FID, CLS visible with a percentage image to represent how far away they are from the next stage. Yet, for our PageSpeed Insights Field Data we are seeing an additional 2 areas with Speed Index and Total Blocking Time, and we don't see the percentage image to help gauge.
Does anyone know how we can get ours to show the 4 main areas too please? I have attached images to show what I mean. Appreciate any help, thank you :)
enter image description hereenter image description here
It sounds like your site's PSI results don't have any field data, and the two additional metrics you're seeing are actually the lab data section. The other website may have more traffic than yours and qualify for inclusion in the public Chrome UX Report dataset, which could explain why they have field data in PSI but your site doesn't.
For example, here's a screenshot of the field and lab data sections:
The field data section resembles your screenshot while the lab data section has additional metrics for SI, TTI, and TBT.
All pages tested in PSI will have lab data but only the pages/origins in the Chrome UX Report will have field data available.
For more info about the difference between lab and field data see https://developers.google.com/web/fundamentals/performance/speed-tools#understanding_lab_vs_field_data
I have to use jointJs for building workflow diagram.
So how much performance will affect in ie browser when there are more then 200 component(includes arrows, nodes,bozes ) on same page ?
As you may know there is a demo application named Rappid here
You can test your performance requirements there.
I have tested the performance with ~150 components and 1 link between 2components and there was no performance issue at all.
I am using it for more than 200 nodes, but there are problems like if you want to add some description near every block at the same time it will hang for few minutes. But if you are doing operations in batches then it will be ok. There are problems if you want to hide 100 nodes together. I am doing these type of tasks using jQuery. I am using it for very complex use case for more than 500 nodes including nodes, arrows and other helper components. Tell me your exact requirement and I will be able to tell you if it will work or not.
The mvc-mini-profiler is a handy tool. ServiceStack has a forked version for use in services. I was thinking it would be dandy to capture the outputs of runs before and after a code change and compare the results.
I figure the steps are:
log the results to a data store or file instead of returning them in the result
compare the output of the various nodes
show the results side-by-side with diffs highlighted
bonus: configure tolerances for diff in amount of time spent in different areas. i.e. i may not care if time in sql varies by 300ms.
I did a quick search and didn't see anything.
Thanks,
Drew
I'm trying to organize a solr search engine. I've already set up the misspelling system and the suggestions.
However I can't seem to find how to retrieve the top 10 most searched words/terms/keywords in solr/lucene. How can I get this? I want to display those on my homepage.
Solr does not provide this kind of feature out of the box. There is the StatsComponent, that provides you with all kind of statistics, but all of those are numeric only.
Depending on how you access solr (directly or via your own app) you could intercept all calls an log the query string. I did this in a recent project where I logged a queries to a database. If you submit all keywords to an other core on your solr server, you can faceting queries on your search terms as described by Hyque
You could use a facet for retrieving the Top X words like this:
http://yourservergoeshere/solr/select?q=*&wt=xml&indent=true&facet=true&facet.query=*&facet.field=message&facet.limit=10&facet.minCount=1
The value of facet.field depends on the field you like to search in. With facet.limit you'll (obviously) limit the amount of results to 10. You'll find the facet results at the end of the results, starting with "facet_counts"
Edit: I really should go to bed earlier. I didn't see the "most searched" in your question. Sorry for that.
Apache Solr does not provide any such capability as of today. There is a desire for this and a JIRA ticket corresponding to it. You can vote for it if you'd like to see it in Solr some day: https://issues.apache.org/jira/browse/SOLR-10359.
The stats component provides information around statistics, but it's mostly numeric in nature. You could parse server logs and come up with a way to build a Frequently Searched Terms (e.g. pump those logs in SiLK or Kibana for visualization).
If you have the ability to change the front end and add some javascript code to the UI or can intercept the search request and make an async or batch calls to APIs for tracking, you can use SearchStax Analytics that provides Search Analytics that tracks searches, clicks, cart actions, revenue, etc.