I'm doing DAST analysis throught Owasp Zap to differents microservices. In one point, before the crawler ran, they give me this:
2022-07-20 19:26:11,517 Number of Imported URLs: 7
Total of 93 URLs
Is there a way to get these two number on a variable? I look in the 3 types of reports (XML, HTM and Json) but I could'nt see these quantities
Yes, ZAP maintains internal statistics which include the number of spiders found by the 2 spiders: https://www.zaproxy.org/docs/internal-statistics/ These statistics can be accessed via the ZAP API.
The stats are also available to reports - see https://www.zaproxy.org/docs/desktop/addons/report-generation/create/
Some reports like 'traditional-html-plus' include the statistics or you can create your own reports.
Related
In my Google Analytics stats i get two lines related to one url like with different stats
/my-page | xxx views
/my-page/ | xxxx views
So the question is, are they counted as different pages and statistics of each link is completely unique and does not affect views of the other one, or when visiting first link, views of the second one also increments and vice versa?
Also is this normal behavior or i can somehow reduce stats only to one url? For the first one, for example.
Analytics treats them as two different pages, so the stats are independent of each other.
To consider the aggregate statistics in Analytics you can make a filter that includes those 2 pages and look at the row of totals.
I am currently using Microsoft Academic's datadump for a project and unable to identify the total number of theses & dissertations(T&D) present. Based on their website, 38% of data is categorised to OTHERS type (one among them is T&D). But their 60+GB CSV dump doesn't explicitly indicate the T&D records. Can someone help me with the statistics for T&D or how find the same?
I tried their API too and unable to find using their API too.
The Microsoft Academic Graph does not explicitly segment publication sub-types by thesis or dissertations, which means that neither the API or website will either.
If this is something you'd like to see added, please let them know by using the "Feedback" link in the lower-right corner on the Microsoft Academic website.
I have come across a list of bitorrent trackers in a text file about 100 of them.My question is how are they able to generate this text file with so many trackers like that.anybody with a script that generates this tracker urls?
example of a bittorent tracker is : http://3dfreedom.ru:6969/announce
There is no central directory of trackers, but one can build a list by harvesting torrents, e.g. from big indexing sites and then extracting tracker lists from them.
To extract tracker URLs from a torrent file you a library that supports bencoding, a serialization format used in the bittorrent ecosystem.
Most torrents will either have a simple announce URL as described in BEP-3 or multiple as in BEP-12
I have a excel sheet with 583 cities as origins and 8 cities as destinations. I have to find distance between each pair of these origins and destinations. Since, it will be a cumbersome task, is their a way I can input the origins and destinations from excel and get output as distance between the cities?
I believe the easiest way would be to use Google Maps Distance Matrix API Web Services. There is an example URL request provided.
A request will look like this:
http://maps.googleapis.com/maps/api/distancematrix/json?origins=Vancouver+BC|Seattle&destinations=San+Francisco|Victoria+BC&mode=bicycling&language=fr-FR&key=API_KEY
You can replace the origins and destinations within the url with yours separated by pipes (|). It will take some work to copy them over manually. You might consider exporting the file as a .csv from Excel and using a scripting language to automate this process. See for example the urllib package of python.
Also note that as a free user, you will be limited on the number of origin and destination pairs you can put in one URL request.
Check out my article on how to get geolocation parameters in Excel using Google services and calculate distances between addresses:
http://www.analystcave.com/excel-calculate-distances-between-addresses/
I've just gotten into the Adwords API for an upcoming project and I need something quite simple actually, but I want to go about it the most efficient way.
I need code to retrieve the Global Monthly Search Volume for multiple keywords (in the millions). After reading about BulkMutateJobService, in the Google documentation they say
If you want to perform a very large number of operations (up to 500,000) on your AdWords campaigns and child objects, use BulkMutateJobService
But later on in the page they give limits of
No more than 25 OperationStream objects are allowed.
No more than 10,000 operations are allowed per BulkMutateRequest.
No more than 100 request parts are allowed.
as well as a few others. See source here http://code.google.com/apis/adwords/docs/bulkjobs.html
Now, my questions:
What do these numbers mean? If I have 1 million words I need information on, do I only need to perform 2 requests with 500K words each?
Also, are there examples of code that does this task?
I only need Global Monthly Search Volume and CPC for each keyword. I've searched online, but to no avail have I found any good example or anything leaning in that direction that utilizes BulkMutateJobService.
Any links, resources, code, advice you can offer? All is appreciated.
The BulkMutateJobService only allows for mutates, or changes, to the account. It does not provide the bulk retrieval of information.
You can fetch monthly search volume for keywords using the TargetingIdeaService. If you use it in STATS mode you can include up to 2500 keywords per request.
Estimates CPC values are obtained from the TrafficEstimatorService. You can request up to 500 keywords per request.
FYI, there is an official AdWords API Forum that you can ask questions on.