diagonal (or divergence) of Jacobian matrix - pytorch

How to efficiently calculate the diagonal of the jacobian matrix using Pytorch?
This operator is widely used in the diffusion model.
[d z_1/d x_1, d z_2/d x_2, ..., d z_n/d x_n]
some not ideal alternatives are: 1. calculate the whole Jacobian matrix first, then take out the diagonal.
2. loop over each entry to calculate the derivative individually.

Related

how to get one number after calculation of distance between 2D matrices

I'd like to calculate document similarity by using word embedding models (w2v, glove)
so one document can be represented 257*300 matrix
( 257(max number of document) * 300(pretrained embedding model dimension))
And now I try to calculate distance between all document.
When I use cosine similarity, euclidean or other vector calculation methods in scikit-learn.
But these methods return similarity matrix.
Is there any method to get one number after matrix distance calculation?
Or should I calculate average of all vectors in similarity matrix ? (I think this is not proper way to solve this problem..)

Algorithm to check whether distance between any two objects is within the Specified grid

I have distance value between two objects. Need algorithm to check whether measured distanced is available in distance of any two objects in grid pattern shown in Image
Grid for verification
This is grid with squared cells. All distances at such grid (expressed in units of cell size) should satisfy to condition
d^2 = a^2 + b^2
If squared distance is integer and you can represent it as sum of two integer squares, then objects can be placed in grid nodes.
There is mathematical criteria - number P is not representable as sum of two squares if it's factorization into primes contains any (4n+3)factor in odd power

Estimate torsion for a discrete curve using four points

The curvature of a discrete space curve can be calculated using 3 successive points can be calculated using the Menger curvature (see https://en.wikipedia.org/wiki/Menger_curvature and Calculate curvature for 3 Points (x,y)).
My question is: is there a similar explicit formula for the torsion (https://en.wikipedia.org/wiki/Torsion_of_a_curve or ) using four 4 successive points?
If not an explicit formula, does someone know of an algorithm/package for calculating it? I work in python, but anything will do.
I can imagine the basic steps. Two successive vectors define a plane, and thus 3 successive vectors define two planes. The change in angle between the plane normals is proportional to the torsion. But I need an exact formula, with the calculated torsion having the proper dimension of 1/length^2.
Having some parametrization of curve r(t) (for example, by length of polyline chain) you can calculate three derivatives using 4 points: r', r'', r'''.
Then torsion is:
v = r' x r'' //(vector product)
torsion = (r''' .dot. v) / (v.dot.v) //.dot. is scalar product

Can the cosine similarity when using Locality Sensitive Hashing be -1?

I was reading this question:
How to understand Locality Sensitive Hashing?
But then I found that the equation to calculate the cosine similarity is as follows:
Cos(v1, v2) = Cos(theta) = (hamming distance/signature length) * pi = ((h/b) * pi )
Which means if the vectors are fully similar, then the hamming distance will be zero and the cosine value will be 1. But when the vectors are totally not similar, then the hamming distance will be equal to the signature length and so we have cos(pi) which will result in -1. Shouldn't the similarity be always between 0 and 1?
Cosine similarity is the dot product of the vectors divided by the magnitudes, so it's entirely possible to have a negative value for the angle's cosine. For example, if you have unit vectors pointing in opposite directions, then you want the value to be -1. I think what's confusing you is the nature of the representation because the other post is talking about angles between vectors in 2-D space whereas it's more common to create vectors in a multidimensional space where the number of dimensions is customarily much greater than 2, and the value for each dimension is non-negative (e.g., a word occurs in document or not), resulting in a 0 to 1 range.

How to curve fit data in Excel to a multi variable polynomial?

I have a simple set of data, 10 values that increase.
I want to fit them to a polynomial of the form:
Z = A1 + A2*X + A3*Y + A4*X^2 + A5*X*Y+ A6*Y^2
Where Z the output is the set of data above, A1 - A6 are the coefficients I am looking for,
X is the range of inputs (10 of course), and Y for the moment is a constant value.
How can I curve fit to this polynomial and not the standard 2nd order one that is created using 'trendline'?
Construct a Vandermonde matrix on your data points, find it's inverse with MINVERSE, then apply this to the vector of Z values with MMULT. This would work for polynomial degree n with n data points.
Otherwise you could try polynomial regression, which will again use the Vandermonde matrix.
More math than Excel really.

Resources