Questions tagged [scipy]
SciPy is a Python-based ecosystem of open-source software for mathematics, science, and engineering.
348
questions
0
votes
0
answers
9
views
using ranksum test within permutation test scipy
I am a bit confused with the definition of the permutation_test function provided by scipy. the following is what I wrote to calculate the p and null distribution
...
0
votes
0
answers
9
views
Nonlinear Optimization of Noisy Functions w/ Bound Constraints via SciPy
Can we use scipy.optimize.minimize to find the best parameters $\mathbf{w} \in \Omega^k$,
$\Omega \subset \mathbb{R}$, of a function
$g = g(f(\mathbf{x}), \mathbf{w}...
2
votes
1
answer
58
views
mean of medians consistently differs from median
Disclaimer: my first question here.
Background of the problem:
I want to compare two distributions (n1 ~ 500 and n2 ~ 700), not normal and with different variances, but roughly unimodal. I decided to ...
1
vote
1
answer
72
views
Two-sample Kolmogorov-Smirnoff test in R and/or Python, how to find degrees of freedom?
I am trying to run a Kolmogorov-Smirnov/K-S test in R using the ks.test() function, and I am trying to find the degrees of freedom when comparing across two different groups. I am also trying in ...
1
vote
1
answer
25
views
Standard function to quantify consistency of a sequence of predictions
Let's say I let a deep learning model classify a single object multiple times but under varying circumstances. Ideally it should predict the same class again and again. But in reality its class ...
4
votes
2
answers
162
views
Unexpected p-value distribution of Mann-Whitney U test under null hypothesis
I am getting very unexpected results of p-value distribution of Mann-Whitney U test under null hypothesis.
I am working on a real data, but I was able to replicate the results on artificial data with ...
2
votes
0
answers
62
views
Negative KL Divergence estimates
I was exploring the KL Divergence and came across some research about calculating it from samples. On stack-exchange, I found out that minimising the KL Divergence is equivalent to minimising the Sum ...
0
votes
0
answers
20
views
Why doesn't my numpy code for generating correlated, normally distributed variables preserve the covariance?
I'm trying to generate random variables whose correlation matches some existing data. My coding skills are good, my statistics, not so much... I'm trying to follow this this answer.
I know that...
....
1
vote
0
answers
115
views
Why do Welch's t-test results differ from Welch's ANOVA post-hoc tests?
I have data on elevation between three different berry types in Python (Raspberry, Sunberry, ...
0
votes
0
answers
63
views
The meaning of probability density functions' product followed by an integration
Scipy's KDE object allows integration of a function multiplied by another KDE object. I assume that this is meant to be used for the estimation of distance between two distributions. As far as I ...
2
votes
1
answer
49
views
Understanding complete linkage
I was trying to understand the linkage function from scipy and I was confused with the output for this sample code
...
0
votes
0
answers
29
views
How to perform Hierarchical Clustering using centroid method and custom distance metric?
I would like to perform Agglomerative Hierarchical Clustering using the centroid method (defined on this page) and a custom distance metric, probably cosine similarity. In the Scipy docs it says you ...
0
votes
0
answers
16
views
Computing coordinates of points of an image after elastic deformation
My task is: given an image and set of points of interest, elastically and randomly deform the image and save it with the modified aforementioned points.
example: (blue points are the points of ...
1
vote
1
answer
305
views
Understanding shannon entropy and computation with scipy.stats.entropy
I am trying to understand the shannon entropy better. By definition, the shannon entropy is calculated as H = -sum(pk * log(pk)).
I am using the scipy.stats.entropy formula and I am running the ...
6
votes
1
answer
364
views
Why doesn't estimating Shannon entropy with a histogram converge to its true value?
I'm following the third recipe of this answer to estimate the Shannon entropy of my samples using histograms. My expectation was, increasing the sample size should lead to a better estimation of the ...