Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit e17e673

Browse files
2.4.0
1 parent 2264f85 commit e17e673

3 files changed

Lines changed: 198 additions & 29 deletions

File tree

Numerics/Numerics/CSharpNumerics.csproj

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,8 +10,8 @@
1010
<PackageLicenseFile>LICENSE</PackageLicenseFile>
1111
<RepositoryUrl>https://github.com/backlundtransform/CSharpNumerics</RepositoryUrl>
1212
<RepositoryType>git</RepositoryType>
13-
<PackageTags>Numerics,Statistics,ML,Physics,Math,Calculus,Matrix,Vector,FFT,Interpolation,ODE,Regression,Classification,CrossValidation,Astronomy,Kinematics,Dynamics,Integration,Tensor, GamePhysics</PackageTags>
14-
<Version>2.3.0</Version>
13+
<PackageTags>Numerics,Statistics,ML,Physics,Math,Calculus,Matrix,Vector,FFT,Interpolation,ODE,Regression,Classification,CrossValidation,Astronomy,Kinematics,Dynamics,Integration,Tensor, GamePhysics, MonteCarlo, Clustering</PackageTags>
14+
<Version>2.4.0</Version>
1515
<PackageIcon>logo.png</PackageIcon>
1616
<PackageReadmeFile>README.md</PackageReadmeFile>
1717
</PropertyGroup>

Numerics/Numerics/ML/README.md

Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -643,3 +643,87 @@ All evaluators implement `IClusteringEvaluator` where **higher score = better**.
643643
| Davies-Bouldin | `DaviesBouldinEvaluator` | $-DB$ (negated) | Lower DB = better separation |
644644
| Calinski-Harabasz | `CalinskiHarabaszEvaluator` | $CH$ | Higher = better, fast |
645645

646+
---
647+
648+
### 🎲 Monte Carlo Clustering (Uncertainty Estimation)
649+
650+
Use the `MonteCarloClustering` class to quantify **how stable** your clustering results are via bootstrap resampling. Two analysis modes are available:
651+
652+
#### Bootstrap — Consensus Matrix & Score Distribution
653+
654+
Runs the algorithm many times on bootstrap-resampled data. Produces a **consensus matrix**, per-point **stability scores**, and full **score distributions** with confidence intervals.
655+
656+
```csharp
657+
var mc = new MonteCarloClustering { Iterations = 200, Seed = 42 };
658+
var result = mc.RunBootstrap(
659+
data,
660+
new KMeans { K = 3 },
661+
new SilhouetteEvaluator(),
662+
new StandardScaler()); // optional
663+
664+
// Score uncertainty
665+
var ci = result.ScoreConfidenceInterval(); // e.g. (0.68, 0.74)
666+
double se = result.ScoreDistribution.StandardError;
667+
var histogram = result.ScoreDistribution.Histogram(20);
668+
669+
// Consensus matrix (N × N) — fraction of times each pair co-clustered
670+
Matrix consensus = result.ConsensusMatrix;
671+
672+
// Per-point stability [0, 1] — how consistently each point stays in its cluster
673+
double[] stability = result.PointStability;
674+
double[] convergence = result.ConvergenceCurve; // running mean of score
675+
```
676+
677+
#### Experiment — Optimal-K Distribution
678+
679+
Runs a full K-range experiment many times on bootstrap samples. Shows **how often each K value is selected as best**, revealing whether the optimal K is robust.
680+
681+
```csharp
682+
var mc = new MonteCarloClustering { Iterations = 100, Seed = 42 };
683+
var kResult = mc.RunExperiment(
684+
data,
685+
new KMeans(),
686+
new SilhouetteEvaluator(),
687+
minK: 2, maxK: 8);
688+
689+
// Which K values won across the 100 bootstrap runs?
690+
foreach (var (k, count) in kResult.OptimalKDistribution.OrderByDescending(x => x.Value))
691+
Console.WriteLine($"K={k}: chosen {count}/100 times");
692+
693+
// Score distribution for the best K in each iteration
694+
var ci = kResult.ScoreConfidenceInterval();
695+
```
696+
697+
#### Fluent API Integration
698+
699+
Add Monte Carlo uncertainty with a single builder call:
700+
701+
```csharp
702+
var result = ClusteringExperiment
703+
.For(data)
704+
.WithAlgorithm(new KMeans())
705+
.TryClusterCounts(2, 8)
706+
.WithEvaluator(new SilhouetteEvaluator())
707+
.WithScaler(new StandardScaler())
708+
.WithMonteCarloUncertainty(iterations: 200, seed: 42)
709+
.Run();
710+
711+
// Standard result
712+
Console.WriteLine($"Best K = {result.BestClusterCount}");
713+
714+
// Monte Carlo result (populated automatically)
715+
var mcResult = result.MonteCarloResult;
716+
Console.WriteLine($"Score CI = {mcResult.ScoreConfidenceInterval()}");
717+
Console.WriteLine($"K distribution: {string.Join(", ",
718+
mcResult.OptimalKDistribution.Select(kv => $"K={kv.Key}: {kv.Value}"))}");
719+
```
720+
721+
**Key points:**
722+
723+
* Bootstrap uses **sampling with replacement** — each iteration sees ~63 % unique points
724+
* Consensus matrix cell (i,j) = fraction of runs where points i and j co-clustered (normalized by co-occurrence)
725+
* Point stability = average consensus with same-cluster neighbours; close to 1.0 = very stable
726+
* `RunExperiment` requires a K-accepting algorithm (KMeans, AgglomerativeClustering) — uses reflection to set K
727+
* All results include full `MonteCarloResult` from the statistics engine (Mean, StdDev, Percentile, Histogram, CI, StandardError)
728+
* Reproducible when `Seed` is set
729+

Numerics/Numerics/Numerics/README.md

Lines changed: 112 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -504,62 +504,147 @@ var solution = matrix.GaussElimination(vector);
504504
```
505505

506506
---
507+
## ✨ Interpolation
507508

508-
## 📈 Interpolation
509+
CSharpNumerics provides a rich interpolation toolkit — from simple piecewise methods to polynomial, spline, rational, trigonometric, and multivariate interpolation.
510+
511+
### Piecewise (two-point) interpolation
512+
513+
All piecewise methods are accessible via a unified dispatcher:
514+
515+
```csharp
516+
double y = data.Interpolate(p => (p.Index, p.Value), 3.5, InterpolationType.Linear);
517+
```
518+
519+
| Type | Enum | Description |
520+
|------|------|-------------|
521+
| Linear | `Linear` | $y = y_1 + (y_2 - y_1)\frac{x - x_1}{x_2 - x_1}$ |
522+
| Log–Log | `Logarithmic` | Linear in log-space for both x and y |
523+
| Lin–Log | `LinLog` | Linear in x, logarithmic in y |
524+
| Log–Lin | `LogLin` | Logarithmic in x, linear in y |
525+
526+
### Polynomial interpolation
527+
528+
Passes a single polynomial of degree N−1 through all N data points.
509529

510530
```csharp
511-
var linear = data.LinearInterpolation(p => (p.X, p.Y), xValue);
512-
var logLog = data.LogarithmicInterpolation(p => (p.X, p.Y), xValue);
513-
var linLog = data.LinLogInterpolation(p => (p.X, p.Y), xValue);
514-
var logLin = data.LogLinInterpolation(p => (p.X, p.Y), xValue);
531+
double[] x = { 0, 1, 2, 3 };
532+
double[] y = { 1, 2, 0, 5 };
533+
var poly = new PolynomialInterpolation(x, y);
515534

516-
// Or with enum:
517-
var result = data.Interpolate(p => (p.X, p.Y), xValue, InterpolationType.Linear);
535+
double val = poly.Evaluate(1.5); // Lagrange basis form
536+
double val2 = poly.EvaluateNewton(1.5); // Newton divided-difference
537+
var (val3, err) = poly.EvaluateNeville(1.5); // Neville with error estimate
518538
519-
// Time series interpolation:
520-
double value = timeSeries.LinearInterpolationTimeSerie(dateTime);
539+
// Or via the extension method:
540+
double val4 = data.Interpolate(p => (p.Index, p.Value), 1.5, InterpolationType.Polynomial);
521541
```
522542

523-
---
543+
### Cubic Spline interpolation
544+
545+
Piecewise cubic polynomials with C² continuity. Three boundary conditions:
546+
547+
| Boundary | Description |
548+
|----------|-------------|
549+
| `Natural` | $S''(x_0) = S''(x_n) = 0$ (free ends) |
550+
| `Clamped` | First derivative specified at endpoints |
551+
| `NotAKnot` | Third derivative continuous at second & second-to-last knot |
552+
553+
```csharp
554+
double[] x = { 0, 1, 2, 3, 4 };
555+
double[] y = { 0, 1, 0, 1, 0 };
556+
557+
var spline = new CubicSplineInterpolation(x, y); // Natural
558+
var clamped = new CubicSplineInterpolation(x, y, SplineBoundary.Clamped, 1.0, -1.0);
559+
var nak = new CubicSplineInterpolation(x, y, SplineBoundary.NotAKnot);
560+
561+
double val = spline.Evaluate(2.5);
562+
double dydx = spline.Derivative(2.5); // first derivative
563+
double d2y = spline.SecondDerivative(2.5); // curvature
564+
565+
// Or via extension method:
566+
double val2 = data.Interpolate(p => (p.Index, p.Value), 2.5, InterpolationType.CubicSpline);
567+
```
524568

525-
## 📊 Statistics
569+
### Rational interpolation
526570

527-
**Descriptive**
571+
Ratio of two polynomials — handles poles and near-singularities better than polynomials.
528572

529573
```csharp
530-
double median = data.Median(p => p.Value);
531-
double variance = data.Variance(p => p.Value);
532-
double stdDev = data.StandardDeviation(p => p.Value);
533-
double covariance = data.Covariance(p => (p.X, p.Y));
534-
double r2 = data.CoefficientOfDetermination(p => (p.Predicted, p.Actual));
574+
double[] x = { 0, 1, 2, 3, 4 };
575+
double[] y = { 1.0, 0.5, 0.333, 0.25, 0.2 }; // ≈ 1/(1+x)
576+
var rat = new RationalInterpolation(x, y);
577+
578+
var (val, err) = rat.Evaluate(1.5); // Bulirsch–Stoer + error estimate
579+
double val2 = rat.EvaluateFloaterHormann(1.5); // barycentric, guaranteed pole-free
535580
```
536581

537-
**Confidence Intervals**
582+
### Trigonometric interpolation
583+
584+
Best for periodic functions. Builds a trigonometric polynomial from the data.
538585

539586
```csharp
540-
var (lower, upper) = data.ConfidenceIntervals(p => p.Value, 0.95);
587+
int N = 16;
588+
double[] x = new double[N], y = new double[N];
589+
for (int i = 0; i < N; i++)
590+
{
591+
x[i] = 2 * Math.PI * i / N;
592+
y[i] = Math.Sin(x[i]) + 0.5 * Math.Cos(2 * x[i]);
593+
}
594+
595+
var trig = new TrigonometricInterpolation(x, y, period: 2 * Math.PI);
596+
double val = trig.Evaluate(Math.PI / 3);
597+
double dydx = trig.Derivative(Math.PI / 3);
598+
599+
// Fourier coefficients
600+
double[] a = trig.CosineCoefficients; // a_0, a_1, ..., a_M
601+
double[] b = trig.SineCoefficients; // b_0, b_1, ..., b_M
602+
603+
// Or via extension method:
604+
double val2 = data.Interpolate(p => (p.Index, p.Value), 1.5, InterpolationType.Trigonometric);
541605
```
542606

543-
**Cumulative Sum**
607+
### Multivariate interpolation
608+
609+
For scattered data in multiple dimensions.
610+
611+
**Inverse Distance Weighting (IDW / Shepard)**
544612

545613
```csharp
546-
var cumsum = data.CumulativeSum(p => p.Value);
614+
double[][] points = {
615+
new[] { 0.0, 0.0 }, new[] { 1.0, 0.0 },
616+
new[] { 0.0, 1.0 }, new[] { 1.0, 1.0 }, new[] { 0.5, 0.5 }
617+
};
618+
double[] values = points.Select(p => p[0] * p[0] + p[1] * p[1]).ToArray();
619+
620+
var interp = new MultivariateInterpolation(points, values);
621+
double val = interp.EvaluateIDW(new[] { 0.25, 0.75 }, power: 2);
547622
```
548623

549-
**Simple Regression**
624+
**Radial Basis Functions (RBF)**
550625

551626
```csharp
552-
var (slope, intercept, correlation) = data.LinearRegression(p => (p.X, p.Y));
553-
var expFunc = data.ExponentialRegression(p => (p.X, p.Y));
627+
double val = interp.EvaluateRBF(new[] { 0.25, 0.75 }, RbfKernel.Gaussian);
554628
```
555629

556-
**Normal Distribution**
630+
| Kernel | Formula |
631+
|--------|---------|
632+
| `Gaussian` | $\phi(r) = e^{-(r/\varepsilon)^2}$ |
633+
| `Multiquadric` | $\phi(r) = \sqrt{1 + (r/\varepsilon)^2}$ |
634+
| `InverseMultiquadric` | $\phi(r) = 1/\sqrt{1 + (r/\varepsilon)^2}$ |
635+
| `ThinPlateSpline` | $\phi(r) = r^2 \ln(r)$ |
636+
| `Cubic` | $\phi(r) = r^3$ |
637+
638+
**Bilinear / Trilinear (regular grids)**
557639

558640
```csharp
559-
var pdf = Statistics.NormalDistribution(standardDeviation: 1, mean: 0);
560-
double density = pdf(0.5);
641+
double val2d = MultivariateInterpolation.Bilinear(xGrid, yGrid, gridValues, xi, yi);
642+
double val3d = MultivariateInterpolation.Trilinear(xGrid, yGrid, zGrid, gridValues, xi, yi, zi);
561643
```
562644

563645

564646

565647

648+
649+
650+

0 commit comments

Comments
 (0)