Mathematical prerequisites for some Clustering techniques

Thu October 14, 2021
maths machine-learning

Prerequisites

Scalar Product

Consider u,vRd.

The scalar product <u;v> (sometimes written also as u.v) is

<u;v>=di=1uivi

<u;v> is

Cauchy-Schwarz Inequality

Consider non-zero x,yRn. The absolute value of the dot product;

|<x,y>|≤∥x∥∥y

or

|x.y|≤∥x∥∥y

the value is equal when x, y are colinear.

Let p(t)=∥tyx20

This value is positive as z∥=z21++z2m0. Also note by definition of scalar product that z∥=z.z

Hence,

Let p(t)=(tyx).(tyx)0

using distributive property of dot product

p(t)=ty.tyx.tyty.x+x.x0

p(t)=t2(y.y)2(x.y)t+x.x0

if we define;

y.y=a

2(x.y)=b

x.x=c

then we obtain

p(t)=t2(a)2(b)t+c0

for t=b2a,

p(b2a)=(b24a2)(a)2(b)(b2a)+c0

b24ab22a+c0

b24a+c0

cb24a

Hence

4acb2

Therefore, substituting

4(y.y)(x.x)[2(x.y)]2

4y2x24(x.y)2

y2x2(x.y)2

y∥∥x∥≥(x.y)

Consider the colinear case if x=cy

| cy.y | =c | y.y |
=cy2
=cy∥∥y
=∥cy∥∥y
=∥x∥∥y

Intermediate Value Theorem

Suppose f is a function continuous in every point in the interval [a,b];

ivt

Mean Value Theorem

For f continuous in [a,b] and differentiable over (a,b), c where the instantaneous change is equal to the average change.

mvt

hence x such that

f(x)=f(b)f(a)ba

or

f(x)(ba)=f(b)f(a)

Fixed Point Iteration

In solving f(x)=0, we rewrite as x=g(x).

Note This is always possible; f(x)=0

f(x)+x=x
g(x)=x

The root r=g(r) where r is a fixed point. Hence with initial guess x0 compute root g(x0). Idea is that x1=g(x0) will be closer to r.

fixed point algorithm

Convergence Analysis

Let r be a root such that r=g(r). The value of x at iteration step k, xk+1=g(xk).

Considering that the error at step k+1=|g(xk)g(r)| Using the mean value theorem, there exists point ξ such that,

|g(xk)g(r)|=|g(ξ)(xkr)|

where ξ[xk,r]

Note that at each iteration the error xkr is multiplied by |g(ξ)|. Therefore,

Convergence Condition I=[rc,r+c] for some c>0 such that |g(x)|<1 on I and xoI

Graphical Approach

Setting y=x and y=g(x), the intersection is the solution as x=g(x)

fixed point convergence

fixed point Divergence

Example

Solve f(x)=xcos(x)=0

Solution x=g(x)=cos(x)

Solving to 4 point accuracy, selecting initial guess of x0=1

x1= cos(x0)=cos(1)= 0.5403
x2= cos(0.5403)= 0.8576
x3= cos(0.8576)= 0.6543
x23= 0.7390
x24= 0.7391

Therefore r0.7391 Convergent




Logistic Regression

Derivation of logistic regression
machine-learning

Notes about Azure ML, Part 11 - Model Validation in AzureML

March 9, 2023
machine-learning azure ml hyperparameter tuning model optimization

Notes about Azure ML, Part 10 - An end-to-end AzureML example; Model Optimization

Creation and execution of an AzureML Model Optimization Experiment
machine-learning azure ml hyperparameter tuning model optimization


machine-learning 27 python 21 fuzzy 14 azure-ml 11 hugo_cms 11 linear-regression 10 gradient-descent 9 type2-fuzzy 8 type2-fuzzy-library 8 type1-fuzzy 5 cnc 4 dataset 4 datastore 4 it2fs 4 excel 3 paper-workout 3 r 3 c 2 c-sharp 2 experiment 2 hyperparameter-tuning 2 iot 2 model-optimization 2 programming 2 robotics 2 weiszfeld_algorithm 2 arduino 1 automl 1 classifier 1 computation 1 cost-functions 1 development 1 embedded 1 fuzzy-logic 1 game 1 javascript 1 learning 1 mathjax 1 maths 1 mxchip 1 pandas 1 pipeline 1 random_walk 1 roc 1 tools 1 vscode 1 wsl 1