Processing math: 100%

Thursday, July 19, 2018

R&C Analysis-Integration of Complex functions

Let μ be a positive definite measure on arbitarary measurable space X. Let L1(μ) be collection of all complex measurable functions f on X for which X|f|dμ< is called space of ``Lesbesgue integral functions''. f measurable implies |f| is measurable, hence above integral is defined. For understanding next definition, recall that a function u can be split into positive u+=max{u,0} and u=max{u,0} parts.  
Definition If f=u+iv for real measurable functions u,v on X and if fL1(μ), we define Efdμ=Eu+dμEudμ+iEv+dμiEvdμ for every measurable set E.Know that u+,u,v+,v are measurable. Then Efdμ exists. Futhermore u+|u||f| for all 4 parts of above integral. Hence, each one of them is finite. Clearly from above definition, Efdμ is a complex number. Occasionally, it is desirable to define integral of f with range [,] to be Efdμ=Ef+dμEf1dμ provided atleast one of the integrals on the right is finite. Thus LHS is a number between [,]. Theorem 1.32 Suppose f,gL1(μ) and α,β are complex numbers, then αf+βgL1(μ) and X(αf+βg)=αXfdμ+βXgdμ \subsection{Proof} First we need to establish that αf+βg is measurable.Then,need to show that the integral is less than infinity - thus establishing that LHS of above belongs to esteemed Legesgue integrable functions set(L1(μ)). If f,g are complex measurable functions, f+g and fg are measurable functions. αf=f.f.(α times) is measurable. Similarly βg is measurable. The sum of measurable funcitons αf+βg is measurable. WLOG assume ofg, then EfdμEgdμ. Know, |αf+βg||α||f|+|β||g| This implies X|αf+βg|dμX|α||f|+X|β||g|=|α|X|f|dμ+|β|X|g|dμ< Thus αf+βgL1(μ). To prove (4) we need to establish X(f+g)dμ=Xfdμ+xgdμ and Xαfdμ=αXfdμ Assume h=f+g h+h=f+f+g+g implies h++f+g=f++g++h From Theorem 1.27 know that if f=i=1fn(x) then Xfμ=ni=1Xfndμ. Applying this theorem yields, h++f1+g=f++g++h Since each of these integrals is finite, we can rearrange terms as we like. h+h=(h+h)=(f+f)+(g+g) Leading to X(f+g)dμ=Xfdμ+Xgdμ. To establish equation (8), the following was already proved earlier. X(αf)dμ=αXfdμ when α0 All that's left is to show that equation (8) holds for α<0 and α=i. α=0 case: Notice u+=max{u,0}=u which means X(1)fdμ=X(1)(u+iv)dμ=X(uiv)dμ=(1)Xfdμ α=i case: (if)=i(u+iv)=(iuv)=v+iu=i(u+iv)=if Proved for all α less than 0 and for i. This shows, Xαfdμ=αXfdμ

Wednesday, July 18, 2018

Machline Learning-Kernel Smoothing methods

Converted document
Chapter 5:
Kernel Smooth Methods:
The set up is as follows:
You have input data - this is called matrix X.
Dimension of this matrix are N × p where N is number of rows. Here each row corresponds to a sample from your experiment.
ie/ Inputs. p is number of features.
Your output is collected into a matrix called y. Most of the time y is a N × 1
Typical linear regression is expressed as y = Xβ where βare called coeff of linear regression.
For example you have a function y = β0 + β1x
These functions are linear in β. For example y = β0 + β1x + β2x2 is still a linear function.
If output is a real value, then it is called regression. And if output is categorical variable - ex: Obese/NonObese. We use logistic regression and its variations.
Key term here is “localization”. Idea is fit a simple model at each of the query points and from there infer a overall function f(X)
This locatization is achieved via using of “Kernels”. Basic set up and terminology is - kernel is Kλ(x0, x) where x0is query point and xis any arbitary point. λwhich is size of neighborhood,
In these models λis a parameter.
Simplest way to understand one dim Kernels.
One simple is to select a neighborhood λand simply average all the distances from your query point x0to all other points within λdistance or ball. Other way is to simply fit a linear function in each neighborhood. Here again you loose continity. This lack of continuity gets resolved using Nadara-Watson average.
f(x0) = ({i = 1}NKλ(x0, xi)yi)/({i = 1}NKλ(x0, xi))
epinichekov kernel.
Kλ(x0, xi) = D((|x0 − xi|)/(λ))
D(t) = (3)/(4)(1 − t2) if |t| < 1
else D(t) = 0
These algorithms have issues at Boundaries:
At boundaries local cluster or neighborhood of points may generate a curve that takes off in different direction compared with the direction overall function takes.
This leads to poor predictability.
We need to tackle this.
To tackle this - as a start we use linear regression for these points.
Linear regression is done for each cluster of points or neighborhood, but applied only to x0your query point at Boundary.
minNi = 1Kλ(x0, xi)[yi − α(x0) − β(x0)xi]2
Then your end function or boundary function is simply f(x0) = α(x0) + β(x0)x0
Defune b(x)T = (1, x) and define B as N × 2 and we define W(x0) which is a diagnol weights N × N
f(x0) = b(x0)T(BTW(x0)B){ − 1)BTy = Ni = 1li(x0)yi
where li(x0) are weights.

R&C Analysis-Th 1.29

Suppose f:X[0,] is measurable, and φ(E)=Efdμ [ER] Then φ is a measure on R Xgdφ=Xgfdμ for every measurable function g on X with range in [0,].

In this set up, as we tour through different sets of sigma algebra R, the integral of measurable function generates a measure.

To prove Measure, need to show

Let E1,E2, be disjoint members of R whose union is E. Then, χEf=i=1χEif and that φ(E)=XχEfμ φ(Ej)=XχEjfdμ Then, i=1φ(Ei)=i=1XχEifdμ Using previous theorem on summation of integrals, i=1XχEifdμ=Xi=1χEifdμ=XχEfdμ=φ(E) Thus i=1φ(Ei)=φ(E) establishing countable additive property. Since ϕR, and φ(ϕ)=0 the finiteness property of atleast one of the sets in sigma algebra is satisfied. This shows φ is a measure. Each χ in above equations is a simple function. Setting g=χ If we set h=χE, for any simple measurable function. then hf=i=1hif and φ(E)=Ehdμφ(Ei)=Eihifdμ then φ(E) is a measure. Xhdμ=Xfgdμ Assume, 0g1(x)g2(x)g(x) with limngi(x)=g(x). for each i the following is true Xgidμ=Xfgidμ

from monotone conv theorem, LHS is limnXgidφ=Xgdφ as f is positive definite, 0g1(x)f(x)g2(x)f(x)g(x)f(x) with XgnfdμXgfdμ.

Hence, the equation Xgdφ=Xgfdμ holds.


Chain complexes on Hilbert spaces

 Chain complexes are mathematical structures used extensively in algebraic topology, homological algebra, and other areas of mathematics. Th...