Learning of Computer Science and Mathmatic

Introduction to the page

This page is used to note what I have learned in university and when working.

What I have learned in math

Part I : Mathematical Statistics

  1. Character 2 : Estimation(Including the Introduction) [note]

    • (P13)Methods of parameter estimation (MLE & ME)
    • (P30)Estimated goodness criteria
    • (P53)Confidence interval
    • (P76)Estimation of distribution function and density function

  2. Character 3 : Hypothesis Testing [note]

    • (P1)Formulation of the question
    • (P9)N-P lemma and likelihood ratio test
    • (P19)Hypothesis testing for single-parameter cases
    • (P38)Generalized likelihood ratio test
    • (P69) Cut-off and p-value
    • (P71) Goodness-of-fit test

  3. Character 4 : Regression Analysis and Linear Models [note]

    • (P1) Introduction
    • (P5) Univariate linear regression
    • (p22) Parameter estimation for linear models
    • (P52) Hypothesis testing for linear models
    • (P68) Regression analysis

  4. Character 5 : Design of Experiments and Analysis of Variance [note]

    • (P1)ANOVA for full trials
    • (P18)Orthogonal design

  5. Character 6 : Sequential Analysis [note]

    • (P1)The importance of the sequential approach and two elements
    • (P6)TSequential probability ratio test

  6. Character 7 : Statistical Decision-making and Bayesian Statistical Generalis [note]

    • (P8)Overview of statistical decision-making issues
    • (P17)Bayesian statistics
    • (P35)A priori distribution
    • (P51)Stochastic simulation of the martensitian chain
    • (P65) Overview of sample surveys

Part II : Mathematical Introduction to Machine Learning

  1. Character 1 : Introduction [note]

  2. Character 2 : Linear method for regression [note]

  3. Character 3 : Classification [note]

  4. Character 4 : Unsupervised learning [note]

  5. Character 5 : GD and momentum accelerations [note]

  6. Character 6 : SGD [note]

  7. Character 7 : Introduction to Neural Network [note]

  8. Character 8 : Training of Neural Network [note]

  9. Character 9 : Introduction to Pytorch [note]

  10. Character 10 : High-dimensional Distribution Learning [note]

  11. Character 11 : Diffusion models and score matching [note]

  12. Character 12 : Transformer and LLM [note]

  13. Character 13 : Concentration inequalities [note]

  14. Character 14 : Uniform concentration and generalization bounds [note]

  15. Character 15 : Theorical foundation of kernel methods [note]

  16. Character 16 : Theorical foundation of 2-layer Neural Networks [note]

Part III : Numerical Algebra & Analysis

  1. Character 1 : Direct Solution of Linear Equations [note]

    • (P4)The solution of a system of trigonometric equations
    • (P9)Trigonometric decomposition and Elect the principal triangular decomposition
    • (P34)Cholesky decomposition method
    • (P45)Banded Gaussian elimination method
    • (P50)Block triangulation decomposition

  2. Character 2 : Round-off Error Analysis [note]

    • (P4)Rounding error analysis for basic operations
    • (P17)Rounding error analysis for Gaussian elimination method
    • (P30)Accuracy estimation and iterative improvement of computational solutions

  3. Character 3 : Norm and Sensitivity Analysis [note]

  4. Character 4 : Iterative Method to Solve Linear Equations [note]

    • (P4)Jacobi iteration and Gauss-Seidel iteration
    • (P8)Convergence of the iterative method
    • (P52)Ultra-slack iterations

  5. Character 5 : Fastest Descent Method and Conjugate Gradient Method [note]

    • (P4)Fastest Descent Method
    • (P15)Conjugate Gradient Method
    • (P29)Other methods of conjugate gradient method
    • (P42)Generalized minimal remainder method

  6. Character 6 : V-Cycle Method(Multiple meshes Method) [note]

  7. Character 7 : Methods for calculating asymmetric eigenvalue problems [note]

    • (P4)Introduction
    • (P9)Power method & anti-power method & QR method & Subspace iterative method
    • (P33)Variation of Hessenberg
    • (P47)QR iteration with displacement

  8. Character 8 : Methods for calculating symmetric eigenvalue problems [note]

    • (P4)Important lemmas
    • (P13)Symmetrical QR method & Jacobi method & dichotomy & The law of divide and conquer
    • (P70)Calculation of singular value decomposition

  9. Character 9 : Polynomial Interpolation [note]

    • (P10)Polynomial interpolation
    • (P40)Spline interpolation
    • (P53)Catch-up method

  10. Character 10 : Best Approximation [note]

    • (P4)Normalize linear space
    • (P15)Optimal square approximation
    • (P36)Best consistent approximation

  11. Character 11 : Numerical Integration [note]

    • (P4)The basic formula for numerical integration
    • (P10)Compound integration formula
    • (P20)Gauss integration formula
    • (P26)Accelerated convergence technology(Romberg)

  12. Character 12 : Numerical Solution of nonlinear equations [note]

    • (P7)Iterative solution of nonlinear equations
    • (P36)Iterative solution of nonlinear systems of equations

  13. Character 13 : Numerical Solution of ODE [note]

    • (P4)Euler method
    • (P14)Runge-Kutta method
    • (P24)Convergence and stability of the single-step method

  14. Character 14 : Numerical Solution of PDE [note]

    • (P4)Difference method for parabolic equations
    • (P21)Difference method for hyperbolic equations

  15. Character 15 : Finite Element Method [note]

Part IV : Optimization Algorithms

  1. Character 1 : Convex Set & Convex Function & Convex Optimization [note1][note2][note3]

  2. Character 2 : LP & SDP & SOCP [note1][note2]

  3. Character 3 : Optimality theory [note1][note2]

  4. Character 4 : Optimality method

  5. Character 5 : Simplex Method & Interior Point Method [note]

  6. Character 6 : Compressed Sensing & Sparse Recovery Guarantees [note]

  7. Character 7 : Compressed Sensing & Sparse Recovery Guarantees [note1][note2]

  8. Character 8 : Matrix Completion [note]

  9. Character 9 : Integer Programming [note]

  10. Character 10 : Submodular Function Optimization [note]

  11. Character 11 : SG Method for Large-scales ML [note]

  12. Character 12 : Randomized Numerical Linear Algebra [note]

  13. Character 13 : Phase Retrieval [note]

  14. Character 14 : Marcov Decision Process [note]

  15. Character 15 : TD-learning and Q-learning [note]

  16. Character 16 : Policy Gradient Methods [note]

Part V : Mathmatical Logics

  1. Character 1 : Propositional Logic: Semantics [note]

  2. Character 2 : Propositional Logic: Grammer [note]

  3. Character 3 : First-Order Logic: Model Theory [note]

  4. Character 4 : First-Order Logic: Proof Theory [note]

  5. Character 5 : Fundamentals of Mathematics [note]

  6. Character 6 : Incompleteness theorem [note]

  7. Character 7 : Fundamentals of Computer Science [note]

Part VI : Differential Equation

  1. Character 1 : Introduction [note]

Part VII : Methods of Stochastic Simulations

  1. Character 1 : Introduction [note]

  2. Character 2 : Random Variables [note]

  3. Character 3 : Generation of Random Variables [note]

  4. Character 4 : Variance Reduction [note]

  5. Character 5 : Limit Theorems [note]

  6. Character 6 : Markov Chains [note]

What I have learned in computer science

Part I : Parallel and Distributed Computing

  1. Character 1 : Basic Theory [note]

    • (P4)Introduction
    • (P43)Hardware architecture
    • (P61)Parallel algorithms & Programming
    • (P70)Three laws of parallel computing
    • (P83)TParallel computing models

  2. Character 1.2 : Basic Theory(supplement) [note]

  3. Character 2 : Programming & Practice: MPI [note]

  4. Character 2 : Programming & Practice: OpenMP [note]

  5. Character 2 : Programming & Practice: CUDA [note]