| Course | Homeworks & Projects | Content |
|---|---|---|
| NLP (ENSAE) | Project and paper | Intent classification in Sequential labelling tasks, using contextual embeddings |
| Advanced Machine learning (ENSAE) | Final project | Understand and Fine-tune the ViT-Base/32 CLIP model |
| Data camp (IPP) | Data challenge | Solar wind classification based on data measured by in-situ spacecraft |
| Design of Data challenge | Precipitation forecast: Based on 18 consecutive satellite radar frames, to predict the next 18 frames | |
| Deep learning II (IPP) | Final project: RBM&DBN | Implement and train RBM (Restricted Boltzmann machine) and DBN (Deep belief network) from scratch |
| Altegrad (MVA) | Lab1 | self-attention and HAN (Hierarchical Attention Network) architecture |
| Lab2 | Transfer learning on transformer architecture | |
| Lab3 | Using Fairseq and HuggingFace transformers to finetune pretrained language models | |
| Lab4 | Spectral Clustering for graphs; Graph Classification using Graph Kernels | |
| Lab5 | DeepWalk algorithm & node embedding & Graph neural network (GNN) | |
| Lab6 | Graph attention network (GAT) & Graph Classification with deep learning | |
| Lab7 | DeepSets model & protein classification with GNN | |
| Kaggle challenge | Kaggle challenge: use sequential and structural information to classify protein into 18 classes. | |
| Causal Inference (IPP) | Lectures & Labs | Notebooks in lectures and labs. See the summary in Readme |
| Final project | Reproduction of paper: Counterfactual Fairness to study in machine learning the fairness using causal inference | |
| Bootstrap (ENSAE) | TD1 | Application of Jackknife to estimate the asymptotic variance (Ex.1) and bias (Ex.2) of estimators |
| TD2 | Application of Bootstrap to estimate the bias (Ex.2) and variance (Ex.1, possibly to use Boostrap of Boostrap) of an estimator/statistic | |
| TD3 | The exam of last year | |
| Sequential Monte-Carlo (ENSAE) | Final project | Employ the SMC methods in Dropout layer of neural network in adaptation stage, in order to replace the fine-tuning |
| Computer Vision (Telecom) | Lab1 | self-attention and HAN (Hierarchical Attention Network) architecture |
| Lab2 | Feature detection (Harris corner detection) & Motion estimation (block matching) & Segmentation (algorithm of Otsu + region-growing based algorithm) | |
| Data streaming (IPP) | Lab1 | Discovery of River: like sklearn, but it focus on online machine learning |
| Lab2 | Using Docker and Kafka to analyse streaming tweets | |
| Final project: Continual GNN | Refactor the original codes in a paper studying streaming GNN via continual learning | |
| Statistic Bayesian (ENSAE) | DM1 | Application of MCMC Gibbs sampler to inference parameters based on 'a proteriori' probability |
| Final project | Using Gibbs sampling and DMC-IS(direct monte carlo with importance sampling) to reproduce some results in this paper | |
| Practical Machine learning (IPP) | Session 1 | analyse on several unsupervised machine learning methods: K-means, GMM, PCA, t-SNE |
| Session 2 | reguralised regression, variable selection, nonlinear regression, on a dataset from the Brain Computer Interface competition | |
| Session 3 | comparison between Bayesian decision, linear and nonlinear classification, on MNIST dataset and another one about diabetes | |
| Deep learning (IPP) | Lab 1 | implement MLP from scratch |
| Lab 2 | implement MLP using pytorch | |
| Lab 3 | RNN (Many-to-one) | |
| Lab 4 | a simple language model | |
| Lab 5 | build CNN for image recognition, using Pytorch | |
| Lab 5 | visualisation of CNN: Deep Dream algorithm; Adversarial examples | |
| MAP566 Statistics in Action (X-3A) | Homework 1 | Hypothesis testing |
| Homework 1 | implement MLP using pytorch | |
| TP3 | Polynomial regression model | |
| TP4 | Nonlinear regression model | |
| TP5 | linear mixed model | |
| TP6 | non linear mixed model | |
| TP7 | mixture models | |
| TP8 | Graph Clustering: Spectral and hierarchical methods | |
| TP9 | Graph Clustering: Stochastic Blockmodels | |
| MAP556 Monte Carlo Methods (X-3A) | TP1 | Simulation of random variables + Law of large numbers + Central limit theorem |
| TP2 | Serveral methods of variance reduction: control variates, antithetic sampling, stratification | |
| TP3 | Variance reduction through importance sampling | |
| TP5 | Using Empirical Regression to approximate conditional expectation (in a context of finance) | |
| TP6 | Generative Adversarial Network (GAN) | |
| TP8 | Simulate processes of Brownian motion (eg. process of Ornstein-Uhlenbeck) and their Euler scheme | |
| TP9 | Multi-level Monte-Carlo method (MLMC) | |
| Challenge1 | simulate E(f(G)), f is reasonably regular | |
| Challenge2 | play Angry Bird! Try to give a control on velocity to the bird facing a random wind | |
| MAP553 Machine learning (X) | TP1 | implement several optimization algorithms: GD, AGD, CGD, SGD, SAG, SVRG |
| Project | a classic dataset of tree cover type classification, using auto machine learning, 2nd in Kaggle competition | |
| Reinforcement learning (X-2A) | Lab3 | Dynamic Programming - Value Iteration |
| Lab4 | Dynamic Programming - policy iteration | |
| Lab5 | Temporal Difference | |
| Lab6 | Q table - SARSA and Q Learning | |
| Lab7 | Policy Gradient | |
| INF580 Large scale mathematical optimization (X-3A) | Project | combine random projection and linear programming. Retrieve solutions from projected problem and dual projected problem, compute primal solution. Compare their feasibility error and compute time |
| MAP433 statistiques (X-2A) | TP1 | Estimation parametrique. Loi de Poisson pour modéliser le nombre de buts marqués par une équipe de football |
| TP2 | Test de Cramer-von Mises | |
| TP3 | Transformation de stabilisation de la variance | |
| Homework1 | Estimation coefficients and interpretation (linear regression) | |
| Homework2 | a test asymptotic on regression coefficients | |
| Homework3 | Classification by KNN | |
| MAP432 Markov & martingale (X-2A) | Project | L'algorithme du recuit simulé, pour résoudre des problèmes d'optimisation non convexe. On s'intéresse ici à une application au problème du voyageur de commerce |
| MAP435 optimisation (X-2A) | optimisation sans contrainte | Algorithme de gradient à pas fixe |
| Algorithme du gradient pas optimal (le cas de fonction quadratic) | ||
| Algorithme de Nesterov (fonctions convexes) | ||
| Algorithme de Nesterov (fonctions fortement convexes) | ||
| Algorithme du gradient conjugué | ||
| Algorithme de Newton | ||
| Analyse de vitesse de convergence et comparer les algos | ||
| Quelques contre-exemples | ||
| optimisation avec contraintes | Algorithme du gradient projeté | |
| Algorithme d'Uzawa | ||
| Méthode de pénalisation | ||
| Algorithme du Lagrangien augmenté |
-
Notifications
You must be signed in to change notification settings - Fork 1
yunhao-tech/Course_project
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published