|
1 | 1 | # This is our recommended Data Science curriculum
|
2 |
| -|Meetup | Title | Description| |
3 |
| -|--------|--------|------------| |
4 |
| -|Meetup 1| **Basics of linear Algebra and Python.** |We will be explaining the basics of python and algebra, that are sufficient for the coming 5 meetups, and give them assignments to work on.| |
5 |
| -|Meetup 2| **Gradient Descent and related algorithms.** |Using the assignments that we have provided in Meetup 1, we have to explain gradient descent, and possibly the other such algorithms. The assignment for this meetup will be considering a loss function and randomly initiated the weights and plotting the loss using python.| |
6 |
| -|Meetup 3| **Regression Techniques.** |We'll include Linear and Logistic Regression in this session.| |
7 |
| -|Meetup 4| **Tree-based and Bagging Algorithms.** |Decision Tree, Random Forest, etc| |
8 |
| -|Meetup 5| **Boosting Techniques.** |XGBoost, LightGBM, etc| |
9 |
| -|Meetup 6| **Intro to Neural Nets.** |We'll start from linear and logistic regression and complicate it a little to explain Deep Neural Nets. If possible we can explain CNNs and RNNs as well.| |
| 2 | +|Meetup | Title | Description| |
| 3 | +|--------|------------|------------| |
| 4 | +|1| **Basics of linear Algebra and Python.** |We will be explaining the basics of python and algebra, that are sufficient for the coming 5 meetups, and give them assignments to work on.| |
| 5 | +|2| **Gradient Descent and related algorithms.** |Using the assignments that we have provided in Meetup 1, we have to explain gradient descent, and possibly the other such algorithms. The assignment for this meetup will be considering a loss function and randomly initiated the weights and plotting the loss using python.| |
| 6 | +|3| **Regression Techniques.** |We'll include Linear and Logistic Regression in this session.| |
| 7 | +|4| **Tree-based and Bagging Algorithms.** |Decision Tree, Random Forest, etc| |
| 8 | +|5| **Boosting Techniques.** |XGBoost, LightGBM, etc| |
| 9 | +|6| **Intro to Neural Nets.** |We'll start from linear and logistic regression and complicate it a little to explain Deep Neural Nets. If possible we can explain CNNs and RNNs as well.| |
0 commit comments