If people aren't laughing at your dreams, your dreams aren't big enough.
The fear of death follows from the fear of life. One who lives life fully is prepared to die at any time. - Mark Twain
Welcome to the Machine Intelligence Lecture Series. I had the priviledge of conducting this lecture series at the UAV Lab, Indian Institute of Technology Kanpur (IIT Kanpur). The aim of this series was to provide my teammates with an overview of various Machine Intelligence concepts and technologies. Throughout the series, I covered a range of topics, delving into both foundational principles and advanced applications.
In this section, I explored the fundamentals of statistical machine learning, including supervised and unsupervised learning algorithms. We discussed concepts such as linear regression, decision trees, k-nearest neighbors, clustering, and dimensionality reduction techniques. Practical implementations and real-world examples were also showcased to illustrate the relevance of these methods.
Deep learning, a subset of machine learning, was a central focus of this segment. We delved into the architecture and mechanics of neural networks, covering feedforward networks, backpropagation, activation functions, and gradient descent optimization. Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) were introduced, with insights into their applications in image and sequence data analysis.
Neural Ordinary Differential Equations (Neural ODEs) represent an exciting advancement in the field of deep learning. Our lectures on this topic delved into the concept of treating neural networks as continuous dynamical systems described by differential equations. Participants gained an understanding of how this approach can model complex processes and capture temporal dependencies in data.
Liquid Neural Networks are an intriguing variant of recurrent neural networks that draw inspiration from biological neural systems. This part of the lecture series covered the unique architecture of Liquid Neural Networks and their applications in tasks requiring dynamic memory and complex pattern recognition. The comparison between Liquid Neural Networks and traditional RNNs shed light on the advantages of this novel approach.
The final segment explored neural networks inspired by biological systems. We delved into Spiking Neural Networks (SNNs) and explored how they mimic the behavior of biological neurons. Topics included neural encoding, spike trains, and SNN architectures. Participants gained insights into the potential of SNNs for neuromorphic computing and real-time event-based processing.
- Lecture Slides: This folder contains the presentation slides used during the lectures for each topic.
- Code Samples: Practical code examples and implementations discussed during the series can be found here.
- Additional Resources: Supplementary reading materials, research papers, and useful links related to each topic are provided in this section.
Feel free to explore the repository and use the resources to enhance your understanding of machine intelligence concepts.
If you have any questions, feedback, or would like to engage further in discussions related to the lecture series, please feel free to contact me or my team:
- Ashish Kumar: ashikumy16@gmail.com
- UAV Lab, IIT Kanpur: UAV Lab
I hope you find this lecture series insightful and valuable in your journey to mastering machine intelligence concepts. Happy learning!
Disclaimer: This repository is for educational purposes and does not endorse any specific technologies or products.
- MIT 6.S191: Introduction to Deep Learning
- A Comprehensive guide to Motion Estimation with Optical Flow
- James, G.; Witten, D.; Hastie, T.; Tibshirani, R. & Taylor, J. (2023), An Introduction to Statistical Learning with Applications in Python , Springer , Cham.
- Activation Functions
- Forward Pass