Just enough Generative AI Fundamentals for getting started with GenAI with Azure AI.
January 3, 2025 @ 9:00 am – 5:00 pm UTC+5:30
About the event
Join us for an engaging hands-on workshop where participants will embark on a comprehensive journey into Deep Learning, Azure AI, and the evolution of Natural Language Processing (NLP) into Generative AI. This workshop strikes a balance between theoretical understanding and hands-on practice, serving as a stepping stone for the “Generative AI for Developers” learning path.
No prior knowledge of Generative AI is necessary for attending this event, making it accessible to all. A basic familiarity with Data and Azure Cloud Services concepts is that you need, along with a grasp of tools like Jupyter Lab or Google Colaboratory notebooks. If you’re already acquainted with Python programming, it’s a definite plus.
If you are a developer, software engineer, or even an AI enthusiast who is intrigued by the possibilities of Generative AI, then this workshop is for you. Even if you’re a seasoned AI professional looking to shape intelligent solutions, or a data scientist eager to enhance your skills with OpenAI models, this workshop offers valuable insights.
Join us to explore the forefront of AI and take your skills to the next level. Don’t miss this chance to be part of an exciting journey towards mastering Generative AI and its applications.
- Deep Learning Basics & Artificial Neural Network Overview
- Building the Vocabulary – Terms & Concepts
- Training the Neural Networks
- Key Types of Neural Networks – CNN, RNN, LSTM, GANs
- Lab(s): Working with Neural Networks
- Azure AI Framework
- Various ways to develop AI Applications on Azure
- Examples of each of the options
- Getting started with Azure AI
- Lab(s): Getting started with Azure OpenAI Studio
- Introduction to NLP
- Rule-Based Approaches: Keyword matching and grammar rules
- Statistical Methods: n-grams
- Machine Learning for tasks like part-of-speech tagging, named entity recognition
- Word embeddings: Word2Vec, GloVe for semantic relationships
- Attention Mechanisms: Machine translation, text summarization, sentiment analysis
- Transformers: Architecture, self-attention, pre-trained language models like BERT, GPT
- Skip Connections and Layer Norm
- Feed-forward Layer
- Transformer hyperparameters and Why they work so well
- BERT, T5 , GPT
- Lab(s): NLP use cases with key approaches