Large language models for text generation

Course code: MLLLMTG

This course is intended for anyone who is fascinated by the capabilities of large language models and generative artificial intelligence and wants to delve into the subject beyond just the level of the average user. Together we will get to know transformers, the basic building blocks of modern language models, introduce the most famous architectures and show how large language models can be used for various applications. No third-party paid account is required for hands-on exercises. We will be using open source models which, when used correctly, are as good as the biggest commercial models.


210 EUR

254 EUR including VAT

The earliest date from 25.06.2024

Selection of dates
onas
Do you have a question?
+420 731 175 867 edu@edutrainings.cz

Professional
and certified lecturers

Internationally
recognized certifications

Wide range of technical
and soft skills courses

Great customer
service

Making courses
exactly to measure your needs

Course dates

Starting date: 25.06.2024

Type: In-person/Virtual

Course duration: 1 day

Language: cz/sk

Price without VAT: 210 EUR

Register

Starting date: Upon request

Type: In-person/Virtual

Course duration: 1 day

Language: en/cz

Price without VAT: 210 EUR

Register

Starting
date
Place
Type Course
duration
Language Price without VAT
25.06.2024 In-person/Virtual 1 day cz/sk 210 EUR Register
Upon request In-person/Virtual 1 day en/cz 210 EUR Register
G Guaranteed course

Didn't find a suitable date?

Write to us about listing an alternative tailor-made date.

Contact

Course structure

  • Generative AI for text and images
  • The evolution of language modeling
  • Transformers
  • Types of transformers for language modeling (encoder, decoder, encoder-decoder)
  • Reinforcement Learning with Human Feedback (RLHF)
  • Selected models for transformer-based language modeling (BERT, GPT, LLAMA, T5, BART…)
  • A practical example of text classification using transformers using the HuggingFace library in the Google Colab environment
  • Prompt engineering: in-context learning, zero shot, one shot and few shot prompting, the most important configuration parameters of generative processes
  • A practical example of in-context learning using the HuggingFace library in the Google Colab environment
  • Fine-tuning of large language models and parameter-efficient fine-tuning (LoRA)
  • Evaluation of language generative models (ROUGE, BLEU)
  • A practical example of using parameter-efficient fine-tuning using the HuggingFace library in the Google Colab environment
  • Retrieval Augmented Generation (RAG)

Prerequisites

  • Basic knowledge of Python programming
  • Course-level knowledge of machine learning Introduction to Machine Learning

Do you need advice or a tailor-made course?

onas

product support

ComGate payment gateway MasterCard Logo Visa logo