Inside GPT – Large Language Models Demystified
This video is also available in the GOTO Play video app! Download it to enjoy offline access to our conference videos while on the move.
Natural language processing using generative pre-trained transformers (GPT) algorithms is a rapidly evolving field that offers many opportunities and challenges for application developers. But what is a generative pre-trained transformer, and how does it work? How can you leverage the latest advances in GPT algorithms to create engaging and useful applications? Can my business benefit from creating a GPT powered chat bot?
In this demo intensive session Alan will take a deep dive into the architecture of GPT algorithms and the inner workings of ChatGPT. The journey will begin by looking at the fundamental concepts of natural language processing, such as word embedding, vectorization and tokenization. He will then demonstrate how you can apply these techniques to train a GPT2 model that can generate song lyrics, showing the internals of how word sequences are predicted.
Alan will then shift the focus to larger language models, such as ChatGPT and GPT4, demonstrating their power, capabilities, and limitations. The use of hyperparameters such as temperature and frequency penalty will be explained and their effect on the generated output demonstrated.
Join me for this session if you want to learn how to harness the power of GPT algorithms in your own solutions.
-
A Short Summary of the Last Decades of Data ManagementHannes MühleisenTuesday Jun 11 @ 13:25
-
Lessons From The Pit LaneMarc PriestleyWednesday Jun 12 @ 09:10
-
X Marks the Spot: Navigating Possible Futures with Wardley MapsSimon WardleyWednesday Jun 12 @ 16:30
-
Is It Time To Version Observability? (Signs Point To Yes)Charity MajorsTuesday Jun 11 @ 09:10
-
How a Passion for Oceans Can Utilize Synergies of TechnologySigne SimonsenWednesday Jun 12 @ 13:25
-
There’s no AI in human: Navigating The Intersection of Technology and HumanityImran RashidTuesday Jun 11 @ 17:10