Step into the world of LLMs with this practical guide that takes you from the fundamentals to deploying advanced applications using LLMOps best practices
Artificial intelligence has undergone rapid advancements, and Large Language Models (LLMs) are at the forefront of this revolution. This LLM book offers insights into designing, training, and deploying LLMs in real-world scenarios by leveraging MLOps best practices. The guide walks you through building an LLM-powered twin that’s cost-effective, scalable, and modular. It moves beyond isolated Jupyter notebooks, focusing on how to build production-grade end-to-end LLM systems.
Throughout this book, you will learn data engineering, supervised fine-tuning, and deployment. The hands-on approach to building the LLM Twin use case will help you implement MLOps components in your own projects. You will also explore cutting-edge advancements in the field, including inference optimization, preference alignment, and real-time data processing, making this a vital resource for those looking to apply LLMs in their projects.
By the end of this book, you will be proficient in deploying LLMs that solve practical problems while maintaining low-latency and high-availability inference capabilities. Whether you are new to artificial intelligence or an experienced practitioner, this book delivers guidance and practical techniques that will deepen your understanding of LLMs and sharpen your ability to implement them effectively.
This book is for AI engineers, NLP professionals, and LLM engineers looking to deepen their understanding of LLMs. Basic knowledge of LLMs and the Gen AI landscape, Python and AWS is recommended. Whether you are new to AI or looking to enhance your skills, this book provides comprehensive guidance on implementing LLMs in real-world scenarios
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
Paul Iusztin is a senior ML and MLOps engineer at Metaphysic, a leading GenAI platform, serving as one of their core engineers in taking their deep learning products to production. Along with Metaphysic, with over seven years of experience, he built GenAI, Computer Vision and MLOps solutions for CoreAI, Everseen, and Continental. Paul's determined passion and mission are to build data-intensive AI/ML products that serve the world and educate others about the process. As the Founder of Decoding ML, a channel for battle-tested content on learning how to design, code, and deploy production-grade ML, Paul has significantly enriched the engineering and MLOps community. His weekly content on ML engineering and his open-source courses focusing on end-to-end ML life cycles, such as Hands-on LLMs and LLM Twin, testify to his valuable contributions.
Maxime Labonne is a Senior Staff Machine Learning Scientist at Liquid AI, serving as the head of post-training. He holds a Ph.D. in Machine Learning from the Polytechnic Institute of Paris and is recognized as a Google Developer Expert in AI/ML. An active blogger, he has made significant contributions to the open-source community, including the LLM Course on GitHub, tools such as LLM AutoEval, and several state-of-the-art models like NeuralBeagle and Phixtral. He is the author of the best-selling book "Hands-On Graph Neural Networks Using Python," published by Packt.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.
EUR 9,50 für den Versand von Spanien nach Deutschland
Versandziele, Kosten & DauerEUR 8,78 für den Versand von USA nach Deutschland
Versandziele, Kosten & DauerAnbieter: La Casa de los Libros, Castellgali, BARCE, Spanien
Zustand: Usado. Bestandsnummer des Verkäufers 9781836200079
Anzahl: 1 verfügbar
Anbieter: California Books, Miami, FL, USA
Zustand: New. Bestandsnummer des Verkäufers I-9781836200079
Anzahl: Mehr als 20 verfügbar
Anbieter: BargainBookStores, Grand Rapids, MI, USA
Paperback or Softback. Zustand: New. LLM Engineer's Handbook: Master the art of engineering large language models from concept to production 1.96. Book. Bestandsnummer des Verkäufers BBS-9781836200079
Anzahl: 5 verfügbar
Anbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich
Zustand: New. In. Bestandsnummer des Verkäufers ria9781836200079_new
Anzahl: Mehr als 20 verfügbar
Anbieter: GreatBookPrices, Columbia, MD, USA
Zustand: As New. Unread book in perfect condition. Bestandsnummer des Verkäufers 48596664
Anzahl: Mehr als 20 verfügbar
Anbieter: GreatBookPrices, Columbia, MD, USA
Zustand: New. Bestandsnummer des Verkäufers 48596664-n
Anzahl: Mehr als 20 verfügbar
Anbieter: THE SAINT BOOKSTORE, Southport, Vereinigtes Königreich
Paperback / softback. Zustand: New. This item is printed on demand. New copy - Usually dispatched within 5-9 working days 526. Bestandsnummer des Verkäufers C9781836200079
Anzahl: Mehr als 20 verfügbar
Anbieter: GreatBookPricesUK, Woodford Green, Vereinigtes Königreich
Zustand: New. Bestandsnummer des Verkäufers 48596664-n
Anzahl: Mehr als 20 verfügbar
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Taschenbuch. Zustand: Neu. nach der Bestellung gedruckt Neuware - Printed after ordering - Step into the world of LLMs with this practical guide that takes you from the fundamentals to deploying advanced applications using LLMOps best practicesPurchase of the print or Kindle book includes a free Elektronisches Buch in PDF format'This book is instrumental in making sure that as many people as possible can not only use LLMs but also adapt them, fine-tune them, quantize them, and make them efficient enough to deploy in the real world.' - Julien Chaumond, CTO and Co-founder, Hugging FaceBook DescriptionThis LLM book provides practical insights into designing, training, and deploying LLMs in real-world scenarios by leveraging MLOps' best practices. The guide walks you through building an LLM-powered twin that's cost-effective, scalable, and modular. It moves beyond isolated Jupyter Not Elektronisches Buch, focusing on how to build production-grade end-to-end LLM systems.Throughout this book, you will learn data engineering, supervised fine-tuning, and deployment. The hands-on approach to building the LLM twin use case will help you implement MLOps components in your own projects. You will also explore cutting-edge advancements in the field, including inference optimization, preference alignment, and real-time data processing, making this a vital resource for those looking to apply LLMs in their projects.What you will learn Implement robust data pipelines and manage LLM training cycles Create your own LLM and refine with the help of hands-on examples Get started with LLMOps by diving into core MLOps principles like IaC Perform supervised fine-tuning and LLM evaluation Deploy end-to-end LLM solutions using AWS and other tools Explore continuous training, monitoring, and logic automation Learn about RAG ingestion as well as inference and feature pipelinesWho this book is forThis book is for AI engineers, NLP professionals, and LLM engineers looking to deepen their understanding of LLMs. Basic knowledge of LLMs and the Gen AI landscape, Python and AWS is recommended. Whether you are new to AI or looking to enhance your skills, this book provides comprehensive guidance on implementing LLMs in real-world scenarios.Table of Contents Undersstanding the LLM Twin Concept and Architecture Tooling and Installation Data Engineering RAG Feature Pipeline Supervised Fine-tuning Fine-tuning with Preference Alignment Evaluating LLMs Inference Optimization RAG Inference Pipeline Inference Pipeline Deployment MLOps and LLMOps Appendix: MLOps Principles. Bestandsnummer des Verkäufers 9781836200079
Anzahl: 2 verfügbar
Anbieter: GreatBookPricesUK, Woodford Green, Vereinigtes Königreich
Zustand: As New. Unread book in perfect condition. Bestandsnummer des Verkäufers 48596664
Anzahl: Mehr als 20 verfügbar