º»¹®³»¿ë ¹Ù·Î°¡±â
¹«·á¹è¼Û ¼Òµæ°øÁ¦ Á¤°¡Á¦free

[Book] Probabilistic Machine Learning An Introduction

Adaptive Computation and Machine Learning | Hardcover
Murphy, Kevin P. ÁöÀ½ | MIT Press | 2022³â 02¿ù 01ÀÏ
  • Á¤°¡ : 140,000¿ø
    ÆÇ¸Å°¡ : 137,000¿ø [2%¡é 3,000¿ø ÇÒÀÎ] ÇÒÀÎÄíÆù ¹Þ±â
  • ÇýÅÃ :
    [±âº»Àû¸³] 1370¿ø Àû¸³ [1% Àû¸³] [Ãß°¡Àû¸³] 5¸¸¿ø ÀÌ»ó ±¸¸Å ½Ã 2,000¿ø Ãß°¡Àû¸³ ¾È³» [ȸ¿øÇýÅÃ] ȸ¿ø µî±Þ º°, 3¸¸¿ø ÀÌ»ó ±¸¸Å ½Ã 2~4% Ãß°¡Àû¸³ ¾È³» [¸®ºäÀû¸³] ¸®ºä ÀÛ¼º ½Ã e±³È¯±Ç ÃÖ´ë 300¿ø Ãß°¡Àû¸³ ¾È³»
  • Ãß°¡ÇýÅà : Æ÷ÀÎÆ® ¾È³» µµ¼­¼Òµæ°øÁ¦ ¾È³» Ãß°¡ÇýÅà ´õº¸±â
  • ¹è¼Ûºñ : ¹«·á ¹è¼Ûºñ ¾È³»
  • ¹è¼ÛÀÏÁ¤ : ¼­¿ïƯº°½Ã Á¾·Î±¸ ¼¼Á¾´ë·Î ±âÁØ Áö¿ªº¯°æ
    ´çÀϹè¼Û Áö±Ý ÁÖ¹®ÇÏ¸é ¿À´Ã( 2ÀÏ,Åä) µµÂø ¿¹Á¤ ¹è¼ÛÀÏÁ¤ ¾È³»
  • ¹Ù·Îµå¸² : ÀÎÅͳÝÀ¸·Î ÁÖ¹®ÇÏ°í ¸ÅÀå¿¡¼­ Á÷Á¢ ¼ö·É ¾È³» ¹Ù·Îµå¸² ÇýÅÃ
    ÈÞÀÏ¿¡´Â ¹Ù·Îµå¸² ÇȾ÷À¸·Î ´õ »¡¸® ¹Þ¾Æ º¸¼¼¿ä. ¹Ù·Îµå¸² ÇýÅùްí ÀÌ¿ëÇϱâ

¾Ë¸³´Ï´Ù.

  • ¿Ü±¹µµ¼­ÀÇ °æ¿ì ÇØ¿ÜÁ¦°øÁ¤º¸·Î¸¸ ¼­ºñ½ºµÇ¾î ¹ÌÇ¥±âµÈ Á¤º¸°¡ ÀÖÀ» ¼ö ÀÖ½À´Ï´Ù. ÇÊ¿äÇÑ Á¤º¸°¡ ÀÖÀ»°æ¿ì 1:1 ¹®ÀÇ°Ô½ÃÆÇ À» ÀÌ¿ëÇÏ¿© ÁֽʽÿÀ.
»óǰ»ó¼¼Á¤º¸
ISBN 9780262046824(0262046822)
Âʼö 864ÂÊ
¾ð¾î English
Å©±â 208(W) X 236(H) X 39(T) (mm)
Á¦º»ÇüÅ Hardcover
ÃѱǼö 1±Ç

Ã¥¼Ò°³

ÀÌ Ã¥ÀÌ ¼ÓÇÑ ºÐ¾ß

A detailed and up-to-date introduction to machine learning, presented through the unifying lens of probabilistic modeling and Bayesian decision theory.
This book offers a detailed and up-to-date introduction to machine learning (including deep learning) through the unifying lens of probabilistic modeling and Bayesian decision theory. The book covers mathematical background (including linear algebra and optimization), basic supervised learning (including linear and logistic regression and deep neural networks), as well as more advanced topics (including transfer learning and unsupervised learning). End-of-chapter exercises allow students to apply what they have learned, and an appendix covers notation.

Probabilistic Machine Learning grew out of the author¡¯s 2012 book, Machine Learning: A Probabilistic Perspective. More than just a simple update, this is a completely new book that reflects the dramatic developments in the field since 2012, most notably deep learning. In addition, the new book is accompanied by online Python code, using libraries such as scikit-learn, JAX, PyTorch, and Tensorflow, which can be used to reproduce nearly all the figures; this code can be run inside a web browser using cloud-based notebooks, and provides a practical complement to the theoretical topics discussed in the book. This introductory text will be followed by a sequel that covers more advanced topics, taking the same probabilistic approach.

¸ñÂ÷

1 Introduction 1

I Foundations 29
2 Probability: Univariate Models 31
3 Probability: Multivariate Models 75
4 statistics 103
5 Decision Theory 163
6 Information Theory 199
7 Linear Algebra 221
8 Optimization 269

II Linear Models 315
9 Linear Discriminant Analysis 317
10 Logistic Regression 333
11 Linear Regression 365
12 Generalized Linear Models * 409

III Deep Neural Networks 417
13 Neural Networks for Structured Data 419
14 Neural Networks for Images 461
15 Neural Networks for Sequences 497

IV Nonparametric Models 539
16 Exemplar-based Methods 541
17 Kernel Methods * 561
18 Trees, Forests, Bagging, and Boosting 597

V Beyond Supervised Learning 619
19 Learning with Fewer Labeled Examples 621
20 Dimensionality Reduction 651
21 Clustering 709
22 Recommender Systems 735
23 Graph Embeddings * 747
A Notation 767

ÃâÆÇ»ç ¼­Æò

¡°The deep learning revolution has transformed the field of machine learning over the last decade. It was inspired by attempts to mimic the way the brain learns but it is grounded in basic principles of statistics, information theory, decision theory, and optimization. This book does an excellent job... ´õº¸±â

Klover ¸®ºä (0)

ºÏ·Î±× ¸®ºä (0) ¾²·¯°¡±â

ºÏ·Î±× ¸®ºä´Â º»ÀÎ ÀÎÁõ ÈÄ ÀÛ¼º °¡´ÉÇÕ´Ï´Ù.
Ã¥À̳ª ŸÀο¡ ´ëÇØ ±Ù°Å ¾øÀÌ ºñ¹æÀ» Çϰųª ŸÀÎÀÇ ¸í¿¹¸¦ ÈѼÕÇÒ ¼ö ÀÖ´Â ³»¿ëÀº ºñ°ø°³ ó¸® µÉ ¼ö ÀÖ½À´Ï´Ù.
¡Ø ºÏ·Î±× ¸®ºä ¸®¿öµå Á¦°ø 2021. 4. 1 Á¾·á

¹®Àå¼öÁý (0) ¹®Àå¼öÁý ¾²±â ³ªÀÇ µ¶¼­±â·Ï º¸±â
※±¸¸Å ÈÄ ¹®Àå¼öÁý ÀÛ¼º ½Ã, ¸®¿öµå¸¦ Á¦°øÇÕ´Ï´Ù. ¾È³»

±³È¯/¹Ýǰ/ǰÀý¾È³»

¡Ø »óǰ ¼³¸í¿¡ ¹Ýǰ/±³È¯ °ü·ÃÇÑ ¾È³»°¡ ÀÖ´Â °æ¿ì ±× ³»¿ëÀ» ¿ì¼±À¸·Î ÇÕ´Ï´Ù. (¾÷ü »çÁ¤¿¡ µû¶ó ´Þ¶óÁú ¼ö ÀÖ½À´Ï´Ù.)

±³È¯/¹Ýǰ/ǰÀý¾È³»
¹Ýǰ/±³È¯¹æ¹ý ¸¶ÀÌ·ë > ÁÖ¹®°ü¸® > ÁÖ¹®/¹è¼Û³»¿ª > ÁÖ¹®Á¶È¸ > ¹Ýǰ/±³È¯½Åû ,
[1:1»ó´ã>¹Ýǰ/±³È¯/ȯºÒ] ¶Ç´Â °í°´¼¾ÅÍ (1544-1900)

¡Ø ¿ÀǸ¶ÄÏ, ÇØ¿Ü¹è¼ÛÁÖ¹®, ±âÇÁÆ® ÁÖ¹®½Ã [1:1»ó´ã>¹Ýǰ/±³È¯/ȯºÒ]
    ¶Ç´Â °í°´¼¾ÅÍ (1544-1900)
¹Ýǰ/±³È¯°¡´É ±â°£ º¯½É¹ÝǰÀÇ °æ¿ì ¼ö·É ÈÄ 7ÀÏ À̳»,
»óǰÀÇ °áÇÔ ¹× °è¾à³»¿ë°ú ´Ù¸¦ °æ¿ì ¹®Á¦Á¡ ¹ß°ß ÈÄ 30ÀÏ À̳»
¹Ýǰ/±³È¯ºñ¿ë º¯½É ȤÀº ±¸¸ÅÂø¿À·Î ÀÎÇÑ ¹Ýǰ/±³È¯Àº ¹Ý¼Û·á °í°´ ºÎ´ã
¹Ýǰ/±³È¯ ºÒ°¡ »çÀ¯
  • ¼ÒºñÀÚÀÇ Ã¥ÀÓ ÀÖ´Â »çÀ¯·Î »óǰ µîÀÌ ¼Õ½Ç ¶Ç´Â ÈÑ¼ÕµÈ °æ¿ì
    (´ÜÁö È®ÀÎÀ» À§ÇÑ Æ÷Àå ÈѼÕÀº Á¦¿Ü)
  • ¼ÒºñÀÚÀÇ »ç¿ë, Æ÷Àå °³ºÀ¿¡ ÀÇÇØ »óǰ µîÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì
    ¿¹) È­Àåǰ, ½Äǰ, °¡ÀüÁ¦Ç°(¾Ç¼¼¼­¸® Æ÷ÇÔ) µî
  • º¹Á¦°¡ °¡´ÉÇÑ »óǰ µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì
    ¿¹) À½¹Ý/DVD/ºñµð¿À, ¼ÒÇÁÆ®¿þ¾î, ¸¸È­Ã¥, ÀâÁö, ¿µ»ó È­º¸Áý
  • ¼ÒºñÀÚÀÇ ¿äû¿¡ µû¶ó °³º°ÀûÀ¸·Î ÁÖ¹® Á¦À۵Ǵ »óǰÀÇ °æ¿ì ((1)ÇØ¿ÜÁÖ¹®µµ¼­)
  • µðÁöÅÐ ÄÁÅÙÃ÷ÀÎ eBook, ¿Àµð¿ÀºÏ µîÀ» 1ȸ ÀÌ»ó ´Ù¿î·Îµå¸¦ ¹Þ¾ÒÀ» °æ¿ì
  • ½Ã°£ÀÇ °æ°ú¿¡ ÀÇÇØ ÀçÆÇ¸Å°¡ °ï¶õÇÑ Á¤µµ·Î °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì
  • ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚº¸È£¿¡ °üÇÑ ¹ý·üÀÌ Á¤ÇÏ´Â ¼ÒºñÀÚ Ã»¾àöȸ Á¦ÇÑ ³»¿ë¿¡
    ÇØ´çµÇ´Â °æ¿ì
(1) ÇØ¿ÜÁÖ¹®µµ¼­ : ÀÌ¿ëÀÚÀÇ ¿äû¿¡ ÀÇÇÑ °³ÀÎÁÖ¹®»óǰÀ¸·Î ´Ü¼øº¯½É ¹× Âø¿À·Î ÀÎÇÑ Ãë¼Ò/±³È¯/¹Ýǰ ½Ã ¡®ÇØ¿ÜÁÖ¹® ¹Ýǰ/Ãë¼Ò ¼ö¼ö·á¡¯ °í°´ ºÎ´ã (ÇØ¿ÜÁÖ¹® ¹Ýǰ/Ãë¼Ò ¼ö¼ö·á : ¨ç¼­¾çµµ¼­-ÆÇ¸ÅÁ¤°¡ÀÇ 12%, ¨èÀϺ»µµ¼­-ÆÇ¸ÅÁ¤°¡ÀÇ 7%¸¦ Àû¿ë)
»óǰ ǰÀý °ø±Þ»ç(ÃâÆÇ»ç) Àç°í »çÁ¤¿¡ ÀÇÇØ ǰÀý/Áö¿¬µÉ ¼ö ÀÖÀ¸¸ç, ǰÀý ½Ã °ü·Ã »çÇ׿¡ ´ëÇØ¼­´Â
À̸ÞÀϰú ¹®ÀÚ·Î ¾È³»µå¸®°Ú½À´Ï´Ù.
¼ÒºñÀÚ ÇÇÇØº¸»ó
ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó
  • »óǰÀÇ ºÒ·®¿¡ ÀÇÇÑ ±³È¯, A/S, ȯºÒ, ǰÁúº¸Áõ ¹× ÇÇÇØº¸»ó µî¿¡ °üÇÑ »çÇ×Àº
    ¼ÒºñÀÚºÐÀïÇØ°á ±âÁØ (°øÁ¤°Å·¡À§¿øÈ¸ °í½Ã)¿¡ ÁØÇÏ¿© 󸮵Ê
  • ´ë±Ý ȯºÒ ¹× ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó±Ý Áö±Þ Á¶°Ç, ÀýÂ÷ µîÀº ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ
    ¼ÒºñÀÚ º¸È£¿¡ °üÇÑ ¹ý·ü¿¡ µû¶ó ó¸®ÇÔ

ÀÌ Ã¥ÀÇ ÇØ¿ÜÁÖ¹®°¡´Éµµ¼­°¡
ÀÖ½À´Ï´Ù.

ÀÌ ºÐ¾ßÀÇ º£½ºÆ®

´õº¸±â+
¹Ù·Î°¡±â
  • ¿ìÃø È®ÀåÇü ¹è³Ê 2
  • ¿ìÃø È®ÀåÇü ¹è³Ê 2
ÃÖ±Ù º» »óǰ