Probabilistic Graphical Models

share ›
‹ links

Below are the top discussions from Reddit that mention this online Coursera specialization from Stanford University.

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other.

Inference Bayesian Network Belief Propagation Graphical Model Markov Random Field Gibbs Sampling Markov Chain Monte Carlo (MCMC) Algorithms Expectation–Maximization (EM) Algorithm

Accessible for free. Completion certificates are offered.

Affiliate disclosure: Please use the blue and green buttons to visit Coursera if you plan on enrolling in a course. Commissions Reddsera receives from using these links will keep this site online and ad-free. Reddsera will not receive commissions if you only use course links found in the below Reddit discussions.

Taught by
Daphne Koller
Professor

Offered by
Stanford University

This specialization includes these 3 courses.

Reddit Posts and Comments

3 posts • 40 mentions • top 14 shown below

r/MachineLearning • post
72 points • The_Man_of_Science
Coursera relaunched the Probabilistic Graphical Models by Daphne Koller
r/math • comment
19 points • control_09

Graph Theorists should make excellent programmers because everything in data structures can be thought of through graph theory.

Basically it's like someone has a PhD in this. https://www.coursera.org/specializations/probabilistic-graphical-models

r/MachineLearning • comment
8 points • redditor_87

https://www.coursera.org/specializations/probabilistic-graphical-models

r/MachineLearning • comment
13 points • wind_of_amazingness

  • Part of "Statistics with R" specialization. I can recommend it to someone who has fair knowledge of confidence intervals, hypthesis testing etc. since it does a great job in comparing classical statistical methods with their Bayesian counterparts: https://www.coursera.org/learn/bayesian/home/welcome

  • Nice class that teaches you basic stuff about how MCMC works and lets you play with it in JAGS: https://www.coursera.org/learn/mcmc-bayesian-statistics/home/welcome

  • This is big, quite complex specialization that teaches about graphical models that have knowledge engineering, priors and Bayesian inference as their primary ways of building and training the models. It does go over MCMC. I would not recommend this specialization to someone who wants to start learning, but someone who is fairly familiar with MCMC and variational inference would find a lot of cool stuff in PGMs that were "the best thing" before deep learning revolution: https://www.coursera.org/specializations/probabilistic-graphical-models

  • Bayesian Methods for Hackers is an easy to read book (available online as a github repo with all source code) that shows some of the tricks that are extremely difficult to pull off if you are using more commonplace MLE methods. This is highly recommended: https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers

r/datascience • comment
2 points • maxmoo

Just took a look at the link, it's hard for me to judge the rigor of the courses without looking into them more closely, but the heavy emphasis on SAS is a pretty big red flag for me (that they're not focusing on modern research and techniques.)

I think if you're restricted to online-only you're better off just picking and choosing from free/cheap stats courses through Coursera, stanford online etc. https://www.coursera.org/specializations/probabilistic-graphical-models is awesome, you might want to do a more basic stats one first to get the background.

I don't think you need to study more CS if you already have a bachelors. You can't really learn deep learning through a CS qualification yet, the field is too new, you're better off self-teaching.

r/MLQuestions • comment
1 points • karlpoppery

I haven't taken it yet but this course has a pretty stellar rating on Coursera : https://www.coursera.org/specializations/probabilistic-graphical-models Probably a good place to start

r/statistics • comment
1 points • Liorithiel

I recommend the Coursera class on the topic. It was great. Took it because of that lecture ;-)

r/datascience • post
1 points • ratterstinkle
Probabilistic Graphical Models MOOC: Thoughts?

Has anyone taken Daphne Koller's Coursera courses on probabilistic graphical models? It is a mini specialization that consists of three courses and I was curious what people thought of it. The courses have high ratings, but the ratings for almost all of the courses are high, rendering the metric saturated.

r/computerscience • comment
1 points • made-it

I had to get that book for one of my AI classes. It's way too dense for someone new to the topic (but it's great as a reference *after* you're already familiar with the topic).

At least the Coursera class explains things in a more beginner-friendly way imo (https://www.coursera.org/specializations/probabilistic-graphical-models).

r/statistics • comment
1 points • doct0r_d

Dr. Koller (one of the founders of Coursera) also has a MOOC on graphical models on coursera:

https://www.coursera.org/specializations/probabilistic-graphical-models

r/statistics • comment
1 points • k3rv1n

Also, for PGMs checkout Daphne Koller's Probabilistic Graphical Models. I thought her book was also good, though I think I'm in the minority on that one.

r/MachineLearning • comment
7 points • PM_UR_LOSS_FUNCTIONS

Here are a list of non superficial online courses that I truly feel are equivalent to upper division undergraduate or graduate level difficulty, off the top of my head, most of these taught by some of the best professors you could possibly find to teach the topic:

Probability Theory: https://www.edx.org/course/probability-the-science-of-uncertainty-and-data

Optimization: https://lagunita.stanford.edu/courses/Engineering/CVX101/Winter2014/

AI overview (e.g reviews algorithms and presents some introductory overview of the field): http://ai.berkeley.edu

Machine Learning: https://www.edx.org/course/machine-learning-columbiax-csmm-102x-0

PGMs: https://www.coursera.org/specializations/probabilistic-graphical-models

Deep RL: http://rail.eecs.berkeley.edu/deeprlcourse/

Robotics (Perception/Navigation or topics relevant to CS): https://www.edx.org/course/autonomous-mobile-robots

Cryptography: https://www.coursera.org/learn/crypto

Compilers: https://lagunita.stanford.edu/courses/Engineering/Compilers/Fall2014/about

Graphics: https://www.edx.org/course/computer-graphics-uc-san-diegox-cse167x-3

There is also a ton of material you can find just by googling e.g. "berkeley 184" or "cmu 10708" but they aren't technically moocs.

r/ItalyInformatica • comment
1 points • diego-user

[Data Science]

Buongiorno,

sono uno studende magistrale di Informatica curriculum AI. Vorrei integrare il mio piano di studi con qualche corso aggiuntivo in ambito statistica (frequentabile nella mia università oppure nei corsi di Stanford) per essere poi riciclabile come data scientist unendo competenze informatiche con quelle statistica.

Alla triennale ho frequentato il seguente corso di Probabilità e Statistica: http://magistrale.educ.di.unito.it/index.php/offerta-formativa/insegnamenti/elenco-completo/elenco-completo/scheda-insegnamento?cod=MFN0600&codA=&year=2019&orienta=Z

Che ne pensate del seguente percorso:

  1. [Stanford] Machine Learning di Andrer NG
  2. [Uni] Teoria dell'Informazione: http://magistrale.educ.di.unito.it/index.php/offerta-formativa/insegnamenti/elenco-completo/elenco-completo/scheda-insegnamento?cod=INF0095&codA=&year=2019&orienta=XFH
  3. [Uni] Apprendimento Automatico: http://magistrale.educ.di.unito.it/index.php/offerta-formativa/insegnamenti/elenco-completo/elenco-completo/scheda-insegnamento?cod=INF0091&codA=&year=2019&orienta=XH
  4. [Stanford] Statistical Learning: https://www.edx.org/course/statistical-learning
  5. [Uni] Bayesian Networks: Ragionamento in presenza di incertezza. La parte metodologica copre problematiche di ragionamento probabilistico, reti bayesiane e metodi di ragionamento su reti bayesiane, modelli probabilistici temporali. Dal punto di vista sperimentale, viene poi illustrato in laboratorio l’utilizzo di tool per la modellazione e l’inferenza con le Reti Bayesiane e con altri modelli probabilistici più complessi. Diagnosi Basata su Modello. Vengono presentati i principali algoritmi di Diagnosi Basata su Modello per sistemi statici e dinamici. Collegandosi al ragionamento in presenza di incertezza, tali algoritmi vengono poi confrontati con la diagnosi probabilistica di sistemi sia statici che dinamici.
  6. [Stanford] Probabilistic Graphical Models Specialization: https://www.coursera.org/specializations/probabilistic-graphical-models

Che ne pensate riguardo all'ordine di frequentazione? Mancherebbe qualcosa?

Grazie per l'attenzione

r/learnmachinelearning • comment
1 points • Responsible_Text9102

# 1. Probability and Statistics by Stanford Online

[See course materials](https://online.stanford.edu/courses/gse-yprobstat-probability-and-statistics)

This wonderful, self-paced course covers basic concepts in probability and statistics spanning over four fundamental aspects of machine learning: exploratory data analysis, producing data, probability, and inference.

Alternatively, you might want to check out this excellent course in statistical learning: “ [An Introduction to Statistical Learning with Applications in R](https://www.r-bloggers.com/in-depth-introduction-to-machine-learning-in-15-hours-of-expert-videos/) ”.

# 2. 18:06 Linear Algebra by MIT

[See course materials](https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/)

The best linear algebra course I’ve seen, taught by the legendary professor Gilbert Strang. I’ve students describe this as “life changing”.

# 3. CS231N: Convolutional Neural Networks for Visual Recognition by Stanford

[See video lectures (2017)](https://www.youtube.com/playlist?list=PLzUTmXVwsnXod6WNdg57Yc3zFx_f-RYsq) [See course notes](https://cs231n.github.io/)

Whether you’re into computer vision or not, CS231N will help you become a better machine learning researcher/practitioner. CS231N balances theories with practices. The lecture notes are well written with visualizations and examples that explain difficult concepts such as backpropagation, gradient descents, losses, regularizations, dropouts, batchnorm, etc.

# 4.Practical Deep Learning for Coders by fast.ai

[See course materials](https://course.fast.ai/)

With the ex president of Kaggle as one of its co-founders, this hands-on course focuses on getting things up and running. It has a forum with helpful discussions about the latest best practices in machine learning.

# 5. CS224N: Natural Language Processing with Deep Learning by Stanford

[See video lectures (2017)](https://www.youtube.com/playlist?list=PLU40WL8Ol94IJzQtileLTqGZuXtGlLMP_) [See course materials](http://web.stanford.edu/class/cs224n/syllabus.html)

Taught by one of the most influential (and most down-to-earth) researcher, Christopher Manning, this is must-take course for anyone interested in natural language processing. The course is well organized, well taught, and up-to-date with the latest NLP research.

# 6. Machine Learning by Coursera

[See course materials](https://www.coursera.org/learn/machine-learning)

Originally taught at Stanford, Andrew Ng’s course is probably the most popular machine learning course in the world. Its Coursera version has been enrolled by more 2.5M people as of writing. This course is theory-heav, so students would benefit more from the course if they have taken more practical courses such as CS231N, CS224N, and Practical Deep Learning for Coders.

# 7. Probabilistic Graphical Models Specialization by Coursera

[See course materials](https://www.coursera.org/specializations/probabilistic-graphical-models)

Unlike most AI courses that introduce small concepts one by one or add one layer on top of another, this specialization tackles AI top down as it asks you to think about the relationships between different variables, how you represent those relationships, what independence you’re assuming, what exactly you’re trying to learn when you say machine learning. This specialization will change the way you approach machine learning. Warning: this specialization isn’t easy. You can also consult detailed notes written by Stanford CS228’s TAs [here](https://ermongroup.github.io/cs228-notes/) .

# 8. Introduction to Reinforcement Learning by DeepMind

[See lecture videos](https://www.youtube.com/watch?v=2pWv7GOvuf0&list=PLqYmG7hTraZDM-OYHWgPebj2MfCFzFObQ)

Reinforcement learning is hard. Luckily, David Silver is here to the rescue. This course provides a great introduction to RL with intuitive explanations and fun examples, taught by one of the world’s leading RL experts.

# 9. Full Stack Deep Learning Bootcamp by Berkeley

[See course materials](https://fullstackdeeplearning.com/march2019)

Most courses only teach you how to train and tune your models. This is the only one I’ve seen that shows you how to design, train, and deploy models from A to Z. This is also a great resource for those struggling with the machine learning system design questions in interviews.

# 10. How to Win a Data Science Competition: Learn from Top Kagglers by Coursera

[See course materials](https://www.coursera.org/learn/competitive-data-science/home/welcome)

With all the knowledge we’ve learned, it’s time to head over to Kaggle to build some machine learning models to gain experience and win some money. Warning: Kaggle grandmasters might not necessarily be good instructors.

** 11. Full Stack Deep Learning: Deploy ML Projects

[Lecture 1: Introduction to Deep Learning - Full Stack Deep Learning - March 2019 - YouTube](https://www.youtube.com/watch?v=5AjG5OPQuBM&list=PLbcQZcJKzjYWRD2LB8N2I8bFNWg3W_J3n&index=2&t=0s)