Week 3 pre-lecture reading

More on Bayes' Rule

The plan for week 3

We are going to model language learning as a process of inference: a learner observes data and infers a language from that data. Specifically, we are working with models which treat language learning as a process of Bayesian inference: learning involves calculating the posterior probability of a language, based on that language’s prior probability and the likelihood that the observed data was generated by that language. We will generally be working with very simple models, where the learner gets data of a very simple sort and infers one of a small number of possible languages – in fact, many of the models we will be using are very similar or identical to the coin-tossing models that are discussed in the readings below.

The aim for the pre-reading this week is to give you a basic understanding of Bayesian inference. Understanding Bayes’ Rule involves a little bit of basic arithmetic, plus some notation for talking about probabilities, conditional probabilities and so on. I have tried to find a couple of relatively basic introductions, and flagged up additional resources you might find useful if you are new to this stuff or need a refresher. There is potentially quite a lot of material to get through – minimally one short chapter and a couple of quizzes if you already somewhat familiar with this stuff and just want the basics, but more if you end up watching tutorial videos or tackling the advanced readings. I recommend making a start on these materials before the week 3 lecture, but if you don’t get through it or want to return and re-read these materials, you can do that as preparatory reading/revision for the next couple of lectures.

You should do at least one of the following two reading options, followed by the quizzes linked below.

For the basics (if in doubt, take this option)

Read Chapter 1 of Stone (2013). This will equip you with just enough to understand the lectures and do stuff with the code. Don’t worry about the section titled “Model Selection, Posterior Ratios and Bayes Factors”, we won’t be using any of that. Your enjoyment of the “fork handles” example may be increased by watching this:

If you find the Stone reading OK and want to get a little more in-depth, you could read Chapters 2-4 of the same book (there are a couple of copies in the library), or try the “for the keen” option below.

If on the other hand this is new to you and you want some help with the basics of probability you could check out these short Khan Academy video lectures which are quite helpful:

You could also google “introduction to Bayes Theorem”, “introduction to Bayes Rule”, or “introduction to probability theory” – there are dozens of intro texts and videos out there, if the ones I like don’t work for me you’ll be able to find something that does.

For the keen

For an introduction that covers a lot more ground and in a lot more detail, you could read Chapters 2, 4, 5 and 6 of Kruschke (2015) – the whole book is available online via the library (if you are on the University network those direct links should work, otherwise follow this link to access via the library, you’ll have the option to log in when you follow one of the links to access the full text). Kruschke’s focus is on Bayesian data analysis rather than cognitive modelling, and this reading involves more maths than the first chapter of the Stone book and is generally a lot less gentle. But if you are interested I think it’ll give you a really solid understanding of Bayesian inference and make the stuff we do on the rest of the course fairly straightforward. Note that that book is designed around the R programming language – we aren’t using R, so don’t mess with the R programming exercises at the end of each chapter, and skip over the little R snippets that crop up occasionally. Alternatively you could do chapters 2-6 of the 1st edition, Kruschke (2011), but that’s not available online.

After the reading, take the quizzes!

Do these three short quizzes which will set you some simple puzzles to do with probability, likelihood, priors and eventually doing Bayesian inference. Most of the course will be no more conceptually complex than these examples, so if you can work through these then you’ll be in good shape – and if you can’t it’ll help you figure out whether you need to watch some of the tutorial content and/or ask for help or clarification in lectures and/or in labs.

References

Kruschke, J. K. (2011). Doing Bayesian Data Analysis: A Tutorial with R and BUGS, 1st Edition. London: Academic Press.

Kruschke, J. K. (2015). Doing Bayesian Data Analysis: A Tutorial with R and BUGS, 2nd Edition. London: Academic Press.

Stone, J. V. (2013). Bayes’ Rule: A Tutorial Introduction to Bayesian Analysis.

Re-use

This page was written by Kenny Smith. All aspects of this work are licensed under a Creative Commons Attribution 4.0 International License.


Course main page

Project maintained by centre-for-language-evolution Hosted on GitHub Pages — Theme by mattgraham