Sorry, that link does not open up a lecture page. But what you are calculating seems to be somewhat wrong. Phase noise is a broadband noise. Phase noise is typically measured from 100 hz to 10 Mhz from the carrier, and the level at each frequency offset point varies.
So if you have a phase noise measourement, and want to turn it into rms jitter, you have to INTEGRATE it over the frequency limits of 100 Hz to 10 MHz, do a little more math, and then you have an answer.
Possibly the example you were looking at was not phase noise at all, but relating a discrete freuqency spurious sidband to the jitter that one tone would cause?
here is a spreadsheet "allan variance from phase noise" that might help your understanding:
http://www.wenzel.com/documents/spread1.htmthere are other spreadsheets out there for jitter to phase noise calculation
Rich
Maguffin Microwave