[Reference request] Article by scientist giving lower and upper bounds on the probability of superintelligence

https://www.lesswrong.com/posts/FYBWCQbwMXwJm6Jdy/reference-request-article-by-scientist-giving-lower-and

A few months back somebody posted an article by a scientist giving lower and upper bounds on the probability of superintelligence. He broke up the calculation as a Fermi calculation with three parts (EDIT: See LocustBeanGum’s answer). Does anybody remember this article and if so can you provide a link?

Comment

https://www.lesswrong.com/posts/FYBWCQbwMXwJm6Jdy/reference-request-article-by-scientist-giving-lower-and?commentId=p5qwNahGhn48dqhBH

The original article; the LW link post.

Perhaps you mean this, but the probabilities involved in the argument are different. (human level AI will be built in 100 years; the AI will be able to undergo a recursive self-improvement in intelligence; this intelligence explosion will unpredictably transform our world)

Comment

https://www.lesswrong.com/posts/FYBWCQbwMXwJm6Jdy/reference-request-article-by-scientist-giving-lower-and?commentId=rc3LcyXtrmNc48B23

Thanks, this is what I was looking for.