Because of better education about the benefit of good oral health when it comes to their overall physical wellbeing, regular visits to dental clinics have become a way of life for North Americans over the past several decades.