Dentists experience the satisfaction, privilege and joy of positively transforming their patients lives by improving their oral health and making their communities healthier. Dentistry is a rewarding and challenging career that offers a wealth of benefits, including owning your own business, exercising creativity in daily work, working as part of a team, and helping to eliminate oral health disparities