Teachers educate, so there are vast benefits both economic and social to their job. The more educated a society is, the more power they have to effect their economic circumstances. Employers also value education, so regardless of what jobs are pursued, the educated applicant will stand out and will also be more likely to succeed. Education also makes people aware of the possibilities available to them meaning that they are more likely to be pro-active when it comes to building a responsible, caring society in which to live.