I live in a fairly small town and know a guy who's a manger for the local town parks.
Seems like he makes a decent living, and basically gets to drive around all day and manage guys doing maintenance work such as trash pickup, mowing, etc as well as enforce city ordinances (ex. forcing residents to remove trash and violations from their yards).
Being able to drive around and work with other guys in the summer sun IMO sure seems enviable compared to being stuck in some crampt cubicle or office, surrounded mainly by gossipy female coworkers.
I think it's a little disappointing how today the conventional "office job" is touted as the ultimate desire of college grads; anyone know of any good job opportunities like this, just in general?
I'm aware of apprenticeships and certifications like welding, electricianship, plumbing, etc that primarily center around guys working with their hands in a male-oriented environment.
Seems like he makes a decent living, and basically gets to drive around all day and manage guys doing maintenance work such as trash pickup, mowing, etc as well as enforce city ordinances (ex. forcing residents to remove trash and violations from their yards).
Being able to drive around and work with other guys in the summer sun IMO sure seems enviable compared to being stuck in some crampt cubicle or office, surrounded mainly by gossipy female coworkers.
I think it's a little disappointing how today the conventional "office job" is touted as the ultimate desire of college grads; anyone know of any good job opportunities like this, just in general?
I'm aware of apprenticeships and certifications like welding, electricianship, plumbing, etc that primarily center around guys working with their hands in a male-oriented environment.