Anonymous

what do you think of college degrees? are they necessary or unnecesary?

12 Answers

Relevance
  • Greg
    Lv 7
    4 weeks ago
    Best answer

    For some jobs they are critical, for others they are important, and for other they are unnecesary.

  • 4 weeks ago

    to be honest with you... universities are businesses. I strongly believe that the school systems are extremely Unfair.  I've been in school for 20 years, had wonderful teachers that motivated me to be excel and sure I've learned along the way, but the system is so unfair, because it only lets in honors students/ap/ high gpa thats so impossible to reach if you're not tech savvy or are a visual learner like me. So, to better answer you're question I think that college degrees are extremely difficult to obtain because of external factors that unfortunately make it harder and harder to even obtain.

    • Thelonious4 weeks agoReport

      Great answer.  Honestly, I think that College's are kinda jokes.  They're only really important if you're wanting to be a Doctor, Lawyer, or Teacher.  But why do I need to go to college to learn that Managers "plan, organize, lead, and control"?  I could learn that by just picking up a textbook!

  • Anonymous
    4 weeks ago

    Depends what you want to do.  If you want a job in a lot of fields, it's a necessity.  A degree in 14th century French poetry is likely not going to be a huge help in terms of finding a job.

  • 4 weeks ago

    A degree is necessary for many occupations however I’m a “on the job training” kind of guy.  Had great success and career in Sales, formed a company and eventually  sold it to employees who were HS grads.  Just was not a formal education guy.  Good luck.

  • What do you think of the answers? You can sign in to give your opinion on the answer.
  • 4 weeks ago

    in some professions, they are necessary. for everyone? not necessary

  • 4 weeks ago

    may be not an actual degree for everyone, but becoming educated is never in vain: it may clear out nonfactual beliefs, outright wrong assumptions, give you a better appreciation for things like architecture, art, history, literature, music etc; expand your way of thinking and tolerance, give you an understanding of your own body and health, allow you to make more informed decisions, etc etc.

  • 4 weeks ago

    i think it can help the people that want them

  • oikoσ
    Lv 7
    4 weeks ago

    If you want to work on a garbage truck, unnecessary. If you want a high-paying (legal) job, necessary. If you are a Delta or Epsilon, don't clutter up the campus trying to get a degree. Leave the classes for the Alpha-plusses, even the ones who had alcohol in their blood-surrogates.

  • 4 weeks ago

    I think most college degrees show only that you are willing to work hard for four years (or six years) and finish your degree even though you're sick of it by the end.  That's what employers want, someone who can jump through hoops, someone persistent and resourceful and perspicacious, who won't throw up his hands and say 'F*ck it!' when things get rough.

    This is one reason why many jobs specify a college degree, -any- college degree.  Some of the dumbest, most incompetent people I've worked with had college degrees.

    But in STEM fields, you really need all the education you can get.  A research scientist, a top-level engineer, a physician, a Master's or a PhD is really only the start of your learning, the entry-level!

  • Anonymous
    4 weeks ago

    I "think" this is a chat question, and thus, a violation.

Still have questions? Get answers by asking now.