I have just completed my graduation and am now looking for a good job. All my relatives have been advising me to only take up a job in that company where medical insurance benefits are provided. Is this really so important that our company provides us with medical insurance benefits? What are its benefits?