|
I don't know what's your plan with the nursing field, but I'm guessing that in the long run you want to be some sort of doctor, right?
Obviously you can also become an optometrist or a dentist, psychiatrist, or even get into pharmaceuticals. Now here's the thing, if you see yourself doing this eventually when you graduate and after med (or whatever correlates) school, then you should get the nursing experience. I mean sure a guy nurse, I mean yes there is still a big stigma associated with it, but in the long run it will be better to have experience of this sort than restaurant experience. Or do you plan to finish school and buy a restaurant?
Also, back to the nursing job. Are you sure that you want to get involved in the nursing industry when you finish college, or the medical industry for that matter. Or are you still undecided? The difference is that this job will give you an eye opening look into the nursing industry, and if you won't like it than you will be better off, because then you can switch majors and it's not too late. There is the possibility of you finishing school with a degree in nursing, while working in a restaurant. The thing is that a few years down the road when you will be trying to find a job, possible employers will see that you don't have any experience and therefore even though you have been taught in theory while in school, you have not lived the reality of the job. So if you will finish college, yet won't find a nursing job for a long time, you'll regret not taking the nursing job that you have been offered and with little money in your pocket you will keep saying "I hate my life, I hate my life"
Yet, if you do take the job and finish school with good grades, employers are going to give you a shot because it's going to be that much easier for them to teach you.
|