Do you need to take supplements? Probably not. Studies consistently show that food is a better absorbed source of vitamins and minerals, and that supplements don't do the average person much good. Food also provides us with phytonutrients and other substances that we don't get in pills.
Keep in mind that the supplement industry is for-profit, and their goal is to make money off you by convincing you that you need their products. And in the United States our "more is better" attitude leads people to blindly take excessive amounts.
My other beef with the supplement industry is that it's not regulated. There is no guarantee that what it says on the label is actually what is in there, or that it's safe, especially if you're taking other medications.
So eat food. Ditch the supplements. Stop worrying. Live vibrantly!