Why Skin Gets Dry in the Winter

Episode #139 / Dec 28, 2009
Skin gets drier in the winter. However, in order to understand how to best hydrate dry skin (covered in another DermTV episode), it's important to understand what makes it dry. Dr. Schultz explains.
Donna DeSantis on December 28, 2009 at 6:37pm

This is such a GREAT sight... I miss seeing Dr. Schultz in person, maybe someday...if I ever have medical ins. again!! Till then I'm a fan of this sight!!
Donna DeSantis