California Goes Green
Winter in Northern California brings snow to the Sierra Nevada Mountains and rain to the land west out to the Pacific. And even though nature takes it's course year in and year out, it's always bizarre when evidence of new life carpets the land in the middle of winter. So it’s official, winter has arrived in California and the hills are alive with green!