I don't know why people can't put this together: WV (and the rest of the South) hasn't just gotten more Republican, it's gotten more conservative, too. I know we like to pretend the old Dems in Dixie were just right wing racists and it took a while for those poor, stupid Southerners to switch to being Republicans (ya know, when the parties literally switched platforms!), but the fact is much of the South used to really appreciate liberal politics, specifically on economic issues. The real heart of American conservatism originated from the Midwest.
Tell that to the Byrd Machine of Virginia.
Not sure what your point is. I said "much of the South," and it's pretty clear that VA (along with TX, FL and OK) had a much quicker embrace of conservatism than much of the South in the latter half of the 20th Century. I also never said that there weren't any conservative Southern Democrats, simply that the South has gotten much more conservative in even the last 20 years, which I'm not sure how you can refute.
So again, not sure what your point is...
You'd also have to include South Carolina to that list of states quick to embrace conservatism, at a minimum.