If I may give my perspective having grown up in the 50s and 60s.
Starting with the 60s and accelerated in the 70s, white society went through a major transformation in their attitude toward race relations in general and their attitudes toward black people in particular. I saw it in my family plus people I met and worked with over the last many decades.
Given that there are no absolutes (one can always a find an example of a single white person acting in a racist manner toward blacks), it is not a bold statement to say that white racism is a thing of the past.
Since the 1960s white society has done everything possible to promote racial harmony even instituting 'reverse' racism (against white citizens who had nothing to do with a culture that established slavery that even their ancestors had nothing to do with) under a series of programs at the federal and state level under the heading of 'affirmative action'.
Has black society followed along the same path as white society toward racial harmony over the last 50 plus years? The simple answer is No.
This is the case even though the great, great majority of blacks have never experienced slavery and most all have never experienced real racism.
Many blacks would differ and say they have experienced racism but it is because they take any disagreement between themselves and a white person as racism. They seem to forget that we all have disagreements with others from time to time.
I still remember the black racism against whites (I could include my own examples) into the 70s. I am also aware of the continued racial harmony (again from a white perspective since that time) really between white and blacks over an extended period of time.
I have also noticed a growing disharmony between blacks toward whites over the last 6 years. I don't know if it is because Obama and his administration, acting so often in a racist manner, has made it easier for the black racists to become more vocal or it is because more blacks are becoming racist.
Regardless I see white attitudes of whites toward blacks starting to change. Why?
It is simply a reaction of whites, after 50 plus years of promoting racial harmony, having to deal with a black culture in America that, in general, cares more for racial disharmony than harmony.
To me over the last 50 plus years the biggest mistake that blacks have made was putting 'all their marbles' and faith in the Democratic Party. That is a whole other discussion but it would be an argument easily made and justified.
So getting back to the OP, if there is a race war then the history books (if written correctly) will say that the Democratic Party created the conditions and Obama lit the match.