I am so stuck here!
I can't honestly think of a book that I have read that has changed my views.
I thought the bible ... but I was raised reading the bible so I can't say it changed my views.
Anything other than that ... just don't know!
The last thing I read was the Twilight series ... but no views changed there!
So we will skip this one ..