Hello folks. I'm a Canadian looking to possibly winter in the southern states (Arizona preferably) but I have reservations given the recent news about gun violence in the US. I don't carry a firearm, hell I don't even own one. Is it really that bad or are the media companies just sensationalizing it? Thanks for the honest replies.