Science only teaches fear.


I can't think of a single time where a scientific study left me feeling secure, or safe, or confident.

The more I think about that, the more alarming and just sad it is.

Science is always talking about what's out there to get us. Viruses, animals, natural disasters, other people.

Where is the scientific study that shows people how humans affect the world? Other than saying how humans are destroying the planet through global warming. Climate change? I think I have even heard it be called global cooling a few times.


"But bro even animals experience fear!"

I'm so sick of that hippy argument suggesting that if animals are doing something than it should be ok for humans to do so as well.

It's just not logical.

Fear should be something that should be taken out of the equation for everyone.

I'm really going to try to live my life without fear. Fear is just a waste of energy. I'm not saying I'm gonna be going balls to the wall and just try to start running through walls. I'm just gonna try to function and live without that emotion holding me back.