In his recent book Thinking Fast and Slow, behavioral economist Daniel Kahneman tells the story of observing army recruits out on exercises and his belief that he could spot the potential leaders amongst them. Years later, it turned out he'd been almost entirely wrong. His confident judgment had been a morass of bias, heuristics, and narrative fallacies.
This got me thinking about a more-or-less completely non-sequiter, vaguely related to open atmostphere and extravagant diversity.
As humans, we have brains built for pattern matching, so we find patterns. We apply these patterns all day long into systems of heuristics: "have I been in this position before? What did I do then? Did it work?" Actions that match our heuristics feel right because they work as expected and match previous experience.
We're making more of these patterns every day, and we are awesome in our ability to match situations against our patterns.
Problem: we're not good at evaluating whether our past decisions were correct. This introduces systemic bias. Bias expands directly with the homogeneity of your peer group.
Fixes? Think different. Use DATA to evaluate your decisions. Interact outside of your comfortable peer group. Be aware that you're using short cuts, and take the long way every-so-often to see if it really is longer. think.