|It's a good thing I know where the starting point is.
Pen to paper. Or fingers to keyboard, as it were.
Then - more data comes in, spurring additional thoughts that cause tangential subjects to gain relevance, which, in turn, alters the soon-to-be-written article's original premise. Mental version 1 of the article becomes a fuzzy backdrop once mental version 26 comes round.
A busy Monday to Friday schedule, coupled with family time on weekends, makes the focus required to write a coherent article seem like an unreachable concept; spare time has become a privilege.
The absence of a distraction-free environment and the continuing onslaught of everyday commitments unfailingly gets in the way.
Something else also happens: the mind wanders between primordial tasks (i.e., work related activities, family-raising tasks and select social engagements) and has little, if any, willpower left for activities that require special focus and attention.
The brain's default mode network eats up the day's free hours - reactions to external stimuli brought on by the environment dithers away any chance of producing quality material.
Writing is put away for another day.
To me - this all sounds like an excuse.
No one, after all, is exempt from everyday life and the trials set forth by the human condition.
To get time, once must then make time.
Stiil - time will not be enough. Depending solely on time will result in suboptimal results, mostly because time cannot always be assured.
|I'm sure we can figure out something.
Mental maps and processes help the organized mind execute everyday activities, taking both the short term and long term consequences into account. Decisions as to which sequence of activities to execute, i.e., which processes or systems to employ, depends on both the data that is available, and the heuristically inspired intuitive reasoning all human beings possess.
Decision-making as a cognitive process is present in all human beings, and it is a trait that signifies general intelligence. Notwithstanding those whom consider themselves infallible, the rest of us have come to understand, that not all decisions deliver the best outcomes. Whether the decision-maker blames himself or others for a good or bad outcome is besides the point: trying to understand how and why a decision is made is a truly necessary, yet hopelessly labyrinthine pursuit. Thankfully, much research has been done on how to make better decisions, and significant progress has been made through the past and ongoing contributions scientists, psychologists, economists, businessmen and other gifted minds have shared with the world.
Choosing which decision-making model, inspiring scientific insight or approach to study and apply becomes an impossible task - the amount of scrutinous analysis produced on the subject is overwhelming. Optimal decision-making students fall prey to analysis paralysis through information overload.
Edward Tufte, a Statistician and Professor at Yale University, argues that information overload is usually a symptom of organizational underload. Tufte is an infographics pioneer, and has proven his thorough understanding of sound decision-making through his data analysis work. He has disproved the neat freak-led belief that messy desks equal disorganized minds, instead attributing chaotic office environments to minds who have yet to file and organize the ideas in their heads. I empathize completely with my fellow messy-desk disorder sufferers and will choose to now
|Where did I leave that donut?
Norbert Wiener, a Professor of Mathematics who taught at MIT, derived his own take on decision-making (or on steering decisions, as a Cybernetician might say) through Cybernetics, a unique systems-led approach that manages complexity, through control and communication, a theory he first stated in his book: "Cybernetics: Or Control and Communication in the Animal and in the Machine". Stafford Beer, a Cybernetician who also taught at the Manchester Business School, applied Cybernetics to management and subsequently developed the Viable System Model, a system made up of subsystems that functions as an autonomous entity that responds to a changing environment through information-fed feedback loops.
All in all, institutions, governments, businesses and other types of entities, whether individual, group-led, large or small, have benefited to some degree thanks to these men's contributions.
Yet, having referenced these academics is akin to barely skimming the surface when it comes to the work done on matters that directly and indirectly affect the science of decision-making.
Therein, it is possible to face a roadblock - even though society at large has benefited from the aforementioned work, there is still a gap that has yet to be filled, primarily between those that can successfully implement and execute processes and systems that can deal with complexity that lead to better decisions, and those, that are not ready to employ such solutions or simply, out of ignorance or corruption, choose not to.
In other words: managing expectations is key.
For the sake of simplicity, I'll leave the reader to interpret what "managing expectations" means here. I'll try to sum it up in one thought though: managing expectations in this context means making sure people understand the work that is required to accomplish the task in question, the reward that comes with this work, and assuring that those involved, at the end of the day, will be able to see light at the end of the tunnel (i.e., they'll maintain their motivation).
As always, when it comes to managing expectations, the jury is still out on how to best accomplish this.
Given this fact, I've always been a fan of keeping things simple, or as simple as possible.
To exemplify a theory grounded in simplicity: Einstein's famous early 20th century theory of special relativity was profoundly absent of the "detail work" most of the scientific papers at the time were chock full of, with the theory having been proven mostly through Einstein's own thought-experiments and well thought out abstract notions. Yet everyone (in the scientific community at the time) was at the very least able to understand where he was coming from once the paper was published. Once others proved Einstein's theories through experiments, he became an instant celebrity overnight. People now understood - or understood enough. Or...understood more. And those that understood Einstein's theories deeply (like Stephen Hawking) have been able to further explore the universe and the matter (or energy...e=mc2...Einstein helped us understand that both are one and the same) that it consists of. The rest is history.
I'm almost certain Richard Feynman wrote or said the following (paraphrase): if you cannot teach something complex to a teenager, as evidenced by having him not understand what you're trying to teach, then it's impossible to believe the idea is true or, alternatively, that you truly understand it.
Meaning: truly understanding something should give a rational person the the ability to be able to explain an idea clearly to someone else and to have that person fully understand the concept - which should result in the clarity needed to perfectly manage expectations.
This revelation has made it difficult for me to implement theoretical and abstract models like optimal decision making models after having supposedly "learned" them. Most of the time, I've realized that I did not truly understand the models I was looking to implement - which, funnily enough, has turned out to be a blessing in disguise. Knowing that you know nothing, like Plato once said, turns out to be quite advantageous: one gets to truly learn what one is looking to implement after further study and experimentation.
|At least one of them has a sense of humor.
So, amidst all this complexity, it's advisable to subscribe to the following tenets (as mentioned in this very entertaining History Channel video regarding Einstein's life), to keep the faith and motivation as one looks to implement the best systems and processes possible.
As followed by Einstein, to readily answer the question "Why have faith that something is right?", as he did with his thought-led yet-to-be-proven theories (at the time):
1. What is proposed must be simple and beautiful
2. Understand that most scientists (or the respective professionals that will be on the receiving end, depending on the situation) may not or will not initially see it that way
3. Disregard how long it will take to prove your theory, justifiably keep at itThe third point is key - the level of discipline needed to keep on moving must be extraordinary, regardless of whether the world is with you or against you.
So, in terms of the initial baby steps the business world can take to make better decisions, I'd advise executives the following: read the latest (March/April 2014) MIT Technology Review Business Report.
An article titled "Scientific Thinking in Business" summarizes how, by employing the scientific method, a set of defined principles meant to apply reason to problem solving (i.e., making a decision regarding a specific question), businesses can benefit more through these age-old reasoning techniques than by relying solely on intuition. The article explains, in a very clear manner, how relying solely on data can be equally dangerous: over-reliance on data can lead to the wrong results, whereas successfully employed creativity can help decision makers ask the right questions, and interpret the data correctly. Big data has proven to be a minefield for those that quickly misinterpret or impulsively derive quick results without employing a scientific method led inquiry and experimentation to their analysis.
The article is realistic, in the sense that it acknowledges how the scientific method is most useful when employed by academics, who live in a world where making a decision based on carefully perused data is prone to careful examination by others - where as in the political, business, policy and advertising worlds strategic decisions have to be made rapidly, according to the company in question's strategy (this effectively means that following point #3 becomes quite challenging).
Yet maintaining a level of healthy skepticism, mixed with a livelier sense of intuition, that spurs creativity, is completely necessary, and can be accomplished, by relying on the scientific method...to ask the right questions, testing them with data, and repeating the process if need be. Blindly following the data, or ignoring it completely and substituting its validity with hunches, is, in the end, equally damaging.
So, after all has been said and done, it must be acknowledged that humans and their decision-making abilities have and will be studied under the biological, scientific, business, and overall system-based microscope, but in the end, whether a brilliant idea is derived by experimental deduction, or confidently, through intuition, the success that lies behind the idea's implementation rests mostly on what lies within the person proposing it, and not entirely on the idea itself.
Let us then look for better ways to make decisions by better preparing ourselves first, instead of blindly following someone else's instructions.