Saturday, November 28, 2009

In the Cave

I've been tracking some of the online discussions concerning the hacked emails between various scientists associated with the Climate Research Unit at the University of East Anglia (UK). This group is one of a large number of groups that are contributing data and  statistical analyses to climate studies, including the IPCC (the Intergovernmental Panel on Climate Change), which has acted as a centralized source for climate science. The  emails released by the hackers stretch from 1999 to this month and concern a number of issues. The hackers apparently thought they had the smoking gun that would prove climate change (or man-made-global-warming theory as they like to call it) is all a hoax. The RealClimate site has a good summary of the issue. George Monbiot's blog entry has links to the emails as well as a droll take on the thinking behind the climate change deniers.



What has been really interesting, to me, is the heat generated in the blogosphere, especially when a news story or recent study pops up about climate change. The first comments come from individuals commenting on the science and then new voices start to emerge heaping scorn on anyone who does not see that the hacked emails change everything (whether the emails have anything to do with the topic or not). Their arguments are interesting on their own, quite apart from the questions of what the hacked emails do or do not prove.


Most obvious, to me, is the assertion that the climate change "hoax" is driven by the huge amounts of money these scientists are making. If you really think that money is ipso facto the corrupting influence, it should be clear (a "no-brainer"?) where the money is, and it ain't with the academics. The largest corporate players on the planet all depend on business as usual (BAU), especially in regards to their control and continued market for fossil fuels, the main source of CO2 emissions. We know what kinds of profits are at stake for Exxon, Shell, BP, Chevron, et al., let alone Monsanto or GE or Boeing. We know what they've been spending on directed research, PR efforts, lobbying in every government that matter.  So, please, don't insult our intelligence with references to fat-cat scientists.


Perhaps not so obvious, but more telling, is the deniers' emphasis on temperature data and whether or not they  show "warming." For one thing, most of the denier-bloggers show a deep distrust of the statistics without much attempt to  illuminate what the numbers do tell us. Nor do they answer the solid statistical evidence from other sources

But then, the proof for climate change argument does not rest on temperatures alone. Atmospheric and sea temperatures were only meant to be indicators of coming changes. The models that have been developed over the last 40 years were meant to give us a warning, to be indicators that would tell us we need to change before the really big, planet-wide systems, lagging some years behind observed temperatures, swing to a new balance point.


At this point, the proof is now distressingly clear -- rising sea levels, thinning ice at both poles, melting glaciers at all lattitudes, disappearing permafrost throughout the far north: Alaska, Canada, Siberia. We can debate the temperature data and models all we want, but the planet-level swing has already happened. The game has changed. And though we could see it, we didn't see it. Why not?


This is where the image of Plato's Cave came to mind. Plato developed this allegory to make a point about the unreliability of sense-derived information, and the need to pursue the truth of the forms that lie behind what we see or hear.

To summarize, Plato (in the character of Socrates) painted a picture of a group of people who have grown up in a cave, restrained so they can only see a wall a little ways in front of them. Behind them, out of their sight, is a big fire. And in between the fire and the watchers there is a constant stream of people, moving and talking, thought the watchers can only hear the echos that bounced off the wall and could only see the shadows they cast on the wall. Plato's point was that the two-dimensional world of the shadows on the wall is, for the watchers, the sum total of reality. If by some chance one of the watchers could escape his restraints and turn to see the three-dimensional world behind him, and then return to tell the others what he had learned, he would be ridiculed, if not ostracized or killed.

I sometimes think that the world we think we live in is actually a highly edited version of the real world, for the simple reason that we can't really process all the information out there (it's chaos!) so we are necessarily selective. Humans need to break a huge amount of input into the signals that really matter, ignoring the static. But the more we learn about the world, the more we learn that it is all signal. Because chaos is not random. But chaos is too much to comprehend, so we watch the shadow that chaos projects on our walls and try to cope the best we can.

"Chaos" (Wiki) describes a dynamic (changing) system that is sensitive to initial conditions -- systems where a tiny change in an input can result in a huge systemic change. The textbook real-world example is the weather. Humans have been gathering weather data for decades. We have powerful computers. But the horizon for accurate forecasting is still hours, not weeks, let alone months. The complexity of the system makes it unlikely we can ever capture and input the right information, even if we had the computational power to crunch it. That's the challenge that comes from trying to predict the local effects of systemic processes by looking at individual inputs. Chaos -- the world --  just doesn't work like that.

Perhaps we are left with the realization that some people will deny a complex truth in favor of a simple answer that doesn't require close, sustained analysis. Change, sometimes inconvenient, is built into the system. We just have to figure out if change, especially when it's not convenient, is built into us.

Wednesday, November 25, 2009

The Efficiency Con

Some part of my income the past couple of decades has been driven by the demand, on the part of large corporations, to train its staff in the principles of effective staff management. It has been my good fortune in the past seven or eight years to do this for a company whose history has been built on a foundation that in key ways differs from what many professionals accept as proven management principles. To understand the difference it helps to look at the careers of two key players in building the structure of scientific managment: Frederick Winslow Taylor (Wiki) and W. Edwards Deming (Wiki).

Born in 1853, Taylor was a mechanical engineer who saw an opportunity, in the rapidly expanding industrial world, to improve efficiency by close studies of how work was actually done. He was the original "efficiency expert," armed with a stopwatch and a clipboard.  He became famous as the inventor of "scientific management." His basic approach was built on four principles:

  1. Replace rule-of-thumb work methods with methods based on a scientific study of the tasks.
  2. Scientifically select, train, and develop each employee rather than passively leaving them to train themselves.
  3. Provide "Detailed instruction and supervision of each worker in the performance of that worker's discrete task" (Montgomery 1997: 250).
  4. Divide work nearly equally between managers and workers, so that the managers apply scientific management principles to planning the work and the workers actually perform the tasks.
However, as indicated in a recent New Yorker article, in practice it worked rather differently. For one thing, there was a wide gap between workers and management, based largely on the understanding (or prejudice) that workers were incapable of understanding the scientific principles that would yield greater productivity. In general, this approach also exalted the role of managers. making them kind of priests in the quest for efficiency and profits. This approach reflects a militaristic attitude that was common at the turn of the 20th century: it was the job of the superior classes to command (and control) the inferior ones.

Another key aspect of Taylor's approach is that it was his business. Even though it was presented as an academic approach, he was selling scientific management to clients where the only meaningful evidence of success was a contract for his services. When confronted with deadlines and payrolls he often ended up using the same "rule of thumb" measurements he was supposed to be replacing. And there were times he didn't get paid.

Scientific management, or "Taylorism" as it was often called, has been impressively influential. It is reasonably accurate to say that the American industrial colossus was built on the principles of efficiency Taylor espoused. And it worked just fine until the American industrial model, rich in resources and markets, was challenged by an approach developed in a war-ravaged society with scarce, expensive raw materials and huge barriers to success, using an approach developed by an American whose ideas differed in fundamental ways from Taylor.

W. Edwards Deming was born in 1900.  His career was largely academic (PhD from Yale in math and physics) and then in government, specializing in statistical measurements of quality. The Second World War gave him an opportunity to test different approaches to industrial efficiency and quality control. In the immediate post-war period he worked in Japan in the occupation government to rebuild the industrial sector. The Japanese, fully aware that they needed to exploit any competitive advantage possible, worked to integrate Deming's teachings in various industries. From my point of view, the culmination of his work was his collaboration with the Toyota Motor Company in the 50s and 60s.

The key difference between Taylor's scientific management and Deming's approach, in my view, is who owns the work. As opposed to Taylor, Deming taught that every worker owns the work he (or she) does and has full responsibility for its quality. This includes not just doing the job right, but taking responsibility for the best way to do the job. At the core of this is a fundamental re-ordering of the role of manager.

Instead of "management by command and control," Deming set the standard for "management by objective." In working with Toyota, he was working with a production system that has already made huge strides in the same direction. Where Taylor looked at individual production steps and sought to streamline each one, driven by efficiency, Toyota had already come to understand the value of looking at the entire system, focused on minimizing waste (starting with overproduction).

The Toyota Production System (TPS) benefited from committed, consistent leadership that lived the principles they preached. Specifically, managers understood that their role was to give the workers the tools and support they need, not only to do their jobs, but to improve them. In some ways, it's the workers who tell the managers how to do the job. Deming provided an overall structure, and statistical evidence, that allowed Toyota to compete with the big boys.

Toyota was clearly successful. Virtually every other automaker has tried to adopt their approach, which has been a great boon for management consultants. And though they may be trying to replicate the results generated by Deming, these conusltants seem to be channeling Taylor and scientific management.

To me, the key thing is that the culture of management by command and control is deeply ingrained in virtually every corporate culture in the world. That is the default position: "to justify the title and salary of a manager it's my job to know more than they do and to show it by telling them what to do and how to do it." And because that's the expectation going in, the average management consultant is going to have to align with that if he (or she) wants to keep working. Or get into talking about "tools" or "best practices" or "targets": all anathema to someone immersed in Deming and TPS.

As the New Yorker article points out, even the acolytes of Taylor came to realize it could not deliver what it promised. But something about the dream of "efficiency" continues to keep Taylorism alive and well. And not just in corporate culture.

Tuesday, November 24, 2009

Why Free Enterprise Isn't

After thinking about Nick Hanauer's TED presentation the last few days I went back to some of my older blog posts. The following was first posted in October of 2009. It looks at the idea that successful entrepreneurs leverage a system that rewards them for driving prices down by subsidizing the cost of the goods and services they sell to us -- MacDonalds is able to sell to cheap burgers because of farm and tax subsidies, in an over-simplified but meaningful summary.


In other words -- no free lunch. MacDonalds, of course doesn't need a strong middle class, at least until falling revenues make it impossible to keep giving out the subsidies.


Among the inescapable lessons drawn (by me) from the current economic tsunami are:
  1.  Globalization is absolutely necessary: if we want to keep every ambitious or desperate third-worlder from coming to where the jobs are (here), we have to send at least some of the jobs to where they are. 
  2. Globalization means that the American working middle class is history: it was their jobs that got sent wherever.

This is not happenstance. There is a key human, behavioral imperative in operation. It is a trait, an instinct perhaps, which has allowed us to compete so successfully with the other species in our environment. We (you and me) would not be here without it. And it could easily be  the imperative that cause the collapse of our global society.

In virtually every human culture we know about, humans demonstrate competitive behaviors, played out in the acquisition of money and/or power. Not every individual is driven by this, certainly, but most cultures reward that minority of individuals who are the most competitive personalities large amounts of control over everyone else. You can argue that ethnic and political cultures gained strength to the degree that they formalized the ascent to power (to prevent endless battles) and balanced the interests of the power elite (who may be less competitive than their founders) with the interests of the plebes (some of whom may be more).

If that balance is lost, the natural urge to dominate is by definition more unbridled, even manic.  The American economy has gone through several cycles of this excited, almost pathological activity in its history. We have been shrugging it off as a necessary aspect of an efficient free market. Sounds good. But the definition of "free" is what is at the center of this.

It was a blog posted by Robert Singer on disinfo.com that got me thinking about what free market really means (at least in this version of capitalist democracy) with the statement:
"You don’t eat the hamburger at McDonalds because it’s a dollar: It’s a dollar to get you to eat it."
 Singer's point is that Ray Croc could sell a hamburger cheap because the grain (which made the bun and fed the cattle ground into the patty) was subsidized -- farmers could sell it for less than it cost to grow. That was because somebody -- taxpayers -- paid them a subsidy that made up the difference.

Now, Ray Kroc could have sold the burger for a higher price and have gotten rich that way. But the genius of Ray Kroc is that the burger he sold bought him both your dollar and a powerful piece of the market. This was about Ray versus everybody else slingin' burgers. Which was mostly diners and cafes run by moms and pops and small entrepreneurs -- gone. The dollars he got from us let him buy a lot of real estate  -- the asset value of McDonalds is primarily in the land the stores sit on, not the number of burgers served. The burgers pay the mortgage. Or perhaps we all do.

Singer sees this an example of the "downward manipulation of prices," a deliberate strategy supported by the Federal Reserve and big finance from its inception:
"Butler’s investigation has identified JP Morgan Chase, one of the founding members of the Federal Reserve, as the prime suspect, in the “ongoing intentional, not accidental” great crime of keeping the price of commodities low so the middle class can afford the American dream, a nightmare for the planet."
It's the same strategy employed by Standard Oil (and all its offspring) and the agribus monoliths, in terms of domestic policy and building reliance on chemical fertilizers and engineered seed. And it's the same strategy, but now leveraging global IMF inequities, that WalMart employs to use low-paid, off-shore labor to supply the goods that will purchase our domestic dollars.  A subsidy here, a controlled wage there, every little wrinkle lets me buy more customers.

But for now, we should know that every piece of food, virtually every consumer item, is paid for in unseen ways -- by manipulated commodity pricing, and by the use of virtual slave labor to work our farms and factories.

The lesson: if you want to win, you've got to own the market, and one way to do that is to decouple everything from value, make it only about price. It's anything but a "free market," unless you mean that by becoming the dominant player you are now free of pesky rivals, other than those that play by your rules. And the players are now global, concentrating huge capital wealth among a tiny fraction of the world's population. Local communities, national societies and cultural ties mean virtually nothing to them. If you buy the competitive thing, at that level it may only be about dueling with the other big players, mano a mano.

Along the way, in our particular culture, we have undervalued skill and knowledge, and have created a glutted workforce that will take slave wages rather than no work at all. And Wall Street continues to reward those companies that add to the unemployment role, because workers are simply costs, and costs must always be cut.

There was a time when slash-and-burn agriculture was probably key to human survival. It provided sunlight for earth that was rich with the ash of the burned plants. But after a year or two the fertilizer was consumed. It took decades for the biomass on that patch to build to the same level of nutrients. Not a big problem in a big forest with few people.

As the forest fills up with hungry people, slash and burn is probably not a good strategy, except it's always worked before. We know how to do it. It's someone else's problem.

At some point, we may change our short-term tactics to match long-term imperatives. Or the winners will just keep fighting over the remaining forest. For that kind of social disconnect, think London in 1870: toffs in the clubs, corpses in the East End.  I've seen articles/ads on Newsmax.com promising to give you the secret to being a "Robber Baron" in this financial crisis. Something to think about.

Sunday, November 1, 2009

To get the shot, or not

Anyone with young children in their extended family has probably come up against the childhood vaccination question, and the debate about causes of autism. And we are all, parents or not, hearing the cases for and against the H1Ni vaccines (and flu vaccines in general).


Flying back this week from New York, and a visit with my grandson who is recovering from flu (assumed by health professionals involve to be H1N1 though no tests were made), I read Amy Wallace's article in Wired. Her strongly worded defense of the science behind childhood vaccinations for mumps, measles, rubella, pertussis, etc. was already known to me from a Twitter storm involving various sides in the debate.


The next day I read an article in The Atlantic that called into doubt the efficacy of flu vaccines, for both seasonal flu and H1N1. The article focused on a few serious scientists who are questioning widely accepted claims for vaccines, at the risk of the approbation of other health professionals.


So, who are we supposed to trust? And, most importantly, what are we supposed to do if we want to improve our odds in a world that seems to threaten our fragile health on so many levels? 


Looking at the questions raised by the Wired articles, it seems important to me that the anti-childhood-vaccination proponents often get lost in personal attacks on one or another of the scientists arguing in favor of vaccinations. Such ad hominem attacks are often used by people who may be so convinced of their position that they can overlook the facts in favor of finding a villain to blame. In this case there may indeed be a certifiable villain in the mix: big pharma.


It's a bit like OJ. The LA cops from the time of Bill Parker had so thoroughly lost the trust of black Angelenos that there was no way a jury of peers was going to believe anybody representing LA, let alone a nut case like Mark Fuhrman, who were out to get a black man, any black man. By the same kind of reasoning, because vaccines come from big pharma, who are one of the least credible institutions in the US, there's no way I would believe anything they claim.


Except. Sometimes the bad guys are not the story. For one thing, big pharma doesn't really make much from vaccines. Only a few hundred million bucks a year. What they're focused on is the big payoffs, the one-or-two-a-day-for-the-rest-of-your-life drugs like Lipitor or Cialis. The others, like childhood vaccines, are just chump change.


The most incendiary charge against the childhood vaccines is the issue of autism. And this could easily be a case of post hoc ergo propter hoc, the term for a classical logical fallacy: just because thing A happens and then thing B happens does not prove A causes B. Autism has become our syndrome de jour, a diagnosis that seems to be growing more common. Like many, I think it may be just a case of different diagnostics applied to a wide range of human behaviors, especially a range of behaviors that is understood to be a continuum anyway.


But the post hoc argument also calls into question most statistical analyses, which only point to correlation, not causation. It was Benjamin Disraeli (and later Mark Twain) who said "there are three kinds of lies: lies, damned lies, and statistics." Statistics can certainly be slanted, but a close examination should be able to discover the bias. And that is the case with flu vaccines.


The argument for getting an annual flu vaccination, it seems, may be based on a conflation of   proximity (like post hoc) and causation. The statistical evidence says that people who get vaccinated are half as likely to die. However, a closer look also says that people who get vaccinated are more likely, for a number of socio-economic reasons, to be healthier in the first place. Where statistics indicate a correlation, scientific testing should support or question it. However the case for the efficacy of flu vaccines has never been actually tested, with controls and placebos and the whole scientific method.


I believe our challenge is to see the world as it really is, not just as conventional wisdom tells us the world should be. Which also means that we should not be overly trusting of "experts." Add to that the deep distrust (or misunderstanding) of science and the scientific method that seems to be a part of our culture, and we are naturally set up to distrust those who claim any kind of truth that doesn't align with our beliefs.


My point is that our beliefs must be continuously tested -- and the scientific method is the best model for a way to discern the world as it really is by rigorous examination. It is highly unlikely that any truth is THE truth. But a single isolated truth can be more significant than a passionate belief. 


It's all about the testing, and the willing suspension of belief.