Fairhaven, The River

About

Recent Posts

  • There's meteorology in everything?
  • Tweaking the bar chart
  • Tracking carbs for diet management
  • Fitbit and R
  • Orgmode tables vs CSV files for R
  • Org-mode, R, and graphics
  • Org-mode dates and R dates
  • Firefox 35 fixes a highly
  • November Books Read
  • October Books Read
Subscribe to this blog's feed
Blog powered by Typepad

Archives

  • January 2016
  • December 2015
  • January 2015
  • December 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014

Categories

  • Arts (6)
  • Books (13)
  • Current Affairs (41)
  • Eco-policy (55)
  • Energy Tech (38)
  • Food and Drink (9)
  • Gift Economy (3)
  • Healthcare (46)
  • Politics (16)
  • Science (4)
  • Standards (33)
  • Travel (9)
  • Web/Tech (32)
See More

Analysis of Heartbleed and IHE ATNA effectiveness

 


Table of Contents

1. Nature of the Attack
2. The Risks
3. Responses
4. Commentary

1. Nature of the Attack

Heartbleed is a high profile flaw and attack on the OpenTLS implementation. This analyzes how well IHE ATNA rules mitigated the flaw, what should change, and contemplates future sensitivities. The flaw permitted exposure of in memory contents of client and server to a malicious counterparty on an established connection. The exposure could then potentially expose current, past, and perhaps future encrypted traffic.

The following analysis applies only to systems that use or used the vulnerable versions of software.  Systems that used different software, or non-vulnerable versions, will not be directly affected by this bug.

The assets that ATNA needs to protect are:

The encrypted traffic
This could be past, present and future traffic. Most especially, this is the documents being exchanged.
Private Authentication data
This could be private certificates, passwords, etc.
Both may be exposed while memory resident in the server, so both are at risk.  The probability of exposure depends upon both static hardware and software characterisitics, and on bynamic history of activity.  It is data that was in memory that is at risk.  That data could expose either.
 
Significant correction.  Examination of the attack/test code shows that OpenSSL does not require an initial negotiation before turning on the heartbeat support.  So the IHE ATNA bi-directional authentication does not provide protection.

2. The Risks

The risks to ATNA protected systems are the same as for any server.  The analysis below is wrong.  It was based on incorrect information that initial TLS negotiation had to succeed.

ATNA requires the use of bi-directional authentication. So ATNA connections are exposed to this attack when one or both sides of a connection are malicious. The organization need not be malicious, but the secure node must have been penetrated and made malicious.

Other, non-ATNA, connections to the same server also expose ATNA connections to this attack. ATNA requires "appropriate" security measures for secure nodes and for secure applications. This is open to interpretation by those deploying systems. If the same secure node was used for both ATNA connections and non-ATNA connections, then those other connections may have exposed private data.

Most public servers use only server-authentication. They do not authenticate the client. They will accept connections from any client. It is near certainty that a public server will be penetrated by heartbleed given the number of potential malicious clients.

A server that requires bi-directional authentication for all connections has a lower probability of penetration. It drops to the probability that one of the known systems is penetrated and malicious. ATNA assumes that validation of acceptability for authentication is based on a review of security practices, so these systems are better protected than the typical system.

This makes the odds of penetration by heartbleed lower. For a system with only a few well protected partners, it is quite low. For a system supporting a large number of parters, it is higher, but probably still much lower than a public server.

Significant risk factors external to ATNA are:

  • Was the system used exclusively for bi-directionally authenticated transactions? Even one open public https port could expose the system to attack. ATNA services on a shared server were probably exposed.  (Details of memory access are very implementation dependent, so exposure is hard to predict.)
  • Are other connections protected by other means from general public access? VPNs and VLANs are common alternative protections.
  • Was traffic recordable for later attack? This is highly dependent upon implementation details. It changes the scope of the assets at risk in the event that the system was penetrated.

3. Responses

Some responses are obvious:

  • Update to remove the bug. (Don’t waste time reading this. Do that now.) Get patches distributed and installed.
  • Consider replacing passwords, revoking and replacing certificates. The urgency of this is very much dependent upon the number and type of potential communication partners. A system with public access was almost certainly penetrated and information that was available in memory is very much at risk. A system with just a few known well protected partners is at much lower risk.
  • Consider negotiating Forward Security. This was always permitted by ATNA as part of TLS negotiation, but support is not required. Some systems were doing this already, because most of the libraries that support ATNA also support Forward Security. If offered and supported, it would be used. Forward Security did not protect against this bug. It just reduces the amount of past and future network traffic that was exposed.
  • Consider partitioning systems so that public facing systems are fully separated (at the hardware level) from internal facing systems. In this context, public facing means systems that accept connections from any client. Many organizations use VPNs, TLS, SSH, etc. with configurations that require bi-directional authentication at all times. That is not public facing. Those systems deny network connections to unknown clients. (This authentication must be at the connection level, not as a later password or token interaction.)

Internal facing does not ensure safety. Depending upon the nature of your internal systems the risk of internal malicious systems ranges from low to high. There are many ways that malicious software gets into internal systems. The difference between internal and public facing is probability. It is certain that public facing systems are subject to constant attack from thousands of systems using a wide variety of methods. Internal facing systems are subject to attack from a much smaller number of systems using a smaller variety of methods.

4. Commentary

Some thoughts on IHE response:

  • Perhaps IHE should explicitly call out the permissibility of Forward Security. I suspect that many readers don’t realize that it is available as an option, since it is not listed. ATNA only lists the minimum necessary, not all the possible options.
  • Perhaps make some stronger statement, such as identifying Forward Security as an IHE option. This doesn’t protect against penetration, but it does reduce the exposure from a penetration.

Some thoughts on public security perceptions:

  • Five years ago it was hard to get the public interested in TLS protection, and tools like "HTTPS Everywhere" were limited to the techno-geeks. Now, a widespread flaw in TLS is major news. That’s quite a change for five years.
  • I expect some changes to authentication technologies:

    • New approaches that incorporate bi-drectional authentication in ordinary consumer transactions will spread. Right now they are very rare and often poorly implemented. Banks are starting to use them. But corporate VPNs are pushing the technology and education into the general public. The affect of Heartbleed would have been somewhat smaller if these were in use. Instead of having a certainty of penetration for all public facing servers, it would be the likelihood that one of the server’s customer/client systems had been penetrated. It would still be a major penetration, but that is a smaller risk.
    • One time password and related ID systems will spread. I rather like the Yubikey system. There are various others like it with different hardware and software requirements. They vary quite a bit at the moment, ranging from expensive smartcard ID systems like that used in some US government systems, to very simple systems like the Yubikey.
    • I expect bio-metric IDs to flower and die for public authentication. The problem is that bio-metric IDs can be stolen. (I’ve done device drivers for fingerprint scanners. I know how to steal and copy a fingerprint. It’s harder than stealing and copying a certificate, but it can be done. Unlike a certificate, you can’t revoke a fingerprint.)

April 13, 2014 in Current Affairs, Healthcare, Standards | Permalink | Comments (0) | TrackBack (0)

Water and Desalination

A recent conference on water at MIT brought some interesting concepts around water in the middle east into better focus. 

  1. The cost of transportation cannot be ignored.  For example, the nominal cost of desalinated water on the Israeli coast is $0.60/ton.  The cost of transporting water from the central hills to Gaza is $0.40.  So there is no point in Gaza paying more than $0.20 for a ton of water from the hills.  The speaker argues that this reality makes water wars more hype than reality.  People may use water as an excuse, but most of the situations don't involve enough money to justify a war.  The available capacity in the hills is 100 million tons/yr.  So after all the emotional rhetoric is done, arguments about that water are arguments over $20 million.  That's all it's worth.
  2. Another speaker pointed out that the nominal number was from 2010.  The actual cost for desalinated water is presently $0.50, and it's falling.  So the arguments are over even smaller amounts of money.
  3. Transportation limits are also important.  Amman gets its water from the Jordan river.  Alongside the river, water is about $0.15/ton.  In Amman it's as much as $2.00/ton during dry periods.  It climbs very rapidly in cost as demand increases.  This is because the pipelines between the Jordan river valley and Amman are usually running at near full capacity.  The alternatives are limited, so during scarcity the price climbs dramatically.  That high price mostly serves to eliminate forms of water use.

Equally interesting, after about 25 years of discussion, Israel, Jordan, and Palestine have finally reached an agreement on the Red Sea - Dead Sea project.  It's only agreement to proceed past the preliminary stage, but there is significant economic pressure to keep moving.  The concept is simple.  Sea water will be taken from the Red Sea, and used to generate hydro power (using the 400m drop down to the Dead Sea) which is used to desalinate the water.  The resulting waste brine is dumped into the Dead Sea.  They've agreed to examine two plans, both sharing the same locations for hydro-power, desalination, canals, and pipelines.  One plan takes full capacity sea water from the start, dumping the excess into the Dead Sea.  The other plan increases the water flow as desalination plants come on line.  It dumps only concentrated brine into the Dead Sea.

The project includes a pipeline to Amman, which deals with their skyrocketing water costs during high demand.  It's less clear how it would affect the water extraction in the Jordan river valley.

Update: I did the arithmetic on energy requirements.  The potential energy of one ton with a 400m drop is about 1.2 Kwh.  The chemical energy of salt mixed in 1 ton of sea water is just under 1 Kwh.  Hydro energy conversion is not 100% efficient, and desalination is not 100% energy efficient, so there isn't enough potential energy to do full desalination without outside energy sources.  Further, much of the fresh water will be left at a high elevation so that it can be sent into Jordan.  So this scheme probably needs further outside power sources.  The brine goes all the way down allowing full energy recovery.  The lower percentage fresh water per ton of sea water reduces the chemical energy needed.  That's probably not enough for the system to be fully self-powering.

January 29, 2014 in Current Affairs, Eco-policy | Permalink | Comments (0) | TrackBack (0)

BYOD Security rears an ugly head

The Wall Street Journal has noticed the growing issues around securing data on personal devices.  The BYOD movement has some serious downsides. Companies are implementing phone wipes (reset to factory default) for employee and contractor phones as part of their BYOD programs.  When contract/employment ends the phone is wiped.  Very few people have backups for their phone, and very few people notice or recall the terms of the BYOD program.  Lots of personal information is getting destroyed as a result.

I practice maximum corporate/personal separation.  I'm now fanatic to the point of bringing a personal and a corporate laptop while traveling.  I've been through one hostile corporate takeover where I saw what can happen to company laptops.  The separation of function has only been directly beneficial once.  It made recovery from a failed disk drive while traveling easy.  Corporate could recover their stuff onto a replacement laptop that reached me the next day.  The rest of the time it's a nuisance.  But a data wipe is too traumatic to risk.

January 22, 2014 in Current Affairs, Travel, Web/Tech | Permalink | Comments (0) | TrackBack (0)

Book Review, Due Diligence

 


Table of Contents

1. History
2. Service and Operations
3. Regulatory breakdowns
4. New Technology
5. Summary

Due Diligence, by David Roodman, is an examination of the results of various microfinance efforts. Microfinance is a larger category that includes micro-lending. It also includes micro-insurance, micro-saving, etc. Roodman examines the widespread claim that micro-finance reduces poverty.

His results are:

  • There is not sufficient evidence to claim that microfinance reduces or increases poverty levels. There are myriad anecdotes about the subject, but with millions of people involved it’s easy to find great anecdotes. When economic statistics are examined, the data does not show a reduction in poverty.
  • Microfinance does meet a significant need of the poor. If instead of asking "does it reduce poverty" you ask "does it provide a financial service that is valued by the poor", the answer is yes. There are important regulatory requirements for microfinance to succeed. Without these, it will break down. With these, it is a valuable service for the poor.
  • Microfinance can be one part of building an economic infrastructure for development and poverty reduction. On its own, it is insufficient. It is mostly an urban solution, due to the operational realities, and it needs other developmental elements to make a significant change.

The book is well written, easy to read, and appears to be very carefully sourced.

1. History

Microfinance has a much longer history than I realized. Jonathan Swift, writer of Gulliver’s Travels, was a major organizer of microfinance services to the poor in Ireland in the 1700’s. Swift is credited with starting the Irish Loan Funds (pdf), a major microlending operation that had widespread use and lasted about 200 years. It’s history is also informative regarding flaws and regulatory requirements. The Prudential Insurance company in the US began as a micro-insurance operation in the US during the 1800’s.

Micro-finance also exists in many more times and locations than the current press excitement would indicate. It is found all around the world. This means that there are enough examples to provide good statistical analysis and a variety of implementation variations.

2. Service and Operations

The root of micro-finance is two-fold:

  • The poor have the same needs for financial services as the rich and middle class. They need the same kinds of saving, borrowing, and insurance as the middle class and rich.
  • The transaction sizes of the poor are tiny by comparison. Micro-finance radically restructures the transactions so that the administrative overhead is correspondingly small. Examples of what this means are scattered through the book.

Social factors, cash flow factors, and resource differences make micro-lending the easiest to simplify. A micro-loan can be structured and standardized to make the administrative cost tiny. For example, a loan can be standardized to a fixed amount, like ten dollars. This is the only available loan amount. A pre-printed index card with 40 boxes is filled out with the borrowers name, and the borrower is given nine dollars. Then every week, the lender visits the borrower, gets 25 cents, and checks one box. When all the boxes are checked, the loan is paid.

Note all the cost reductions. The total infrastructure requirement is a pre-printed piece of paper and an indelible pen. The infrastructure is auditable. There is no time spent negotiating amounts. The labor time needed for the lender remains significant. They must pay someone to visit each borrower every week. But each individual transaction time is very small, since it’s just collect a coin and check a box. In an urban environment, the travel time can be kept small.

When the interest per transaction is one cent, the collection cost must be kept to around a 100 millicents. This is done by employing staff who are also paid poverty level wages. When the tracking is simple check boxes, a minimally educated person can do the job.

There are many variations and adjustments to this approach for different cultures and economic environments.

3. Regulatory breakdowns

Micro-lending has broken down and become severely abusive in some countries. Borrowers have been hounded into suicide or flight by abusive lenders. Local lenders have been known to break into borrowers homes and steal as needed to cover missed payments.  Lenders have used violent enforcers to coerce borrowing.

In some countries the abuses have led to prohibitions on micro-lending. In others, there have been government takeovers, management replacement, etc. This is sometimes combined with intrusive local politics and corruption, where it’s unclear where the real abuse and corruption resides.

There is a strong pattern observed here:

  • The worst abuses have been in countries where more than 50% of the lending funds are from outside the country. The correlation between percentage outside funds and abuse is strong. This has significant implications for foreign aid policies that try to encourage micro-lending.
  • Appropriate regulation is crucial. Traditional banking regulations are far too expensive for the tiny transactions needed by micro-finance. But, absence of proper financial regulation has led to frauds, bankruptcy, and many other abuses. Designing appropriate regulatory structures requires considerable creativity.

4. New Technology

There is lots of press about new technology. This is too new to be subject to analysis in this book. He does mention it, and early indications are that the changes will be important. Some of the systems mentioned are:

  1. M-PESA is radically revising all sorts of financial transactions in Kenya. This goes far beyond traditional micro-finance. The combination of cell phone transactions, micro-banking relationships between local shops and tradition banks, and regulatory changes has allowed the cell phone to be used to transfer money, pay bills, save money, etc. This has a bigger impact on the upper poor and lower middle class, but the transactional flexibility does flow through to the really poor who lack phones.
  2. Brazil has used satellite links to allow local post offices and corner stores to act as "correspondent banks" and provide services to thousands of small towns.
  3. South Africa and Namibia are using Net1 and an advanced smart card system for financial transactions. It is designed to operate despite erratic power and intermittent communications connections. This is a complex hybrid of dispersed and centralized control, so that financial integrity is preserved while providing access to small and remote locations.

5. Summary

The overall conclusion is that the various forms of micro-finance do meet important financial needs for the poor. They do not eliminate poverty, but that’s not a reason to deny the poor these services. These services do significantly improve their quality of life.

The proper regulation of micro-finances is a challenge, but it is needed. The long established banking regulations are much too burdensome for the tiny transactions involved, so substantial changes are needed. This challenge is one that can be met if regulators, politicians, and lenders can work together.

The impact of cheap communications (the mobile revolution) was not covered. It’s still very early. The impact can be large if the costs can be reduced to a level competitive with the highly cost optimized paper systems that are traditional.

January 01, 2014 in Current Affairs, Politics | Permalink | Comments (0) | TrackBack (0)

Culture Conflict, FDA vs 23andMe

 

Table of Contents

1. The medical device culture
2. 23andMe specifics
3. 23andMe competitors
4. How enthusiasm becomes a lie

The FDA conflict with 23andMe reflects a cultural clash that will are important for new players who want to create new kinds of medical devices. This is a clash between the Internet culture and the FDA medical device culture. (Both of those terms are a shorthand for larger groups. The "Internet culture" behavior can be traced back to the 60’s and mainframe computers, but it is best known today by Internet behaviors.)

  1. Hype, deception, and lies are normal in the Internet culture. Nobody expects advertising to be truthful. Dilbert cartoons point out the need to find the engineers and talk to them, because only the engineers will tell the truth.
  2. Medical device claims are expected to be accurate, truthful, and based on scientific evidence. The non-functional claims (like price and delivery) may be exaggerated, but the functional claims can be trusted.

23andMe ran into deep legal problems because they acted like an Internet company. They made claims that could not be backed up with scientific evidence. This is not a big surprise. They came from the silicon valley Internet culture. They thought that the argument "Well, nobody could possibly take that claim to be literal truth" was an excuse.

In the long term I expect the Internet culture will be forced to adapt to the medical device culture. The medical device culture has the law on its side, and it can point to past abuses by fraudulent device makers. The experience is that desperate patients will believe the most absurd claims, and that it’s only by harsh penalties for hype and lies that these frauds were stamped out.

The FDA was created in response to the fraudulent sale of snake oil solutions, monkey gland cures, and a huge variety of useless medical devices. The public will believe the most ridiculous of claims. The FDA rules are a form of substantial fraud prevention. If you can prove your claim, you can get approval to sell your device. If not, you cannot sell or market it.

1. The medical device culture

The FDA regulators have a split personality, combined with police powers that often surprise Internet newcomers.

  1. The FDA wants new devices to succeed. The FDA staff is rewarded for helping newcomers. They provide extensive education and advice. They will not do the device development work for you, but they will explain the rules and motivations in great detail if a newcomer is willing to listen and learn.
  2. The FDA staff will not take the slightest risk of approving something without solid scientific evidence. They know that their career will be destroyed if they ever make a mistake and approve a significantly unsafe device. They want a complete defense of scientific evidence to protect their career in case an unsafe device escapes detection.

This leads to a schizophrenic behavior at times, with staff both encouraging newcomers and demanding extraordinary documentation and justification for claims.

For new kinds of devices this poses a serious problem. The FDA cannot simply re-use old procedures and criteria from known devices. Those old tests don’t apply to the new kind of device. So they face an internal conflict between the desire to encourage new devices and avoid any risk of failure. Their response is to dump the problem on the innovator and say "provide proof".

The regulatory discussions and burdens of dealing with new kinds of devices is a slow and difficult process as a result. The Internet approach of "try it, fail, fix it, fail, fix it again, fail again, until it works" is not acceptable for the FDA or law. The FDA does accept that approach in the context of organized clinical trials, but the Internet culture has grown up using the general public as their trial subjects. Recruiting subjects and performing scientifically valid clinical trials is new to them.

An example of the kind of documentation needed can be found at the AHA and ACC recommendations (pdf).  This is the backup documentation for a clinical practice recommendation, not for a medical device, but it’s publicly available. Most medical device filings are confidential. It identifies the strength of different kinds of backup data, the risks and alternatives examined, and for each individual claim it identifies the supporting data. For example, the claim that an 38.7/ml reduction in LDL levels will result in a risk reduction of 27%, is justified by evidence identified for that claim (see page 65, item 31), 14 different clinical trials.

The Internet experience does not prepare people to back up their hype and exaggerations with this level of scientific evidence. The FDA can be very tolerant and helpful for newcomers who a learning to do this, but they are very rigid. No device will be approved until:

  1. the scientific evidence is in place, or
  2. the claims are withdrawn, and removed from all advertising, marketing material, public, or private discussion. These unsubstantiated claims must never be mentioned by any means.

2. 23andMe specifics

Almost all of the details surrounding 23andMe are confidential, so I can only note scattered details that have been made public.

  • 23andMe stopped responding to FDA requests. I can understand being frustrated by the level of detailed backup that the FDA requests. I’ve done medical device approvals. But, the FDA can be persuaded if you do a thorough scientific job. Refusing to talk switches the FDA from helpful to hostile. The FDA has police powers just like a uniformed policeman. Making the relationship hostile is never a good idea.
  • 23andMe included claims of performing BRCA1 test. There are existing BRCA1 diagnostic tests approved by the FDA. These existing tests identify about 700 different variations of BRCA1. The 23andMe marker tests could only identify a few classes of BRCA1. The FDA will normally apply the standards of approved tests to new devices. To get a lower quality test approved requires scientific evidence that the difference between the many variations is not clinically significant, or that the less accurate test has some significant value. That would require clinical trials. 23andMe should have withdrawn all these claims, but didn’t.
  • 23andMe made a variety of claims for "increased risk of X", etc. These claims need the same kind of risk definition and scientific backup as the LDL claim example above. The FDA documents indicate that there was no such evidence.
  • 23andMe was given several years to resolve the lack of evidence. The FDA noted that they had not even begun preparations for verifying some of their claims, despite having had several years of postponements.

This list is incomplete because the details of FDA complaints remain confidential. Any one of these mistakes would result in an FDA shutdown order, which is what happened to 23andMe.

3. 23andMe competitors

There are 23andMe competitors. They’ve taken several different paths in dealing with the FDA requirements.

  • Sequencing labs continue to offer their sequencing services. The costs are steadily dropping. A personal genome sequencing is now around $1K, for which you get a few hundred megabytes of individual sequence data. When asked for justification, they can provide FDA and other reviewers with extensive scientific evidence that the machines that they use are accurate. They do not make any further diagnostic claims. Interpreting the sequence data is your problem. The FDA is quite happy with this. The claims are completely justified and the results are medically valuable in the hands of someone who knows how to interpret them.
  • Some vendors abandoned the device market and offer services interpreting the resulting sequences. This makes them not a device, so the FDA doesn’t get involved. They are now consultants who are practicing medicine, and are subject to the kind of regulation that applies to doctors.
  • Some vendors decided to get out of the business. The technology and science might not be ready for making diagnostic claims at such a low cost. Rephrasing the advertising to be truthful would kill their market. How many people will pay $100 for a meaningless report with no scientific validity?

4. How enthusiasm becomes a lie

I work at a medical device company and have brought devices from initial idea through FDA approval to production and end of life. I was also involved with various silicon valley startups back in the workstation era (including Sun Microsystems). It’s very easy for a silicon valley enthusiasm to become a lie.

We had a new medical device that we were bringing to market, and had a new marketing staffer. The new device would replace an existing device, which meant that our FDA issues were much easier:

  • We only had to demonstrate that it was just as good as the old device. This was mostly easy. The new system was much better in many ways, and the only difficulty we had was explaining why some of the evaluation methods used with the old device no longer made sense.

Getting FDA approval still took many months, but there were no surprises and it got done on schedule. The new marketing guy didn’t understand why we didn’t bring up all the new capabilities of the new device. By making extra claims we would have had to provide a complete scientific basis for why those improvements made a difference in patient outcomes. This would have meant a huge increase in pre-market R&D for clinical trials.

He heard the arguments but I don’t think he ever really accepted them. The huge cost and schedule saving drove the decision. Proving the value of the improvements would have made the product so late and so expensive that it would have failed in the market.

The new system was faster, less expensive, and more reliable. Those are not clinical claims, and we could easily demonstrate those with low cost engineering trials. So we limited the product claims to:

  • It’s faster
  • It’s less expensive
  • It’s more reliable
  • It can do everything that the old system did.

The marketing guy wanted to add claims about how this would result in improved patient care, etc. It seems obvious that faster, less expensive, and more reliable will improve the healthcare system and patient results. But the whole management team squashed him and made sure that he knew that he would be fired if he even hinted at that claim in private customer conversations.

The reason gets back to medical claims and FDA regulations. You cannot claim an improvement without proof. We could easily provide scientific evidence for faster and more reliable. I’m not sure how you prove improved healthcare delivery based on those.

In fact, it’s easy to see that these improvements might have had no measurable difference in patient results:

faster
A faster system can be used to reduce delays, improve overall process, or to serve more people with fewer systems. These need not result in any patient health improvements.
less expensive
A lower cost is good and easy to sell. But, so many costs are increasing that total patients costs will still go up. It’s hard to argue that a slightly smaller cost increase will result in improved patient health.
more reliable
A more reliable system is very popular with technicians, nurses, and doctors. It permits a variety of process improvements. But again, there is no clear causal relationship between better processes, happier staff, and healthier patients.

We had no trouble selling the system based on it being faster, less expensive, and more reliable. We could explain all the ways that the customers could take advantage of this. The customers could make their own assumptions about the resulting patient outcomes. We never made patient outcome claims.

I don’t think our marketing guy ever really understood why we did this.

December 27, 2013 in Current Affairs, Healthcare | Permalink | Comments (0) | TrackBack (0)

Risks and Management Accounting

I listened to a presentation on management accounting this weekend.  It was a report on an analysis of risk management practices in businesses that operate in risk sensitive areas.  These included banks, insurance, healthcare, aviation, and petroleum businesses.  Most of the analysis was issues unrelated to healthcare.  But, the following caught my eye as healthcare related:

Red flag warning issues:

  • Having a target orientation for operations.   The example here was the British NHS.  It set targets for various metrics and the NHS staff were managed to meet those targets.  The result was significant patient harm.  The methods used to meet targets were chosen to maximize success meeting targets, not success in helping patients.
  • Using spreadsheets, especially risk registers.  Wholistic data gatering, measurement uniformity, and process analysis worked better.  Spreadsheets were a strong negative indicator.

A positive indicator and counter agent for red flags:

  • Systemic analysis for critical moments:
    • When and where do risk decisions take place?
    • When and where do risk actions take place?
    • Who is involved?
    • What information is available at that time?
    • How can the who, what, where, when be changed to reduce resulting risk?

Critical moments are those times where behavior has an immediate and direct affect on the risk situation.  Examples range from call center response to problem reports to loan application interviews.

One of the problematic NHS target driven behaviors was prioritization of dispatch for ambulances.  The goal was to have an ambulance arrive within 15 minutes of a call.  This was optimized by dropping dispatch priority when arrival was over 15 minutes late.  These late calls would be postponed in favor of new calls.  It's not optimum for patients, but it optimizes for meeting the target goal.  This kind of counterproductive behavior is very widespread when targets are used in management.  It's not a problem unique to healthcare, but it's one that definitely affects healthcare.

July 29, 2013 in Current Affairs, Healthcare | Permalink | Comments (0) | TrackBack (0)

Book Reviews: Who Owns the Future, and Roots of Radicalism

I read two oddly related books this past month:

  • Roots of Radicalism: Tradition, the Public Sphere, and Early Nineteenth-Century Social Movements, by Craig Calhoun, a very academic history of the origins of the radical movements.  It primarily covers US and UK radicalism, emphasizing that the traditional left/right political analysis approach is misleading and obscures some important aspects of the origins.  Only a history nerd will wade through the dense academic prose, so I've pulled the highlights out below.
  • Who owns the Future, by Jaron Lanier, a current commentary book.  It's much more readable by the ordinary techie.  It's a grab bag of ideas.  I found myself alternatively reacting "yes, I've seen that effect" and "interesting idea, but what about this and this?".  It's worth reading primarily because it includes useful observations that are much too often ignored by the techno-enthusiasts.

Roots of Radicalism

Much of recent historical work has attempted to force fit 19th century radicalism into the left-right mold of modern politics.  This has misleading results.

Radicalism originated in the early 19th century social movements in Europe and the US.  These were a response to the social disruption from the industrial revolution.

The primary participants were the artisans and craftsmen, not wage slaves.  Prior to the industrial revolution artisans made things.  Black smiths made metal things, potters made pottery, shoes and clothes were made by hand, and so forth.  The industrial revolution destroyed careers that were safe and predictable. The  whole artisanal segment of society was destroyed.
  • These people were not poor.  They had reliable food, clothing, and housing.  They had reliable work.  They had an education.
  • The radical groups were at least as much conservative as progressive.  They did not want change.  They wanted to preserve their place in an artisanal society. 

The radical movements included groups that espoused nativism and other conservative solutions.  These later morphed into nationalism and fascism. Some radical attempts to preserve a craftsman oriented social structure evolved into syndicalism, which has since faded from the political landscape.  Syndicalism never fit within the left-right paradigm.

The Tea Party and Occupy movements have a great deal in commmon with these early radicals.

The cultural evolution into wage slavery and the need for labor unions took place later.  The completion of the industrial revolution and emergence of organizational dominance took place after the destruction of the first wave of radical organizations during the 1848 revolutions (Europe) and the Civil War (US).

The role of religion was mixed.  The Second Great Awakening in the US was a powerful force that created communities with a goal of eliminating corruption and improving society.  Their goals included abolition of slavery and temperance.  It was split between the transcendentalist and evangelicals, who had profoundly different views of God and man's relationship with god.

The transcendentalist and evangelical branches in the US diverged and they became political enemies in the later 19th century.  They agreed on abolition, but after the Civil War ended slavery, other issues like temperance, women's suffrage, social structures, etc. were areas of great dispute.  This split never fit within a left-right paradigm.  This religious component was just as important as the worker-owner split emphasized by the left-right paradigm.

The education level and financial status of the early radicals was much better than the traditional left-right paradigm indicates.  This is apparent in some of the ways that the establishment in the UK worked to suppress the radicals.
  • The stamp tax and other measures increased the cost of paper and printing.  It became too expensive for the radicals to reach their audience, but remainded affordable for the upper classes.  This indicates two things.  The radical audience could read, because pamphlets and papers were effective at reaching them.  The radical audience had some spare cash, to cover publishing and distribution costs.  The stamp tax was designed to make this communication path too expensive.

In the UK the merchants, lawyers, etc. were split away from the radicals by changes in the 1830 reform act.  Prior to that, voting was based on real estate ownership.  You needed to own significant real estate (land and buildings) to qualify to vote.  The 1830 Act expanded this to include financial resources.  The result was that middle class occupations got the vote.  Prior to this, the emerging middle class had been supporting the radicals. Once they got the vote, they opposed to many of the other radical goals like de-industrialization.

In the US voting and political participation had always included the merchants, lawyers, etc.  The US did not have a similar splitting prior to the bloodbaths of the religious conflicts and civil war that shattered early radical structures.

None of these groups were unitary or isolated.  There was a strong web of overlapping group memberships within affinity groups.  This is apparent in both the 19th century and the NSM.

The New Social Movements (NSM) of the 1960's and 70's were not that new.  The radicals of the early 19th century had a lot in common with them.  The 60's hippies and the 19th century Transcendentalists are close companions.  Both periods had extensive social ferment with a multiplicity of rapidly evolving groups.  The NSM activism and direct action efforts have direct parallels with 19th century radical activism and direct action (e.g., Luddites, John Brown, Underground Railway).

Who Owns Network

Much of what Lanier is observing reminds me of the changes that drove the Radical movements. His examples of the network destroying ways of working are similar to the destruction of the artisanal culture in the early nineteenth century.  The computer and network are destroying a way of life for many people.  You see reactions to this in the Occupy movement and Tea Party movement.

Lanier's response suggestions are interesting.  They are very incomplete.

I liked his term "siren servers".  The analogy to the Greek Sirens is apt.  It's a better metaphor than the walled garden.  Whether it's Facebook, Google, or some other niche area, the siren server pulls in its victims who are blind to the negative effects until they are already on the rocks.

July 15, 2013 in Books, Current Affairs, Web/Tech | Permalink | Comments (0) | TrackBack (0)

GPGtools for MacOS - Small Review

I helped with the install of GPGTools onto a Mac this weekend.  It brings PGP email within the reach of an advanced non-programmer Mac user.   It should also support S/MIME email, but that looks beyond the reach of a non-developer at present.

Good points:

  • Painless, vanilla install.  It was no different than any other Mac application.
  • Reasonably good integration with the default Mac Mail app.  It adds some buttons for signing and encrypting.  It integrates the key storage dialogs into the default ap in a reasonably intuitive manner.
  • Correct PGP implemetation.  This is based on very limited testing, but it seems to do the job.
  • Reasonably good integration with the public key servers.

Bad points:

  • No manual.  You are expected to figure it out by trying things in the menus and reading the FAQs.  This is a failing shared by many Mac applications.  It makes it extremely hard to provide any assistance by telephone unless you also have a Mac.  I will have problems providing telephone assistance because I don't have a Mac with email.
  • FAQ does not cover very much.  The rudiments of install, send, and receive are OK.  Things like "how can I use S/MIME?" or "how can I exchange keys other than through a public keyserver?" are not well covered.
  • It's easy to make mistakes and mis-understand things because basic terminology used in PGP, e.g., "fingerprint", are used but not explained.

In net, it's a really good start, but needs some serious help from some good writers to fill out the FAQ and provide a simple manual. 

June 30, 2013 in Current Affairs, Politics, Web/Tech | Permalink | Comments (0) | TrackBack (0)

IEA Carbon Emissions Report

The IEA has announced current numbers for carbon emissions and energy efficiency.  These emphasize the importance of energy efficiency.  This is from this report.  The reports indicate the improvements comparing 2011 with 2010.  The US is down 200 million tons (3.8%), Europe is down 50 million tons (1.4%), but total world is up 1.4% to a total 31.6 billion tons.  A further analysis shows that energy efficiency efforts are much more cost effective than alternative energy projects.

The report includes the annual spending on energy efficiency:

  • About $200 billion world wide.
  • about $20 billion in the US.

I always evaluate the sources and accuracy for statistics.  Lots of published statistics are based on very poor data and flimsy analysis.

The carbon emissions for US and Europe are fairly accurate.  They are based upon fuel production, sale, and use data.  This will miss some sources.  There will be some inaccuracies.  But these are unlikely to change dramatically year to year, so the 3.8 and 1.4 figures are accurate.  The worldwide number is much less likely to be accurate, even year to year, due to data gathering issues.

The efficiency numbers are highly dubious.  The underlying data gathering is highly subjective and difficult to measure.  When someone buys a CFL, how much of that spending is towards energy efficiency?  When someone buys a more efficient refrigerator how much is towards energy efficiency?  When an office building is renovated, how much is towards energy efficiency?

The numbers themselves look wrong.  Johnson Controls' building efficiency division reports annual revenues of 14 billion worldwide.  It's not credible that one building HVAC vendor is 8% of world spending all on its own.

It appears that the efficiency numbers are based primarily on reports by government programs about spending that is related to that government program.  This is consistent with public statements by the Kyoto statistics team that they do not include any efficiency spending that is motivated by profit or cost saving.  Kyoto reports are purely for government and regulation driven activities.

The 3.4% annual reduction in US carbon emissions is primarily the result of cost saving efforts.  The largest single one is the switch from coal and oil to natural gas.  This changes much more than the C:H ratio in the fuels.  The mid-west power companies closed their old least efficient coal fired generators and replaced them with new very high efficiency combined cycle turbines.  The heat to electricity efficiency improved from about 30% to about 60%. 

There are many other sources of improvement. The switch to efficient light bulbs is nearing completion.  Over 80% of lighting power consumption is now from flourescent or LED lights.  The ongoing improvements are primarily from better lighting design and controls, like occupancy sensors and task oriented lighting levels.  Freight traffic continues to shift from truck to rail at 1-2% annually.  Warehousing, shipping, and logistics strategies take time but provide a significant improvement.

Compare this with renewables, where the US spent $52 billion in 2011 to increase the percentage of non-hydro renewable electricity by 0.3%.  That contributes about 0.2% of the 3.8% decrease in carbon.  The lifespan of efficiency improvements is similar to the lifespan of wind power and solar cells.  So in terms of cost effectiveness, US efficiency projects provide 1.7% reduction per $10 billion, and US alternative energy projects provide 0.04% reduction per $10 billion.

If the efficiency investment numbers were believable, this would be evidence that alternative energy efforts are profoundly wasteful to a degree that is harmful to the world climate.  But, the numbers lack credibility.

The claims by the various energy efficiency vendors and sponsors is that energy efficiency investments have a typical ROI of 25-100%.  The alternative energy investments have typical ROI of 3-5%.  (This ROI uses actual costs, not cost minus subsidy.)  If you assume that energy efficiency provides ten times more carbon reduction per dollar, the reduction in carbon utilization implies an investment of $100 billion during 2011.  That's a lot more believable than the IEA estimate of $20 billion.

Still, this does indicate that the push for alternative energy sources is a diversion.  There is a lot that can be done to improve energy efficiency, and those improvements will have a substantially greater benefit to both the economy and the climate.

June 25, 2013 in Current Affairs, Eco-policy | Permalink | Comments (0) | TrackBack (0)

Aviation News

Today's Aviation Week had three relevant articles.

  1. Another article on upgrading aircraft to use digital telemetry, GPS, etc.  This time it was discussion of the steps that Delta is taking with some of their aircraft.  Nothing new or radical is reported.  It's just another article how to upgrade at reduced cost.  Long term (assuming the FAA mis-managers don't completely ruin things) the digital upgrades should reduce fuel use, air pollution, noise pollution, and flight times.  Switching from voice to text messaging makes sense for all the routine flight control.  There are two pilots and messaging doesn't interfere with flying the way it does with driving.  It's faster and avoids the confusion over exact numbers that sometimes affects the voice controls.  It also carries a lot more messages per second over the limited radio channels.
  2. A report on prototype effort by British Airways, as part of joint effort with Solena Fuels and GreenSky London, to build a $500 million plant to convert waste biomass into fuels.  It's to be built in Tilbury and go operational in 2016.  The plant will take 565,000 metric tons of sorted municipal waste and generate 50,000 tons jet fuel, 50,000 tons diesel, 20,000 tons naptha, and 50 megawatts excess electrical power.

    The feedstock is dry sorted municipal waste. That means metals, glass, and other recyclables have been removed.  (Technically it's called refuse derived fuel - RDF).  Part of the deal is the attraction of using something that somebody else collects and pays you to take, as are airline commitments to reduce carbon impact.  These fuels count as bio-fuels with no carbon impact.

    It's a plasma torch gasification, so plastic, tires, etc. can be processed.  It uses the usual syngas F-T processes, with the latest chemical reactor and catalyst designs.

  3. An article about the complaints about latest idiocy on carbon tax for aviation.  The European Parliament has clearly said that they will collect tax on aviation travel outside Europe, has created a bureaucratic monster, and all the non-EU countries (US, Russia, China, etc.) have reacted with immense hostility. This choice abrogates promises to use ICAO international processes for aviation CO2 controls. One of the absurdities is that a charter airline that flies one 747 per week is considered de minimus carbon contributor and avoids most of the paperwork.  A business jet operator that flies one business trip per month is not considered de minimus and must follow the full bureaucratic procedures.

    The annual paperwork cost (filings, people, etc.) is estimated at $100K/yr.  For an airline this is just another piece of the regulatory burden.  For business jet operators this is a big extra cost.  There is a series of "free" allowances.  Again, even for small airlines the cost of filing and qualifying is justified by the value of the allowances.  For business jet operators, the filing cost exceeds the value of the allowances.

    The laws justification was CO2 emissions, but clearly the regulations are designed to eliminate private aviation and business jet operations in favor of commercial airlines.  Extending the reach outside Europe is a simple power and money grab by the EU Parliament.

May 22, 2013 in Current Affairs, Eco-policy, Travel | Permalink | Comments (0) | TrackBack (0)

»