Kamis, 03 April 2008

Climbing New Hampshire

Why do humans climb mountains?

Because they're there? To regain a sense of challenge and opportunity for heroism that we've lost in the modern world? To get away from cell phones, email and the non-stop flow of information that is part of our internet culture?

On March 15, 2008, I finished ascending all the New Hampshire peaks over 4000 feet high during winter. Winter hiking/climbing poses some unique challenges such as how to stay warm at the top of Mt. Washington when it's -20F and the wind is blowing 50 mph. How to avoid avalanches. How to stay hydrated when even boiling water freezes over the course of a hike. How to ascend 12 foot snow drifts for 15 miles at a time. And the most dangerous - how to drive from Boston to New Hampshire on ice covered freeways without getting hit by a skidding bus.

Like everything I do, there is method in this madness. I think of alpine climbing as a kind of puzzle - an outdoor version of Sudoku - which cleanses my mind from the concerns of the work week. What gear is needed to stay warm but not too warm, since sweat freezes solid and can rapidly cause hypothermia? What route is safest? What techniques are best to ascend steep ice, deep snow, and tree covered terrain? What pace is best to manage time and energy, ensuring a successful trip? Reaching the summit is optional, but returning to the car is mandatory.

In New Hampshire there are 48 mountains above 4000 feet. Records have been kept for the past 50 years and I'm the 360th person to have completed the winter ascents of all the New Hampshire peaks.

Here's a typical schedule. Pack the night before with just the right amount of gear to be safe. I typically carry about 9 pounds of food, clothing, water and rescue equipment that I've refined over the years. Here's my gear list. I get up at 5am, eat a bowl of oatmeal, and fill a liter bottle with boiling water for drinking on the ascent. I pick up my climbing partner, who is also a physician, and drive to the White Mountains, typically a 2-3 hour commute depending on the mountain destination. All of gear has to be carefully laid out so that when we arrive at the trailhead, clothing layers can be added without losing body warmth and we can rapidly get started with the hike/climb. I generally like to start off a bit chilled so that the initial run up the trail gives me a body temperature that's just right. Along the trail, hats, gloves, and body insulating layers are added or removed as needed to stay just the perfect temperature.

Our typical journeys are 10-20 miles with 4000 feet of vertical gain. Depending on the depth of snow drifts, the ice, and the bushwacking through tree canopies, it can be quite taxing. I typically carry light snowshoes for deep snow and crampons for traversing ice. I wear a boot within a boot (Scarpa Alpha double plastic boots) to keep my feet warm.

Near the summit, the temperatures and the wind are so severe that goggles and facemasks are needed to keep your eyeballs from freezing. We typically summit, stay just a few minutes and then descend. Remember that the summit is only halfway back to the car.

Each year, several people die while winter hiking in the White Mountains. Most are reckless or significantly under prepared. My climbing partner and I are very safe and will not take risks. If we feel avalanche danger is too high or weather conditions are too severe, we will turn around. The good news is that ascending 48 peaks prepares you for a variety of conditions and events. We recently ascended Wildcat A through chest deep snow, hiking straight up the mountain because the trails were completely invisible in this year's heavy snow fall. The last mile took us 4 hours of hard work, with 3 steps backward for every 4 steps forward. At the end of the hike, we were exhausted by the effort. Only later did we find out that we were the first to ascend the mountain the past 30 days due to extreme conditions. I'm glad we did not know that the ascent was impossible before we started!

Rabu, 02 April 2008

After Hours Pay for IT Professionals

I was recently asked how we fund "on call" pay and subsidize remote access for our IT staff.

At BIDMC, our policy is that we have standardized on call pay for those who carry a beeper and support our critical systems.

We reimburse some IT employees for half of their monthly home internet service cost (approximately $30/month) if deemed necessary to do their job, assuming that only 50% of a home internet connection will be used for business. Also, we reimburse cell phones and Blackberries by adding the amount of an appropriate monthly plan (decided by their Director/Manager) to employee paychecks. Employees pay the bills themselves, eliminating the administrative burden of reimbursements.

I asked my IT colleagues at other hospitals in Boston for permission to publish their policies.

At Boston Medical Center, they provide on call pay but employees must pay for their own internet access. They previously funded internet access, but dropped internet reimbursement as home connections became more ubiquitous. They are currently paying for a "team" on call cell phone but may ask employees to use their own phones in the future.

At Partners Healthcare, they currently pay for on call support and offer a stipend to staff to cover their home internet access. They are investigating the best practices at other healthcare IT organizations.

At Children's Hospital of Boston, they have two hourly rates for on call support. The higher rate (referred to as Tier I) is paid to on call staff who are paged more frequently. The second rate (Tier II) is paid to anyone who participates in the on call rotation but is paged infrequently. Children's pays for home internet access of on-call staff who frequently log in remotely to perform systems management.

As IT staffing becomes increasingly virtual, it's clear that our policies on paying for beeper call, remote access, mobility technologies, and home office equipment will evolve.

Selasa, 01 April 2008

Electronic Health Records for Non-Owned Doctors - Implementation Order

This is the ninth entry in my series about providing electronic health records for non-owned clinicians. We'll call this one "triaging the practices". Since we have 300 non-owned clinicians who need electronic health records, where do we begin? If new clinicians join the Beth Israel Deaconess Physician's Organization during our rollout, how do their practices fit into the rollout?

We need specific triage rules to decide on the order of implementation.

In a for profit business, some metric like referral volume might be used, but in the non-profit healthcare world, such an approach would be a violation of Stark anti-kickback rules.

In our case, we want to ensure the highest quality care, coordinated through the use of interoperable electronic records. We want to ensure decision support is enabled for those who needed it most. We want to invest our effort into those practices which require the most clinical integration with the hospital to ensure high performance medicine. Based on a quality/safety approach, a rational implementation order would be:

1. Primary Care Physicians are the first priority - PCPs see a high volume of patients and are the "air traffic controllers" for care, ensuring coordination among all the clinicians a patient sees. An EHR enables an accurate problem list, an up to date medication list and alerts/reminders for wellness care. We want every patient's PCP to have the benefits for an EHR. Yes, we know that the first few months of using an EHR will impact a PCPs productivity, but our experience with other EHR implementations is that with appropriate training and a "model office" configuration, productivity rapidly returns to baseline.

2. Specialists who serve as a kind of primary care giver, managing diseases such as congestive heart failure, cancer and diabetes are also a priority to be early EHR users. Tracking diabetes care requires data coordination among endocrinologists, Ophthalmologists, and Vascular surgeons. Ob/Gyns are primary care givers. Chronic diseases such as COPD and CHF require coordination among pulmonologists, cardiologists and PCPs. Specialties that require significant care coordination with primary care givers or deliver primary care themselves include Cardiology, Ophthalmology, OB/GYN, Dermatology, Orthopaedics, Urology, Gastroenterology, Surgery, Pulmonary, Neurology, Endocrinology, Vascular, and Rheumatology.

3. As we are rolling out EHRs to these PCPs and specialists, it's likely that new clinicians will become affiliated with BIDMC. As we plan our rollout calendar, we will need to stay flexible so that new PCPs get priority and the specialists who most benefit from care coordination are placed ahead in the queue.

This approach to triage ensures that patients and providers get the maximal benefit from our efforts as we rollout 6 practices per month starting this Summer. We may need to refine our rules even further as we learn more from our rollouts:

* PCPs with a closer geographical location to BIDMC/a local hospital go first, since they have the most data interoperability needs. * Clinicians near retirement may choose not to be early adopters and may want to stay on paper.
* Some practices may more easily adapt to new technology than others and should go sooner

Over the next few months, the hospital and the physician organization will finalize the triage rules based on quality, safety and data sharing benefits, so it is very clear that we are Stark compliant and can easily explain to every non-owned physician when their EHR will be implemented based on objective criteria. I'll let you know how it goes!

Minggu, 30 Maret 2008

Educational Technology Priorities

In my role as CIO of Harvard Medical School, I oversee administrative, research and educational technologies for the school. Recently Information Technology and the Program in Medical Education agreed to enhance our IT governance by refining our charter for the Educational Technology oversight committee. I've included the complete charter below to give you an insight into the technology priorities of the Harvard Medical School teaching faculty.

Since 2001, Harvard Medical School has used Mycourses, a single portal for all educational content and applications. For a brief overview of Mycourses, you can click on Take a Tour. The website includes dozens of simulations, hundreds of videos, and thousands of PDFs, representing every handout for every course. We have a enterprise wide directory of all students and faculty, a self built student information system called Madris, and numerous applications which support faculty, staff and student needs.

Here's the Educational Technologies Committee charge identifying the priorities to make Mycourses even better.

Committee Charge:
* Oversee educational technology – existing and new - applications at HMS, including areas listed below.
* Identify annual academic and administrative priorities for educational technology, including new features (eg, enhanced user interfaces, curriculum database, on-line grading), improvements to existing applications (eg, surveys, test delivery)
* Develop an annual work plan and establish and monitor quarterly goals
* Meet bimonthly throughout the academic year

Areas of focus:
*Curriculum management
Improving and expanding the capabilities of MyCourses, making it more user friendly
Developing a search tool for curricular content across all four years of the curriculum, including information from MyCourses and the Course Catalogue

*Student information
Continued development of the MADRIS Student Information System

*Evaluation
Enhanced Evaluation of students by faculty
Enhanced Evaluation of faculty by students
Enhanced Course evaluation

*Virtual applications and simulation
Enhanced Virtual microscopy for pathology and histology
Enhanced Virtual patients
Enhanced Simulation

* Faculty information
Comprehensive teaching effort reporting
Tracking participation in faculty development
Tracking participation in evaluation

* Innovation
Support applications for external and HU funding for educational technology projects that enhance medical student education
Consider establishing HMS innovation funds to seed new initiatives that enhance the MD curriculum

Educational Technology Executive Committee Membership:
Director of Educational Technology and Software Development
Head Master, Academic Societies and Vice Chair, Curriculum Committee
Executive Assistant, PME Administration
Dean for Medical Education
Chief Information Officer
Associate Dean, Medical Education Planning and Administration
Registrar
Executive Director of Curriculum Programs
Senior Designer for Educational Technology
Educational Computing Software Support Specialist
Director, Primary Care Experience at BIDMC
Associate Master, Cannon Society
Master, Cannon Society

This governance committee will not only ensure our work is well aligned with the priorities of stakeholders, but it will also ensure enhanced communication about our progress between IT staff and faculty/staff/students.

I'm very interested in the priorities and experiences of other schools. If you'd like to share your priorities, please submit your experiences via our educational technology survey

I'll summarize the responses, keeping all identities confidential, in a future blog entry.

Kamis, 27 Maret 2008

Cool Technology of the Week

I have an "-ology" problem.

Radiology, Cardiology, Pulmonology, Gastrology, Gynecology, and Endocrinology all have image management needs that require high bandwidth networks, short term high speed storage, and long term archival storage.

Radiology has a industrial strength GE Centricity PACS. The other "ologies" have heterogeneous applications from multiple vendors. As a CIO, I can no longer let 1000 wildflowers bloom in the world of image management. Why?

1. Each department would use its own image viewing software
2. Each department would need its own disaster recovery strategy
3. Each department would use its own records management/image retention rules
4. Each department would need its own capital budget for storage
5. Clinicians would not be able to have unified list of all imaging studies for patient or consolidate images across multiple institutions using different medical record numbers

How do I solve this problem? The answer is Long Term Archiving that is standards compatible and supports all the ologies. Teramedica's Evercore is one solution. IBM's Grid Medical Archive Solution is another. This concept is the cool technology of the week.

The idea behind these systems is simple. Each department can purchase the applications which interface to its imaging devices and support its workflow. The Departments own the "front end"

Each of these imaging systems supports a DICOM exchange to a long term archive. In the past, I've used content addressable storage with a proprietary API and DICOM broker for radiology, DVDs for echo, CDs for vascular, MODs for radiation oncology, etc.

All of this will be replaced with an enterprise image archiving approach which can

1. Provide one place for all images in the enterprise to be stored. IS can provide any storage hardware it wants - NAS, SATA disk, Data Domain archiving appliances etc.

2. Provide unified metadata for every image which can support a single application for consolidated image viewing of all studies from all the "ologies" at different institutions

3. Provide records management rules which enable deletion, information lifecycle management, and compression based on image type or department. For example, digital mammography needs to be kept 10 years, but we can move it from fast storage to slow storage when appropriate. CT images could be compressed after a year and deleted after 5 years.

Having unified storage, unified viewing, and unified management means that IS can now own the backend of image management and treat it as a utility, just as we do with other central file architectures.

The end result of this utility approach to long term image management is a win/win. Departments select the applications they need and the workflow they want. IS manages the security, integrity, and cost of storage centrally. The total cost of operating an enterprise image infrastructure is lower, the service levels are higher, and compliance with records retention policies are simplified.

I've been pursuing this concept for the past 5 years, but now the products are mature enough to make it a reality and I plan to do this as an FY09 project.

Rabu, 26 Maret 2008

Playing the Japanese Flute

At the end of the day, when my email queue is empty, my Blackberry is silent, and my desk is cleared, I retreat to a quiet space to play the Japanese flute, called the Shakuhachi. This blog entry is a personal statement about playing the Shakuhachi and ensuring I keep my mental health while living the life of a CIO.

As I've said in previous blog entries, being a CIO is a lifestyle, not a job. It's a balancing act of keeping projects moving, customers happy, and budgets frugal. The average tenure of a CIO in most organizations is about 2-3 years. When I leave the office, I want to clear away the frustrations of the day and arrive home with the optimism and enthusiasm my family deserves. Having avocations outside of the office helps me maintain my mental clarity and positive outlook. Rock/ice climbing are great weekend activities to recharge the soul, and I'll blog about those later. Playing the Shakuhachi is a daily meditative experience that has been called "Blowing Zen".

The Shakuhachi is an end blown bamboo flute, which is a bit like a recorder without a mouth piece. It's created by hollowing out the root end of a single piece of bamboo. Sound is made by passing air over the blowing edge, enabling the player to produce 4 octaves using only the lips. Notes are fingered using 5 holes and sharps/flats are made by varying the position of the head or by partially covering the holes.

Shakuhachi sheet music is written entirely in Japanese Hiragana characters and not in western musical notation.

The instrument came into Japan from China at the end of the 7th century. From this period until the 12th century it was chiefly used as court music. From the 12th to 16th century it was a popular instrument among mendicant monks, who banded together in the 17th century to form the Fuke sect of Zen Buddhism and used the Shakuhachi as a spiritual tool. Their songs are called honkyoku and are meditation pieces rather than folk songs.

Learning to play the Shakuhachi requires years of study under a licensed Master. The Japanese say that it takes 3 years to learn to move your head appropriately and 7 years to make a note perfectly. One such Shakuhachi master, Phil Nyokai James, lives in Portland, Maine and offers lessons in the Boston area to 10 students every other week.

I play two flutes - a 1.8 (traditional 1.8 foot long) in the key of D made by Kobayashi Ichijo of Osaka and a 2.4 in the key of A (the picture above) made by Yamaguchi Shugetsu of Nara. In addition to playing the flute each night, I bring it to mountaintops (such as the summit of Mt. Fuji) and forests. The rich sound echos throughout natural settings and sounds truly spiritual.

A recent book about Mt. Monanock in New Hampshire describes the mysterious flute player of the mountain who is often heard but never seen. If you're hiking in New England and you hear a flute in the distance, it just might be the wandering CIO monk.

Selasa, 25 Maret 2008

Standards for Secondary Uses of Data

Health Information Exchanges and Regional Health Information organizations typically focus on the exchange of patient data for clinical care. Use cases include providing clinical histories to Emergency Departments, pushing laboratory/radiology results from hospitals to physician offices, and supporting referral workflow between primary care clinicians/specialists.

However, secondary uses of data such as public health reporting, pharmacovigilance, biosurveillance and quality reporting can be equally important. The Social Security Administration disability evaluation process provides an great example of data exchange for secondary use that improves quality, saves money, and leads to increased patient satisfaction.

Today, the Social Security Administration pays over $500 million dollars per year to retrieve paper records and purchase consultative examinations when they are unable to obtain existing records in support of a disability application. Here's how it works:

1. A patient applies to the Social Security Administration for benefits related to a disability
2. The patient signs an authorization to release medical records at a local SSA office or submits the form via mail
3. The SSA forwards the authorization and a medical record request to hospitals via mail
4. Health Information Management staff at the hospital copies paper records or prints electronic records, then sends those records to the SSA. It's a manual, costly process. Records are generally sent via mail, but some providers use fax or the SSA website to upload non-standard file formats, like Word, which the SSA converts to images for use in their current system. These images are just pictures, not data, and therefore are not searchable.
5. Staff at the SSA manually review the paper records to verify diagnoses, medications, lab results, and other observations which document disability
6. The application is manually reviewed, then an administrative decision is made. The entire process takes about 6 months

With interoperable data standards, the new process could be:

1. A patient applies to the Social Security Administration for benefits related to a disability
2. The application is entered into SSA's disability claims system
3. The patient authorization is digitized by a local SSA office and stored centrally
4. The case is transferred electronically to the State Disability Determination Service - who will determine whether the claimant is medically disabled according to SSA's rules.
5. At the same time as the electronic case is transferred, SSA's system automatically sends the digitized authorization to a hospital along with an electronic query to verify patient records are present - without any human intervention
6. The hospital verifies the authorization and sends an electronic clinical summary securely with the SSA
7. SSA receives the clinical summary, formats it in a document and automatically saves in the electronic disability folder - again, without any human intervention. At the same time, a rules engine reviews the data. Depending upon the data received, the system alerts the case adjudicator to take appropriate steps.

For example, the hospital includes an ICD 9 diagnosis code of 153.2, which is Malignant neoplasm of the descending colon, in the clinical summary. The summary also includes a secondary ICD 9 diagnosis code of 197.7, which is a secondary malignant neoplasm of the liver.
The rules engine automatically creates an alert to advise the adjudicator to consider the listing for Colon Cancer with distant metastasis. The adjudicator sees the alert when he/she opens the case for the first time.

8. The adjudicator finds the associated operative and pathology reports in the clinical summary and makes a decision on the case immediately. A process that used to take weeks will be accomplished in a matter of days.

Perhaps some day, with sophisticated enough clinical summaries and rules engines, a system could be developed to automatically make a decision on some disability claims based upon electronic health records.

Although the specific use case for exchange of data between hospitals and the SSA has not been in scope for HITSP, the interoperability specifications developed for biosurveillance, another secondary use of data, work very well. Specifically

1. The PIX/PDQ transaction can be use to transfer patient demographic information and verify patient records are present
2. The Continuity of Care Document provides a clinical summary of problems, medications, allergies, and laboratories
3. The XDR standard provides secure transport of the CCD from hospital to the SSA

Of course, patient privacy must always be protected with any data exchange. Technical security standards enforce privacy policy and the social security administration workflow is predicated on patient authorization, acceptance of the signed authorization by the hospital and transmission of records to SSA only after patient identity has been verified.

The fact that HITSP interoperability specifications have been recognized by Secretary Leavitt means that standards for labs, medications, clinical summaries, transport, and security are available to meet the interoperability requirements of clinicians, patients, hospitals, labs, pharmacies, and government agencies. 2008 is the tipping point for interoperability now that standards are available, government and hospital stakeholders are aligned, and the business case for data exchange is clear.

BIDMC is currently working on a pilot with SSA to implement the HITSP standards and the workflow described above. A successful pilot could lead to wide adoption of data sharing in support of the disability process and integration of these workflows into the Nationwide Health Information Network. Best of all, the enhanced service to patients will likely result in lower overall costs, making implementation fundable from the savings of eliminating paper record transfer.

We'll be live later this year and I'll share all our experiences.