Categories
Uncategorized

Making the Right Choice, or How I Learned to Live with Limiting My Own Technical Debt and Just Be Happy

One of the things that comes up in my day job is trying to make sure that reports that we create are correct, not only from a data perspective, but from an architectural perspective. There are hundreds of legacy reports with legacy SQL code that has been written by 10’s of developers (some actual developers and some not so actual developers) over the last 10+ years.

Today a request came in to update a productivity report to include a new user. The request included their user ID from the application where their productivity is being tracked from.

This request looked exactly like another report and request that I’ve seen that involved running productivity from the same system with the same aspects (authorizations work).

I immediately thought that the reports were the same and set out to add the user id to a table ReferralsTeam which includes fields UserID and USerName.

I also thought that documenting what needed to be done for this would be a good thing to be done.

I documented the fix and linked the Confluece article to the JIRA issue and then I realized my mistake. This wasn’t the same report. It wasn’t even the same department!

OK, so two things:

  1. There are two reports that do EXCATLY the same thing for two different departments
  2. The report for the other department has user ids hard coded in the SQL

What to do?

The easy way is to just update the stored procedure with the hard coded user ids with the new one and call it a day

The right way:

  1. Update the table ReferralsTeam to have a new column called department … or better yet create a second table called Departments with fields DepartmentID and DepartmentName and add the DepartmentID to the ReferralsTeam table.
  2. Populate the new column with the correct data for the current team that has records in it
  3. Update the the various stored procedures that use the ReferralsTeam table in them to include a parameter that is used to filter the new column that was added to keep the data consistent
  4. Add the User IDs from the report that has the user IDs hard coded, i.e. the new department
  5. Update the report that uses the hard coded user ids to use the dynamic stored procedure
  6. Verify the results
  7. Build a user interface to allow the data to be updated outside of SQL Server Management Studio
  8. Give access to that user interface to the Department Managers so they can manage it on their own

So, which one would you do?

In this case, I updated the hard coded stored procedure to include the new user id to get that part of the request done. This helps satisfy the requester and allows them to have the minimum amount of down time.

I then also create a new JIRA issue so that we can look at doing steps 1 – 6 above and assigned to the report developer. Steps 7 & 8 are in a separate JIRA issue that is assigned to the Web Developers.

Doing things the right way will sometimes take longer to implement in the short run, but in the long run we’ve removed the need for Managers in these departments to ask to have the reports updated, we prevent bad/stale filters from being used, and we can eliminate a duplicative report!

One interesting note, the reason I caught the duplication was because of a project that we’ve been working on to document all of the hundreds of reports we have. I searched Confluence for the report name and it’s recipients were unexpected for me. That lead me to question all I had done and really evaluate the best course of action. While I kind of went out of order (and this is why I started documented one process that I didn’t mean to) I was still able to catch my mistake and rectify it.

Documentation is a pain in the ass in the moment, but holy crap it can really pay off in unexpected ways in the long run.

Categories
Uncategorized

The Technical Debt of Others

The Technical Debt of Others

Technical Debt as defined on technopendia is:

a concept in programming that reflects the extra development work that arises when code that is easy to implement in the short run is used instead of applying the best overall solution.

In the management of software development we have to make these types of easy-to-implement-and-we-need-to-ship versus need-to-do-it-right-but-it-will-take-longer decisions all of the time.

These decisions can lead to the dreaded working as designed answer to a bug report.

This is infuriating.

It’s even more infuriating when you are on the receiving end of this.

A recent feature enhancement in the EHR we use touted an

Alert to let proscribing providers know that a medication is a duplicate.

For anyone in the medical field you can know what a nightmare it can be to prescribe a duplicate medication from a patient safety perspective, so we’d obviously want to have this feature on.

During our testing we noticed that if a medication was prescribed in a dose, say 75mg, and stopped and then started again at a new dose, say 50mg, the Duplicate Medication Alert would be presented.

We dutifully submitted a bug report to the vendor and the responded

The Medication is considered a true duplicate as when a medication is stopped it is stopped for that day it is still considered active till (sic) the end of the day due to the current application logic, which cannot be altered or changed. What your providers/users may do is enter a DUR Reason and Acknowledge with something along the lines of "New Prescription". These DUR reasons can be added via Tools > Preferences > Medications > DUR > Override Reason tab – type in the desired DUR Override Reason > Select Add > OK to save.

If functionality and logic outside of this is desired this will need to be submitted as an Idea as well since this is currently functioning off of development's intended design.

Then the design is broken.

From a technical perspective I know exactly what is going on. This particular vendor stores date values as varchar(8) but stores datetime values as datetime. There may be some really good reasons for making this design decision.

However, when the medication tables were designed, the developers asked the question, "Will we EVER care about the time a medication is started or stopped?"

They answered no and decided to set up a start date (and by extension an end date) for medications to not respect the time that a prescription started or stopped and therefore set them as varchar(8) and not as DATETIME.

But now they’ve rolled out this awesome feature. A feature that would actually allow providers to recognize duplicate medications potentially saving lives. But because they don’t store the time of the stopped medication, their logic can only look at the date. When it sees the same medication (but in different doses) active on the same date a warning appears letting the provider know that they have a duplicate medication (even though they don’t).

Additionally, this warning serves no purpose other than to be one more damned click from a provider’s perspective because the vendor is not storing (ie ignoring) the time.

When clinicians complain about the impact of EHRs on their ability to deliver effective care … when they complain about EHRs not fulfilling their promise of increased patient safety, these are the types of things that they are complaining about.

I think this response from one of the clinicians sums up this issue

I don't see the logic with the current "intended design" in considering a medication that has just been inactivated STILL ACTIVE until the end of the day. A prescriber would stop current and start new meds all in one sitting (which includes changing doses of the same med), not wait until the next day to do the second step. It decreases workflow efficiency to have to enter a reason when no real reason exists (since there IS no active entry on med list). The whole point is to alert a prescriber to an existing entry of a medication and resolve it by inactivating the duplicate, if appropriate (otherwise, enter reason for having a duplicate), before sending out a new Rx.

While it's relatively easy to follow and resolve the duplication alert if the inactivation and new prescribing is done by the same prescriber, I can see a scenario where prescriber A stops an old ibuprofen 600mg Rx[^2] (say PCP) and patient then goes to see prescriber B (say IC[^3]) who then tries to Rx ibuprofen 800mg…. and end up getting this duplication alert. The second prescriber would almost be lost as to why that message is showing up.

The application logic should augment the processes the application was designed to faciliate, but right now it is a hindrance. (emphasis added)

I know that sometimes we need to build it fast so that we can ship, but developers need to remember, forever is a long freaking time.

When you make a forever decision, be prepared to have push back from users of your software when those decision are markedly ridiculous. And be prepared to be scoffed at when you answer their bug report with a Working-as–Designed response.

[^2]: Rx = prescription

[^3]: IC = Immediate Care

Categories
Uncategorized

Getting CPHIMS(R) Certified – Part III

I walked into the testing center at 8:30 (a full 30 minutes before my exam start time as the email suggested I do).

I signed in and was given a key for a locker for my belongings and offered use of the restroom.

I was then asked to read some forms and then was processed. My pockets were turned out and my glasses inspected. I signed in (again) and had the signature on my ID scrutinized with how I signed on test day. It only took three tries … apparently 19 year old me doesn’t sign his name like 39 year old me.

Now it was test time … even if I could remember any of the questions I wouldn’t be able to write about them … but I can’t remember them so it’s not a problem.

It took me 80 minutes to get through the real test of 115 questions (15 are there as ‘test’ questions that don’t actually count). The only real issues I had were:

  • construction noise outside the window to my left
  • the burping guy to my right … seriously bro, cut down on the breakfast burritos
  • one question that I read incorrectly 4 different times. On the fifth time I finally realized my mistake and was able to answer correctly (I think). As it turned out I had guessed what I thought was the correct answer but it was still a good feeling to get the number through a calculation instead of just guessing it

When the test was completed and my questions scored the results came back. A passing score is 600 out of 800. I scored 669 … I am officially CPHIMS. The scoring breakdown even shows areas where I didn’t do so well, so I know what to focus on for the future. For reference, they are:

  • Testing and Evaluation (which is surprising for me)
  • Analysis (again, surprising)
  • Privacy and Security (kind of figured this as it’s not part of my everyday job)

Final Thoughts

When I set this goal for myself at the beginning of the year it was just something that I wanted to do. I didn’t really have a reason for it other than I thought it might be neat.

After passing the exam I am really glad that I did. I’ve heard myself say things and think about things differently, like implementation using Pilots versus Big Bang or By Location versus By Feature.

I’m also asking questions differently of my colleagues and my supervisors to help ensure that the we are doing things for the right reason at the right time.

I can’t wait to see what I try to do next

Categories
Uncategorized

Getting CPHIMS(R) Certified – Part II

Signing up for the actual exam may have been the most difficult and confusing part. I had to be verified as someone that could take the test, and then my membership needed to be verified (or something).

I received my confirmation email that I could sign up for the exam and read through it to make sure I understood everything. Turns out, when you sign up for the CPHIMS you need to use your FULL name (and I had just used my middle and last name).

One email to the HIMSS people and we’re all set (need to remember that for next time … this exam is the real deal!)

I was going to be in Oceanside for the Fourth of July Holiday and decided to sign up to take the exam in San Diego on the fifth. With a test date in hand I started on my study plan.

Every night when I got home I would spend roughly 45 minutes reading the study book, and going over Flash Cards that I had made with topics that I didn’t understand. Some nights I took off, but it was a solid 35 days of studying for 45 minutes.

Now, 2 things I did not consider:

  1. Scheduling an exam on the fifth is a little like scheduling an exam on Jan 1 … not the best idea in the world
  2. The place my family and I go to in Oceanside always has a ton of friends and family for the weekend (30+) and it would be a less than ideal place to do any last minute studying / cramming

I spent some of the preceding weekend reading and reviewing flash cards, but once the full retinue of friends and family arrived it was pretty much over. I had some chances to read on the beach, but for the most part my studying stopped.

The morning of the fifth came. I made the 40 minutes drive from Oceanside to the testing center to take the CPHIMS exam for real.

Categories
Uncategorized

Getting CPHIMS(R) Certified – Part I

One of my professional goals for 2017 was to get my CPHIMS (Certified Professional in Healthcare Information and Management Systems). The CPHIMS certification is offered through HIMSS which “Demonstrates you meet an international standard of professional knowledge and competence in healthcare information and management systems”.

There was no requirement for my job to get this certification, I just thought that it would be helpful for me if I better understood the Information and Management Systems part of Healthcare.

With not much more than an idea, I started on my journey to getting certification. I did some research to see what resources were available to me and found a Practice Exam, a Book and a multitude of other helpful study aids. I decided to start with the Practice Exam and see what I’d need after that.

In early March I signed up for the Practice Exam. I found all sorts of reasons to put off taking the exam, but then I noticed that my Practice Exam had an expiration date in May. One Sunday, figure “what the hell, let’s just get this over with” I sat down at my iMac and started the exam.

I really had no idea what to expect other than 100 questions. After about 20 minutes I very nearly stopped. Not because the exam was super difficult, but because I had picked a bad time to take a practice exam. My head wasn’t really in the game, and my heart just wanted to go watch baseball.

But I powered on through. The practice exam was nice in that it would give you immediate feedback if you got the question right or wrong. It wouldn’t be like that on test day, but it was good to know where I stood as I went through this practice version.

After 50 minutes I completed the exam and saw that I had a score of 70. I figured that wouldn’t be a passing score, but then saw that the cutoff point was 68. So I passed the practice test.

OK, now it was time to get serious. Without any studying or preparation (other than the 8+ years in HIT) I was able to pass what is arguably a difficult exam.

The next thing to do was to sign up for the real thing ..

Categories
Uncategorized

Updating my LinkedIn Profile

I've been trying to update my LinkedIn Profile for a couple of weeks now (maybe a couple of months) and I keep hitting a roadblock. Not really sure why …

Since being 'promoted' from Director of NextGen Support Services to Director of Business Informatics, I've wanted to update the Profile but haven't really had the 'time' to do it.

So a couple of weeks ago I decided that start in earnest on the update. I've done more research than I can stand but I don't feel like I'm any closer to an update that I like.

I think part of the problem is that I don't really know* what who the summary is for. Is it for me or for other people. People that are reading my summary (for whatever reason people read LinkedIn summaries)?

If it's for me then I guess I'd write about the things that I really like to do, like data analysis and bits of programming to get to solutions to hard problems. If it's for other people then I guess I need to be genuine about who I am while also 'selling' myself to the prospective others.

Maybe the best thing is to write it for me and then hope for the best. I kind of like that. Besides, if someone else reads it and they don't like it then that's a good indication about how well I would get along with that person in a professional setting anyway and might be best to avoid them.

And if they do like it then all the better that they will also like me … the real me.

Categories
Uncategorized

HIMSS Recap

I’ve gone through all of my notes, reviewed all of the presentations and am feeling really good about my experience at HIMSS.

Takeaways:

  1. We need to get ADT enabled for the local hospitals
  2. We need to have a governance system set up for a variety of things, including data, reporting, and IT based projects

Below are the educational sessions (in no particular order) I attended and my impressions. Mostly a collection of interesting facts (I’ve left the Calls to Action for my to do list).

Choosing the Right IT Projects to Deliver Strategic Value presented by Tom Selva and Seth Katz they really hit home the idea that there is a relationship between culture and governance. The culture of the organization has to be ready to accept the accountability that will come with governance. They also indicated that process is the most important part of governance. Without process you CANNOT have governance.

In addition to great advice, they had great implementation strategies including the idea of requiring all IT projects to have an elevator pitch and a more formal 10 minute presentation on why the project should be done and in what way it aligned with the strategy of the organization.

Semantic data analysis for interoperability presented by Richard E. Biehl, Ph.D. showed me that there was an aspect of data that I hadn’t ever had to think about. What to do when multiple systems are brought together and define the same word or concept in different ways. Specifically,, “Semantic challenge is the idea of a shared meaning or the data that is shared”. The example on relating the concept of a migraine from ICD to SNOMED and how they can result in mutually exclusive definitions of the same ‘idea’ was something I hadn’t ever really considered before.

Next Generation IT Governance: Fully-Integrated and Operationally-Led presented by Ryan Bosch, MD, MBAEHS and Fran Turisco, MBA hit home the idea of Begin with the End in mind. If you know where you’re going it’s much easier to know how to get there. This is something I’ve always instinctively felt, however, distilling it to this short, easy to remember statement was really powerful for me.

Link to HIMSS Presentation

Developing a “Need-Based” Population Management System presented by
Rick Lang and Tim Hediger hammered home the idea that “Collaboration and Partnering are KEY to success”. Again, something that I know but it’s always nice to hear it out loud.

Link to HIMSS Presentation

Machine Intelligence for Reducing Clinical Variation presented by Todd Stewart, MD and F.X. Campion, MD, FACP was one of the more technical sessions I attended. They spoke about how Artificial Intelligence and Machine Learning don’t replace normal analysis, but instead allow us to focus on what hypothesis we should test in the first place. They also introduced the idea (to me anyway) that data has shape and that shape can be analyzed to lead to insight. They also spoke about ‘Topological Data Analysis’ which is something I want to learn more about.

Link to HIMSS Presentation

Driving Patient Engagement through mobile care management presented by Susan Beaton spoke about using Health Coaches to help patients learn to implement parts of the care plan. They also spoke about how “Mobile engagement can lead to increased feeling of control for members” These are aspects that I’d like to see my organization look to implement in the coming months / years

Link to HIMSS Presentation

Expanding Real time notifications for care transitions presented by
Elaine Fontaine spoke about using demographic data to determine the best discharge plan for the patient. In one of the presentations I saw (Connecticut Hospitals Drive Policy with Geospatial Analysis presented by Pat Charmel) the presenter had indicated that as much as 60% of healthcare costs are determined by demographics. If we can keep this in mind we can help control healthcare costs much more effectively, but it lead me to ask:

  • how much do we know
  • how much can we know
  • what aspects of privacy do we need to think about before embarking on such a path?

Link to HIMSS Presentation

Your Turn: Data Quality and Integrity which was more of an interactive session when asked the question “What would a National Patient Identifier be useful for?” most attendees in audience felt that it would help with information sharing

Predictive Analytics: A Foundation for Care Management presented by Jessica Taylor, RN and Amber Sloat, RN I saw that while California has been thinking about and preparing for value based care for some time, the rest of the country is just coming around to the idea. The hospital that these Nurses work for are doing some very innovative things, but they’re things that we’ve been doing for years. The one thing they did seem to have that we don’t is an active HIE that helps to keep track of patients in near real time. I would love to have! One of the benefits of a smaller state perhaps (they were from Maine)?

Link to HIMSS Presentation

A model of data maturity to support predictive analytics presented by Daniel O’Malley, MS was full of lots of charts and diagrams on what the University of Virginia was doing, but it was short on how they got there. I would have liked to have seen more information on roadblocks that they encountered during each of the stages of the maturity. That being said, because the presentation has the charts and diagrams, I feel like I’ll be able to get something out of the talk that will help back at work.

Link to HIMSS Presentation

Emerging Impacts on Artificial Intelligence on Healthcare IT presented by James Golden, Ph.D. and Christopher Ross, MBA. They had a statistic that 30% of all data in the world is healthcare data! That was simply amazing to me. They also had data showing that medical knowledge doubles every THREE years. This means that between the time you started medical school and the time you were a full fledged doctor the amount of medical knowledge could have increased 4 to 8 fold! How can anyone keep up with that kind of knowledge growth? The simple answer is that they can’t and that’s why AI and ML are so important for medicine. But equally important is how the AI/ML are trained.

Link to HIMSS Presentation

Categories
Uncategorized

HIMSS review

I had meant to do a write up of each day of my HIMSS experience, but time got away from me, as did the time zone change, and here I am at the end of HIMSS experience with only my day 0 notes down on paper.

Day 1 started with a rousing Keynote by Ginni Rometty, the CEO of IBM. The things that struck me most about her keynote were here sense of optimism about the future sprinkled with some caution about AI, Machine Learning and Big Data. She reminded us that the computers that we are using for our analyst is are tools to help, not replace, people and that it is incumbent upon us, the leaders of HIT, to keep in the front of our minds how these BIG Data AI/ML algorithms were trained. As the old saying goes, “Garbage In, Garbage Out”

I also was able to record a bit of her keynote speech just in case I need to find and listen to it later.

I tweeted a couple of times during the keynote (and even got some likes and retweets … not something I’m used to getting)

Transparency in the Era of Cognition with the help of @ibmwatson #himss17

Artificial intelligence is out of its winter ... I sure hope so, but time will tell #himss17

Integration in workflow is the key to adoption #himss17

Don't let others define you. Great words from @GinniRometty #himss17

Growth and comfort never coexist. Another great gem from @GinniRometty #himss17

I spent almost all of my time on day 1 in educational sessions. One things that I noticed from my first class was just how FULL it was 15 minutes before the session even started!

The Emerging Impacts of AI on HIT was full 15 minutes before the session started! Something tells me lots of ppl interested in AI #HIMSS17

Sometimes the session title were a bit misleading, but eventually most of them would come around. A class with a title of Connecticut Hospitals Drive Policy with Geospatial Analysis was more about the Connecticut Hospitals and less about the Geospatial Analysis, but in the end I was what I was hoping to see which was people using Geospatial Analysis to help identify, and perhaps risk stratify patients to give the best care possible.

My tweet when the class was over:

Great talk on #geospatial analysis. So many ideas floating through my head now on potential actions and analysis #HIMSS17

I ended my HIMSS 2017 experience on a high note with a great session titled Choosing the Right IT Projects To Deliver Strategic Value. I’m still processing everything that came out of that session, but it left me feeling very positive about the future. It was nice to have the same, or at least very similar, feeling of optimism at the end of HIMSS as I had at the beginning after Mrs. Rometty’s Keynote.

I’ll be writing up my notes and linking to the presentations later this week (maybe whilst I’m flying back home to California tomorrow).

This is a conference I am overwhelmed by but am glad I am coming to.

While it’s fresh in my mind, strategies for next year:

  • Pick 1 – 3 strategic challenges you want to solve. Then identify 10 – 20 vendors that can help solve that problem. Talk to them, schedule appointments with them. Get more information than you know what to do with
  • Work on being a presenter. It will help check off that ‘Speak in front of large groups of people’ item on your Bucket List
Categories
Conference Healthcare HIMSS Travel

HIMSS 2017 – Day 0

I'm in Orlando for HIMSS17 and and pretty pumped for my day one session tomorrow which is titled: Business Intelligence Best Practices: A Strong Foundation for Organizational Success.

Conferences are always a bit overwhelming, but this one is more overwhelming than most. More than 40,000 people all gathered in one convention center to discuss Healthcare Tech. Kind of awesome and scary!

I'm looking forward to visiting some booths in the exhibition hall, and wandering around and stumbling onto some great new things / ideas.

I'm going to write up my impressions of the days events, hopefully including notes, and links to tweets because the tweets will be raw and most uncensored impressions of what I'm seeing / hearing.

Here's the HIMSS 2017!

Categories
Uncategorized

Making Better Meetings … maybe

To say that I attend a lot of meetings is a bit of an understatement. However, as a manager that is part of my job and I accept that it is something I need to do.

What I have been trying to do at my office is lead more effective meetings, but also to encourage my colleagues to have more effective meetings as well.

It's been challenging as the organization I work for is large and all I can do is lead by example with the meetings that I am in.

Until now … maybe

I read an article on LinkedIn titled Tired of wasting time in meetings? Try this and there were several suggestions for better meetings some of which I already knew:

  1. Define the purpose of the meeting
  2. Define the outcome of the meeting
  3. Have a timed agenda and someone in charge

And others that I didn't:

  1. Facts – not opinions!
  2. Keep people on-point. (Only talk about matters relating to their job)

I think that number 4 is a key idea for any meeting (that isn't a brainstorming meeting) but number 5 is a bit too much. Keeping people on point is an important aspect to any meeeting, but only allowing people to talk about matters related to their job … what is the dividing line between 'my job' and 'not my job'?

This seems like it wouldn't actually have the intended outcome. I think people who are already quite will be encouraged to stay quite as the topic isn't related to their job (even if it might be) and those that talk too much already will assume that everything is related to their job so they will still contribute inappropriately.

I think that point 5 is much better when restated as:

  1. Keep people on-point, only talk about the current agenda item

The article did include a nice diagram that you can download (need to provide an email address first).