Lost in Translation

As a software engineer, I’ve discovered what seems to be a common, almost universal, problem that we experience in the process of trying to deliver applications that meet the needs of our customer. And maybe this is a problem you find in other areas as well, but the problem of clearly communicating with someone that doesn’t speak your language is by far the most frustrating thing that I encounter.

There are two major variations on this problem, both of which are frustrating to no end.

No Speak Computer

The most immediate communication error we experience in this field is trying to speak to someone that just doesn’t get computers. Sure, they use one, probably every day, but at a high level. The same way that I use my car. I can drive my car just fine, day to day. I have a general understanding that when the one thing points to the E, I put gas in the other thing to make all the things happy. I can turn a steering wheel, I can push a brake pedal, and I can even fold the seats back. What I can’t do, however, is tell you what’s causing the rattling noise.

I’m sure mechanics the world over feel my pain on this. Trying to get useful, specific, and accurate information from someone not familiar with the topic at hand is hard at best, and dangerous at worst.

I’ve had sessions with customers lasting days at a time where we go back and forth, me trying to press and probe for information that may be useful to me, and them struggling to understand what I’m asking about. I don’t blame them in the same way that I hope my mechanic doesn’t blame me for not knowing about the internal workings of a combustion engine. The best we can do is muddle through and try to get by, though this could be fixed with some training. The argument can be made that since our users are using these machines for hours on end, day in and day out, for their jobs that they should be required to have basic understandings of how it works. While I don’t know where every nut and bolt goes under the hood, I have a basic understanding of my car and how it operates. But good luck mandating basic computer competence in the workplace when the general feeling about the matter by management is “that’s why we have you people, right?”

The flip side to the user who knows nothing and can’t help you is the user who knows nothing but thinks they do. I’ve had customers with a fundamental misunderstanding of how the software works and what technologies are being used feed me bad information when reporting bugs or requesting new features. Which brings us to the far more insidious and destructive communication problem.

I Don’t Know What I Want

It is an unfortunate adage that the customer is always right. Let’s just be clear:


Customers have a need that needs to be filled, yes. They have a need that software can fill and our jobs exist to give them the product they need. The main problem is that due to a lack of understanding of the technologies used in creating software, and the frameworks upon which they are built, we’re asked to build some stupid stuff and sometimes we’re asked to build the wrong stuff.

Big Words

Sometimes the problem is that our customers are trying to use the same jargon we do. Which is understandable. No one wants to sound dumb. But this isn’t a good idea when talking to someone about a subject where words have very specific meanings and actions have very specific consequences. Nothing in computer science just happens. Computers don’t do anything we don’t explicitly tell them to do. Computer’s never just know what it is you want. If you’re dictating what kind of software I’m to make for you, you can’t be leaving out details because they’re obvious to you. The programmer who isn’t an expert in your specific field of business, and thus the computer, will not know that some values mean the same thing or that groupings should be made by column A instead of column B.

Design and Analysis

The most dangerous, but hardest to detect, error in communication comes from the users that have some say in how the software behaves. This may sound ideal to most managers, but in most cases, it is not. Especially when the user starts to assume they know how the software is structured.

I’ve seen requests for features, sometimes detailed in their description of the flow of information and behavioral components, which are completely and utterly dumb. Not that the person or group requesting the feature is dumb, but because there is a lack of understanding of how the software is structured. On more than one occasion, I’ve received some requests that would take inordinate amounts of time or resources or code changes to complete, and when asked what is the most important component of the feature, we find out that we could make the change for a fraction of the cost to implement the feature as designed. When asked why it was planned in such a round about way, the answer is almost invariably “We thought that would be easier for you.”

Their intentions are in the right place, but rarely will someone who is not intimately familiar with the code base have more knowledge about what would be easiest for us to implement. I really feel that the whole paradigm of design by use case has quite a bit of merit, though getting use cases for everything you want to implement in a large project can be tedious.

But really, just tell us what you want the software to do, not how you think the software should do it. We’re good at figuring that stuff out. That’s why we get educated and train for years at school or in the workplace. Trust us.

Where Automated Healthcare Has Gone Wrong… And How To Fix It

I somewhat kinda almost work in the healthcare industry. I’ve also talked to many doctors and made observations while at my own appointments. I think we can all agree that while American medicine is some of (if not the) best in the world, we can also all see that there is squandered potential. Doctors, nurses, staff, and patients are all stretched thin.

Automation To The Rescue?

You’ve probably experienced the same things I have. Interdepartmental confusion where the left hand of accounting doesn’t know what the right hand of healing is doing and vice versa. Filling out the same form in triplicate. And I’ve never been to the doctor that they didn’t ask me, again, for a list of my medications and dosages.

Most people seem to think that the holy grail in health care management right now is a computerized system that can get the right information to the right doctor instantly, automating all the paper work that used to require an entire staff to manage effectively. This is only half right. An automated system like this is, I suppose, the end goal, but we’re going about it the wrong way.

I’ve heard it said that what will finally push us to computerized medical systems will be a big push by the government to design such a system. Surely they are the only ones capable of implementing such a system, right?


No. And guess what: it isn’t going to be the private sector, either. It is going to have to be both.

Foot In The Door Mentality

The main problem, as I see it, is that everyone is competing to deliver the best healthcare management system, which is a good thing. But this also leads to some of the problems we have right now, the most egregious of which is that every system out there is a closed system. If a hospital or practice buys System A, they’re stuck with it. There is no interoperability with System B. As a software engineer, this makes me want to scream. What if they need to move to System B? What if they have a patient who needs to get a procedure done at a facility that has System B?  What if a patient moves? What if System A’s vendor suddenly jacks up its licencing fees?

You’re stuck, that’s what.

This kind of lock-in shifts the competition from vendor’s trying to produce the most useful product for their customers to trying to be the first system a customer buys. If you can get in on the ground floor, those suckers will be hooked, no matter how much or how little effort you put into the product. And lucky for the vendor, few institutions will put the effort into examining the technology under the hood of whatever solution they’re looking at. This kind of sales decision is usually made in a board room after a fancy slide show presentation showing stock images of happy doctors and patients in gowns, obviously smiling from all the great care they’re getting thanks to whatever integrated system the vendor is peddling.

I do, however, feel that there is a solution to this current state of affairs. One that involves both the private sector and government agencies. One that takes advantage of the strengths of both.

Open Standards, Baby!

I propose that rather than the government create a competing integrated system that tries to be everything to everyone, and rather than companies competing to be the first to plant roots at a practice, that we split the job. The government, along with a consortium of business and healthcare provider interests, should develop a single, standardized API for healthcare data management. Define a standard for the types of information needed by all institutions. Define a method to store custom information that may be exclusive to certain practices or certain types of facilities that could be read and interpreted by other institutions that may or may not want or need to implement that part of the API. Define a common data transfer method. Legislate the use of such a system.

This creates one standard that the private vendors can work around. This shifts the focus on being the first to get your system in the door to being the best system for the types of customers you’re courting. If a vendor is under the threat of a customer simply moving to another system if their current one doesn’t meet their demands, the incentive to deliver a superior product is created.

Institutions will be able to store data and retrieve it. More importantly, institutions will be able to share their data. Transferring records would be easy, since you could be assured that every system available would be able to receive and interpret your records.

This is a problem that could be fixed with the right approach and some forethought by all the various interests. It’s not an impossible problem. It has been proven that standardized systems agreed upon by multiple parties can be very beneficial without harming the economic interests of said parties. These vendors are already competing for your business. I’m just suggesting we shift the competition to inspire improvements to the actual technology itself rather than improving sales pitches.

Engineers Are Funny

My team at work is having a team outing tomorrow, where we will all leave at lunch to go eat and bond over a round of disc golf. Typical team building exercise, I’m sure. Except we’re engineers. A quick conversation about where we should eat for lunch led to a list of about 10 possibilities and an hour long conversation about the best algorithms for properly ranking choices over a small vote set.


I was thinking back today and I remembered something totally awesome from my childhood. And by childhood I mean high school.

When I was a freshman and sophomore, I volunteered for the summer science camp held at our middle school. It was some of the most fun I had in high school. We had various ages, and though my memory fails me even at a young age, we were dealing with elementary school students. The camp consisted of a week of daily science themes, where the science teachers from the school system would each lead a class of kids in a science experiment relating to the day’s theme. Space, Materials and Matter, Earth sciences; all presented in age appropriate ways with several experiments to teach basic principles in each topic. Each class lasted about an hour, which is good because attention spans at that age are not sizable.

As a student volunteer, my responsibilities included helping run lunch time, snack time and helping with whatever experiments happened to be the most involved that day, needing the most hands on help for the kids. Until the last day.

The last day of the camp was Forensics Day. All the student volunteers were assigned a role in a crime mystery. This usually involved the theft of another character’s beloved toy or stuffed animal. The kids would all be gathered in the morning to hear the bad news and that they would be tasked with helping catch the thief! All the classes that day revolved around different forensic methods they could use to identify the thief, be it via fingerprint found at the scene, footprints in the mud, or depositions and alibis. At the end of the day, we’d all gather again and the kids would get to question us with their new knowledge and evidence and eventually the culprit would be brought to justice.

I just checked the school website from my home town. Looks like they’ve expanded their summer camp program. They even have two Engineer camps! Maybe all hope isn’t lost for the future generations.


The New “Thing”

I like new technologies as much as the next software engineer, but I have my limits. Maybe I’m just more conservative than some of my colleagues. Maybe I’m a Luddite. Maybe I’m too cautious. But some things drive me crazy.

The Problem(s)

It starts rather innocently. You have a new project. You have an upgrade to an existing system. Your team needs to implement a process solution to support your developers. All are perfectly legitimate projects and all can be solved by any number of platforms and systems.

However, this is where some developers get overzealous because they remember that one cool thing that they read about in the tech magazine that just came out in pre-beta that has the potential to be the perfect solution for all the ails this project seeks to salve. The only problem is that these developers have never used this platform/system before. No problem, we’re all smart guys. We can teach ourselves along the way, right?



I’ve seen this a few times now in my professional career. I thought we’d leave this behind in college as a industry, but I guess not. And I’m willing to bet that you’ve seen it as well where you work. And it invariably causes more problems than it solves. Sure you may end up with a solution to your original problem, but now you have a much worse problem: you have a monstrosity of a system that operates with all the stability of a tight rope walker that you and your team have to support for the next x number of years.

I did this in school, sure. I was still learning. Having an actual project presented to me was a great excuse to try to learn something new. Of course, that’s the point of college; to learn. Most of the time, professors didn’t care so much about the functionality of the overall project, but were looking to see if you were understanding the concepts of the lessons. Sure, your project crashes if you click that one button with a property set to null, but your sorting algorithm was spot on. Bravo. And you learned some new platform/paradigm/system along the way. Double bravo.

No More A For Effort

But businesses need things to work. That’s why they hired you. Your place of employ is not in the business helping you learn the latest thing. Your place of employ is in the business of making widgits, selling widgits, or servicing someone who does one or the other. You owe it to your employer to give them the best solution to their problem that you can, which means working with that which you’re familiar.

Don’t fall prey to the marketing gimmicks and hype surrounding whatever the new thing happens to be. It may be great, but not if you use it to create a terrible mound of code that only barely does what it is supposed to do.

Engineer, Innovate Thyself

I’m not a crank, a luddite, or a developer particularly stuck in his ways. OK, so I might need a little more convincing about the latest and greatest than some of my peers, but I’m definitely not anti-progress.

You should absolutely continue to invest in yourself and your skill set. You should push yourself and train and be a good citizen in the universe of computer science. In fact, I feel that employers should invest in the further training of engineers.

Just don’t do it on the job. First projects are learning experiences. First projects exist for you to learn what to do, and more importantly, what not to do. That means you’re going to write some bad code. That is OK.

As long as you keep it to yourself.

PC Infrastructure Needs an Update

There is a lot of hupla (and rightly so) surrounding parallel computing today. CPUs aren’t getting much faster in terms of clock speeds these days and any more its all about the cloud, man. Distributed computing has been a thing in academic circles for quite a while. Much like the Internet, which also started in the world of academia and made its way to the general populace, distributing computing is in its public infancy with projects such as Folding@home and SETI@home, and more recently the computing as a service projects of Amazon and Microsoft. But we’re already seeing a creep into the homes of the average Joe Blow. Multicored GPUs are pretty common now. The GeForce GTX 680 from NVIDIA, for example, has 1536 CUDA cores.

My biggest beef with the state of things right now is that the job of the graphics chip to do heavy processing, even when what you’re processing has nothing to do with graphics. I think its about time that we move away from the current paradigm of a single CPU chip with anywhere between 1 and 16 cores and a GPU with a gazillion cores to a set up more applicable to what is certainly coming in the future. Namely, we need a cluster of general purpose cores specifically for general processing. With the prevalence of data processing in today’s applications, even for local applications, I feel its time to let the GPU do its thing and have a separate cluster for everything else. If for nothing else than allowing games to fully parallelize both graphics specific tasks and game or AI tasks.

Good thing that Microsoft is moving us (the business world, at least) in this direction with the advent of new asynchronous language features in C# 5.0. Really, this is going to have to happen. We need to integrate threading and parallel concepts into the languages we use because in the near future the ability to write a good parallel algorithm is going to be just as important to a developer as the ability to write a good for loop.

I had just one class in college which emphasized the concepts of threading, deadlocking and parallel algorithms, and that was in a class on operating systems. I hope things are changing in computer science curricula, because starting developers out with, and staying with, single threaded C and C++ applications won’t be sufficient in the future.

I’m just sayin’, its time to get on the parallel train. Choo choo.