Presentation Matters

As a professional software developer, my job is to plan, design and implement software solutions to problems. This involves learning about a business problem, the process that is followed, and designing software that makes this process easier or even automatic. Almost all software is written this way and for this purpose. And most developers do a good job of fulfilling goal #1: make it work. However, I’ve noticed a disturbing lack of attention to what should be goal #2: presentation.

Of course, I’m generalizing. I’m lumping some facets of software into making it work and lumping others into presentation. Let’s, for the purpose of this post, go with the following definitions:

  • Making It Work – The software is functional and performs expected calculations accurately. The average user can operate the software to successfully complete a task.
  • Presentation – The layout of the software is intuitive and optimized for common flows. The performance of calculations is optimized. Common tasks are automated as much as possible.

In business, function is everything. This is unfortunate, because form matters, both in interface and structural design.

Interfacing

The most obvious, and public, example of form is in your user interface. The control layout, the data grouping, the flow all coalesce into an important feature of your software. A good user interface can greatly improve the productivity of your users, which is, by the way, the reason you exist as a software developer. Conversely, a poor user interface can harm your users productivity. Taking advantage of your preferred development platform’s (which happens to be C# .NET for me) built in functionality can do wonders for your project; it makes the difference between good software and great software. Simple implementations, such as setting intuitive tab stopping on forms or writing clear and specific error messages, make the lives of your users easier.

Optimization

Most overlooked, and most difficult to solve, is the problem presented by performance that is acceptable. I see code all the time that is never optimized because at some point an algorithmic solution to a problem was found, and even though it isn’t as fast as it could be, it just works. Sometimes this code exists for legitimate reasons. There are times when the simplest code is not as performant as a convoluted equivalent. This is rarely the case though. In my experience, this has usually been used as an excuse by people not interested in spending the time to correctly structure or comment their code. I distinctly remember in college being told to always go with the simplest coding option because memory and processing time is so cheap now. To this I say bah. This is the kind of thinking I see that has led to database calls, file processing a myriad of other processes taking minutes to complete when they could have been run in seconds.

Why Should Anyone Care?

You may ask why this matters. It matters because the interaction between the user and the software is a very critical determinant for the success of your software. Your users probably won’t be able to tell you that a database call is just a little laggy if they have never made a database call themselves. Your users won’t be able to let you know that you should change the tab stop settings on a form or that the work flow for that thing they do over and over all day could be tweaked or automated. Users aren’t, for the most part, trained to analyze their workflow in this manner to watch for inefficiencies. Hell, you’re lucky if you find out about legitimate bugs before entire business processes spring up to workaround it until you’re left with a breaking change in the fix for the original bug.

What your users will know is when you get it right. The right interface and optimization of backend processes can increase productivity. More importantly, perhaps, is the increase in user morale that the right balance can bring. Happy users get more done. Happy users don’t dread working with your software. Happy users make your life easier.

A Tale Of Two Designs

The perfect example of good and bad presentation, I feel, can be seen in the current mobile OS wars between Apple and Google. I think most people, despite any perceived shortcomings of iOS in its feature set, one must admit that the presentation and performance of existing features is top notch. The iPhone took off the way it did because everything about iOS was a piece of art. Apple has sacrificed numbers and features in exchange for perfect presentation.

Google, on the other hand, seems to side more with a rich feature set, but with less desirable presentation. Android feels like it’s been designed with the average engineer in mind, whereas iOS feels like it has been designed for the average not engineer. This is one of the reasons I’m against owning a phone running Android. I have yet to see a phone running Android behave as intuitively and fluidly as an iPhone. For example, the scrolling animation is always jittery. When I slide a control on the iPhone, I feel like I’m interacting with a physical entity. When I slide a control on an Android phone, I feel like I’m sending a command to a computer and it is having a hard time keeping up with me. That’s a purchase (or even a series of purchases) that Google and the respective cellular carriers are losing out on because of a lack of attention to presentation.

These things matter. A lot. There are costs in productivity, morale, and sales associated with presentation. Don’t lose out. If nothing else, just consider it another engineering challenge: optimizing your users.

Tornado Warning

Standing in the bathroom waiting for the tornado warning to pass. I think the people in the hall are missing the point of taking shelter. Doesn’t matter if you stand outside in the hall if the hall is wide open and has two large open rooms on either side with window walls.

1/2 inch glass isn’t going to stop that tree hurtling toward you. Just sayin’.

How I Wasted Two Days Learning SSIS Against My Will

So for two days I’ve been beating my head against the wall at work. The task was simple enough:

  1. Gather some some data.
  2. Cull the data.
  3. Dump said data to a file.

Easy enough, right? Not so much.

The Beginning

I started this task like I would any other data centered task. I started with a script to get the data I needed. Easy-peasy. I had to gather data from several different tables and do some filtering based on data present in other tables. Nothing too complicated. I ended up with something along the lines of:

  1. Select values that are in a valid state into a temp table.
  2. Select values that have been the valid state at some point after a given date into the same temp table.
  3. Filter out invalid rows in a first pass.
  4. Filter out other invalid rows in a second pass.
  5. Transform the temp table into useful form in a second temp table.
  6. Get the return results from various tables based on the second temp table.

Didn’t take long to crank out this script once I worked out all the requirements and figured out which buckets to dump the preliminary data into. So now on to the easy part. All I have to do is run my query in SSIS and dump it to a text file.

So easy.

If At First You Don’t Succeed

So I’m new to SSIS (though I’ve learned more about it in the last two days than I’d have liked). I found the Data Flow control and thought I would be done by lunch. I thew in an OLE DB source and pasted in my magic query. Of course this angered SSIS. I was quickly told in no uncertain terms: F off.

Problem: Turns out you cannot use DECLARE or CREATE syntax inside of a sql command for an OLE DB source.

Maybe If I…

So I couldn’t use my script directly. Fine. I’ll make it a stored procedure and return my data set. I set up the procedure and we’re in business. My somewhat complicated query runs on our test box in about two minutes. So I plug this into the OLE DB source via an EXEC in the SQL command. This proceeds to destroy Visual Studio. I give up on the process recovering after 15 minutes.

Problem: SSIS, in the interest of being easy makes you rely on magic to discover metadata about your queries to automagically populate column and transform information for you. Unfortunately, part of this magic involves adding a preamble to your query to pull metadata and rows as a preview. This can lead to some terrible query plans for the metadata retrieval call.

Surely This Will Work

The thought occurred to me that if SSIS was searching for metadata on the return results of the stored procedure, I’d bypass this magic by just turning the stored procedure into a function. After trouble compiling the TSQL as a function, I quickly realize that temp tables are not allowed. I have to use table variables due to the data integrity contracts for SQL Server user defined functions. No problem. Quick fix. Back to the OLE DB Source and plug in a query against my function, passing in the parameter in fnMyFunction( ? ) form. SSIS laughs in my face.

Problem: SSIS is perfectly capable of parsing your query, as long as you don’t try to pass a parameter to a function. Oh, so your parser understands ( ‘hard coded value’ ) but not ( ? ). You’re a dick, SSIS.

If You Want To Play Dirty, I’ll Play Dirty

I was now on a mission. It was time to go rogue and break some rules. So I broke out of the data flow and into the control flow. I added a script and added a CREATE/DROP statement for a temporary table (read: not a temp table or variable) that was actually going into the schema. I then selected the returned result set from the stored procedure version of the script into this table. I then piped into the data flow and just ran a select all from this table. SSIS was perfectly happy with this since I turned off design time validation on the OLE DB source. I funneled this into my flat file, then piped from the data control to another script that dropped the table before finishing execution.

Apparently I’m Lucky

While in the middle of this giant mess, my boss, while laughing at me for having the bad fortune to get this hellacious bug, told me that I was actually lucky that I was getting to work with SSIS rather than DTS. He then launched into a list of all the ways SSIS was superior to DTS.

That’s all fine and dandy, but if I’m looking at two cars, it doesn’t matter if one car has a 400 horse power engine, leather interior, and jetpack wings if I can’t start either one and get from point A to B.

Who knows. Maybe SSIS will grow on me when I have some simpler queries to run. You didn’t make a good impression, though, SSIS. I’ve got my eyes on you.

Zombies

So I’ve been powering through an audio drama recently. We’re Alive is an amazing story about a zombie outbreak that is more complex than what I’ve seen in the past. In typical contemporary style, the zombies are fast. What this story does that I haven’t seen are how the zombies operate. To avoid spoilers, I’ll just let you listen and find out how for yourself, slowly, along with the other survivors.

Definitely check it out.

Computer Science Epiphany

It is a little embarrassing that it has taken me this long, but I recently had an epiphany as related to my career in computer science. It was kind of an earth shattering notion, really. It really hit home when I was attending the CodepaLOUsa conference. It hit me: just because someone is an authority on a subject does not mean they’re right.

Don’t judge me. It may seem simple, but after twenty some odd years of existing and thriving in an environment where memorization and acceptance of best practices is expected and rewarded, I had come to just see this as the way the world worked. It wasn’t until recently when I’ve started looking into some methodologies and technologies on my own that, surprise, surprise, I didn’t agree with what some people were espousing. Not only that, but that I wasn’t necessarily wrong for disagreeing with the authority.

I think I’ll start speaking up in meetings at work, now.