Outlook Support

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Friday, 28 January 2011

Testing & Programming = Oil & Water

Posted on 22:56 by Unknown
I was watching a science program just now and it occurred to me that Testing is very much science. And then I wondered about Programming.

I started in IT over 22 years ago doing programming.  For me, the process of programming broke down to  three parts: figuring out the algorithm to solve the problem, implementing/coding the solution, and cleaning up the code (for whatever reason - e.g. maintainability, usability of UI, etc.).  It gets more complicated than that of course, but I think that about sums it up the major activities as I saw them. (SIDE NOTE: I didn't write those to mirror TDD's Red-Green-Refactor, but it does align nicely that way.)

When I think back on my experiences in programming, I don't see a lot of overlap with my experiences in Science (~ 8 years studying, researching and doing Physics & Environmental Science + teaching Science on top of that).  Science is about answering questions.  The Scientific Method provides a framework for asking and answering questions.  Programming isn't about that.  Building software isn't about that.  I'm having difficulty at the moment trying to see how testing and programming go together.

It occurs to me that schools and universities don't have any courses that teach students how to build software.  It also occurs to me that schools and universities provide students with the opportunities to learn and develop the skills required to build software well.  The schools just don't know they're doing that and consequently the students don't get that opportunity intentionally.

I'm not talking about learning to program.  That's trivial.  Building software isn't about programming.


Building software starts with an idea - an idea that someone will pay money for.  School courses to watch for here include - Economics, Entrepreneurship.

Building software requires people to work together. School courses that may apply - Business, Math/Finances, Psychology.

Building software requires people to work under constraints. "Project Management" is not really taught in schools, but there are many courses available to the public. I found this really painful to learn by doing. A whole world of insights opened up when I took my first PjM course. Reinventing the wheel really is dumb here.  I believe this should be formally taught in schools - in High School actually (the earlier, the better).

Building software requires people to solve difficult problems creatively.

This one is interesting.  I think there are many opportunities for people to learn this skill in school.  I know that we definitely covered this in Science.  I also know that Engineering programs teach students how to do this.  There are many more faculties and programs that this would apply to and they all have one thing in common - there's some formula or method for solving problems for some purpose.

The thing is, that purpose is different in each case.  See, here's where my mind is doing flip-flops.

In Science, when we solve a problem, the outcome is usually more information.  This information feeds back into the original idea to help us check the validity of our initial premise/hypothesis.  Sure, there are moments of free-form exploratory investigation, but I don't believe that happens a lot.  There are an infinite number of paths that any experiment may go in if you don't have a particular question in mind when you start, so unless you are trying to intentionally waste time and money, you will start with some question in mind before you start your experiment or investigation.

In Engineering, solving a problem is different in that the outcome is usually something real, something tangible, some application or system that fills a need.

The purpose of Science is not to build things but to answer questions.  The purpose of Engineering is to build things - and to do so safely, ethically, within desired parameters for intended purpose, and so on.

So where does that leave us in building software?  *Building* software is definitely an Engineering task... and then some. I am temporarily over-simplifying the process of building software to focus only on the requirements gathering, design, coding and deployment phases.

"Testing" comes from and is an integral part of Science, so how does it fit in with these software engineering/development phases of requirements gathering, design, coding and deployment?  Well, it doesn't.  It has nothing to do with them.  And yet, it has everything to do with them.

That is, from one perspective, at no time do you ever need to test anything to get through any of those phases.

That last statement, while true, kind of goes against everything I ever learned in school.  Whenever I did math problems, I always checked that I got the right answers against the solutions in the back of the textbooks.  Why did I do this? Why did I care?

I did it so that I could tell myself that *how* I solved the problem was correct - it got me the same answer that the textbook and my teacher cared about.  I discovered there were exceptions, of course.  That is, there were times when I got the correct answer but my method was wrong in some way.  Dumb luck does play a role in life and that was when I first discovered the evil twins named Type I and Type II errors.  (Side Note: I wonder if that's where Dr. Seuss got his idea for Thing 1 and Thing 2?  Hmm..)

So, the process of checking answers with the "approved" solutions, and handing in assignments for grading by teachers is a feedback mechanism to tell me that I've learned how to solve certain kinds of problems in ways that provide the desired results.  Let's assume for a moment that's a good thing.

Getting back to building software, you can go through requirements gathering, design, coding and deployment without ever once checking that you are producing the desired solution or results.  In the end, this is a monumental waste of time and money, and is completely incongruous with the initial premise that you are building a product/service/solution that someone will pay money for.  That's bad economics.  That's psychotic.

So how do we fix this?  What's the problem here?

Well, one problem is that we now have a question, we have doubt at the end of each of these phases.  We have a desire to learn if the end result of each stage and for the whole process is meeting [our/someone's] expectations.  The answer at the back of the textbook here will be provided by the people who are choosing to pay you and not your competitor for what you produce/release/ship.

Hey! Wait a minute!  Science helps us answers question!  Testing is a small part of that process.  The bigger process starts with a question that stems from some research or exploration of the initial area of interest.  This hypothesis is the part we really care about.  The "how do we go about gathering enough information to answer this question" part is something different and we should get people who know how to do this to either (a) do it for us, or (b) help us do it for ourselves. Then there is the analysis of that data or observations in context of the initial hypothesis.

But this is a different layer now!  We're adding a layer of "science" on top of "engineering".  That's weird.  That's like trying to mix oil and water together.  That is, they don't mix.  If you shake them up together you only end up with some cloudy mess that eventually will separate out again.

So what does this mean for building software? We need people who are skilled at engineering solutions, and we need people who are skilled at identifying and answering questions about the solutions being engineered.  I believe these are two, very separate skill sets required to be successful.

However, my experience in the Software/IT industry over the last 20 years has been that only one of those skill sets has really been identified as important or relevant -- that of the programmer or engineer in building the solutions.

This is a problem.  There's a huge knowledge gap here.

Schools don't teach you how to "test" in the context of software development.  Every single Testing "certification" agency I have met to date misses the mark. They don't teach you the correct skills. They "teach" you superficial documentation skills that produce information *like* the kind of information that is required.  That would be like me handing out Plumbing certificates to anyone who successfully completes the Mario & Luigi video games because these characters are plumbers in the games.  That's just not right.  Likewise, there is no actual "science" performed by teaching people to create scientific-like reports. (Although it happens sometimes that testers learn testing skills accidentally if they pay attention to what they're really doing.)

So, where are we here? Building software requires layers of intelligent, creative effort and problem-solving abilities.  These layers are complementary and require different skill sets.  Just like you wouldn't hire an accountant to deploy your systems, I don't think it's wise to hire a programmer to provide valuable testing insights into the development process.  It's the wrong skill set.

Oil and water, or oil and vinegar - the analogy holds for me. Testing is something completely different from Programming and building software.  It's a layer on top to help you know that what you are doing is on target for what your paying customers are expecting.  Some might call that value.
Read More
Posted in building software, certification, engineering, programming, science, TDD, testing | No comments

Monday, 10 January 2011

Software Testing "Popcorn" button

Posted on 09:17 by Unknown
I made myself some microwave popcorn for a snack just now.  Placed the popcorn bag in the microwave, pressed the 'popcorn' button and then 'start'. Someone next to me said: "There's a popcorn button?" Um, yes, there is.  In fact, there has been a 'popcorn' button on every microwave oven I've ever seen.

I explained to my colleague that the recommended time on the bag (in this case it was 2 min 30 sec) doesn't work on every oven.  Different ovens have different power output and so the actual cook time may vary.  If I go with the default time, it might burn or be under-done and leave too many unpopped kernels in the bag.  You could figure out the correct time in a few ways.


Method 1: Math
Start by taking a look at the power output of the oven. Full-size ovens deliver 1,000 - 1,600 watts of power, and mid-size ovens yield 800 - 1,000 watts. Higher wattage heats food more quickly.  If it's not explicitly written on the bag, assume the default popcorn popping time is for a 1,000 watt oven.

Use a time/power ratio along the lines of: ( t_your_micro_oven / P_your_micro_oven ) = ( 150 sec / 1000 W)

I converted the 2:30 recommended time on the bag to 150 seconds for convenience.  The power of your microwave oven should be written on the back somewhere, probably close to the plug.  This leaves your popping time as the only unknown variable in the equation, so it should be straightforward to solve.

That might work.

Method 2: Brute Force/Iterative
Put the bag in the oven and follow the instructions exactly.  Make no changes.  When the time is complete, take the bag out and put the contents in a bowl.

If you are happy with the result, congratulations! You are done.  If not, we will need to change the cooking time for the next bag.

On a piece of paper (or in a document on your computer if you prefer), make some observations to capture things like: time settings used, quality of the popcorn output, taste, number of unpopped kernels, and so on.  Keep a note next to the microwave oven or on the container where you keep more popcorn bags -- a note to reference these 'test results' so that you can have something for comparison the next time.

Basically, you know where I'm going with this.  Pop the first bag with the default recommendations and keep varying the subsequent popping times until you are happy with the result.  This will require "n" bags to iterate through until you are happy with the end result.

This should eventually produce high-quality results.  It may take you some time and will be costly to do it this way as you may have to go through many bags until you get it just right.


Method 3: Popcorn button

Put the bag in the oven, press the 'popcorn' button, serve and enjoy.

I'm not certain how this works.  I shall call it "magic" for now.  I can speculate that perhaps the microwave oven manufacturer has a team of engineers dedicated to the "Perfect Popcorn Production" using their equipment and are responsible for programming the correct power and time settings into their ovens.

Maybe they use Method 1 above.  Maybe they use a combination of methods 1 and 2 above.  Maybe it's something else.

The point here is that someone has already done the thinking for me so that I can focus on the quality of experience of using their oven.

Wow.  I have so many comparisons and analogies back to Software Testing running through my head right now.

One that jumps to mind is development & testing terminology/jargon.

  • When is a test not a test? When it is a check. When it is an inspection. When it is a question. and so on.
  • Are your [agile] development sprints 2 weeks? No? Why not? That's the default value, so it should work for you, right?

What if software organisations were responsible for their own 'popcorn' buttons? That is, where the 'popcorn button' is a way to communicate the methods and models used within their organisations that produce the desired 'quality' result.

As a consultant, one of the first things I do is watch and listen to the development and project team members. I need to understand their terminology and way of doing things.  That helps set a reference frame for me.  If I want to make an improvement or change somewhere, I need to have an understanding of how to do that on their terms, not terms according to some industry standard or certification terminology dictionary that may not apply.

What would be the point?  Well, I think it would be handy if we could abstract out some of the desired practices from the terminology and implementation details that are specific and custom to every organisation, team or project.

Now that I think about it.  Maybe, we as consultants are the "Popcorn Programming Facilitators."  That is, the company wants to implement an automated regression testing activity.  What does that look like? How can that work for us? What kind of output do we get?

Some days we identify the required popcorn buttons (e.g. Regression Testing, Bug Tracking System, Produce Status/Progress Reports). Some days we help program them.

What do you think?
Read More
Posted in context-driven, development, software, testing | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Enable ActiveX Control in Outlook
    Occasionally when using Microsoft Outlook, you may receive an error message telling you that your security settings do not allow ActiveX con...
  • Outlook requires Outlook Express 4.01
    Cannot start Microsoft Outlook. Outlook requires Microsoft Outlook Express 4.01 or greater. You can install Outlook Express by running Add/R...
  • Some incomplete thoughts...
    There are a series of related ideas that I want to discuss, but I don't think I'll have the time to properly describe them here. I...
  • Microsoft Outlook Duplicate Email Fix
    When using Microsoft Outlook, you may encounter an error in which all of your emails are downloaded twice. Depending on the size of your inb...
  • The Human Side of Living
    As I go through life I keep noticing stories, ideas and insights into humanity and I sometimes wonder if we are meant to discover these less...
  • Troubleshoot Outlook Express Error 0X800ccc90
    If you're an Outlook Express user trying to log on and you get an Error 0X800ccc90, which stops your password from being authenticated, ...
  • Happy Limerick Day! (May 12th)
    The CEO where I currently work sent around the following note by email at the start of today: Today is Limerick day. A limerick is a five-li...
  • Distorted Sound
    Another problem with MSN/Windows Live Messenger is the sound cuts out. This is a known problem with versions 7, 8.0, 8.1 Beta and 8.1. Usua...
  • Now with minty-fresh visitor counter
    Someone suggested to me this past weekend that I add a visitor counter to this blog. It's one of the most common suggestions made to me...
  • Windows live mail error Ox800CCCD2
    This error generally comes up when your firewall block any port. Reconfigure your account to Live Mail and also make sure that your firewal...

Categories

  • agile
  • agile testing
  • AYE
  • bad training
  • bugs
  • building software
  • certification
  • communication
  • conference
  • configure outlook express
  • configure windows live hotmail account in windows live mail
  • configure windows live mail
  • context-driven
  • development
  • engineering
  • error message
  • ET
  • exploratory testing
  • future
  • hiring
  • hobbies
  • hotmail account validation process
  • How to Enable ActiveX Control in Outlook
  • how to fix duplicate email
  • how to solve error 4.01 or greater
  • incoming mail sync to outlook
  • information radiator
  • instruction for pst file
  • interests
  • lean
  • lean software development
  • learning
  • low tech testing dashboard
  • management
  • mastery
  • measuring progress
  • metrics
  • Microsoft Outlook Duplicate Email Fix
  • Microsoft Technical Support
  • Microsoft Windows Mail
  • ms outlook duplicate email
  • msn account reset
  • msn account validation process
  • msn error code 0x80004005
  • msn error code 0x80004005 in apple mac
  • msn error code 0x80004005 windows 8
  • MSN Error Support Msn Help and Support
  • MSN Password Recovery
  • msn password reset
  • MSN Technical Support
  • outgoing mail not sent from outlook express
  • outlook not authenticate password
  • passion
  • people
  • pop3 email server
  • programming
  • quality
  • Quality Center
  • questions
  • regression testing
  • remove error 0X800ccc90
  • remove Error 0X800ccc90/Error 0x800ccc18
  • remove error 421
  • remove error ox800ccc90
  • remove msn error code 0x80004005 in windows 7
  • remove windows live mail
  • repair microsoft outlook pst file
  • repair PST file
  • resolve sound distortion problem with your live messenger
  • reviewing resumes
  • Satir
  • SBTM
  • science
  • skills
  • software
  • software testers
  • Software testing
  • sound distortion msn
  • sound distortion with livemail
  • support for microsoft outlook
  • support for outlook
  • TDD
  • technical support for microsoft outlook
  • testing
  • testing dashboard
  • time
  • Unable To Login in Windows Mail
  • unable to loging in waindows mail
  • value
  • Waterfall
  • windows live mail error Ox800CCCD2
  • windows live mail support
  • writing

Blog Archive

  • ►  2013 (16)
    • ►  September (10)
    • ►  August (1)
    • ►  April (1)
    • ►  February (2)
    • ►  January (2)
  • ►  2012 (3)
    • ►  May (1)
    • ►  February (1)
    • ►  January (1)
  • ▼  2011 (25)
    • ►  December (1)
    • ►  October (1)
    • ►  September (2)
    • ►  August (3)
    • ►  July (2)
    • ►  May (1)
    • ►  April (2)
    • ►  March (9)
    • ►  February (2)
    • ▼  January (2)
      • Testing & Programming = Oil & Water
      • Software Testing "Popcorn" button
  • ►  2010 (13)
    • ►  November (1)
    • ►  September (3)
    • ►  July (1)
    • ►  May (1)
    • ►  April (1)
    • ►  February (4)
    • ►  January (2)
  • ►  2009 (10)
    • ►  December (1)
    • ►  November (2)
    • ►  October (2)
    • ►  July (3)
    • ►  May (1)
    • ►  February (1)
  • ►  2008 (4)
    • ►  October (1)
    • ►  April (1)
    • ►  March (2)
  • ►  2007 (12)
    • ►  November (1)
    • ►  August (2)
    • ►  July (1)
    • ►  May (3)
    • ►  February (2)
    • ►  January (3)
  • ►  2006 (1)
    • ►  August (1)
  • ►  2005 (16)
    • ►  November (2)
    • ►  October (1)
    • ►  September (2)
    • ►  August (1)
    • ►  May (4)
    • ►  April (4)
    • ►  February (1)
    • ►  January (1)
  • ►  2004 (2)
    • ►  December (2)
Powered by Blogger.

About Me

Unknown
View my complete profile