Outlook Support

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Friday, 12 November 2010

Fishing for Wisdom

Posted on 16:34 by Unknown
I just came back from a week at the AYE Conference. My head is full with several new ideas swimming around and stirring up half-baked old ideas - which is a good thing.

One of the thoughts causing my head to spin came from a session one evening where we discussed ideas to improve the conference moving forward. Johanna Rothman led the session and at one point she mentioned that the AYE workshop sessions included pure [Virginia] Satir [ideas, models, etc.] and applied Satir. This got me thinking about some of the subtle differences I had noticed about the sessions and what they meant to me.

In particular, some of the ideas and models I learned from the AYE sessions appear to dwell longer in my mind and apply to a broader spectrum of situations while others seem to be more specific - i.e. an application of a model in a particular context. Don't get me wrong, whether you choose to attend a pure Satir or applied Satir workshop at AYE (and the sessions aren't labelled as such because it doesn't really matter), it's a win-win scenario. :) Sure, different hosts have different styles, but each session is different every time so you sometimes see people attend the same session again to see what new insights they get.

So, what's the big deal here? Why did I get stuck on a small point like this? Well, it reminded me of the time when I was in Teacher's College in the mid-90's, preparing to become a High School Science teacher.

I had 2 main professors in Teacher's College - one for each of my 'teachable' subjects of Physics and Chemistry. Their styles were very different. One professor seemed to keep me busy while the other made me think a lot.

I'll be honest, coming from a university environment to Teacher's College, I expected to be told what I needed to do to be a teacher. You know - I expected a lecture-style learning environment like most of my previous undergraduate courses. What I got didn't match that expectation so I was a bit frustrated at first until I discovered the secret that no one clearly explains to you.

What's the secret? Okay, I'll tell you. The secret is that when you go to Teacher's College, *you* are the teacher, not the student. So, by going there acting like a student expecting to be taught, I had the perspective all wrong.

I don't recall when the paradigm shift happened for me but I'm glad it did. After that point, I didn't consider my professors to be the ones teaching me to be a teacher; I saw them as guides to help me learn the things I needed to be a better teacher.

The real teacher here is experience. And that you can't get unless you are doing what you want to be doing, not sitting in a classroom somewhere talking about what you want to be doing. So what did I get from the Teacher's College experience? I got access to many different 'teachers' to help me deconstruct my experiences so that I could learn from them.

I need to pause for a second and think about that last sentence again. That sounds suspiciously a lot like the AYE conference experience to me. Hmm.

So what was different between my two main professor/guides? I think (now) it was something similar to the difference between the 'pure Satir' and 'applied Satir' AYE sessions. One professor offered tips and ideas that applied in a certain situations - the ones we said mattered to us - while the other discussed models and ideas to help us learn from our own experiences in more broader situations. Looking back, those things weren't clearly stated in that way at the time. I think I understand a bit more now about how they were trying to help us, in different ways, to become better teachers. Of course, both professors provided us with lots of opportunities to practice demos and teaching short topics in an environment where we could safely receive feedback from our peers. (hmm, more AYE conference and PSL familiarity here.)

I recall that someone once mentioned the old Chinese proverb while were at Teacher's College:
Give a man a fish and you feed him for a day. Teach a man to fish and you feed him for a lifetime.

To me, the quote makes me think of the difference between information and knowledge. I think both professors were trying to teach us to fish (i.e. give us knowledge), just in different ways. They both wanted us to leave the college more confident with knowledge that we could use to help ourselves become better teachers moving forward.

Back to AYE and present day. Reflecting upon the learnings from this past week, I learned new models and ideas that I can apply in many ways - some apply at work and some apply to life beyond the workplace. I think I left with a few fish and a few new fishing techniques.

The old proverb bugs me though. I don't think it completely captures the full experience and feeling of what happened. There's something missing, something meta.

And then it came to me today. It's not the fish. It's not learning to fish either. It's the fishing.

If you pay attention to what you are doing while you are fishing, I think that leads to something other than information or knowledge; it leads to wisdom.

Continuing with this proverb as an analogy, if I learn deep-sea fishing while on vacation, I don't think that will help me very much if I decide to go fly-fishing at a local river. There are many different ways to fish - for the different kinds of fish, the environments in which they live, and the purpose of fishing (e.g. food vs sport).  If we pay attention to more than just the types of fish, their environments and techniques to catch them, we can learn something more, something bigger.  I find it hard to think in broad ideas like that sometimes.  I also find those are the most rewarding moments though.

Attending the AYE conference (and PSL this past Spring) was like that for me.  My head doesn't stop thinking about ways to apply the models and ideas we experience at AYE.  Meeting wonderful, intelligent, kind practitioners from all over the world helps enrich the shared experiences in ways that bring us closer together.  We talk about fish (individual experiences) and techniques to help find solutions to problems we think we see.  And then the hosts/speakers go and show us things that help us solve problems we didn't even think about or see!

For me, attending AYE is an opportunity to meet old friends, learn about myself, learn about how to interact better with others (both at work and in personal life), share experiences and knowledge with colleagues, learn some new problem-solving techniques, make new friends, and pause for a moment to reflect upon where I am in life and where I'd like to be.  It's a moment to notice that I'm fishing - I'm learning and growing.  And that others are fishing too.  And while some are fishing for similar things and others for different things, we all recognise that we are fishing so we have that in common.

It's going to take me a bit of time to unpack all of the ideas I was exposed to this past week because learning happened on many different levels.  I could post some notes, and I plan to, but I don't believe the notes alone can convey the experience of learning that happened there.  Much learning happened between conference sessions too.  You meet so many people with similar or related interests and different experiences that once you start talking in the hallways, over dinner or lounging about somewhere, you can't help but continue to learn and think about things in new ways.

If you want, that is.  If you're into that sort of thing.  If learning, growing, and working better with other human beings isn't your cup of tea, then this conference is definitely *not* for you.

To everyone else, I highly recommend the experience.  This conference, and the PSL workshop, are opportunities that shouldn't be passed up.  It might even change the way you think about things.  It has for me.

To the AYE and PSL hosts, Jerry, Don, Esther, Johanna and Steve, to my Teacher's College professors, Peter and Tom, and to my family, friends, and colleagues who have all provided me with helpful feedback and information to help me grow and be a better person, I thank you.
Read More
Posted in AYE, conference, learning, people, Satir | No comments

Thursday, 30 September 2010

Using MS Outlook to support SBTM

Posted on 20:56 by Unknown
Okay, to recap, Session-Based Test Management (SBTM) is a test framework to help you manage and measure your Exploratory Testing (ET) effort. There are 4 basic elements that make this work: (1) Charter or mission (the purpose that drives the current testing effort), (2) Time-boxed periods (the 'sessions'), (3) Reviewable result, and (4) Debrief. There are many different ways that you might implement or apply these elements in your team or testing projects.

Let's take a look at tracking the testing effort from strictly a Project Management perspective. Years ago, when I first became a test manager, I was introduced to the idea of the 60% 'productive' work day as a factor to consider when estimating effort applied to project schedules. That is, in a typical 8-hour workday you don't really get 8 complete, full hours of work from someone. I don't believe it's mentally possible to get that. The brain needs a break, as does the body, and there are many natural distractions in the workplace (meetings, email, breaks, support calls, stability of the code or environments, and so on), so the reality is that the number of productive working hours for each employee is actually something less than the total number of hours they're physically present in the workplace.

That 'productivity' factor changes with each person, their role and responsibilities, the number of projects in the queue, and so on. Applying some statistical averaging to my past experiences, I find that 60% seems about right for a tester dedicated to a single project. I have worked with some teams that have been more productive and some much less.

So what does this look like? If we consider an 8-hour day, 60% is 4.8 hours. I'm going to toss in an extra 15 minute break or distraction and say that it works out to about 4.5 hours of productive work from a focussed employee in a typical 8-hour day. Again, it depends on the person and the tasks that they're performing, so this is just an averaging factor.


4.5 hours is an interesting number. In the SBTM framework, the "Time box" length for the test session revolves around a normalised value. The original SBTM presentation ("How to Measure Ad Hoc Testing") suggests that 1 "Normal" session = 90 minutes (+/- 15 minutes). Your "normal" time box session may be 90 minutes, it may be less. Customise it to suit your team's needs and fit.

I'll use 90 minutes for now because it's the default recommended value to have a focussed test effort to get some solid testing done. 90 minutes. That's 1.5 hours. So 3 * Normal = 4.5 hours.

Therefore, from a project management perspective, 3 Normal sessions is an average tester's target productivity level for completed test sessions in an average 8 hour day.

It doesn't seem like a lot, three 90-minute sessions, but you'd be surprised at the number of distractions that happen throughout the day and development sprints/cycles that can prevent a tester from completing 3 sessions in a day.

Recently, I've been working with a tester on an Agile team who has been facing a high distraction level. As a result of all the interruptions for quick checks, reviews, design meetings, and so on, he has been struggling to complete 3 sessions in a day.

I should mention that I *AM NOT* asking him or anyone else on my team to complete 3 sessions in a day. I am not concerned with the numbers or the math of tracking the sessions completed, especially because the development team environment is very integrated and highly collaborative.

What I am concerned about is trying to find ways to block off his time so that he can get some solid testing done, uninterrupted. One of the powerful aspects of the time-boxed session is to reduce or minimise the distractions so that your brain can maintain the focus on the testing problems at hand. The more the distractions and interruptions, the more likely that something important or interesting will be missed.

Enter MS Outlook. When I look at my Outlook calendar it is riddled with meeting requests throughout the week. I have recurring meetings, impromptu meetings, lunch and learns, scheduled debriefs, sprint planning, retrospectives, demos and so on. I'm the Test Manager/Lead, so that's expected. When I look at my tester's calendar, it is fairly blank/open, save for the daily stand-ups and a few other weekly team meetings. It seems counter-intuitive that someone with an open calendar, so much available 'free' time, should be unable to block off time-boxed periods dedicated to testing specific features and charters.

So I've started to schedule his test sessions in MS Outlook by scheduling Appointments. At the start of the day, we'll sit down and come up with 3 important charters that we'd like to cover in the day. We block off one session/appointment in the morning and two in the afternoon. The "Subject" is a summary of the charter and priority. If the session involves another Dev or team member, they can be included in the appointment, which is now a 'meeting request'.

Outlook is a convenient tool since it (a) is available on everyone's desktop, (b) has built-in reminders, and (c) allows the tester to *see* all the free time available between scheduled test sessions. It's a way for him to try and regain control of the *uninterrupted* test sessions.

SBTM isn't intended to be a time-tracking tool. It's a productivity-enhancing tool. By keeping focussed on the charter, you give your mind time to think about the important things -- to learn about the system and the software, to apply the appropriate tools and techniques, to make the best observations you can in the time you have available. Performing good Exploratory Testing is a delicate and complex thought process.

I see my job as doing whatever I can to help provide the best environment possible for good testing to succeed. I never would have thought that Outlook would factor into that. Who knew?
Read More
Posted in ET, management, SBTM, testing, time | No comments

Thursday, 16 September 2010

Test-Driven Development isn't new

Posted on 09:10 by Unknown
I used TDD as an analogy to a tester today to explain how logging bugs in a bug tracking system drives the development. A bug report represents a failing test (when you verify that it's really a bug that is) according to some stakeholder need/want.

In Test-Driven Development, the programmer writes/automates the test first that represents the user story that the customer/user wants. The test fails. The programmer then writes enough code required to pass the test and then moves on. (refactoring code along the way, etc..)

It's much the same with regular system testing (i.e. in the absence of agile/TDD practices) where a tester identifies and logs a bug in the bug tracking system. One difference is that these bug reports/tests aren't always automated. (Okay, I've never seen anyone automate these bug reports/tests before but I like to believe that some companies/dev teams out there actually do do this.) That doesn't change the fact that a bug report is the failing test. Even if it's a manual test, it drives the development change and then the bug report is checked/retested to see that the fix works as expected.

Bug regression testing, then, is a requirement for good testing and system/software development, not an option.

So, while the agile practices of TDD and others may seem new, I see this one as a retelling of a common tester-programmer practice. If anything, I see TDD as an opportunity to tighten/shorten/quicken the loop between testing feedback and development. With practice, TDD helps programmers develop the skills and habits they need to create code and systems with confidence -- to know that as the system grows, the specific needs of the customers are being met every step along the way. No one gets left behind.

How can we, as testers, help? If your programmers don't practice TDD or automate tests, start investigating ways that you can do this. Investigate Open Source scripting languages. Engage your programmers in discussions of testability of the interfaces. There are many articles and presentations on the internet on the topics of test/check automation, frameworks and Domain Specific Languages (DSL).

Start reading. Participate in discussions (in real life and online). Start developing scripting skills (I recommend Ruby, of course, especially to the tester newbie). If you don't feel confident with your programming skills, help hire someone onto your test team that can help all the testers advance their skills, knowledge, and productivity in that area.

Be the Quality Advocate by putting your words into practice. You want your programmers to start practicing TDD? Show them how you can do it. You are already doing it - scripting/automating the checks that demonstrate a bug failure is just the next step.

Start by automating a single bug failure. Take it from there.
Read More
Posted in bugs, development, TDD, testing | No comments

Thursday, 2 September 2010

Why New Year's Resolutions Fail

Posted on 21:41 by Unknown
Someone recently said something to me that made me think. He said that all New Year's resolutions fail because they come at the wrong time.

You know what I mean by New Year's resolutions, right? It's those promises you make to yourself, and maybe to others, right around the end of December that you will change or improve yourself in some way in the new year.

The sentiment may not be wrong, but the timing certainly is. The argument made was that January 1st isn't really the start of the new year - September is. You see, here in North America, whether you are in school or not, most businesses revolve around a "school year" structure of September to June, with July and August being the summer holiday months.

So, if September is the start of the year, we can't make promises to change something in January. That's like starting a 2-week sprint (in Agile Development) and saying half-way through that you are going to have completely new objectives. It doesn't work that way. You already committed to delivering certain goals during the Sprint Planning session at the start.

What's that? What if you didn't set any goals at the beginning of the Sprint/Year in September? Doesn't matter. The Sprint/year started anyway and you are in the middle of it. There's no way you are easily going to shift your life in a totally new direction half way through.

So, the moral of the story is: if you want to make New Year's resolutions, make them in August, not in December. That way you are more likely to follow through with them as the year progresses.

Hm, interesting.

Of course, life changing events can happen any time. You don't need to make a resolution of any kind to change yourself and how you get along in the world. You just need to see yourself how you want to be, and live like you've already reached that goal.
Read More
Posted in | No comments

Saturday, 10 July 2010

Testing Software is like Making Love

Posted on 06:58 by Unknown
Work with me for a minute here.

One of the things I dislike about a recent trend in the software testing profession is the lack of analogies and examples that relate to me as an adult. Yes, it's interesting that children are innocent, curious and are natural explorers, but they are also naive, inexperienced and cannot reason or abstract like adults can. I don't find it helpful to me or when training new testers to say you need to be more like children. I'm an adult, so how can you help me now? Do you have an analogy that relates to me as an adult?

Here's an analogy that I think might help.

Testing software is like making love.

What does that mean? For a start, what's the difference between 'making love' and 'having sex'? I think the big difference is caring about the person you are with and wanting to satisfy their needs.

Is there just one way to do it? No. Every person's needs are different, just like every project we help test is unique.

I think people/consultants who specialise in certain things like an automation tool or a process improvement model are kind of like porn stars. They've mastered a technique and go around showing everyone just how good they are. Often, the perceived satisfaction in their results is just superficial and temporary because we are so impressed with their performance and showmanship. Is there real satisfaction or addressing of customer/stakeholder needs in specialists like these? I don't think so. Well, maybe sometimes, but I highly doubt it's the silver bullet/best practice they make it out to be.

Is there a need for specialists? Yes, I think so. We can (usually? often? sometimes?) learn from them. I believe that people (customers, here) fall into high-level patterns and that we (who want to address their needs) should learn from those who have experienced successes in working with certain kinds of customer needs. Does that mean that we should only look to one specialist in our lifetime? That's up to you. That's like saying you only want to learn to use a hammer and will look at all of life's situations with the one tool you know how to use.

Is there value in getting a certificate to show that you've learnt a particular sexual technique? Would you be proud of it? Would you post it up on your wall at work? Would you put it on a t-shirt, business card or advertise that you have this knowledge and/or experience?

There are so many books out there on sex, love and relationships - there are so many different needs and patterns to try and address. If this analogy holds true then the Software Testing industry has a long way to go to catch up on the number of different books published to help people be successful and happy in testing software and systems.

Perhaps we need something like the Kama Sutra of Software Testing. (ooo, I like that. just made it up.)

Does making love involve intercourse? What about tantric sex? What about just cuddling? What if you're alone? Maybe sometimes sex is all you really need at that moment - get in, get the job done, get out. Maybe it's what the customer both needs and wants. Don't think too much about it then. Do your best and move on. Porn stars don't need to build relationships to be famous or successful.

What's the difference between providing someone with what they want compared with what they need? Here's a good follow-up question to ponder: if you provide your customer with what they need instead of what they want, will they still be happy? Will they tell you that? Are you sure that you've really identified what they needed? How do you know? How are you certain about what you think you know?

Talk to your customers. Get to know them. Try to understand both what they are really asking for and what you think they need. Do your best to try and address their needs. Provide everyone on the project with the information they need to be successful in the project constraints. We are only human and therefore make mistakes and miss things sometimes. People can be forgiving too and may provide you with other opportunities to show that you've learned from past experiences and that you care about growing and doing a better job next time.

If the client/organisation isn't forgiving, then take the one-night stand as a learning opportunity and move on. Seek out what you desire. I believe that the best software testers are truly the ones who care about the people we work with and for. They care about doing a good job, about learning, growing and improving their performance and satisfaction level. It's unfortunate that not all of the organisations we work for see that or care. It doesn't stop us though. We're a caring bunch. We're weird that way. ;)

Good software testers are lovers.
Read More
Posted in | No comments

Wednesday, 12 May 2010

Happy Limerick Day! (May 12th)

Posted on 14:39 by Unknown
The CEO where I currently work sent around the following note by email at the start of today:
Today is Limerick day. A limerick is a five-line poem with a strict form (AABBA rhyming), which intends to be witty or humorous, and is sometimes obscene with humorous intent.

Here is one from me (if you respond in kind let's stay away from the "obscene" part of the definition)!!
[..snip..]


Of course, that challenge went answered and we have had a steady stream of limericks all day!  :)

Here were some of the ones I thought up between meetings and test sessions...


(The first one is on hiring testers:)

What is Quality?
How do you check the functionality?
Provide info in a timely fashion
Prefer those with passion
Most important, you need personality!

Doing good work takes time and thought
If you think testing is easy, it is not
We apply test techniques
To see what reeks
And help the stakeholders know what's hot.

In Exploratory Testing we Learn, Design, Test.
We believe no practices are "the best"
Context helps us decide
Which approaches are tried
Until we solve the problem we do not rest.

Jerry Weinberg is an important man
Of his work and ideas I am a big fan
He helped me to learn
People are the concern
So make helping them succeed the plan


Okay, so I'm not a poet. ;-)

Can you do better?  Leave some limericks in the comments and have fun! =)

Cheers!  Paul.
Read More
Posted in | No comments

Wednesday, 7 April 2010

SBTM ET.XLS spreadsheet with DMY format

Posted on 21:33 by Unknown
Ha ha!  I finally got some time to figure out how to get the ET.XLS spreadsheet to support DMY format instead of the default MDY (US) format.  It turned out to be a small change to the macros, but unfortunately required hard-coding the number of columns in the input files to make it work.  As long as you aren't changing the number of columns in the TXT files, this should work for you.  I also had to remove the forced "m/d" format on one of the worksheet tabs.

You can download a copy of the zipped spreadsheet here: et2_DMY_dates.zip.  I also updated my www.staqs/sbtm page to include this file.

Last July I posted the latest tools-ruby scripts and received some feedback that I wasn't the only one with this problem.  I fixed the date formatting problem using Excel 2003, so if anyone is using an older version of Excel too bad. ;)

As for Excel 2007, I just started using it this week.  I noticed that I had to play with the Macro security settings (once you make the 'Developer' toolbar visible), and then I got it to work.  I'll be playing with this a bit more in the coming days so if an update is required I'll post it here to let you know.

If you find this updated file helpful, please drop me a line to let me know. Thanks.

Cheers!
Read More
Posted in | No comments

Friday, 26 February 2010

TEDxWaterloo

Posted on 13:02 by Unknown
I attended the first independent TED event in Waterloo (TEDxWaterloo) yesterday, 25 February 2010. The theme was "Tomorrow StarTED Yesterday." The web site and twitter account page have lots of great info if you're interested to know more about the event and speakers. There's even a nice photo blog of the event at http://www.longexposure.ca/2010/02/tedx-waterloo/

So what can I add that hasn't already been said? Well, I can tell you what the event meant to me.


I wasn't sure what to expect. "TED" stands for "Technology, Entertainment, Design," but that tells me nothing. I've seen and enjoyed a number of the videos that I have come across - i.e. that were recommended to me in one way or another (email, web page links, twitter, passing conversations, and so on). However, separate videos alone hadn't quite made the connection for me.

The opening remarks helped set the context for the event. In those few minutes, it all came together - it clicked. Aha! This event is about more than the sum of the individual words above. Even the motto "Ideas worth spreading" make sense to me now.

The people who spoke at the conference and the selected TED videos that were shown hit me somewhere I didn't expect - in my heart. Most of the conferences that I have attended speak to my mind; help me try to understand or learn something or other. That's nice, but that's not all there is to life.

There's only one conference that I have ever attended that I would compare to the TED event - the AYE Conference. The difference here is that AYE is built from a series of interactive workshops intended to help us understand and work better with each other - human beings, not machines. TED is a perfect complement in that the speakers share stories and ideas which inspire us.

Inspired. Yes, that's a word I would use to describe how I felt during and after the TEDx event.

I spoke with someone at the event whom I hadn't seen in over a year. She described to me how her current project is making her feel dumber every day, dealing with processes and bureaucracy that only serves to confuse and make the project more difficult. At the TEDx event she said she felt like her brain was expanding, opening up again. I think that's another good description of how I felt.

Following the TEDx event, I now have a better understanding of what the words mean when I see them. I have a context for this event and videos - which helps me know what to expect and where/how to process the information.

Here are some notes I took from some of the speakers at the event:

Terry O'Reilly
  • The Age of Persuasion
  • idea of friction - sometimes you need to slow down in order to sell and market new ideas. Brilliant examples
  • He referenced the book "The Checklist Manifesto" by Atul Gawande. (Good example that has come up a few times in testing. I think we need more good, practical examples in our profession.)
Paul Saltzman
  • (I was really surprised and blown away by this talk. Really left me with a lot to think about afterwards.)
  • "magic" - that which is real but you do not yet understand. (I've seen magic used a number of times to help demonstrate certain testing concepts. Paul used it in a different sense here and I think there is a lot of magic in what we do every day.)
  • "humility" - not making yourself feel small, rather, seeing yourself in perspective of the larger universe.
  • "There is no end to love. Love is infinite. There is no end to creativity. Creativity is infinite."
  • "Nothing changes until you do."
Caroline Disler
  • "Western civilisation" - There is no Eastern vs. Western civilisation (unless you happen to be strictly speaking geographically). We are one world.
  • "Those who are ignorant of the past are prisoners of the present."
  • she gave a really interesting summary of where knowledge and language (and even numbers!) originated and how every part of the world has helped bring us to where we are today.
Madhur Anand
  • Heh. I didn't know about Sudbury and the environmental restoration projects happening there. Cool! (My university degree was in Environmental Science, so I had an interest in this talk. As a mining town, Sudbury hasn't had the best rep over the years, so I'm pleased to see that it is leading the way in environmental restoration projects. The world needs more of this.)
  • she used some interesting quotes. One that was new to me was: "To live in a place, you must first imagine it." by Jay Ruzesky
  • The environmental crisis is also a crisis of imagination.
Darren Wershler
  • First one to mention Marshall McLuhan (he said he pulled the short straw backstage before the event. ha ha.)
  • The roles of different kinds of media (untimely, conceptual and impossible) and their influences on us. Something doesn't have to be real or even possible to have an effect/impact on us, to inspire us to new ideas.
  • e.g. transporter technology from Star Trek, or the many inventions by Reed Richards of the Fantastic Four. (Nice comic book reference!)
Marty Avery
  • Be soft to be strong. Namaste.
  • Kayaking story really touching. What does it mean to be a hero?
  • There is some of us in everyone we meet. "You am I and I am you."
Amy Krouse Rosenthal
  • (Google search doesn't do her justice. This was [yet] another amazing talk! Here's a link to reach her: http://www.whoisamy.com/ - watch the videos!)
  • The 7 [musical] notes on life:
    • Always Trust Magic (ATM)
    • Beckon the lovely. Whatever you beckon (attract, look for) will eventually find you.
    • Connected. We are all connected.
    • Do
    • Empty space
    • Figure it out as you go
    • Go do it. Howard Thurman quote: "Don't ask what the world needs. Ask what makes you come alive, and go do it. Because what the world needs is people who have come alive."
  • The [musical reference] 'key' is you
  • "Make the most of your time here."
Wow. And that doesn't cover all the speakers and videos we watched!

This event was certainly something I needed! I was moved and feel like I want to do something even bigger now! Well done! I can't wait to attend the next one or create my own independent TED event! =)

If an independent TEDx event is coming near you, I highly recommend you take the time to attend. If you do, please tell me what you thought and felt.

Cheers!
Read More
Posted in | No comments

Wednesday, 24 February 2010

Now with minty-fresh visitor counter

Posted on 11:39 by Unknown
Someone suggested to me this past weekend that I add a visitor counter to this blog. It's one of the most common suggestions made to me over the years and I don't know why.

Back in the mid-90's I had a web site with a counter. It was novel then. I played around with different fonts and features and watched it go up over time. I don't have that site anymore. I set up a new web site about 7 years ago, but adding a counter wasn't one of the important things on my to-do list. Foolish? Dunno. Maybe. Maybe not.

Is Quality measured in numbers?

Anyhoo, I won't go there today. ;-) I'm not going to think philosophically about it right now. I'll reserve thinking and judgment about the visitor counter for a later date.

This is just a placeholder note to indicate that the counter started today.

Cheers! Paul.
Read More
Posted in | No comments

Monday, 8 February 2010

SBTM is not ET

Posted on 20:48 by Unknown
There's a subtle but important distinction that I'd like to talk about. Session-Based Testing is *not* Exploratory Testing. Please stop using those terms interchangeably because they're not.

Exploratory Testing (ET) is a testing approach that puts the emphasis on real-time learning, test design and test execution, as opposed to a more "scripted" approach that puts the emphasis on the separation of these activities - separated in time, space, and usually with copious amounts of documented artifacts.

When I first started in I.T. over 20 years ago, any testing I did as part of my programming contracts were exploratory in nature. I didn't call it 'ET' at the time and I certainly didn't approach it with the same discipline and formality that I do today. Back then, Programming was my main focus and testing was just something I did as required along the way. Ten years later (or about 12 years ago depending on your perspective), I took a workshop class on "Test Case Design" with Ross Collard. That was an amazing class that opened my eyes to a whole new world of analysis and problem solving that I didn't know before. Cool!

After that workshop, I had plenty of opportunities to practice what I learned, try new techniques and tools, and explore additional testing ideas thrown out onto the just-budding software testing mailing lists. One of the things we discussed in Ross' class was the role of "ad hoc" or informal testing. I don't have access to the data, but some study-or-other at the time (90's sometime?) showed that ad hoc testing failed to produce the same amount of testing coverage that formal test design analysis would.

Okay, I buy that. To paraphrase: guessing ideas off the top of your head consistently produced less coverage than having some structured analytical approaches/techniques/heuristics/models at your disposal. Okay. I don't need a formal study to tell me that.


So what's different with Exploratory Testing? Well, when I first learned about ET at the turn of this century, it instantly clicked with me. Rather than the "guessing" attitude normally associated with "ad hoc" testing, ET clearly defined the testing approach in a way that made you think. You learn something; you design something; you test and observe something; repeat. Note that nowhere in there does it say "take a wild-ass guess and call it good, complete or even 'good enough' testing coverage."

Before I was introduced to ET, I had spent several years practising and training other testers on test design techniques. That helped me fill in the "test design" step of ET. That step is the weakest link with most of the testers I have met and spoken with over the years who have tried and given up on (i.e. failed with) ET. You can't really fake your way through test design. That's why I make it an important part of my hiring/interviewing process (you can read the article online).

So, what *don't* I like about ET? There's just one thing really. The ET approach formalised the learning, test design and test execution aspects of testing, but not the interpersonal communication aspect of it.

The 'scripted' (waterfall) approach to testing relies on the documenting (and maintenance) of hundreds or thousands of test cases, each with their own set of pre-conditions, steps, expected results, and so on. While the value of these documented test cases may be questionable, one thing going for it is that you can share these test ideas with other people quite easily. (They're documented; pass it on.)

In ET, not so much. If the important parts of testing takes place in your head as you process all of the inputs and information, and compare them with explicit and implicit requirements and expectations, in order to assess the quality of the A/SUT, then when/how do you share those test ideas with other people (testers, developers, business analysts, etc.)? Well, you don't. Or rather, ET alone doesn't give you any advice for communicating test ideas or testing coverage with others.

Enter "Session-Based Test Management" (SBTM) or just '"Session-Based Testing".

Aha! After a year or two of using ET, I instantly found the merit in SBTM. SBTM provides the framework that you can wrap around an ET approach. It is a way that you can manage the testing effort. It has four main elements: develop specific charters, time-box an uninterrupted work session, create a reviewable result, and review/debrief the session afterwards.

Here's the catch: it is *not* a testing approach! It is a test management framework. Actually, when I teach/describe it to others, I sometimes refer to it as "Session-Based Task Management."

I have taught SBTM to programmers as a way to help them manage their time and reduce the number of interruptions during a work day. I have also successfully implemented SBTM in a waterfall organisation where very little ET was ever performed.

Yes, you read that correctly. I have even wrapped SBTM around a *scripted* testing approach.

Eek! Egad! Gadzooks! Isn't that blasphemy?

Well, actually, no.

You see, I have found that SBTM is an incredibly powerful tool for a test manager. It gives you insights into aspects of testing that you might never have without it.

The four main SBTM elements provide a solid foundation to managing your work, and can be transferred to activities other than just ET. For example: programming, writing, organising/cleaning your basement, any consulting work, and so on.

The original SBTM framework included some Perl scripts that I have long since stopped using. The original archive included a session sheet template, but like any template you can modify it and tailor it to your needs. (If in doubt, just ask James Bach for his thoughts on Test Plan templates! :)) That's the main reason I rewrote the SBTM scripts in Ruby - so that I could customise the session sheets to the needs of the projects I worked on. So, for one project I added a section to the session sheet, and for another project I completely removed the TBS metrics; my Ruby scripts are flexible and can handle the changes easily. (ASIDE: I haven't made this customisable script publicly available on my site yet. Send me a note if you are interested in trying it.)

In fact, if you follow the intent of SBTM, you don't need to use the session sheet template or scripts at all - as long as you have some agreed-upon reviewable result that you can later debrief. In this way, I have heard of some test teams that have implemented SBTM using Wiki's, and others that have integrated old Test Case Management systems into the process. Sounds cool and innovative to me!

So, what's my gripe? In the last several weeks, I have read several times that Exploratory Testing includes time-boxed, chartered sessions with reviewable results. Umm, no, I'm pretty sure you're confusing the framework with the approach, the wrapper with the content, the book format with the story.

If you have implemented SBTM on a project, I can make no assumptions about what testing approach you are following. Likewise, if you include ET in your overall testing strategy, I won't assume you are using SBTM to manage that effort.

If you want to talk about ET or SBTM, please try to describe them in the correct context. It will make it less confusing for beginners and other interested parties. Granted, together you have a very powerful combination. But Superman is still super in a different suit. =)
Read More
Posted in | No comments

Wednesday, 3 February 2010

Time - Bane or Innovation Catalyst?

Posted on 09:08 by Unknown
Time. What time is it? How much time do we have? When do you want/need it? What's the deadline? I need more time!

If we had all the time in the world for software development, would the delivered results really be of better quality?

A co-worker at a past employer wrote the following when someone sent an email submission for a fun, internal contest the day after the deadline:
The contest ended a long time ago. Trying to submit something now is like submitting your late university assignment.
One of my profs told me:
"I don't care if you have something that's better than all the works of Shakespeare. If you can't get it in before the deadline it's worth nothing to me."

Ha, ha. It was intended as a funny remark at the time but there's some truth in there too.

So, if someone submits an assignment "on time" but of lesser value/quality than they might produce if they had more time, would they still continue to work on their opus or would they give it up to move onto the next project? Do we (as a collective group of intelligent human beings) lose out by putting Time ahead of Quality?


The traditional "Project Management Triangle" puts the emphasis on: functionality/scope, cost and schedule. An experienced consultant can tell the employer: pick/fix any two and we can estimate the third.

I noticed years ago that "quality" isn't in this "triangle". As a novice, I took "scope" and "quality" to be part of the same point. Clearly I was mistaken. When people are focussed on delivering something, on time, at a fixed cost, everyone interprets "quality" in different ways.

I think the Agile manifesto/movement is an interesting response to the "traditional" (a.k.a. Waterfall) approach to software development. It takes the same 3 constraints (of scope, cost and time) and changes up the order of activities to integrate quality into the deliverable products. This is done by embedding customer involvement (via collaboration, user stories, automated acceptance tests) and rapid delivery releases to allow for quicker feedback into the design and implementation. For example, in a traditional/waterfall project, it may take anywhere from 6-18 months to find out your interface/implementation fails to meet the needs of the customers. Or, using agile methods, it might take anywhere from 2-14 days. Your choice.

So what about software testing?

In every waterfall project I have worked on, development always delivered software late into the "test" phase. This meant less time to provide feedback, because the release deadline was fixed. Time is my bane here. I've got less of it and need more of it! ... or do I?

If I stick to a waterfall approach to testing - i.e. develop & document test plans, test strategies, test cases, execute the tests, log the results and communicate the summaries - then, no, time is not my friend here.

But is it a requirement to do testing this way? Whose requirement? How much does their opinion really matter?

I watched my son play a game recently and describe the "glitches" (his lingo, not mine) to his younger brother so that he could try and work around them. I'm pretty sure my boys don't care whether the software team used waterfall or agile methods, or how well their test cases and processes were documented. They found bugs in their game, are annoyed by them, and figured out ways to work around them. Sometimes they just give up on a game altogether.

Personally, I'd say that the customer doesn't really care about how you do your testing - as long as the end result has good enough quality that doesn't interfere with their intended use of the software or system.

Here's a secret: Nobody cares.

Some lawyers may pretend to care when they are paid to do so, but the reality is that I don't know of a single tester who has ever been charged with manslaughter for failure to document critical test cases that may have caught the bugs that resulted in loss of life.

The FDA doesn't care. Their lawyers tell them that they should care about documented tests and results, so they impose regulations. But the FDA doesn't really care about your documented test cases or test processes. What they really care about is that a minimum standard of due diligence has been performed to demonstrate that a particular product will not harm anyone. That's it in a nutshell. You may not even need testers to achieve that level of quality either.

I could go on, but I think I made the point - nobody cares how you do your testing as long as the collective development effort produces a quality product. You remember "quality" - it's that thing that project managers leave off their project management triangle.

So, if we disregard the premise that testing needs to happen in a "waterfall" fashion, what's left? Well, what do we know? We know that (1) we don't have a lot of time, and (2) we have a lot of features to cover. Oh, and it's also very likely that (3) you have a limited number of resources and people - most likely less than what you'd probably like. (Hey, if we're screwed on the 'time' factor, why not get screwed on the 'cost' factor too, right? ;))

So where does that leave us? Time to innovate! Time to become agile! Talk to your customers; collaborate with your developers and business analysts/product managers; learn the software and functionality as you design and execute the tests because there really isn't time to do those things separately.

Risk-based testing (RBT) works on the premise that there might be something bad/undesirable that could happen, so why don't we start by looking in those places first. RBT is also an appropriate response to the statistical impossibility of complete testing coverage for any useful software program with more than 2 lines of code. That is, if it will take an infinite amount of time to test something, how about if we narrow it down to just some of the areas that we think might be risky in some way (i.e. popular, critical, complex, and so on).

What else can you do? You have a lot of features to cover in a short amount of time. Well, start by ditching all the test documentation requirements and focus on: what is necessary to establish a minimum level of understanding of what's going on.

Do you really need all those documented steps for every test case? No, you don't. Unintelligent automated systems and robots need step-by-step instructions, humans don't. And most humans don't follow the steps consistently either, so just let that one go. Instead, describe the scope of the testing you want to do using checklists and decision tables. The important things need to be discussed in person to ensure clarity of requirements and information, but everything else should be fine with using point form.

Worried about how you will capture the test results if you are denied the Pass/Fail test status column? Work it out! Figure out a solution that fits your project's (and organisation's) needs. There are a number of far more useful alternatives out there - e.g. application logging, screen captures, note taking, and so on.

If you don't have enough time to complete a project using the same approach you've used in the past, it's time to try something new. Time to think up of new solutions, new processes, and identify/create new tools to help you reach those goals.

The end goal is a high quality product.. or maybe just "good enough" quality depending on your situation. The end goal is not to produce sparkling, publishable test documentation. (If it is, consider changing your title from "tester" to "test biographer")

Don't lose sight of what's important. What will you do with the time you've been given? How will you choose to react to the situation?
Read More
Posted in | No comments

Thursday, 21 January 2010

What I learned about Testing from a crazy ex-girlfriend

Posted on 19:58 by Unknown
I was reminiscing with a tester colleague today about how our mothers used to mess with our stuff when we were younger and how it really got on our nerves.

Picture the scene: you have a desk in your room that's plastered with papers and stuff everywhere. And you know precisely where everything is. It's your mess after all.

Enter the mom. She looks around, maybe she's come in to drop off some laundry or to complain about the state of your room or whatever. You aren't around. She starts to tidy. She tidies the papers on your desk and arranges your action figures/books/pencils/Lego/rubber band collection/whatever into a neat arrangement of some kind.

You return. "Ahhhh! Where's my stuff?!?! You changed the order! I can't find anything now! Don't touch my stuff!!"

Your mom, now hurt because she was "only trying to help," vows to never touch your stuff again unless someone's life depends on it. Maybe. We'll see next week.


We chuckled over the memories, but the connection my colleague made was how that ability to memorise tiny details and the placement of certain pieces of information in a messy desk was perhaps already the mark of a good tester. When you look at a computer screen, you take in all the details and it becomes a new mess of our own design that we track in our heads. We notice when details are moved or changed. If we think there might be something different, if we have a hunch, we can use tools to help us verify it or we can check with an oracle of some kind. It's that ability to make a connection, develop a hunch and act on it, that makes a good tester.

So, you're probably wondering "where does the crazy ex-girlfriend come into the story here?" Good question. No, I haven't forgotten. This is the spot.

In university I had .. er, how would I describe it now.. a short-lived relationship with a girl who was definitely the outgoing/extrovert type. One morning, I met her before classes started and didn't meet up with her again until after lunch when we both had a free spot in our schedules. When I saw her after lunch, I did a double-take but I wasn't sure what I was noticing. Something was different but I couldn't put my finger on it.

Then it hit me! Her earrings were different. She had two piercings in each ear and in the morning she had studs and stars (in that order) and in the afternoon they were reversed - stars and studs.

Being the attentive boyfriend, I asked her if her earrings had changed their places or if I was just losing my mind. She looked at me and didn't say anything for a minute. Then she said 'yes', she was bored in one of her morning classes and decided to switch the order. Then she got mad at me. She was upset that I had noticed because she didn't want anyone to notice that she had changed them.

Umm, really? That includes me? So, I don't get any points for noticing you and paying attention? Okay, I don't get this relationship stuff. I think our relationship lasted perhaps another 48 hours. Oh well. C'est la vie.

In retrospect, that was another example where somewhere in my head I had made the connection - something was different even if I didn't know what. Then I began the process of methodically going through the list of possibilities and checking them off one by one - was it her hair? her eyes? makeup? lipstick? top? something she was carrying? a scent? a mark? necklace? wait, did I check the ears? hey, there are 2 thingies there - could they have changed? Spider sense tingling.. better consult the oracle and check if we have a match. Bingo!

Success in finding the difference! Failure at love. =( You win some, you lose some.

I didn't know it at the time but the best relationship was still to come! =) And on that note, sometimes you notice the little things, and sometimes someone has to hit you over the head with a frying pan to tell you to open your eyes and see what's right in front of you!

Ah, but love makes you blind, doesn't it? ;)
Read More
Posted in | No comments

Monday, 11 January 2010

What skill does Exploratory Testing require?

Posted on 12:47 by Unknown
I've just been challenged with a sobering reality.

I've heard the term "Exploratory Testing" used many times over the last few years by developers and testers at various gatherings. I've practiced it myself for over 6 years in various black-box system testing efforts. When training new testers on my team, I provided them with foundational concepts in context, risk, scientific method, test techniques and communication. Then over the course of several weeks, I reviewed their test sessions and provided feedback during debrief sessions to improve their understanding and application of the various testing skills required to be efficient and effective.

People have told me that I have really high standards, and perhaps I do. To me, testing is a passion and fun, and quality is an ideal achieved through effective communication and interactions with all the stakeholders on a project.

But that's all besides the point. If the question is "what is Exploratory Testing and how do you do it?" then my standards and expectations from team members are irrelevant.

ET is simply an approach to testing software where the tests are not predefined and the focus is on iterative learning, test design and execution (to paraphrase a simplified definition).

How someone learns, how someone designs tests, how someone executes those tests - these things are not defined by any standard; they are applied differently by different people. ET can be performed by anyone. There aren't any requirements for how well or thoroughly someone should perform it.

To quote from the animated movie "Ratatouille": "Anyone can cook. But I realize, only now do I truly understand what he meant. Not everyone can become a great artist, but a great artist can come from anywhere."

So, when I hear the term ET thrown around, I have about as much understanding of how they're testing as I do from a development shop that uses the term "Agile". That is, I don't know anything about what it means to them, how they're applying it, how effective it is, or how it compares to my standards/expectations.

I've been reading articles and research lately comparing ET and Test Case-driven Testing (TCT) approaches, and it never ceases to amaze me how stats and research may be twisted to support everyone's beliefs about which is better than the other.

Developers and Product Managers who have worked with me understand the quality of the information and feedback that my testing style provides. They have said that it is a whole new level of testing feedback they've never seen before. It makes me feel good to hear that - that I'm providing a valuable service.

But when I read one of these comparison articles, I have to assume that the ET applied in the research studies aren't at the same level that I apply it. I have to accept that. I may not like it, but that's the reality. To me, the same research applied would likely show that Agile and Waterfall aren't really all that different in terms of produced output. Sigh.

Am I missing something?
Read More
Posted in | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Enable ActiveX Control in Outlook
    Occasionally when using Microsoft Outlook, you may receive an error message telling you that your security settings do not allow ActiveX con...
  • Outlook requires Outlook Express 4.01
    Cannot start Microsoft Outlook. Outlook requires Microsoft Outlook Express 4.01 or greater. You can install Outlook Express by running Add/R...
  • Some incomplete thoughts...
    There are a series of related ideas that I want to discuss, but I don't think I'll have the time to properly describe them here. I...
  • Microsoft Outlook Duplicate Email Fix
    When using Microsoft Outlook, you may encounter an error in which all of your emails are downloaded twice. Depending on the size of your inb...
  • The Human Side of Living
    As I go through life I keep noticing stories, ideas and insights into humanity and I sometimes wonder if we are meant to discover these less...
  • Troubleshoot Outlook Express Error 0X800ccc90
    If you're an Outlook Express user trying to log on and you get an Error 0X800ccc90, which stops your password from being authenticated, ...
  • Happy Limerick Day! (May 12th)
    The CEO where I currently work sent around the following note by email at the start of today: Today is Limerick day. A limerick is a five-li...
  • Distorted Sound
    Another problem with MSN/Windows Live Messenger is the sound cuts out. This is a known problem with versions 7, 8.0, 8.1 Beta and 8.1. Usua...
  • Now with minty-fresh visitor counter
    Someone suggested to me this past weekend that I add a visitor counter to this blog. It's one of the most common suggestions made to me...
  • Windows live mail error Ox800CCCD2
    This error generally comes up when your firewall block any port. Reconfigure your account to Live Mail and also make sure that your firewal...

Categories

  • agile
  • agile testing
  • AYE
  • bad training
  • bugs
  • building software
  • certification
  • communication
  • conference
  • configure outlook express
  • configure windows live hotmail account in windows live mail
  • configure windows live mail
  • context-driven
  • development
  • engineering
  • error message
  • ET
  • exploratory testing
  • future
  • hiring
  • hobbies
  • hotmail account validation process
  • How to Enable ActiveX Control in Outlook
  • how to fix duplicate email
  • how to solve error 4.01 or greater
  • incoming mail sync to outlook
  • information radiator
  • instruction for pst file
  • interests
  • lean
  • lean software development
  • learning
  • low tech testing dashboard
  • management
  • mastery
  • measuring progress
  • metrics
  • Microsoft Outlook Duplicate Email Fix
  • Microsoft Technical Support
  • Microsoft Windows Mail
  • ms outlook duplicate email
  • msn account reset
  • msn account validation process
  • msn error code 0x80004005
  • msn error code 0x80004005 in apple mac
  • msn error code 0x80004005 windows 8
  • MSN Error Support Msn Help and Support
  • MSN Password Recovery
  • msn password reset
  • MSN Technical Support
  • outgoing mail not sent from outlook express
  • outlook not authenticate password
  • passion
  • people
  • pop3 email server
  • programming
  • quality
  • Quality Center
  • questions
  • regression testing
  • remove error 0X800ccc90
  • remove Error 0X800ccc90/Error 0x800ccc18
  • remove error 421
  • remove error ox800ccc90
  • remove msn error code 0x80004005 in windows 7
  • remove windows live mail
  • repair microsoft outlook pst file
  • repair PST file
  • resolve sound distortion problem with your live messenger
  • reviewing resumes
  • Satir
  • SBTM
  • science
  • skills
  • software
  • software testers
  • Software testing
  • sound distortion msn
  • sound distortion with livemail
  • support for microsoft outlook
  • support for outlook
  • TDD
  • technical support for microsoft outlook
  • testing
  • testing dashboard
  • time
  • Unable To Login in Windows Mail
  • unable to loging in waindows mail
  • value
  • Waterfall
  • windows live mail error Ox800CCCD2
  • windows live mail support
  • writing

Blog Archive

  • ►  2013 (16)
    • ►  September (10)
    • ►  August (1)
    • ►  April (1)
    • ►  February (2)
    • ►  January (2)
  • ►  2012 (3)
    • ►  May (1)
    • ►  February (1)
    • ►  January (1)
  • ►  2011 (25)
    • ►  December (1)
    • ►  October (1)
    • ►  September (2)
    • ►  August (3)
    • ►  July (2)
    • ►  May (1)
    • ►  April (2)
    • ►  March (9)
    • ►  February (2)
    • ►  January (2)
  • ▼  2010 (13)
    • ▼  November (1)
      • Fishing for Wisdom
    • ►  September (3)
      • Using MS Outlook to support SBTM
      • Test-Driven Development isn't new
      • Why New Year's Resolutions Fail
    • ►  July (1)
      • Testing Software is like Making Love
    • ►  May (1)
      • Happy Limerick Day! (May 12th)
    • ►  April (1)
      • SBTM ET.XLS spreadsheet with DMY format
    • ►  February (4)
      • TEDxWaterloo
      • Now with minty-fresh visitor counter
      • SBTM is not ET
      • Time - Bane or Innovation Catalyst?
    • ►  January (2)
      • What I learned about Testing from a crazy ex-girlf...
      • What skill does Exploratory Testing require?
  • ►  2009 (10)
    • ►  December (1)
    • ►  November (2)
    • ►  October (2)
    • ►  July (3)
    • ►  May (1)
    • ►  February (1)
  • ►  2008 (4)
    • ►  October (1)
    • ►  April (1)
    • ►  March (2)
  • ►  2007 (12)
    • ►  November (1)
    • ►  August (2)
    • ►  July (1)
    • ►  May (3)
    • ►  February (2)
    • ►  January (3)
  • ►  2006 (1)
    • ►  August (1)
  • ►  2005 (16)
    • ►  November (2)
    • ►  October (1)
    • ►  September (2)
    • ►  August (1)
    • ►  May (4)
    • ►  April (4)
    • ►  February (1)
    • ►  January (1)
  • ►  2004 (2)
    • ►  December (2)
Powered by Blogger.

About Me

Unknown
View my complete profile