Outlook Support

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Wednesday, 27 July 2011

Quality Center Must Die

Posted on 23:56 by Unknown
It is not a matter of "if" -- it is a matter of "when" HP's Quality Center software will die.  And you, my dear readers will help make that happen.

"How?" you may ask? Simple. There are two things you should do: (1) think, and (2) don't put up with crap that gets in the way of delivering value to the customer and interacting intelligently with other human beings.

But I am getting ahead of myself. Let's rewind the story a bit...

Several months ago I was hired by my client to help train one of the test teams on agile and exploratory testing methods. The department has followed a mostly Waterfall development model until now and wants to move in the Agile direction. (A smart choice for them, if you ask me.) Why am I still there after all this time? That's a good question.

After attending the Problem Solving Leadership course last year, and after attending a few AYE conferences, I changed my instructional style to be more the kind of consultant that empowers the client with whatever they need to help themselves learn and grow.  It's a bit of a slower pace, but the results are more positive and long-lasting.

I am a part of a "pilot" agile/scrum team and am working closely with one of the testers (I will call him "Patient Zero") to coach him on good testing practices to complement the agile development processes. I have done this several times now at different clients, so this is nothing new to me. One of the unexpected surprises that cropped up this time was that this development team is not an end-to-end delivery team, so when they are "done" their work, the code moves into a Waterfall Release process and it all kind of falls apart. There are still some kinks to be solved here and I am happy to see some really bright, caring people trying to solve these problems. So that's okay.


Patient Zero and I are part of a larger test team, and the rest of the test team all work on Waterfall-style projects, use Waterfall-compatible tools, and they generally don't get how we work. :) Unfortunately, one of the tools mandated for our team's use is HP's Quality Center (HPQC).  I hadn't seen that tool in about a decade and it looked very similar to how I last remember it.

To my agile coach/practitioner friends I should clarify that at no time during our sprint development work does anyone ever touch HPQC! However, once the code is deployed/falls into the Waterfall Release process, regression test cases are created in HPQC and it is used for defect tracking. It is mandated, and so shall it be done. I can live with that. It's just a tool at this point and the impact to our ability to deliver a good solution is eliminated by the fact that we don't touch it until after we are "done". (Communication and collaboration FTW!)

Two days ago.

Our whole test team took part in a 2-day HPQC training workshop on something HP calls "Business Process Testing" or BPT. Being naturally curious to learn something new, I wanted to know what BPT was and how it fits into the bigger testing picture. Here we go.

We were given a handout with some "test scenarios" to be used for training. The test scenarios fell into this pattern:
  • Scenario name/title
  • Requirement description
  • Test Situation (I am staring at this right now and I still don't know what this means)
  • Role (kind of system user this requirement/situation applies to)
  • Steps

That's okay information. The "Steps" are what you might typically expect to see if you have been testing for a while. Here is an example for working with a sample web app:
  1. Login to the system with (a certain type of user)
  2. Navigate to some module in the app
  3. Click "Create" button from the tool-bar
  4. Enter mandatory field values and save
  5. Search for the information you created in the previous step
  6. Logout of the system

And there were 6 of these scenarios.

I read through these tests and then I tried to follow them using the system. I quickly encountered a half-dozen bugs - some with the system, some with the test scenarios/cases, and some were open questions that I would follow-up with the Product Owner for requirement clarification.

But, woah-woah-woah-hey.. wait a minute. We only need to worry about *these* documented test scenarios! I struggled hard to keep my mouth shut about the value of time and the many different kinds of tests I would happily engage in at this point if I could leave the tool alone. But, I left that to my "inner voice" and we were now one hour into the first day's training.

At this point, we were given an overview of the HPQC modules and told to (1) enter the Requirements, (2) create BPT test scenarios, and (3) "componentize" the test scenarios into groupings of related steps. This last part required some explanation since this was new to me, and once we got it, we set to work on the task.  Since we could see (1) and (2) on the handout sheets in front of us, we went straight to work on part (3). That was kind of fun working in small group of 4 looking at these scenarios and trying to come up with solutions.

And then someone went and spoiled the fun.  We were "told" that we weren't supposed to do part (3) until after we had entered the information for (1) and (2) into the tool.  I was all like "really?" and then a team lead came by and repeated the exact same thing.  This may be summed up as: Enter data into the tool first, and think later.

I was kind of shocked with this comment and attitude and in retrospect it kind of foreshadowed the rest of the training experience -- i.e. Tool and process first; think later; maybe.  Okay. I'll play along and see where this goes.

During this exercise, I began to grasp this idea of "components" as HP uses them. I think they are like Page Objects - chunks of code (or in this case, test steps) that perform a certain function, promote reuse, and reduce duplication. Although it was never described in this way, I believe the HP "BPT" module is a proprietary Do-It-Yourself DSL. Aha! I have experience with those. I get that stuff.

So, once I got it, I started to explain the concept of YAGNI to my group team members.  That is, let's not overthink or over-engineer these components. Let's build/write them based on the needs of the requirements in front of us. We will modify the components in the future as new requirements appear.  This idea was well received and we quickly came to an efficient solution for our small group.

When we took up the exercise, I found I had to explain the YAGNI concept to the instructor/trainer (and rest of the class/team) as he proposed that we try to abstract out these components to allow for compatibility with other features and system elements. What a waste of time! We cannot know what we will need beyond our immediate needs so that kind of abstraction is a pointless exercise that leads to more headaches than you need.

Eventually, I started to get that the point of writing the test scenarios using BPT was that these components form a basic vocabulary that may then be automated at some point in the future -- yes, BPT integrates with HP's QTP automation tools.  Now, I'm all in favour of consistency, clarity, reuse, and automating tests that humans should never have to do more than once (and if there is value in re-running the test), so I struggled to understand why it was never explained to us this way.  As long as I kept the DSL/automation model in my head, I understood what we were doing and can see the potential benefits of it.

I saw many testers struggle with the exercises and models we were presented with. End of day one.

Day two began with a quick recap and then we were introduced to the bureaucracy that is Waterfall and HPQC. That is, there are review processes and workflows to cover each requirement, BPT and component. Welcome to Wasteland. (I mean the Lean Development concept of "waste" here, although other interpretations of "wasteland" may be just as valid.) Easily a third of the 2-day training was spent on the processes surrounding the management and review of the various HPQC objects. sigh.

We then moved onto the concept of "parameters" for the components we created yesterday.  Okay, I get method parameters when I am scripting with Ruby, so this was no sweat.  Given the length of time I spent parametrizing my components compared to everyone else in the class, I think I may be one of the few who really got it.

I learned a few more things about the main instructor. One was that he didn't know how to identify web page elements using commonly available browser tools. Umm, aren't these tools supposed to interact with web pages? This never came up before? You have never wondered how to find the name/id for an element on a web page? really?

The other was that he had a really bad sense of humour.  He made a reference to a "QA" joke that I shall not repeat here.  Needless to say that I found it insulting, offensive, it made my skin crawl and my blood boil.  Many unpleasant feelings and ideas arose in me and it took all my willpower and strength not to react to the blatant stupidity of insulting the profession of the students in your class and the market that the HPQC tool represents.

The final blow for the day came to me when we tried to "execute" these BPT test scenarios using the HPQC tool. THEN I discover that these "parameters" can have 2 kinds of values - fixed/hard-coded and run-time. Anyone who has done intelligent automation knows NOT to hard-code values in their scripts. Data-driven is way better, and as an exploratory tester, I may not know what value I will choose until moments before.

Here's where the HPQC tool gets stupid..er.  Regardless which parameter type you choose, they may ONLY be defined BEFORE the test execution! You cannot leave empty parameters so that when you get to a particular step, the tester can decide what value to input and feed it back into the tool.

What does this mean?  As a tester, let's say you want to create a new user profile in an app of some kind. The test parameters for things like Name, Email, Address, Country, and so on, must all be defined and set before the test is executed. You cannot decide while you are actually testing!

Why does this matter to me? It matters because it means that the humans who execute these BPT tests manually have no choice in what values they may input. Testing techniques like Equivalence Classes and BVA that help guide our choices to pursue interesting paths are completely cut-off! It turns out that HPQC treats humans worse than the automated counterparts. In discussion with an automation "expert" at the end of the class I learned that at least you can code some variability into the QTP automation scripts.  This is not possible with the same test scenarios executed manually by human beings.

So. Much. Wrong.

So after my first ever QC training session, here are some of my take-aways:

  • It took me 2 days to "script" 6 test scenarios in this tool and they were rotten test cases to begin with! I suspect that outside of the training environment, it will actually take longer to complete since you won't have a "reviewer" sitting next to you waiting for you to finish your piece.
  • And they weren't even automated! Who knows how long it would take to tweak the "components" to make them work with a particular automation strategy.
  • HP QC will never be a useful tool for any agile or rapid development efforts
  • (HP best practice) Put the tool, data and review processes first, before you think. Maybe always instead of it too.
  • BPT is a DSL framework for test scripting
  • BPT component parameters cannot be customised during test execution. They may only be set/defined before you start testing. => No thinking allowed while testing.
  • The instructor didn't appear to be knowledgeable on anything outside of the tool itself. This includes how we might actually want to use the tool. No, no, no. And I quote: "Testers must change how they work to use the tool in the way it was designed."
  • As long as there are people pushing these kinds of horrible tools that suck the life and intelligence out of people, inject mountains of wasteful activities that provide no value to the customers or end users, and continue to create barriers between testers and their developer counterparts, I will always have job security in helping organisations recover from these cancers.

At the end of the day, there were several people interested in my idea of randomising tests. The automation expert in the class insisted that it couldn't be done with automation, so I called up my "Unscripted Automation" presentation slides that I gave a few years ago. He said that what I proposed was not a "best practice" and that everyone, the whole industry, was using the tools in the way that he described how they should be used.

My response was to simply say "they are all wrong."
Read More
Posted in agile, bad training, exploratory testing, Quality Center, testing, Waterfall | No comments

Friday, 15 July 2011

Visual Studio: Remote Debugging

Posted on 06:57 by Unknown

Given the task of remotely debugging UAT code from my workstation, I read so many articles around steps to get it working. Though all of them contributed in getting it working, there was not a single silver bullet post which nailed the solution for me. The approach that worked for me was a concoction of tips and tricks from multiple posts. Let me chart out the steps I treaded to get there, hoping that for some it works as a one stop shop. Please note my IDE is VS 2005 Team Edition.


1) For remote debugging, a remote debugging monitor service msvsmon should be running on the remote machine. For 64bit processes, the service executable is available at <install path>\Microsoft Visual Studio 8\Common7\IDE\Remote Debugger\x64.
2) Double click the executable on the remote machine which opens a window with a timestamp and a message “Msvsmon started a new server named <login name>@<remote machine name>. Waiting for new connections.” i.e. the service has been started. Please note that the <login name> would be the user with which you have logged onto the machine and maybe qualified with the name of the domain on which you are currently logged onto.
3) Build the solution on the host machine and place the assemblies along-with the .pdb files onto the remote machine file system. GAC these assemblies from the remote file system into the remote assembly cache.
4) Restart the appropriate service/ process so that it starts picking up our newly gaced assemblies. In my case it was the BizTalk service which I had to bounce.
5) On the host machine IDE menu, open Debug --> Attach to Process. Set Transport as “Default” and Qualifier as <login name>@<remote machine name> i.e. details shown in the msvsmon window. Please do not omit the domain name. You can simply copy paste the <login name> from the msvsmon window.
Hit [Refresh] to see processes hosted on the remote machine and attach to the required process.
6) In Step (5) if you get an error like,
         Unable to connect to the Microsoft Visual Studio Remote Debugging Monitor named <login name>@<remote  
         machine name>'.  A security package specific error occurred.
Check whether,
         The <login name> is an admin on the remote machine or added as a remote user on the remote machine.


 If yes then use the following work-around:
a) On the remote machine create a user “RemoteDebug” (any name would do) and add the user as a local administrator
b) On the host machine, also  create a user “RemoteDebug” and add the user as a local administrator
c) On the remote machine run msvsmon service under RemoteDebug user by doing a “run as”
d) On the host machine IDE open Debug à Attach to Process. With Transport set as “Default” set Qualifier as RemoteDebug@<remote machine name>. Hit [Refresh] which will display all the processes hosted on the remote machine
e) Go back to the remote machine and exit the msvsmon service. Start the service again, but this time simply double click the executable and not do a run as. This will run the service under your login name.
f) Go to the host machine IDE and open Debug à Attach to Process. With Transport set as “Default” set Qualifier as <login name>@<remote machine name>. Hit [Refresh] and this should now display all the processes hosted on the remote machine.
Please note that if you try to debug by attaching the process in step (d), the debugging will not work as the IDE is run under your login name and not under RemoteDebug


Do not hesitate to drop me an e-mail with any particular observations you have had whilst trying to setup a remote debug environment. Enjoy!
Read More
Posted in | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Enable ActiveX Control in Outlook
    Occasionally when using Microsoft Outlook, you may receive an error message telling you that your security settings do not allow ActiveX con...
  • Outlook requires Outlook Express 4.01
    Cannot start Microsoft Outlook. Outlook requires Microsoft Outlook Express 4.01 or greater. You can install Outlook Express by running Add/R...
  • Some incomplete thoughts...
    There are a series of related ideas that I want to discuss, but I don't think I'll have the time to properly describe them here. I...
  • Microsoft Outlook Duplicate Email Fix
    When using Microsoft Outlook, you may encounter an error in which all of your emails are downloaded twice. Depending on the size of your inb...
  • The Human Side of Living
    As I go through life I keep noticing stories, ideas and insights into humanity and I sometimes wonder if we are meant to discover these less...
  • Troubleshoot Outlook Express Error 0X800ccc90
    If you're an Outlook Express user trying to log on and you get an Error 0X800ccc90, which stops your password from being authenticated, ...
  • Happy Limerick Day! (May 12th)
    The CEO where I currently work sent around the following note by email at the start of today: Today is Limerick day. A limerick is a five-li...
  • Distorted Sound
    Another problem with MSN/Windows Live Messenger is the sound cuts out. This is a known problem with versions 7, 8.0, 8.1 Beta and 8.1. Usua...
  • Now with minty-fresh visitor counter
    Someone suggested to me this past weekend that I add a visitor counter to this blog. It's one of the most common suggestions made to me...
  • Windows live mail error Ox800CCCD2
    This error generally comes up when your firewall block any port. Reconfigure your account to Live Mail and also make sure that your firewal...

Categories

  • agile
  • agile testing
  • AYE
  • bad training
  • bugs
  • building software
  • certification
  • communication
  • conference
  • configure outlook express
  • configure windows live hotmail account in windows live mail
  • configure windows live mail
  • context-driven
  • development
  • engineering
  • error message
  • ET
  • exploratory testing
  • future
  • hiring
  • hobbies
  • hotmail account validation process
  • How to Enable ActiveX Control in Outlook
  • how to fix duplicate email
  • how to solve error 4.01 or greater
  • incoming mail sync to outlook
  • information radiator
  • instruction for pst file
  • interests
  • lean
  • lean software development
  • learning
  • low tech testing dashboard
  • management
  • mastery
  • measuring progress
  • metrics
  • Microsoft Outlook Duplicate Email Fix
  • Microsoft Technical Support
  • Microsoft Windows Mail
  • ms outlook duplicate email
  • msn account reset
  • msn account validation process
  • msn error code 0x80004005
  • msn error code 0x80004005 in apple mac
  • msn error code 0x80004005 windows 8
  • MSN Error Support Msn Help and Support
  • MSN Password Recovery
  • msn password reset
  • MSN Technical Support
  • outgoing mail not sent from outlook express
  • outlook not authenticate password
  • passion
  • people
  • pop3 email server
  • programming
  • quality
  • Quality Center
  • questions
  • regression testing
  • remove error 0X800ccc90
  • remove Error 0X800ccc90/Error 0x800ccc18
  • remove error 421
  • remove error ox800ccc90
  • remove msn error code 0x80004005 in windows 7
  • remove windows live mail
  • repair microsoft outlook pst file
  • repair PST file
  • resolve sound distortion problem with your live messenger
  • reviewing resumes
  • Satir
  • SBTM
  • science
  • skills
  • software
  • software testers
  • Software testing
  • sound distortion msn
  • sound distortion with livemail
  • support for microsoft outlook
  • support for outlook
  • TDD
  • technical support for microsoft outlook
  • testing
  • testing dashboard
  • time
  • Unable To Login in Windows Mail
  • unable to loging in waindows mail
  • value
  • Waterfall
  • windows live mail error Ox800CCCD2
  • windows live mail support
  • writing

Blog Archive

  • ►  2013 (16)
    • ►  September (10)
    • ►  August (1)
    • ►  April (1)
    • ►  February (2)
    • ►  January (2)
  • ►  2012 (3)
    • ►  May (1)
    • ►  February (1)
    • ►  January (1)
  • ▼  2011 (25)
    • ►  December (1)
    • ►  October (1)
    • ►  September (2)
    • ►  August (3)
    • ▼  July (2)
      • Quality Center Must Die
      • Visual Studio: Remote Debugging
    • ►  May (1)
    • ►  April (2)
    • ►  March (9)
    • ►  February (2)
    • ►  January (2)
  • ►  2010 (13)
    • ►  November (1)
    • ►  September (3)
    • ►  July (1)
    • ►  May (1)
    • ►  April (1)
    • ►  February (4)
    • ►  January (2)
  • ►  2009 (10)
    • ►  December (1)
    • ►  November (2)
    • ►  October (2)
    • ►  July (3)
    • ►  May (1)
    • ►  February (1)
  • ►  2008 (4)
    • ►  October (1)
    • ►  April (1)
    • ►  March (2)
  • ►  2007 (12)
    • ►  November (1)
    • ►  August (2)
    • ►  July (1)
    • ►  May (3)
    • ►  February (2)
    • ►  January (3)
  • ►  2006 (1)
    • ►  August (1)
  • ►  2005 (16)
    • ►  November (2)
    • ►  October (1)
    • ►  September (2)
    • ►  August (1)
    • ►  May (4)
    • ►  April (4)
    • ►  February (1)
    • ►  January (1)
  • ►  2004 (2)
    • ►  December (2)
Powered by Blogger.

About Me

Unknown
View my complete profile