February 17, 2005 - Needs and Wants

Recently while chatting with Alberto the phrase "what the user needs and wants" came up in our talk. That got me thinking a bit, what's the difference. It struck me finally that, that's the core difference between a market success or not.

In the automation world - things like search engines or even automated testing tools, a "need" is what the computer programmatically might determine is important to offer the user. A "want" on the other hand is the user selecting what they really can use from what the computer offered them.

Agitator, for instance, offers a lot of interesting observations about the code being agitated - equivalent to what the computer determined the user might "need". Users will typically select some of those as what they "want".

A search engine, for instance, will offer many web page links based on the search term - programmatically determining what the user "needs". The user performing the search will click, though, on only a few of them before selecting the one they "want".

Market success will come from where the "need" most closely matches the "want". Google has this figured out, in my opinion, it tunes the "need" results based on what users generally "want" - and users go "ooh - that's a smart search engine - it figured out what I want!".

All automation tools need to support this - some form of learning to figure out how to improve the quality of the results they offer. In the end, the goal is to reduce the time the user will take to perform the operation they originally set out to perform, while offering them a great user experience (read "simple") in the process. We're going to perform some work around bringing the "need" and "want" space closer with Agitator.

Have you toyed with this concept in your company or project?


Posted by Ashish Kumar at February 17, 2005 07:25 AM


Trackback Pings

TrackBack URL for this entry:
http://www.developertesting.com/mt/mt-tb.cgi/146


Comments

Sounds good. Now the question is "To what extend a software can be intelligent". Software these days, try to learn from patterns, user preferences, etc. But I wonder how a testing tool like agitator can do that! May be my imagination is not that broad :)


When we talk in terms of agitator, there are perhaps two important things that we can try to put effort in, to shorten that gap between the want and need.

1. "To show more useful observations".

Complexity and usefulness of observations itself is an interesting topic, since we are doing unit testing. Here question is 'what is the unit', just the class you test or a few other classes that are associated?. I have seen people defining unit from the method under testing to the group of all the classes participate in a call stack.

2. "To have more knowledgeable and intelligent auto factories to increase coverage".

Auto factories, which read the code, like a 'human do' and give more useful values so that coverage goes high. This will need the factory to be really aware of the API methods that are being called from your source code.

Now the problem is, we can make agitator, the Java standard API aware, but what do we do to the third party APIs.

Posted by: Chidambaram Danus on April 11, 2005 04:37 AM

There's more than need and want. There is "can use" and "thinks (s)he wants". We encounter a lot of these when doing specs. "thinks (s)he wants" comes in two flavours: flavour one is to codify existing processes (that could be improved). Flavour two is "I want a candy": request that are out of the toy box. "Can use" are functions users never asked for but empower them to achieve goals easer/faster/safer or reach goals they like but hadn't in view.
my 2c
;-) stw

Posted by: Stephan H. Wissel on April 16, 2005 07:50 AM

Chidambaram -

Thanks for your comments. You're right on both counts - we need to improve observations based on "smart knowledge" of the code under test, as well as auto-factories that help improve coverage. With some new technology that we will be introducing in our 3.0 release you will see some of these "smarts" show up.

Related to the "needs and wants" topic though, I do believe that there's some learning that the Agitator can do based on how the user interacts with it. For a set of observations offered by the Agitator the user may generally convert a few of those to assertions. Agitator can learn from that, and when it needs to offer observations about other code that matches similar code patterns - it can know that the user prefers those "style" of observations - maybe prioritize those ahead of the rest of the list.

While that's only step 1 - I can see some significant work in making this way more smarter in learning from the user's interactions. Of course some of that work will only show over time, and is strictly under wraps at the moment.

Thanks again,

Ashish

Posted by: ashishkumar on April 18, 2005 10:05 PM

Steve,

Lots and lots of traditional software has beed designed with a view that users don't know much, so of course, computers with their great processing capability can offer better solutions that they really "need". I don't subscribe to that view - not any more, not after seeing Google.

They've proved, I think, that learning intelligently from matching what the software thought the user "needs" (i.e. the initial search results) vs. what the user "wants" (i.e. the links that the user really clicks on) and then positioning the links that the user clicked on higher the next time - is the right way to go.

In the automated testing world - my take on the two concepts you introduce - with an example method:


public void addFoo(Foo element) {
if (element.isTypeOne()) {
typeOneList.add(element);
} else if (element.isTypeTwo()) {
typeTwoList.add(element);
}
}


thinks (s)he wants: This is what the user wants to test with. They want to pass in two parameters to verify that the code adds them properly - one of typeOne and the other of typeTwo. Fairly reasonable.
can use: The tool can quickly determine (as the agitator does), that other outcomes are possible in the method. - 1) NPE because element is null, 2) Nothing gets added because the element is neither typeOne or typeTwo, and 3) NPE if either list is empty.


The tool can offer all of these - thus supplementing the limited thought the user applied with what the compute power that the tool has access to. But, maybe, the element passed in can never be null - because the callers ensure that in all other code (is a contract that the domain offers). The tool ought to learn from the fact that the user doesn't care of the case where an NPE is triggered because the parameter is null.

Thanks for the two concepts - these make it easier even for me to think about needs and wants and how they map to our world.

Ashish

Posted by: ashishkumar on April 18, 2005 10:29 PM

Post a comment




Remember Me?