User interface design testing evaluates how well a design takes care of its users, offers clear direction, delivers feedback, and maintains consistency of language and approach. Subjective impressions of ease of use and look and feel are carefully considered in UI design testing. Issues pertaining to navigation, natural flow, usability, commands, and accessibility are also assessed in UI design testing.
During UI design testing, you should pay particular attention to the suitability of all aspects of the design. Look for areas of the design that lead users into errors or that do not clearly indicate what is expected of users.
Consistency of aesthetics, feedback, and interactivity directly affect an application's usability—and should therefore be carefully examined. Users must be able to rely on the cues they receive from an application to make effective navigation decisions and understand how best to work with an application. When cues are unclear, communication between users and applications can break down.
It is essential to understand the purpose of the software under test (SUT) before beginning UI testing. The two main issues to consider are:
With answers to these questions, you will be able to identify program functionality and design that do not behave as a reasonable target user would expect they should. Keep in mind that UIs serve users, not designers or programmers. As testers, we represent users and must be conscious of their needs.
Profiling the Target User
Gaining an understanding of a Web application's target user is central to evaluating the design of its interface. Without knowing the user's characteristics and needs, it can be a challenge to assess how effective the UI design is.
User interface design testing involves the profiling of two target-user types: (1) server-side users and, more important, (2) client-side users. Users on the client side generally interact with Web applications through a Web browser. More than likely they do not have as much technical and architectural knowledge as users on the server side of the same system. Additionally, the application features that are available to client-side users often differ from the features that are available to server-side users (who are often system administrators).
Therefore, client-side UI testing and server-side UI testing should be evaluated by different standards. When creating a user profile, consider the following four categories of criteria (for both client-side and server-side users).
How long has the intended user been using a computer? Do they use a computer professionally or only casually at home? What activities are they typically involved with? What assumptions does the SUT make about user skill level, and how well do the expected user's knowledge and skills match those assumptions?
For client-side users, technical experience may be quite limited, but the typical user may have extensive experience with a specific type of application, such as a spreadsheet, word processor, desktop presentation program, drawing program, or instructional development software. In contrast, system administrators and information services (IS) personnel who install and set up applications on the server side probably possess significant technical experience, including in-depth knowledge of system configuration and script-level programming. They may also have extensive troubleshooting experience, but limited experience with typical end-user application software.
How long has the user been using the Web system? Web systems occasionally require client-side users to configure browser settings. Therefore, some experience with Web browsers will be helpful. Is the user familiar with Internet jargon and concepts, such as Java, ActiveX, HyperText Markup Language (HTML), proxy servers, and so on? Will the user require knowledge of related helper applications such as Acrobat reader, File Transfer Protocol (FTP), and streaming audio/video clients? How much Web knowledge is expected of server-side users? Do they need to modify Practical Extraction and Reporting Language (perl) or Common Gateway Interface (CGI) scripts?
Is the user familiar with the subject matter with which the application is associated? For example, if the program involves building formulas into spreadsheets, it is certainly targeted at client-side users with math skills and some level of computing expertise. It would be inappropriate to test such a program without the input of a tester who has experience working with spreadsheet formulas.
Another example includes the testing of a music notation–editing application. Determining if the program is designed for experienced music composers who understand the particulars of musical notation, or for novice musicians who may have little to no experience. with music notation, is critical to evaluating the effectiveness of the design. Novice users want elementary tutorials, and expert users want efficient utilities. Is the user of an e-commerce system a retailer who has considerable experience with credit card–"processing practices? Is the primary intended user of an online real estate system a realtor who understands real estate listing services, or is it a first-time home buyer?
Will users be familiar with the purpose and abilities of the program because of past experience? Is this the first release of the product, or is there an existing base of users in the marketplace who are familiar with the product? Are there other popular products in the marketplace that have a similar design approach and functionality? Keep in mind that Web applications are still a relatively new class of application. It is possible that you are testing a Web application that is the first of its kind to reach the marketplace.
Consequently, target users may have substantial domain knowledge but no application-specific experience.With answers to these questions, you should be able to identify the target user for whom an application is designed. There may be several different target users. With a clear understanding of the application's target users, you can effectively evaluate an application's interface design and uncover potential UI errors.
Table offers a means of grading the four attributes of target-user experience. User interface design should be judged, in part, by how closely the experience and skills of the target user match the characteristics of the SUT.
Once we have a target-user profile for the application under test, we will be able to determine if the design approach is appropriate and intuitive for its intended users. We will also be able to identify characteristics of the application that make it overly difficult or simple. Overly simplistic design can result in as much loss of productivity as an overly complex design can. Consider the bug-report screen in the sample application. It includes numerous data-entry fields. Conceivably, the design could have broken up the functionality of the bug-report screen over multiple screens. Although such a design might serve novice users, it would unduly waste the time of more experienced users—the application's target.
Evaluating Target-User Experience
Testing the Sample Project
Consider the target user of the sample application. The sample application is designed to support the efforts of software development teams. When we designed the sample application, we assumed that the application's target user would have, at a minimum, intermediate computing skills, at least beginning-level Web experience, and intermediate experience in the application's subject matter (bug tracking). We also assumed that the target user would have at least beginning experience with applications of this type.
Beyond these minimum experience levels, we knew that it was also possible that the target user might possess high experience levels in any or all of the categories. Table shows how the sample application's target user can be rated.
Evaluating Sample Application Target User
Considering the Design
The second step in preparing for UI design testing is to study the design employed by the application. Different application types and target users require different designs. For example, in a program that includes three branching options, a novice computer user might be better served by delivering the three options over the course of five interface screens, via a wizard. An information services (IS) professional, on the other hand, might prefer receiving all options on a single screen, so that he or she could access them more quickly.
TOPICS TO CONSIDER WHEN EVALUATING DESIGN
Design metaphors are cognitive bridges that can help users understand the logic of UI flow by relating them to experiences that users may have had in the real world, or in other places. An example of an effective design metaphor includes Web directory sites that utilize a design reminiscent of a library card catalog. Another metaphor example includes scheduling applications that visually mirror the layout of a desktop calendar and address book. Microsoft Word uses a document-based metaphor for its word-processing program—a metaphor that is common to many types of applications.
EXAMPLES OF TWO DIFFERENT DESIGN METAPHORS
TWO DIFFERENT APPROACHES TO CONVEY IDENTICAL INFORMATION AND
Neither design approach is more correct than the other. They are simply different. Regardless of the design approach employed, it is usually not our role as testers to judge which design is best. However, that does not mean that we should overlook design errors, especially if we work for an organization that really cares about subjective issues such as usability. Our job is to point out as many design deficiencies early in the testing as possible. Certainly, it is our job to point out inconsistency in the implementation of the design. That is, if the approach is using a pull-down menu as opposed to using radio buttons, a pull-down menu should then be used consistently in all views.
Think about these common issues:
Navigation options via radio buttons
Ask yourself these questions:
Navigation options via pull-down menu
User Interaction (Data Input)
Users can perform various types of data manipulation through keyboard and mouse events. Data manipulation methods are made available through on-screen UI controls and other technologies, such as cut-and-paste and drag-and-drop.
User Interface Controls
User interface controls are graphic objects that enable users to interact with applications. They allow users to initiate activities, request data display, and specify data values. Controls, commonly coded into HTML pages as form elements, include radio buttons, check boxes, command buttons, scroll bars, pull-down menus, text fields, and more. Figure 9.5 includes a standard HTML text box that allows limited text input from users, and a scrolling text box that allows users to enter multiple lines of text. Click-softing the Submit button beneath these boxes submits the entered data to a Web server. The Reset buttons return the text boxes to their default state.
Radio buttons are mutually exclusive—only one radio button in a set can be selected at one time. Check boxes, on the other hand, allow multiple options in a set to be selected simultaneously. Figure includes a pull-down menu that allows users to select one of multiple predefined selections. Clicking the Submit button submits the user's selection to the Web server. The Reset button resets the menu to its default state. The pushbuttons (Go Home and Search) initiate actions (e.g., CGI scripts, search queries, submit data to a database, hyperlinks, etc.).
Figure also includes examples of images (commonly referred to as graphics or icons) that can serve as hyperlinks or simulated pushbuttons.
Form-based HTML UI controls, including a standard HTML text box and a scrolling text box
Form-based HTML UI controls: including a pull-down menu
Figures illustrate the implementation of several standard HTML UI controls on a Web page. Figure shows the objects (graphic link, mouse-over link titles or ALT, and a text link) as they are presented to users. Figure shows the HTML code that generates these objects.
Standard HTML controls, such as tables and hyperlinks, can be combined with images to simulate conventional GUI elements such as those found in Windows and Macintosh applications (navigation bars, command buttons, dialog boxes, etc.). The left side of Figure (taken from the sample application) shows an HTML frame that has been combined with images and links to simulate a conventional navigation bar.
Dynamic User Interface Controls
Scripts are programming instructions that are executed by browsers when HTML pages load or when they are called based on certain events. Some scripts are a form of object-oriented programming, meaning that program instructions identify and send instructions to individual elements of Web pages (buttons, graphics, HTML forms, etc.), rather than to pages as a whole. Scripts do not need to be compiled and can be inserted directly into HTML pages. Scripts are embedded into HTML code with <SCRIPT> tags. Scripts can be executed on either the client side or the server side. Client-side scripts are often used to dynamically set values for UI controls, modify Web page content, validate data, and handle errors.
Graphic links, mouse-over text, and text links
HTML code for graphic links, mouse-over text, and text links
Tables, forms, and frames simulating Windows-based UI controls
Java is a computing language developed by Sun Microsystems that allows applications to run over the Internet (though Java objects are not limited to running over the Internet).
Java is a compiled language, which means that it must be run through a compiler to be translated into a language that computer processors can use. Unlike other compiled languages, Java produces a single compiled version of itself, called Java bytecode. Bytecode is a series of tokens and data that are normally interpreted at runtime. By compiling to this intermediate language rather than to binaries that are specific to a given type of computer, a single Java program can be run on several different computer platforms for which there is a Java Virtual Machine (Java VM). Once a Java program has been compiled into bytecode, it is placed on a Web server. Web servers deliver bytecode to Web browsers, which interpret and run the code.
Java programs designed to run inside browsers are called applets. When a user navigates to a Web site that contains a Java applet, the applet automatically downloads to the user's computer. Browsers require Java bytecode interpreters to run applets. Java-enabled browsers,such as Netscape Navigator and Internet Explorer, have Java bytecode interpreters built into them.Precautions are taken to ensure that Java programs do not download viruses onto the user's computers. Java applets must go through a verification process when they are first downloaded to users' machines—to ensure that their bytecode can be run safely. After verification, bytecode is run within a restricted area of RAM on users' computers.
ActiveX is a Windows custom control that runs within ActiveX-enabled browsers (such as Internet Explorer), rather than off servers. Similar to Java applets, ActiveX controls support the execution of event-based objects within a browser.
One major benefit of ActiveX controls is that they are components. Components can be easily combined with other components to create new, features-rich applications. Another benefit is that once a user downloads an ActiveX control, he or she will not have to download it again in the future; ActiveX controls remain on users' systems, which can speed up load time for frequently visited Web pages.
Some disadvantages of ActiveX are that it is dependent on the Windows platform, and some components are so big that they use too much system memory. ActiveX controls, because they reside on client computers and generally require an installation and registration process, are considered by some to be intrusive. Figure 9.10 shows a calendar system ActiveX control. Figure shows the HTML code that generated the page in Figure. An HTML <OBJECT> tag gives the browser the ActiveX control class ID so that it can search the registry to determine the location of the control and load it into memory.
Calendar system ActiveX control
HTML code that generated the ActiveX control
Sometimes, multiple ActiveX controls are required on the same HTML page. In such instances, controls may be stored on the same Web server, or on different Web servers.
Server-side includes (SSIs) are directives to Web servers that are embedded in HTML comment tags. Web servers can be configured to examine HTML documents for such comments and to perform appropriate processes when they are detected. The SSIs are typically used to pull additional content from other sources into Web pages—for example, the addition of current date and time information. Following is an example of an SSI (enclosed between HTML comment tags) requesting that the Web server call a CGI script named mytest.cgi. <!--#exec cgi="/cgi-bin/mydir/mytest.cgi"-->
Style sheets are documents that define style standards for a given set of Web pages. They are valuable in maintaining style consistency across multiple Web pages. Style sheets allow Web designers to define design issues such as fonts and colors from a central location, thus freeing designers from concerns over inconsistent graphic presentation that might result from browser display differences or developer oversight.
Style sheets set style properties for a variety of HTML elements: text style, font size and face, link colors, and more. They also define attribute units such as length, percentage, and color. The problem with traditional style sheets is that they do not take the dynamic nature of Web design into account. Web pages themselves offer multiple means of defining styles without the use of style sheets—for example, style properties can be defined in an HTML page's header, or inline in the body of an HTML document. Such dynamic style definition can lead to conflicting directives.
Cascading style sheets (CSS) is the most common and most mature style sheet language. Cascading style sheets offer a system for determining priority when multiple stylistic influences are directed onto a single Web page element.
Cascading style sheets dictate the style rules that are to be applied when conflicting directives are present. Cascading style sheets allow Web designers to manage multiple levels of style rules over an unlimited number of Web pages. For example, a certain line of text on a Web page might be defined as blue in the page's header, as red in the page's body text (inline), and as black in an external style sheet. In this scenario, CSS could establish a hierarchy of priority for the three conflicting style directives. The CSS could be set up to dictate that inline style commands take priority over all other style commands. Following that in priority might be ''page-wide" style commands (located in page headers). Finally, external style sheet commands might hold the least influence of the three style command types. There are different means of referencing style sheets. The browser takes all style information (possibly conflicting) and attempts to interpret it. Figure shows a mixture of styles applied to a page. Some of the approaches may be incompatible with some browsers.
Some errors that you should look for include:
Navigation methods dictate how users navigate through a program—from one UI control to another within the same page (screen, window, or dialog box), and from one page to the next. User navigation is achieved through input devices, such as keyboard and mouse. Navigation methods are often evaluated by how easily they allow users to get to commonly used features and data.
Ask yourself these questions:
Testing the Sample Application
User navigation within the sample application is achieved via standard UI controls (keyboard and mouse events). Data updates are submission based, meaning that they are achieved by clicking action buttons, such as Submit. diagrams how users navigate through the sample application's trend metrics and distribution metrics features.
Sample application navigation
Mouse/Keyboard Action Matrices
Appendices D and E contain test matrices that detail mouse and keyboard actions. These matrices can be customized to track navigation test coverage for the Web system under test.
Occasionally, the names of on-screen commands are not used consistently throughout an application. This is partially attributable to the fact that the meaning of command names often varies from one program to the next. If the nomenclature of certain commands varies within a single program, user confusion is likely to result. For example, if a Submit command is used to save data in one area of a program, then the Submit command name should be used for all saving activities throughout the application.
Consideration should be given to the action commands that are selected as the default commands. Default action commands should be the least risky of the available options (the commands least likely to delete user-created data).
Table lists a number of common confirming-action and canceling-action commands, along with their meanings and the decisions that they imply.
Confirming and Canceling Commands
Feedback and Error Messages
Consistency in audible and visible feedback is essential for maintaining clear communication between users and applications. Messages (both visible and audible), beeps, and other sound effects must remain consistent and user friendly to be effective. Error messaging in particular should be evaluated for clarity and consistency.
Examine the utilization of interface components within feedback for unusual or haphazard implementations. One can identify commonly accepted guidelines within each computing platform for standard placement of UI elements, such as placing OK and Cancel buttons in the bottom right corner of dialog boxes. Alternate designs may make user interaction unnecessarily difficult.
As a general rule, simple errors such as invalid inputs should be detected and handled at the client side. The server, of course, has to detect and handle error conditions that do not become apparent until they interfere with some process being executed on the server side. Another consideration is that, sometimes, the client might not understand the error condition being responded to by the server, and it might therefore ignore the condition, or display the wrong message, or display a message that no human can understand.
Browser-based error message
Additionally, the client might not switch to the appropriate state or change the affected data items in the right way unless it understands the error condition reported by the server. Some errors to look for include the following:
Ask yourself these questions:
Data Presentation (Data Output)
In Web applications, information can be communicated to users via a variety of UI controls (e.g., menus, buttons, check boxes, etc.) that can be created within an HTML page (frames, tables, simulated dialog boxes, etc.).
Figure illustrate three data presentation views that are available in the sample application. Each view conveys the same data through a different template built using HTML frames and tables.
In this sample application example, there are at least three types of potential errors: (1) data errors (incorrect data in records caused by write procedures), (2) database query errors, and (3) data presentation errors. A data error or database query error will manifest itself in all presentations, whereas a presentation error in server-side scripts will manifest itself only in the presentation with which it is associated. Figure illustrates the data presentation process. Where errors manifest themselves depends on where the errors occur in the process.
Single issue report presented in Full View
Same issue report presented in Edit View
Analyze the application to collect design architectural information. One of the most effective ways to do this is to interview your developer. Once the information is collected, use it to develop test cases that are more focused at the unit level, as well at the interoperability level.
Web Service Testing Related Interview Questions
|LoadRunner Interview Questions||Web Service Testing Interview Questions|
|ETL Testing Interview Questions||Testing Tools Interview Questions|
|QTP Interview Questions||Manual Testing Interview Questions|
|Selenium Interview Questions||Oracle SOA suit 11g Interview Questions|
|Automation Testing Interview Questions||Soap UI Interview Questions|
|API testing Interview Questions||Test Cases Interview Questions|
|Selenium WebDriver Interview Questions||Web testing Interview Questions|
|Performance Testing Interview Questions||Test Estimation Interview Questions|
|Advanced C# Interview Questions||Embedded Testing Interview Questions|
|Middleware Interview Questions||Radar Test Engineer Interview Questions|
Web Service Testing Tutorial
Welcome To Web Testing
Web Testing Versus Traditional Testing
Software Testing Basics
Web Application Components
Test Planning Fundamentals
Sample Test Plan
User Interface Tests
Configuration And Compatibility Tests
Web Security Concerns
Performance, Load, And Stress Tests
Web Testing Tools
All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd
Wisdomjobs.com is one of the best job search sites in India.