Wednesday, June 17, 2009

Q/A1

What is Software Quality Assurance?

Software QA covers the complete software development process - monitoring and improving the process, making sure that all standards and procedures are followed, and guarantying that issues are found and dealt with. SQA is oriented to defect 'prevention'.

Which is better - black, gray or white box testing?

It's impossible to declare one of the testing approaches to be better then another. It depends of Quality Assurance Engineer skill set, the type of the project, what is trying to be achieved during testing.

What is gray box testing?

In recent years the term gray box testing has appear into common usage. Gray box testing is a software testing procedure that uses an amalgamation of black box testing and white box testing techniques.

With gray box testing approach, Quality Assurance Engineer does have the knowledge of some of the internal structure of the application under test. In gray box testing, Quality Assurance Engineer creates some test cases for the internal mechanism of the application under test. For the rest of the test case, Quality Assurance Engineer uses a black box approach in applying inputs to the application under test and validates the outputs.

What is white box testing?

Sometimes White box testing called in different Quality Assurance organizations as glass box testing or clear box testing, uses an internal perspective of the application under test to design test cases based on the knowledge of internal structure. In order to work as white box tester, the tester has to work with the application code and therefore is needed to possess knowledge of coding and logic.

What is the difference between QA and testing?

The main difference between QA and testing is that software quality assurance is oriented to defect 'prevention', while software testing is oriented to defect 'detection'. In other words testing measures the quality of a developed software application, QA measures the quality of processes used to create a quality software application.

What is Ad Hoc Testing?

Ad hoc software testing is a type of testing executed without documentation and planning. Ad hoc tests are intended to be run only once, unless a defect is discovered. Ad hoc testing is a part of exploratory testing.

What is Acceptance Testing?

Acceptance Testing is black box testing performed by customer to determine whether to accept a software product. Normally performed prior to software application delivery to validate the software application meets a set of agreed acceptance criteria.

What is test strategy?

The test strategy is defined set of methods and objectives that direct test design and execution. The test strategy describes the overall testing approach for the testing of application under test including stages of testing, completion criteria, and general testing techniques. The test strategy forms the basis for test plans

What is test harness?

In software quality assurance, a test harness is a software application configured to verify an application under test or test environment.

What is traceability matrix?

In software quality assurance, a traceability matrix can be used to show relationships between software requirements and test cases.

What is test suite?

In software quality assurance, a test suite is a collection of test cases used to validate the software program to show it has some defined set of behaviours. Usually a test suite holds prerequisite steps, clear goals and instructions for each collection of test cases in addition to information on the system and environment configuration to be used during testing and validation.

What tests shouldn't we automate?

Most types of testing benefit from automation, but some testing type's needs a real human attention and intelligence. It is possible, but difficult to automate GUI even with agile compatible tools like Selenium. Usability testing, exploratory testing and test that will never fail should not be considered as targets for test automation.

What tests should we automate?

Some testing type's needs human attention and intelligence, but most types of testing benefit from automation. In the same time QA Engineer should automate only that which needs automating. No one can automate 100 percent of testing work, but in certain areas like performance testing, load testing, stress testing regression testing, your team may have chance of reaching near to 100 percent of test automation. Other areas of easy automation would be API testing, test data set up and creation

Do you recommend using test automation in agile environment?

Proper test automation should be a core agile practice. Successful agile projects depend on test automation. Thriving agile teams expect to have working software all the time, which allows them to build and deploy production ready software application as often as customer needed. Agile teams cannot accomplish this goal without constant and proper testing. The following list contains main reasons for test automation in agile process:

Manual testing is long process
Manual testing is not repeatable process
Manual testing is error prone
Automation frees software engineer time
Automation regression tests offer a safety net
Automated tests provide feedback as often as needed
Automated tests become living documentation

What is stress testing?

Stress testing is used evaluate the application's behaviour when it is pushed beyond the normal or peak load conditions. The main goal of stress testing is to discover application issues that appear only under high load conditions. These can contain such issues as synchronization problems, race conditions, and memory leaks. Graceful performance degradation under high load leading to non-catastrophic failure is the desired result. Load test engineer could use the same scripts and tools as were used for performance testing, but using a very high level of simulated load.

What is performance testing?

Performance testing is used to determine the response time/latency, throughput, resource utilization (CPU, RAM, network I/O, disk I/O) and workload of a software application. The main goal of performance testing is to identify how well your application performs in relation to your performance objectives. The intent of the performance testing is not to break the application under test. The intent is to observe and document performance under expected usage conditions. There are performance test available to help simulate load, for example Apache JMeter, WebLOAD, LoadRunner and so on. Using test automation tools Load Test Engineer can simulate load in terms of users, connections, data, and in any other ways.

Why does test automation project fail?

The following problems are often encountered during test automation projects and may result in failing test automation projects:

  • Management doesn't treat test automation as software development. Anyone can test and automation is easy – just record and playback.
  • QA team select wrong set of test cases for test automation. In the same time, management aims for 100% test automation of all test cases.
  • QA Engineers spend time between manual and automation testing, instead of concentrating on one task.
  • No one realized that automation test cases are difficult to maintain and manage.
  • The development, maintenance, and management of automated test scripts often need additional time and resources than manual test execution by inexperienced tester.

How do you know when to stop testing?

There are several ways for Quality Assurance Engineer to find out when testing stops:

  • Release, testing or customer deadline has been met
  • All test cases have been executed and certain predetermined percentage of the passed
  • Predetermined code coverage percentage is met
  • The number of found bugs or bug severity falls to a certain point
  • No money left to continue testing

Could you name a few testing activities?

QA Engineer role usually includes a variety of testing activities:

  • Create test plans and test cases
  • Execute tests
  • Log bugs
  • Communicate with various team members like testers, developer and managers
  • Make crucial decisions on whether something is a bug or a design constraint
  • Schedule projects
  • Allocate human and technical resources
  • Make crucial decisions about the software applications
  • Automate testing

What makes a good QA Engineer?

A good QA Engineer should able to perform the following tasks successfully in any environment:

Verification: A good QA Engineer can officially state that it is possible to accomplish certain tasks.
Detection: A good QA Engineer seeks issues that exist, either in the process or the product.
Prevention: A good QA Engineer recognizes potential issues before they become visible.
Reflection: A good QA Engineer looks back at how problems and bugs ended up in the product and analyzes this data to find out how to make the process better in the future.

Define test automation requirements for developing web application?

Imagine that you were asked to evaluate web application from test automation friendliest point of view. These criteria could be used to name web application automation friendly in order to test an application with SilktTest, QTP, Selenium or any other test automation tool.

  • All web pages should have names
  • Similar objects should have consistent names
  • Unique names should be used for various objects
  • All the images have ALT attribute test assigned
  • Dynamic content should have a proper name or html id
  • All tables displaying data should have names

How do you keep your testing skills updated?

  • Read testing websites, magazines and books to understand latest trends in software testing industry.
  • Read about new testing tools available in market.
  • Practice by trying out different testing tools
  • Attend professional testing conferences

Does your team use continuous integration?

If tester doesn't understand what interviewer means by continuous integration, the tester probably didn't work in a good software environment. How can QA Engineer get steady code build for testing if there is no bulletproof method of building and deploying code to testing and production environment? If there is no continuous integration process in place, QA Engineers most likely would spend time finding and reporting "show-stopper" and unit level bugs. The interviewee should be prepared to answer what source control (also known as version control, source control or (source) code management (SCM) systems they used. There are plenty of them around and most popular are SV, Perforce and VSS. The interviewee also needs to know about continuous integration software like CruiseControl, Bamboo or Hudson.

What's in your Testers Tool Box?

The test interview is not only a test of interviewee specific knowledge, but an opportunity to knowledge exchange. As an interviewer I have to spend at least half an hour interviewing some potential Quality Assurance Engineer and I want to use these minutes wisely. For example, I like to interview testers about various tools they use during preparation and actual testing. Here are some wonderful tools I use in my day-to-day testing routine:

Firebug - extension for Mozilla Firefox browser allows the debugging, editing, and monitoring of any website's CSS, HTML, DOM, and JavaScript;

Selenium - a free software testing framework for web applications;

Windows Virtual PC - is a virtualization suite for Microsoft Windows operating systems, and an emulation suite for Mac OS X on PowerPC-based systems. Virtual PC allows you create separate virtual machines on your Windows desktop;

Cygwin - a Unix-like environment and command-line interface for Microsoft Windows;

OpenSTA - GUI-based web server benchmarking utility that can perform scripted HTTP and HTTPS heavy load tests with performance measurements;

WinSCP - an open source SFTP and FTP client for Microsoft Windows;


HttpWatch – an HTTP Viewer and HTTP Sniffer for IE and Firefox;


 

What do you hate about testing?

Tester interview questions usually focused on positive results, like most obvious test interview question is what do you like about testing, but should it be the case. I believe asking reverse interview question would open the real mind of candidate for Test Engineer position and would perfectly describe the software development organization where Test Engineer works now.

The most hated term among all testers is "UI Automation" and the misunderstanding from management around "UI automation", thinking it is the silver bullet to all software development problems. As result the company spends money on unproductive test automation software.

Next most hated issue would be the developers. Some developers know how to test, when to test and what to test, another just throw the code over the fence with issue so bad that a basic sanity test could have caught as blocking issue.
The testers do not like managers, because every time the customer raises a defect in shipped product the management would question the testing team why this defect was missed during the testing cycle and who missed the issue instead of doing the root cause analysis for the defect. Some managers continuously call Test Engineer as Quality Assurance Engineer while Quality Assurance is a process not a title and request the software application to be QA's when the meant tested.
Of course test engineers hate themselves. There are testers who get comfortable with what they already know and stop pushing themselves to learn more, other testers doing the same manual tasks again and again, while basic automation should be applied and used.

Why do you want to leave your current job?

"I've been working with my wonderful company to advance the state of testing. My management has reached a point where they are satisfied with the state of quality assurance team, while I am still striving to improve in the art of quality assurance. I feel that I can no longer add value at my present company and it is time for me to start a new life"

I'm not sure I want to leave my company, but in the same time your job posting interested me and I really would like to talk about the opportunity your company has available.


 


 

No comments: