FSU Researchers can contribute their experience with experimentation

User Generated

fnatrrgu666

Computer Science

Frostburg State University

Description

INTRODUCTION

Researchers can contribute their experience with experimentation to develop improved techniques for interface evaluation and the user experience. Guidance in conducting pilot studies, acceptance tests, surveys, interviews, and discussions would benefit large-scale development groups, but additional attention needs to be given to smaller projects and incremental-type changes. Strategies are needed to cope with evaluation for the numerous specific populations of users and the diverse forms of disabilities that users may have. In this project 2, you are the Experts in helping design and constructing psychological tests which can help in preparing validated and reliable test instruments for subjective evaluation of the varying types of interfaces, from small mobile devices to very large displays, including specialized interfaces such as gaming. Such standardized tests would allow independent groups to compare the acceptability of interfaces.

SCOPE: You are working for an Independent company and you are tasked to design an evaluation instrument tool used to profile users’ skill levels with interfaces that would be helpful in job-placement and training programs.

STEP ONE (1) Use PowerPoint (MS Suite Products or suitable Tool) to design (draw) your HCI Interface, design an Evaluation Instrument Test Tool to validate an interface for a small mobile device or a very large display to include specialized interfaces such as gaming. Please show in your design how you would incorporate quality features e.g., usability, universality, and usefulness using an AI and/or Machine Learning approach.

OBJECTIVE: This project 2 should show how do you best incorporate and evaluate qualitative data and dimensions such as fun, pleasure, joy, affect, challenge, or realism.

STEP TWO: Answer each of the following question to include why as it relates to Evaluations and the Users Experience.

  • Would benchmark datasets and task libraries help standardize evaluation?
  • How useful can researchers make automated testing against requirements documents?
  • How many users are needed to generate valid recommendations?
  • How can we better explain the differences between users’ perceptions of a task and the objective measures?
  • How do we select the best measure for a task?
  • How can life-critical applications for experienced professionals be tested reliably?
  • Is there a single usability metric that can be used and compared across types of interfaces?
  • Can we combine performance data and subjective data and create a single meaningful result?
  • Is there a scorecard that can be used to aid in the interpretation of usability results?
  • Is there a theory to explain and understand the relationship between measures?

Explanation & Answer:
3 task
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Attached.

Assignment
Research

STEP ONE

USER

Feedback

MANIPULATION

USER
INTERFACE

USER
INTERFACE

RESPONSE

REQUEST

Application

STEP TWO


Would benchmark datasets and task libraries help standardize evaluation?

In order to provide a set of standardized tasks and metrics that can be used to
assess the performance of various interfaces, benchmark datasets and task
libraries can help standardize evaluation. Researchers and developers can use
this to compare the effectiveness of various interfaces and pinpoint areas for
development.


How useful can researchers make automated testing against requirements
documents?

When assessing interfaces, automated testing against requirements documents
can be very helpful in ensuring that the interface complies with the
requirements. This can speed up the testing process, conserve resources, and
guarantee that the interface is user-friendly and functional.

Continuation.


How many users are needed to generate valid recommen...


Anonymous
Nice! Really impressed with the quality.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Related Tags