Patient Education Technology Guide to a Mobile Health Application PowerPoint

User Generated

Xw18

Health Medical

Description

See assignment, directions, rubic and article below. This is a powerpoint, slideshow only assignment. Thanks in advance.


Unformatted Attachment Preview

Directions • • • • You are required to complete this assignment using the productivity tools required by Chamberlain University, which is Microsoft Office 2013 (or later version), or Windows and Office 2011 (or later version) for MAC. You must save the file in the ".pptx" format. A later version of the productivity tool includes Office 365, which is available to Chamberlain students for FREE by downloading from the student portal at http://my.chamberlain.edu (Links to an external site.)Links to an external site.. Click on the envelope at the top of the page. You are required to cite your source(s) as it relates to your application slide. Other citations are permitted, but this is not a requirement for the assignment. Title slide (first slide): Include a title slide with your name and the title of the presentation. Scenario Slide (one slide): This slide should include a brief scenario, then identify a patient who is experiencing a specific disease process or diagnosis who would benefit from an already developed and reliable mHealth app. Or it could identify a person who is currently healthy and would like to maintain or improve health and prevent illness. Be sure to include the nurse's assessment of the patient's learning needs and readiness to learn. Be specific. Example: Scenario for Ms. Ellis Jane Doe (your name here) • • • • • Jennifer Ellis, a 62-year-old African American woman, has been recently diagnosed with chronic kidney disease (CKD). She has been prescribed several medications she must take every day. The nephrologist has stressed the importance of leading a healthy lifestyle to slow or stop the progression of CKD. She is interested in ways in which she can better track her health and make healthier choices. She is a high school graduate and iPhone user, mostly to send text messages to family and friends. She is eager to learn how to use an app that can help her manage her CKD. Prepare the following slides as if you are presenting them to the patient. • • • • • mHealth application slide (one to three slides): Identify a developed and reliable mHealth app that could benefit the patient. Describe the app, including the following. Name Purpose Intended audience Mobile device(s) upon which it will operate • • • • • • Where to download or obtain it (include a working link if it is to be downloaded from a website) Any other information you believe would be pertinent to this situation Be sure to cite all sources you use. Teaching slides (one to three slides): Prepare slides that contain important points about the app that you want to teach to the patient, such as how to use the app safely and effectively (including how to interpret and act on the information that is provided). Evaluation slide (one to three slides): Describe how you would determine the success of the patient's use of this app. For example, include ways to evaluate the effectiveness of the teaching plan that are a good fit for the type of app or focus on specific ways that this app benefits the patient's health and wellness. References (last slide): List any references for sources that were used or cited in the presentation. Writing and design: There should be no spelling or grammatical errors. Writing is concise and clear. Avoid words that the patient may not understand. Slides are visually appealing, incorporating graphics, photographs, colors, and themes. Review the section on Academic Integrity Policy found in the RNBSN Policies. All work must be original (in your own words) unless properly cited. Best Practices in Preparing PowerPoint Slideshows • • • • • • • • • Be creative but realistic. Incorporate graphics, color, themes, or photographs to increase interest. Make it easy to read with short bullet points and large font. Review directions thoroughly. Cite all sources within the slides with (author, year), as well as on the reference slide. Proofread prior to final submission. Check for spelling and grammar errors prior to final submission. Abide by the Chamberlain academic integrity policy. Tutorial: For those not familiar with the development of a PowerPoint slideshow, the following link to the Microsoft website may be helpful. http://office.microsoft.com/enus/support/training-FX101782702.aspx (Links to an external site.)Links to an external site. The Chamberlain Student Success Strategies (CCSSS) offers a module on Computer Literacy that contains a section on PowerPoint. The link to SSPRNBSN may be found in your student portal. **Academic Integrity Reminder** Chamberlain College of Nursing values honesty and integrity. All students should be aware of the Academic Integrity policy and follow it in all discussions and assignments. By submitting this assignment, I pledge on my honor that all content contained is my own original work except as quoted and cited appropriately. I have not received any unauthorized assistance on this assignment. Please see the grading criteria and rubrics on this page. Systematic review of the types of methods and approaches used to assess the effectiveness of healthcare information websites Contents 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. Review of healthcare information websites Introduction Methods Results Discussion Limitations of the review Conclusion Conflicts of interest Acknowledgements What is known about the topic? What does this paper add? Table 1. Number and percentage of studies according to measurement focus Table 2. Number and percentage of studies by stage of website development and by evaluation emphasis References Listen The aim of this systematic review was to identify types of approaches and methods used to evaluate the effectiveness of healthcare information websites. Simple usage data may not be sufficient to assess whether desired healthcare outcomes were achieved or to determine the relative effectiveness of different web resources on the same health topic. To establish the state of the knowledge base on assessment methods used to determine the effectiveness of healthcare websites, a structured search of the literature was conducted in Ovid Medline, resulting in the retrieval of 1611 articles, of which 240 met the inclusion criteria for the present review. The present review found that diverse evaluation methods were used to measure the effectiveness of healthcare websites. These evaluation methods were used during development, before release and after release. Economic assessment was rare and most evaluations looked at content issues, such as readability scores. Several studies did try to assess the usefulness of websites, but few studies looked at behaviour change or knowledge transfer following engagement with the designated health website. To assess the effectiveness of the knowledge transfer of healthcare information through the online environment, multiple methods may need to be used to evaluate healthcare websites and may need to be undertaken at all stages of the website development process. Research Review of healthcare information websites Introduction Given the diversity of information needed by health professionals and health consumers within a range of care settings, the online environment offers a powerful means by which to disseminate and maintain information currency, as well as encourage engagement with healthcare knowledge. Information seekers now have an unprecedented ability to access vast amounts of information on any health issue via a keyboard or touch screen. Online healthcare information dissemination is powerful because of its reach, its relative cost advantages and its immediate availability. As a result, there is growing interest in the role of e-health and telehealth within the healthcare system, particularly within primary healthcare (Australian Government 2010). Statistics provide evidence of the rapid uptake and use of the Internet by all age groups, geographic regions and countries for accessing information, including healthcare information. Figures from the Internet World Stats website showed that, in June 2012, there were 2.45 billion people, or 34.3% of the world's population, with access to the Internet (Internet World Stats 2013). According to the Australian Bureau of Statistics, during the same time period in Australia there were over 12 million Internet users, with the number of users increasing at a rate of around 10% annually (Australian Bureau of Statistics 2012). A significant proportion of this Internet activity relates to health issues. A study by Eysenbach and Kohler (2003) examined the prevalence of health-related searches on the web by analysing search terms entered into popular search engines. These authors found that an estimated 4.75% of all searches were health related (Eysenbach and Kohler 2003). The Pew Internet and American Life Project report Health Online 2013 indicated that 72% of Internet users have looked online for health information of one kind or another within the past year; three-quarters of these searches began at a search engine such as Google or Bing (Fox and Duggan 2013). This demand for online health information highlights the importance of information being derived from credible sources. Although advances in technological capabilities are making it increasingly easier to create online resources, developing high-quality content that is readily accessible draws upon a complex range of skills and knowledge. The types of skills and knowledge required can be most readily seen in the US usability guidelines (US Department of Health and Human Services 2006) or the Australian Government's web publishing guidelines (Australian Government 2012). An integral part of determining the 'value' of a website is to formally investigate the resource in a structured, purposeful manner. This form of investigation, or evaluation, can support design integrity, successful development and delivery, and appropriate modification and recognition of the website (or other form of online information dissemination). Patton (2008, p. 27) has described evaluation as: The systematic collection of information about the activities, characteristics, and outcomes of program services, policy or processes, in order to make judgements about the program or process, improve effectiveness, and/or inform decision about future development. Evaluation activities in the online environment are particularly important for enhancing the functionality of resources for users and maximising the contribution of online information to outcomes in the health system. By doing so, evaluation can demonstrate the value of the online information and the delivery platform to policy makers, funders and health organisations. Nevertheless, there are challenges in evaluating online health resources. These challenges include different users having different requirements and perspectives on what is a successful outcome (Greenhalgh and Russell 2010; Pawson et al. 2011). In addition, there are human, organisational and technical factors relating to system design, development and use that may also impact on appropriate evaluation (Pagliari 2007; Yusof et al. 2008; Catwell and Sheikh 2009). Regardless of these obstacles, there is an increasing recognition of the value of such studies both to the individual resource and to the body of knowledge that supports online resource development (Eysenbach 2011). However, the landscape of approaches used in evaluating health information websites during design, development and production remains unclear. The present systematic review maps the different evaluation approaches being used to assess the effectiveness of healthcare websites and, in so doing, provides baseline knowledge not only of the evaluation methods being used, but also of aspects of website and information development that appear to be underevaluated. Methods The study was conducted between January and April 2012. A literature search of the Ovid SP database was conducted on 7 February 2012. The search comprised two constructs: (1) website (Internet/OR website$.mp.); and (2) evaluation (Evaluation Studies as a Topic/ OR 'Outcome and Process Assessment (Health Care)'/ OR assessment.mp. OR 'Outcome Assessment (Health Care)'/ OR Quality Indicators, Health Care/ OR Quality Assurance, Health Care/ OR 'Quality of Health Care'/ OR quality.mp. OR Quality Improvement/). The following limits were applied: English language, year = 1995-current. Retrievals were restricted to the following publication types: evaluation studies, case reports, clinical trials, comparative study, meta-analyses or randomised controlled trials. The literature search retrieved 1611 articles that were downloaded into the electronic reference manager Endnote X4 (Thomson Reuters, Carlsbad, CA, USA) for screening. To be included in the present review, articles needed to: discuss a website; describe a study that was related to evaluation; and be about a website that provides healthcare information. Exclusion criteria included not being an online resource, not being a website (social media platforms were excluded), describing online healthcare professional education, not being a study, not relating to health and/or not being able to retrieve the article. Abstracts of the 1611 retrieved articles were screened by a research assistant (SLB) against the inclusion and exclusion criteria. Three sets of randomly selected articles (n = 70) were screened by a second rater (JT) to determine inter-rater reliability, which was found to range from 76% to 92% for the three sets of articles. After each exercise, the two raters met to discuss the sources of variability and refine decision making. Finally, 240 articles were identified that met the criteria for inclusion. A flow chart of retrievals, exclusions and studies included is outlined in Fig. 1. The data extraction form comprised a set of commonly recorded items, such as author and citation details, abstract, name of online resource and web address (where provided), healthcare content area, intended audience, year study conducted, description of the evaluation, methods of data collection and details of any statistical analysis undertaken. In addition, four specific review categories were developed (Website Type, Measurement Focus, Stage of Development, and Evaluation Emphasis), as described below. A trial extraction of data from a random selection of 10% of included studies was jointly undertaken (JT, SLB) to finalise the data extraction form and to determine its workability. Weekly meetings were held across the project allowing for review of coding issues in data extraction. As noted above, four additional extraction categories were created to provide a further basis for organising the included studies: * Website Type. This categorisation indicated whether the study included looked at one or more websites and whether it was evaluating the same or different content areas at a single or multiple time points. * Measurement Focus. This item categorised the study included by using the focus of measurement reported in the article. The categories represented specific components of evaluation activity and focus across the design, development, implementation and maturity phases of a website's life cycle. The initial categories were developed from reports in the literature that covered developmental and technical perspectives and user needs (Cunliffe 2000; Calero et al. 2005; Elling et al. 2007; Tankeleviviene and Damasevicius 2009). In a joint data extraction exercise, the categories and descriptors for the Measurement Focus items were expanded, refined and documented. The categories reflect the outcome measures being examined in the evaluation method (e.g. usability testing, web metrics, behaviour change and economic assessment). * Stage of Development. This categorisation relates to phase of website development with three prerelease categories and three post-release categories. The Australian Government Information Office's range of Better Practice Checklists & Guidance materials highlights the need for consideration of the life cycle of a website from concept to decommissioning (Australian Government Information Office 2013). * Evaluation Emphasis. This categorisation uses an evaluation framework developed to guide evaluation activities for the CareSearch website (http://www.caresearch.com.au.chamberlainuniversity.idm.oclc.org, verified 18 August 2013). The framework was developed following a program logic activity to outline the role of evaluation within the development of the CareSearch website (Tieman and Martin 2009). The three levels of emphasis relate to enhancing access, measuring use and assessing usefulness or impact. Descriptive statistical analysis was conducted using SPSS version 19.0.01 (SPSS Inc. 2010). Results The review found that over half the studies evaluated a particular group of websites on a single topic (e.g. breast cancer information) at single time point (59.2%; 142/240). Forty-four studies compared the performance of a group of websites across different healthcare topics (18.3%) and 10 studies looked at a particular group of websites (e.g. mental health advice) at different time points (4.2%). Fifteen studies looked at one website at a single time point (6.3%) and a further 31 looked at a single website at multiple time points (12.9%). Cancer, mental health, reproductive health and paediatrics were the most common health content areas of the websites included in evaluation studies. When appraising the measurement focus of the evaluation studies, assessment of content quality areas, such as accuracy and currency of the information (68.8%; 165/240), and structural elements, such as menu systems, navigation or hyperlinks (43.3%; 104/240), were the two most commonly assessed areas, as indicated in Table 1. Project evaluation and economic assessment were the least common types of measurement focus. Most of the studies reviewed were conducted after the release of the designated website (80.8%; 194/240). Approximately 30% looked at studies before release during the concept analysis and/or needs analysis phase (14.2%; 34/240) or the development phase (17.5%; 42/240), as indicated in Table 2. Although more than 80% of the studies (199/240) addressed issues relating to enhancing access to the website, less than one-third attempted to assess the usefulness of the resource in terms of whether it had actually changed practice or outcomes (Table 2). Discussion Throughout the screening and analysis, it became clear that the evaluation methods being described were extensive and variable. Understanding the range and nature of approaches being used, and possible issues associated with implementing particular approaches, is important to ensure that the most appropriate types of studies are undertaken to meet different evaluation purposes, and that the studies reflect the different stages of website development. Eysenbach (2011) has highlighted the need to improve and standardise the quality of reporting in this field to facilitate the use and dissemination of this published research. The significant number of articles reporting on the findings of evaluations of health information websites suggests that the evaluation of online resources is feasible and of interest. Even though there was considerable diversity in terms of the focus of the measurement of the studies, very few studies addressed the economics of online health information provision, or the costs and benefits of information transfer within this environment independently or compared with other forms of information transfer. Most studies also described work that was conducted at a single time point, limiting the ability to assess the impact of changes to individuals or to websites over time. Studies around the assessment of the quality of the information content provided in the website and readability levels, reports on changes to structural aspects of the website and issues in search and search engine retrieval were much more commonly published than studies addressing behaviour change or knowledge transfer. Although ensuring functionally accessible websites is extremely important, more emphasis is needed on assessing the impact that engagement with these online resources has on individuals and on the health system. Relatively few studies focused on program evaluations. Hence, there is only a limited amount of published material available to web developers and project managers that assesses the performance of the online resource against specified criteria from funding agencies or policy makers. The apparent paucity of evaluation research in this field may also reflect the reality of decision-making processes involved in committing to creating an online resource, which may restrict the time and capacity to articulate and define the purposes and intended contributions of the resource to health outcomes. For groups commissioning or creating websites or webpages, the results of the present review would indicate that more needs to be done in articulating desired outcomes of the project, not just outlining the product specification. Partners in web planning and development need to consider who the intended users are, the capacities and experience of these users and the circumstances under which they will seek, find and use the health information provided (Ekeland et al. 2012). The information needs of the intended audience should guide planning and inform decision making about formats, presentation, design and navigation to enhance knowledge transfer and knowledge use. There were some indications that different methods have been used at different stages of the website development process. User testing activities and usability studies were reported in several studies, suggesting that formative evaluation before launch has been seen as a valuable aspect of the web development process. However, it is worth noting that although there were many studies looking at issues around the quality and accessibility of the online content, very few explicitly explored issues around access for marginalised groups, such as online options for non-English-speaking groups or enhancements for use of information by intellectually disabled groups. Previous research on a 'digital divide' highlights the importance of not only providing resources in appropriate forms, but also of facilitating and supporting access by the whole community (Reinfeld-Kirkman et al. 2010; Kruse et al. 2012; Choi and Dinitto 2013). Given that many see online information provision as a remedy that enables equitable distribution of health information, this remains an area for further study. Most post-release studies reported on visitor numbers and usage statistics, or provided the results of visitor surveys and user satisfaction scales. However, there was often little interpretation of the possible meaning of these metrics other than as trend indicators of use. The possibility of web metrics acting as surrogates of individual and health system actions needs to be explored. Further research around the meaning of usage patterns could add great value to these readily available metrics. For example, commercial enterprises will assess the relationship between product views and orders, and then use these web metrics to assess the impact of marketing strategies or product releases. Identifying and evaluating possible relevant metrics for health information would be a valuable piece of research. For example, does time of use correlate with different environmental circumstances for users such as no colleagues available for advice during night duty? Or what number of page views correlate to actual visitor action, such as booking an appointment with a general practitioner? Glynn et al. (2011) have already used a web metric system (Google Insights) to show a relationship between an annual breast cancer awareness campaign and online breast cancer activities. Similarly, comparative data on usage rates and penetration are needed for health websites to provide background information against which to assess the effectiveness of strategies and approaches used in website development and delivery. It is interesting to note that many of the evaluation studies were not undertaken by the agencies responsible for the online resource. This suggests that the study of online health information has become an area of research interest in its own right. For many health agencies, assessing the quality of online resources is important because patients, carers and families are using this information for self-diagnosis, decision making about treatment options or as part of their engagement with health professionals (McMullan 2006; Sillence et al. 2007; Boucher 2010; Rubenstein 2012). Finally, this review developed several evaluation variables to differentiate between the focus measures being used, the website types under review, the stage of website development and the actual emphasis of the evaluation. This highlights the need for a common language to describe not only the design characteristics of evaluation studies, but also the contribution of the evaluation research in terms of stage of development and the focus of the evaluation with respect to the information users, funders and the health system. Limitations of the review The present review only included articles published after 1995 from a single biomedical bibliographic database. It is likely that there are substantial numbers of unpublished reports and conference presentations looking at evaluations of health websites. It should be noted that no quality assessment of the individual studies was undertaken. The focus of the analysis was on the purpose of the study, not on the conduct of the study. This study used several evaluation schemas developed for the review. These categories have not been independently validated and assessed as website evaluation scales, so additional examination of the value of these schemas is warranted. The present study only looked at health information provided through a website. The reality is that the web is no longer a vast library of web pages accessed through discrete websites; it is a complex mix of information sources and formats, online interfaces, searching tools and brokers, and participation and management gateways. It is unknown whether similar approaches to evaluation can be used for other types and forms of online information dissemination and exchange. Conclusion Healthcare information is no longer the province of the local doctor or nurse. Just as the nature of medical technology has changed, so too has provision of healthcare information. The online environment has changed the ways in which health consumers and health professionals seek and engage with health information, but our understanding of how effectively information is being provided and used through this medium remains limited. There is a need for further research that looks beyond the creation and access of health websites to the impact that health websites have on outcomes for health consumers and their effects on health professionals and health services. Evaluation activities and studies of evaluation processes are an essential part of the process of understanding the contribution these resources can make. Evaluation activities undertaken during planning and development can assist in developing accessible and usable websites. Evaluation undertaken following release of a website can not only help demonstrate the use of these online resources, but also help assess the effect of these online resources on individuals and, potentially, on services and organisations and the health system. Commissioning agencies need to ensure that web developers and content providers are aware of best practice requirements and encourage research to direct the effective preparation and use of web-based healthcare information. Conflicts of interest JT and SLB are part of the CareSearch Project staff. CareSearch is funded by the Australian Government Department of Health and Ageing. Acknowledgements The authors acknowledge the contribution of Amanda Adams who created the SPSS file and was responsible for data entry, data cleansing and data outputs. What is known about the topic? * There is increasing use of health websites. * Studies on website usage and their impact can be used to improve their functionality and contribution. What does this paper add? * This paper reviews methods used to evaluate health website effectiveness. * Diverse methods are used and at different stages of development. * Few studies looked at economic assessment and behaviour change. Table 1. Number and percentage of studies according to measurement focus No. of Measurement focus studies % Feasibility 29 12.1 Formative evaluation identifying need, consideration of audience, items for inclusion etc. Heuristic 22 9.2 Systematic inspection of a user interface design for usability by an expert User testing 32 13.3 Feedback of the prototype website by intended users Text content assessment 56 23.3 Includes readability assessment, literacy testing, text analysis Content quality 165 68.8 Studies of the accuracy, currency and quality of the website content; can include automated assessment Structural elements 104 43.3 Measures structural elements of the website, such as navigability, menu systems, hyperlinks Visuals/graphic identity 47 19.6 Studies looking at presentation of the website (e.g. inclusion of high-quality pictures) Metric analysis 37 15.4 Retrieval and analysis of site metrics, such as visitor numbers, referrals Search engine optimisation 61 25.4 Studies that assess the effectiveness of page tagging, search term analysis etc. that lead a user to the website Visitor satisfaction surveys 25 10.4 Online/offline surveys of satisfaction with the resource Knowledge transfer 29 12.1 Studies that assess whether access and engagement with the website has led to an increase in knowledge or understanding by the web visitor Behaviour change 32 13.3 Studies that assess whether visitors' health behaviours have changed due to engagement with a website (e.g. stopped smoking, anxiety reduced etc.) Project evaluation 7 2.9 Assessment by funders, policy makers of the value of their website project Economic assessment 7 2.9 Cost-benefit analysis, economic analysis, cost pricing of an individual website Table 2. Number and percentage of studies by stage of website development and by evaluation emphasis No. of studies % Stage of website development Concept analysis and/or needs assessment 34 14.2 Development phase 42 17.5 Release/launch 8 3.3 Post release effectiveness 194 80.8 Iterative enhancements 6 2.5 Redesign 5 2.1 Other 1 0.4 Evaluation emphasis Access (facilitating the ability of users to be able to access the resource) 199 82.9 Use (tracking if, and how, the resource is being used) 68 28.3 Usefulness (addressing whether the resource has made a difference to use or practice) 72 30.0 Fig. 1. Schematic representation of article screening. References Australian Bureau of Statistics (2012) 'Internet activity, Australia.' (Australian Bureau of Statistics: Canberra) Available at http://www.abs.gov.au/AUSSTATS/abs@.nsf/Lookup/8153.0Main+Features1Jun%202012?OpenDocume nt [Verified 21 February 2013] Australian Government (2010) 'Building a 21st century primary health care system: Australia's first national primary health care strategy.' (Department of Health and Ageing) Available at http://www.yourhealth.gov.au/internet/yourhealth/publishing.nsf/Content/report-primaryhealth/$File/ NPHCS-Foreword-intro.pdf [Verified 21 February 2013] Australian Government (2012) 'The Australian Government web guide.' (Department of Finance and Deregulation) Available at http://webguide.gov.au/ [Verified 25 February 2013] Australian Government Information Office (2013) 'Better practice checklists & guidance.' Available at http://agict.gov.au/policy-guides-procurement/better-practice-checklists-guidance/ [Verified 27 July 2013] Boucher J 2010 Technology and patient-provider interactions: improving quality of care, but is it improving communication and collaboration? Diabetes Spectrum 23 3 142 144 10.2337/diaspect.23.3.142 Calero C Ruiz J Piattini M 2005 Classifying web metrics using the web quality model. Online Information Review 29 3 227 248 10.1108/14684520510607560 Catwell L Sheikh A 2009 Evaluating eHealth interventions: the need for continuous systemic evaluation. PLoS Medicine 6 8 e1000126 10.1371/journal.pmed.1000126 19688038 Choi NG Dinitto DM 2013 The digital divide among low-income homebound older adults: Internet use patterns, eHealth literacy, and attitudes toward computer/Internet use. Journal of Medical Internet Research 15 5 e93 10.2196/jmir.2645 23639979 Cunliffe D 2000 Developing usable websites: a review and a model. Internet Research 10 4 295 308 10.1108/10662240010342577 Ekeland AG Bowes A Flottorp S 2012 Methodologies for assessing telemedicine: a systematic review of reviews. International Journal of Medical Informatics 81 1 1 11 10.1016/j.ijmedinf.2011.10.009 22104370 Elling S, Lentz L, de Jong M (2007) Website evaluation questionnaire. Development of a research-based tool for evaluating informational websites. In 'Proceedings of the 6th International Conference EGOV 2007, LNCS 4656'. (Eds MA Wimmer, HJ Scholl, A Grönlund) pp. 293-304. Eysenbach G 2011 CONSORT E-HEALTH: improving and standardizing evaluation reports on webbased and mobile health interventions. Journal of Medical Internet Research 13 4 e126 10.2196/jmir.1923 Eysenbach G Kohler C 2003 What is the prevalence of health-related searches on the World Wide Web? Qualitative and quantitative analysis of search engine queries on the Internet. AMIA Annual Symposium Proceedings 2003 225 229 Fox S, Duggan M (2013) 'Health online 2013.' Pew Internet & American Life Project. Available at http://pewinternet.org/Reports/2013/Health-online.aspx [Verified 21 February 2013] Glynn RW Kelly JC Coffey N Sweeney KJ Kerin MJ 2011 The effect of breast cancer awareness month on internet search activity: a comparison with awareness campaigns for lung and prostate cancer. BMC Cancer 11 1 442 10.1186/1471-2407-11-442 21967632 Greenhalgh T Russell J 2010 Why do evaluations of ehealth programs fail? An alternative set of guiding principles. PLoS Medicine 7 11 e1000360 10.1371/journal.pmed.1000360 21072245 Internet World Stats (2013) 'World Internet usage and population statistics june 30, 2012.' Available at http://www.internetworldstats.com/stats.htm [Verified 21 February 2013] Kruse RL Koopman RJ Wakefield BJ Wakefield DS Keplinger LE Canfield SM Mehr DR 2012 Internet use by primary care patients: where is the digital divide? Family Medicine 44 5 342 347 23027117 McMullan M 2006 Patients using the Internet to obtain health information: how this affects the patienthealth professional relationship. Patient Education and Counseling 63 1-2 24 28 10.1016/j.pec.2005.10.006 16406474 Pagliari C 2007 Design and evaluation in eHealth: challenges and implications for an interdisciplinary field. Journal of Medical Internet Research 9 e15 10.2196/jmir.9.2.e15 17537718 Patton MQ (2008) 'Utilization-focused evaluation', 4th edn. (Sage: Thousand Oaks, CA) Pawson R Wong G Owen L 2011 Known knowns, known unknowns, unknown unknowns: the predicament of evidence-based policy. The American Journal of Evaluation 32 518 546 10.1177/1098214011403831 Reinfeld-Kirkman N Kalucy E Roeger L 2010 The relationship between self-reported health status and the increasing likelihood of South Australians seeking Internet health information. Australian and New Zealand Journal of Public Health 34 4 422 426 10.1111/j.1753-6405.2010.00576.x 20649784 Rubenstein EL (2012) 'Things my doctor never told me': Bridging information gaps in an online community. American Society for Information Science and Technology. Available at https://www.asis.org/asist2012/proceedings/Submissions/126.pdf [Verified 27 July 2013] Sillence E Briggs P Harris PR Fishwick L 2007 How do patients evaluate and make use of online health information? Social Science & Medicine 64 9 1853 1862 10.1016/j.socscimed.2007.01.012 17328998 SPSS Inc. (2010) 'IBM SPSS statistics (Version 19.0.0.1).' (IBM Corporation: Somers, NY) Tankeleviviene L Damasevicius R 2009 Characteristics of domain ontologies for web based learning and their application for quality evaluation. Informatics in Education 8 1 131 152 Tieman J, Martin P (2009) Metrics, measures and meanings: evaluating the CareSearch website. In 'Positioning the profession: the Tenth International Congress on Medical Librarianship', Brisbane, Australia, 31 August-4 September 2009. Available at http://espace.library.uq.edu.au/eserv/UQ:179705/n2%5F2%5FWed%5FTieman%5F77.pdf [Verified 27 July 2013] US Department of Health and Human Services (2006) 'The research-based web design and usability guidelines, enlarged/expanded edition.' (US Government Printing Office: Washington DC) Yusof MM Papazafeiropouloub A Pualb RJ Stergioulasb LK 2008 Investigating evaluation frameworks for health information systems. International Journal of Medical Informatics 77 377 385 10.1016/j.ijmedinf.2007.08.004 17904898 Received 28 February 2013, accepted 6 August 2013, published online 5 September 2013 ~~~~~~~~ By Jennifer Tieman and Sandra L. Bradley, Palliative and Supportive Services, School of Medicine, Flinders University, Health Sciences Building, Repatriation General Hospital Campus, Daw Park, SA 5041, Australia Palliative and Supportive Services, School of Medicine, Flinders University, Health Sciences Building, Repatriation General Hospital Campus, Daw Park, SA 5041, Australia; Corresponding author. Email: Jennifer.Tieman@flinders.edu.au Copyright of Australian Journal of Primary Health is the property of CSIRO Publishing and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individua Hebda, T., & Czar, P. (2013). Handbook of informatics for nurses & healthcare professionals (5th ed.). Boston, MA: Pearson. • Chapter 4: The Internet and the World Wide Web: An Overview (pp. 78–79 and 83–95) · Chapter 6: Healthcare Information Systems (pp. 114–127) • Chapter 23: Integrating Technology, Informatics, and the Internet into Nursing Education (pp. 472 -475) . Chapter 24: Consumer Education and Informatics (pp. 483-489) Article (required): Boudreaux, E. D., Waring, M. E., Hayes, R. B., Sadasivam, R. S., Mullen, S., & Pagoto, S. (2014). Evaluating and selecting mobile health apps: Strategies for healthcare providers and healthcare organizations. Translational Behavioral Medicine, 4(4), 363-371. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4286553/ Article (suggested): Tieman, J., & Bradley, S. L. (2013). Systematic review of the types of methods and approaches used to assess the effectiveness of healthcare information websites. Australian Journal of Primary Health, 19(4), 319–324. doi:10.1071/PY13030. Retrieved from https://chamberlainuniversity.idm.oclc.org/login?url=http://search.ebscohost.com/login.aspx? direct=true&db=a9h&AN=93284343&site=eds-live&scope=site 2 Website (suggested): National Patient Safety Foundation (NPSF). (2015). Roadmap for patient education on electronic health records. Retrieved from http://c.ymcdn.com/sites/www.npsf.org/resource/resmgr/PSAW Resources 2015/Moore Roadmap.pdf hhSearch Terms=%22electonic+and+research
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

...


Anonymous
Excellent! Definitely coming back for more study materials.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Related Tags