Loading…
Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course
Identifying instruments and surveys to address geoscience education research (GER) questions is among the high-ranked needs in a 2016 survey of the GER community (St. John et al., 2016). The purpose of this study was to develop and validate a student-centered assessment instrument to measure course...
Saved in:
Published in: | Journal of geoscience education 2017-11, Vol.65 (4), p.435-454 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c236t-d19b0aa60117581ecdca0ff3dd3361d1670fa86d6312410368916cda2c9f95183 |
---|---|
cites | cdi_FETCH-LOGICAL-c236t-d19b0aa60117581ecdca0ff3dd3361d1670fa86d6312410368916cda2c9f95183 |
container_end_page | 454 |
container_issue | 4 |
container_start_page | 435 |
container_title | Journal of geoscience education |
container_volume | 65 |
creator | Liu, Juhong Christie John, Kristen St Courtier, Anna M. Bishop |
description | Identifying instruments and surveys to address geoscience education research (GER) questions is among the high-ranked needs in a 2016 survey of the GER community (St. John et al., 2016). The purpose of this study was to develop and validate a student-centered assessment instrument to measure course experience in a general education integrated science course. A one-shot case study of pre-experimental design (Creswell, 2009, 2014) was used to understand student experiences in a large-enrollment course with digital content integration, including out-of-class video, online presentations, and warm-up questions, as well as in-class video paired with discussion and small group activities. In two sections taught by the same geoscientist, 209 students accessed course content in an online learning management system prior to classroom instruction. We adapted the 36-item Course Experience Questionnaire (CEQ; Wilson et al., 1997) to a Web-based survey. We conducted statistical analysis on the CEQ responses, including item factor analysis, examination of communalities and measures of association, confirmatory factor analysis, and reliability and stability testing through bootstrap resampling. The statistical analyses indicate that the results in this study are comparable to those from other cultural contexts and subject areas. Given the moderate fit of the model and reasonably stable results, we propose that the core indicators of the CEQ constructs, including Appropriate Workload, Clear Goals and Standards, Generic Skills, Good Teaching, and Emphasis on Independence are sufficient to assess students' course experiences in general education science settings. These results provided moderately to strongly valid information that can help instructors in designing their technology-integrated classes, although further study with a larger sample population would be beneficial. From this study we conclude that students' perceptions of their course experiences are closely related to their development of problem-solving and analytical skills, clear course expectations to direct their own study plans, their preference for a motivating instructor, the opportunity to have a variety of learning choices, and their workload. We propose the use of a 25-item version of the CEQ that is appropriate for formative assessment in discipline-specific STEM introductory classes, including geoscience classes. |
doi_str_mv | 10.5408/16-204.1 |
format | article |
fullrecord | <record><control><sourceid>proquest_eric_</sourceid><recordid>TN_cdi_eric_primary_EJ1161360</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1161360</ericid><sourcerecordid>1977168008</sourcerecordid><originalsourceid>FETCH-LOGICAL-c236t-d19b0aa60117581ecdca0ff3dd3361d1670fa86d6312410368916cda2c9f95183</originalsourceid><addsrcrecordid>eNplkEtLAzEUhYMoWKvgHxACbtxMzZ10Msmy1LFWCi58bEPMQ6ZMk5pM1S7870474sbVvdzz3cPhIHQOZFSMCb8GluVkPIIDNMih4BmMc36IBkC4yIQQxTE6SWlJSE6BlAP0fWM_bBPWK-tbrLzBL6qpjWrr4HFw3QVPUrIp7fW5T23c7FcXIp6GTUwWV19rG2vrtcW1xwrPrLdRNbgyG90bzX1r36JqrcGPuif731N05FST7NnvHKLn2-ppepctHmbz6WSR6ZyyNjMgXolSjACUBQerjVbEOWoMpQwMsJI4xZlhFPIxEMq4AKaNyrVwogBOh-ii9-2CarmO9UrFrazuARhQRjr9stfXMbxvbGrlssvnu0gSRFkC44TsXK56SseQUrTuzwmI3JUvgcmufAkdSnu09l1TK_UZYmNkq7ZNiC4qr-sk6b-vH0d5hyA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1977168008</pqid></control><display><type>article</type><title>Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course</title><source>Education Collection (Proquest) (PQ_SDU_P3)</source><source>Taylor & Francis</source><source>Social Science Premium Collection</source><creator>Liu, Juhong Christie ; John, Kristen St ; Courtier, Anna M. Bishop</creator><creatorcontrib>Liu, Juhong Christie ; John, Kristen St ; Courtier, Anna M. Bishop</creatorcontrib><description>Identifying instruments and surveys to address geoscience education research (GER) questions is among the high-ranked needs in a 2016 survey of the GER community (St. John et al., 2016). The purpose of this study was to develop and validate a student-centered assessment instrument to measure course experience in a general education integrated science course. A one-shot case study of pre-experimental design (Creswell, 2009, 2014) was used to understand student experiences in a large-enrollment course with digital content integration, including out-of-class video, online presentations, and warm-up questions, as well as in-class video paired with discussion and small group activities. In two sections taught by the same geoscientist, 209 students accessed course content in an online learning management system prior to classroom instruction. We adapted the 36-item Course Experience Questionnaire (CEQ; Wilson et al., 1997) to a Web-based survey. We conducted statistical analysis on the CEQ responses, including item factor analysis, examination of communalities and measures of association, confirmatory factor analysis, and reliability and stability testing through bootstrap resampling. The statistical analyses indicate that the results in this study are comparable to those from other cultural contexts and subject areas. Given the moderate fit of the model and reasonably stable results, we propose that the core indicators of the CEQ constructs, including Appropriate Workload, Clear Goals and Standards, Generic Skills, Good Teaching, and Emphasis on Independence are sufficient to assess students' course experiences in general education science settings. These results provided moderately to strongly valid information that can help instructors in designing their technology-integrated classes, although further study with a larger sample population would be beneficial. From this study we conclude that students' perceptions of their course experiences are closely related to their development of problem-solving and analytical skills, clear course expectations to direct their own study plans, their preference for a motivating instructor, the opportunity to have a variety of learning choices, and their workload. We propose the use of a 25-item version of the CEQ that is appropriate for formative assessment in discipline-specific STEM introductory classes, including geoscience classes.</description><identifier>ISSN: 1089-9995</identifier><identifier>EISSN: 2158-1428</identifier><identifier>DOI: 10.5408/16-204.1</identifier><language>eng</language><publisher>Bellingham: National Association of Geoscience Teachers</publisher><subject>assessment instrument ; CAI ; Case studies ; Classroom Desegregation ; Classrooms ; College Science ; Computer assisted instruction ; Construction standards ; Core curriculum ; Course Content ; course experience ; Distance learning ; Earth Science ; Education ; Educational Environment ; Educational Research ; Educational technology ; Electronic Learning ; Experimental design ; Factor Analysis ; General Education ; Higher education ; Information literacy ; Learning Processes ; Learning Strategies ; Learning Theories ; Mathematics Education ; Measures (Individuals) ; Nonmajors ; On-line systems ; Online Surveys ; Polls & surveys ; Population (statistical) ; Problem solving ; Quality ; Questionnaires ; Reliability analysis ; Reliability aspects ; Resampling ; Researchers ; Science Curriculum ; Science education ; Skills ; Stability analysis ; Statistical Analysis ; STEM Education ; Student Attitudes ; Student Experience ; Student Surveys ; Students ; Studies ; Teachers ; Teaching ; Teaching Methods ; Test Construction ; Test Reliability ; Test Validity ; Tourism ; Undergraduate Students ; Undergraduate Study ; Validity ; Working conditions ; Workload ; Workloads</subject><ispartof>Journal of geoscience education, 2017-11, Vol.65 (4), p.435-454</ispartof><rights>Nat. Assoc. Geosci. Teachers 2017</rights><rights>Copyright National Association of Geoscience Teachers Nov 2017</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c236t-d19b0aa60117581ecdca0ff3dd3361d1670fa86d6312410368916cda2c9f95183</citedby><cites>FETCH-LOGICAL-c236t-d19b0aa60117581ecdca0ff3dd3361d1670fa86d6312410368916cda2c9f95183</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/1977168008/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$H</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/1977168008?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>230,314,780,784,885,21376,21392,27922,27923,33609,33875,43731,43878,73991,74167</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1161360$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Liu, Juhong Christie</creatorcontrib><creatorcontrib>John, Kristen St</creatorcontrib><creatorcontrib>Courtier, Anna M. Bishop</creatorcontrib><title>Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course</title><title>Journal of geoscience education</title><description>Identifying instruments and surveys to address geoscience education research (GER) questions is among the high-ranked needs in a 2016 survey of the GER community (St. John et al., 2016). The purpose of this study was to develop and validate a student-centered assessment instrument to measure course experience in a general education integrated science course. A one-shot case study of pre-experimental design (Creswell, 2009, 2014) was used to understand student experiences in a large-enrollment course with digital content integration, including out-of-class video, online presentations, and warm-up questions, as well as in-class video paired with discussion and small group activities. In two sections taught by the same geoscientist, 209 students accessed course content in an online learning management system prior to classroom instruction. We adapted the 36-item Course Experience Questionnaire (CEQ; Wilson et al., 1997) to a Web-based survey. We conducted statistical analysis on the CEQ responses, including item factor analysis, examination of communalities and measures of association, confirmatory factor analysis, and reliability and stability testing through bootstrap resampling. The statistical analyses indicate that the results in this study are comparable to those from other cultural contexts and subject areas. Given the moderate fit of the model and reasonably stable results, we propose that the core indicators of the CEQ constructs, including Appropriate Workload, Clear Goals and Standards, Generic Skills, Good Teaching, and Emphasis on Independence are sufficient to assess students' course experiences in general education science settings. These results provided moderately to strongly valid information that can help instructors in designing their technology-integrated classes, although further study with a larger sample population would be beneficial. From this study we conclude that students' perceptions of their course experiences are closely related to their development of problem-solving and analytical skills, clear course expectations to direct their own study plans, their preference for a motivating instructor, the opportunity to have a variety of learning choices, and their workload. We propose the use of a 25-item version of the CEQ that is appropriate for formative assessment in discipline-specific STEM introductory classes, including geoscience classes.</description><subject>assessment instrument</subject><subject>CAI</subject><subject>Case studies</subject><subject>Classroom Desegregation</subject><subject>Classrooms</subject><subject>College Science</subject><subject>Computer assisted instruction</subject><subject>Construction standards</subject><subject>Core curriculum</subject><subject>Course Content</subject><subject>course experience</subject><subject>Distance learning</subject><subject>Earth Science</subject><subject>Education</subject><subject>Educational Environment</subject><subject>Educational Research</subject><subject>Educational technology</subject><subject>Electronic Learning</subject><subject>Experimental design</subject><subject>Factor Analysis</subject><subject>General Education</subject><subject>Higher education</subject><subject>Information literacy</subject><subject>Learning Processes</subject><subject>Learning Strategies</subject><subject>Learning Theories</subject><subject>Mathematics Education</subject><subject>Measures (Individuals)</subject><subject>Nonmajors</subject><subject>On-line systems</subject><subject>Online Surveys</subject><subject>Polls & surveys</subject><subject>Population (statistical)</subject><subject>Problem solving</subject><subject>Quality</subject><subject>Questionnaires</subject><subject>Reliability analysis</subject><subject>Reliability aspects</subject><subject>Resampling</subject><subject>Researchers</subject><subject>Science Curriculum</subject><subject>Science education</subject><subject>Skills</subject><subject>Stability analysis</subject><subject>Statistical Analysis</subject><subject>STEM Education</subject><subject>Student Attitudes</subject><subject>Student Experience</subject><subject>Student Surveys</subject><subject>Students</subject><subject>Studies</subject><subject>Teachers</subject><subject>Teaching</subject><subject>Teaching Methods</subject><subject>Test Construction</subject><subject>Test Reliability</subject><subject>Test Validity</subject><subject>Tourism</subject><subject>Undergraduate Students</subject><subject>Undergraduate Study</subject><subject>Validity</subject><subject>Working conditions</subject><subject>Workload</subject><subject>Workloads</subject><issn>1089-9995</issn><issn>2158-1428</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>ALSLI</sourceid><sourceid>CJNVE</sourceid><sourceid>M0P</sourceid><recordid>eNplkEtLAzEUhYMoWKvgHxACbtxMzZ10Msmy1LFWCi58bEPMQ6ZMk5pM1S7870474sbVvdzz3cPhIHQOZFSMCb8GluVkPIIDNMih4BmMc36IBkC4yIQQxTE6SWlJSE6BlAP0fWM_bBPWK-tbrLzBL6qpjWrr4HFw3QVPUrIp7fW5T23c7FcXIp6GTUwWV19rG2vrtcW1xwrPrLdRNbgyG90bzX1r36JqrcGPuif731N05FST7NnvHKLn2-ppepctHmbz6WSR6ZyyNjMgXolSjACUBQerjVbEOWoMpQwMsJI4xZlhFPIxEMq4AKaNyrVwogBOh-ii9-2CarmO9UrFrazuARhQRjr9stfXMbxvbGrlssvnu0gSRFkC44TsXK56SseQUrTuzwmI3JUvgcmufAkdSnu09l1TK_UZYmNkq7ZNiC4qr-sk6b-vH0d5hyA</recordid><startdate>20171101</startdate><enddate>20171101</enddate><creator>Liu, Juhong Christie</creator><creator>John, Kristen St</creator><creator>Courtier, Anna M. Bishop</creator><general>National Association of Geoscience Teachers</general><general>Taylor & Francis Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>0-V</scope><scope>3V.</scope><scope>4T-</scope><scope>4U-</scope><scope>7XB</scope><scope>88B</scope><scope>88I</scope><scope>8AF</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ALSLI</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>CCPQU</scope><scope>CJNVE</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M0P</scope><scope>M2P</scope><scope>M7S</scope><scope>PCBAR</scope><scope>PQEDU</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>ERI</scope><scope>GA5</scope></search><sort><creationdate>20171101</creationdate><title>Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course</title><author>Liu, Juhong Christie ; John, Kristen St ; Courtier, Anna M. Bishop</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c236t-d19b0aa60117581ecdca0ff3dd3361d1670fa86d6312410368916cda2c9f95183</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>assessment instrument</topic><topic>CAI</topic><topic>Case studies</topic><topic>Classroom Desegregation</topic><topic>Classrooms</topic><topic>College Science</topic><topic>Computer assisted instruction</topic><topic>Construction standards</topic><topic>Core curriculum</topic><topic>Course Content</topic><topic>course experience</topic><topic>Distance learning</topic><topic>Earth Science</topic><topic>Education</topic><topic>Educational Environment</topic><topic>Educational Research</topic><topic>Educational technology</topic><topic>Electronic Learning</topic><topic>Experimental design</topic><topic>Factor Analysis</topic><topic>General Education</topic><topic>Higher education</topic><topic>Information literacy</topic><topic>Learning Processes</topic><topic>Learning Strategies</topic><topic>Learning Theories</topic><topic>Mathematics Education</topic><topic>Measures (Individuals)</topic><topic>Nonmajors</topic><topic>On-line systems</topic><topic>Online Surveys</topic><topic>Polls & surveys</topic><topic>Population (statistical)</topic><topic>Problem solving</topic><topic>Quality</topic><topic>Questionnaires</topic><topic>Reliability analysis</topic><topic>Reliability aspects</topic><topic>Resampling</topic><topic>Researchers</topic><topic>Science Curriculum</topic><topic>Science education</topic><topic>Skills</topic><topic>Stability analysis</topic><topic>Statistical Analysis</topic><topic>STEM Education</topic><topic>Student Attitudes</topic><topic>Student Experience</topic><topic>Student Surveys</topic><topic>Students</topic><topic>Studies</topic><topic>Teachers</topic><topic>Teaching</topic><topic>Teaching Methods</topic><topic>Test Construction</topic><topic>Test Reliability</topic><topic>Test Validity</topic><topic>Tourism</topic><topic>Undergraduate Students</topic><topic>Undergraduate Study</topic><topic>Validity</topic><topic>Working conditions</topic><topic>Workload</topic><topic>Workloads</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Juhong Christie</creatorcontrib><creatorcontrib>John, Kristen St</creatorcontrib><creatorcontrib>Courtier, Anna M. Bishop</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Social Sciences Premium Collection【Remote access available】</collection><collection>ProQuest Central (Corporate)</collection><collection>Docstoc</collection><collection>University Readers</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Education Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>STEM Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central</collection><collection>Social Science Premium Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Earth, Atmospheric & Aquatic Science Collection</collection><collection>ProQuest One Community College</collection><collection>Education Collection (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Education Journals</collection><collection>ProQuest Science Journals</collection><collection>Engineering Database</collection><collection>Earth, Atmospheric & Aquatic Science Database</collection><collection>ProQuest One Education</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>ERIC</collection><collection>ERIC - Full Text Only (Discovery)</collection><jtitle>Journal of geoscience education</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Liu, Juhong Christie</au><au>John, Kristen St</au><au>Courtier, Anna M. Bishop</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1161360</ericid><atitle>Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course</atitle><jtitle>Journal of geoscience education</jtitle><date>2017-11-01</date><risdate>2017</risdate><volume>65</volume><issue>4</issue><spage>435</spage><epage>454</epage><pages>435-454</pages><issn>1089-9995</issn><eissn>2158-1428</eissn><abstract>Identifying instruments and surveys to address geoscience education research (GER) questions is among the high-ranked needs in a 2016 survey of the GER community (St. John et al., 2016). The purpose of this study was to develop and validate a student-centered assessment instrument to measure course experience in a general education integrated science course. A one-shot case study of pre-experimental design (Creswell, 2009, 2014) was used to understand student experiences in a large-enrollment course with digital content integration, including out-of-class video, online presentations, and warm-up questions, as well as in-class video paired with discussion and small group activities. In two sections taught by the same geoscientist, 209 students accessed course content in an online learning management system prior to classroom instruction. We adapted the 36-item Course Experience Questionnaire (CEQ; Wilson et al., 1997) to a Web-based survey. We conducted statistical analysis on the CEQ responses, including item factor analysis, examination of communalities and measures of association, confirmatory factor analysis, and reliability and stability testing through bootstrap resampling. The statistical analyses indicate that the results in this study are comparable to those from other cultural contexts and subject areas. Given the moderate fit of the model and reasonably stable results, we propose that the core indicators of the CEQ constructs, including Appropriate Workload, Clear Goals and Standards, Generic Skills, Good Teaching, and Emphasis on Independence are sufficient to assess students' course experiences in general education science settings. These results provided moderately to strongly valid information that can help instructors in designing their technology-integrated classes, although further study with a larger sample population would be beneficial. From this study we conclude that students' perceptions of their course experiences are closely related to their development of problem-solving and analytical skills, clear course expectations to direct their own study plans, their preference for a motivating instructor, the opportunity to have a variety of learning choices, and their workload. We propose the use of a 25-item version of the CEQ that is appropriate for formative assessment in discipline-specific STEM introductory classes, including geoscience classes.</abstract><cop>Bellingham</cop><pub>National Association of Geoscience Teachers</pub><doi>10.5408/16-204.1</doi><tpages>20</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1089-9995 |
ispartof | Journal of geoscience education, 2017-11, Vol.65 (4), p.435-454 |
issn | 1089-9995 2158-1428 |
language | eng |
recordid | cdi_eric_primary_EJ1161360 |
source | Education Collection (Proquest) (PQ_SDU_P3); Taylor & Francis; Social Science Premium Collection |
subjects | assessment instrument CAI Case studies Classroom Desegregation Classrooms College Science Computer assisted instruction Construction standards Core curriculum Course Content course experience Distance learning Earth Science Education Educational Environment Educational Research Educational technology Electronic Learning Experimental design Factor Analysis General Education Higher education Information literacy Learning Processes Learning Strategies Learning Theories Mathematics Education Measures (Individuals) Nonmajors On-line systems Online Surveys Polls & surveys Population (statistical) Problem solving Quality Questionnaires Reliability analysis Reliability aspects Resampling Researchers Science Curriculum Science education Skills Stability analysis Statistical Analysis STEM Education Student Attitudes Student Experience Student Surveys Students Studies Teachers Teaching Teaching Methods Test Construction Test Reliability Test Validity Tourism Undergraduate Students Undergraduate Study Validity Working conditions Workload Workloads |
title | Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T10%3A43%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_eric_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Development%20and%20Validation%20of%20an%20Assessment%20Instrument%20for%20Course%20Experience%20in%20a%20General%20Education%20Integrated%20Science%20Course&rft.jtitle=Journal%20of%20geoscience%20education&rft.au=Liu,%20Juhong%20Christie&rft.date=2017-11-01&rft.volume=65&rft.issue=4&rft.spage=435&rft.epage=454&rft.pages=435-454&rft.issn=1089-9995&rft.eissn=2158-1428&rft_id=info:doi/10.5408/16-204.1&rft_dat=%3Cproquest_eric_%3E1977168008%3C/proquest_eric_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c236t-d19b0aa60117581ecdca0ff3dd3361d1670fa86d6312410368916cda2c9f95183%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1977168008&rft_id=info:pmid/&rft_ericid=EJ1161360&rfr_iscdi=true |