Guest guest Posted October 5, 2000 Report Share Posted October 5, 2000 Reliance on exams is one of the follies of present day education. Currently there is a giant dialog going on between the Presidential candidates over testing in the public schools. Tests are wonderful at measuring your retention of the materials you've been taught in preparation for the test. In other words, if they teach the TAAS test, which all of them do, then the student will be able to answer the questions on the TAAS test but not make change when my checkout total is $1.13 and after they've rung up $2.00 submitted I suddenly present $5.25 to them in payment. Nowhere do we adequately measure critical thinking and performance abilities except in structured scenario practice, which takes lots of time, personpower, and money. But if we really want to know how people are going to perform in the field, how can we NOT employ the only measurements we know work? (How do we know they work? Actually, I can't prove they do! I think they do, but I'm not on any better ground than my CQI friends who rely upon reported interventions to develop statistical suppositions.) I expect that there will be several who are quite familiar with standardized testing who will disagree with me and assert that standardized multiple choice testing can report a highly correlated prediction of performance in the field. I don't buy it for one minute. I know of NO standardized tests measured against field performance. I also know of practically no actually valid measurements of field performance in an individual as compared to anything. The existing CQI programs are so primitive and subjective that none can be relied upon. CQI managers please reply and prove me wrong. All educators want to know how our students will perform in the field once certified/licensed. To date, I know of no valid predictors of performance based upon written exams. If anybody can tell me of any, I'd appreciate it. My hypothesis is that enhanced education produces better medics, but I can't prove it. Can anybody prove or disprove it? Gene Gandy Quote Link to comment Share on other sites More sharing options...
Guest guest Posted October 5, 2000 Report Share Posted October 5, 2000 In a message dated 10/5/00 11:18:12 PM Central Daylight Time, wegandy@... writes: << OK, now here starts the commercial> We at TJC want to work with ALL people in the above category to help them get the educational credentials they deserve. We can do lots for you. We don't care who nor where you are. We're currently working with a person on an aircraft carrier to get a degree in Paramedicine. >> That is awesome! Quote Link to comment Share on other sites More sharing options...
Guest guest Posted October 6, 2000 Report Share Posted October 6, 2000 You got me, . Give yourself two on the next quiz! Chris Quote Link to comment Share on other sites More sharing options...
Guest guest Posted October 6, 2000 Report Share Posted October 6, 2000 Gene Gandy wrote: " My hypothesis is that enhanced education produces better medics, but I can't prove it. Can anybody prove or disprove it? " Gene - I hope that no one doubts this. Are we still getting those that do? (I ask this while I know that you and I have had some folks in years past that doubted this but I'm just wondering if anyone now can truthfully say they don't think enhanced education improves any field of endeavor) - It's just kind of a no-brainer. >>> wegandy@... 10/05/00 11:39PM >>> Quote Link to comment Share on other sites More sharing options...
Guest guest Posted October 6, 2000 Report Share Posted October 6, 2000 Gene - I just realized mentioning a " a no-brainer " might actually be a rather poor pun regarding education! >>> ems_elbd@... 10/06/00 09:17AM >>> Gene Gandy wrote: " My hypothesis is that enhanced education produces better medics, but I can't prove it. Can anybody prove or disprove it? " Gene - I hope that no one doubts this. Are we still getting those that do? (I ask this while I know that you and I have had some folks in years past that doubted this but I'm just wondering if anyone now can truthfully say they don't think enhanced education improves any field of endeavor) - It's just kind of a no-brainer. >>> wegandy@... 10/05/00 11:39PM >>> Quote Link to comment Share on other sites More sharing options...
Guest guest Posted October 7, 2000 Report Share Posted October 7, 2000 In a message dated 10/6/00 8:23:41 AM Central Daylight Time, Clgrote126@... writes: << You got me, . Give yourself two on the next quiz! >> Thanks Chris! I need all the extra help I can get! Quote Link to comment Share on other sites More sharing options...
Guest guest Posted October 8, 2000 Report Share Posted October 8, 2000 Gene, The subject of testing has been discussed at work recently, and several of my colleagues agreed with your position that testing is no clear indicator of field performance. I agree but only in a limited sense. We have all worked with the partner whom we trust and respect, despite the fact that we know they made a 70 on their state exam, and 10 of their points came from lucky guesses. These tactile learners can " do " a heck of a lot better than they can prove on an exam. The other end of the spectrum is the egghead who made a 98 on his EMT-P exam, but can't read a cardiac monitor. Face it, the test score only quantifies what percentage of the information tested the candidate was able to recall at the time he took the test. I realize that from what I have written so far, one could draw the conclusion that I am against written and practical testing. Not hardly. I am against the current system that requires us to spend time and money every four years on an exam that isn't even called an exam, and means absolutely nothing. In regard to the poor test takers, it has been my experience that for every good medic who can't test there are 20 who also failed the exam about whom you say a prayer of thanks that they won't be in the field. To put it another way, for every good test-taker/bonehead medic who passes the test, there are probably 20 bad test-takers/boneheads who fail and deserve to do so. I guess the best way to consider this issue is to measure the candidate by the one rule that is accepted universally in this line of work: would you trust the person in question to care for you or your kids. If all other factors are equal, I'll take the guy who made a 90 over the guy who made a 70 every time. Steve Pike Re: [texasems-L] Thoughts on eMail writing Reliance on exams is one of the follies of present day education. Currently there is a giant dialog going on between the Presidential candidates over testing in the public schools. Tests are wonderful at measuring your retention of the materials you've been taught in preparation for the test. In other words, if they teach the TAAS test, which all of them do, then the student will be able to answer the questions on the TAAS test but not make change when my checkout total is $1.13 and after they've rung up $2.00 submitted I suddenly present $5.25 to them in payment. Nowhere do we adequately measure critical thinking and performance abilities except in structured scenario practice, which takes lots of time, personpower, and money. But if we really want to know how people are going to perform in the field, how can we NOT employ the only measurements we know work? (How do we know they work? Actually, I can't prove they do! I think they do, but I'm not on any better ground than my CQI friends who rely upon reported interventions to develop statistical suppositions.) I expect that there will be several who are quite familiar with standardized testing who will disagree with me and assert that standardized multiple choice testing can report a highly correlated prediction of performance in the field. I don't buy it for one minute. I know of NO standardized tests measured against field performance. I also know of practically no actually valid measurements of field performance in an individual as compared to anything. The existing CQI programs are so primitive and subjective that none can be relied upon. CQI managers please reply and prove me wrong. All educators want to know how our students will perform in the field once certified/licensed. To date, I know of no valid predictors of performance based upon written exams. If anybody can tell me of any, I'd appreciate it. My hypothesis is that enhanced education produces better medics, but I can't prove it. Can anybody prove or disprove it? Gene Gandy Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.