Selectivity of schools based on 2012-13 standardized test scores

<p><a href=“Affluent Students Have an Advantage - Graphic - NYTimes.com”>Affluent Students Have an Advantage - Graphic - NYTimes.com;

<p>Here’s a graphic from NYTimes showing (unsurprisingly) that the wealthier the family the more likely a student will graduate from college.</p>

<p>Yes, I looked at the graphic. Here is a link to at article in CNNMoney that suggests income matters.
[Income</a> gap continues to affect college graduation rates - Nov. 21, 2011](<a href=“http://money.cnn.com/2011/11/21/news/economy/income_college/index.htm]Income”>Income gap continues to affect college graduation rates - Nov. 21, 2011)</p>

<p>I don’t really doubt that income is correlated with college education success. But these are correlations. I think one has to use reason to figure out the likely causes behind the correlations.</p>

<p>Is the problem poverty and financial struggles? Or is there something else limiting the student’s achievement? The only way to find out for sure is to provide every student with the very best parenting and education from day 1. Then see where that leads them. Some will still not succeed because it simply is not in their nature. Others will succeed. </p>

<p>Students who score high on the SATs have both natural ability and they have had favorable parenting and education prior to college so they are well prepared for college. High SAT scores reflect the fact that all the relevant factors have worked in the student’s favor and they are more likely to succeed in college.</p>

<p>Students who have been born into a difficult environment need help from day 1. Throwing money at the problem when they are 18 years old…well, for some the help comes too late.</p>

<p>It could very well be that the parents earn less because the relevant factors were not in their favor either and so the cycle continues.</p>

<p>I think the most important predictor of success in college is the SATs or ACTs because they measure preparedness for college.</p>

<p>I wish the USDOE collected data about high school gpa and rank as well as SAT.</p>

<p>Most studies indicate SAT tests are POOR at predicting college success and contribute very little to predicting any outcomes. </p>

<p>[Change</a> Magazine - Back to the Basics: In Defense of Achievement (and Achievement Tests) in College Admissions](<a href=“http://www.changemag.org/Archives/Back%20Issues/January-February%202009/full-back-to-basics.html]Change”>http://www.changemag.org/Archives/Back%20Issues/January-February%202009/full-back-to-basics.html)</p>

<p>That is a great article, barrons. Thanks for providing the link. I’ve read about the UC research on SATs in the past. I was very interested in the note at the end on methodology.</p>

<p>Without going into detail, I’ll just say that the article reflects a common misconception about SATs and graduation rates. The conclusions the researchers came to in their studies are an artifact of their methods. In multiple regression, variables are entered in a certain order. The variables one decides to enter first subtract from the variables that are entered later… perhaps until little “power” is left for the last variable, in their case the SATs. The reason SATs had little “predictive power” in their research is the choices they made about the order of entry of the variables in their statistical analysis.</p>

<p>Statistics don’t “lie” but people can “lie” with statistics. It isn’t really a lie; it is more like failing to make full disclosure in order to support one’s bias.</p>

<p>The College Board is actually correct when they say that SATs are powerful predictors of college success. I just calculated the correlations between graduation rates and SATs for hundreds of universities and colleges. The correlations were very high. For public universities, the correlation between SAT Math 25th percentile and graduation rates was +.72. Among liberal arts colleges, the correlation was +.79. Among private universities, the correlation was +.82. </p>

<p>I’ll tell you another reason for the misconception about the predictive power of SATs: it is much harder to predict outcomes for individual students based on SATs than to predict aggregate outcomes for entire freshmen classes. What this means is that colleges will be able to reliably predict that their graduation rates will increase if they raise their average SAT scores but it will be harder for them to predict which particular students will graduate.</p>

<p>By the way, I also found high INVERSE correlations between the percent of students receiving Pell grants and graduation rates. So, family income is evidently a good predictor of success. too.</p>

<p>It is very important to help students overcome financial challenges and succeed in college. Nothing I have said above suggests otherwise. I hope I am not misunderstood.</p>

<p>Your entire thread is misleading. Again, fix your data and stop implying selectivity based on SAT. It’s not an absolute scale. Is Vandy or Columbia more selective than Stanford based on SAT scores? It’s definitely not, but by your metric and misleading titles, it is.</p>

<p>What you have is classic faulty conclusions by CEEB because SAT score is highly correlated such as income and related factors like cultural development. And higher income yields higher SAT and grad success. </p>

<p>[No</a> Rich Child Left Behind - NYTimes.com](<a href=“http://opinionator.blogs.nytimes.com/2013/04/27/no-rich-child-left-behind/]No”>No Rich Child Left Behind - The New York Times)</p>

<p>blah, on what should selectivity be based? And, is that data available? Are you saying the numbers are incorrect? I think the numbers are correct. Not sure what you are saying…</p>

<p>barrons, another great article, thanks. It would be an interesting experiment to identify 200 equally at-risk families and give half of them a financial boost from birth along with intensive mentoring. I bet you’d see an effect on their outcomes. It would encourage early investment in struggling families. But, would it be ethical?</p>

<p>collegehelp, you have probably seen this link, but in case you haven’t, I find it interesting: [SAT</a> Comparison, 1966 versus 2006](<a href=“http://www.gradeinflation.com/SATcomparison.html]SAT”>SAT Comparison, 1966 versus 2006).</p>

<p>Thanks JuniorMint. That was very interesting. Some schools saw major swings! It would be interesting to see more datapoints over time. Maybe the average SAT has been cyclical. More students take the SAT now. I wonder how that might be a factor.</p>

<p>I find it interesting too. :)</p>

<p>The number of high scorers has increased, but there aren’t that many scoring in the 1500-1600 range (<a href=“http://talk.collegeconfidential.com/college-admissions/413821-sat-score-frequencies-freshman-class-sizes.html[/url]”>http://talk.collegeconfidential.com/college-admissions/413821-sat-score-frequencies-freshman-class-sizes.html&lt;/a&gt;), which means that schools with 1966 scores in the upper level had to fight the ever widening field of competition to retain their caliber of student. </p>

<p>Carleton, for instance, was able to remain the same (1390 to 1386), but it didn’t see the same boon as Grinnell, for example, which went up to the same score (1384), because Carleton has been fighting for the top cross-admits with Pomona, Middlebury, Haverford, etc. (WAS are missing from the list.) The same situation can be seen with Kenyon (+6) and Macalester (+66). </p>

<p>On the other hand, consider Smith, which dropped from 1410 to 1249, a 161 point difference. I am assuming Smith, as a Seven Sister school, probably lost some of its high scoring admits to the gradual integration of top schools. </p>

<p>UW-Seattle, which is still a research powerhouse, dropped 110 points. I would assume, yet again, because the primary applicant pool (above average students in the Northwest) broadened their horizons with the decline of regionalism.</p>

<p>Alternatively, some schools embraced holistic review and don’t appear to care about the difference. E.g. Stanford drops 63 points.</p>

<p>collegehelp, would you please post the URL of the tool you used to access/ sort the IPEDS data? Sorry if all of you know this already. </p>

<p>Thank you!</p>

<p>The 2013 EA season for Northeastern continued the trend. My own son, with 2300+ SATs, is strongly considering the school due to excellent merit aid, the very strong co-op program, and lovely city campus. And we keep hearing about the masses of great kids with high stats who were deferred. I would look for Northeastern to continue rising rapidly in the rankings.</p>

<p>RainCityMom-</p>

<p>[IPEDS</a> Data Center](<a href=“http://nces.ed.gov/ipeds/datacenter/SelectVariables.aspx?stepId=1]IPEDS”>http://nces.ed.gov/ipeds/datacenter/SelectVariables.aspx?stepId=1)</p>

<p>The tool is rather difficult to use. I selected institutions by variable. Under Institutional Characteristics, Directory Information, I used Carnegie Classification Basic to select various categories of Liberal Arts Colleges, Masters Universities, and Research Universities. Under Admissions and Test Scores, I selected the SAT scores that I wanted and the percent who took the SAT. Then I selected schools above a certain minimum SAT 25th %ile math, as I recall. Then I added the public/private variable Control of Institution under the Directory Information heading. Then I downloaded a comma-delimited file and imported into Excel spreadsheet to sort.</p>

<p>Good luck.</p>

<p>One could have just looked up an SAT-based ranking here:
[College</a> Rankings - Top 500 Ranked Colleges - Highest SAT 75th Percentile Scores - StateUniversity.com](<a href=“USA University College Directory - U.S. University Directory - State Universities and College Rankings”>Top 500 Ranked Colleges - Highest SAT 75th Percentile Scores)
This is sorted on 75th percentile M+CR, which gives a somewhat different result than sorting on 25th percentile M alone. In the OP’s list, ties are sorted alphabetically by school name; in the stateuniversity list, somehow there are no ties in the rank. </p>

<p>

</p>

<p>The thread title is “Selectivity of schools based on 2012-13 standardized test scores”. Of course a different ranking would result if we measured selectivity on some other basis, or even if we measured it based on a different way of treating the scores. The OP clearly indicated his basis for ranking by selectivity. He clearly indicated his information source. He’s not responsible for fixing individual discrepancies between the DOE data and some other source. He’s certainly not responsible for fixing the influence of family income on test scores. </p>

<p>Even though average SAT scores count for only ~8% of the US News ranking, they generally are good predictors of the overall ranking. Colleges with the highest average test scores also tend to have the biggest endowments per student, smaller average class sizes, higher faculty salaries, better 4 and 6 year graduation rates, etc. etc. It may indeed be the case that in admissions, these colleges are cherry-picking students “born on third base”. That wouldn’t necessarily mean they aren’t cherry-picking the best of everything else and as a result delivering a higher quality undergraduate education.</p>

<p>But if you think you’ll get a better education at a college with, say, greater student diversity or higher levels of research spending, then don’t use test scores as a proxy for college quality. Pick a different metric.</p>