What This Test Measures
The STEM Thinking Skills Assessment is the single most important component of AOS and AET admissions. It's also the most misunderstood.
Parents tend to assume this is a science and math content test — something like the Virginia SOL exams but harder. It's not. The test is developed by Insight Assessment, and it's designed to measure how your student thinks, not what they've memorized.
Think of it this way: a typical school test asks, "Do you know the Pythagorean theorem?" The STEM assessment asks, "Can you figure out a problem you've never seen before using logic, spatial reasoning, and algebraic thinking — without a calculator, under time pressure?"
The test has 33 questions and a 50-minute time limit. That works out to roughly 90 seconds per question. Some questions take 30 seconds; others take three minutes. Students who can't manage their time will run out before finishing.
The core skill being tested: Can your student reason through an unfamiliar problem using logic and quantitative thinking? Not speed. Not memorization. Not "tricky" math. Clear, flexible thinking under mild time pressure.
The Five Domains
The STEM Thinking Skills Assessment covers five distinct areas. Each domain tests a different kind of thinking, and a strong overall score requires competence across all of them.
Overall Critical Reasoning
This is the broadest domain. It tests whether your student can evaluate an argument, identify assumptions, draw valid conclusions from given information, and spot logical flaws. These questions often present a short passage or scenario and ask the student to determine what follows logically — or what doesn't.
Students who read carefully and think before answering tend to do well here. Students who rush or rely on gut feelings often get tripped up by answer choices that seem right but don't hold up under scrutiny.
Out-of-the-Box Algebra
Don't let the word "algebra" mislead you. This isn't about solving 3x + 7 = 22. These questions test whether a student can see algebraic relationships in unfamiliar contexts — patterns in sequences, relationships between variables, unknown quantities that can be deduced from constraints.
A student who's strong at school algebra but has never done competition-style or puzzle-based math may find this domain surprisingly difficult. It rewards creative problem-solving, not procedural fluency. The math itself isn't advanced, but the presentation is intentionally non-standard.
Spatial-Relational Thinking
This domain tests the ability to visualize and manipulate objects mentally — rotating shapes, folding and unfolding patterns, understanding three-dimensional relationships from two-dimensional representations. For many students, this is either a natural strength or a significant weakness.
Spatial reasoning is trainable, but it takes specific practice. Working through mental rotation exercises, origami-based puzzles, cross-section problems, and visualization tasks builds the neural pathways that this domain tests. This isn't something school typically develops, which is why it catches many well-prepared academic students off guard.
Tech Logic
Tech logic questions involve algorithmic thinking, conditional reasoning ("if this, then that"), flowcharts, and systematic problem-solving. These are the kinds of questions that feel natural to students who've done any programming, but they don't require actual coding knowledge.
The test might present a set of rules and ask what happens when you follow them in a specific order, or ask the student to trace through a logical sequence to find an outcome. Students who enjoy puzzles, logic games, or coding challenges will recognize the thinking style, even if the specific format is new.
Scientific Thinking
This domain doesn't test scientific facts — it tests whether a student can think like a scientist. That means interpreting data from charts and graphs, evaluating experimental designs, identifying variables, and drawing evidence-based conclusions.
A question might present two graphs and ask what conclusion is supported by both data sets, or describe an experiment and ask what's wrong with the methodology. The student needs to think critically about evidence, not recall information from their science textbook.
Scoring (260–300)
The STEM Thinking Skills Assessment is scored on a scale of 260 to 300. This is not a percentile — it's a scaled score derived from the number of correct answers and the statistical properties of each question.
Common error: Many parent forums and prep resources incorrectly state the scale is 250-300. The bottom of the scale is 260, not 250.
LCPS does not publish a minimum score for admission. There's no bright line where everyone above X gets in and everyone below X doesn't. The STEM score is combined with the Writing Assessment score (0-10 scale) and the student's academic record to produce a composite score. Decisions are based on the composite, not on any single component.
That said, the STEM score is widely understood to be the most heavily weighted of the three factors. A student with a very strong STEM score has a significant advantage — though a weak writing score or weak grades can still drag the composite down.
What do the numbers mean?
Without LCPS publishing detailed score distributions, it's hard to say exactly what a "good" score is. Here's what we can say based on the Insight Assessment framework and the 260-300 scale:
- Scores near 260 indicate performance at or near the bottom of the measured range. The student may have strong skills in one or two domains but struggled significantly in others.
- Mid-range scores (around 275-280) suggest solid reasoning ability across most domains. Whether this is competitive for admission depends on the writing score, grades, and the overall applicant pool that year.
- Scores approaching 290-300 indicate very strong critical thinking and reasoning across all five domains. These are the scores that tend to anchor competitive applications.
The score your student needs depends on the total package. A 285 STEM score with strong writing and grades is a very different application from a 285 with a 4/10 writing score. The composite matters.
Test Day Logistics
Knowing what to expect on test day reduces anxiety and lets your student focus on the questions. Here's what happens.
| Detail | What You Need to Know |
|---|---|
| Location | The test is administered at your student's own middle school during the school day. Your student doesn't need to travel to a testing center. |
| Format | Taken on LCPS-provided laptops using a secure online platform. This is not a paper test. Students cannot bring their own devices. |
| Duration | 50 minutes for 33 questions. No scheduled breaks. The clock runs continuously once the test begins. |
| Calculator | Not allowed. All computation must be done mentally or with any scratch paper provided. This is by design — the test measures reasoning, not calculation. |
| Timing | Typically administered in October or November during the fall admissions cycle. The exact date is communicated by the school. |
| Retakes | Students take the STEM assessment once per admissions cycle. There are no retake opportunities within the same cycle. |
On the same day or separate days: The STEM Thinking Skills Assessment and the Writing Assessment may be administered on the same day or on different days, depending on how your school schedules them. Make sure your student knows whether they're sitting for one test or two.
How to Prepare
This is the section most parents want. Here's what actually works.
Spatial reasoning practice
This is the domain where targeted practice makes the biggest difference, because most students have the least experience with it. Work on mental rotation exercises, paper folding problems, cube net identification, and 3D-to-2D visualization tasks. There are workbooks and apps specifically for spatial reasoning — use them. Even 15 minutes a day of spatial practice over two to three months builds real capability.
Mental math fluency
Since calculators aren't allowed, your student needs to be comfortable with arithmetic, fractions, percentages, and basic algebraic manipulation without a device. This doesn't mean they need to be a human calculator — they need to be comfortable enough that computation doesn't eat up their reasoning time. Practice mental math daily: estimate, simplify, approximate. Speed matters less than confidence.
Logic puzzles and constraint problems
Logic grid puzzles, Sudoku variants, KenKen, and "if A then B" style deductions all build the kind of systematic reasoning the test rewards. The goal is for your student to get comfortable with problems where they have to chain multiple logical steps together, not just apply a single formula.
Timed practice
The 50-minute time limit is real. Students who've never practiced under time pressure often underperform because they spend too long on hard questions and don't reach the easier ones at the end. Practice with a timer. Get used to the rhythm of "spend 90 seconds, make your best call, move on." Coming back to a tough question is always better than stalling on it.
Data interpretation
Practice reading charts, graphs, and data tables. Given a bar chart with three variables, can your student identify trends, outliers, and relationships? Given an experiment described in text, can they evaluate whether the conclusion follows from the data? This skill transfers directly to the Scientific Thinking domain.
Non-standard algebra
Work through math competition problems at the AMC 8 level. Not because the STEM test is a math competition — it's not — but because competition problems train the "seeing the trick" skill that the Out-of-the-Box Algebra domain rewards. Pattern recognition, working backwards, and testing cases are all useful techniques.
Start with diagnostics
Before diving into preparation, figure out where your student is strong and where they're weak. Spatial reasoning? Logic? Data interpretation? A diagnostic approach lets you focus preparation time where it actually matters instead of covering everything equally.
Practice the format, not just the content
The STEM test is computer-based, timed, and multiple-choice. If your student has only ever practiced on paper with unlimited time, they'll face two adjustment challenges at once on test day. Practice under realistic conditions: on a screen, with a timer, without a calculator.
Build stamina
50 minutes of concentrated reasoning is mentally exhausting for a 13-year-old. Work up to full-length practice sessions gradually. A student who's done twenty 50-minute sessions will perform better than one who's only done practice in 15-minute chunks.
What Doesn't Help
Just as important as knowing what to do is knowing what to avoid. Some common preparation approaches are essentially wasted effort for this particular test.
These preparation strategies won't move the needle:
- Memorizing science facts, vocabulary, or formulas
- Doing more standard school algebra homework
- Generic standardized test prep (SAT/ACT-style)
- Studying from textbooks or class notes
- Flashcard-based memorization of any kind
- Reading science articles without analyzing the data
Why standard test prep doesn't transfer
Most test prep is designed for content-based exams. The SAT tests math that students have learned in school. The SOL tests Virginia curriculum standards. The STEM Thinking Skills Assessment tests none of that. It's a reasoning assessment, not a knowledge assessment.
A tutoring center that has your student drilling algebra equations is preparing them for a different test. Unless the practice specifically targets critical reasoning, spatial thinking, logic, and data interpretation — under timed conditions, without a calculator — it's not aligned to what the STEM assessment actually measures.
Why more school homework doesn't help
Being an A student in math is necessary but not sufficient. The STEM test doesn't ask, "Can you solve the type of problem your teacher showed you?" It asks, "Can you figure out a problem type you've never seen?" These are different skills. Your student can be perfect in school math and still score in the mid-range on the STEM assessment if they haven't developed flexible, non-routine reasoning.
Why memorization is counterproductive
There's nothing to memorize. No formulas appear on the test that you could study in advance. The test is designed so that all necessary information is provided in the question. Students who try to cram facts or formulas into memory are spending time on something that will have zero payoff on test day. That time would be better spent doing logic puzzles or spatial reasoning practice.
When to Start
This depends on your student, but here's a general framework.
The long game (starting in 6th or 7th grade)
If your student is a year or two away from applying, you have the luxury of building foundational skills without test pressure. Spatial reasoning games, logic puzzles at bedtime, math competitions like AMC 8, coding classes — all of these build the thinking muscles the STEM test measures. This isn't "test prep." It's developing the underlying capabilities.
Focused preparation (summer before 8th grade)
The summer before 8th grade is the most common starting point for targeted STEM test preparation. This gives your student 3-4 months of focused practice before the October/November testing window. Start with a diagnostic to identify weak domains, then build a practice schedule that addresses those gaps while maintaining strengths.
Last-minute (September of 8th grade)
If you're starting in September with testing in October, you have 4-6 weeks. That's enough time to familiarize your student with the test format, do timed practice, and shore up the most glaring weaknesses — but it's not enough time to build spatial reasoning skills from scratch. Focus on test strategy: pacing, skipping and returning to hard questions, eliminating obviously wrong answers, and managing test anxiety.
Honest assessment: Students who start preparing 6-12 months in advance tend to do meaningfully better than those who start 4-6 weeks before the test. The reasoning skills this test measures take time to develop. You can't cram your way to a strong STEM score the way you might cram for a vocabulary test.
Whatever your timeline, the key is specificity. Generic "enrichment" is fine for general development, but targeted practice — spatial reasoning, logic, data interpretation, mental math, timed conditions — is what moves STEM scores. Know what the test measures, practice those specific skills, and don't waste time on things that won't show up in the score.
Related reading: For the full picture of how the STEM score fits into the admissions decision, read our guide to what ACL actually looks for. To understand which program is the best fit for your student, see our AOS vs AET vs MATA comparison.