Author: Michigan Mathematics and Science Leadership Network (MMSLN)

The Michigan Mathematics and Science Leadership Network provides leadership, curriculum support, professional development, and student services to educators in local school districts and works to foster community involvement in the areas of mathematics and science. The Mathematics and Science Leadership Network supports the implementation of high-quality mathematics and science education for the students of Michigan.

Federal funds to support high quality science education for educators as well as students

The CS3 ESSA Title II and IV Toolkit on the Council of State Science Supervisors website points to actions state and local leaders can take to propose uses of the funding to support high quality science education for educators as well as students.

From the website:

Resources for Work with Federal Title Programs

CS3 ESSA Title II and IV Toolkit – Apr 2019 (PDF) CS3 ESSA Title II and IV Toolkit – Apr 2019 (Word)

This document was developed to help science leaders understand and access federal funds under two US Department of Education programs. Title II is typically used for professional development and Title IV is for Student Support and Academic Enrichment – both pools being highly useful for improving science education opportunities for teachers, principals and students.

This particular tool provides information for state science supervisors as well as local science education leaders such as district supervisors, lead teachers and school leaders. Planning and working together at the state and local levels may maximize the impact of these funds.

Please make this document available to science education leaders in your state.

NSTA Webinar Archive – Grant Funds in ESSA for Science and STEM, April 3, 2019

In this webinar, Jodi Peterson (NSTA) and Larry Plank (NSELA Past President) share details about the the funding available from the Every Student Succeeds Act (ESSA) for STEM, what district leaders and classroom teachers can do to access these funds, and some emerging best practices in STEM (and computer science) that all schools should consider.

Global Climate Change and the Science Denial Presentations by Joe Levine

At NSTA, I attended two very thought provoking presentations by Joseph Levine, Global Climate Change and the Science Denial. The presentations, links to videos, and additional resources are contained in this post, as well as a description of how they may be used.

James

________

You are welcome (and, in fact, encouraged!) to use these files for educational presentations, using your personal/professional judgment. Please note that although most images here are open-access, some are from sources that allow free educational use ONLY … so please don’t sell any of these materials or post them on social media. Please reference and credit either Dr. Camille Parmesan (for some slides in the climate change presentation) or Dr. Naomi Oreskes (for slides taken during her AAAS keynote).

Keynote and PowerPoint versions of both presentations are available for download from this folder:

https://www.dropbox.com/sh/juqvyoxltns92iz/AACa4NfyHFGMdTJ6cOdvlY-Ea?dl=0

Now and then, people have problems with my presentation files, because I have the most up-to-date versions of Keynote and PowerPoint. So … if you have older versions, they may not be able to read the files. (Don’t you love it when software developers do that?) If this happens to you, and you can’t get the files to work, please let me know, and tell me which version of Keynote or PowerPoint you have.

Note that both presentations include numerous slides I skipped to keep as close to the scheduled time as possible. Those skipped slides won’t show up if you "play" the presentation, but if you display it in “work” mode, skipped slides show up as thin lines which will display if you select them.

I have removed videos from the presentation, but have left on the slides that carried them place-holder slides that say "Insert movie "XXX" from "movies" dropbox folder … or something close to that.

Here is the link to that "movies" folder – which contains all the videos I used and many more that are just as interesting and useful:

https://www.dropbox.com/sh/0879z79txuvd8tl/AACVNIdlHEPNGR5687kdiBTFa?dl=0

With this link, you should be able to download any files you want/need from that folder. (Note: there are MANY files in there; feel free to pursue, download, and use any of them, with caveats about use as above.)

A couple of other great websites packed with resources:

Laboratory for Anthropogenic Landscape Ecology

Goddard Space Flight Center Scientific Visualization Service

Feel free to contact me with any questions.

As always, it was a GREAT pleasure to work with you at NSTA St Louis.

We all have our work cut out for us on this important subject.

all best — Joe

Using PSAT 8 as the Test for Grade 8 Students: An Ill-conceived, Dangerous Idea

Recently, I have seen administrators across Michigan flooding Twitter with celebratory tweets about the statewide switch to the PSAT 8 as the high-stakes accountability test for 8th grade students and teachers. Most of these tweets sound very similar: The PSAT 8 is the test we need. It provides schools, parents, and teachers with the data they need to improve. Putting aside the eerie similarity of the wording, let’s be clear about something: they are all very wrong. We need to be honest about this. We need to have an honest conversation as a state about how these ideas are misguided at best, dangerous at worst.

To begin that honest conversation, I’ll start with my contention that the PSAT 8, in fact, does none of the things these administrators and organizations claim it does. I believe this is true for at least four reasons:

1. The PSAT 8 has a fundamental purpose. It is meant to separate students.

The SAT Suite of Assessments (designed by the College Board) is designed to track one thing, and one thing only: college readiness. These tests are meant to separate students into two groups: college ready and . . . not. Here is where we hit our first snag. The policy of using the PSAT 8 for accountability purposes is in direct contention with the purpose of the assessment. The PSAT 8 is designed to be predictive of student performance on the SAT which is designed to give an indication of a given student’s likelihood of succeeding in entry level college coursework. Success is, in this case, defined as a probability of receiving a B in a credit-bearing course during freshman year.

There are a number of issues at play here. First, grade 8 educators have a responsibility to prepare students well in mathematics as defined by the 8th grade mathematics content standards in Michigan. That, in no uncertain terms, is their mission. Implementing PSAT 8 will drive schools to prepare students to perform on the PSAT 8, not on the end of year learning targets laid out in the standards which have been adopted by our state Board of Education. Second, it is always dangerous to use an assessment for a purpose other than its designed and intended purpose. This assessment is meant to measure college readiness as defined by the College Board, not students’ thinking and understanding of given mathematical content. Third, this separation and labeling of students does nothing to help those who are deemed “not college ready” by this assessment. Indeed, it likely dooms them to remedial tracks throughout high school, resulting in lowered expectations for them across the board. And this is the real danger. The assessment contributes to the gaps in student performance and learning that we currently see.

2. The PSAT 8 is a norm-referenced assessment.

This concern is a fundamental one in many ways. Our education system is based, currently, on standards or criteria to be met by the end of each successive school year. In such a system, assessment is best designed to determine student understanding in those areas and students’ abilities to meet the criteria laid out for them. The PSAT 8 is not designed in this way. This assessment provides an overall score in mathematics as well as several sub-section scores and oftentimes gives students information about the percentiles in which they reside. This means that the data are compared across the national cohort of students taking the test at that time. And scores are assigned relevance based on the performance of other students, not on any static criteria.

Normative data used for high-stakes decisions within a criterion-referenced system is nonsensical. Educators’ time is better spent considering how students understandings match the criteria we wish them to meet than considering how students compare to the average performance or to other students in their peer group. Certainly, there are “alignment documents” that give indications of which of the standards fit into the buckets of content of the assessment—but that is not the same as using an assessment designed to assess the criteria themselves.

3. The PSAT 8 provides data that is highly uninformative about classroom instruction.

One of the most detailed reports the PSAT 8 portal will provide is the Question Analysis Report. Educators love this report. They. Love. It. And for all the wrong reasons. I get it. The report gives you the performance of your students, the state’s students, and the nation’s students on a given assessment item. It shows you the percentages of students who chose each distractor. It even gives you the item to look at! The report feels like a gold mine. But ultimately, it’s fool’s gold. Here’s why.

As educators pour over this report they feel as though they are getting very accurate data about student performance. But these item-based analyses are dangerous because they easily lead to solutions designed to fix problems that may not exist. Let me illustrate. I sat with two district administrators and looked over a set of PSAT 8 data. We spent almost two hours looking at students’ performance on questions limited to the Heart of Algebra strand on the PSAT. After all of this work and my attempts to get them to see connections among the items and to come up with alternative explanations for students’ choices, the big takeaway for them was that they needed to work on systems of linear equations more.

There are two concerns that surface in the example above. First is the lack of information about students’ thinking. We certainly know which questions many students got wrong and we even know which distractors students chose most often. But any attempt to figure out why students chose those distractors is stymied by our lack of information. While it is tempting to say that the distractors were designed to take advantage of common misconceptions, that explanation is ultimately self-defeating. There are a number of potential reasons for a student to choose a given distractor, only one of which is that their misconception matches the one intended by the item writers. Without looking at student work in detail or talking to the students themselves, educators can never be certain they have even a vague idea of students’ issues. Second, when summarizing the efforts at the end of a meeting, educators are (understandably) drawn to particular examples of problems that were a struggle for students (like the systems problems in our example). But these particulars are, in all likelihood, small percentages of the kinds of problems students will likely see on a given form of the PSAT. To be clear, in the example, there were at most four items that were related to students’ understandings of systems of linear equations, a small portion of the overall assessment. This kind of item-based decision-making is dangerous and will oftentimes lead to solutions that miss the mark.

4. The PSAT 8 assesses content that grade 8 teachers and students are not responsible for.

There are two concerns associated with this issue: fairness and accountability. I would argue that it is inherently unfair to give an assessment to students which contains content that they have not learned yet, regardless of whether or not success on that content would only increase their score above the proficient mark. The PSAT 8—indeed any high-stakes assessment of this nature—contains items on which it is expected that students will not be able to perform. After all, the test has to separate students somehow, right? Here is the heart of the unfairness. The PSAT 8 contains content that is not present in the grade 8 standards because it is meant to predict performance on the SAT, not assess 8th grade content.
As for accountability, if I were an 8th grade teacher, I would be absolutely irate that I was being measured for accountability and evaluation by a tool that contained content for which I was not responsible. And because educators take their evaluations seriously, many will feel as though they have no choice but to teach the more advanced content to everyone in the hopes that it will better their evaluation results. Systemically, this might lead to a practice of requiring all 8th grade students to take Algebra 1, a practice that we learned from California is detrimental to students. And maybe that is the most insidious problem of all: with the best of intentions, these decisions are made based on the needs of adults and not on the needs of students. Accountability. School data. Allow schools to improve. Very few of these sentiments mention students specifically. And that’s because the assessment is not designed to be friendly to students—it is designed to separate and label them.

To close out this discussion and make it, perhaps, more productive. I ask the following questions:

  1. What kinds of data do teachers actually need to improve their classroom practice on a daily basis?
  2. What kinds of data can schools collect easily that give insight into the effectiveness of their curriculum?
  3. What alternative types of assessments might allow us to get at what students know and can do?
  4. How can we have an honest conversation about how we might effectively use PSAT 8 data to help schools and teachers?
    1. What can the PSAT do for us, based on its intended purpose?
    2. What can’t it do for us, based on its intended purpose?
  1. Article written by Jason Gauthier, Math Consultant, Allegan Area ESA

ACESSE Vision survey – respondents needed

Science Community Leaders,

Please help our research partners at University of Colorado at Boulder finalize the creation of a science vision survey tool for states to use. It should take about 18 minutes of your time. Please share broadly with state and district leaders, folks in informal ed, etc. The link is here: https://cuboulder.qualtrics.com/jfe/form/SV_cFOn6spXQQ8fNVH

Thanks for your help!

Megan Schrauben, Executive Director, MiSTEM Network

UM Public Lecture: The Unlikely Friendship of Math and Science

The Unlikely Friendship of Math and Science

Ben Orlin

Ben Orlin Public Lecture
Abstract: On the one hand, there’s science: the clear-eyed, hard-nosed, the pragmatic empiricist. On the other hand, there’s math: the poet, the dreamer, the hunter of wild abstractions. How do these two intellectual traditions regard one another? And why is it that the most useless-sounding math – from knot theory to meta-logic to non-Euclidean geometry – often turns out to be the most useful?

Prerequisites: basic human curiosity; tolerance for bad drawings; the willingness to participate in a silly debate. In short: all are welcome!

https://events.umich.edu/event/62432

WED, APR 3 20195:30pm – 6:30pm
East Hall – 1324

Math and Science education leaders gather to Advance Equity in STEM

On Thursday, March 14, Math and Science education leaders from across the state gathered in Lansing for Part 2 of Michigan Advancing Equity in STEM (MAE-STEM) hosted by the Michigan Math and Science Leadership Network (MMSLN).

We took steps to increase our Equity Literacy and build our collective stamina and skill to become a threat to inequity in our spheres of influence across the state. We commit to Recognizing, Responding to, and Redressing inequities and Sustaining equity efforts in our regions. We will work to ensure children thrive, versus survive, in our educational systems.

Follow our collective learning on Twitter at #MiSTEMEquity.

🌱When you plant lettuce, if it does not grow well, you don’t blame the lettuce. You look for reasons it is not doing well. It may need fertilizer, or more water, or less sun. You never blame the lettuce. ~Thich Nhat Hanh

Expect to hear more from us in April after Part 3.

This article was submitted by Danielle SeaboldMathematics Education Consultant and Instructional Coach, Bold Educational Consulting LLC.

 

Highly Qualified Teacher Requirement Removed, Learn More

The Office of Educator Excellence within the Michigan Department of Education will be hosting a webinar on Friday, March 22, 2019 as part of the Educator Workforce Webinar Series: Preparing, Placing, Developing, and Retaining Educators. This webinar will focus on educator placement and staffing.

In Michigan the requirements for Highly Qualified Teachers have been removed; join this webinar to review MDE placement requirements and best practices and hear about staffing solutions for innovative programs. Updated placement guidance will be shared. Be sure to register for your opportunity to ask questions and learn solutions to support innovative courses and increase student opportunities.

We look forward to your participation on Friday, March 22, 2019 at 9:00 AM.

Victor Bugni

Appropriate Placement Consultant

Office of Educator Excellence

Michigan Department of Education

Office: 517-335-0589

Share Your Experience!

https://s.zoomerang.com/r/OEE_Service

Celebrate March is Reading Month by learning more: Read by Grade Three Parent Awareness Toolkit.

Not all models are the same, so your approach to Developing and Using Models shouldn’t be either.

Models should include the interactions between variables to explain a phenomenon. For example, when modeling the health of an individual with diabetes. The afflicted person is impacted by diet. On paper (or a whiteboard), this would likely take the form of an arrow connecting diet to health. It does not show that the health of someone with diabetes can be impacted positively with good nutrition and impacted negatively with poor nutrition. The same is also true for many of the other variables involved in this model; environment, exercise, etc. Another limitation of the paper model is that it doesn’t allow students to test their model.

Using a dynamic modeling tool, such as SageModeler, students can overcome the problems associated with paper models. SageModeler is a free web-based systems dynamics modeling tool for secondary school students to construct dynamic models and validate their models by comparing outputs from their own models and data from one or more other sources, including experimental data from probes or data generated by simulations. The tool is used to engage learners in three-dimensional learning by using crosscutting concepts (systems and systems modeling, cause and effect, and energy and matter) with various scientific practices (particularly modeling, but also analyzing and interpreting data and engaging in argument with evidence), integrated with disciplinary core ideas.

To facilitate model validation, SageModeler and all external data sources are embedded in a Common Online Data Analysis Platform (CODAP). CODAP is an intuitive graphing and data analysis platform that takes the outputs from the system dynamics models, as well as any other validating data source, and blends them into a single analytic environment.

Learn to use SageModeler so you can facilitate it with teachers in your region.

  • Learn to use the tool in-depth to work with teachers in your region. Better yet, bring a teacher with you. Then, try it in their classroom and learn together to become a more proficient user and facilitator of the tool.
  • Gain access to CREATE for STEM middle school and high school units that are designed for NGSS and use SageModeler. The tool can also be used to enhance most secondary units from your own curriculum.
  • Two day workshop – May 1 & 2. Plus a problems of practice follow up video conference. Presented by Joe Krajcik (MSU) and Dan Damelin (Concord)

· Supported by a grant from MSU CREATE for STEM and Concord Consortium – including hotel and travel. For up to 30 consultants or consultant-teacher teams.

· Workshop location – AMR, 2123 University Park Drive, Suite 100, Okemos, MI 48864

· Register today! – $45 for workshop meals.

Guidance for Science Teacher Evaluation

“[The] Science [MSTEP] in all administered grades does not have student growth data that is viable for use within educator evaluation because there are multiple year gaps between data points, and there is no student growth data available for individual teacher attribution.” – Brian Lloyd, Student Growth Consultant at MDE (memo, 9.11.2018).

How will science teachers meet the requirements of the law?
In Michigan, the law states that 60% of a teacher’s evaluation is comprised from district selected teacher evaluation instrument, and 40% of a teacher’s evaluation comes from student growth data. Jessica Ashley, Oakland Schools Science Consultant, developed the attached document to offer guidance for teachers and administrators to help facilitate conversations regarding student growth measures. This guidance includes different options to comprise the 40%, approaches for measuring student growth, common assessment instrument options, collecting your students’ data, Illuminate, and other important science assessment information.
Science Teachers–Guidance for Your Evaluation.pdf

2019 Modeling Workshops

Four summer Modeling Instruction in Science workshops have been fully funded. Two of them are very new and exciting workshops we are offering in 2019; Biology 2 and Modeling with MiSTAR!

Please share this information (and the attached flyer) with all of the science teachers in your region.

1. Biology 2. Four Biology Modeling Facilitator/Teachers (3 in Oakland: Fawn P., Allison M. and Andrea B.) and AMTA are working on the Biology Modeling Framework to bring it across the finish line. This is the 2nd version we started using, out of Ohio State. They are not only finishing out all units but building in a much better concept flow and story line. This workshop will help past biology modelers understand the new flow and experience the remaining units that were not provided in the biology 1 workshop. There will also be a design challenge incorporated to deploy a student developed model, and better support the NGSS based MMS.

2. Modeling with MiSTAR. Three Middle School Modeling Facilitator/Teacher, (2 in Oakland: Andrea W. and Nell B./and Scott S. in Macomb County) AMTA and the MiSTAR team have collaborated on designing this workshop that will infuse Modeling Instruction pedagogy into 2 sixth and 2 seventh grade units. The workshop preparation involves some unit revisions so Modeling building (as employed by Modeling Instruction) can occur.

I’m sending out our annual flyer describing the Modeling in Michigan workshops we have scheduled. It’s coming out a bit later than usual because we’ve been deliberating on our dates, in part due to the impact of snow days. But since we shouldn’t wait too much longer I’m sending out a flyer with June dates that may end up getting shifted a week later. If anyone needs an overview of what our program is about I’ve also attached our Modeling One Pager.

Also, we did not win a STEM Advisory Council grant so at this time only 4 workshops have guaranteed funding. We are pursuing other funds and if we are successful I will update our flyer and let you all know. We decided to set up the application links anyway to allow people to at least get their names on our list. We’ll keep them informed on where we stand in coming weeks.  As always, check the website for additional information and to apply, Modeling Instruction in Michigan Workshops

We rely on you to get this flyer and our news out to all your science teachers.

Appreciated!

Mike, Jessica, James and the Modeling in Michigan programeers

Mike Gallagher

Oakland MiSTEM Region Director

Mike.Gallagher

248.209.2234

Summer2019Workshops.pdf

Modeling-One Pager-2019.pdf