Bootstrap lessons align with several important teaching standards. Select particular standards from the following menu to see which lessons meet those standards.

Common Core ELA Standards


Initiate and participate effectively in a range of collaborative discussions (one-on-one, in groups, and teacher-led) with diverse partners on grades 9-10 topics, texts, and issues, building on others' ideas and expressing their own clearly and persuasively. [See: Introduction to Computational Data Science.]

Common Core Math Standards


Construct and interpret scatter plots for bivariate measurement data to investigate patterns of association between two quantities. Describe patterns such as clustering, outliers, positive or negative association, linear association, and nonlinear association. [See: Data Displays and Lookups; Defining Functions; Scatter Plots; Correlations; Linear Regression.]


Know that straight lines are widely used to model relationships between two quantitative variables. For scatter plots that suggest a linear association, informally fit a straight line, and informally assess the model fit by judging the closeness of the data points to the line. [See: Scatter Plots; Correlations; Linear Regression.]


Use the equation of a linear model to solve problems in the context of bivariate measurement data, interpreting the slope and intercept. [See: Linear Regression.]


Write a function that describes a relationship between two quantities. [See: Defining Functions.]


Use function notation, evaluate functions for inputs in their domains, and interpret statements that use function notation in terms of a context. [See: Applying Functions.]


Recognize the purposes of and differences among sample surveys, experiments, and observational studies; explain how randomization relates to each. [See: Randomness and Sample Size.]


Evaluate reports based on data. [See: Threats to Validity.]


Represent data with plots on the real number line (dot plots, histograms, and box plots). [See: Histograms; Visualizing the “Shape” of Data; Spread of a Data Set.]


Use statistics appropriate to the shape of the data distribution to compare center (median, mean) and spread (interquartile range, standard deviation) of two or more different data sets. [See: Histograms.]


Interpret differences in shape, center, and spread in the context of the data sets, accounting for possible effects of extreme data points (outliers). [See: Histograms; Visualizing the “Shape” of Data.]


Represent data on two quantitative variables on a scatter plot, and describe how the variables are related. [See: Scatter Plots; Correlations.]


Fit a function to the data; use functions fitted to data to solve problems in the context of the data. Use given functions or choose a function suggested by the context. Emphasize linear, quadratic, and exponential models. [See: Visualizing the “Shape” of Data.]


Fit a linear function for a scatter plot that suggests a linear association. [See: Linear Regression.]


Interpret the slope (rate of change) and the intercept (constant term) of a linear model in the context of the data. [See: Linear Regression.]


Compute (using technology) and interpret the correlation coefficient of a linear fit. [See: Correlations; Linear Regression.]


Distinguish between correlation and causation. [See: Correlations; Linear Regression.]

CSTA Standards


Create clearly named variables that represent different data types and perform operations on their values. [See: Grouped Samples.]


Decompose problems and subproblems into parts to facilitate the design, implementation, and review of programs [See: Defining Table Functions; Method Chaining.]


Create procedures with parameters to organize code and make it easier to reuse. [See: Defining Functions; Defining Table Functions.]


Systematically test and refine programs using a range of test cases [See: Defining Functions; Defining Table Functions; Method Chaining; Checking Your Work.]


Document programs in order to make them easier to follow, test, and debug. [See: Defining Functions; Defining Table Functions; If-Expressions.]


Represent data using multiple encoding schemes. [See: Introduction to Computational Data Science; Starting to Program; Displaying Categorical Data.]


Collect data using computational tools and transform the data to make it more useful and reliable. [See: Displaying Categorical Data; Table Methods; If-Expressions; Randomness and Sample Size; Grouped Samples.]


Refine computational models based on the data they have generated. [See: Randomness and Sample Size; Grouped Samples; Correlations.]


Design and iteratively develop computational artifacts for practical intent, personal expression, or to address a societal issue by using events to initiate instructions. [See: Choosing Your Dataset; Ethics and Privacy.]


Decompose problems into smaller components through systematic analysis, using constructs such as procedures, modules, and/or objects. [See: Defining Table Functions; Method Chaining.]


Create artifacts by using procedures within a program, combinations of data and procedures, or independent but interrelated programs. [See: Defining Table Functions; Method Chaining.]


Document design decisions using text, graphics, presentations, and/or demonstrations in the development of complex programs. [See: Choosing Your Dataset.]


Create interactive data visualizations using software tools to help others better understand real-world phenomena. [See: Displaying Categorical Data; Data Displays and Lookups; Histograms; Visualizing the “Shape” of Data; Spread of a Data Set; Scatter Plots; Linear Regression.]


Construct solutions to problems using student-created components, such as procedures, modules and/or objects. [See: Choosing Your Dataset; Histograms; Visualizing the “Shape” of Data.]


Develop and use a series of test cases to verify that a program performs according to its design specifications. [See: Checking Your Work.]


Use data analysis tools and techniques to identify patterns in data representing complex systems [See: If-Expressions; Scatter Plots; Correlations; Linear Regression.]


Evaluate the ability of models and simulations to test and support the refinement of hypotheses. [See: Correlations; Threats to Validity.]

K-12CS Standards

6-8.Algorithms and Programming.Control

Programmers select and combine control structures, such as loops, event handlers, and conditionals, to create more complex program behavior. [See: Method Chaining.]

6-8.Algorithms and Programming.Modularity

Programs use procedures to organize code, hide implementation details, and make code easier to reuse. Procedures can be repurposed in new programs. Defining parameters for procedures can generalize behavior and increase reusability. [See: Defining Functions; Defining Table Functions.]

6-8.Algorithms and Programming.Variables

Programmers create variables to store data values of selected types. A meaningful identifier is assigned to each variable to access and perform operations on the value by name. Variables enable the flexibility to represent different situations, process different sets of data, and produce varying outputs. [See: Defining Functions.]

6-8.Computing Systems.Troubleshooting

Comprehensive troubleshooting requires knowledge of how computing devices and components work and interact. A systematic process will identify the source of a problem, whether within a device or in a larger system of connected devices. [See: Checking Your Work.]

6-8.Data and Analysis.Collection

People design algorithms and tools to automate the collection of data by computers. When data collection is automated, data is sampled and converted into a form that a computer can process. For example, data from an analog sensor must be converted into a digital form. The method used to automate data collection is influenced by the availability of tools and the intended use of the data. [See: Threats to Validity.]

6-8.Data and Analysis.Inference and Models

People transform, generalize, simplify, and present large data sets in different ways to influence how other people interpret and understand the underlying information. Examples include visualization, aggregation, rearrangement, and application of mathematical operations. [See: Data Displays and Lookups; If-Expressions; Measures of Center; Spread of a Data Set.]

6-8.Data and Analysis.Visualization and Transformation

Computer models can be used to simulate events, examine theories and inferences, or make predictions with either few or millions of data points. Computer models are abstractions that represent phenomena and use data and algorithms to emphasize key features and relationships within a system. As more data is automatically collected, models can be refined. [See: Scatter Plots; Correlations.]

9-12.Algorithms and Programming.Control

Programmers consider tradeoffs related to implementation, readability, and program performance when selecting and combining control structures. [See: Method Chaining; If-Expressions.]

9-12.Algorithms and Programming.Modularity

Complex programs are designed as systems of interacting modules, each with a specific role, coordinating for a common overall purpose. These modules can be procedures within a program; combinations of data and procedures; or independent, but interrelated, programs. Modules allow for better management of complex tasks. [See: Defining Functions; Defining Table Functions; Method Chaining.]

9-12.Computing Systems.Troubleshooting

Troubleshooting complex problems involves the use of multiple sources when researching, evaluating, and implementing potential solutions. Troubleshooting also relies on experience, such as when people recognize that a problem is similar to one they have seen before or adapt solutions that have worked in the past. [See: Checking Your Work.]

9-12.Data and Analysis.Collection

Data is constantly collected or generated through automated processes that are not always evident, raising privacy concerns. The different collection methods and tools that are used influence the amount and quality of the data that is observed and recorded. [See: Ethics and Privacy.]

9-12.Data and Analysis.Inference and Models

The accuracy of predictions or inferences depends upon the limitations of the computer model and the data the model is built upon. The amount, quality, and diversity of data and the features chosen can affect the quality of a model and ability to understand a system. Predictions or inferences are tested to validate models. [See: Linear Regression; Threats to Validity.]

9-12.Data and Analysis.Visualization and Transformation

Data can be transformed to remove errors, highlight or expose relationships, and/or make it easier for computers to process. [See: Data Displays and Lookups; Visualizing the “Shape” of Data; Spread of a Data Set; Scatter Plots.]

9-12.Impacts of Computing.Culture

The design and use of computing technologies and artifacts can improve, worsen, or maintain inequitable access to information and opportunities. [See: Ethics and Privacy.]

9-12.Impacts of Computing.Safety, Law, and Ethics

Laws govern many aspects of computing, such as privacy, data, property, information, and identity. These laws can have beneficial and harmful effects, such as expediting or delaying advancements in computing and protecting or infringing upon people’s rights. International differences in laws and ethics have implications for computing. [See: Ethics and Privacy.]


Fostering an Inclusive Computing Culture [See: Ethics and Privacy; Threats to Validity.]


Recognizing and Defining Computational Problems [See: Method Chaining; If-Expressions; Grouped Samples.]


Developing and Using Abstractions [See: Defining Functions; Defining Table Functions.]


Creating Computational Artifacts [See: Displaying Categorical Data; Histograms; Spread of a Data Set; Scatter Plots; Correlations.]


Testing and Refining Computational Artifacts [See: Checking Your Work.]


Communicating About Computing [See: Introduction to Computational Data Science; Choosing Your Dataset.]

Next-Gen Science Standards


Ask questions to determine relationships, including quantitative relationships, between independent and dependent variables. [See: Choosing Your Dataset.]


Ask and/or evaluate questions that challenge the premise(s) of an argument, the interpretation of a data set, or the suitability of the design. [See: Threats to Validity.]


Evaluate merits and limitations of two different models of the same proposed tool, process, mechanism, or system in order to select or revise a model that best fits the evidence or design criteria. [See: Histograms.]


Make directional hypotheses that specify what happens to a dependent variable when an independent variable is manipulated. [See: Linear Regression.]


Analyze data using tools, technologies, and/or models (e.g., computational, mathematical) in order to make valid and reliable scientific claims or determine an optimal design solution. [See: Data Displays and Lookups; Method Chaining.]


Apply concepts of statistics and probability (including determining function fits to data, slope, intercept, and correlation coefficient for linear fits) to scientific and engineering questions and problems, using digital tools when feasible. [See: Spread of a Data Set.]


Consider limitations of data analysis (e.g., measurement error, sample selection) when analyzing and interpreting data. [See: Randomness and Sample Size; Threats to Validity.]


Evaluate the impact of new data on a working explanation and/or model of a proposed process or system. [See: If-Expressions; Grouped Samples.]


Analyze data to identify design features or characteristics of the components of a proposed process or system to optimize it relative to criteria for success. [See: Table Methods.]


Apply techniques of algebra and functions to represent and solve scientific and engineering problems. [See: Defining Functions; Defining Table Functions.]


Use simple limit cases to test mathematical expressions, computer programs, algorithms, or simulations of a process or system to see if a model “makes sense” by comparing the outcomes with what is known about the real world. [See: Checking Your Work.]


Make a quantitative and/or qualitative claim regarding the relationship between dependent and independent variables. [See: Scatter Plots; Correlations.]


Compare and evaluate competing arguments or design solutions in light of currently accepted explanations, new evidence, limitations (e.g., trade-offs), constraints, and ethical issues. [See: Ethics and Privacy.]

Oklahoma Standards


Select and modify an existing algorithm in natural language or pseudocode to solve complex problems. [See: Starting to Program.]


Develop programs that utilize combinations of nested repetition, compound conditionals, procedures without parameters, and the manipulation of variables representing different data types. [See: Starting to Program.]


Incorporate existing code, media, and libraries into original programs of increasing complexity and give attribution. [See: Defining Functions.]


Analyze multiple methods of representing data and choose the most appropriate method for representing data. [See: Displaying Categorical Data; Data Displays and Lookups.]


Use knowledge of solving equations with rational values to represent and solve mathematical and real-world problems (e.g., angle measures, geometric formulas, science, or statistics) and interpret the solutions in the original context. [See: Defining Functions.]


Identify the dependent and independent variables as well as the domain and range given a function, equation, or graph. Identify restrictions on the domain and range in real-world contexts. [See: Applying Functions.]


Write linear functions, using function notation, to model real-world and mathematical situations. [See: Defining Functions.]


Break down a solution into procedures using systematic analysis and design. [See: Defining Table Functions; Method Chaining.]


Create computational artifacts by systematically organizing, manipulating and/or processing data. [See: Table Methods.]


Evaluate and refine computational artifacts to make them more user-friendly, efficient and/or accessible. [See: Visualizing the “Shape” of Data.]


Use tools and techniques to locate, collect, and create visualizations of small- and largescale data sets (e.g., paper surveys and online data sets). [See: Choosing Your Dataset.]


Show the relationships between collected data elements using computational models. [See: Scatter Plots; Correlations; Linear Regression.]


Evaluate the ways computing impacts personal, ethical, social, economic, and cultural practices. [See: Ethics and Privacy.]


Test and refine computational artifacts to reduce bias and equity deficits. [See: Randomness and Sample Size; Grouped Samples; Checking Your Work; Threats to Validity.]


Identify, describe, and analyze linear relationships between two variables. [See: Randomness and Sample Size; Grouped Samples.]


Explain how outliers affect measures of central tendency. [See: Measures of Center.]


Collect, display and interpret data using scatterplots. Use the shape of the scatterplot to informally estimate a line of best fit, make statements about average rate of change, and make predictions about values not in the original data set. Use appropriate titles, labels and units. [See: Scatter Plots; Correlations; Linear Regression.]