Logistic Regression Calculator
Compute logistic regression probabilities from coefficients and inputs.
1. Paste your dataset
Format: each row = 1 observation. Last column = target (0 or 1). Columns separated by comma, semicolon, tab, or space. First row can be a header.
2. Training parameters
3. Model output
No model trained yet.
4. Predict with the trained model
After training, we’ll show the feature list. Enter values in order and get the probability \( P(y=1 \mid x) \).
Logistic regression essentials
Logistic regression models the log-odds (logit) of the positive class as a linear combination of predictors:
logit(p) = ln( p / (1 − p) ) = β₀ + β₁x₁ + β₂x₂ + … + βₖxₖ
p = 1 / (1 + e^(−(β₀ + β₁x₁ + ... + βₖxₖ)))
During training, we minimize the logistic loss (cross-entropy) over all observations. Here we do it with batch gradient descent.
Good to know
- Features on wildly different scales can slow or destabilize training. Normalizing helps.
- If your target isn’t 0/1, map it first.
- This page does a basic, educational implementation—no regularization, no auto class-weight.
Formula (LaTeX) + variables + units
','\
logit(p) = ln( p / (1 − p) ) = β₀ + β₁x₁ + β₂x₂ + … + βₖxₖ p = 1 / (1 + e^(−(β₀ + β₁x₁ + ... + βₖxₖ)))
- No variables provided in audit spec.
- NIST — Weights and measures — nist.gov · Accessed 2026-01-19
https://www.nist.gov/pml/weights-and-measures - FTC — Consumer advice — consumer.ftc.gov · Accessed 2026-01-19
https://consumer.ftc.gov/
Last code update: 2026-01-19
- Initial audit spec draft generated from HTML extraction (review required).
- Verify formulas match the calculator engine and convert any text-only formulas to LaTeX.
- Confirm sources are authoritative and relevant to the calculator methodology.