# Logistic Regression Complexity ## 1. Linear Complexity

Logistic Regression has O(N) Linear Complexity and it will scale very well. This is one of the main reasons why Logistic Regression is commonly preferred in Data Science projects particularly when big data or dimensional data is involved.

Probabilistic report capability is another big pro of Logistic Regression algorithm. You can see a more complete list of advantages of Logistic Regression below:

## 2. Logistic Regression Runtime Performance

Logistic Regression is one of the fastest machine learning algorithms along with Naive Bayes and some of its popularity is attributable to how fast it is and how well it works with big data.

We have run some runtime performance tests to give an idea about the time it takes to run real world Logistic Regression projects.

Logistic Regression has O(N*P) time complexity for binary classification problems where N is the rows (sample size) and P is the feature size.

A more complete definition of Logistic Regression’s time complexity is O(N*P*C) where N and P are as before and C is the Prediction Class size (C=1 for binary classification and C=2 for predicting 3 classes (multiclass) and so on). Additionally, time complexity of Logistic Regression models may vary based on different regularization techniques and solver algorithms that are utilized with the model.

Since class size will be a limited amount and constant values don’t change the time complexity of an algorithm, it can be said that Logistic Regression will have O(N) Linear Complexity in most cases which is very favorable for scaling but there can be situations where time complexity reaches O(N^2) Quadratic Complexity depending on the prediction class and dimension of the dataset. But, what about runtime performance?

Our performance tests with various Logistic Regression models have given following runtime results which can be helpful to have a sophisticated guess when implementing Logistic Regression to machine learning and data science projects.

56 featuressolver=lbfgs

Logistic Regression (100K): 1.61 seconds
Logistic Regression (250K): 2.68 seconds
Logistic Regression (500K): 3.86 seconds
Logistic Regression (1M): 4.16 seconds

2 featuressolver=lbfgs
Logistic Regression (100K): 0.21 seconds
Logistic Regression (250K): 0.34 seconds
Logistic Regression (500K): 0.67 seconds
Logistic Regression (1M): 0.98seconds

featuressolver=liblinear
Logistic Regression (100K): 0.06 seconds
Logistic Regression (1M): 0.44 seconds

56 featuressolver=liblinear
Logistic Regression (100K): 2.46 seconds
Logistic Regression (1M): 9.38 seconds

Please note: the tests were done with consumer tools (i7 8th Gen processor, 16GB RAM) and might not be very sensitive especially on the very quick end. Data reading and loading times are subtracted from ultimate performances.

Results show the efficiency and high performance of Logistic Regression machine learning models. Interestingly, liblinear solver shows an advantage in simple low-dimension data with 2 features however, it struggles a bit when data gets dimensional with 56 features since linearity of data becomes questionable at that point.

Probably because of this reason Scikit-Learn Logistic Regression implementation has lbfgs as default solver which performs favorably with different types of situations overall.

## 3. Data Size

Logistic Regression is perfectly suitable for big data as well as more moderately sized datasets. It will handle millions of samples in a matter of seconds in most situations and scales well even with high dimensions.

This makes Logistic Regression one of the go to machine learning algorithms for classification problems where time and performance is a critical criteria.

#### Rows

Logistic Regression can handle millions and even billions of rows without much difficulty if hyperparameters are carefully optimized.

See: Tuning Logistic Regression for more detail and solver related settings.

#### Features

Logistic Regression can also handle dimensional data very well. For this purpose specific solvers can be assigned to solver parameter making Logistic Regression more big data or dimensional data friendly.

Another fast and scalable classification algorithm is Naive Bayes which also produces probability reports but works quite differently than Logistic Regression. If you’ve never tried it, it’s definitely worth taking a look as it’s commonly taken advantage of in the Machine Learning and Data Science implementations.