linear regression vs logistic regression (logistic vs linear regression) is a two important backbone algorithm for data science and machine learning regression models most of the model and algorithm are using these regression fundamentals in the background.
What is the regression?
Regression analysis is a form of predictive modeling technique which investigates the relationship between a dependent variable Y and independent variable Xn. In another word, regression shows the changes in the dependent variable yaxis with respect to the independent variable xaxis.
Regression analysis a graphing a line for the set of data points which most fit an overall set of data.
Application of Regression:
 Weather forecast
 Trend forecast
 Sales and marketing, Relationship between age and income.
 Trend and sales estimation.
 Analyzing the impact of price changes.
 Email spam or not.
 Assessment the risk in financial services and insurance domain.
 A return of investment: – suppose a company invests some fund to promote some brand so they can use regression to calculate the return based on brand marketing investment.
 A company can use regression to find out which department (HR, marketing, Sales, Admin, R&D) are taking more salary on their overall revenue.
 Regression analysis is useful for customer survey and feedback form to check the quality of service
 A company can use a regression model to predict the future employ salary with respect to their experience.
Linear Regression Vs Logistic Regression
Basic  Linear Regression  Logistic Regression 
Core concept  A data point is in the form of a straight line  Data point are in not straight line, they vary between to point. 
Used with  Continuous variables  Categorical variable 
Output Prediction 
Values of variable. coefficient interpretation of independent variables are quite straightforward. 
The probability of occurrence of an event. Coefficient interpretation is different and depends on the family (binomial, Poisson, etc.) and link (log, logit, inverselog, etc.). 
Accuracy and goodness of fit  Measured by loss, R Squared, Adjusted R squared etc.  Accuracy, Precision, Recall, F1 Score, ROC curve, Confusion matrix, etc. 
Pattern  A linear regression map to continues X to continues Y. 
Logistic Regression map to continuous X to binary Y. You can use this to a category like truth/false, yes/no, statement. 
Formula 

Graph type  Linear graph

Sigmoid graph

Regression line also called bestfit line which shows the relationship between the independent variable and dependent variable.
Positive Regression line is the line where dependent values at Yaxis is a proposal to an independent value Xaxis line of slop is +ve Y = mx+c
Negative Regression line where your independent values increasing on Xaxis and dependent values on yaxis decreasing so Y values inversely professional to X values where the slope of the regression line is ve Y = mx+c
Logistic regression work as a switch it will be either 0 or 1 logistic regression data not linearly separated instead of that it will be divided into two group.
We can transform the logistic regression formula by putting linear regression y value in a sigmoid function.
Wrapping up: So linear regression Vs logistic regression by looking at the data pattern we can easily understand which regression will work well with what kind of datasets. Linear regression work grates with the continuous data point and provide good accuracy which predicting unseen data point. In other hands, logistic regression (used to predict a binary outcome (1 / 0, Yes / No, True / False) works very accurately with a group of the data point. We also see their different 3D graphs and how logistic regression equation will derive by using sigmoid function from linear regression. In further article will see details study about linear and logistic regression, will select the dataset and drill down their algorithm using python will also see their advantages and disadvantages.