 # Lession - #1521 Regression Interaction Effects

Interaction effect is present in statistics as well in marketing. In marketing, this same conception is referred to as the community effect. Interaction effect means that two or further features variables combined have a significantly larger effect on a point as compared to the sum of the individual variables alone. This effect is important to understand in regression as we try to study the effect of several variables on a single response variable.

A direct regression equation can be expressed as follows

Then, we try to find the linear relation between the independent variables( X ₁ and X ₂>
with the response variable Y and ε is the small error. To check whether there's any significant statistical relation between the predictor and response variables, we conduct hypothesistesting.However, we will have two hypotheses

If we conduct this test for the predictor variable X ₁.

Null hypothesis( H ₀>
There's no relationship between X ₁ and Y( β ₁ = 0>

Alternative hypothesis( H ₁>
There's a relationship between X ₁ and Y( β ₁ ≠ 0>

We also decide whether or not to reject the null hypothesis based on the p- value. P- value is the probability of the results of the test, given the null hypothesis is true. For illustration, if we get anon-zero value of β ₁ in our test results, this indicates that there's a relationship between X ₁ and Y. But if the p- value is large, this indicates that there's a high probability that we might get anon-zero value for β ₁ indeed when the null hypothesis is actually true. In such a case, we fail to reject the null hypothesis and conclude that there's no relation between the predictor and response variable. But if the p- value is low( generally p- value cutoff is considered to be0.05>
also indeed a smallnon-zero value of β ₁ indicates a significant relation between the predictor and response variable.

still, we consider that for each unit increase of X ₁, Y increases decreases by β ₁ units, If we conclude that there's a relationship between X ₁ and Y. In the linear equation over, we assume that the effect of X ₁ on Y is independent of X ₂. This is also called as the additive assumption in linear regression.

But what if the effect of X ₁ on Y is also dependent on X ₂? We can see similar relations in numerous business problems. Consider for illustration we want to find out the return on investment for two different investment types. The direct retrogression equation for this illustration will be

In this illustration, there's a possibility that there would be lesser profit if we invest in both types of investments incompletely rather than investing in one fully. For illustration, if we've 1000 units of plutocrat to invest, investing 500 units of plutocrat in both the investments can lead to greater profit as compared to investing 1000 units completely in either of the investment types. In such a case, investment1’s relation with ROI'll be dependent on investment2. This relation can be included in our equation as follows In the equation over, we've included the ‘ interaction ” between investment1 and investment2 for the prediction of total return on investment. We can include similar interactions for any linear regression equation The above equation can be rewritten as Then, β ₃ is the coefficient of the interaction term. Again, to verify the presence of an interaction effect in regression, we conduct a hypothesis test and check the p- value for our coefficient( in this case β ₃>
. 