{"id":266963,"date":"2024-11-25T07:22:32","date_gmt":"2024-11-25T07:22:32","guid":{"rendered":"https:\/\/imarticus.org\/blog\/?p=266963"},"modified":"2024-11-25T12:49:23","modified_gmt":"2024-11-25T12:49:23","slug":"simple-linear-regression","status":"publish","type":"post","link":"https:\/\/imarticus.org\/blog\/simple-linear-regression\/","title":{"rendered":"A Step-by-Step Guide to Simple Linear Regression"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">Simple <\/span><span style=\"font-weight: 400;\">linear regression<\/span><span style=\"font-weight: 400;\"> is a statistical method used to model the relationship between two variables: a dependent variable and an independent variable. It helps us understand how changes in the independent variable affect the dependent variable. This technique is widely used in various fields, including finance, economics, and social sciences.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Enrol in Imarticus Learning\u2019s holistic <\/span><a href=\"https:\/\/imarticus.org\/chartered-financial-analyst-certification-program\/\"><b>CFA course<\/b><\/a><span style=\"font-weight: 400;\"> to become a chartered financial analyst.<\/span><\/p>\n<h2><span style=\"font-weight: 400;\">Linear Regression Explained for Beginners<\/span><span style=\"font-weight: 400;\">: Understanding the Model<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">A simple <\/span><span style=\"font-weight: 400;\">linear regression<\/span><span style=\"font-weight: 400;\"> model can be expressed as:<\/span><\/p>\n<p><b><i>Y = \u03b2\u2080 + \u03b2\u2081X + \u03b5<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400;\">Where:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Y:<\/b><span style=\"font-weight: 400;\"> Dependent variable<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>X:<\/b><span style=\"font-weight: 400;\"> Independent variable<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>\u03b2\u2080:<\/b><span style=\"font-weight: 400;\"> Intercept<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>\u03b2\u2081: <\/b><span style=\"font-weight: 400;\">Slope<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>\u03b5:<\/b><span style=\"font-weight: 400;\"> Error term<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">The goal of regression analysis is to estimate the values of \u03b2\u2080 and \u03b2\u2081, which represent the intercept and slope of the regression line, respectively.<\/span><\/p>\n<h2><span style=\"font-weight: 400;\">Linear Regression Tutorial<\/span><span style=\"font-weight: 400;\">: Steps in Simple Linear Regression<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Here is a comprehensive <\/span><span style=\"font-weight: 400;\">linear regression tutorial<\/span><span style=\"font-weight: 400;\"> so that it is easier for you to understand the steps involved in this process.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Data Collection<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Identify Variables:<\/b><span style=\"font-weight: 400;\"> Determine the dependent and independent variables for your analysis.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Collect Data:<\/b><span style=\"font-weight: 400;\"> Gather relevant data for both variables. Ensure the data is accurate and reliable.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Data Cleaning and Preparation<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Missing Values:<\/b><span style=\"font-weight: 400;\"> Handle missing values using techniques like imputation or deletion.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Outliers:<\/b><span style=\"font-weight: 400;\"> Identify and handle outliers, which can significantly impact the regression results.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Data Transformation:<\/b><span style=\"font-weight: 400;\"> If necessary, transform the data (e.g., log transformation) to meet the assumptions of linear regression.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Model Specification<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Linear Relationship:<\/b><span style=\"font-weight: 400;\"> Assume a linear relationship between the variables.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Error Term Assumptions:<\/b><span style=\"font-weight: 400;\"> Assume that the error term is normally distributed with a mean of zero and constant variance.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Model Estimation<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Least Squares Method:<\/b><span style=\"font-weight: 400;\"> Use the least squares method to estimate the coefficients \u03b2\u2080 and \u03b2\u2081.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Statistical Software:<\/b><span style=\"font-weight: 400;\"> Utilise statistical software like R, Python, or Excel to perform the calculations.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Model Evaluation<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Coefficient of Determination (R\u00b2): <\/b><span style=\"font-weight: 400;\">Measures the proportion of the variance in the dependent variable explained by the independent variable.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Standard Error of the Estimate:<\/b><span style=\"font-weight: 400;\"> Measures the variability of the observed values around the regression line.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Hypothesis Testing:<\/b><span style=\"font-weight: 400;\"> Test the significance of the regression coefficients using t-tests.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Residual Analysis:<\/b><span style=\"font-weight: 400;\"> Examine the residuals to check for patterns or outliers.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Interpretation of Results<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Intercept:<\/b><span style=\"font-weight: 400;\"> The value of Y when X is zero.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Slope:<\/b><span style=\"font-weight: 400;\"> The change in Y for a one-unit change in X.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>R\u00b2:<\/b><span style=\"font-weight: 400;\"> The proportion of the variation in Y explained by X.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Statistical Significance:<\/b><span style=\"font-weight: 400;\"> Assess the statistical significance of the regression coefficients.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Applications of Simple Linear Regression<\/span><\/h2>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Financial Analysis:<\/b><span style=\"font-weight: 400;\"> Predicting stock prices, forecasting sales, or estimating costs.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Economics:<\/b><span style=\"font-weight: 400;\"> Analysing the relationship between economic variables, such as GDP and unemployment.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Marketing: <\/b><span style=\"font-weight: 400;\">Predicting customer behaviour, measuring the effectiveness of marketing campaigns, or optimising pricing strategies.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Social Sciences:<\/b><span style=\"font-weight: 400;\"> Studying the impact of social factors on various outcomes, such as education, health, and crime.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Limitations of Simple <\/span><span style=\"font-weight: 400;\">Linear Regression<\/span><\/h2>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Linear Relationship:<\/b><span style=\"font-weight: 400;\"> Assumes a linear relationship between the variables.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Outliers and Influential Points:<\/b><span style=\"font-weight: 400;\"> Outliers can significantly affect the regression results.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Multicollinearity:<\/b><span style=\"font-weight: 400;\"> If independent variables are highly correlated, it can lead to unstable estimates.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Causation:<\/b><span style=\"font-weight: 400;\"> Correlation does not imply causation.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Multiple <\/span><span style=\"font-weight: 400;\">Linear Regression Explained for Beginners<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Multiple linear regression extends the simple linear regression model to include multiple independent variables. It is used to analyse the relationship between a dependent variable and two or more independent variables. The general form of the multiple linear regression model is:<\/span><\/p>\n<p><b><i>Y = \u03b2\u2080 + \u03b2\u2081X\u2081 + \u03b2\u2082X\u2082 + &#8230; + \u03b2\u209aX\u209a + \u03b5<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400;\">Where:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Y:<\/b><span style=\"font-weight: 400;\"> Dependent variable<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>X\u2081, X\u2082, &#8230;, X\u209a:<\/b><span style=\"font-weight: 400;\"> Independent variables<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>\u03b2\u2080:<\/b><span style=\"font-weight: 400;\"> Intercept<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>\u03b2\u2081, \u03b2\u2082, &#8230;, \u03b2\u209a:<\/b><span style=\"font-weight: 400;\"> Coefficients for each independent variable<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>\u03b5:<\/b><span style=\"font-weight: 400;\"> Error term<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Key Concepts<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Multiple R-squared:<\/b><span style=\"font-weight: 400;\"> Measures the proportion of the variance in the dependent variable explained by all the independent variables.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Adjusted R-squared:<\/b><span style=\"font-weight: 400;\"> Adjusts R-squared for the number of independent variables, penalising for overfitting.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>F-test:<\/b><span style=\"font-weight: 400;\"> Tests the overall significance of the regression model.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>t-tests:<\/b><span style=\"font-weight: 400;\"> Test the significance of individual regression coefficients.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Polynomial Regression<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Polynomial regression is used to model non-linear relationships between variables. It involves adding polynomial terms (e.g., squared, cubed) of the independent variable to the regression equation.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For example, a quadratic regression model can be expressed as:<\/span><\/p>\n<p><b><i>Y = \u03b2\u2080 + \u03b2\u2081X + \u03b2\u2082X\u00b2 + \u03b5<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400;\">Polynomial regression can capture more complex relationships than simple linear regression. However, it&#8217;s important to avoid overfitting the model by adding too many polynomial terms.<\/span><\/p>\n<h2><span style=\"font-weight: 400;\">Time Series Regression<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Time series regression is used to analyse time-series data, where the observations are ordered chronologically. It involves modelling the relationship between a dependent variable and time.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Key Concepts<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Autocorrelation: <\/b><span style=\"font-weight: 400;\">The correlation between observations at different time points.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Stationarity:<\/b><span style=\"font-weight: 400;\"> The statistical property of any time series where the means, variances, and autocorrelations remain constant over time.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Trend:<\/b><span style=\"font-weight: 400;\"> A long-term pattern in the data.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Seasonality:<\/b><span style=\"font-weight: 400;\"> Regular fluctuations that occur at specific intervals.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Cyclical Patterns:<\/b><span style=\"font-weight: 400;\"> Long-term fluctuations that are not regular.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Diagnostic Checks<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">To ensure the validity of a regression model, it&#8217;s important to perform diagnostic checks:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Normality of Residuals:<\/b><span style=\"font-weight: 400;\"> The residuals should be normally distributed.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Homoscedasticity:<\/b><span style=\"font-weight: 400;\"> The variance of the residuals should be constant across all values of the independent variable.\u00a0<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Independence of Errors:<\/b><span style=\"font-weight: 400;\"> The residuals should be independent of each other.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Multicollinearity:<\/b><span style=\"font-weight: 400;\"> The independent variables should not be highly correlated.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Outliers and Influential Points:<\/b><span style=\"font-weight: 400;\"> Identify and handle outliers that can significantly affect the regression results.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Model Selection and Evaluation<\/span><\/h2>\n<h3><span style=\"font-weight: 400;\">Model Selection Criteria<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Adjusted R-squared:<\/b><span style=\"font-weight: 400;\"> A modified version of R-squared that penalises for the number of predictors, helping to avoid overfitting.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Akaike Information Criterion (AIC):<\/b><span style=\"font-weight: 400;\"> Measures the relative quality of statistical models for a given set of data.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Bayesian Information Criterion (BIC):<\/b><span style=\"font-weight: 400;\"> Similar to AIC, but with a stronger penalty for model complexity.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Cross-Validation<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>k-fold Cross-Validation:<\/b><span style=\"font-weight: 400;\"> Splits the data into k folds, trains the model on k-1 folds, and evaluates it on the remaining fold.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Leave-One-Out Cross-Validation:<\/b><span style=\"font-weight: 400;\"> A special case of k-fold cross-validation where each observation is used as a validation set.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Regularisation Techniques<\/span><\/h3>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Ridge Regression: <\/b><span style=\"font-weight: 400;\">Adds a penalty term to the regression equation to reduce the impact of large coefficients.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Lasso Regression:<\/b><span style=\"font-weight: 400;\"> Shrinks some coefficients to zero, effectively performing feature selection.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Elastic Net Regression: <\/b><span style=\"font-weight: 400;\">Combines the features of Ridge and Lasso regression.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Robust Regression<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Robust regression techniques are designed to handle outliers and non-normality in the data. They are less sensitive to the influence of outliers compared to ordinary least squares regression.<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Least Absolute Deviation (LAD) Regression:<\/b><span style=\"font-weight: 400;\"> Minimises the sum of absolute deviations rather than the sum of squared errors.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>M-Estimators:<\/b><span style=\"font-weight: 400;\"> A class of robust estimators that downweight the influence of outliers.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Time Series Regression Models<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Time series regression models are used to analyse data collected over time. They account for factors like trend, seasonality, and autocorrelation.<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Autoregressive (AR) Models:<\/b><span style=\"font-weight: 400;\"> Model the relationship between a variable and its lagged values.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Moving Average (MA) Models: <\/b><span style=\"font-weight: 400;\">Model the relationship between a variable and past errors.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Autoregressive Integrated Moving Average (ARIMA) Models:<\/b><span style=\"font-weight: 400;\"> Combine AR and MA models to capture both trend and seasonal patterns.<\/span><\/li>\n<\/ul>\n<h2><span style=\"font-weight: 400;\">Generalised Linear Models (GLMs)<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">GLMs extend linear regression to accommodate non-normal response variables. They are useful for modelling count data, binary outcomes, and other non-normally distributed data.<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Poisson Regression: <\/b><span style=\"font-weight: 400;\">Models count data, such as the number of events occurring in a fixed time interval.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Logistic Regression:<\/b><span style=\"font-weight: 400;\"> Models binary outcomes, such as whether a customer will churn or not.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Negative Binomial Regression:<\/b><span style=\"font-weight: 400;\"> Models count data with overdispersion.<\/span><\/li>\n<\/ul>\n<h3><span style=\"font-weight: 400;\">Wrapping Up<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">Simple linear regression is a powerful tool for understanding the relationship between two variables. You can effectively apply this technique to various real-world problems by following the steps outlined in this guide. However, it&#8217;s important to remember the limitations of the model and to use it judiciously.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Frequently Asked Questions<\/span><\/h3>\n<p><b>What are the differences between simple linear regression and multiple linear regression?<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Simple linear regression models the relationship between one dependent variable and one independent variable, while multiple linear regression models the relationship between one dependent variable and two or more independent variables.\u00a0<\/span><\/p>\n<p><b>What is a <\/b><b>linear regression example<\/b><b>?<\/b><\/p>\n<p><span style=\"font-weight: 400;\">A <\/span><span style=\"font-weight: 400;\">linear regression example<\/span><span style=\"font-weight: 400;\"> would be that a real estate agent might use linear regression to predict the price of a house based on its square footage. In this case, the dependent variable (house price) is predicted by the independent variable (square footage). The regression model would estimate the relationship between these two variables, allowing the agent to make more accurate price predictions.<\/span><\/p>\n<p><b>How can I assess the goodness of fit of a regression model?<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The goodness of fit of a regression model can be assessed using statistical measures like R-squared, adjusted R-squared, and the F-statistic. These measures help determine how well the model fits the data and how much of the variation in the dependent variable is explained by the independent variables.<\/span><\/p>\n<p><b>How to use <\/b><b>linear regression analysis in Python<\/b><b>?<\/b><\/p>\n<p><span style=\"font-weight: 400;\">To use <\/span><span style=\"font-weight: 400;\">linear regression in Python<\/span><span style=\"font-weight: 400;\">, you can leverage libraries like Statsmodels or Scikit-learn. You&#8217;ll first import the necessary libraries and load your data into a suitable format (e.g., pandas DataFrame). Then, you&#8217;ll define your dependent and independent variables, train the model using the fit() method, and evaluate the model&#8217;s performance using various metrics.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Simple linear regression is a statistical method used to model the relationship between two variables: a dependent variable and an independent variable. It helps us understand how changes in the independent variable affect the dependent variable. This technique is widely used in various fields, including finance, economics, and social sciences. Enrol in Imarticus Learning\u2019s holistic [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":266964,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_mo_disable_npp":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[22],"tags":[4977],"class_list":["post-266963","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-finance","tag-simple-linear-regression"],"acf":[],"aioseo_notices":[],"modified_by":"Imarticus Learning","_links":{"self":[{"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/posts\/266963","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/comments?post=266963"}],"version-history":[{"count":1,"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/posts\/266963\/revisions"}],"predecessor-version":[{"id":266965,"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/posts\/266963\/revisions\/266965"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/media\/266964"}],"wp:attachment":[{"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/media?parent=266963"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/categories?post=266963"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/imarticus.org\/blog\/wp-json\/wp\/v2\/tags?post=266963"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}