Note:

-An R Notebook is an R Markdown document with chunks that can be executed independently and interactively, with output visible immediately beneath the input.

-Notebook output are available as HTML, PDF, Word, or Latex.

-This Notebook as HTML is preferably open with Google Chrome.

-R-Code can be extracted as Rmd file under the button “Code” in the notebook.

-This Notebook using iterative development. It means the process starts with a simple implementation of a small set of idea requirements and iteratively enhances the evolving versions until the complete version is implemented and perfect.

What is Logistic Regression ?

  • Regression analysis when the dependent variable is dichotomous (binary) such as Gender (male or female)
  • Like all regression analyses, the logistic regression is a predictive analysis.
  • The relationship between one dependent binary variable and one or more nominal, ordinal, interval or ratio-level independent variables.

Note: We don’t use Linear Regression for binary classification because its linear function results in probabilities outside [0,1] interval, thereby making them invalid predictions.

Type of questions that a binary logistic regression can examine:

  • How does the probability of getting lung cancer (yes vs. no) change for every additional pound a person is overweight and for every pack of cigarettes smoked per day?
  • Do body weight, calorie intake, fat intake, and age have an influence on the probability of having a heart attack (yes vs. no)?

Types of Logistic Regression

1. Multinomial Logistic Regression:

Let’s say our target variable has K = 4 classes. This technique handles the multi-class problem by fitting K-1 independent binary logistic classifier model. For doing this, it randomly chooses one target class as the reference class and fits K-1 regression models that compare each of the remaining classes to the reference class.

Due to its restrictive nature, it isn’t used widely because it does not scale very well in the presence of a large number of target classes. In addition, since it builds K - 1 models, we would require a much larger data set to achieve reasonable accuracy.

2. Ordinal Logistic Regression:

This technique is used when the target variable is ordinal in nature. Let’s say, we want to predict years of work experience (1,2,3,4,5, etc). So, there exists an order in the value, i.e., 5>4>3>2>1. Unlike a multinomial model, when we train K -1 models, Ordinal Logistic Regression builds a single model with multiple threshold values.

If we have K classes, the model will require K -1 threshold or cutoff points. Also, it makes an imperative assumption of proportional odds. The assumption says that on a logit (S shape) scale, all of the thresholds lie on a straight line.

Note: Logistic Regression is not a great choice to solve multi-class problems. But, it’s good to be aware of its types. In this tutorial we’ll focus on Logistic Regression for binary classification task.

How does it work

Logistic Regression assumes that the dependent (or response) variable follows a binomial distribution with the following characteristics:

  • There must be a fixed number of trials denoted by n, i.e. in the data set, there must be a fixed number of rows.
  • Each trial can have only two outcomes; i.e., the response variable can have only two unique categories.
  • The outcome of each trial must be independent of each other; i.e., the unique levels of the response variable must be independent of each other.
  • The probability of success (p) and failure (q) should be the same for each trial.

Major assumptions

  • The dependent variable should be dichotomous in nature (e.g., presence vs. absent)
  • There should be no outliers in the data
  • There should be no high correlations (multicollinearity) among the predictors
  • Mathematically, logistic regression estimates a multiple linear regression function defined as

Overfitting. When selecting the model for the logistic regression analysis, another important consideration is the model fit. Adding independent variables to a logistic regression model will always increase the amount of variance explained in the log odds (typically expressed as R²). However, adding more and more variables to the model can result in overfitting, which reduces the generalizability of the model beyond the data on which the model is fit.

Reporting the R2. Numerous pseudo-R2 values have been developed for binary logistic regression. These should be interpreted with extreme caution as they have many computational issues which cause them to be artificially high or low. A better approach is to present any of the goodness of fit tests available; Hosmer-Lemeshow is a commonly used measure of goodness of fit based on the Chi-square test.

Evaluate model and accuracy

  1. Akaike Information Criteria (AIC) (or BIC)
  2. Null Deviance and Residual Deviance

Use case:

Predict if an individual will earn more than $50K using logistic regression based on demographic variables data from https://github.com/itsmecevi/adult-csv

1. Import the data

inputData <- read.csv("adult.csv")
head(inputData)

A list of all variable:

names(inputData)
 [1] "AGE"           "WORKCLASS"     "FNLWGT"        "EDUCATION"     "EDUCATIONNUM" 
 [6] "MARITALSTATUS" "OCCUPATION"    "RELATIONSHIP"  "RACE"          "SEX"          
[11] "CAPITALGAIN"   "CAPITALLOSS"   "HOURSPERWEEK"  "NATIVECOUNTRY" "ABOVE50K"     

2. Check for class bias

Ideally, the proportion of events (Yes) and non-events (No) in the Y variable should approximately be the same. Lets check the proportion of classes in the dependent variable ABOVE50K.

table(inputData$ABOVE50K)

    0     1 
24720  7841 

Clearly, there is a class bias (the proportion of events is much smaller than proportion of non-events). We need sample the observations in approximately equal proportions to get better models.

3. Create training and test samples

# Create Training Data
input_ones <- inputData[which(inputData$ABOVE50K == 1), ]  # all 1's
input_zeros <- inputData[which(inputData$ABOVE50K == 0), ]  # all 0's
set.seed(100)  # for repeatability of samples
input_ones_training_rows <- sample(1:nrow(input_ones), 0.7*nrow(input_ones))  # 1's for training
input_zeros_training_rows <- sample(1:nrow(input_zeros), 0.7*nrow(input_ones))  # 0's for training. Pick as many 0's as 1's
training_ones <- input_ones[input_ones_training_rows, ]  
training_zeros <- input_zeros[input_zeros_training_rows, ]
trainingData <- rbind(training_ones, training_zeros)  # row bind the 1's and 0's 
# Create Test Data
test_ones <- input_ones[-input_ones_training_rows, ]
test_zeros <- input_zeros[-input_zeros_training_rows, ]
testData <- rbind(test_ones, test_zeros)  # row bind the 1's and 0's 

4. Compute information value to find out important variables

The smbinning::smbinning function converts a continuous variable into a categorical variable using recursive partitioning. We will first convert them to categorical variables and then, capture the information values for all variables in iv_df

#install.packages("simbinning")
#library(smbinning)
# segregate continuous and factor variables
factor_vars <- c ("WORKCLASS", "EDUCATION", "MARITALSTATUS", "OCCUPATION", "RELATIONSHIP", "RACE", "SEX", "NATIVECOUNTRY")
continuous_vars <- c("AGE", "FNLWGT","EDUCATIONNUM", "HOURSPERWEEK", "CAPITALGAIN", "CAPITALLOSS")
iv_df <- data.frame(VARS=c(factor_vars, continuous_vars), IV=numeric(14))  # init for IV results
# compute IV for categoricals
for(factor_var in factor_vars){
  smb <- smbinning.factor(trainingData, y="ABOVE50K", x=factor_var)  # WOE table
  if(class(smb) != "character"){ # heck if some error occured
    iv_df[iv_df$VARS == factor_var, "IV"] <- smb$iv
  }
}
NAs introduced by coercionNAs introduced by coercionNAs introduced by coercionNAs introduced by coercionNAs introduced by coercion
# compute IV for continuous vars
for(continuous_var in continuous_vars){
  smb <- smbinning(trainingData, y="ABOVE50K", x=continuous_var)  # WOE table
  if(class(smb) != "character"){  # any error while calculating scores.
    iv_df[iv_df$VARS == continuous_var, "IV"] <- smb$iv
  }
}
NAs introduced by coercionNAs introduced by coercionNAs introduced by coercionNAs introduced by coercionNAs introduced by coercion
iv_df <- iv_df[order(-iv_df$IV), ]  # sort
iv_df

For more information: https://www.r-bloggers.com/woe-and-iv-variable-screening-with-information-in-r/

5. Build logit models and predict on test data

logitMod <- glm(ABOVE50K ~ RELATIONSHIP + AGE, data=trainingData, family=binomial(link="logit"))
predicted <- plogis(predict(logitMod, testData))  # predicted scores
# or
predicted <- predict(logitMod, testData, type="response")  # predicted scores

Why we used only RELATIONSHIP + AGE + CAPITALGAIN + OCCUPATION + EDUCATIONNUM.

Because of decide on optimal prediction probability cutoff for the model.

The default cutoff prediction probability score is 0.5 or the ratio of 1’s and 0’s in the training data. But sometimes, tuning the probability cutoff can improve the accuracy in both the development and validation samples. The InformationValue::optimalCutoff function provides ways to find the optimal cutoff to improve the prediction of 1’s, 0’s, both 1’s and 0’s and o reduce the misclassification error. Lets compute the optimal score that minimizes the misclassification error for the above model.

#install.packages("InformationValue")
#library(InformationValue)
optCutOff <- optimalCutoff(testData$ABOVE50K, predicted)[1] 
optCutOff
[1] 0.8771106

6. Do model diagnostics

The summary(logitMod) gives the beta coefficients, Standard error, z Value and p Value. If your model had categorical variables with multiple levels, you will find a row-entry for each category of that variable. That is because, each individual category is considered as an independent binary variable by the glm(). In this case it is ok if few of the categories in a multi-category variable don’t turn out to be significant in the model (i.e. p Value turns out greater than significance level of 0.5).

summary(logitMod)

Call:
glm(formula = ABOVE50K ~ RELATIONSHIP + AGE, family = binomial(link = "logit"), 
    data = trainingData)

Deviance Residuals: 
    Min       1Q   Median       3Q      Max  
-2.0477  -0.7334   0.1133   0.8200   2.5855  

Coefficients:
                              Estimate Std. Error z value            Pr(>|z|)    
(Intercept)                  0.0003401  0.0899831   0.004               0.997    
RELATIONSHIP Not-in-family  -1.8881838  0.0569038 -33.182 <0.0000000000000002 ***
RELATIONSHIP Other-relative -3.1274740  0.2370408 -13.194 <0.0000000000000002 ***
RELATIONSHIP Own-child      -3.7651374  0.1567731 -24.016 <0.0000000000000002 ***
RELATIONSHIP Unmarried      -2.4644369  0.0936906 -26.304 <0.0000000000000002 ***
RELATIONSHIP Wife            0.1215044  0.0915656   1.327               0.185    
AGE                          0.0218334  0.0019637  11.119 <0.0000000000000002 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 15216  on 10975  degrees of freedom
Residual deviance: 11417  on 10969  degrees of freedom
AIC: 11431

Number of Fisher Scoring iterations: 5

VIF:-XXX

Like in case of linear regression, we should check for multicollinearity in the model. As seen below, all X variables in the model have VIF well below 4.

For more informarion: https://www.r-bloggers.com/collinearity-and-stepwise-vif-selection/

#install.packages("VIF")
#library(VIF)
vif(logitMod)
argument is not numeric or logical: returning NAError in as.vector(y) - mean(y) : non-numeric argument to binary operator

Misclassification Error:

Misclassification error is the percentage mismatch of predcited vs actuals, irrespective of 1’s or 0’s. The lower the misclassification error, the better is your model.

misClassError(testData$ABOVE50K, predicted, threshold = optCutOff)
[1] 0.0892

ROC (Receiver Operating Characteristics):

Receiver Operating Characteristics Curve traces the percentage of true positives accurately predicted by a given logit model as the prediction probability cutoff is lowered from 1 to 0. For a good model, as the cutoff is lowered, it should mark more of actual 1’s as positives and lesser of actual 0’s as 1’s. So for a good model, the curve should rise steeply, indicating that the TPR (Y-Axis) increases faster than the FPR (X-Axis) as the cutoff score decreases. Greater the area under the ROC curve, better the predictive ability of the model.

plotROC(testData$ABOVE50K, predicted)

The above model has area under ROC curve 88.78%, which is pretty good.

Concordance:

In simpler words, of all combinations of 1-0 pairs (actuals), Concordance is the percentage of pairs, whose scores of actual positive’s are greater than the scores of actual negative’s. For a perfect model, this will be 100%. So, the higher the concordance, the better is the quality of model.

Concordance(testData$ABOVE50K, predicted)
$`Concordance`
[1] 0.7969542

$Discordance
[1] 0.2030458

$Tied
[1] 0.00000000000000002775558

$Pairs
[1] 45252896

The above model with a concordance of 79.6% is indeed a good quality model.

Specificity and Sensitivity:-XXX

Sensitivity (or True Positive Rate) is the percentage of 1’s (actuals) correctly predicted by the model, while, specificity is the percentage of 0’s (actuals) correctly predicted. Specificity can also be calculated as 1???False Positive Rate.


#http://r-statistics.co/Logistic-Regression-With-R.html
sensitivity(testData$ABOVE50K, predicted, threshold = optCutOff)
Error in sensitivity.default(testData$ABOVE50K, predicted, threshold = optCutOff) : 
  inputs must be factors

Confusion Matrix:-XXX

confusionMatrix(testData$ABOVE50K, predicted, threshold = optCutOff)
Error: `data` and `reference` should be factors with the same levels.

Change log update

  • 02.02.2019

License

MIT

LS0tDQp0aXRsZTogIkxvZ2lzdGljIFJlZ3Jlc3Npb24gaW4gUiINCnN1YnRpdGxlOiAiVXNlIGNhc2UgbWFjaGluZSBsZWFybmluZyB3aXRoIFIiDQphdXRob3I6ICJDZXZpIEhlcmRpYW4sIEIuIFNjIg0KZGF0ZTogIjAyLjAyLjIwMTkiDQpvdXRwdXQ6DQogIGh0bWxfbm90ZWJvb2s6DQogICAgY29kZV9mb2xkaW5nOiBoaWRlDQogICAgaGlnaGxpZ2h0OiBweWdtZW50cw0KICAgIHRoZW1lOiBjb3Ntbw0KICAgIHRvYzogeWVzDQogICAgdG9jX2RlcHRoOiA1DQogICAgdG9jX2Zsb2F0OiB5ZXMNCiAgaHRtbF9kb2N1bWVudDoNCiAgICBkZl9wcmludDogcGFnZWQNCiAgICB0b2M6IHllcw0KICAgIHRvY19kZXB0aDogJzUnDQogIHBkZl9kb2N1bWVudDoNCiAgICB0b2M6IHllcw0KICAgIHRvY19kZXB0aDogJzUnDQogIHdvcmRfZG9jdW1lbnQ6DQogICAgdG9jOiB5ZXMNCiAgICB0b2NfZGVwdGg6ICc1Jw0KLS0tDQoNCg0KKipOb3RlOioqDQoNCi1BbiBSIE5vdGVib29rIGlzIGFuIFIgTWFya2Rvd24gZG9jdW1lbnQgd2l0aCBjaHVua3MgdGhhdCBjYW4gYmUgZXhlY3V0ZWQgaW5kZXBlbmRlbnRseSBhbmQgaW50ZXJhY3RpdmVseSwgd2l0aCBvdXRwdXQgdmlzaWJsZSBpbW1lZGlhdGVseSBiZW5lYXRoIHRoZSBpbnB1dC4NCg0KLU5vdGVib29rIG91dHB1dCBhcmUgYXZhaWxhYmxlIGFzIEhUTUwsIFBERiwgV29yZCwgb3IgTGF0ZXguIA0KDQotVGhpcyBOb3RlYm9vayBhcyBIVE1MIGlzIHByZWZlcmFibHkgb3BlbiB3aXRoIEdvb2dsZSBDaHJvbWUuDQoNCi1SLUNvZGUgY2FuIGJlIGV4dHJhY3RlZCBhcyBSbWQgZmlsZSB1bmRlciB0aGUgYnV0dG9uICJDb2RlIiBpbiB0aGUgbm90ZWJvb2suDQoNCi1UaGlzIE5vdGVib29rIHVzaW5nIGl0ZXJhdGl2ZSBkZXZlbG9wbWVudC4gSXQgbWVhbnMgdGhlIHByb2Nlc3Mgc3RhcnRzIHdpdGggYSBzaW1wbGUgaW1wbGVtZW50YXRpb24gb2YgYSBzbWFsbCBzZXQgb2YgaWRlYSByZXF1aXJlbWVudHMgYW5kIGl0ZXJhdGl2ZWx5IGVuaGFuY2VzIHRoZSBldm9sdmluZyB2ZXJzaW9ucyB1bnRpbCB0aGUgY29tcGxldGUgdmVyc2lvbiBpcyBpbXBsZW1lbnRlZCBhbmQgcGVyZmVjdC4NCg0KDQoNCiNXaGF0IGlzIExvZ2lzdGljIFJlZ3Jlc3Npb24gPw0KDQoqIFJlZ3Jlc3Npb24gYW5hbHlzaXMgd2hlbiB0aGUgZGVwZW5kZW50IHZhcmlhYmxlIGlzIGRpY2hvdG9tb3VzIChiaW5hcnkpIHN1Y2ggYXMgR2VuZGVyIChtYWxlIG9yIGZlbWFsZSkNCiogTGlrZSBhbGwgcmVncmVzc2lvbiBhbmFseXNlcywgdGhlIGxvZ2lzdGljIHJlZ3Jlc3Npb24gaXMgYSBwcmVkaWN0aXZlIGFuYWx5c2lzLg0KKiBUaGUgcmVsYXRpb25zaGlwIGJldHdlZW4gb25lIGRlcGVuZGVudCBiaW5hcnkgdmFyaWFibGUgYW5kIG9uZSBvciBtb3JlIG5vbWluYWwsIG9yZGluYWwsIGludGVydmFsIG9yIHJhdGlvLWxldmVsIGluZGVwZW5kZW50IHZhcmlhYmxlcy4NCg0KTm90ZTogV2UgZG9uJ3QgdXNlIExpbmVhciBSZWdyZXNzaW9uIGZvciBiaW5hcnkgY2xhc3NpZmljYXRpb24gYmVjYXVzZSBpdHMgbGluZWFyIGZ1bmN0aW9uIHJlc3VsdHMgaW4gcHJvYmFiaWxpdGllcyBvdXRzaWRlIFswLDFdIGludGVydmFsLCB0aGVyZWJ5IG1ha2luZyB0aGVtIGludmFsaWQgcHJlZGljdGlvbnMuDQoNCioqVHlwZSBvZiBxdWVzdGlvbnMgdGhhdCBhIGJpbmFyeSBsb2dpc3RpYyByZWdyZXNzaW9uIGNhbiBleGFtaW5lOioqDQoNCiogSG93IGRvZXMgdGhlIHByb2JhYmlsaXR5IG9mIGdldHRpbmcgbHVuZyBjYW5jZXIgKHllcyB2cy4gbm8pIGNoYW5nZSBmb3IgZXZlcnkgYWRkaXRpb25hbCBwb3VuZCBhIHBlcnNvbiBpcyBvdmVyd2VpZ2h0IGFuZCBmb3IgZXZlcnkgcGFjayBvZiBjaWdhcmV0dGVzIHNtb2tlZCBwZXIgZGF5Pw0KKiBEbyBib2R5IHdlaWdodCwgY2Fsb3JpZSBpbnRha2UsIGZhdCBpbnRha2UsIGFuZCBhZ2UgaGF2ZSBhbiBpbmZsdWVuY2Ugb24gdGhlIHByb2JhYmlsaXR5IG9mIGhhdmluZyBhIGhlYXJ0IGF0dGFjayAoeWVzIHZzLiBubyk/DQoNCg0KIyBUeXBlcyBvZiBMb2dpc3RpYyBSZWdyZXNzaW9uDQoNCioqMS4gTXVsdGlub21pYWwgTG9naXN0aWMgUmVncmVzc2lvbjoqKg0KDQpMZXQncyBzYXkgb3VyIHRhcmdldCB2YXJpYWJsZSBoYXMgSyA9IDQgY2xhc3Nlcy4gVGhpcyB0ZWNobmlxdWUgaGFuZGxlcyB0aGUgbXVsdGktY2xhc3MgcHJvYmxlbSBieSBmaXR0aW5nIEstMSBpbmRlcGVuZGVudCBiaW5hcnkgbG9naXN0aWMgY2xhc3NpZmllciBtb2RlbC4gRm9yIGRvaW5nIHRoaXMsIGl0IHJhbmRvbWx5IGNob29zZXMgb25lIHRhcmdldCBjbGFzcyBhcyB0aGUgcmVmZXJlbmNlIGNsYXNzIGFuZCBmaXRzIEstMSByZWdyZXNzaW9uIG1vZGVscyB0aGF0IGNvbXBhcmUgZWFjaCBvZiB0aGUgcmVtYWluaW5nIGNsYXNzZXMgdG8gdGhlIHJlZmVyZW5jZSBjbGFzcy4NCg0KRHVlIHRvIGl0cyByZXN0cmljdGl2ZSBuYXR1cmUsIGl0IGlzbid0IHVzZWQgd2lkZWx5IGJlY2F1c2UgaXQgZG9lcyBub3Qgc2NhbGUgdmVyeSB3ZWxsIGluIHRoZSBwcmVzZW5jZSBvZiBhIGxhcmdlIG51bWJlciBvZiB0YXJnZXQgY2xhc3Nlcy4gSW4gYWRkaXRpb24sIHNpbmNlIGl0IGJ1aWxkcyBLIC0gMSBtb2RlbHMsIHdlIHdvdWxkIHJlcXVpcmUgYSBtdWNoIGxhcmdlciBkYXRhIHNldCB0byBhY2hpZXZlIHJlYXNvbmFibGUgYWNjdXJhY3kuDQoNCg0KKioyLiBPcmRpbmFsIExvZ2lzdGljIFJlZ3Jlc3Npb246KiogDQoNCiBUaGlzIHRlY2huaXF1ZSBpcyB1c2VkIHdoZW4gdGhlIHRhcmdldCB2YXJpYWJsZSBpcyBvcmRpbmFsIGluIG5hdHVyZS4gTGV0J3Mgc2F5LCB3ZSB3YW50IHRvIHByZWRpY3QgeWVhcnMgb2Ygd29yayBleHBlcmllbmNlICgxLDIsMyw0LDUsIGV0YykuIFNvLCB0aGVyZSBleGlzdHMgYW4gb3JkZXIgaW4gdGhlIHZhbHVlLCBpLmUuLCA1PjQ+Mz4yPjEuIFVubGlrZSBhIG11bHRpbm9taWFsIG1vZGVsLCB3aGVuIHdlIHRyYWluIEsgLTEgbW9kZWxzLCBPcmRpbmFsIExvZ2lzdGljIFJlZ3Jlc3Npb24gYnVpbGRzIGEgc2luZ2xlIG1vZGVsIHdpdGggbXVsdGlwbGUgdGhyZXNob2xkIHZhbHVlcy4NCg0KSWYgd2UgaGF2ZSBLIGNsYXNzZXMsIHRoZSBtb2RlbCB3aWxsIHJlcXVpcmUgSyAtMSB0aHJlc2hvbGQgb3IgY3V0b2ZmIHBvaW50cy4gQWxzbywgaXQgbWFrZXMgYW4gaW1wZXJhdGl2ZSBhc3N1bXB0aW9uIG9mIHByb3BvcnRpb25hbCBvZGRzLiBUaGUgYXNzdW1wdGlvbiBzYXlzIHRoYXQgb24gYSBsb2dpdCAoUyBzaGFwZSkgc2NhbGUsIGFsbCBvZiB0aGUgdGhyZXNob2xkcyBsaWUgb24gYSBzdHJhaWdodCBsaW5lLg0KDQo+IE5vdGU6IExvZ2lzdGljIFJlZ3Jlc3Npb24gaXMgbm90IGEgZ3JlYXQgY2hvaWNlIHRvIHNvbHZlIG11bHRpLWNsYXNzIHByb2JsZW1zLiBCdXQsIGl0J3MgZ29vZCB0byBiZSBhd2FyZSBvZiBpdHMgdHlwZXMuIEluIHRoaXMgdHV0b3JpYWwgd2UnbGwgZm9jdXMgb24gTG9naXN0aWMgUmVncmVzc2lvbiBmb3IgYmluYXJ5IGNsYXNzaWZpY2F0aW9uIHRhc2suDQoNCg0KDQojSG93IGRvZXMgaXQgd29yaw0KDQpMb2dpc3RpYyBSZWdyZXNzaW9uIGFzc3VtZXMgdGhhdCB0aGUgZGVwZW5kZW50IChvciByZXNwb25zZSkgdmFyaWFibGUgZm9sbG93cyBhIGJpbm9taWFsIGRpc3RyaWJ1dGlvbiB3aXRoIHRoZSBmb2xsb3dpbmcgY2hhcmFjdGVyaXN0aWNzOg0KDQoqIFRoZXJlIG11c3QgYmUgYSBmaXhlZCBudW1iZXIgb2YgdHJpYWxzIGRlbm90ZWQgYnkgbiwgaS5lLiBpbiB0aGUgZGF0YSBzZXQsIHRoZXJlIG11c3QgYmUgYSBmaXhlZCBudW1iZXIgb2Ygcm93cy4NCiogRWFjaCB0cmlhbCBjYW4gaGF2ZSBvbmx5IHR3byBvdXRjb21lczsgaS5lLiwgdGhlIHJlc3BvbnNlIHZhcmlhYmxlIGNhbiBoYXZlIG9ubHkgdHdvIHVuaXF1ZSBjYXRlZ29yaWVzLg0KKiBUaGUgb3V0Y29tZSBvZiBlYWNoIHRyaWFsIG11c3QgYmUgaW5kZXBlbmRlbnQgb2YgZWFjaCBvdGhlcjsgaS5lLiwgdGhlIHVuaXF1ZSBsZXZlbHMgb2YgdGhlIHJlc3BvbnNlIHZhcmlhYmxlIG11c3QgYmUgaW5kZXBlbmRlbnQgb2YgZWFjaCBvdGhlci4NCiogVGhlIHByb2JhYmlsaXR5IG9mIHN1Y2Nlc3MgKHApIGFuZCBmYWlsdXJlIChxKSBzaG91bGQgYmUgdGhlIHNhbWUgZm9yIGVhY2ggdHJpYWwuDQoNCg0KI01ham9yIGFzc3VtcHRpb25zDQoNCiogVGhlIGRlcGVuZGVudCB2YXJpYWJsZSBzaG91bGQgYmUgZGljaG90b21vdXMgaW4gbmF0dXJlIChlLmcuLCBwcmVzZW5jZSB2cy4gYWJzZW50KQ0KKiBUaGVyZSBzaG91bGQgYmUgbm8gb3V0bGllcnMgaW4gdGhlIGRhdGENCiogVGhlcmUgc2hvdWxkIGJlIG5vIGhpZ2ggY29ycmVsYXRpb25zIChtdWx0aWNvbGxpbmVhcml0eSkgYW1vbmcgdGhlIHByZWRpY3RvcnMNCiogTWF0aGVtYXRpY2FsbHksIGxvZ2lzdGljIHJlZ3Jlc3Npb24gZXN0aW1hdGVzIGEgbXVsdGlwbGUgbGluZWFyIHJlZ3Jlc3Npb24gZnVuY3Rpb24gZGVmaW5lZCBhcw0KDQoNCg0KIVtdKDEucG5nKQ0KDQoqKk92ZXJmaXR0aW5nKiouICBXaGVuIHNlbGVjdGluZyB0aGUgbW9kZWwgZm9yIHRoZSBsb2dpc3RpYyByZWdyZXNzaW9uIGFuYWx5c2lzLCBhbm90aGVyIGltcG9ydGFudCBjb25zaWRlcmF0aW9uIGlzIHRoZSBtb2RlbCBmaXQuICBBZGRpbmcgaW5kZXBlbmRlbnQgdmFyaWFibGVzIHRvIGEgbG9naXN0aWMgcmVncmVzc2lvbiBtb2RlbCB3aWxsIGFsd2F5cyBpbmNyZWFzZSB0aGUgYW1vdW50IG9mIHZhcmlhbmNlIGV4cGxhaW5lZCBpbiB0aGUgbG9nIG9kZHMgKHR5cGljYWxseSBleHByZXNzZWQgYXMgUrIpLiAgSG93ZXZlciwgYWRkaW5nIG1vcmUgYW5kIG1vcmUgdmFyaWFibGVzIHRvIHRoZSBtb2RlbCBjYW4gcmVzdWx0IGluIG92ZXJmaXR0aW5nLCB3aGljaCByZWR1Y2VzIHRoZSBnZW5lcmFsaXphYmlsaXR5IG9mIHRoZSBtb2RlbCBiZXlvbmQgdGhlIGRhdGEgb24gd2hpY2ggdGhlIG1vZGVsIGlzIGZpdC4NCg0KKipSZXBvcnRpbmcgdGhlIFIyKiouICBOdW1lcm91cyBwc2V1ZG8tUjIgdmFsdWVzIGhhdmUgYmVlbiBkZXZlbG9wZWQgZm9yIGJpbmFyeSBsb2dpc3RpYyByZWdyZXNzaW9uLiAgVGhlc2Ugc2hvdWxkIGJlIGludGVycHJldGVkIHdpdGggZXh0cmVtZSBjYXV0aW9uIGFzIHRoZXkgaGF2ZSBtYW55IGNvbXB1dGF0aW9uYWwgaXNzdWVzIHdoaWNoIGNhdXNlIHRoZW0gdG8gYmUgYXJ0aWZpY2lhbGx5IGhpZ2ggb3IgbG93LiAgQSBiZXR0ZXIgYXBwcm9hY2ggaXMgdG8gcHJlc2VudCBhbnkgb2YgdGhlIGdvb2RuZXNzIG9mIGZpdCB0ZXN0cyBhdmFpbGFibGU7IEhvc21lci1MZW1lc2hvdyBpcyBhIGNvbW1vbmx5IHVzZWQgbWVhc3VyZSBvZiBnb29kbmVzcyBvZiBmaXQgYmFzZWQgb24gdGhlIENoaS1zcXVhcmUgdGVzdC4NCg0KI0V2YWx1YXRlIG1vZGVsIGFuZCBhY2N1cmFjeQ0KDQoxLiBBa2Fpa2UgSW5mb3JtYXRpb24gQ3JpdGVyaWEgKEFJQykgKG9yIEJJQykNCjIuIE51bGwgRGV2aWFuY2UgYW5kIFJlc2lkdWFsIERldmlhbmNlDQoNCg0KI1VzZSBjYXNlOiANCg0KUHJlZGljdCBpZiBhbiBpbmRpdmlkdWFsIHdpbGwgZWFybiBtb3JlIHRoYW4gJDUwSyB1c2luZyBsb2dpc3RpYyByZWdyZXNzaW9uIGJhc2VkIG9uIGRlbW9ncmFwaGljIHZhcmlhYmxlcyBkYXRhIGZyb20gaHR0cHM6Ly9naXRodWIuY29tL2l0c21lY2V2aS9hZHVsdC1jc3YNCg0KKioxLiBJbXBvcnQgdGhlIGRhdGEqKg0KDQoNCmBgYHtyfQ0KaW5wdXREYXRhIDwtIHJlYWQuY3N2KCJhZHVsdC5jc3YiKQ0KaGVhZChpbnB1dERhdGEpDQpgYGANCg0KQSBsaXN0IG9mIGFsbCB2YXJpYWJsZToNCg0KYGBge3J9DQoNCm5hbWVzKGlucHV0RGF0YSkNCg0KYGBgDQoNCg0KDQoNCioqMi4gQ2hlY2sgZm9yIGNsYXNzIGJpYXMqKg0KDQpJZGVhbGx5LCB0aGUgcHJvcG9ydGlvbiBvZiBldmVudHMgKFllcykgYW5kIG5vbi1ldmVudHMgKE5vKSBpbiB0aGUgWSB2YXJpYWJsZSBzaG91bGQgYXBwcm94aW1hdGVseSBiZSB0aGUgc2FtZS4gTGV0cyBjaGVjayB0aGUgcHJvcG9ydGlvbiBvZiBjbGFzc2VzIGluIHRoZSBkZXBlbmRlbnQgdmFyaWFibGUgYEFCT1ZFNTBLYC4NCg0KYGBge3J9DQoNCnRhYmxlKGlucHV0RGF0YSRBQk9WRTUwSykNCg0KYGBgDQoNCkNsZWFybHksIHRoZXJlIGlzIGEgY2xhc3MgYmlhcyAodGhlIHByb3BvcnRpb24gb2YgZXZlbnRzIGlzIG11Y2ggc21hbGxlciB0aGFuIHByb3BvcnRpb24gb2Ygbm9uLWV2ZW50cykuIFdlIG5lZWQgc2FtcGxlIHRoZSBvYnNlcnZhdGlvbnMgaW4gYXBwcm94aW1hdGVseSBlcXVhbCBwcm9wb3J0aW9ucyB0byBnZXQgYmV0dGVyIG1vZGVscy4NCg0KDQoqKjMuIENyZWF0ZSB0cmFpbmluZyBhbmQgdGVzdCBzYW1wbGVzKioNCg0KYGBge3J9DQoNCiMgQ3JlYXRlIFRyYWluaW5nIERhdGENCmlucHV0X29uZXMgPC0gaW5wdXREYXRhW3doaWNoKGlucHV0RGF0YSRBQk9WRTUwSyA9PSAxKSwgXSAgIyBhbGwgMSdzDQppbnB1dF96ZXJvcyA8LSBpbnB1dERhdGFbd2hpY2goaW5wdXREYXRhJEFCT1ZFNTBLID09IDApLCBdICAjIGFsbCAwJ3MNCnNldC5zZWVkKDEwMCkgICMgZm9yIHJlcGVhdGFiaWxpdHkgb2Ygc2FtcGxlcw0KaW5wdXRfb25lc190cmFpbmluZ19yb3dzIDwtIHNhbXBsZSgxOm5yb3coaW5wdXRfb25lcyksIDAuNypucm93KGlucHV0X29uZXMpKSAgIyAxJ3MgZm9yIHRyYWluaW5nDQppbnB1dF96ZXJvc190cmFpbmluZ19yb3dzIDwtIHNhbXBsZSgxOm5yb3coaW5wdXRfemVyb3MpLCAwLjcqbnJvdyhpbnB1dF9vbmVzKSkgICMgMCdzIGZvciB0cmFpbmluZy4gUGljayBhcyBtYW55IDAncyBhcyAxJ3MNCnRyYWluaW5nX29uZXMgPC0gaW5wdXRfb25lc1tpbnB1dF9vbmVzX3RyYWluaW5nX3Jvd3MsIF0gIA0KdHJhaW5pbmdfemVyb3MgPC0gaW5wdXRfemVyb3NbaW5wdXRfemVyb3NfdHJhaW5pbmdfcm93cywgXQ0KdHJhaW5pbmdEYXRhIDwtIHJiaW5kKHRyYWluaW5nX29uZXMsIHRyYWluaW5nX3plcm9zKSAgIyByb3cgYmluZCB0aGUgMSdzIGFuZCAwJ3MgDQoNCiMgQ3JlYXRlIFRlc3QgRGF0YQ0KdGVzdF9vbmVzIDwtIGlucHV0X29uZXNbLWlucHV0X29uZXNfdHJhaW5pbmdfcm93cywgXQ0KdGVzdF96ZXJvcyA8LSBpbnB1dF96ZXJvc1staW5wdXRfemVyb3NfdHJhaW5pbmdfcm93cywgXQ0KdGVzdERhdGEgPC0gcmJpbmQodGVzdF9vbmVzLCB0ZXN0X3plcm9zKSAgIyByb3cgYmluZCB0aGUgMSdzIGFuZCAwJ3MgDQoNCmBgYA0KDQoNCg0KDQoqKjQuIENvbXB1dGUgaW5mb3JtYXRpb24gdmFsdWUgdG8gZmluZCBvdXQgaW1wb3J0YW50IHZhcmlhYmxlcyoqDQoNClRoZSBgc21iaW5uaW5nOjpzbWJpbm5pbmdgIGZ1bmN0aW9uIGNvbnZlcnRzIGEgY29udGludW91cyB2YXJpYWJsZSBpbnRvIGEgY2F0ZWdvcmljYWwgdmFyaWFibGUgdXNpbmcgcmVjdXJzaXZlIHBhcnRpdGlvbmluZy4gV2Ugd2lsbCBmaXJzdCBjb252ZXJ0IHRoZW0gdG8gY2F0ZWdvcmljYWwgdmFyaWFibGVzIGFuZCB0aGVuLCBjYXB0dXJlIHRoZSBpbmZvcm1hdGlvbiB2YWx1ZXMgZm9yIGFsbCB2YXJpYWJsZXMgaW4gYGl2X2RmYA0KDQpgYGB7cn0NCg0KI2luc3RhbGwucGFja2FnZXMoInNpbWJpbm5pbmciKQ0KI2xpYnJhcnkoc21iaW5uaW5nKQ0KIyBzZWdyZWdhdGUgY29udGludW91cyBhbmQgZmFjdG9yIHZhcmlhYmxlcw0KZmFjdG9yX3ZhcnMgPC0gYyAoIldPUktDTEFTUyIsICJFRFVDQVRJT04iLCAiTUFSSVRBTFNUQVRVUyIsICJPQ0NVUEFUSU9OIiwgIlJFTEFUSU9OU0hJUCIsICJSQUNFIiwgIlNFWCIsICJOQVRJVkVDT1VOVFJZIikNCmNvbnRpbnVvdXNfdmFycyA8LSBjKCJBR0UiLCAiRk5MV0dUIiwiRURVQ0FUSU9OTlVNIiwgIkhPVVJTUEVSV0VFSyIsICJDQVBJVEFMR0FJTiIsICJDQVBJVEFMTE9TUyIpDQoNCml2X2RmIDwtIGRhdGEuZnJhbWUoVkFSUz1jKGZhY3Rvcl92YXJzLCBjb250aW51b3VzX3ZhcnMpLCBJVj1udW1lcmljKDE0KSkgICMgaW5pdCBmb3IgSVYgcmVzdWx0cw0KDQojIGNvbXB1dGUgSVYgZm9yIGNhdGVnb3JpY2Fscw0KZm9yKGZhY3Rvcl92YXIgaW4gZmFjdG9yX3ZhcnMpew0KICBzbWIgPC0gc21iaW5uaW5nLmZhY3Rvcih0cmFpbmluZ0RhdGEsIHk9IkFCT1ZFNTBLIiwgeD1mYWN0b3JfdmFyKSAgIyBXT0UgdGFibGUNCiAgaWYoY2xhc3Moc21iKSAhPSAiY2hhcmFjdGVyIil7ICMgaGVjayBpZiBzb21lIGVycm9yIG9jY3VyZWQNCiAgICBpdl9kZltpdl9kZiRWQVJTID09IGZhY3Rvcl92YXIsICJJViJdIDwtIHNtYiRpdg0KICB9DQp9DQoNCiMgY29tcHV0ZSBJViBmb3IgY29udGludW91cyB2YXJzDQpmb3IoY29udGludW91c192YXIgaW4gY29udGludW91c192YXJzKXsNCiAgc21iIDwtIHNtYmlubmluZyh0cmFpbmluZ0RhdGEsIHk9IkFCT1ZFNTBLIiwgeD1jb250aW51b3VzX3ZhcikgICMgV09FIHRhYmxlDQogIGlmKGNsYXNzKHNtYikgIT0gImNoYXJhY3RlciIpeyAgIyBhbnkgZXJyb3Igd2hpbGUgY2FsY3VsYXRpbmcgc2NvcmVzLg0KICAgIGl2X2RmW2l2X2RmJFZBUlMgPT0gY29udGludW91c192YXIsICJJViJdIDwtIHNtYiRpdg0KICB9DQp9DQoNCml2X2RmIDwtIGl2X2RmW29yZGVyKC1pdl9kZiRJViksIF0gICMgc29ydA0KaXZfZGYNCg0KYGBgDQoNCg0KDQoNCkZvciBtb3JlIGluZm9ybWF0aW9uOiBodHRwczovL3d3dy5yLWJsb2dnZXJzLmNvbS93b2UtYW5kLWl2LXZhcmlhYmxlLXNjcmVlbmluZy13aXRoLWluZm9ybWF0aW9uLWluLXIvDQoNCg0KDQoNCg0KDQoqKjUuIEJ1aWxkIGxvZ2l0IG1vZGVscyBhbmQgcHJlZGljdCBvbiB0ZXN0IGRhdGEqKg0KDQpgYGB7cn0NCg0KbG9naXRNb2QgPC0gZ2xtKEFCT1ZFNTBLIH4gUkVMQVRJT05TSElQICsgQUdFLCBkYXRhPXRyYWluaW5nRGF0YSwgZmFtaWx5PWJpbm9taWFsKGxpbms9ImxvZ2l0IikpDQoNCnByZWRpY3RlZCA8LSBwbG9naXMocHJlZGljdChsb2dpdE1vZCwgdGVzdERhdGEpKSAgIyBwcmVkaWN0ZWQgc2NvcmVzDQojIG9yDQpwcmVkaWN0ZWQgPC0gcHJlZGljdChsb2dpdE1vZCwgdGVzdERhdGEsIHR5cGU9InJlc3BvbnNlIikgICMgcHJlZGljdGVkIHNjb3Jlcw0KDQpgYGANCg0KDQpXaHkgd2UgdXNlZCBvbmx5IGBSRUxBVElPTlNISVAgKyBBR0UgKyBDQVBJVEFMR0FJTiArIE9DQ1VQQVRJT04gKyBFRFVDQVRJT05OVU1gLg0KDQpCZWNhdXNlIG9mIGRlY2lkZSBvbiBvcHRpbWFsIHByZWRpY3Rpb24gcHJvYmFiaWxpdHkgY3V0b2ZmIGZvciB0aGUgbW9kZWwuDQoNClRoZSBkZWZhdWx0IGN1dG9mZiBwcmVkaWN0aW9uIHByb2JhYmlsaXR5IHNjb3JlIGlzIDAuNSBvciB0aGUgcmF0aW8gb2YgMSdzIGFuZCAwJ3MgaW4gdGhlIHRyYWluaW5nIGRhdGEuIEJ1dCBzb21ldGltZXMsIHR1bmluZyB0aGUgcHJvYmFiaWxpdHkgY3V0b2ZmIGNhbiBpbXByb3ZlIHRoZSBhY2N1cmFjeSBpbiBib3RoIHRoZSBkZXZlbG9wbWVudCBhbmQgdmFsaWRhdGlvbiBzYW1wbGVzLiBUaGUgSW5mb3JtYXRpb25WYWx1ZTo6b3B0aW1hbEN1dG9mZiBmdW5jdGlvbiBwcm92aWRlcyB3YXlzIHRvIGZpbmQgdGhlIG9wdGltYWwgY3V0b2ZmIHRvIGltcHJvdmUgdGhlIHByZWRpY3Rpb24gb2YgMSdzLCAwJ3MsIGJvdGggMSdzIGFuZCAwJ3MgYW5kIG8gcmVkdWNlIHRoZSBtaXNjbGFzc2lmaWNhdGlvbiBlcnJvci4gTGV0cyBjb21wdXRlIHRoZSBvcHRpbWFsIHNjb3JlIHRoYXQgbWluaW1pemVzIHRoZSBtaXNjbGFzc2lmaWNhdGlvbiBlcnJvciBmb3IgdGhlIGFib3ZlIG1vZGVsLg0KDQpgYGB7cn0NCg0KI2luc3RhbGwucGFja2FnZXMoIkluZm9ybWF0aW9uVmFsdWUiKQ0KI2xpYnJhcnkoSW5mb3JtYXRpb25WYWx1ZSkNCm9wdEN1dE9mZiA8LSBvcHRpbWFsQ3V0b2ZmKHRlc3REYXRhJEFCT1ZFNTBLLCBwcmVkaWN0ZWQpWzFdIA0Kb3B0Q3V0T2ZmDQoNCmBgYA0KDQoNCg0KKio2LiBEbyBtb2RlbCBkaWFnbm9zdGljcyoqDQoNClRoZSBzdW1tYXJ5KGxvZ2l0TW9kKSBnaXZlcyB0aGUgYmV0YSBjb2VmZmljaWVudHMsIFN0YW5kYXJkIGVycm9yLCB6IFZhbHVlIGFuZCBwIFZhbHVlLiBJZiB5b3VyIG1vZGVsIGhhZCBjYXRlZ29yaWNhbCB2YXJpYWJsZXMgd2l0aCBtdWx0aXBsZSBsZXZlbHMsIHlvdSB3aWxsIGZpbmQgYSByb3ctZW50cnkgZm9yIGVhY2ggY2F0ZWdvcnkgb2YgdGhhdCB2YXJpYWJsZS4gVGhhdCBpcyBiZWNhdXNlLCBlYWNoIGluZGl2aWR1YWwgY2F0ZWdvcnkgaXMgY29uc2lkZXJlZCBhcyBhbiBpbmRlcGVuZGVudCBiaW5hcnkgdmFyaWFibGUgYnkgdGhlIGdsbSgpLiBJbiB0aGlzIGNhc2UgaXQgaXMgb2sgaWYgZmV3IG9mIHRoZSBjYXRlZ29yaWVzIGluIGEgbXVsdGktY2F0ZWdvcnkgdmFyaWFibGUgZG9uJ3QgdHVybiBvdXQgdG8gYmUgc2lnbmlmaWNhbnQgaW4gdGhlIG1vZGVsIChpLmUuIHAgVmFsdWUgdHVybnMgb3V0IGdyZWF0ZXIgdGhhbiBzaWduaWZpY2FuY2UgbGV2ZWwgb2YgMC41KS4NCg0KYGBge3J9DQoNCnN1bW1hcnkobG9naXRNb2QpDQoNCmBgYA0KDQoqKlZJRjoqKi1YWFgNCg0KTGlrZSBpbiBjYXNlIG9mIGxpbmVhciByZWdyZXNzaW9uLCB3ZSBzaG91bGQgY2hlY2sgZm9yIG11bHRpY29sbGluZWFyaXR5IGluIHRoZSBtb2RlbC4gQXMgc2VlbiBiZWxvdywgYWxsIFggdmFyaWFibGVzIGluIHRoZSBtb2RlbCBoYXZlIFZJRiB3ZWxsIGJlbG93IDQuDQoNCkZvciBtb3JlIGluZm9ybWFyaW9uOiBodHRwczovL3d3dy5yLWJsb2dnZXJzLmNvbS9jb2xsaW5lYXJpdHktYW5kLXN0ZXB3aXNlLXZpZi1zZWxlY3Rpb24vDQoNCmBgYHtyfQ0KDQojaW5zdGFsbC5wYWNrYWdlcygiVklGIikNCiNsaWJyYXJ5KFZJRikNCg0KdmlmKGxvZ2l0TW9kKQ0KDQpgYGANCg0KKipNaXNjbGFzc2lmaWNhdGlvbiBFcnJvcjoqKg0KDQpNaXNjbGFzc2lmaWNhdGlvbiBlcnJvciBpcyB0aGUgcGVyY2VudGFnZSBtaXNtYXRjaCBvZiBwcmVkY2l0ZWQgdnMgYWN0dWFscywgaXJyZXNwZWN0aXZlIG9mIDEncyBvciAwJ3MuIFRoZSBsb3dlciB0aGUgbWlzY2xhc3NpZmljYXRpb24gZXJyb3IsIHRoZSBiZXR0ZXIgaXMgeW91ciBtb2RlbC4NCg0KYGBge3J9DQoNCm1pc0NsYXNzRXJyb3IodGVzdERhdGEkQUJPVkU1MEssIHByZWRpY3RlZCwgdGhyZXNob2xkID0gb3B0Q3V0T2ZmKQ0KDQpgYGANCg0KDQoqKlJPQyAoUmVjZWl2ZXIgT3BlcmF0aW5nIENoYXJhY3RlcmlzdGljcyk6KioNCg0KUmVjZWl2ZXIgT3BlcmF0aW5nIENoYXJhY3RlcmlzdGljcyBDdXJ2ZSB0cmFjZXMgdGhlIHBlcmNlbnRhZ2Ugb2YgdHJ1ZSBwb3NpdGl2ZXMgYWNjdXJhdGVseSBwcmVkaWN0ZWQgYnkgYSBnaXZlbiBsb2dpdCBtb2RlbCBhcyB0aGUgcHJlZGljdGlvbiBwcm9iYWJpbGl0eSBjdXRvZmYgaXMgbG93ZXJlZCBmcm9tIDEgdG8gMC4gRm9yIGEgZ29vZCBtb2RlbCwgYXMgdGhlIGN1dG9mZiBpcyBsb3dlcmVkLCBpdCBzaG91bGQgbWFyayBtb3JlIG9mIGFjdHVhbCAxJ3MgYXMgcG9zaXRpdmVzIGFuZCBsZXNzZXIgb2YgYWN0dWFsIDAncyBhcyAxJ3MuIFNvIGZvciBhIGdvb2QgbW9kZWwsIHRoZSBjdXJ2ZSBzaG91bGQgcmlzZSBzdGVlcGx5LCBpbmRpY2F0aW5nIHRoYXQgdGhlIFRQUiAoWS1BeGlzKSBpbmNyZWFzZXMgZmFzdGVyIHRoYW4gdGhlIEZQUiAoWC1BeGlzKSBhcyB0aGUgY3V0b2ZmIHNjb3JlIGRlY3JlYXNlcy4gR3JlYXRlciB0aGUgYXJlYSB1bmRlciB0aGUgUk9DIGN1cnZlLCBiZXR0ZXIgdGhlIHByZWRpY3RpdmUgYWJpbGl0eSBvZiB0aGUgbW9kZWwuDQoNCg0KYGBge3J9DQoNCnBsb3RST0ModGVzdERhdGEkQUJPVkU1MEssIHByZWRpY3RlZCkNCg0KYGBgDQoNClRoZSBhYm92ZSBtb2RlbCBoYXMgYXJlYSB1bmRlciBST0MgY3VydmUgODguNzglLCB3aGljaCBpcyBwcmV0dHkgZ29vZC4NCg0KDQoqKkNvbmNvcmRhbmNlOioqDQoNCkluIHNpbXBsZXIgd29yZHMsIG9mIGFsbCBjb21iaW5hdGlvbnMgb2YgMS0wIHBhaXJzIChhY3R1YWxzKSwgQ29uY29yZGFuY2UgaXMgdGhlIHBlcmNlbnRhZ2Ugb2YgcGFpcnMsIHdob3NlIHNjb3JlcyBvZiBhY3R1YWwgcG9zaXRpdmUncyBhcmUgZ3JlYXRlciB0aGFuIHRoZSBzY29yZXMgb2YgYWN0dWFsIG5lZ2F0aXZlJ3MuIEZvciBhIHBlcmZlY3QgbW9kZWwsIHRoaXMgd2lsbCBiZSAxMDAlLiBTbywgdGhlIGhpZ2hlciB0aGUgY29uY29yZGFuY2UsIHRoZSBiZXR0ZXIgaXMgdGhlIHF1YWxpdHkgb2YgbW9kZWwuDQoNCmBgYHtyfQ0KDQpDb25jb3JkYW5jZSh0ZXN0RGF0YSRBQk9WRTUwSywgcHJlZGljdGVkKQ0KDQpgYGANCg0KDQpUaGUgYWJvdmUgbW9kZWwgd2l0aCBhIGNvbmNvcmRhbmNlIG9mIDc5LjYlIGlzIGluZGVlZCBhIGdvb2QgcXVhbGl0eSBtb2RlbC4NCg0KDQoqKlNwZWNpZmljaXR5IGFuZCBTZW5zaXRpdml0eToqKi1YWFgNCg0KU2Vuc2l0aXZpdHkgKG9yIFRydWUgUG9zaXRpdmUgUmF0ZSkgaXMgdGhlIHBlcmNlbnRhZ2Ugb2YgMSdzIChhY3R1YWxzKSBjb3JyZWN0bHkgcHJlZGljdGVkIGJ5IHRoZSBtb2RlbCwgd2hpbGUsIHNwZWNpZmljaXR5IGlzIHRoZSBwZXJjZW50YWdlIG9mIDAncyAoYWN0dWFscykgY29ycmVjdGx5IHByZWRpY3RlZC4gU3BlY2lmaWNpdHkgY2FuIGFsc28gYmUgY2FsY3VsYXRlZCBhcyAxPz8/RmFsc2UgUG9zaXRpdmUgUmF0ZS4NCg0KIVtdKDIucG5nKQ0KDQohW10oMy5wbmcpDQoNCmBgYHtyfQ0KDQojaHR0cDovL3Itc3RhdGlzdGljcy5jby9Mb2dpc3RpYy1SZWdyZXNzaW9uLVdpdGgtUi5odG1sDQoNCmBgYA0KDQpgYGB7cn0NCg0Kc2Vuc2l0aXZpdHkodGVzdERhdGEkQUJPVkU1MEssIHByZWRpY3RlZCwgdGhyZXNob2xkID0gb3B0Q3V0T2ZmKQ0KDQpzcGVjaWZpY2l0eSh0ZXN0RGF0YSRBQk9WRTUwSywgcHJlZGljdGVkLCB0aHJlc2hvbGQgPSBvcHRDdXRPZmYpDQoNCg0KYGBgDQoNCg0KDQoqKkNvbmZ1c2lvbiBNYXRyaXg6KiotWFhYDQoNCmBgYHtyfQ0KDQpjb25mdXNpb25NYXRyaXgodGVzdERhdGEkQUJPVkU1MEssIHByZWRpY3RlZCwgdGhyZXNob2xkID0gb3B0Q3V0T2ZmKQ0KDQpgYGANCg0KDQojQ2hhbmdlIGxvZyB1cGRhdGUNCg0KKiAwMi4wMi4yMDE5DQoNCiNSZWZlcmVuY2VzDQoNCiogaHR0cDovL3Itc3RhdGlzdGljcy5jby9Mb2dpc3RpYy1SZWdyZXNzaW9uLVdpdGgtUi5odG1sDQoqIGh0dHBzOi8vd3d3LmRhdGFjYW1wLmNvbS9jb21tdW5pdHkvdHV0b3JpYWxzL2xvZ2lzdGljLXJlZ3Jlc3Npb24tUg0KKiBodHRwczovL3d3dy5oYWNrZXJlYXJ0aC5jb20vcHJhY3RpY2UvbWFjaGluZS1sZWFybmluZy9tYWNoaW5lLWxlYXJuaW5nLWFsZ29yaXRobXMvbG9naXN0aWMtcmVncmVzc2lvbi1hbmFseXNpcy1yL3R1dG9yaWFsLw0KKiBbUiBQcm9ncmFtbWluZy9HcmFwaGljc10oaHR0cHM6Ly9lbi53aWtpYm9va3Mub3JnL3dpa2kvUl9Qcm9ncmFtbWluZy9HcmFwaGljcyNQb2ludHMpDQoqIFtUdXRvcmlhbHMgZm9yIGxlYXJuaW5nIFJdKGh0dHBzOi8vd3d3LnItYmxvZ2dlcnMuY29tL2hvdy10by1sZWFybi1yLTIvI2guYm9zZHU3a2tveW0pDQoqIFtSRG9jdW1lbnRhdGlvbl0oaHR0cHM6Ly93d3cucmRvY3VtZW50YXRpb24ub3JnLykNCiogW1N0YXRpc3RpY2FsIHRvb2xzIGZvciBoaWdoLXRocm91Z2hwdXQgZGF0YSBhbmFseXNpc10oaHR0cDovL3d3dy5zdGhkYS5jb20vZW5nbGlzaC8pDQoqIFtzdGF0bWV0aG9kc10oaHR0cHM6Ly93d3cuc3RhdG1ldGhvZHMubmV0L2luZGV4Lmh0bWwpDQoqIFtjb2duaXRpdmVjbGFzcy5haS1EYXRhIFZpc3VhbGl6YXRpb24gd2l0aCBSXShodHRwczovL2NvZ25pdGl2ZWNsYXNzLmFpL2NvdXJzZXMvZGF0YS12aXN1YWxpemF0aW9uLXdpdGgtci8pDQoqIFtDb21wcmVoZW5zaXZlIEd1aWRlIHRvIERhdGEgVmlzdWFsaXphdGlvbiBpbiAgUl0oaHR0cHM6Ly93d3cuYW5hbHl0aWNzdmlkaHlhLmNvbS9ibG9nLzIwMTUvMDcvZ3VpZGUtZGF0YS12aXN1YWxpemF0aW9uLXIvKQ0KKiBbZGF0YXNjaWVuY2VwbHVzXShodHRwczovL2RhdGFzY2llbmNlcGx1cy5jb20vKQ0KKiBbSGFuZHMtT24gUHJvZ3JhbW1pbmcgd2l0aCBSXShodHRwczovL3JzdHVkaW8tZWR1Y2F0aW9uLmdpdGh1Yi5pby9ob3ByLykNCiogW1IgZm9yIERhdGEgU2NpZW5jZV0oaHR0cHM6Ly9yNGRzLmhhZC5jby5uei8pDQoqIFtSIE1hcmtkb3duOiBUaGUgRGVmaW5pdGl2ZSBHdWlkZV0oaHR0cHM6Ly9ib29rZG93bi5vcmcveWlodWkvcm1hcmtkb3duLykNCiogW1IgYm9va2Rvd246IEF1dGhvcmluZyBCb29rcyBhbmQgVGVjaG5pY2FsIERvY3VtZW50cyB3aXRoIFIgTWFya2Rvd25dKGh0dHBzOi8vYm9va2Rvd24ub3JnL3lpaHVpL2Jvb2tkb3duLykNCiogW2Jsb2dkb3duOiBDcmVhdGluZyBXZWJzaXRlcyB3aXRoIFIgTWFya2Rvd25dKGh0dHBzOi8vYm9va2Rvd24ub3JnL3lpaHVpL2Jsb2dkb3duLykNCiogW3Itc3RhdGlzdGljc10oaHR0cDovL3Itc3RhdGlzdGljcy5jby9Mb2dpc3RpYy1SZWdyZXNzaW9uLVdpdGgtUi5odG1sKQ0KDQoNCg0KDQoNCjxCcj4NCg0KI0xpY2Vuc2UNCg0KW01JVF0oaHR0cHM6Ly9vcGVuc291cmNlLm9yZy9saWNlbnNlcy9NSVQp