Logistic Regression Statistics Excel Project

User Generated

xvzv1213

Business Finance

Description

the data and instruction of this assignment are attached below. please read the instruction and follow the instruction. please dont bid the question if you have no idea what this assignment about. please dont waste each other's time thank you.

Unformatted Attachment Preview

This assignment requires reading a primer on “Logistic Regression” You will need to download an Excel Add-Ins – Real Statistics Resource Pack for this assignment. The site contains instructions on how to download and install the Add-Ins. For information on how to use Real Statistics for this assignment see “Real Statistics Data Analysis Tool” (you need to scroll down until the subtopic above). Part 1 (30 points) The data used in this assignment is from Andrew Ng’s Machine Learning course on Coursera. The data is provided here. The data consists of marks of two exams for 100 applicants. The target value (last column) takes on binary values 1 (“1” means the applicant was admitted to the university whereas “0” means the applicant didn't get an admission. The objective is to build a classifier that can predict whether an application will be admitted to the university or not. You will need to copy the dataset below into a text editor i.e. Notepad as a .csv (comma delimited value) file before you can load into Excel. 34.62365962451697,78.0246928153624,0 79.0327360507101,75.3443764369103,1 45.08327747668339,56.3163717815305,0 61.10666453684766,96.51142588489624,1 75.02474556738889,46.55401354116538,1 76.09878670226257,87.42056971926803,1 84.43281996120035,43.53339331072109,1 95.86155507093572,38.22527805795094,0 75.01365838958247,30.60326323428011,0 82.30705337399482,76.48196330235604,1 69.36458875970939,97.71869196188608,1 39.53833914367223,76.03681085115882,0 53.9710521485623,89.20735013750205,1 69.07014406283025,52.74046973016765,1 67.94685547711617,46.67857410673128,0 70.66150955499435,92.92713789364831,1 76.97878372747498,47.57596364975532,1 67.37202754570876,42.83843832029179,0 89.67677575072079,65.79936592745237,1 50.534788289883,48.85581152764205,0 34.21206097786789,44.20952859866288,0 77.9240914545704,68.9723599933059,1 62.27101367004632,69.95445795447587,1 80.1901807509566,44.82162893218353,1 93.114388797442,38.80067033713209,0 61.83020602312595,50.25610789244621,0 38.78580379679423,64.99568095539578,0 61.379289447425,72.80788731317097,1 85.40451939411645,57.05198397627122,1 52.10797973193984,63.12762376881715,0 52.04540476831827,69.43286012045222,1 40.23689373545111,71.16774802184875,0 54.63510555424817,52.21388588061123,0 33.91550010906887,98.86943574220611,0 64.17698887494485,80.90806058670817,1 74.78925295941542,41.57341522824434,0 34.1836400264419,75.2377203360134,0 83.90239366249155,56.30804621605327,1 51.54772026906181,46.85629026349976,0 94.44336776917852,65.56892160559052,1 82.36875375713919,40.61825515970618,0 51.04775177128865,45.82270145776001,0 62.22267576120188,52.06099194836679,0 77.19303492601364,70.45820000180959,1 97.77159928000232,86.7278223300282,1 62.07306379667647,96.76882412413983,1 91.56497449807442,88.69629254546599,1 79.94481794066932,74.16311935043758,1 99.2725269292572,60.99903099844988,1 90.54671411399852,43.39060180650027,1 34.52451385320009,60.39634245837173,0 50.2864961189907,49.80453881323059,0 49.58667721632031,59.80895099453265,0 97.64563396007767,68.86157272420604,1 32.57720016809309,95.59854761387875,0 74.24869136721598,69.82457122657193,1 71.79646205863379,78.45356224515052,1 75.3956114656803,85.75993667331619,1 35.28611281526193,47.02051394723416,0 56.25381749711624,39.26147251058019,0 30.05882244669796,49.59297386723685,0 44.66826172480893,66.45008614558913,0 66.56089447242954,41.09209807936973,0 40.45755098375164,97.53518548909936,1 49.07256321908844,51.88321182073966,0 80.27957401466998,92.11606081344084,1 66.74671856944039,60.99139402740988,1 32.72283304060323,43.30717306430063,0 64.0393204150601,78.03168802018232,1 72.34649422579923,96.22759296761404,1 60.45788573918959,73.09499809758037,1 58.84095621726802,75.85844831279042,1 99.82785779692128,72.36925193383885,1 47.26426910848174,88.47586499559782,1 50.45815980285988,75.80985952982456,1 60.45555629271532,42.50840943572217,0 82.22666157785568,42.71987853716458,0 88.9138964166533,69.80378889835472,1 94.83450672430196,45.69430680250754,1 67.31925746917527,66.58935317747915,1 57.23870631569862,59.51428198012956,1 80.36675600171273,90.96014789746954,1 68.46852178591112,85.59430710452014,1 42.0754545384731,78.84478600148043,0 75.47770200533905,90.42453899753964,1 78.63542434898018,96.64742716885644,1 52.34800398794107,60.76950525602592,0 94.09433112516793,77.15910509073893,1 90.44855097096364,87.50879176484702,1 55.48216114069585,35.57070347228866,0 74.49269241843041,84.84513684930135,1 89.84580670720979,45.35828361091658,1 83.48916274498238,48.38028579728175,1 42.2617008099817,87.10385094025457,1 99.31500880510394,68.77540947206617,1 55.34001756003703,64.9319380069486,1 74.77589300092767,89.52981289513276,1 What to submit: The Workbook with your analysis: The worksheet containing the Box plots. The worksheet of the Logistic Regression output with the statistic (the cell highlighted in yellow) that indicates the % of the observed cases that are predicted accurately by the model. Your Workbook should have the following worksheet as Part2 (end of Part2 below): Part 2 (70 points) It’s almost graduation time and you are thinking of applying for a PH.D. program. You wonder what criteria predicts admission to a Ph.D. program. How can you predict whether a student will get an admit or not? What are the parameters for selection? Can it be mathematically expressed? All these questions started popping up, so you decided to answer the question. In order to solve this problem, you will follow a structured approach. 1. Define the problem Write down the problem statement and understand what you’re trying to solve. In this case, your objective is to predict whether a student will get an admit or not. So, it means, that this is a binary classification problem. 2. Generate your own hypothesis Next, list down all the things that you think can affect your objective viz. List down all the possible features with respect to your target feature. In your case, you ask this question — What are the factors that can affect a student’s admission? Take a piece of paper and write down your own hypothesis. Examples may be the following features - GRE Score - TOEFL Score - Statement of Purpose (SOP) - Letter of Recommendation (LOR) - Academic Performance (GPA) - Extra Curricular Activities (Sports, Math Olympiad etc..) - Outstanding Achievements - Projects and Research Make sure that you write down at least 10–20 for any problem statement. This helps you get a deeper understanding of what you are trying to solve and prompts you to think beyond the available dataset. 3. The Dataset You will be using UCLA hypothetical data for graduate admissions (see Excel workbook provided). You can now map your hypothesis with the given dataset. Identify each feature as either (a) Continuous Variable or (b) Categorical Variable (Source: Stack Exchange) 4. Data Cleaning Most of the time, the dataset will have lots of anomalies like missing values, outliers and so on. Always preprocess the data before moving on to the next step. - Missing Value Treatment: You can use Mean-imputation (for continuous variables) or Modeimputation (for categorical variables) - Outlier treatment: An outlier is an observation that lies an abnormal distance from other values in a random sample. There are four ways to treat it — Deleting observations, Imputing, Creating Bins and Treat Separately. - Feature Engineering: Combining, adding, deleting, splitting, scaling features in order to increase the accuracy of the model 5. Exploratory Data Analysis (EDA) Now it’s time to get your hands dirty. Let us explore the data and understand the given features. This dataset has a binary response (outcome, dependent) variable called admit. There are three predictor variables: gre, gpa and rank. You will treat the variables gre and gpa as continuous. The variable rank takes on the values 1 through 4. Institutions with a rank of 1 have the highest prestige, while those with a rank of 4 have the lowest. EDA gives you a clear idea about your features. It also helps to capture any trend or seasonality of data points. Use Box plot to show the outliers in your variables (gre, gpa). In your case, you can see that the higher the GRE and GPA, the better the chances of getting admit. 6. Predictive Modeling This is a binary classification problem. The output has only two possibilities either Yes (1) or No (0). You will use Logistics Regression in Real Statistics (Add-in) to conduct your analysis. Use Binary Logistic Regression Model in Real Statistics Add-in to predict the dependent variable (DV – Admit = 1, or Not Admit =0, to Ph.D. program), whether a student will be admitted with the three independent variables (IV – gre, gpa, rank) What to submit: The Workbook with your analysis: The worksheet containing the Box plots. The worksheet of the Logistic Regression output with the statistic (the cell highlighted in yellow) that indicates the % of the observed cases that are predicted accurately by the model. Your Workbook should have the following worksheet: admit gre 0 1 1 1 0 1 1 0 1 0 0 0 1 0 1 0 0 0 0 1 0 1 0 0 1 1 1 1 1 0 0 0 0 1 0 0 0 0 1 1 0 1 1 0 0 1 gpa 380 660 800 640 520 760 560 400 540 700 800 440 760 700 700 480 780 360 800 540 500 660 600 680 760 800 620 520 780 520 540 760 600 800 360 400 580 520 500 520 560 580 600 500 700 460 rank 3.61 3.67 4 3.19 2.93 3 2.98 3.08 3.39 3.92 4 3.22 4 3.08 4 3.44 3.87 2.56 3.75 3.81 3.17 3.63 2.82 3.19 3.35 3.66 3.61 3.74 3.22 3.29 3.78 3.35 3.4 4 3.14 3.05 3.25 2.9 3.13 2.68 2.42 3.32 3.15 3.31 2.94 3.45 3 3 1 4 4 2 1 2 3 2 4 1 1 2 1 3 4 3 2 1 3 2 4 4 2 1 1 4 2 1 4 3 3 3 1 2 1 3 2 3 2 2 2 3 2 3 1 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 1 0 1 0 580 500 440 400 640 440 740 680 660 740 560 380 400 600 620 560 640 680 580 600 740 620 580 800 640 300 480 580 720 720 560 800 540 620 700 620 500 380 500 520 600 600 700 660 700 720 800 3.46 2.97 2.48 3.35 3.86 3.13 3.37 3.27 3.34 4 3.19 2.94 3.65 2.82 3.18 3.32 3.67 3.85 4 3.59 3.62 3.3 3.69 3.73 4 2.92 3.39 4 3.45 4 3.36 4 3.12 4 2.9 3.07 2.71 2.91 3.6 2.98 3.32 3.48 3.28 4 3.83 3.64 3.9 2 4 4 3 3 4 4 2 3 3 3 3 2 4 2 4 3 3 3 2 4 1 1 1 3 4 4 2 4 3 3 3 1 1 4 2 2 4 3 2 2 2 1 2 2 1 2 0 1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 1 0 1 0 1 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 580 660 660 640 480 700 400 340 580 380 540 660 740 700 480 400 480 680 420 360 600 720 620 440 700 800 340 520 480 520 500 720 540 600 740 540 460 620 640 580 500 560 500 560 700 620 600 2.93 3.44 3.33 3.52 3.57 2.88 3.31 3.15 3.57 3.33 3.94 3.95 2.97 3.56 3.13 2.93 3.45 3.08 3.41 3 3.22 3.84 3.99 3.45 3.72 3.7 2.92 3.74 2.67 2.85 2.98 3.88 3.38 3.54 3.74 3.19 3.15 3.17 2.79 3.4 3.08 2.95 3.57 3.33 4 3.4 3.58 2 2 2 4 2 2 3 3 3 4 3 2 2 1 2 3 2 4 4 3 1 3 3 2 2 1 3 2 2 3 3 3 4 1 4 2 4 2 2 2 3 2 3 4 3 2 1 0 1 0 0 0 0 0 0 1 0 1 0 1 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 1 0 0 0 640 700 620 580 580 380 480 560 480 740 800 400 640 580 620 580 560 480 660 700 600 640 700 520 580 700 440 720 500 600 400 540 680 800 500 620 520 620 620 300 620 500 700 540 500 800 560 3.93 3.52 3.94 3.4 3.4 3.43 3.4 2.71 2.91 3.31 3.74 3.38 3.94 3.46 3.69 2.86 2.52 3.58 3.49 3.82 3.13 3.5 3.56 2.73 3.3 4 3.24 3.77 4 3.62 3.51 2.81 3.48 3.43 3.53 3.37 2.62 3.23 3.33 3.01 3.78 3.88 4 3.84 2.79 3.6 3.61 2 4 4 3 4 3 2 3 1 1 1 2 2 3 3 4 2 1 2 3 2 2 2 2 2 1 4 3 3 3 3 3 3 2 4 2 2 3 3 3 3 4 2 2 4 2 3 0 0 0 1 0 0 0 1 0 0 1 0 0 0 1 1 0 1 1 0 1 0 0 0 0 0 0 1 1 0 1 0 1 0 0 1 0 0 1 0 0 0 1 0 0 0 0 580 560 500 640 800 640 380 600 560 660 400 600 580 800 580 700 420 600 780 740 640 540 580 740 580 460 640 600 660 340 460 460 560 540 680 480 800 800 720 620 540 480 720 580 600 380 420 2.88 3.07 3.35 2.94 3.54 3.76 3.59 3.47 3.59 3.07 3.23 3.63 3.77 3.31 3.2 4 3.92 3.89 3.8 3.54 3.63 3.16 3.5 3.34 3.02 2.87 3.38 3.56 2.91 2.9 3.64 2.98 3.59 3.28 3.99 3.02 3.47 2.9 3.5 3.58 3.02 3.43 3.42 3.29 3.28 3.38 2.67 2 2 2 2 3 3 4 2 2 3 4 3 4 3 2 1 4 1 3 1 1 3 2 4 2 2 3 2 3 1 1 1 2 3 3 1 3 2 3 2 4 2 2 4 3 2 3 1 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 0 0 0 0 1 1 1 0 0 1 1 0 1 0 1 0 0 1 0 1 1 1 0 800 620 660 480 500 700 440 520 680 620 540 800 680 440 680 640 660 620 520 540 740 640 520 620 520 640 680 440 520 620 520 380 560 600 680 500 640 540 680 660 520 600 460 580 680 660 660 3.53 3.05 3.49 4 2.86 3.45 2.76 3.81 2.96 3.22 3.04 3.91 3.34 3.17 3.64 3.73 3.31 3.21 4 3.55 3.52 3.35 3.3 3.95 3.51 3.81 3.11 3.15 3.19 3.95 3.9 3.34 3.24 3.64 3.46 2.81 3.95 3.33 3.67 3.32 3.12 2.98 3.77 3.58 3 3.14 3.94 1 2 2 2 4 3 2 1 3 2 1 3 2 2 3 3 4 4 2 4 4 3 2 3 2 2 2 2 3 3 3 3 4 3 2 3 2 3 2 1 2 2 3 1 4 2 2 0 0 0 1 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 1 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 1 360 660 520 440 600 800 660 800 420 620 800 680 800 480 520 560 460 540 720 640 660 400 680 220 580 540 580 540 440 560 660 660 520 540 300 340 780 480 540 460 460 500 420 520 680 680 560 3.27 3.45 3.1 3.39 3.31 3.22 3.7 3.15 2.26 3.45 2.78 3.7 3.97 2.55 3.25 3.16 3.07 3.5 3.4 3.3 3.6 3.15 3.98 2.83 3.46 3.17 3.51 3.13 2.98 4 3.67 3.77 3.65 3.46 2.84 3 3.63 3.71 3.28 3.14 3.58 3.01 2.69 2.7 3.9 3.31 3.48 3 4 4 2 4 1 4 4 4 2 2 2 1 1 3 1 2 2 3 2 3 2 2 3 4 1 2 2 3 3 2 3 4 4 2 2 4 4 1 3 2 4 2 3 1 2 2 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 1 0 1 1 0 0 1 0 1 1 0 0 1 0 0 0 0 0 1 1 1 1 0 580 500 740 660 420 560 460 620 520 620 540 660 500 560 500 580 520 500 600 580 400 620 780 620 580 700 540 760 700 720 560 720 520 540 680 460 560 480 460 620 580 800 540 680 680 620 560 3.34 2.93 4 3.59 2.96 3.43 3.64 3.71 3.15 3.09 3.2 3.47 3.23 2.65 3.95 3.06 3.35 3.03 3.35 3.8 3.36 2.85 4 3.43 3.12 3.52 3.78 2.81 3.27 3.31 3.69 3.94 4 3.49 3.14 3.44 3.36 2.78 2.93 3.63 4 3.89 3.77 3.76 2.42 3.37 3.78 2 4 3 3 1 3 3 1 3 4 1 3 4 3 4 2 3 3 2 2 2 2 2 3 3 2 2 1 2 1 3 3 1 1 2 2 1 3 3 3 1 2 2 3 1 1 2 0 0 1 0 0 0 1 0 0 1 0 1 0 0 0 1 1 1 1 1 0 0 0 0 0 560 620 800 640 540 700 540 540 660 480 420 740 580 640 640 800 660 600 620 460 620 560 460 700 600 0 3.49 3.63 4 3.12 2.7 3.65 3.49 3.51 4 2.62 3.02 3.86 3.36 3.17 3.51 3.05 3.88 3.38 3.75 3.99 4 3.04 2.63 3.65 3.89 0 4 2 2 3 2 2 2 2 1 2 1 2 2 2 2 2 2 3 2 3 2 3 2 2 3 0
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Attached.

Logistic Regression
gre
220
300
300
300
340
340
340
340
360
360
360
360
380
380
380
380
380
380
380
380
400
400
400
400
400
400
400
400
400
400
400
420
420
420
420
420
420
420
440
440
440
440
440

gpa
2.83
2.84
2.92
3.01
2.9
2.92
3
3.15
2.56
3
3.14
3.27
2.91
2.94
3.33
3.34
3.38
3.43
3.59
3.61
2.93
3.05
3.08
3.15
3.23
3.31
3.35
3.36
3.38
3.51
3.65
2.26
2.67
2.69
2.96
3.02
3.41
3.92
2.48
2.76
2.98
3.13
3.15

rank

Success
3
2
4
3
1
3
2
3
3
3
1
3
4
3
4
3
2
3
4
3
3
2
2
2
4
3
3
2
2
3
2
4
3
2
1
1
4
4
4
2
3
4
2

Failure
0
1
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

Total
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0
0
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

p-Obs
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

0
1
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0

p-Pred
0.08121
0.157798
0.061077
0.10884
0.273628
0.110975
0.188678
0.129871
0.089916
0.122091
0.322145
0.146417
0.071967
0.122012
0.097044
0.159398
0.255099
0.168993
0.116244
0.189553
0.126153
0.217187
0.221176
0.230687
0.094291
0.162445
0.166719
0.260907
0.263914
0.184712
0.306629
0.048792
0.109921
0.180054
0.321659
0.331915
0.111393
0.157053
0.059898
0.195335
0.141268
0.095498
0.247372

Suc-Pred
0.08121
0.157798
0.061077
0.10884
0.273628
0.110975
0.188678
0.129871
0.089916
0.122091
0.322145
0.146417
0.071967
0.122012
0.097044
0.159398
0.255099
0.168993
0.116244
0.189553
0.126153
0.217187
0.221176
0.230687
0.094291
0.162445
0.166719
0.260907
0.263914
0.184712
0.306629
0.048792
0.109921
0.180054
0.321659
0.331915
0.111393
0.157053
0.059898
0.195335
0.141268
0.095498
0.247372

440
440
440
440
440
460
460
460
460
460
460
460
460
460
460
460
460
460
460
480
480
480
480
480
480
480
480
480
480
480
480
480
480
480
480
500
500
500
500
500
500
500
500
500
500
500
500

3.17
3.22
3.24
3.39
3.45
2.63
2.87
2.93
2.98
3.07
3.14
3.15
3.44
3.45
3.58
3.64
3.64
3.77
3.99
2.55
2.62
2.67
2.78
2.91
3.02
3.13
3.39
3.4
3.43
3.44
3.45
3.57
3.58
3.71
4
2.71
2.79
2.81
2.86
2.93
2.97
2.98
3.01
3.03
3.08
3.13
3.17

2
1
4
2
2
2
2
3
1
2
3
4
2
3
2
1
3
3
3
1
2
2
3
1
1
2
4
2
2
3
2
2
1
4
2
2
4
3
4
4
4
3
4
3
3
2
3

0
0
0
1
1
0
0
0
0
0
0
0
0
1
0
1
1
0
1
0
1
1
0
1
1
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
1
0

1
1
1
0
0
1
1
1
1
1
1
1
1
0
1
0
0
1
0
1
0
0
1
0
0
1
1
1
1
1
1
1
0
0
1
1
1
1
1
1
1
1
1
1
1
0
1

1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

0
0
0
1
1
0
0
0
0
0
0
0
0
1
0
1
1
0
1
0
1
1
0
1
1
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
1
0

0.250277
0.377951
0.10314
0.283698
0.293266
0.186814
0.216809
0.142123
0.345505
0.244353
0.163201
0.100937
0.301227
0.198814
0.324607
0.468534
0.223377
0.241389
0.274057
0.283515
0.192673
0.198788
0.133722
0.343583
0.363108
0.261834
0.124064
0.304351
0.309309
0.204954
0.312638
0.333017
0.46835
0.153702
0.410849
0.21133
0.085113
0.141942
0.089445
0.093974
0.096654
0.158803
0.099402
0.164062
0.16946
0.270798
0.179531

0.250277
0.377951
0.10314
0.283698
0.293266
0.186814
0.216809
0.142123
0.345505
0.244353
0.163201
0.100937
0.301227
0.198814
0.324607
0.468534
0.223377
0.241389
0.274057
0.283515
0.192673
0.198788
0.133722
0.343583
0.363108
0.261834
0.124064
0.304351
0.309309
0.204954
0.312638
0.333017
0.46835
0.153702
0.410849
0.21133
0.085113
0.141942
0.089445
0.093974
0.096654
0.158803
0.099402
0.164062
0.16946
0.270798
0.179531

500
500
500
500
500
500
500
500
500
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
520
540
540
540
540
540
540
540
540
540
540
540
540
540
540

3.23
3.31
3.35
3.53
3.57
3.6
3.88
3.95
4
2.62
2.68
2.7
2.73
2.85
2.9
2.93
2.98
3.1
3.12
3.15
3.19
3.25
3.29
3.3
3.35
3.51
3.65
3.74
3.74
3.81
3.9
4
4
2.7
2.81
3.02
3.04
3.12
3.13
3.16
3.17
3.19
3.2
3.28
3.28
3.33
3.38

4
3
2
4
3
3
4
4
3
2
3
3
2
3
3
4
2
4
2
3
3
3
1
2
3
2
4
2
4
1
3
1
2
2
3
4
1
1
2
3
1
2
1
1
3
3
4

0
0
0
0
0
1
0
0
0
0
1
0
0
0
0
0
0
0
0
0
1
0
0
1
0
0
1
1
1
1
1
1
1
0
0
0
0
0
0
0
1
0
0
0
0
0
0

1
1
1
1
1
0
1
1
1
1
0
1
1
1
1
1
1
1
1
1
0
1
1
0
1
1
0
0
0
0
0
0
0
1
1
1
1
1
1
1
0
1
1
1
1
1
1

1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

0
0
0
0
0
1
0
0
0
0
1
0
0
0
0
0
0
0
0
0
1
0
0
1
0
0
1
1
1
1
1
1
1
0
0
0
0
0
0
0
1
0
0
0
0
0
0

0.115788
0.196116
0.305841
0.141871
0.229927
0.23408
0.178304
0.186413
0.294296
0.207349
0.135359
0.137188
0.221749
0.151576
0.15664
0.097954
0.25707
0.110261
0.278388
0.184043
0.188757
0.195999
0.435279
0.307335
0.208533
0.343117
0.159666
0.384445
0.169274
0.535864
0.287731
0.572323
0.433227
0.225668
0.153489
0.108674
0.399219
0.414217
0.289293
0.192237
0.423674
0.298972
0.429376
0.444668
0.207132
0.213585
0.138879

0.115788
0.196116
0.305841
0.141871
0.229927
0.23408
0.178304
0.186413
0.294296
0.207349
0.135359
0.137188
0.221749
0.151576
0.15664
0.097954
0.25707
0.110261
0.278388
0.184043
0.188757
0.195999
0.435279
0.307335
0.208533
0.343117
0.159666
0.384445
0.169274
0.535864
0.287731
0.572323
0.433227
0.225668
0.153489
0.108674
0.399219
0.414217
0.289293
0.192237
0.423674
0.298972
0.429376
0.444668
0.207132
0.213585
0.138879

540
540
540
540
540
540
540
540
540
540
540
540
540
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
560
580
580
580
580
580
580
580
580
580
580
580

3.39
3.46
3.49
3.49
3.5
3.51
3.55
3.77
3.78
3.78
3.81
3.84
3.94
2.42
2.52
2.65
2.71
2.95
2.98
3.04
3.07
3.16
3.19
3.24
3.32
3.33
3.36
3.36
3.43
3.48
3.49
3.59
3.61
3.69
3.78
4
2.86
2.88
2.93
3.02
3.06
3.12
3.2
3.25
3.29
3.3
3.32

3
4
1
2
2
2
4
2
2
4
1
2
3
2
2
3
3
2
1
3
2
1
3
4
4
4
1
3
3
2
4
2
3
3
2
3
4
2
2
2
2
3
2
1
4
2
2

1
0
1
1
0
0
1
1
1
0
1
1
0
0
0
1
0
0
1
0
0
0
0
0
0
0
1
0
0
1
0
1
0
1
0
0
1
0
0
0
0
1
1
0
0
0
1

0
1
0
0
1
1
0
0
0
1
0
0
1
1
1
0
1
1
0
1
1
1
1
1
1
1
0
1
1
0
1
1
1
0
1
1
0
1
1
1
1
0
0
1
1
1
0

1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

1
0
1
1
0
0
1
1
1
0
1
1
0
0
0
1
0
0
1
0
0
0
0
0
0
0
1
0
0
1
0
0.5
0
1
0
0
1
0
0
0
0
1
1
0
0
0
1

0.22152
0.146481
0.485238
0.349987
0.351757
0.353531
0.155442
0.400944
0.402812
0.180373
0.547254
0.414076
0.303757
0.197084
0.209669
0.143571
0.1494
0.270359
0.399041
0.184989
0.28914
0.433005
0.20321
0.131526
0.138791
0.139722
0.471482
0.225437
0.235076
0.358706
0.155345
0.378595
0.261149
0.27332
0.413896
0.323667
0.10556
0.268683
0.276385
0.290587
0.297036
0.201835
0.32024
0.461628
0.14151
0.337385
0.340868

0.22152
0.146481
0.485238
0.349987
0.351757
0.353531
0.155442
0.400944
0.402812
0.180373
0.547254
0.414076
0.303757
0.197084
0.209669
0.143571
0.1494
0.270359
0.399041
0.184989
0.28914
0.433005
0.20321
0.131526
0.138791
0.139722
0.471482
0.225437
0.235076
0.358706
0.155345
0.75719
0.261149
0.27332
0.413896
0.323667
0.10556
0.268683
0.276385
0.290587
0.297036
0.201835
0.32024
0.461628
0.14151
0.337385
0.340868

580
580
580
580
580
580
580
580
580
580
580
580
580
580
580
580
580
580
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
600
620
620
620
620
620
620
620

3.34
3.36
3.4
3.4
3.4
3.46
3.46
3.46
3.5
3.51
3.57
3.58
3.69
3.77
3.8
4
4
4
2.82
2.98
3.13
3.15
3.22
3.28
3.31
3.32
3.35
3.38
3.4
3.47
3.48
3.54
3.56
3.58
3.59
3.62
3.63
3.64
3.89
3.89
2.85
3.05
3.07
3.09
3.17
3.18
3.21

2
2
2
3
4
2
3
4
2
2
3
1
1
4
2
1
2
3
4
2
2
2
1
3
4
2
2
3
3
2
2
1
2
1
2
3
3
3
1
3
2
2
2
4
2
2
4

0
0
0
0
0
1
0
0
0
0
0
1
0
0
0
0
0
0
0
1
0
1
0
0
0
0
0
1
0
1
0
1
1
1
0
0
0
1
1
0
0
0
0
0
1
1
0

1
1
1
1
1
0
1
1
1
1
1
0
1
1
1
1
1
1
2
0
1
0
1
1
1
1
1
0
1
0
1
0
0
0
1
1
1
0
0
1
1
1
1
1
0
0
1

1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

0
0
0
0
0
1
0
0
0
0
0
1
0
0
0
0
0
0
0
1
0
1
0
0
0
0
0
1
0
1
0
1
1
1
0
0
0
1
1
0
0
0
0
0
1
1
0

0.344368
0.347885
0.354969
0.239158
0.152215
0.365714
0.247744
0.158329
0.372953
0.374772
0.264014
0.525634
0.546886
0.193124
0.42887
0.605628
0.467282
0.33379
0.106965
0.293647
0.318389
0.321771
0.467242
0.230647
0.149137
0.35125
0.35658
0.244722
0.247606
0.378246
0.380075
0.529322
0.394827
0.537058
0.40041
0.280804
0.282376
0.283953
0.596133
0.325045
0.282343
0.314867
0.318229
0.133953
0.335319
0.337053
0.145144

0.344368
0.347885
0.354969
0.239158
0.152215
0.365714
0.247744
0.158329
0.372953
0.374772
0.264014
0.525634
0.546886
0.193124
0.42887
0.605628
0.467282
0.33379
0.213931
0.293647
0.318389
0.321771
0.467242
0.230647
0.149137
0.35125
0.35658
0.244722
0.247606
0.378246
0.380075
0.529322
0.394827
0.537058
0.40041
0.280804
0.282376
0.283953
0.596133
0.325045
0.282343
0.314867
0.318229
0.133953
0.335319
0.337053
0.145144

620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
620
640
640
640
640
640
640
640
640
640
640
640
640
640
640
640
640
640
640
640
640
640
660
660
660
660

3.22
3.23
3.3
3.33
3.37
3.37
3.4
3.43
3.45
3.58
3.61
3.63
3.63
3.69
3.71
3.75
3.78
3.94
3.95
3.99
4
4
2.79
2.94
3.12
3.17
3.19
3.3
3.35
3.38
3.5
3.51
3.52
3.63
3.67
3.73
3.76
3.81
3.86
3.93
3.94
3.95
4
2.91
3.07
3.14
3.31

2
3
1
3
1
2
2
3
2
2
1
2
3
3
1
2
3
4
3
3
1
2
2
2
3
2
4
2
3
3
2
2
4
1
3
3
3
2
3
2
2
2
3
3
3
2
4

0
1
0
0
1
1
0
0
1
0
1
0
0
0
1
1
0
0
2
0
1
0
0
1
0
0
1
0
0
0
0
0
0
1
0
0
0
0
0
0
1
1
0
1
0
1
0

1
0
1
1
0
0
1
1
0
1
0
1
1
1
0
0
1
1
0
1
0
1
1
0
1
1
0
1
1
1
1
1
1
0
1
1
1
1
1
1
0
0
1
0
1
0
1

1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

0
1
0
0
1
1
0
0
1
0
1
0
0
0
1
1
0
0
1
0
1
0
0
1
0
0
1
0
0
0
0
0
0
1
0
0
0
0
0
0
1
1
0
1
0
1
0

0.344033
0.231896
0.494205
0.246024
0.507802
0.370792
0.376247
0.26072
0.385407
0.409593
0.55421
0.41902
0.291765
0.301491
0.573317
0.441875
0.316419
0.230412
0.345657
0.35272
0.627318
0.490176
0.282193
0.306389
0.224919
0.345621
0.148949
0.368808
0.257594
0.262077
0.405662
0.407536
0.184455
0.569329
0.307918
0.317942
0.323018
0.464795
0.340237
0.488049
0.48999
0.491932
0.365063
0.205132
0.226147
0.350743
0.167459

0.344033
0.231896
0.494205
0.246024
0.507802
0.370792
0.376247
0.26072
0.385407
0.409593
0.55421
0.41902
0.291765
0.301491
0.573317
0.441875
0.316419
0.230412
0.691314
0.35272
0.627318
0.490176
0.282193
0.306389
0.224919
0.345621
0.148949
0.368808
0.257594
0.262077
0.405662
0.407536
0.184455
0.569329
0.307918
0.317942
0.323018
0.464795
0.340237
0.488049
0.48999
0.491932
0.365063
0.205132
0.226147
0.350743
0.167459

660
660
660
660
660
660
660
660
660
660
660
660
660
660
660
660
660
660
660
680
680
680
680
680
680
680
680
680
680
680
680
680
680
680
680
680
680
680
680
700
700
700
700
700
700
700
700

3.32
3.33
3.34
3.44
3.45
3.47
3.49
3.59
3.6
3.63
3.67
3.67
3.7
3.77
3.88
3.94
3.95
4
4
2.42
2.96
3
3.08
3.11
3.14
3.19
3.27
3.31
3.34
3.46
3.48
3.64
3.67
3.7
3.76
3.85
3.9
3.98
3.99
2.88
2.9
2.94
3.08
3.27
3.28
3.45
3.52

1
2
3
2
4
3
2
3
3
2
2
3
4
3
2
2
2
1
2
1
3
4
4
2
2
4
2
2
2
2
3
3
2
2
3
3
1
2
3
2
4
2
2
2
1
3
2

0
0
0
1
0
1
1
0
1
1
0
1
1
0
1
0
1
0
1
1
1
1
0
0
0
0
1
0
0
1
0
0
1
0
1
1
0
1
0
0
0
0
0
0
0
0
0

1
1
1
0
1
0
1
1
0
0
1
0
0
1
0
1
0
1
0
0
0
0
1
1
1
1
0
1
1
0
1
1
0
1
0
0
1
0
1
1
1
1
1
1
1
1
1

1
1
1
1
1
1
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1

0
0
0
1
0
1
0.5
0
1
1
0
1
1
0
1
0
1
0
1
1
1
1
0
0
0
0
1
0
0
1
0
0
1
0
1
1
0
1
0
0
0
0
0
0
0
0
0

0.521017
0.385056
0.264949
0.405483
0.183178
0.285082
0.414882
0.30446
0.306108
0.441509
0.449187
0.317781
0.214045
0.334859
0.489805
0.501459
0.503401
0.648508
0.513111
0.361395
0.219293
0.142004
0.149748
0.355899
0.361261
0.160959
0.38488
0.392264
0.397835
0.420371
0.296141
0.322694
0.460562
0.466359
0.3434
0.359336
0.641221
0.520687
0.384743
0.326064
0.138171
0.336391
0.36109
0.395798
0.536136
0.300867
0.44306

0.521017
0.385056
0.264949
0.405483
0.183178
0.285082
0.829763
0.30446
0.306108
0.441509
0.449187
0.317781
0.214045
0.334859
0.489805
0.501459
0.503401
0.648508
0.513111
0.361395
0.219293
0.142004
0.149748
0.355899
0.361261
0.160959
0.38488
0.392264
0.397835
0.420371
0.296141
0.322694
0.460562
0.466359
0.3434
0.359336
0.641221
0.520687
0.384743
0.326064
0.138171
0.336391
0.36109
0.395798
0.536136
0.300867
0.44306

700
700
700
700
700
700
700
700
700
700
700
720
720
720
720
720
720
720
720
720
720
720
740
740
740
740
740
740
740
740
740
740
760
760
760
760
760
780
780
780
780
780
800
800
800
800
800

3.52
3.56
3.56
3.65
3.72
3.82
3.83
3.92
4
4
4
3.31
3.4
3.42
3.45
3.5
3.64
3.77
3.84
3.88
3.94
4
2.97
3.31
3.34
3.37
3.52
3.54
3.62
3.74
3.86
4
2.81
3
3.35
3.35
4
3.22
3.63
3.8
3.87
4
2.78
2.9
3.05
3.15
3.22

4
1
2
2
2
3
2
2
1
2
3
1
3
2
4
3
1
3
3
3
3
3
2
1
4
4
4
1
4
4
2
3
1
2
2
3
1
2
4
3
4
2
2
2
2
4
1

1
1
1
0
0
0
0
0
2
0
0
0
0
1
0
1
1
0
0
0
0
0
1
0
0
0
1
0
0
0
1
1
1
1
1
0
1
1
1
1
0
1
0
0
1
0
1

0
0
0
2
1
1
1
1
1
1
1
1
1
0
1
0
0
1
1
1
1
1
0
1
1
1
0
1
1
1
0
1
0
0
0
1
0
0
0
0
1
0
1
1
0
1
0

1
1 0.206064 0.206064
1
1 0.589612 0.589612
1
1 0.450742 0.450742
2
0 0.468108 0.936217
1
0 0.481671 0.481671
1
0 0.364548 0.364548
1
0 0.503031 0.503031
1
0 0.520502 0.520502
3 0.666667 0.669129 2.007386
1
0 0.535991 0.535991
1
0 0.397518 0.397518
1
0 0.553293 0.5532...


Anonymous
Very useful material for studying!

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Related Tags