How to input cvs and get min and max values python - python-3.x

Read in the same hsv_2020_climo_data.csv file into Pandas DataFrame with the Date column as the index (similar to examples before).
Answer the following questions with formatted print statements.
What are the data types of each column? (i.e. 'MaxTemperature' dtype is int64)
What is the highest maximum temperature for the entire year?
What is the lowest minimum temperature for the entire year?
How much total rain did we get for 2020? (Hint you will need to handle the "Trace" values first)
Make a plot with Maximum Temperature, Minimum Temperature, and Average Temperature with labels and title.
Data below
Date MaxTemperature MinTemperature AvgTemperature Precipitation Snowfall SnowDepth
2020-01-01 53 31 42.0 0.00 0.0 0
2020-01-02 54 45 49.5 3.42 0.0 0
2020-01-03 59 53 56.0 0.32 0.0 0
2020-01-04 56 31 43.5 0.08 0.0 0
2020-01-05 55 29 42.0 0.00 0.0 0
2020-01-06 60 35 47.5 0.03 0.0 0
2020-01-07 55 35 45.0 T 0.0 0
2020-01-08 61 30 45.5 0.00 0.0 0
2020-01-09 60 36 48.0 0.00 0.0 0
2020-01-10 65 58 61.5 T 0.0 0
2020-01-11 72 50 61.0 1.08 0.0 0
2020-01-12 59 46 52.5 0.00 0.0 0
2020-01-13 62 43 52.5 0.08 0.0 0
2020-01-14 64 60 62.0 0.96 0.0 0
2020-01-15 67 60 63.5 0.61 0.0 0
2020-01-16 60 42 51.0 0.03 0.0 0
2020-01-17 59 40 49.5 0.00 0.0 0
2020-01-18 57 45 51.0 0.17 0.0 0
2020-01-19 45 27 36.0 0.00 0.0 0
2020-01-20 28 22 25.0 T T 0
2020-01-21 37 23 30.0 0.00 0.0 0
2020-01-22 46 20 33.0 0.00 0.0 0
2020-01-23 47 39 43.0 0.72 0.0 0
2020-01-24 54 40 47.0 0.19 0.0 0
2020-01-25 41 33 37.0 T 0.0 0
2020-01-26 49 29 39.0 0.03 0.0 0
2020-01-27 58 37 47.5 0.01 0.0 0
2020-01-28 60 31 45.5 0.00 0.0 0
2020-01-29 50 44 47.0 0.03 0.0 0
2020-01-30 60 37 48.5 0.00 0.0 0
2020-01-31 52 45 48.5 T 0.0 0
2020-02-01 50 37 43.5 0.06 0.0 0
2020-02-02 68 31 49.5 0.00 0.0 0
2020-02-03 71 40 55.5 T 0.0 0
2020-02-04 67 55 61.0 0.18 0.0 0
2020-02-05 68 62 65.0 1.16 0.0 0
2020-02-06 64 36 50.0 1.49 0.0 0
2020-02-07 41 33 37.0 T T 0
2020-02-08 53 32 42.5 0.10 T 0
2020-02-09 61 33 47.0 T 0.0 0
2020-02-10 57 49 53.0 1.78 0.0 0
2020-02-11 66 45 55.5 1.12 0.0 0
2020-02-12 70 44 57.0 1.31 0.0 0
2020-02-13 60 36 48.0 0.38 0.0 0
2020-02-14 41 26 33.5 0.00 0.0 0
2020-02-15 54 22 38.0 0.00 0.0 0
2020-02-16 58 42 50.0 0.00 0.0 0
2020-02-17 56 38 47.0 0.00 0.0 0
2020-02-18 62 46 54.0 1.32 0.0 0
2020-02-19 52 43 47.5 T 0.0 0
2020-02-20 47 33 40.0 0.85 T 0
2020-02-21 44 26 35.0 0.00 0.0 0
2020-02-22 56 24 40.0 0.00 0.0 0
2020-02-23 53 34 43.5 0.01 0.0 0
2020-02-24 55 44 49.5 0.62 0.0 0
2020-02-25 62 45 53.5 0.00 0.0 0
2020-02-26 48 36 42.0 0.04 T 0
2020-02-27 46 31 38.5 T T 0
2020-02-28 51 34 42.5 T 0.0 0
2020-02-29 55 36 45.5 0.00 0.0 0
2020-03-01 66 34 50.0 0.00 0.0 0
2020-03-02 60 52 56.0 0.44 0.0 0
2020-03-03 69 55 62.0 0.33 0.0 0
2020-03-04 60 51 55.5 0.04 0.0 0
2020-03-05 59 42 50.5 0.15 0.0 0
2020-03-06 56 37 46.5 0.00 0.0 0
2020-03-07 58 30 44.0 0.00 0.0 0
2020-03-08 65 35 50.0 0.00 0.0 0
2020-03-09 68 45 56.5 T 0.0 0
2020-03-10 70 56 63.0 0.27 0.0 0
2020-03-11 66 51 58.5 0.12 0.0 0
2020-03-12 76 57 66.5 0.26 0.0 0
2020-03-13 67 54 60.5 0.14 0.0 0
2020-03-14 74 50 62.0 0.56 0.0 0
2020-03-15 61 44 52.5 1.07 0.0 0
2020-03-16 59 44 51.5 0.02 0.0 0
2020-03-17 74 53 63.5 0.24 0.0 0
2020-03-18 77 52 64.5 0.00 0.0 0
2020-03-19 78 64 71.0 T 0.0 0
2020-03-20 72 60 66.0 1.11 0.0 0
2020-03-21 59 43 51.0 0.01 0.0 0
2020-03-22 66 43 54.5 0.07 0.0 0
2020-03-23 64 57 60.5 1.80 0.0 0
2020-03-24 76 57 66.5 2.96 0.0 0
2020-03-25 66 51 58.5 0.00 0.0 0
2020-03-26 81 47 64.0 0.00 0.0 0
2020-03-27 85 63 74.0 0.00 0.0 0
2020-03-28 82 66 74.0 0.00 0.0 0
2020-03-29 75 53 64.0 0.41 0.0 0
2020-03-30 70 52 61.0 0.02 0.0 0
2020-03-31 56 43 49.5 0.65 0.0 0
2020-04-01 62 39 50.5 0.00 0.0 0
2020-04-02 70 38 54.0 0.00 0.0 0
2020-04-03 75 42 58.5 0.00 0.0 0
2020-04-04 78 54 66.0 0.00 0.0 0
2020-04-05 81 54 67.5 0.00 0.0 0
2020-04-06 83 52 67.5 0.00 0.0 0
2020-04-07 74 62 68.0 0.00 0.0 0
2020-04-08 80 63 71.5 0.24 0.0 0
2020-04-09 71 57 64.0 0.32 0.0 0
2020-04-10 60 40 50.0 0.00 0.0 0
2020-04-11 71 37 54.0 0.00 0.0 0
2020-04-12 66 54 60.0 3.02 0.0 0
2020-04-13 66 42 54.0 T 0.0 0
2020-04-14 59 39 49.0 0.00 0.0 0
2020-04-15 61 34 47.5 0.00 0.0 0
2020-04-16 69 36 52.5 0.00 0.0 0
2020-04-17 76 45 60.5 0.07 0.0 0
2020-04-18 62 45 53.5 0.21 0.0 0
2020-04-19 63 46 54.5 1.41 0.0 0
2020-04-20 72 51 61.5 0.11 0.0 0
2020-04-21 76 50 63.0 0.00 0.0 0
2020-04-22 68 42 55.0 0.29 0.0 0
2020-04-23 70 54 62.0 0.92 0.0 0
2020-04-24 73 56 64.5 0.01 0.0 0
2020-04-25 74 53 63.5 0.21 0.0 0
2020-04-26 61 41 51.0 0.00 0.0 0
2020-04-27 72 38 55.0 0.00 0.0 0
2020-04-28 77 53 65.0 T 0.0 0
2020-04-29 72 53 62.5 0.13 0.0 0
2020-04-30 71 48 59.5 T 0.0 0
2020-05-01 76 43 59.5 0.00 0.0 0
2020-05-02 83 51 67.0 0.00 0.0 0
2020-05-03 85 58 71.5 0.00 0.0 0
2020-05-04 84 60 72.0 T 0.0 0
2020-05-05 76 56 66.0 T 0.0 0
2020-05-06 65 44 54.5 0.00 0.0 0
2020-05-07 71 39 55.0 0.00 0.0 0
2020-05-08 61 48 54.5 0.86 0.0 0
2020-05-09 64 40 52.0 0.00 0.0 0
2020-05-10 73 39 56.0 0.00 0.0 0
2020-05-11 68 43 55.5 0.00 0.0 0
2020-05-12 66 48 57.0 T 0.0 0
2020-05-13 79 55 67.0 0.05 0.0 0
2020-05-14 85 61 73.0 0.00 0.0 0
2020-05-15 84 67 75.5 0.00 0.0 0
2020-05-16 86 62 74.0 0.00 0.0 0
2020-05-17 79 65 72.0 0.23 0.0 0
2020-05-18 82 60 71.0 T 0.0 0
2020-05-19 71 54 62.5 T 0.0 0
2020-05-20 77 54 65.5 0.25 0.0 0
2020-05-21 79 57 68.0 0.00 0.0 0
2020-05-22 82 63 72.5 1.52 0.0 0
2020-05-23 86 64 75.0 0.28 0.0 0
2020-05-24 88 65 76.5 T 0.0 0
2020-05-25 88 67 77.5 0.00 0.0 0
2020-05-26 76 67 71.5 0.31 0.0 0
2020-05-27 79 61 70.0 0.74 0.0 0
2020-05-28 83 62 72.5 0.21 0.0 0
2020-05-29 83 64 73.5 0.18 0.0 0
2020-05-30 84 63 73.5 0.00 0.0 0
2020-05-31 83 59 71.0 0.00 0.0 0
2020-06-01 85 53 69.0 0.00 0.0 0
2020-06-02 89 67 78.0 0.00 0.0 0
2020-06-03 88 71 79.5 0.06 0.0 0
2020-06-04 87 68 77.5 T 0.0 0
2020-06-05 90 69 79.5 0.41 0.0 0
2020-06-06 91 68 79.5 0.00 0.0 0
2020-06-07 91 71 81.0 0.00 0.0 0
2020-06-08 84 75 79.5 0.43 0.0 0
2020-06-09 87 75 81.0 0.11 0.0 0
2020-06-10 92 65 78.5 0.00 0.0 0
2020-06-11 85 61 73.0 0.00 0.0 0
2020-06-12 88 61 74.5 0.00 0.0 0
2020-06-13 90 58 74.0 0.00 0.0 0
2020-06-14 92 62 77.0 0.00 0.0 0
2020-06-15 83 61 72.0 0.00 0.0 0
2020-06-16 81 60 70.5 0.00 0.0 0
2020-06-17 80 63 71.5 0.00 0.0 0
2020-06-18 85 61 73.0 0.00 0.0 0
2020-06-19 91 64 77.5 0.00 0.0 0
2020-06-20 94 66 80.0 0.00 0.0 0
2020-06-21 88 69 78.5 0.14 0.0 0
2020-06-22 88 68 78.0 0.14 0.0 0
2020-06-23 87 70 78.5 0.25 0.0 0
2020-06-24 75 68 71.5 0.70 0.0 0
2020-06-25 85 70 77.5 0.00 0.0 0
2020-06-26 75 68 71.5 0.15 0.0 0
2020-06-27 82 68 75.0 0.30 0.0 0
2020-06-28 90 73 81.5 0.25 0.0 0
2020-06-29 90 71 80.5 0.01 0.0 0
2020-06-30 88 70 79.0 0.84 0.0 0
2020-07-01 82 68 75.0 0.73 0.0 0
2020-07-02 89 68 78.5 0.00 0.0 0
2020-07-03 94 71 82.5 0.00 0.0 0
2020-07-04 92 71 81.5 0.00 0.0 0
2020-07-05 93 71 82.0 0.47 0.0 0
2020-07-06 88 71 79.5 0.00 0.0 0
2020-07-07 90 73 81.5 0.07 0.0 0
2020-07-08 87 73 80.0 T 0.0 0
2020-07-09 90 71 80.5 0.00 0.0 0
2020-07-10 92 73 82.5 0.00 0.0 0
2020-07-11 92 67 79.5 0.00 0.0 0
2020-07-12 82 68 75.0 1.38 0.0 0
2020-07-13 89 69 79.0 0.00 0.0 0
2020-07-14 91 70 80.5 0.00 0.0 0
2020-07-15 93 70 81.5 0.00 0.0 0
2020-07-16 91 70 80.5 0.00 0.0 0
2020-07-17 94 73 83.5 0.00 0.0 0
2020-07-18 95 73 84.0 0.00 0.0 0
2020-07-19 95 73 84.0 0.00 0.0 0
2020-07-20 95 73 84.0 0.00 0.0 0
2020-07-21 94 74 84.0 T 0.0 0
2020-07-22 92 73 82.5 0.19 0.0 0
2020-07-23 92 71 81.5 0.00 0.0 0
2020-07-24 90 73 81.5 0.00 0.0 0
2020-07-25 94 72 83.0 0.07 0.0 0
2020-07-26 94 71 82.5 0.00 0.0 0
2020-07-27 91 73 82.0 T 0.0 0
2020-07-28 90 72 81.0 T 0.0 0
2020-07-29 92 73 82.5 0.02 0.0 0
2020-07-30 90 74 82.0 0.14 0.0 0
2020-07-31 92 74 83.0 0.25 0.0 0
2020-08-01 87 70 78.5 T 0.0 0
2020-08-02 86 66 76.0 0.00 0.0 0
2020-08-03 91 67 79.0 0.00 0.0 0
2020-08-04 90 70 80.0 0.01 0.0 0
2020-08-05 92 68 80.0 0.00 0.0 0
2020-08-06 92 71 81.5 0.00 0.0 0
2020-08-07 94 69 81.5 0.00 0.0 0
2020-08-08 97 68 82.5 0.00 0.0 0
2020-08-09 96 71 83.5 0.00 0.0 0
2020-08-10 98 74 86.0 0.00 0.0 0
2020-08-11 95 73 84.0 0.49 0.0 0
2020-08-12 93 74 83.5 0.01 0.0 0
2020-08-13 94 71 82.5 T 0.0 0
2020-08-14 90 74 82.0 T 0.0 0
2020-08-15 92 71 81.5 0.00 0.0 0
2020-08-16 93 67 80.0 T 0.0 0
2020-08-17 91 67 79.0 0.00 0.0 0
2020-08-18 93 64 78.5 0.24 0.0 0
2020-08-19 91 68 79.5 1.24 0.0 0
2020-08-20 87 67 77.0 T 0.0 0
2020-08-21 82 68 75.0 0.10 0.0 0
2020-08-22 85 64 74.5 0.00 0.0 0
2020-08-23 88 68 78.0 0.00 0.0 0
2020-08-24 88 72 80.0 T 0.0 0
2020-08-25 82 72 77.0 0.15 0.0 0
2020-08-26 85 70 77.5 1.83 0.0 0
2020-08-27 91 75 83.0 0.22 0.0 0
2020-08-28 86 72 79.0 0.92 0.0 0
2020-08-29 90 74 82.0 0.02 0.0 0
2020-08-30 91 71 81.0 0.23 0.0 0
2020-08-31 87 71 79.0 0.94 0.0 0
2020-09-01 89 71 80.0 0.05 0.0 0
2020-09-02 89 74 81.5 0.00 0.0 0
2020-09-03 89 73 81.0 0.00 0.0 0
2020-09-04 90 67 78.5 T 0.0 0
2020-09-05 88 59 73.5 0.00 0.0 0
2020-09-06 86 57 71.5 0.00 0.0 0
2020-09-07 86 60 73.0 0.00 0.0 0
2020-09-08 87 64 75.5 0.00 0.0 0
2020-09-09 88 65 76.5 0.00 0.0 0
2020-09-10 90 66 78.0 0.00 0.0 0
2020-09-11 93 68 80.5 0.00 0.0 0
2020-09-12 90 73 81.5 0.01 0.0 0
2020-09-13 91 71 81.0 T 0.0 0
2020-09-14 90 69 79.5 0.00 0.0 0
2020-09-15 83 69 76.0 0.00 0.0 0
2020-09-16 74 68 71.0 0.12 0.0 0
2020-09-17 87 70 78.5 0.00 0.0 0
2020-09-18 79 61 70.0 0.00 0.0 0
2020-09-19 76 59 67.5 0.00 0.0 0
2020-09-20 81 58 69.5 0.00 0.0 0
2020-09-21 76 53 64.5 0.00 0.0 0
2020-09-22 75 51 63.0 T 0.0 0
2020-09-23 71 53 62.0 0.79 0.0 0
2020-09-24 66 55 60.5 2.65 0.0 0
2020-09-25 73 64 68.5 T 0.0 0
2020-09-26 76 62 69.0 0.00 0.0 0
2020-09-27 83 61 72.0 0.00 0.0 0
2020-09-28 82 56 69.0 0.42 0.0 0
2020-09-29 70 49 59.5 0.00 0.0 0
2020-09-30 77 47 62.0 0.00 0.0 0
2020-10-01 76 51 63.5 0.00 0.0 0
2020-10-02 69 44 56.5 0.00 0.0 0
2020-10-03 71 40 55.5 0.00 0.0 0
2020-10-04 76 50 63.0 0.00 0.0 0
2020-10-05 76 48 62.0 0.00 0.0 0
2020-10-06 80 48 64.0 0.00 0.0 0
2020-10-07 82 52 67.0 0.00 0.0 0
2020-10-08 82 49 65.5 0.00 0.0 0
2020-10-09 73 63 68.0 0.47 0.0 0
2020-10-10 74 64 69.0 1.35 0.0 0
2020-10-11 75 68 71.5 0.23 0.0 0
2020-10-12 80 64 72.0 0.00 0.0 0
2020-10-13 76 52 64.0 0.00 0.0 0
2020-10-14 82 45 63.5 0.00 0.0 0
2020-10-15 80 54 67.0 0.00 0.0 0
2020-10-16 66 39 52.5 0.01 0.0 0
2020-10-17 68 37 52.5 0.00 0.0 0
2020-10-18 76 50 63.0 0.00 0.0 0
2020-10-19 80 56 68.0 0.00 0.0 0
2020-10-20 81 59 70.0 0.00 0.0 0
2020-10-21 81 58 69.5 0.00 0.0 0
2020-10-22 83 62 72.5 0.00 0.0 0
2020-10-23 83 63 73.0 0.03 0.0 0
2020-10-24 66 55 60.5 0.44 0.0 0
2020-10-25 69 55 62.0 0.00 0.0 0
2020-10-26 75 58 66.5 0.00 0.0 0
2020-10-27 75 58 66.5 T 0.0 0
2020-10-28 74 69 71.5 2.87 0.0 0
2020-10-29 72 48 60.0 0.58 0.0 0
2020-10-30 57 42 49.5 0.00 0.0 0
2020-10-31 68 40 54.0 0.00 0.0 0
2020-11-01 68 43 55.5 0.00 0.0 0
2020-11-02 57 33 45.0 0.00 0.0 0
2020-11-03 66 34 50.0 0.00 0.0 0
2020-11-04 71 39 55.0 0.00 0.0 0
2020-11-05 70 44 57.0 0.00 0.0 0
2020-11-06 76 46 61.0 0.00 0.0 0
2020-11-07 75 48 61.5 T 0.0 0
2020-11-08 79 59 69.0 0.00 0.0 0
2020-11-09 78 62 70.0 0.00 0.0 0
2020-11-10 74 65 69.5 T 0.0 0
2020-11-11 77 58 67.5 0.04 0.0 0
2020-11-12 68 44 56.0 0.00 0.0 0
2020-11-13 71 42 56.5 0.00 0.0 0
2020-11-14 73 41 57.0 0.00 0.0 0
2020-11-15 69 40 54.5 0.02 0.0 0
2020-11-16 63 32 47.5 0.00 0.0 0
2020-11-17 62 34 48.0 0.00 0.0 0
2020-11-18 64 31 47.5 0.00 0.0 0
2020-11-19 66 38 52.0 0.00 0.0 0
2020-11-20 72 40 56.0 0.00 0.0 0
2020-11-21 73 42 57.5 0.00 0.0 0
2020-11-22 69 46 57.5 0.02 0.0 0
2020-11-23 57 35 46.0 0.00 0.0 0
2020-11-24 65 32 48.5 0.00 0.0 0
2020-11-25 65 53 59.0 0.48 0.0 0
2020-11-26 62 40 51.0 0.00 0.0 0
2020-11-27 66 38 52.0 0.58 0.0 0
2020-11-28 57 41 49.0 T 0.0 0
2020-11-29 55 39 47.0 0.73 0.0 0
2020-11-30 44 30 37.0 0.08 T 0
2020-12-01 41 25 33.0 0.00 0.0 0
2020-12-02 52 20 36.0 0.00 0.0 0
2020-12-03 58 25 41.5 0.16 0.0 0
2020-12-04 48 35 41.5 0.82 0.0 0
2020-12-05 56 28 42.0 0.00 0.0 0
2020-12-06 59 30 44.5 T 0.0 0
2020-12-07 47 28 37.5 0.00 0.0 0
2020-12-08 49 25 37.0 0.00 0.0 0
2020-12-09 64 28 46.0 0.00 0.0 0
2020-12-10 71 35 53.0 0.00 0.0 0
2020-12-11 66 37 51.5 0.00 0.0 0
2020-12-12 63 46 54.5 0.30 0.0 0
2020-12-13 60 34 47.0 0.80 0.0 0
2020-12-14 44 35 39.5 0.81 0.0 0
2020-12-15 48 30 39.0 T 0.0 0
2020-12-16 50 35 42.5 0.26 0.0 0
2020-12-17 43 26 34.5 0.00 0.0 0
2020-12-18 50 23 36.5 0.00 0.0 0
2020-12-19 53 27 40.0 0.03 0.0 0
2020-12-20 51 40 45.5 0.16 0.0 0
2020-12-21 61 38 49.5 0.00 0.0 0
2020-12-22 60 31 45.5 0.00 0.0 0
2020-12-23 61 35 48.0 0.12 0.0 0
2020-12-24 52 28 40.0 1.13 T 0
2020-12-25 32 20 26.0 0.00 0.0 0
2020-12-26 50 18 34.0 0.00 0.0 0
2020-12-27 60 26 43.0 0.00 0.0 0
2020-12-28 57 37 47.0 0.00 0.0 0
2020-12-29 61 33 47.0 0.00 0.0 0
2020-12-30 69 41 55.0 0.00 0.0 0
2020-12-31 59 50 54.5 0.03 0.0 0

First of all, notice that some columns contain the value "T". To solve some of the questions, you'll have to replace those:
df = pd.read_csv(r"C:\users\....\DATA.csv", sep=";")
df.replace('T',0, inplace = True)
To get the datatypes:
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 366 entries, 0 to 365
Data columns (total 7 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Date 366 non-null object
1 MaxTemperature 366 non-null int64
2 MinTemperature 366 non-null int64
3 AvgTemperature 366 non-null float64
4 Precipitation 366 non-null object
5 Snowfall 366 non-null object
6 SnowDepth 366 non-null int64
dtypes: float64(1), int64(3), object(3)
memory usage: 20.1+ KB
To get all the information you ask, you need to transform object values to float. They are string because you hade "T" values instead of numeric:
df['Precipitation'] = df.Precipitation.astype(float)
df['Snowfall'] = df.Snowfall.astype(float)
df['SnowDepth'] = df.SnowDepth.astype(float)
Note now that
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 366 entries, 0 to 365
Data columns (total 7 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Date 366 non-null object
1 MaxTemperature 366 non-null int64
2 MinTemperature 366 non-null int64
3 AvgTemperature 366 non-null float64
4 Precipitation 366 non-null float64
5 Snowfall 366 non-null float64
6 SnowDepth 366 non-null float64
dtypes: float64(4), int64(2), object(1)
memory usage: 20.1+ KB
Now to answer all the questions:
df.describe()
returns:
MaxTemperature MinTemperature AvgTemperature Precipitation \
count 366.000000 366.000000 366.000000 366.000000
mean 73.021858 52.229508 62.625683 0.192678
std 14.496086 14.966696 14.317651 0.471406
min 28.000000 18.000000 25.000000 0.000000
25% 61.250000 40.000000 51.000000 0.000000
50% 74.000000 53.000000 63.250000 0.000000
75% 86.000000 67.000000 75.500000 0.140000
max 98.000000 75.000000 86.000000 3.420000
Snowfall SnowDepth
count 366.0 366.0
mean 0.0 0.0
std 0.0 0.0
min 0.0 0.0
25% 0.0 0.0
50% 0.0 0.0
75% 0.0 0.0
max 0.0 0.0
you have the max, min, .... for all variables.
As for the plot
lines = df.plot.line()

Related

Creating a dataframe with multi-index columns from a dictionary

Suppose I have the following:
a data frame with multi-index columns
a dictionary that's been created from the data frame with multi-index columns in order to efficiently manipulate it.
df = pd.DataFrame([
[100,90,80,70,36,45],
[101,78,65,88,55,78],
[92,77,42,79,43,32],
[103,98,76,54,45,65]],
index = pd.date_range(start='2022-01-01', periods=4)
)
df.columns = pd.MultiIndex.from_tuples(
(("mkf", "Open"),
("mkf", "Close"),
("tdf", "Open"),
("tdf","Close"),
("ghi","Open"),
("ghi", "Close"))
)
df
mkf tdf ghi
Open Close Open Close Open Close
2022-01-01 100 90 80 70 36 45
2022-01-02 101 78 65 88 55 78
2022-01-03 92 77 42 79 43 32
2022-01-04 103 98 76 54 45 65
df_dict = {c:df[c].assign(r=np.log(df[(c, 'Close')]).diff()) for c in df.columns.levels[0]}
df_dict
{'ghi': Open Close r
2022-01-01 36 45 NaN
2022-01-02 55 78 0.550046
2022-01-03 43 32 -0.890973
2022-01-04 45 65 0.708651,
'mkf': Open Close r
2022-01-01 100 90 NaN
2022-01-02 101 78 -0.143101
2022-01-03 92 77 -0.012903
2022-01-04 103 98 0.241162,
'tdf': Open Close r
2022-01-01 80 70 NaN
2022-01-02 65 88 0.228842
2022-01-03 42 79 -0.107889
2022-01-04 76 54 -0.380464}
What is the best way to transform the dictionary back to a data frame in its original form (i.e. with multi-index columns)?
mkf tdf ghi
Open Close r Open Close r Open Close r
2022-01-01 100 90 NaN 80 70 NaN 36 45 NaN
2022-01-02 101 78 0.55 65 88 -0.14 55 78 0.23
2022-01-03 92 77 -0.89 42 79 -0.12 43 32 -0.10
2022-01-04 103 98 0.71 76 54 0.24 45 65 -0.38
pd.concat([df_dict[c] for c in df_dict.keys()], axis = 1,
keys = df_dict.keys())
ghi mkf tdf
Open Close r Open Close r Open Close r
2022-01-01 36 45 NaN 100 90 NaN 80 70 NaN
2022-01-02 55 78 0.550046 101 78 -0.143101 65 88 0.228842
2022-01-03 43 32 -0.890973 92 77 -0.012903 42 79 -0.107889
2022-01-04 45 65 0.708651 103 98 0.241162 76 54 -0.380464

Missing values in .dat file(empty) leading to error while reading the file

I have a .dat file which I tried to do analysis upon. This is the code
catalog=ascii.read("table6.dat",Reader=ascii.NoHeader,guess=False,fast_reader=False,delimiter='\s')
The problem is that there are missing values(empty) within the file which does not allow me to do analysis on the data.
output:
astropy.io.ascii.core.InconsistentTableError: Number of header columns (23) inconsistent with data columns (24) at data line 3
changing the delimiter from '\s' to '\n' gives me this
col1
-------------------------------------------------------------------------------------------------------------------------------------
1 33 Psc 28 00 05 20.1 -05 42 27 93.73 -65.93 111 -6.6 -13 89 (44) -3 45 -101 -16.7 37.4 24.6
2 ADS 48A 38 00 05 41.2 45 48 35 114.64 -16.32 11 -9.0 886 -207 (737) -4 10 -3 -33.6 -31.1 -15.4
3 5 Cet 352 00 08 12.0 -02 26 52 98.32 -63.23 140 -0.4 6 -4 (77) -9 62 -125 -2.1 -4.1 -1.4
4 BD Cet 1833 00 22 46.7 -09 13 49 100.84 -70.86 71 -4.8 3 -51 (409) -4 23 -67 8.1 -15.9 -0.9
5 13 Cet A 3196 00 35 14.8 -03 35 34 112.87 -66.15 21 10.6 410 -21 (409) -3 8 -19 -36.0 -19.3 -12.7
6 FF And 00 42 47.3 35 32 50 120.95 -27.29 24 -0.5 250 90 (380) -11 18 -11 -26.3 -11.6 8.6
7 zeta And 4502 00 47 20.3 24 16 02 121.73 -38.60 31 -23.7 -100 -83 (737) -13 21 -19 26.5 -14.0 5.2
8 CF Tuc 5303 00 52 58.3 -74 39 07 302.81 -42.48 54 0.5 19 28 (409) 22 -33 -36 -6.6 1.0 -5.5
9 BD+25 161 6286 01 04 07.1 26 35 13 126.44 -36.20 55 -20.0 -12 -18 (737) -26 36 -32 13.7 -13.5 7.7
10 AY Cet 7672 01 16 36.2 -02 30 01 137.72 -64.65 67 -30.1 -108 -59 (409) -21 19 -60 46.6 -2.7 15.6
...
196 IM Peg 216489 22 53 02.3 16 50 28 86.36 -37.48 50 -12.8 -19 -24 (737) 3 40 -30 6.3 -11.9 6.0
197 AZ Psc 217188 22 58 52.7 00 18 58 73.71 -51.46 260 -20.5 39 16 (409) 45 156 -203 -54.2 -12.3 5.5
198 TZ PsA 217344 23 00 27.7 -33 44 34 10.64 -65.25 46 36.9 -44 -132 (409) 19 4 -42 32.1 -21.4 -28.2
199 KU Peg 218153 23 05 29.3 26 00 33 95.03 -31.05 950 -80.4 51 -9 (737) -71 811 -490 -171.4 -159.1 -78.5
200 KZ And 218738 23 09 57.4 47 57 30 105.90 -11.53 23 -6.9 157 -5 (737) -6 22 -5 -12.7 -12.2 -5.5
201 RT And 23 11 10.0 53 01 33 108.06 -6.92 95 20.0 -12 -18 (737) -29 90 -11 1.5 20.8 -7.9
202 SZ Psc 219113 23 13 23.8 02 40 32 80.66 -51.96 125 12.0 12 29 (737) 13 76 -98 -13.5 17.2 -3.5
203 EZ Peg 23 16 53.4 25 43 09 97.58 -32.45 83 -27.2 -70 13 (409) -9 69 -45 24.8 -10.9 28.1
204 lambda And 222107 23 37 33.9 46 27 29 109.90 -14.53 23 6.8 162 -421 (737) -8 21 -6 -1.8 -6.7 -49.2
205 KT Peg 222317 23 39 31.0 28 14 47 104.22 -32.00 25 -3.1 299 226 (737) -5 21 -13 -41.9 -6.0 13.8
206 II Peg 224085 23 55 04.0 28 38 01 108.22 -32.62 29 -18.1 574 27 (737) -8 24 -16 -66.5 -48.1 -3.8
but the header cannot be separately allocated to the columns.
there is a missing value in rows 6, 201, 203 in the third column(shown values).
the problem could be solved if false values could be given to these missing empty fields.
I can't find any documentation relating to this...
The problem is that there is fundamentally no way for the table parser to unambiguously know where the column boundaries are for your data file. Your table data are in fixed-width format, meaning that each column lives within certain character bounds in each line. You need to specify those bounds in some way.
This is documented here with examples:
https://docs.astropy.org/en/latest/io/ascii/fixed_width_gallery.html#fixed-width-gallery
If you can modify the file, the easiest way is to add a header line which tells the parser what the column boundaries are. For example:
Col1 Col2 Col3 Col4
---- --------- ---- ----
1.2 "hello" 1 a
2.4 's worlds 2 2
If you cannot modify the file itself, then you can explicitly specify the column starts and stops, as shown in the second example in this section: https://docs.astropy.org/en/latest/io/ascii/fixed_width_gallery.html#fixedwidthnoheader

Excel INDEX MATCH column format into row-column format

I have a huge dataset with ID, MEAN, and DATE (in day of the year). I'm sub-setting it below just for example:
OBJECTID MEAN DATE
1 0.960337524 27
2 1.024530873 27
3 1.07565201 27
4 1.32608937 27
5 1.115863256 27
6 0.738648832 27
7 1.209547088 27
8 1.190287749 27
1 1.311272704 43
2 1.421150386 43
3 1.341622942 43
4 1.343600738 43
5 1.322288454 43
6 1.057037145 43
7 1.262514248 43
8 1.148541133 43
1 1.141311572 75
2 1.12654984 75
3 1.125632558 75
4 1.128487158 75
5 1.181200445 75
6 0.820567181 75
7 0.973662794 75
8 0.903646102 75
In this example, the first date is DAY 27 (Jan 27th). I want to reformat this in Excel so DATE is the horizontal header and ID is the first vertical column like this:
OBJECTID 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365
1
2
3
4
5
6
7
8
How do I use INDEX and MATCH to populate the cells in the blank table above with the values in the dataset? Not all dates in the table will have a value so I need it to populate it with zero like this:
OBJECTID 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365
1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.960337524 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1.311272704 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1.141311572 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2
3
4
5
6
7
8
This is what I came up with so far:
=IFERROR(INDEX($B$2:$B$2617, MATCH(0, COUNTIF($E2:E2,$B$2:$B$2617)+IF($A$2:$A$2617<>$E2, 1, 0), 0)), 0)
But it doesn't account for the dates with no values. It put the MEAN value for Day 27 into the cell for Day 1.
OBJECTID MEAN DATE OBJECTID 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365
1 0.960337524 27 1 0.960337524
2 1.024530873 27 2
3 1.07565201 27 3
4 1.32608937 27 4
5 1.115863256 27 5
6 0.738648832 27 6
7 1.209547088 27 7
8 1.190287749 27 8
1 1.311272704 43
2 1.421150386 43
3 1.341622942 43
4 1.343600738 43
5 1.322288454 43
6 1.057037145 43
7 1.262514248 43
8 1.148541133 43
1 1.141311572 75
2 1.12654984 75
3 1.125632558 75
4 1.128487158 75
5 1.181200445 75
6 0.820567181 75
7 0.973662794 75
8 0.903646102 75
Any advice or push in the right direction would be appreciated!
On condition that there is only one Mean in your data for each combination of ID and Date the formula below will do the job. (Input is the sheet with your data in it.)
=SUMIFS(Input!$B$2:$B$2617,Input!$A$2:$A$2617,$A2,Input!$C$2:$C$2617,B$1)
However, this formula makes the range management too error prone. Luckily most errors in the range setting will result in a formula error but the process can be much simplified by separating range management from data extraction.
I assigned the name Data to the range Input!$A$2:$C$2617. In fact, you would probably construct this range to be dynamic so that it adjusts to the number of data rows you have automatically. But that is an extra benefit of properly managing ranges which is outside the scope of your present question.
Use the range Data in the above formula you arrive at this:-
=SUMIFS(INDEX(Data,,2),INDEX(Data,,1),$A2,INDEX(Data,,3),B$1)
Either this formula or the one first above introduced can be copied to the right and down to cover your entire output table. If they bother you (as they did me) you can suppress the display of zeros in the sheet or in each cell, using the sheet settings or the cell format to do so.
What you can do is create another unique ID using simple "&" (concat) function. Basically add a 4th column which has the formula
=A2&"_"&C2
The above takes the assumption that your first entry is in 2nd row and this formula is for 2nd row as well which you can drag. Once done then you can apply index match on this new ID.
Hope it helps in clearing your query.

TypeError while looping through dictionary key and items

I have a dictionary and when I tried to loop through its keys and values, it throws me an error:
TypeError: items() takes no arguments (1 given)
BDT_param_grid1 ={"learning_rate": np.arange(0.1,1.0,0.1),
"n_estimators": np.arange(1, 1000, 10),
"base_estimator__min_samples_split": np.arange(0.1,1.0,0.1),
"base_estimator__min_samples_leaf": np.arange(1,60,1),
"base_estimator__max_leaf_nodes": np.arange(2,60,1),
"base_estimator__min_weight_fraction_leaf": np.arange(0.1, 0.4, 0.1),
"base_estimator__max_features": np.arange(0.1,1,0.1),
"base_estimator__max_depth": np.arange(1, 28, 1)}
for key,items in dict.items(BDT_param_grid1):
print(key,items)
My expected result is:
learning_rate [0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9]
n_estimators [ 1 11 21 31 41 51 61 71 81 91 101 111 121 131 141 151 161 171
181 191 201 211 221 231 241 251 261 271 281 291 301 311 321 331 341 351
361 371 381 391 401 411 421 431 441 451 461 471 481 491 501 511 521 531
541 551 561 571 581 591 601 611 621 631 641 651 661 671 681 691 701 711
721 731 741 751 761 771 781 791 801 811 821 831 841 851 861 871 881 891
901 911 921 931 941 951 961 971 981 991]
base_estimator__min_samples_split [0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9]
base_estimator__min_samples_leaf [ 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
49 50 51 52 53 54 55 56 57 58 59]
base_estimator__max_leaf_nodes [ 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49
50 51 52 53 54 55 56 57 58 59]
base_estimator__min_weight_fraction_leaf [0.1 0.2 0.3 0.4]
base_estimator__max_features [0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9]
base_estimator__max_depth [ 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
25 26 27]
Quiet Strange as I was able to get the result without any error earlier in the same code

Resizing .svg image using 'convert' function of ImageMagick suite produces empty images

I have created a quick .svg of the StackOverflow logo, which looks like this:
With the following data in the .svg file itself:
<?xml version="1.0" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 20010904//EN"
"http://www.w3.org/TR/2001/REC-SVG-20010904/DTD/svg10.dtd">
<svg version="1.0" xmlns="http://www.w3.org/2000/svg"
width="256.000000pt" height="256.000000pt" viewBox="0 0 256.000000 256.000000"
preserveAspectRatio="xMidYMid meet">
<metadata>
Created by potrace 1.14, written by Peter Selinger 2001-2017
</metadata>
<g transform="translate(0.000000,256.000000) scale(0.100000,-0.100000)"
fill="#000000" stroke="none">
<path d="M1095 2549 c-464 -62 -875 -402 -1029 -850 -57 -168 -78 -379 -58
-579 44 -421 317 -806 706 -995 187 -91 340 -125 566 -125 222 0 378 34 558
120 142 68 239 138 352 250 113 113 184 213 250 353 95 199 127 366 117 622
-8 221 -61 406 -167 584 -70 118 -118 177 -225 279 -178 170 -382 278 -618
326 -95 20 -356 28 -452 15z m695 -466 c0 -5 22 -135 49 -291 27 -155 46 -284
42 -286 -3 -2 -27 -7 -53 -11 -44 -7 -47 -6 -53 16 -17 74 -95 552 -91 557 5
5 65 18 94 21 6 0 12 -2 12 -6z m-240 -344 c80 -117 153 -224 163 -240 l18
-28 -42 -31 -43 -31 -18 23 c-10 13 -84 121 -165 241 l-148 219 35 29 c19 16
39 29 45 29 5 0 75 -95 155 -211z m-195 -219 c132 -78 242 -143 244 -145 2 -2
-9 -25 -23 -50 -25 -41 -29 -44 -49 -34 -51 26 -467 281 -472 289 -7 11 43 92
53 87 4 -3 115 -69 247 -147z m-107 -221 c152 -40 279 -73 281 -75 5 -5 -30
-104 -37 -104 -12 0 -526 140 -543 148 -15 6 -15 12 -3 55 8 26 17 47 20 47 3
0 130 -32 282 -71z m-458 -384 l0 -265 405 0 405 0 0 265 0 265 40 0 40 0 0
-310 0 -310 -495 0 -495 0 0 310 0 310 50 0 50 0 0 -265z m390 165 c124 -11
245 -23 269 -27 l43 -6 -5 -50 c-4 -27 -8 -51 -9 -53 -2 -1 -134 9 -294 23
l-291 26 4 46 c7 66 10 72 35 66 13 -2 124 -14 248 -25z m300 -255 l0 -55
-295 0 -295 0 0 55 0 55 295 0 295 0 0 -55z"/>
</g>
</svg>
And I'm attempting to resize it to 30x30 like so:
convert stackoverflow-4-xxl.svg -resize 30x30 stackoverflow-4-xxl_s.svg
But this produces an empty image (even though data is still present, as seen in the .svg file itself, below)
<?xml version="1.0" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 20010904//EN"
"http://www.w3.org/TR/2001/REC-SVG-20010904/DTD/svg10.dtd">
<svg width="30" height="30">
<g style="" transform="scale(1.25,1.25)">
<g style="fill:#000000;stroke:none;" transform="matrix(0.1 0 0 -0.1 0 256)">
<g style="" transform="matrix(0.1 0 0 -0.1 0 256)">
<path d="M1095 2549 c-464 -62 -875 -402 -1029 -850 -57 -168 -78 -379 -58 -579 44 -421 317 -806 706 -995 187 -91 340 -125 566 -125 222 0 378 34 558 120 142 68 239 138 352 250 113 113 184 213 250 353 95 199 127 366 117 622 -8 221 -61 406 -167 584 -70 118 -118 177 -225 279 -178 170 -382 278 -618 326 -95 20 -356 28 -452 15z m695 -466 c0 -5 22 -135 49 -291 27 -155 46 -284 42 -286 -3 -2 -27 -7 -53 -11 -44 -7 -47 -6 -53 16 -17 74 -95 552 -91 557 5 5 65 18 94 21 6 0 12 -2 12 -6z m-240 -344 c80 -117 153 -224 163 -240 l18 -28 -42 -31 -43 -31 -18 23 c-10 13 -84 121 -165 241 l-148 219 35 29 c19 16 39 29 45 29 5 0 75 -95 155 -211z m-195 -219 c132 -78 242 -143 244 -145 2 -2 -9 -25 -23 -50 -25 -41 -29 -44 -49 -34 -51 26 -467 281 -472 289 -7 11 43 92 53 87 4 -3 115 -69 247 -147z m-107 -221 c152 -40 279 -73 281 -75 5 -5 -30 -104 -37 -104 -12 0 -526 140 -543 148 -15 6 -15 12 -3 55 8 26 17 47 20 47 3 0 130 -32 282 -71z m-458 -384 l0 -265 405 0 405 0 0 265 0 265 40 0 40 0 0 -310 0 -310 -495 0 -495 0 0 310 0 310 50 0 50 0 0 -265z m390 165 c124 -11 245 -23 269 -27 l43 -6 -5 -50 c-4 -27 -8 -51 -9 -53 -2 -1 -134 9 -294 23 l-291 26 4 46 c7 66 10 72 35 66 13 -2 124 -14 248 -25z m300 -255 l0 -55 -295 0 -295 0 0 55 0 55 295 0 295 0 0 -55z"/>
</g>
</g>
</g>
</svg>
So, just to see if resizing to the same size produces the original image (or therabouts), I did:
convert stackoverflow-4-xxl.svg -resize 256x256 stackoverflow-4-xxl_m.svg
And this still produced an empty image, with the .svg data identical except for:
<svg width="256" height="256">
Any idea where I'm going wrong?
You don't need to use a program.
Either:
hand edit the SVG. Change the width and height to 30px.
<svg version="1.0" xmlns="http://www.w3.org/2000/svg"
width="30px" height="30px" viewBox="0 0 256.000000 256.000000"
preserveAspectRatio="xMidYMid meet">...
Or if you want to style its size with CSS, just remove the width and height.
<svg version="1.0" xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 256.000000 256.000000"
preserveAspectRatio="xMidYMid meet">...

Resources