Olen saanut varoitusviestin otsikossa ja tarkistanut sellaisia viestejä kuten esim. tämä .
Haluaisin ymmärtää, miten tämä ominaisuus erottuu täydellisesti kohdemuuttujasta, koska oletin juuri, että tämä eräänlainen varoitus liittyisi enemmän kategorisiin ominaisuuksiin, joissa tietyllä tasolla on joko joko tosi tai väärä kohdeluokka.
Konteksti on verkkosivuston muunnos (tapahtuma tekee ostosta True = X1 tai ei = False X0 ). Halusin ymmärtää sivun keskimääräisen latausajan vaikutuksen tietyn verkkosivuston istunnon aikana. Poistettuani muut ominaisuudet, kuten laitetyyppi ja liikenteen lähde, huomasin, että saan varoituksen vain ominaisuudella Avg_Load_Time, joka on numeerinen (dbl) ominaisuus.
Seuraava ajatukseni oli, että ehkä kaikki istunnot, joiden keskimääräinen latausaika oli 0, aiheuttivat täydellisen itsensä parationi Minulla ei kuitenkaan ole nollia, vain joitain lähellä nollaa:
> summary(x$Avg_Load_Time) Min. 1st Qu. Median Mean 3rd Qu. Max. 0.24 2.32 4.27 10.18 8.73 484.62
Sitten tarkastelin yhteenvetoa keskimääräisestä latausaikasta vain niille istunnoille, joilla on tapahtuma, missä kohde on siis X1:
> summary(y %>% filter(target == "X1") %>% select(Avg_Load_Time)) Avg_Load_Time Min. : 0.780 1st Qu.: 2.478 Median : 3.785 Mean : 4.253 3rd Qu.: 4.815 Max. :16.410
Näen täällä, että vaikka min on suurempi, se ei ole 0.
Kuinka voinko löytää täydellisen erottumiseni syyn, kun olen supistanut sen yhteen ominaisuuteen?
Tässä on esimerkki 1000: sta, jos se auttaa. Kaikki vinkit erottamisen ymmärtämiseen arvostavat:
dput(x %>% sample_n(1000)) structure(list(target = structure(c(1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L), .Label = c("X0", "X1"), class = "factor"), Avg_Load_Time = c(0.77, 39.1, 5.34, 5.45, 1.74, 2.18, 9.19, 4.73, 9.37, 2.45, 4.33, 1.86, 1.93, 4.32, 18.13, 6.93, 3.57, 13.93, 130.38, 4.47, 26.67, 14.48, 19.54, 9.41, 6.51, 3.78, 1.91, 2.98, 5.47, 2.24, 3.07, 27.9, 8.8, 65.66, 10.23, 3.32, 1.81, 5.02, 2.71, 1.04, 11.76, 5.73, 2.32, 3.54, 2.3, 63.9, 4.5, 0.78, 1.44, 4.06, 0.7, 1.79, 7.7, 4.3, 33.25, 1.44, 0.79, 6.39, 4.17, 0.6, 3.58, 16.84, 11.07, 16.05, 28.29, 9.22, 4.1, 7.81, 0.55, 64.88, 3.32, 10.44, 3.22, 1.57, 1.01, 7.16, 3.41, 5.74, 3.73, 2.62, 4.39, 17.92, 5.05, 1.94, 6.95, 1.86, 27.07, 7.69, 4.05, 2.96, 8.03, 3.21, 5.33, 1.62, 17.03, 8.37, 1.7, 5.08, 4.96, 0.83, 4.65, 16.36, 7.04, 4.9, 22.98, 6.08, 4.3, 2.91, 1.52, 1.81, 11.28, 16.71, 4.17, 9.62, 3.18, 2.66, 0.78, 9.3, 25.39, 5.84, 1.13, 58.03, 1.45, 10.45, 19.5, 1.25, 1.06, 30.49, 2.9, 7.31, 3.61, 4.64, 0.68, 10.43, 8.84, 1.78, 17.16, 6.68, 4.61, 7.43, 5.03, 2.98, 2.89, 4.15, 9.47, 3.68, 2.16, 2.09, 41.78, 3.06, 113.4, 30.13, 5.37, 14.83, 2.1, 2.03, 13.51, 3.1, 5.54, 4.61, 18.09, 23.82, 34.64, 4.99, 8.35, 7.45, 3.98, 3.44, 1.01, 34.45, 64.03, 2.82, 13.63, 13.34, 0.66, 4.15, 2.06, 19.7, 1.38, 2.16, 10.65, 5.89, 57.27, 17.51, 3.5, 10.97, 2.2, 9.38, 2.06, 5.25, 4.11, 72.22, 0.93, 3.65, 5.71, 4.79, 3.01, 0.95, 6.6, 15.35, 1.05, 3.31, 3.44, 8.31, 11.35, 6.63, 4.87, 4.83, 10.05, 1.01, 25.35, 3.79, 11.14, 24.26, 9.71, 1.76, 3.75, 1.66, 7.02, 6.41, 3.72, 3.58, 35.16, 3.24, 2.29, 9.61, 9.31, 0.67, 0.63, 7.08, 10.85, 2.65, 4.35, 5.86, 3.24, 4.32, 3.34, 2.37, 4.23, 1.97, 1.83, 15.42, 4.17, 5.18, 2.37, 8.91, 0.71, 20.18, 5.96, 1.41, 3.11, 26.85, 2.47, 5.99, 2.53, 1.86, 2.67, 13.66, 8.28, 5.7, 8.1, 3.95, 139.35, 15.37, 2.55, 2.85, 5.46, 2.55, 17.16, 2.87, 23.42, 1.58, 62.58, 7.5, 14.41, 1.57, 4.42, 5.41, 4.62, 12.5, 3.3, 4.37, 3.91, 3.35, 7.27, 1.11, 24.86, 18, 8.83, 7.87, 2.68, 2.77, 32.58, 12.66, 2.64, 9.89, 30.86, 10.17, 3.49, 37.99, 4.99, 12.98, 1.75, 11.92, 45.36, 3.35, 2.28, 2.83, 19.92, 9.33, 4.98, 19.76, 2.92, 3.84, 4.8, 205.98, 4.53, 8.82, 3.74, 21.8, 3.56, 3.9, 2.29, 7.85, 79.96, 3.56, 2.78, 5.9, 2.93, 3.76, 1.79, 12.94, 2.34, 25.17, 22.71, 4.15, 6.87, 147.62, 6.1, 3.23, 93.41, 12.91, 4.93, 3.22, 5.84, 8.73, 17.73, 79.63, 182.45, 2.36, 1.62, 1.22, 1.09, 3.75, 0.93, 1.82, 12.14, 4.38, 2.1, 0.88, 4.36, 1.33, 3.74, 2.85, 2.34, 13.2, 5.44, 9.94, 6.6, 2.79, 7.7, 10.99, 11.43, 19.7, 3.79, 2.26, 1.68, 23.24, 7.41, 3.13, 5.22, 2.4, 4.48, 2.35, 10.36, 1.25, 34.14, 7.37, 3.46, 18.84, 8.32, 4.9, 2.37, 1.03, 4.56, 9.7, 20.95, 1.01, 17.42, 9.29, 0.88, 3.84, 13.82, 0.52, 4.51, 11.74, 1, 6.28, 5.49, 6.13, 5.62, 0.53, 6.72, 2.08, 3.38, 68.72, 4.56, 2.45, 15.21, 5.54, 5.13, 3.86, 4.89, 1.21, 3.88, 4.83, 4.97, 8.22, 5.76, 4.07, 6.83, 1.94, 120.71, 3.26, 7.38, 4.21, 5.95, 3.7, 1.28, 3.43, 1.42, 1.63, 3.97, 10.57, 8.98, 2.37, 21.73, 8.04, 5.18, 2.48, 5.74, 4.65, 1.85, 6.75, 0.98, 1.72, 4, 6.08, 7.21, 8, 10.98, 1.94, 0.75, 30.3, 7.29, 3.31, 4.3, 66.62, 3.87, 3.01, 1.56, 3.37, 5.44, 6.76, 6.21, 1.39, 8.02, 2.95, 9.56, 1.62, 2.28, 0.46, 2, 12.55, 4.66, 15.48, 1.76, 5.81, 1.94, 4.25, 2.65, 1.51, 2.7, 27.43, 46.24, 2.67, 16.77, 0.7, 0.4, 6.07, 11.3, 1.49, 3.45, 3.2, 22.74, 1.5, 0.7, 2.6, 7.89, 2.57, 3.42, 2.46, 1.7, 2.45, 2.12, 7.97, 9.4, 3.58, 7.2, 12.18, 15.27, 2.94, 5.19, 7.33, 7.54, 5.01, 5.08, 10.65, 16.13, 2.46, 5.28, 3.02, 2.82, 10.84, 0.53, 4.22, 3.51, 10.69, 4.31, 2.55, 7.58, 19.3, 4.97, 9.39, 1.66, 0.45, 2.71, 0.82, 0.7, 8.76, 21.98, 1.95, 1.09, 3.78, 2.71, 2.55, 1.69, 17.2, 6.37, 11.42, 2.33, 0.98, 52.6, 1.67, 1.32, 21.99, 34.11, 4.99, 4.52, 6.84, 2.45, 0.7, 1.16, 9.52, 21.73, 2.32, 5.26, 7.34, 3.55, 2.6, 4.29, 9.48, 0.48, 7.22, 1.94, 4.25, 6.62, 6.76, 3.39, 1.67, 3.81, 38.39, 3.49, 65.29, 3.59, 11.54, 1.87, 4.21, 6.6, 7.3, 8.97, 9.82, 2.65, 4.99, 2.03, 4.81, 3.08, 6.41, 1.29, 1.04, 3.53, 1.29, 4.07, 2.92, 2.91, 3.82, 4.94, 2.25, 10.05, 8.87, 1.51, 3.26, 3.4, 0.68, 7.64, 0.6, 0.78, 6.25, 2.89, 17.56, 4.83, 5.55, 9.6, 3.31, 2.43, 6.96, 5.05, 5.95, 6.96, 15.06, 45.99, 1.74, 3.48, 1.83, 2.76, 6.35, 24.95, 1.96, 2.23, 2.23, 17.25, 5.2, 12.57, 11.58, 10.85, 2.91, 1.1, 3.2, 6.4, 3.15, 5.55, 1.72, 2.34, 1.83, 49.76, 1.87, 5.72, 3.59, 0.81, 8.8, 6.76, 2.06, 3.15, 9.06, 15.15, 1.64, 4.92, 9.64, 3.7, 1.78, 1.88, 3.98, 4.93, 3.37, 10.57, 4.41, 4.67, 6.39, 3.51, 21.83, 2.33, 0.68, 1.66, 2.89, 4.57, 360.7, 5.89, 6.63, 8.59, 0.48, 8.08, 2.01, 1.59, 12.45, 0.99, 2.3, 2.79, 1.47, 2.78, 2.05, 3.12, 17.84, 185.53, 3.71, 0.8, 1.82, 12.42, 31.16, 2.27, 19.23, 1.48, 7.22, 0.24, 11.73, 1.25, 14.06, 11.55, 1.48, 1.73, 5.01, 1.66, 2.25, 3.26, 6.73, 4.66, 1.8, 5.25, 8.15, 3.94, 2.72, 1.69, 25.96, 4.46, 1.51, 1.61, 1.67, 2.16, 5.24, 22.86, 3.64, 10.68, 4.65, 0.62, 0.64, 7.69, 3.63, 37.52, 9.98, 3.27, 10.94, 1.92, 2.4, 1.04, 6.05, 5.34, 3.4, 4.08, 72.08, 3.95, 5.1, 1.44, 17.06, 2.14, 4.17, 3.39, 7.79, 5.71, 19.87, 2.54, 2.49, 3.44, 3.85, 12.06, 12.18, 1.7, 3.12, 17.3, 4.41, 4.4, 0.82, 57.91, 124.91, 5.35, 5.41, 20.75, 13.54, 0.82, 0.84, 8.62, 10.04, 1.08, 10.49, 7.05, 2.72, 1.18, 2.05, 6.87, 3.51, 20.66, 4.69, 31.9, 4.64, 6.04, 1.71, 6.91, 70.11, 2.83, 9.88, 2, 10.48, 4.25, 12.24, 1.27, 50.22, 0.85, 3.51, 5.47, 0.69, 1.45, 2.97, 1.58, 2.2, 6.79, 15.88, 3.52, 1.75, 18.68, 3.81, 2.87, 4.06, 69.44, 91.15, 0.79, 1.15, 6.57, 1.18, 4.33, 7.3, 42.46, 40.83, 6.48, 32.34, 3.16, 41.11, 4.61, 1.57, 2.22, 1.2, 2.35, 10.48, 6.82, 5.38, 5.51, 3.34, 57.3, 51.9, 10.52, 1.85, 3.37, 4.42, 1.09, 29.53, 1.76, 2.48, 2.54, 10.22, 11.62, 59.79, 176.17, 7.18, 4.36, 1.76, 7.34, 4.55, 8.21, 3.94, 9.64, 1.62, 19.5, 5.53, 5.28, 1.59, 43.85, 24.02, 5.95, 6.34, 4.54, 3.71, 1.48, 9.18, 5.56, 6.08, 15.67, 24.48, 0.8, 12.53, 4.14, 29.11, 19.85, 2.54, 92.42, 44.65, 8.07, 2.44, 3.93, 3.79, 13.65, 17.64, 3.67, 9.42, 3.43, 1.81, 11.76, 1.63, 4.27, 5.87, 11.66, 3.77, 1.62, 3.58, 15.66, 4.46, 8.12, 7.35, 8.62, 6.24, 4.28, 1.68, 3.93, 3.27, 2.67, 2.93, 161.22, 3.54, 2.62, 40.6, 1.09, 2.3, 9.57, 1.1, 3.33, 17.41, 7.63, 4.01, 16.9, 3.8, 2.8, 3.56, 2.51, 6.26, 1.84, 2.98, 4.92, 2.12, 6.35, 11.74, 2.64, 14.35, 452.01, 1.7, 1.91, 4.79, 2.49, 7.61, 1.54, 8.19, 7.95, 2.81, 7.08, 9.06, 5.17, 2.08, 7.92, 4.39, 22.12, 3.42, 3.82, 3.17, 17.41, 3.29, 10.66, 31.54, 3.62, 26.38, 3.43, 10.32, 1.32, 10.71, 2.75, 0.95)), row.names = c(6184L, 2551L, 2196L, 1039L, 2202L, 2513L, 6486L, 916L, 4414L, 2131L, 4485L, 48L, 4451L, 428L, 82L, 2537L, 3385L, 862L, 1963L, 4647L, 5071L, 2291L, 2995L, 3809L, 2285L, 1515L, 327L, 3483L, 65L, 3061L, 3869L, 3477L, 3101L, 2373L, 2719L, 3135L, 4565L, 1753L, 3063L, 6430L, 6003L, 2311L, 4421L, 1644L, 4624L, 3624L, 5539L, 5660L, 6346L, 2726L, 1827L, 4540L, 1783L, 6390L, 3L, 5930L, 4033L, 389L, 4441L, 4337L, 5426L, 4693L, 1528L, 1651L, 1031L, 6197L, 1658L, 1607L, 3984L, 169L, 5577L, 3275L, 4969L, 2540L, 4156L, 6473L, 5848L, 3533L, 3060L, 3899L, 1891L, 4948L, 6339L, 3585L, 720L, 4000L, 1086L, 145L, 1657L, 3040L, 3259L, 201L, 6284L, 40L, 4519L, 3823L, 3223L, 5009L, 5800L, 5318L, 6275L, 1786L, 2839L, 6337L, 1608L, 209L, 5153L, 6367L, 4579L, 354L, 4555L, 5648L, 4864L, 5039L, 1677L, 6116L, 5098L, 1642L, 4770L, 2200L, 6191L, 3071L, 450L, 3636L, 4081L, 2510L, 5294L, 1727L, 2803L, 2432L, 1601L, 3750L, 1342L, 1631L, 4963L, 5250L, 1706L, 4321L, 2363L, 5493L, 1785L, 1871L, 4915L, 3863L, 2609L, 3569L, 5090L, 6215L, 776L, 5994L, 3678L, 2258L, 2520L, 5860L, 4978L, 571L, 1565L, 4433L, 2162L, 4047L, 4313L, 6357L, 4122L, 5517L, 6401L, 709L, 2926L, 3962L, 5218L, 3417L, 4282L, 6511L, 4401L, 308L, 6254L, 2895L, 1322L, 3314L, 1255L, 3496L, 2530L, 1512L, 2848L, 4397L, 6493L, 4089L, 2933L, 3121L, 5843L, 4478L, 2383L, 799L, 3954L, 1881L, 6246L, 6538L, 5655L, 3924L, 6358L, 598L, 6321L, 2812L, 1495L, 2279L, 1566L, 1571L, 3243L, 3463L, 3446L, 4494L, 5554L, 2408L, 3205L, 1415L, 503L, 4475L, 2991L, 6206L, 3917L, 3783L, 579L, 4765L, 5490L, 2332L, 3855L, 334L, 279L, 4344L, 2040L, 3374L, 5118L, 5522L, 943L, 1384L, 4601L, 4265L, 1661L, 4688L, 4689L, 4901L, 5189L, 3486L, 5768L, 2838L, 1224L, 5894L, 797L, 64L, 5550L, 71L, 4872L, 3641L, 4625L, 3234L, 4074L, 4193L, 4694L, 4910L, 6064L, 711L, 5573L, 2679L, 435L, 3532L, 1943L, 5559L, 3315L, 3558L, 1329L, 3639L, 1315L, 3333L, 1385L, 969L, 4171L, 4913L, 6416L, 3509L, 1493L, 3441L, 4746L, 5616L, 4951L, 3169L, 4749L, 831L, 2960L, 1296L, 16L, 2343L, 1135L, 3011L, 1561L, 2271L, 6274L, 174L, 3444L, 6017L, 3905L, 2256L, 6176L, 2010L, 4810L, 390L, 1249L, 2519L, 5377L, 6018L, 5639L, 5085L, 2620L, 5812L, 4687L, 1585L, 1728L, 2769L, 3270L, 4024L, 4315L, 423L, 1338L, 2607L, 4817L, 2097L, 870L, 6315L, 904L, 2440L, 4453L, 361L, 57L, 499L, 592L, 261L, 2635L, 2813L, 529L, 2855L, 5575L, 2611L, 577L, 2758L, 4659L, 3844L, 460L, 5323L, 1192L, 2380L, 272L, 381L, 4215L, 1872L, 5269L, 4364L, 897L, 5692L, 147L, 1357L, 5217L, 5735L, 300L, 6237L, 2495L, 105L, 446L, 2340L, 998L, 4142L, 612L, 6281L, 1582L, 1222L, 1890L, 166L, 1640L, 5590L, 58L, 3018L, 142L, 3891L, 3186L, 4745L, 299L, 4523L, 5641L, 784L, 1204L, 1686L, 1584L, 3400L, 2020L, 1845L, 1339L, 2362L, 3775L, 4993L, 3140L, 6136L, 3744L, 3660L, 4153L, 2724L, 2882L, 606L, 4553L, 2163L, 1866L, 6542L, 3836L, 439L, 1593L, 4147L, 1863L, 1478L, 1836L, 5330L, 2317L, 6407L, 4020L, 6340L, 5530L, 4834L, 4014L, 5586L, 6277L, 1131L, 4902L, 1407L, 5960L, 6548L, 5643L, 4351L, 905L, 4831L, 1502L, 619L, 4279L, 6394L, 128L, 2750L, 933L, 2526L, 4238L, 3399L, 659L, 1480L, 2368L, 2682L, 5147L, 6000L, 416L, 1817L, 5850L, 2734L, 4140L, 6131L, 6076L, 5482L, 5680L, 2259L, 2351L, 4757L, 4151L, 289L, 859L, 5292L, 5635L, 1138L, 3254L, 798L, 2505L, 4556L, 1551L, 3940L, 4871L, 5242L, 418L, 6498L, 260L, 5817L, 4388L, 4007L, 3834L, 5505L, 5628L, 6338L, 761L, 5450L, 5683L, 285L, 6111L, 5526L, 3037L, 4L, 2593L, 3748L, 1503L, 4305L, 3995L, 2808L, 5340L, 723L, 5026L, 3815L, 780L, 5079L, 4068L, 819L, 5578L, 5309L, 5343L, 4748L, 5907L, 6230L, 750L, 4398L, 1132L, 608L, 6299L, 42L, 5876L, 3563L, 2357L, 4928L, 4651L, 3820L, 6556L, 2657L, 1072L, 6177L, 5854L, 1055L, 3019L, 3226L, 1947L, 2649L, 2658L, 3980L, 4411L, 4809L, 5374L, 6171L, 2297L, 4886L, 1136L, 3304L, 5831L, 6033L, 3996L, 5566L, 2274L, 5844L, 4357L, 4184L, 3931L, 1742L, 1906L, 584L, 1180L, 5983L, 2034L, 3948L, 2299L, 1073L, 4888L, 2482L, 5282L, 1443L, 2127L, 4934L, 4823L, 5775L, 1885L, 1196L, 148L, 6078L, 6388L, 6283L, 6387L, 4507L, 2845L, 6058L, 3802L, 6417L, 6221L, 2099L, 5433L, 2409L, 4856L, 4206L, 6222L, 2927L, 2702L, 456L, 4939L, 4571L, 5468L, 5040L, 2424L, 5272L, 6453L, 5051L, 4724L, 5896L, 2916L, 1310L, 5210L, 5510L, 646L, 5657L, 814L, 6170L, 676L, 6462L, 5444L, 1140L, 5464L, 5277L, 845L, 4103L, 6037L, 3394L, 5133L, 4308L, 6330L, 3808L, 3992L, 5485L, 3267L, 2779L, 1673L, 3759L, 540L, 63L, 3328L, 5014L, 6502L, 1702L, 183L, 2793L, 1387L, 1509L, 1104L, 6117L, 2521L, 1616L, 1915L, 5086L, 2052L, 980L, 1808L, 3238L, 1065L, 3380L, 5700L, 627L, 5914L, 2915L, 3048L, 3623L, 1123L, 6095L, 1816L, 5820L, 4345L, 834L, 4729L, 4228L, 4196L, 4470L, 1279L, 5591L, 1570L, 2116L, 4849L, 4395L, 226L, 476L, 1626L, 5747L, 3529L, 2431L, 1781L, 6031L, 2284L, 3319L, 1572L, 258L, 3268L, 3450L, 1602L, 6434L, 5241L, 3211L, 1457L, 973L, 5836L, 4221L, 5546L, 511L, 1494L, 4660L, 4740L, 6022L, 3065L, 4671L, 1235L, 4859L, 5285L, 6085L, 1835L, 246L, 3957L, 2888L, 6273L, 4354L, 6334L, 1819L, 5608L, 5737L, 2086L, 1058L, 2646L, 816L, 4892L, 962L, 6487L, 2038L, 4419L, 5027L, 1894L, 3495L, 587L, 3206L, 2829L, 4782L, 3643L, 1092L, 4123L, 5749L, 2676L, 2893L, 3014L, 38L, 1912L, 5211L, 2243L, 4058L, 1213L, 2605L, 2442L, 1232L, 5918L, 4185L, 3302L, 1337L, 6362L, 5555L, 307L, 2301L, 2233L, 937L, 3907L, 5225L, 5638L, 975L, 2251L, 1050L, 1491L, 6382L, 5216L, 2451L, 5973L, 5968L, 5662L, 502L, 5915L, 2422L, 4802L, 3790L, 3299L, 2436L, 2277L, 2446L, 1261L, 6100L, 3587L, 2741L, 1789L, 3988L, 2954L, 673L, 5694L, 2920L, 3473L, 578L, 5383L, 3635L, 2474L, 4929L, 2527L, 2379L, 2749L, 2919L, 4747L, 1568L, 2770L, 3580L, 4304L, 5181L, 463L, 3725L, 3582L, 6360L, 3340L, 3527L, 2487L, 5010L, 4628L, 3698L, 3776L, 1653L, 1242L, 755L, 6249L, 4548L, 4715L, 2907L, 3603L, 5111L, 3679L, 4719L, 5415L, 3942L, 3701L, 5062L, 6464L, 3886L, 4970L, 5863L, 4053L, 3203L, 2152L, 5063L, 558L, 4078L, 1168L, 3739L, 1542L, 3839L, 3160L, 6303L, 2109L, 1773L, 5431L, 2239L, 4065L, 4771L, 6126L, 478L, 1101L, 4449L, 889L, 1234L, 2784L, 1710L, 453L, 1939L, 4598L, 5976L, 3052L, 2723L, 1453L, 144L, 1011L, 347L, 2381L, 5726L, 1098L, 3801L, 2205L, 5924L, 5627L, 4158L, 1323L, 2716L, 6020L, 5811L, 2453L, 2576L, 1343L, 1320L, 599L, 4175L, 2525L, 4167L, 728L, 2376L, 3965L, 5238L, 3838L, 5333L, 6010L, 3692L, 6235L, 1547L, 6061L, 4914L, 523L, 6040L, 3971L, 5140L, 470L, 6180L, 5213L, 1000L, 5703L, 464L, 17L, 2573L, 2548L, 4077L, 6232L, 4488L, 4627L, 2826L, 5015L, 4984L, 1940L, 6304L, 1287L, 4968L, 4008L, 4960L, 6471L, 3094L, 2265L, 3780L, 5842L, 1355L, 4387L, 1961L, 3508L, 5247L, 1715L, 4510L, 2579L, 5276L, 1884L, 2056L, 572L, 4258L, 5438L, 3359L, 4644L, 2303L, 322L, 5600L, 688L, 569L, 1143L, 4504L, 1109L, 2366L, 2628L, 513L, 6001L, 3407L, 5020L, 1613L, 5690L, 5180L, 4863L, 2050L, 2599L, 2516L, 3648L, 2714L, 4472L, 5454L, 2338L, 3966L, 903L, 1241L, 2971L, 4947L, 4792L, 3717L, 3221L, 5182L, 1006L, 6137L, 2480L, 1403L, 3797L, 5872L, 4249L, 195L, 6063L, 1898L), class = "data.frame")
Muokkaa: Tässä on koko koodi, jonka haastan mallin suorittamiseksi:
library(caret) ## custom evaluation metric function my_summary <- function(data, lev = NULL, model = NULL){ a1 <- defaultSummary(data, lev, model) b1 <- twoClassSummary(data, lev, model) c1 <- prSummary(data, lev, model) out <- c(a1, b1, c1) out} ## tuning & parameters set.seed(123) train_control <- trainControl( method = "cv", number = 5, savePredictions = TRUE, verboseIter = TRUE, classProbs = TRUE, summaryFunction = my_summary ) linear_model = train( x = select(training_data, Avg_Load_Time), y = target, trControl = train_control, method = "glm", # logistic regression family = "binomial", metric = "AUC" )
Tämän suorittamisen jälkeen saan varoitusviestin.
Kommentit
- Mikä on koko malli olet sopiva? Onko se vuorovaikutuksessa muiden muuttujien kanssa? Mistä tiedät myös, että ' s tämä ominaisuus aiheuttaa ongelman?
- @Glen Olen lisännyt tämän viestiin nyt.
- Saitko virheen, jos sovitat koko tietojoukon ilman ansioluetteloa / koulutusta? Näyttää erittäin epätasapainoisilta luokilta ja ihmettelen, onko joillakin foldeilla vain 1 tai jopa 0 pienemmässä luokassa. Oletko kokeillut kerrosta taitosten valintaa luokittain sen varmistamiseksi, että jokaisella taitoksella on riittävästi pienempää luokkaa?
- @EdM " Oletko yrittänyt kerrostaa valinnan taittuu luokittain sen varmistamiseksi, että jokaisessa taitoksessa on tarpeeksi pienempää luokkaa " – Kuinka tekisin sen?
Vastaa
Katsoin tietojasi ja ne ovat erittäin vinossa poikkeamien kanssa. Siksi sinulla ei ole täydellistä erottelua, mutta varoitus tapahtuu, koska jotkut äärimmäisistä havainnoista ovat ennustaneet todennäköisyyksiä, joita ei voida erottaa arvosta 1.
Jos sovitat mallin avg_load_time-lokiin, et saa virhettä (I testasi tämän näytetiedoissasi).
Tämä vastaus selittää, mitä hyvin tapahtuu: Ongelma logistisen regressioiden täydellisellä erottelulla (R-kirjaimina)