# Some results obtained

## Avg. Probability distribution for Logistic Regression : 1D

No. of elements used to train =400, No. of iterations = 150

## Avg. Probability distribution for Decision Tree : 1D

No. of elements,m =400, No. of iterations = 150, max depth of decision tree = 10

## Observations

- Probabilities of elements either present in the set or not are very close in logistic regression(i.e, around 0.5) because of which only 2 and 3 indices are set in bitarray,but for false positive elements also it gives same indices. This results in high false positive rate. So, though the size of the classifier + size of the bit array can be very small but false positive rate is very high.
- In decision tree, as for all elements from 0-1000 and 2000-3000 since probability is zero , there is no false positive in 0-1000 and 2000-3000