How do I interpret two-class prediction models?
How Can We Help?
< All Topics

How do I interpret two-class prediction models?

Interpretation of a model can be done using XAI operator which provides both global and local explanation.

Global explanator like Feature Importance can be used to understand how much importance each feature contributes to model prediction & with local explanation we can understand the prediction of a particular instance.

Previous FAQ Are there any data limitations to perform two-class prediction models?
Next FAQ How do I measure effectiveness of two-class prediction models?
type your search
Get in touch with us.
Our team is here to help you!


For general inquiries:

For Media Relations:

For Investor Relations:

For Careers:

Before you go, can you please answer a question for us?