How do I interpret multi-class prediction models?
How Can We Help?
< All Topics

How do I interpret multi-class prediction models?

Interpretation of a model can be done using XAI operator which provides both global and local explanation.

Global explanator like Feature Importance can be used to understand how much importance each feature contributes to model prediction & with local explanation we can understand the prediction of a particular instance.

Previous FAQ Are there any data limitations to perform multi-class prediction models?
Next FAQ How do I measure effectiveness of multi-class prediction models?
type your search
Get in touch with us.
Our team is here to help you!

CONTACT INFO

For general inquiries:
hypersense@subex.com

For Media Relations:
sandeep.banga@subex.com

For Investor Relations: investorrelations@subex.com

For Careers:
jobs@subex.com
scroll-up

Before you go, can you please answer a question for us?