AI has advanced considerably in recent years, with models achieving human-level performance in diverse tasks. However, the true difficulty lies not just in developing these models, but in deploying them optimally in practical scenarios. This is where inference in AI comes into play, surfacing as a key area for researchers and innovators alike.Under