Microsoft, in partnership with OpenAI, recently introduced its new AI language model called Orca, and it’s already making news in the technology world.
Orca is designed to address the limitations of smaller models by emulating the reasoning processes of larger models, such as GPT-4. Despite being more compact in size, Orca does not compromise on performance and requires fewer computing resources to operate, making it a highly efficient tool.
The unveiling of Orca has sparked a debate among AI enthusiasts, particularly about whether it would rival OpenAI’s popular AI product, ChatGPT. What sets Orca apart is its ability to learn explanations, step-by-step thought processes, and complex instructions with the help of GPT-4, demonstrating its immense learning capacity.

Orca, also referred to as Orca 13B, is a 13-billion parameter model. This means it contains 13 billion adjustable weights, which it uses to imitate the reasoning process of larger foundation models (LFMs). Despite its smaller size, Orca has shown impressive performance. It matches ChatGPT on the Big-Bench Hard (BBH) benchmark and even outperforms conventional AI models on both BBH and the AGIEval human-centric benchmark. Furthermore, it demonstrates competitive performance in professional and academic examinations such as the SAT, LSAT, GRE, and GMAT, marking a significant milestone in AI development.T
here’s currently no official statement from Microsoft about whether Orca 13B will become open source. However, if it does become open source, it could be a game-changer, allowing developers to create and train their own models and even enhance Orca based on user feedback.
Microsoft’s investment in AI projects like Orca shows its commitment to driving the evolution of AI tech and products. The introduction of Orca not only adds to Microsoft’s suite of AI-powered products and services but also signals a major step forward in the advancement of AI technology and maybe a future competition with its current partner OpenAI.