OpenAI has made o1-pro available via its developer API after unveiling an upgraded version of its o1 “reasoning” AI model. If you have been following the developments in the area of AI, then you will realize that OpenAI has been focusing on the efficiency of AI. And this release is a clear aim at enhancing the reasoning capability of the AI for the developers who require something better, more intelligent and (one would hope) more stable.
More Power, More Precision
What’s new? More computational muscle. o1-pro, introduced by OpenAI, has received a boost in its processing power, an improvement that is said to enhance the accuracy of the model and the reproducibility of the solutions.
Exclusivity and a Premium Price Tag
There is a catch, however: Accessibility is restricted to developers who are currently using OpenAI’s API services. Therefore, if you are not currently associated with OpenAI’s ecosystem, then you will have to wait your turn.
And then there’s the pricing—$150 per million input tokens, $600 per million output tokens—thus making o1-pro the most costly AI model that OpenAI has ever introduced. It is, therefore, targeted for enterprises and professionals who are willing to pay more for improved reasoning abilities. But does it really perform as expected?
Performance: Users Are Divided
ChatGPT Pro users who were able to access the platform very recently in December 2024 are likely to provide one of the two responses shown below.
Some people say that o1-pro is great, and specifically for technical documents and complex language tasks, it is excellent. One developer appreciated its capability to easily understand the complex industry-specific terminologies and termed it as a worthy investment for critical applications.
But then, there’s the flip side. Some users have reported that the output quality is not consistent and some of the responses provided are rather shallow or incomplete as compared to the previous versions. The reasons are under consideration—Some people think that perhaps OpenAI has changed some performance parameters, which has resulted in the response quality being reduced. Discussions in community forums, particularly on Reddit, have been mixed, with some people expressing doubts that the additional money brings consistently better output.