Use cases
A batch API is useful for any use case where you do not need instantaneous output for your queries. It is most useful for use cases where you need to process large amounts of data such as:
- LLM as a judge: use llama 3.1 70B to assess smaller models on specific applications, such as a RAG.
- Create synthetic datasets: generate large amounts of synthetic data to finetune smaller models
- Classify large datasets: handle large volumes of data, such as sorting emails or documents
- Analyze conversations: extract insights, identify trends or synthetize chat logs
- Process financial documents: analyze large volumes of financial documents for compliance or insights
- Generate periodic reports: schedule and process detailed reports based on large datasets
- Translate bulk of documents: translate large volumes of text, such as books, documents, or web content
- Analyze sentiment on social media: analyse and classify sentiment across large volumes of social media posts or reviews