SGP provides a comprehensive suite of capabilities for building, deploying, and managing AI applications at scale. Our platform combines powerful tools for development, evaluation, and production deployment.Documentation Index
Fetch the complete documentation index at: https://docs.gp.scale.com/llms.txt
Use this file to discover all available pages before exploring further.
📄 Document Understanding
Advanced document processing and information extraction capabilities. Learn more about Document Understanding → Key Features:- Universal Format Support: PDFs, images, spreadsheets, presentations
- Intelligent Parsing: Convert documents to structured JSON
- Custom Extraction: Define schemas for specific data needs
- High Accuracy OCR: Advanced text recognition and extraction
- Project Management: Secure, isolated document processing
⚙️ Workflows
AI-powered low-code workflow builder that enables enterprise operators to import data, transform it, call LLMs or Agents, analyze or evaluate the output, & automate the entire workflow. Learn more about Workflows → Key Features:- Importing data: Read data from various sources like the agent traces, CSVs, blob storages, or queryable data sources.
- Transforming data: Transform your data using custom
Pythoncode or simple functions. Get structured data from PDFs. - Call LLM or Agents: Generate output and traces against input from each row of your dataset using LLMs or
Agentexagents. - Evaluate data: Evaluate your outputs using
LLM as Judgeand access the evaluation anytime on SGP. - Visualize data: Generate visualizations like charts & build custom dashboards.
- Export the data: Export the data as a CSV or SGP dataset or SGP Evaluation.
- Automate & monitor: Schedule the workflow to periodically. Monitor logs & get alerts.
Training
Train custom models on your own data without managing cloud infrastructure. Submit training jobs through a single API and Train handles the rest, whether your workload runs on GCP, AWS, or Azure. Learn more about Training Key Features:- Any cloud backend: Run jobs on Vertex AI, SageMaker, or Azure ML with a single API
- Bring your own image: Push a Docker image once and reference it in any job
- Portable data access: Mount GCS, S3, or Azure Blob Storage into your container using consistent environment variables
- Simple job lifecycle: Track progress from submission to completion with a consistent status model across all backends

