The most common cause of hitting the API rate limit is if you’re calling trigger() on a task in a loop, instead of doing this use batchTrigger() which will trigger multiple tasks in a single API call. You can have up to 100 tasks in a single batch trigger call.
When attaching schedules to tasks we strongly recommend you add them in our dashboard if they’re “static”. That way you can control them easily per environment.If you add them dynamically using code make sure you add a deduplicationKey so you don’t add the same schedule to a task multiple times. If you don’t your task will get triggered multiple times, it will cost you more, and you will hit the limit.If you’re creating schedules for your user you will definitely need to request more schedules from us.
Payloads and outputs that exceed 512KB will be offloaded to object storage and a presigned URL will be provided to download the data when calling runs.retrieve. You don’t need to do anything to handle this in your tasks however, as we will transparently upload/download these during operation.
An alert destination is a single email address, Slack channel, or webhook URL that you want to send alerts to. If you’re on the Pro plan and need more than the plan limit, you can request more by contacting us via email or Discord.
The default machine is small-1x which has 0.5 vCPU and 0.5 GB of RAM. You can optionally configure a higher spec machine which will increase the cost of running the task but can also improve the performance of the task if it is CPU or memory bound.See the machine configurations for more details.