Import data
Last updated
Was this helpful?
Last updated
Was this helpful?
There are two ways to import the data you want to label:
Upload the data to Segments.ai's asset storage service via the web interface or Python SDK.
Keep the data in your own cloud bucket and submit the URLs to Segments.ai via the Python SDK.
The maximum file size for our asset storage service is 100MB.
Within a dataset, click the "Add samples" button or drag and drop files to the page. The uploaded assets (e.g. image or point cloud files) are stored in the Segments AWS S3 bucket. The asset URLs are public but unguessable, making them only accessible to dataset collaborators.
See the .
If you want to keep the data in your own cloud bucket or on your own file server, you can use the to Segments.ai. In this case, no data is copied to our own storage system, only a reference (URL) to the data is stored in our database. This can be done in three ways:
Keep the data in a cloud bucket whose content can be publicly accessed but not listed. You store the assets in this bucket with unguessable file names (containing a random uuid) such that they can only be accessed by third parties who you've shared the URLs with.
Keep the data in a private cloud bucket or server, and generate proxied or pre-signed URLs on your end to retain full control of the access permissions. These URLs can have custom restrictions: expiry time, maximum number of accesses, IP whitelisting, rate limits, etc.
Keep the data in a private cloud bucket or server, and grant us cross-account access. In this case, we generate temporary pre-signed URLs whenever the images need to be displayed in the frontend. For setting this up, see .