multipart/form-data
requestsid
: The ID of the file used for getting info, updating, or deletingname
: The name of the file or the provided name in the addMetadata
methodcid
: A cryptographic hash based on the contents of the filesize
: The size of the file in bytesnumber_of_files
: The number of files in a referencemime_type
: The mime type of the uploaded filegroup_id
: The group the file was uploaded to if applicablename
or keyvalues
methods after the selected upload method. This can include an optional name
override or keyvalue
pairs that can be used to searching the file later on
group
method.
.url()
parameter on any of the upload methods and pass in the signed upload URL there. If you are using the API you can simply make the upload request using the signed URL as the endpoint.got
. Better support for upload progress will come in later versions of the SDK!
File
object.
fs
to feed a file into a blob
, then turn that blob
into a File
. Due to the buffer limit in Node.js you may have issues going beyond 2GB with this approach. Using Resumable Uploads with a client side file picker will help increase this.
file
entry and having multiple files for that one entry, which does not work.
arrayBuffer
, which then gets passed into a new Blob
that can then be uploaded to Pinata. This has been abstracted in the SDK using the url method.
buffer
that is passed into a Blob
. Alternatively you can use the SDK for this as well using the base64 method.
https://uploads.pinata.cloud/v3/files
is fully TUS compatible, so it can support larger files with the ability to resume uploads. Any file upload larger than 100MB needs to be uploaded through the TUS method, or through the legacy /pinFileToIPFS endpoint. The SDK handles this automatically when you use pinata.upload.<network>.file()
by checking the file size before uploading.
group_id
or keyvalues
these can be passed directly into the upload metadata (see example below)tus-js-client
request_id
and Pinata will start looking for the file. Progress can be checked by using the queue method.
prechecking
Pinata is running preliminary validations on your pin request.searching
Pinata is actively searching for your content on the IPFS network. This may take some time if your content is isolated.retrieving
Pinata has located your content and is now in the process of retrieving it.expired
Pinata wasn’t able to find your content after a day of searching the IPFS network. Please make sure your content is hosted on the IPFS network before trying to pin again.backfilled
Pinata can only search 250 files at a time per account, so if you have more than 250 items in your queue then the extra items will sit in a backfilled status. Once the queue goes down it will automatically start working on the next items in the backfill queue.over_free_limit
Pinning this object would put you over the free tier limit. Please add a credit card to continue pinning content.over_max_size
This object is too large of an item to pin. If you’re seeing this, please contact us for a more custom solution.invalid_object
The object you’re attempting to pin isn’t readable by IPFS nodes. Please contact us if you receive this, as we’d like to better understand what you’re attempting to pin.bad_host_node
You provided a host node that was either invalid or unreachable. Please make sure all provided host nodes are online and reachable.ipfs-unixfx-importer
and blockstore-core
libraries.