Google Cloud Storage
Read and write files from a Google Cloud Storage bucket
Introduction to Google Cloud Storage
Google Cloud Storage offers secure and highly scalable object storage on Google Cloud. It is ideal for a variety of uses, including serving website content, storing data for archival and disaster recovery, or distributing large data objects to users via direct download.
Signing up for Google Cloud Storage
To integrate Google Cloud Storage with ByteNite, an active Google Cloud account is necessary. Follow these steps to set up your account and storage bucket:
- Visit the Google Cloud website: Navigate to the Google Cloud website.
- Create a Google Cloud account: Click on "Get started for free" and follow the instructions to set up your account.
- Access the Google Cloud console: Once your account is set up, log in to the Google Cloud Console.
- Create a storage bucket: In the Google Cloud Console, navigate to the Storage section, click "Create bucket", and complete the setup process.
Generating Google Cloud Storage credentials
To use Google Cloud Storage with ByteNite, you will need to generate appropriate credentials:
- Access the Google Cloud Console: Sign in to the Google Cloud Console.
- Navigate to IAM & Admin: Click on "IAM & Admin" in the main menu.
- Create a service account: Select "Service Accounts" in the sidebar, then click "Create Service Account".
- Enter account details: Provide a name and description for the service account.
- Set permissions: Assign roles such as "Storage Admin" to give the account access to your storage buckets.
- Create key: Once the account is created, click on it, select "Keys", and then "Add Key" > "Create new key". Choose JSON as the key type and download it. This file contains your credentials.
Now that you have set up your Google Cloud Storage bucket and generated the necessary credentials, you are ready to use Google Cloud Storage as a data source and destination for ByteNite processing jobs.
For comprehensive information on Google Cloud Storage, check out the official Google Cloud documentation.
Data Origin ⬇️
You can use input files from a Google Cloud bucket. If your bucket is private, make sure you have the necessary permissions and credentials. Specifically, for data origins, you must grant your credentials at least read access to the designated bucket.
Configuration
API Endpoints
dataSource
object:
Key | Value | Mandatory |
---|---|---|
dataSourceDescriptor | "gcp" | * |
params → @type | "type.googleapis.com/bytenite.data_source.HttpDataSource" . | * |
params → name | The input file path within your Google Cloud Storage bucket. | * |
params → accessKey | Your Google Cloud Access Key ID. | * |
params → secretKey | Your Google Cloud Secret Access Key. | * |
params → bucketName | The name of your Google Cloud Storage bucket. | * |
params → cloudRegion | The Google Cloud region where your bucket is located. | * |
Example
{
"dataSource": {
"dataSourceDescriptor": "gcp",
"params": {
"@type": "type.googleapis.com/bytenite.data_source.HttpDataSource",
"name": "/path/to/file.obj",
"accessKey": "your_access_key",
"secretKey": "your_secret_key",
"bucketName": "your_bucket_name",
"cloudRegion": "selected_region"
}
}
}
Data destination
You can direct the output files from ByteNite to a Google Cloud Storage bucket that you control. Ensure that your bucket is appropriately configured for access. If it's a private bucket, confirm that your Google Cloud credentials have the necessary write permissions for the intended bucket.
Configuration
API Endpoints
dataDestination
object:
Key | Value | Mandatory |
---|---|---|
dataSourceDescriptor | "gcp" | * |
params → @type | "type.googleapis.com/bytenite.data_source.S3DataSource" | * |
params → name | The output folder path within your Google Cloud Storage bucket. Note: if the path doesn't exist, it will be created | * |
params → accessKey | Your Google Cloud Access Key ID | * |
params → secretKey | Your Google Cloud Secret Access Key | * |
params → bucketName | The name of your Google Cloud Storage bucket | * |
params → cloudRegion | The Google Cloud region where your bucket is located | * |
Example
{
"dataDestination": {
"dataSourceDescriptor": "gcp",
"params": {
"@type": "type.googleapis.com/bytenite.data_source.S3DataSource",
"name": "/output/folder/",
"accessKey": "your_access_key",
"secretKey": "your_secret_key",
"bucketName": "your_bucket_name",
"cloudRegion": "selected_region"
}
}
}
Updated about 1 month ago