Chapter 83 – Ultimate Guide to Google Cloud Storage CRUD Using Python

File storages, such as videos, images etc might occupy the most proportion of most brand servers you might be using. Furthermore, cloud storage capability interaction with business applications is one of the most important items to assess if the cloud storage is excellent or not. 

Google Cloud storage pricing per GB is not pricey comparably. And its integration capability with applications are super flexible and friendly. Thus, in this piece, I would walk through Google cloud storage CRUD. By the end of this piece, you can refer and apply these methods to set up applications with Google cloud storage

File storages, such as videos, images etc might occupy the most proportion of most brand servers you might be using. Furthermore, cloud storage capability interaction with business applications is one of the most important items to assess if the cloud storage is excellent or not. 

Google Cloud storage pricing per GB is not pricey comparably. And its integration capability with applications are super flexible and friendly. Thus, in this piece, I would walk through Google cloud storage CRUD. By the end of this piece, you can refer and apply these methods to set up applications with Google cloud storage

Table of Contents: Ultimate Guide to Google Cloud Storage CRUD using Python

Create a Bucket

Buckets in GCP are basic containers where you can store data in the cloud. The objects you store in the cloud are contained or stored in buckets. You cannot use buckets the same way you use directories or folders. The creation process is a bit more restrictive. You also cannot delete buckets. So if you like to store and contain data in GCP for web app, machine learning purpose etc, creating a bucket is the 1st step. Here is the sample code as follows:

from google.cloud import storage

def create_bucket(bucket_name):

   credentials = service_account.Credentials.from_service_account_info(json dict)

   bucket = client.create_bucket(bucket_name, location='US-EAST1')

   return f"Bucket {bucket.name} created."

bucket_name = "Buyfromlo Bucket"    

create_bucket(bucket_name)

Upload Files

Below is the sample code to continue the bucket creation along with uploading files using Python scripts.

from google.cloud import storage

def upload_file(bucket_name):

    credentials = service_account.Credentials.from_service_account_info(json dict)

    bucket = client.bucket(bucket_name)

    file_name = 'my-file.txt'

    blob = bucket.blob(file_name)

    with open(file_name, 'rb') as f:

        acontents = f.read()

    blob.upload_from_string(contents)

    return f'File {file_name} uploaded to {blob.public_url}'

 bucket_name = "handsoncloud-new-bucket"

 upload_file(bucket_name)

Rename files

Naming is one of the most important processing on the file access, notably when you deal with large bunch of dataset like in Machine learning. Here is the sample code as follows how to automatically name the file using Python

    bucket.rename_blob(blob, new_file_name)

Update files

metadata = {'description': 'This file metadata is updated via HandsOnCloud Tutorial'}

blob.metadata = metadata

blob.patch()

print(f'Metadata for file {file_name} updated.')

Delete files

blob = bucket.blob(file_name)

blob.delete()

print(f'File {file_name} deleted.')

Create Folders

folder_name = 'Buyfromlo Images'

folder = bucket.blob(folder_name)

folder.upload_from_string('')

print(f'Folder {folder_name} created.')

Delete Folders

GCP runs a pay-as-you-go model which implies it doesn’t have any upfront cost, and it can hugely facilitate business to reduce regular recurring fixed cost. Being said that, it’s a monthly charging model. For instance, if you have 5GB data contained always-on in the GCP, it still costs your recurring fees. Therefore, accordingly you need to know how to delete unuseful data for the purpose to avoid wasting dollars.

Here is the sample code as follows:

folder_name = 'buyfromlo'

folder = bucket.blob(folder_name)

folder.delete()

print(f'Folder {folder_name} deleted.')

Enable public access to the GCS folders

By default the new files and dataset created and stored on GCP is not open to public. That implies your app or your script can’t access the file. You must activate and enable public access. Fortunately the way is very straightforward and easy. Here is the code sample as follows:

from google.cloud import storage

from typing import List

def make_bucket_public(bucket_name: str, members: List[str] = ["allUsers"]):

     credentials = service_account.Credentials.from_service_account_info(json dict)

    bucket = client.bucket(bucket_name)

    policy = bucket.get_iam_policy(requested_policy_version=3)

    policy.bindings.append({"role": "roles/storage.objectViewer", "members": members})

    bucket.set_iam_policy(policy)

    return f"Bucket {bucket.name} is now publicly readable"

Fetch Files Hosted in Google Cloud Storage

For more details regarding fetching dataset contained in Google Cloud Storage, please refer to the article as follows:

https://www.easy2digital.com/automation/data/chapter-78-fetching-media-files-using-google-cloud-storage-and-python/

Full Python Scripts of Google Cloud Storage CRUD that Includes Bulky Interaction, Image Byte Data.

If you are interested in Chapter 83 –Ultimate Guide to Google Cloud Storage CRUD Using Python, please subscribe to our newsletter by adding the message ‘Chapter 83+ Full scripts of Google Cloud Storage CRUD. We would send you the script when the up-to-date app script is live.

I hope you enjoy reading Chapter 83 – Segment WUltimate Guide to Google Cloud Storage CRUD Using Pythoneb App Structure Using Flask Blueprint. If you did, please support us by doing one of the things listed below, because it always helps out our channel.