pythongoogle-cloud-storagegoogle-cloud-platformgoogle-cloud-datalab

Write a Pandas DataFrame to Google Cloud Storage or BigQuery


Hello and thanks for your time and consideration. I am developing a Jupyter Notebook in the Google Cloud Platform / Datalab. I have created a Pandas DataFrame and would like to write this DataFrame to both Google Cloud Storage(GCS) and/or BigQuery. I have a bucket in GCS and have, via the following code, created the following objects:

import gcp
import gcp.storage as storage
project = gcp.Context.default().project_id    
bucket_name = 'steve-temp'           
bucket_path  = bucket_name   
bucket = storage.Bucket(bucket_path)
bucket.exists()  

I have tried various approaches based on Google Datalab documentation but continue to fail. Thanks


Solution

  • Try the following working example:

    from datalab.context import Context
    import google.datalab.storage as storage
    import google.datalab.bigquery as bq
    import pandas as pd
    
    # Dataframe to write
    simple_dataframe = pd.DataFrame(data=[{1,2,3},{4,5,6}],columns=['a','b','c'])
    
    sample_bucket_name = Context.default().project_id + '-datalab-example'
    sample_bucket_path = 'gs://' + sample_bucket_name
    sample_bucket_object = sample_bucket_path + '/Hello.txt'
    bigquery_dataset_name = 'TestDataSet'
    bigquery_table_name = 'TestTable'
    
    # Define storage bucket
    sample_bucket = storage.Bucket(sample_bucket_name)
    
    # Create storage bucket if it does not exist
    if not sample_bucket.exists():
        sample_bucket.create()
    
    # Define BigQuery dataset and table
    dataset = bq.Dataset(bigquery_dataset_name)
    table = bq.Table(bigquery_dataset_name + '.' + bigquery_table_name)
    
    # Create BigQuery dataset
    if not dataset.exists():
        dataset.create()
    
    # Create or overwrite the existing table if it exists
    table_schema = bq.Schema.from_data(simple_dataframe)
    table.create(schema = table_schema, overwrite = True)
    
    # Write the DataFrame to GCS (Google Cloud Storage)
    %storage write --variable simple_dataframe --object $sample_bucket_object
    
    # Write the DataFrame to a BigQuery table
    table.insert(simple_dataframe)
    

    I used this example, and the _table.py file from the datalab github site as a reference. You can find other datalab source code files at this link.