top of page

Migrating Patient Files From Vendor's S3 Files to Databricks Volume

Requirement

Requirement:

 

Create a Databricks PySpark script to transfer Patient files from the vendor's S3 bucket to Databricks Volume.

 

Databricks Secret Information: “access_key” and “secret_key” are placed in Databricks secret under the scope “aws_keys”

 

Volume path: /Volumes/agilisium_playground/purgo_playground/patient_files

 

S3 file path: s3://agilisium-playground-dev/filestore/vendor/patient_files/patient_data_202502.csv

Purgo AI Agentic Code

bottom of page