Top Talent like Sai Srikanth are on Pangea

Pangea, a YC company, connects companies with fractional talent. Fractional hiring allows companies to move faster and work with more specilaized talent, while giving talent more flexibilty and independence. If you are talent open to fractional work, apply here. If you’re a company looking for high-quality fractional talent, learn more here.

Sai Srikanth Bezawada

AI & Machine Learning EngineerData EngineerData AnalystChadstone, VIC, AU
Amazon Web Services
Google Cloud Platform
NoSQL
SQL
Python
Node.js
Microsoft Azure
Data Analysis
Ads Manager
GPT-3
Jupyter Notebook
Computer Science
Azure DevOps
Tensorflow
Programming
C Programming Language
Software Development
Elasticsearch
Databases
Available for hire fromNegotiable
Contracts
Full-Time Roles
Passionate Cloud Data Engineer / Analyst with 10 years of industry experience in cloud industry tools including AWS, Azure and GCP.
I'm a seasoned Cloud Data Engineer with a passion for transforming raw data into actionable insights. With a rich tapestry of 10 years in the industry, I've honed my skills to bridge the gap between data and decision-making, utilizing cloud technologies to unlock the true potential of data. Throughout my journey, I've contributed to diverse projects, ranging from architecting scalable data pipelines to designing robust ETL processes. My expertise spans a spectrum of cloud platforms, including AWS, Azure, and GCP. I take pride in my ability to optimize data storage, implement efficient data models, and ensure seamless data integration across the cloud ecosystem.

Work History

O

Lead Data Engineer

Optus SportWorking as a Lead Data Engineer in Optus Sport as to maintain and improve the data pipeline (built in AWS) to distil multi-terabytes worth of raw data into accurate and actionable insights to the business My role and responsibilities: • Ability to work, manage, orchestrate hundreds of millions of data points per hour during matches across dozens of tech layers and systems • Utilize AWS Kinesis, Kafka to connect real time live data via APIs to the existing data layers in the AWS cloud • Experience designing and implementing real time API streamed data pipeline (Kinesis Firehose) • Having real time designing, building and orchestrating experience with Cloud Data Warehouse Architecture: Athena, Glue, Lambda, EMR, Kinesis • Expert experience in SQL / NoSQL databases (Big Data Systems). • Ability in building, suggesting the best data practices across teams and suggesting the best data storage/ retrieval and usage approaches for data across wider teams • Being able to blend both Data Engineering and analytics in order to achieve the best outcome and have overall project visibility • Designed and implemented effective database solutions and models to store and retrieve data. • Identifying and understanding the data inefficiencies, building dashboards, analyzing the data and recommending the best business practices • Expert experience in R, Python • Experience with traditional Data Warehouse Architecture: Teradata/MS SQL Server/Oracle • Ability to build DevOps, Data pipelines across AWS, GCP • Ability to manage a heavy workload and use their judgement to prioritize things that align with the overall business strategy and move the needle • Productivity Tools: Experience with productivity tools e.g., Slack, JIRA, Confluence, etc. • Leveraged mathematical techniques to develop engineering and scientific solutions.
C

Senior Data Consultant

CyberCXFeb 2021 - Jan 2022 • 1 yrWorked as a Cloud Consultant in building end to end data pipelines, architecture using cloud services like AWS, Azure & GCP which remain appropriate to the business requirements. My role and responsibilities: • Understanding, researching and analyzing the cloud system requirements and evaluating & testing the program using Python, Node JS and launching several test cases. • Working with several client-based roles in assisting, identifying & effectively managing the problems that needs to be solved in achieving a business outcome. • Identifying and understanding the data inefficiencies, building dashboards, analyzing the data and recommending the best business practices. • Working with different teams across the business to analyze the system and create product development plan by understanding the technical needs. • Writing the reports that will be documented based on the work that has been completed and ensure the client's work has been delivered in accordance to the system architecture/ guidelines. • Liaising customers during the initial development, installation & testing the software solutions and products and build the business insight out of that. • Working on Agile procedures, standards and methods to in-scope & out-scope the project plans, estimations for client consideration. • Analyzing the deployment models, building project documentation, ensuring the quality of the data products and getting them approved by clients and end-users. • Trained and certified Apache Spark programmer using Databricks, done advanced training on AWS, Azure based Databricks. • Working on several PoC's for project pitching and performing technical presentation for clients and helping in understanding the Data Engineering & Cloud infra projects. • Performing ad-hoc analysis and presenting results in a clear manner and building dashboards that evidently explain the business problem.
U

Cloud Data Engineer

University Of Melbourne Apr 2020 - Feb 2021 • 11 mosWorked as a IoT AWS Data Engineer in optimizing IoT data pipelines, architectures and data sets including IoT sensor data. My role and responsibilities: • Building, testing, debugging and managing the AWS Cloud application failures regarding the sensor information and data acquisition. Building the software processes, testing protocols and ensuring quality standards of the Smart campus application services. • Analyzing the usage matrix of each IOT Sensor, mapping it with business and systems, building mapping charts for sensors. • Demonstrating and articulating the business intentions behind the project and leveraging and building the sensor usage plan for SMART Campus Project. • Understanding, researching and analyzing the cloud system requirements and evaluating & testing the program using Python and launching several test cases. • Design, build and optimize IoT data pipelines, architectures and data sets including IoT sensor data. • Experience working in the agile environment with core understanding of Dev Ops & CI/CD principles. • Data integration design and build including real-time data analytics, ETL, data modelling and SQL. • Experience with open-source frameworks including Apache Spark. • Ability to write Python, NodeJS Code, updating the modules and maintaining the programs along with end user evaluation for all the new/ existing Smart campus application procedures and orchestrating Data pipelines. • Experience processing files in various image, audio and video media formats. • Build Integration pipeline between Enterprise Applications and Smart Campus platform. • Extensive experience and knowledge of using the AWS cloud platform in an enterprise environment to deliver world class services and capabilities. • Strong background in data integration – real-time data analytics, ETL, data modelling and SQL skills. • Extensive development in the AWS technologies with strong focus on big data & Dev Ops related technologies.
N

Cloud Data Consultant

NCS AustraliaApr 2019 - Apr 2020 • 1 yr 1 moWorking as a Data Consultant and helping several clients on technologies like Predictive Analytics, Tableau, AWS, Machine Learning and Data Engineering. My role and responsibilities: • Being able to understand the commercial and business value of the problem statement and be able to provide appropriate software solutions. • Working with several clients including Belong, Telstra to enhance their data collection procedures to deliver information that is relevant for building software & analytic systems. • Being able design the GCP Data pipelines, building the modern cloud data warehouse techniques & migrating data to Google Big Query. • Using Big query for Loading, exporting the data from different source systems, querying & viewing data. Also, used the web UI console and bq command line tool to interact with Big query. • Processing, cleansing, and verifying the integrity of data used for analysis and performing Data Preparation using Tableau DataPrep. • Performing ad-hoc analysis and presenting results in a clear manner and building dashboards that evidently explain the business problem. • Working on several cloud data migration and data engineering projects which help clients in building strong and secure cloud eco-systems. • Had extensive experience in using cloud based ETL tools like Talend, AWS Glue data and also able to perform cloud data migration without tool basis (eg: Python based data migration techniques, ML algorithms using AWS Services like Lambda, Sagemaker) • Using serverless techniques and minimizing the execution, load times and reduced amount of time required to execute batches. • Developed AWS solutions to read Salesforce Metadata API to keep track of the data model changes in the Salesforce side and automated for the whole data model governance. • Developed several AWS Lambda functions (using python language) to extract bulk data from the Salesforce.
N

BI Analyst |

NSW Transport Oct 2015 - Apr 2019 • 3 yrs 7 mos• Integrated BI assets into customer relationship management tasks, improving service personnel's available intelligence. • Drafted strategic business intelligence roadmap, complete with data governance policies and tactical information safeguards. • Developing automated data analytics solution from data sources using (but not limited to) SQL, Spark, Python for the National Asset Intelligence System. • Performing predictive analytics within the AWS ecosystem. • Visualizing insights via reporting tools including Tableau. • Developing patterns for ingestion, transformation and manipulation of data across different source systems. • Providing support, guidance and specialist advice to colleagues in delivering value through analytical tools and techniques. • Documenting in adherence to data governance guidelines. • Had extensive experience in developing and coding ETL Jobs and code deployment. • Building several jobs and sub-jobs in Talend by using several components in Talend Data integration services. • Working with the development methodologies of Credit card data and deploying them into the required environment through Control-M Scripts. • Analyze client Business requirements and collaborate with Data modeller to identify suitable data model for the ODS. • The loader framework uses ETL mechanisms and preparing data flows and user experience diagrams

Education

D

Deakin University

Master of Data Science and Analytics, Information TechnologyJul 2017 - Mar 2019

How Pangea Works

Effortlessly discover top talent

We’ve distilled the candidate search from endless hours down to just a few minutes. Using Pangea’s AI-powered search tools, you can find top fractional talent able to take on your next project. Our system looks at your company’s niche and your needs to find the perfect match faster than any traditional hiring platform.

Start working with talent today

The top talent on Pangea is ready to get started with you right now. You can message or hire a candidate right from their profile page and start assigning work as soon as they respond. And the best part? Pangea’s fractional contract structure lets you start small and ramp up as your needs change, keeping your costs manageable and your team’s capabliities flexible.

Track work and invoices in one place

Assign tasks, track progress, and complete invoices all on Pangea. We’ve combined every part of the hiring process into one platform to eliminate the miscommunication that’s unavoidable on other freelance platforms. We even send out 1099s to your contractors at the end of the year!

Talk with a Talent Expert

Members of our team are available to help you speed through the hiring process.
Available Now
Book a Call
Passionate Cloud Data Engineer / Analyst with 10 years of industry experience in cloud industry tools including AWS, Azure and GCP.
I'm a seasoned Cloud Data Engineer with a passion for transforming raw data into actionable insights. With a rich tapestry of 10 years in the industry, I've honed my skills to bridge the gap between data and decision-making, utilizing cloud technologies to unlock the true potential of data. Throughout my journey, I've contributed to diverse projects, ranging from architecting scalable data pipelines to designing robust ETL processes. My expertise spans a spectrum of cloud platforms, including AWS, Azure, and GCP. I take pride in my ability to optimize data storage, implement efficient data models, and ensure seamless data integration across the cloud ecosystem.

Talk with a Talent Expert

Members of our team are available to help you speed through the hiring process.
Available Now
Book a Call

Top Talent like Sai Srikanth are on Pangea

Pangea, a YC company, connects companies with fractional talent. Fractional hiring allows companies to move faster and work with more specilaized talent, while giving talent more flexibilty and independence. If you are talent open to fractional work, apply here. If you’re a company looking for high-quality fractional talent, learn more here.

Sai Srikanth Bezawada

AI & Machine Learning EngineerData EngineerData AnalystChadstone, VIC, AU
Amazon Web Services
Google Cloud Platform
NoSQL
SQL
Python
Node.js
Microsoft Azure
Data Analysis
Ads Manager
GPT-3
Jupyter Notebook
Computer Science
Azure DevOps
Tensorflow
Programming
C Programming Language
Software Development
Elasticsearch
Databases
Available for hire fromNegotiable
Contracts
Full-Time Roles

Work History

O

Lead Data Engineer

Optus SportWorking as a Lead Data Engineer in Optus Sport as to maintain and improve the data pipeline (built in AWS) to distil multi-terabytes worth of raw data into accurate and actionable insights to the business My role and responsibilities: • Ability to work, manage, orchestrate hundreds of millions of data points per hour during matches across dozens of tech layers and systems • Utilize AWS Kinesis, Kafka to connect real time live data via APIs to the existing data layers in the AWS cloud • Experience designing and implementing real time API streamed data pipeline (Kinesis Firehose) • Having real time designing, building and orchestrating experience with Cloud Data Warehouse Architecture: Athena, Glue, Lambda, EMR, Kinesis • Expert experience in SQL / NoSQL databases (Big Data Systems). • Ability in building, suggesting the best data practices across teams and suggesting the best data storage/ retrieval and usage approaches for data across wider teams • Being able to blend both Data Engineering and analytics in order to achieve the best outcome and have overall project visibility • Designed and implemented effective database solutions and models to store and retrieve data. • Identifying and understanding the data inefficiencies, building dashboards, analyzing the data and recommending the best business practices • Expert experience in R, Python • Experience with traditional Data Warehouse Architecture: Teradata/MS SQL Server/Oracle • Ability to build DevOps, Data pipelines across AWS, GCP • Ability to manage a heavy workload and use their judgement to prioritize things that align with the overall business strategy and move the needle • Productivity Tools: Experience with productivity tools e.g., Slack, JIRA, Confluence, etc. • Leveraged mathematical techniques to develop engineering and scientific solutions.
C

Senior Data Consultant

CyberCXFeb 2021 - Jan 2022 • 1 yrWorked as a Cloud Consultant in building end to end data pipelines, architecture using cloud services like AWS, Azure & GCP which remain appropriate to the business requirements. My role and responsibilities: • Understanding, researching and analyzing the cloud system requirements and evaluating & testing the program using Python, Node JS and launching several test cases. • Working with several client-based roles in assisting, identifying & effectively managing the problems that needs to be solved in achieving a business outcome. • Identifying and understanding the data inefficiencies, building dashboards, analyzing the data and recommending the best business practices. • Working with different teams across the business to analyze the system and create product development plan by understanding the technical needs. • Writing the reports that will be documented based on the work that has been completed and ensure the client's work has been delivered in accordance to the system architecture/ guidelines. • Liaising customers during the initial development, installation & testing the software solutions and products and build the business insight out of that. • Working on Agile procedures, standards and methods to in-scope & out-scope the project plans, estimations for client consideration. • Analyzing the deployment models, building project documentation, ensuring the quality of the data products and getting them approved by clients and end-users. • Trained and certified Apache Spark programmer using Databricks, done advanced training on AWS, Azure based Databricks. • Working on several PoC's for project pitching and performing technical presentation for clients and helping in understanding the Data Engineering & Cloud infra projects. • Performing ad-hoc analysis and presenting results in a clear manner and building dashboards that evidently explain the business problem.
U

Cloud Data Engineer

University Of Melbourne Apr 2020 - Feb 2021 • 11 mosWorked as a IoT AWS Data Engineer in optimizing IoT data pipelines, architectures and data sets including IoT sensor data. My role and responsibilities: • Building, testing, debugging and managing the AWS Cloud application failures regarding the sensor information and data acquisition. Building the software processes, testing protocols and ensuring quality standards of the Smart campus application services. • Analyzing the usage matrix of each IOT Sensor, mapping it with business and systems, building mapping charts for sensors. • Demonstrating and articulating the business intentions behind the project and leveraging and building the sensor usage plan for SMART Campus Project. • Understanding, researching and analyzing the cloud system requirements and evaluating & testing the program using Python and launching several test cases. • Design, build and optimize IoT data pipelines, architectures and data sets including IoT sensor data. • Experience working in the agile environment with core understanding of Dev Ops & CI/CD principles. • Data integration design and build including real-time data analytics, ETL, data modelling and SQL. • Experience with open-source frameworks including Apache Spark. • Ability to write Python, NodeJS Code, updating the modules and maintaining the programs along with end user evaluation for all the new/ existing Smart campus application procedures and orchestrating Data pipelines. • Experience processing files in various image, audio and video media formats. • Build Integration pipeline between Enterprise Applications and Smart Campus platform. • Extensive experience and knowledge of using the AWS cloud platform in an enterprise environment to deliver world class services and capabilities. • Strong background in data integration – real-time data analytics, ETL, data modelling and SQL skills. • Extensive development in the AWS technologies with strong focus on big data & Dev Ops related technologies.
N

Cloud Data Consultant

NCS AustraliaApr 2019 - Apr 2020 • 1 yr 1 moWorking as a Data Consultant and helping several clients on technologies like Predictive Analytics, Tableau, AWS, Machine Learning and Data Engineering. My role and responsibilities: • Being able to understand the commercial and business value of the problem statement and be able to provide appropriate software solutions. • Working with several clients including Belong, Telstra to enhance their data collection procedures to deliver information that is relevant for building software & analytic systems. • Being able design the GCP Data pipelines, building the modern cloud data warehouse techniques & migrating data to Google Big Query. • Using Big query for Loading, exporting the data from different source systems, querying & viewing data. Also, used the web UI console and bq command line tool to interact with Big query. • Processing, cleansing, and verifying the integrity of data used for analysis and performing Data Preparation using Tableau DataPrep. • Performing ad-hoc analysis and presenting results in a clear manner and building dashboards that evidently explain the business problem. • Working on several cloud data migration and data engineering projects which help clients in building strong and secure cloud eco-systems. • Had extensive experience in using cloud based ETL tools like Talend, AWS Glue data and also able to perform cloud data migration without tool basis (eg: Python based data migration techniques, ML algorithms using AWS Services like Lambda, Sagemaker) • Using serverless techniques and minimizing the execution, load times and reduced amount of time required to execute batches. • Developed AWS solutions to read Salesforce Metadata API to keep track of the data model changes in the Salesforce side and automated for the whole data model governance. • Developed several AWS Lambda functions (using python language) to extract bulk data from the Salesforce.
N

BI Analyst |

NSW Transport Oct 2015 - Apr 2019 • 3 yrs 7 mos• Integrated BI assets into customer relationship management tasks, improving service personnel's available intelligence. • Drafted strategic business intelligence roadmap, complete with data governance policies and tactical information safeguards. • Developing automated data analytics solution from data sources using (but not limited to) SQL, Spark, Python for the National Asset Intelligence System. • Performing predictive analytics within the AWS ecosystem. • Visualizing insights via reporting tools including Tableau. • Developing patterns for ingestion, transformation and manipulation of data across different source systems. • Providing support, guidance and specialist advice to colleagues in delivering value through analytical tools and techniques. • Documenting in adherence to data governance guidelines. • Had extensive experience in developing and coding ETL Jobs and code deployment. • Building several jobs and sub-jobs in Talend by using several components in Talend Data integration services. • Working with the development methodologies of Credit card data and deploying them into the required environment through Control-M Scripts. • Analyze client Business requirements and collaborate with Data modeller to identify suitable data model for the ODS. • The loader framework uses ETL mechanisms and preparing data flows and user experience diagrams

Education

D

Deakin University

Master of Data Science and Analytics, Information TechnologyJul 2017 - Mar 2019

How Pangea Works

Effortlessly discover top talent

We’ve distilled the candidate search from endless hours down to just a few minutes. Using Pangea’s AI-powered search tools, you can find top fractional talent able to take on your next project. Our system looks at your company’s niche and your needs to find the perfect match faster than any traditional hiring platform.

Start working with talent today

The top talent on Pangea is ready to get started with you right now. You can message or hire a candidate right from their profile page and start assigning work as soon as they respond. And the best part? Pangea’s fractional contract structure lets you start small and ramp up as your needs change, keeping your costs manageable and your team’s capabliities flexible.

Track work and invoices in one place

Assign tasks, track progress, and complete invoices all on Pangea. We’ve combined every part of the hiring process into one platform to eliminate the miscommunication that’s unavoidable on other freelance platforms. We even send out 1099s to your contractors at the end of the year!

Talk with a Talent Expert

Members of our team are available to help you speed through the hiring process.
Available Now
Book a Call
Pangea empowers fractional work across the world for marketing and design roles.