We are looking for a skilled Data Engineer with hands-on experience in API integration and building scalable data pipelines on GCP.

Responsibilities –
1. Analyze inbound inventory data flows and define integration requirements
2. Design and implement API integrations to process and enrich incoming data
3. Build scalable, cost-efficient data pipelines using GCP tools
4. Set up storage solutions for both structured and unstructured data
5. Develop QA and monitoring systems to ensure data quality and performance
6. (Optional) Integrate prompting logic for Generative AI content (text, image, video)

What’s Required
1. Proven experience with third-party API integrations in complex environments
2. Strong background in cloud-native data engineering (preferably with GCP and BigQuery)
3. Solid focus on monitoring, cost management, and data quality
4. Bonus: Familiarity with Generative AI workflows

How to apply-
1. Submit a proposal with clear pricing
2. Include relevant examples of past work involving API integration and scalable data projects
3. Authentic communication, no AI-generated proposals will not be considered
4. This role may evolve into an ongoing partnership focused on GCP and BigQuery

Note: Proposals must include a short video answering the listed screening questions to be considered.

Budget

$35.00

Project type: Hourly
Cloud Categories
GCP
Select Skills
Backup and Restore, Data Management and Storage, Database Management, GCP
Project Categories
Other
Project Type
Hourly