Senior Data Engineer – Cloud Infrastructure & ETL/ELT Pipeline Development (Remote/Hybrid)
Posted 2026-05-06About arenaflex
Welcome to arenaflex, where innovation meets imagination! We are a forward-thinking technology company dedicated to transforming raw data into powerful insights that drive business decisions. At arenaflex, we believe that data is the cornerstone of modern innovation, and our team of talented professionals works diligently to build robust data infrastructure that supports our dynamic business needs.
Our culture thrives on collaboration, creativity, and continuous learning. We embrace remote work flexibility, allowing our team members to contribute their best work from the comfort of their homes while staying connected through cutting-edge collaboration tools. At arenaflex, you'll find an environment that values diversity, encourages professional growth, and rewards excellence. Join us as we continue to push the boundaries of what's possible in data engineering and cloud technology.
Position Overview
We are currently seeking a highly skilled and motivated Senior Data Engineer to join our growing data team. This is a fully remote position that offers the opportunity to work from anywhere in the United States. As a Senior Data Engineer at arenaflex, you will be responsible for designing, building, and maintaining our enterprise-level data pipelines and infrastructure. You will play a critical role in ensuring data quality, reliability, and accessibility across multiple departments and projects.
This position requires a strategic thinker who can translate complex business requirements into elegant technical solutions. You will collaborate closely with cross-functional teams including data analysts, business stakeholders, and software developers to deliver scalable data solutions that power our organization's decision-making processes.
Key Responsibilities
As a Senior Data Engineer at arenaflex, you will be entrusted with the following core responsibilities:
- Design and Development: Architect and implement robust ETL/ELT data pipelines using modern data integration tools and frameworks. Create efficient data workflows that can handle large-scale data processing requirements.
- Database Management: Design and maintain database schemas, tables, views, and indexes. Ensure optimal database performance through proper indexing, query optimization, and regular maintenance procedures.
- Cloud Infrastructure: Build and manage cloud-based data infrastructure using AWS services including S3, EC2, and other relevant cloud technologies. Ensure scalability, security, and cost-effectiveness of all cloud solutions.
- Data Integration: Develop and maintain data integration processes that connect various data sources across the organization. Work with structured and unstructured data from multiple internal and external sources.
- Collaboration: Partner with business analysts and stakeholders to understand data requirements and translate them into technical specifications. Participate in requirements gathering sessions and provide technical guidance.
- Code Quality: Write clean, maintainable, and well-documented code. Implement best practices for version control using GitLab/GitHub and ensure proper code review processes.
- Performance Optimization: Monitor pipeline performance, identify bottlenecks, and implement optimization strategies. Ensure data processing jobs meet SLA requirements for speed and reliability.
- Documentation: Create and maintain comprehensive technical documentation for all data pipelines, schemas, and processes. Ensure knowledge transfer and team collaboration.
- Innovation: Stay current with emerging technologies and industry trends. Propose and implement innovative solutions that improve our data infrastructure and capabilities.
Required Qualifications
To be considered for this role, candidates must meet the following essential requirements:
- Educational Background: Bachelor's degree in Computer Science, Information Technology, Engineering, Mathematics, or a related technical field.
- Professional Experience: Minimum 2+ years of hands-on experience in a Data Engineering role, with demonstrated expertise in building and maintaining production-level data pipelines.
- SQL Expertise: Strong proficiency in SQL for data manipulation, query optimization, and database design. Must have 2+ years of experience writing complex queries, stored procedures, and optimizing database performance.
- Programming Skills: Solid programming skills in Python with experience in data processing libraries and frameworks.
- Cloud Platform Experience: Minimum 2+ years of experience working with cloud platforms, specifically AWS (Amazon Web Services). Familiarity with S3, EC2, and other AWS data services.
- Data Pipeline Development: 2+ years of experience designing, building, and maintaining ETL/ELT data pipelines from conception to production.
- Data Warehousing: Experience working with data lakes, data warehouses, and application databases. Understanding of data modeling principles and best practices.
- Version Control: Proficiency in GitLab/GitHub for source code management and collaborative development.
- Containerization: Experience with Docker for containerizing applications and data processes.
- Big Data Technologies: Familiarity with big data technologies and experience handling large datasets.
Preferred Qualifications
While the following qualifications are not mandatory, they will significantly enhance your candidacy:
- Advanced degree (Master's or PhD) in Computer Science, Data Science, Engineering, or a related field.
- Experience with Apache Airflow for workflow orchestration and pipeline scheduling.
- Proficiency in Snowflake cloud data warehouse platform.
- Experience with Databricks for big data analytics and processing.
- Background in streaming data processing and real-time analytics.
- Knowledge of media, entertainment, or digital advertising industries.
- Experience with Looker or other business intelligence tools.
- Understanding of PostgreSQL and other relational database systems.
- Experience with data visualization and reporting tools.
Technical Skills Required
Candidates must demonstrate proficiency in the following technical areas:
- Database Systems: SQL, PostgreSQL, and other relational database platforms
- Programming Languages: Python (required), with familiarity in other languages a plus
- Cloud Platforms: AWS (Amazon Web Services) – required; Azure or GCP – preferred
- Data Tools: Apache Airflow, Snowflake, Databricks, Looker
- Version Control: GitLab/GitHub
- Containerization: Docker
- Storage Solutions: AWS S3 and equivalent cloud storage technologies
What We Offer
At arenaflex, we value our employees and are committed to providing a comprehensive benefits package that supports both professional and personal well-being:
- Competitive Compensation: Attractive hourly rate of $30 per hour with opportunities for performance-based increases.
- Remote Work Flexibility: Work from home with flexible hours that accommodate your lifestyle.
- Health & Wellness: Comprehensive health insurance coverage including medical, dental, and vision plans.
- Retirement Benefits: 401(k) retirement plan with company matching contributions.
- Professional Development: Generous budget for training, certifications, and conference attendance.
- Career Growth: Clear advancement paths and opportunities to take on leadership roles.
- Work-Life Balance: Paid time off, holidays, and flexible scheduling options.
- Equipment & Tools: Company-provided laptop and necessary software licenses.
- Collaborative Culture: Regular team meetings, virtual events, and cross-functional projects.
Career Development Opportunities
At arenaflex, we invest in your growth and career advancement. As a Senior Data Engineer, you will have access to:
- Mentorship from senior technical leaders and architects.
- Training opportunities in emerging technologies and cloud certifications.
- Exposure to cutting-edge data engineering projects and challenges.
- Potential transition paths to Lead Data Engineer, Data Architect, or Management positions.
- Participation in internal innovation labs and technical initiatives.
- Conference attendance and industry networking opportunities.
Work Environment
Our remote work environment is designed to foster productivity and collaboration. You'll be part of a distributed team that uses modern communication tools to stay connected. We believe in results-oriented work rather than micromanagement, giving you the autonomy to manage your schedule while meeting project deadlines.
At arenaflex, we maintain a culture of transparency, respect, and continuous improvement. Regular stand-ups, sprint planning sessions, and team retrospectives ensure everyone stays aligned and motivated. We celebrate achievements, learn from challenges, and support each other's growth.
How to Apply
If you're passionate about data engineering, thrive in remote work environments, and want to be part of a dynamic team that's shaping the future of data infrastructure, we want to hear from you!
To apply for this position, please submit your updated resume along with a cover letter highlighting your relevant experience and why you're excited about joining arenaflex. Our hiring team will review applications on a rolling basis and reach out to qualified candidates for interviews.
Join arenaflex and be part of something extraordinary! Apply today to take the first step toward an exciting career in data engineering.