Job DescriptionAgileEngine is an Inc. 5000 company that creates award-winning software for Fortune 500 brands and trailblazing startups across 17+ industries. We rank among the leaders in areas like application development and AI/ML, and our people-first culture has earned us multiple Best Place to Work awards.
If you're looking for a place to grow, make an impact, and work with people who care, we'd love to meet you! :)
WHAT YOU WILL DO
- Architect, build, and maintain modern and robust real-time and batch data analytics pipelines;
- Develop and maintain declarative data models and transformations;
- Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB;
- Deploy and configure BI tooling for data analysis;
- Work closely with product, finance, legal, and compliance teams to build dashboards and reports to support business operations, regulatory obligations, and customer needs;
- Establish, communicate, and enforce data governance policies;
- Document and share best practices with regards to schema management, data integrity, availability, and security;
- Protect and limit access to sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes;
- Identify and communicate data platform needs, including additional tooling and staffing;
- Work with cross-functional teams to define requirements, plan projects, and execute on the plan;
- At least 4 hrs overlap with EST is required.
MUST HAVES
- 5+ years of engineering and data analytics experience;
- Strong SQL and Python skills for complex data analysis;
- Experience with modern data pipeline and warehouse tools using Databricks, Spark or AWS Glue;
- Proficiency with declarative data modeling and transformation tools using DBT;
- Experience configuring and maintaining data orchestration platforms with Airflow;
- Experience with infrastructure-as-code tools (e.g., Terraform);
- Upper-intermediate English level.
NICE TO HAVES
- Familiarity with real-time data streaming (e.g., Kafka, Spark);
- Background working with cloud-based data lakes and secure data practices;
- Hands-on experience building automation tooling and pipelines using Python, Go, or TypeScript;
- Familiarity with container orchestration (e.g., Kubernetes);
- Prior experience managing external data vendors;
- Exposure to Web3 / Crypto data systems;
- Background working cross-functionally with compliance, legal, and finance teams;
- Experience driving company-wide data governance or permissioning frameworks;
- Strong bias for simplicity, speed, and avoiding overengineering;
- Ability to work autonomously and drive projects end-to-end.
THE BENEFITS OF JOINING US
- Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.
- Competitive compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.
- A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.
- Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
Your application doesn't end here! To unlock the next steps, check your email and complete your registration on our Applicant Site. The incomplete registration results in the termination of your process.