Job description
Job Description:
- Drives successful solution adoption and implementation for medium to complex portfolios.
- Works on several large and enterprise-wide projects.
- Participates in strategy design and leads initiatives.
- Designs solutions for large-scale initiatives.
- Has intermediate to advanced skills in Python, PyTorch, TensorFlow, and other deep learning frameworks.
- Is an expert in working with large databases, BI applications, data quality, and performance tuning.
- Has expert knowledge of developing end-to-end business intelligence solutions: data modeling, ETL, and reporting.
- Has a deep understanding of data gathering, inspecting, cleansing, transforming, and modeling techniques.
- Deep understanding and experience in Microservices architecture.
- May act as an escalation point for others.
- Has outstanding written and communication skills.
- Identifies and drives process improvement.
- Responsible for improving availability, security, compliance, interoperability, performance, and reengineering activities.
- Grows into the role of a recognized subject matter expert in one or more functions.
- Has excellent communication skills, with the ability to work individually and in broader, geographically dispersed teams.
Qualifications:
- Bachelor's degree required; Master's degree preferred.
- 8 years of relevant experience, including several technology solutions such as Java, Big Data technologies, and data management tools.
- 5+ years of DataOps and DevOps experience, building solutions using Hadoop technologies (Pig, Spark, Kafka), Python, and version-controlled CI/CD pipelines using tools like Jenkins and GitHub.
- 3-5 years of experience in designing, developing, and implementing Google and AWS cloud solutions.
- 5+ years of experience with relational database concepts, including star schema, Oracle, SQL, PL/SQL, SQL tuning, OLAP, Big Data technologies, Snowflake, and Apache NiFi.
- 5+ years of experience in building data pipelines in data lake setups.
- 3 years of architecting, designing, and implementing enterprise-scale projects/products and data management.
- 3-4 years of experience with Cerner, EPIC, and Lawson systems.
- 5 years of secure data engineering and scripting (Shell, Python).
- 5 years of experience in healthcare data, building clinical and non-clinical solutions that drive patient outcomes.