Job Purpose: This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models alongside the Domain expertise in Healthcare. This position requires extensive data and integration experience. An integration or data architecture background is preferred, but not required.
Required Job Qualifications:
*Bachelor Degree and 5 years Information Technology experience OR Technical Certification and/or College Courses and 7 year Information Technology experience OR 9 years Information Technology experience. Master’s degree (in a technical related subject) preferred but not required.
*Possess ability to manage workload, manage multiple priorities, and manage conflicts with customers/employees/managers, as applicable. Furthermore, ability to direct / manage a team of integration designers, developers, and testers in building large scale, complex integrations throughout a modern data ecosystem.
*Must have experience with NoSql Databases like HBASE, Mongo, CosmoDB, Graph Databases or Cassandra
*Must have working knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Azure) and considerations for scalable, distributed systems. Pivitol Cloud Foundry (PCF)
* Must have extensive knowledge working with version control tools like GIT and SVN.
* Hands on experience with PCF using Talend suite.
* Experience implementing complex business rules in Talend by creating Reusable Transformations and robust mappings/mapplets. Experience in loading data, troubleshooting, Debugging and tuning of Talend mappings.
* Hands-on experience in Performance tuning of Talend and Informatica ETL, Integration, Queries and Jobs.
*Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production. Additionally, experience in design for sustainability – minimize impact on operations crews in the event of systems outages, unexpected data, etc., as well as engineering the code base for straightforward extensions / expansions.
*Must have working experience in the data warehousing and Business Intelligence systems. Additionally, experience in building in data quality analysis in-line in integration flows. Experience in working with metadata across the integration landscape in support of data governance and operational needs.
*Participate in design reviews, code reviews, unit testing and integration testing.
*Assume ownership and accountability for the assigned deliverables through all phases of the development lifecycle.
*SDLC Methodology (Agile / Scrum / Iterative Development).
*System performance management.
*Systems change / configuration management.
*Business requirements management.
*Problem solving /analytical thinking.
*Ability to execute.
Preferred Job Qualifications:
*Master’s Degree in Computer Science or Information Technology.
???????Preferred: Cloud Experience (we use PCF on prem and Azure cloud)