Location: Chicago, IL
This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models alongside the Domain expertise in Healthcare. This position requires extensive data and integration experience. An integration or data architecture background is preferred, but not required.
Required Job Qualifications:
*Bachelor Degree and 5 years Information Technology experience OR Technical Certification and/or College Courses and 7 year Information Technology experience OR 9 years Information Technology experience. Master’s degree (in a technical related subject) preferred but not required.
*Possess ability to manage workload, manage multiple priorities, and manage conflicts with customers/employees/managers, as applicable. Furthermore, ability to direct / manage a team of integration designers, developers, and testers in building large scale, complex integrations throughout a modern data ecosystem.
*Must have extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster with extensive experience in Integration (Integration includes ETL, message-based, streaming and API styles of integration) with tools preferable Talend Data Integration, Talend Big Data migration platform Edition 6.2.1 or comparable toolsets, and Data Warehousing. Talend is preferred tool for data integration and Integration. If you have extensive experience with some other tool, you are expected to be able to transfer these skills into Talend tools within 30-60 days. is committed to placing experienced resources and as such adopts a CodeVue test approach for potential candidates.
*Must have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog .
*Must have experience with NoSql Databases like HBASE, Mongo, CosmoDB, Graph Databases or Cassandra
*Must have experience with Developing Pig scripts/Hive QL , UDF for analyzing all semi-structured/unstructured/structured data flows.
*Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.
*Experience with Spark and Scala, or some other JVM based language with data integration experience
*Must have working knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Azure) and considerations for scalable, distributed systems
*Must demonstrate Integration best practices with focus on Talend.
* Must have extensive knowledge working with version control tools like GIT and SVN.
* Hands on experience with PCF using Talend suite.
* Experience implementing complex business rules in Talend by creating Reusable Transformations and robust mappings/mapplets. Experience in loading data, troubleshooting, Debugging and tuning of Talend mappings.