Enterprise Architect - Big Data Technologies
Responsible for providing a top down strategic direction for big data architecture in support of the overall enterprise architecture initiative. Guides the organization in the creation of a comprehensive big data technology solution portfolio and ability to define big data solution patterns for the purposes of creating reusable assets within enterprise architecture. Anticipates IT and industry future directions and relates trends and best practices to future big data architecture requirements and projects. Interfaces across the enterprise, acting as a visionary to proactively assist in defining direction for the use of big data technologies. May be called upon to provide any level of solution architecture guidance for single or multiple projects. Major Responsibilities:
- Design and implement big data technology prototyping, proof-of-concept (POC), and solution design patterns across emerging platforms
- Produce artifacts in support of Big Data reference architecture advocacy and implementation, including authoring documentation, and presentations/diagrams for dissemination to technical and business audiences.
- Experience designing and building complex, high performance platforms to support Big Data ingestion, curation and analytics.
- Stay abreast of and evaluate emerging big data technologies.
- Develop and publish technology standards, such as use of Hadoop technologies.
- Translate user requirements into effective big data solution architectures.
- Monitor regulatory guidelines/laws and emerging industry standards to determine impact on enterprise solution architectures and technology strategic direction.
- A bachelor's degree in computer science.
- Demonstrate knowledge of Big Data Architecture, Data Warehousing, Advanced Analytic, or Data Science implementations.
- At least 10 years of experience in designing and implementing big data management solutions.
- At least 5 years of experience in implementing solutions based upon big data platform technologies (e.g. Cloudera, HortonWorks).
- At least 5 years of experience in Big Data Components/Frameworks such as Hadoop, Spark, Storm, HBase, HDFS, Pig, Hive, Sqoop, Flume, etc.
- Ability to work closely with users and translating user requirements into solution designs.
- Demonstrated success facilitating design workshops and influencing others.
- Knowledge and experience in developing solutions compliant with regulatory and legal requirements within the Pharmaceutical Industry is a plus.
- Lead the adoption of big data technologies across IT organizations.
- Perform architecture design, data modeling, and implementation of big data solutions.
- Develop guidelines to ensure big data standardization and consistency