Website AT&T
This role will focus on Big Data while providing support to our overall Entertainment group data practice and work in partnership with multiple domain architects, development and implementation teams. As a part of the team, you will be responsible for architecting, designing, and implementing new capabilities, features across multiple projects in our organization. We are looking for a candidate who seeks big challenges as part of a strong agile team, who has great collaboration skills, and who has an ability to deliver well thought out technology solutions to entertainment business problems.
Responsibilities:
- Work with domain architects, scrum team architects , product owners, data scientists and big data development / test engineers to bring big data and data science R&D projects into Production.
- Work with the data science delivery lead, Operations, product owners and business stakeholders to create the team’s backlog
- Create / maintain a data architecture roadmap that maintains the integrity of our big data practice and satisfies the business need to fast value delivery
- Engage with all teams in a collaborative environment
- Architect, design, deliver, test, and troubleshoot complex data implementations
- Work with domain architects to improve data strategy, quality and governance
- Guide, mentor and influence architectural direction and principles adoption
- Work closely with both business and external users to quickly deliver high-quality applications
- Work collaboratively with our business leaders to drive adoption of our solutions
- Mentor and develop junior architects in emerging (data architecture) technology areas
Required Qualifications
- 2-5 years’ of experience in Big Data, data science, machine learning
- 5+ years’ experience in Database Administration, Design and Development
- 10+ years’ experience in Software Architecture
- Cloud – private/hybrid/public, Big Data (Hortonworks) ecosystems
- Experience in AWS managed services for data ingestion, processing
- Experience in designing big data lake for data integration from enterprise wide applications/systems
- Experience with various ingestion patterns for large data sets Experience with Open Source and NoSQL technologies (e.g. Couchbase, Cassandra) at an Enterprise level
- Experience with various visualization tools such as Tableau , Microstrategy
- Lambda architecture for ingestion including real-time streaming technologies (Kafka, Spark Streaming)
- Knowledge and understanding of ETL design and data processing mechanisms
- Knowledge about data replication, data masking, and performance factors, etc.
- Ability to present the potential business impacts , calculate ROI for certain use case realization
- Ability to formulate solution to handle a new situation presented related to integrating various disparate systems to Warehouse
- Ability to combine strong analytical and technical skills with business skills to engage with a wide range of stakeholders
- Experience in Agile/Scrum Methodologies
- Experience in preparing business case evaluations and preparing financial analysis of solutions
- Contribute SME skills to enterprise governance and develop emerging technology solutions for enterprise adoption.
- Excellent verbal and written communication skills
Preferred Qualifications
- Database specific qualifications or certifications
- Experience with: Java, Python, Node.JS, RESTful Services, Hadoop, Spark, Linux, Hive, Kafka, Cassandra, Hbase, OpenTSDB, Tableau, Microstrategy
- Certified Scrum Master(CSM)
- Project Management Certification (PMP or equivalent experience)
Education
- Degree in computer science/software engineering and/or equivalent work experience
Job ID 1928179 Date posted 05/31/2019