Skip to content

AT&T International Careers My Job Application

Join a network that helps you succeed

At AT&T, our talents work together to make a difference. Think you’d like to help us connect more people than ever before? Check out our latest opportunities below, or browse for other recent postings.


Hadoop Admin


Bagalore, India

Ref #:


Date published:

Apply now
Impact and Influence:
This position will interact on a consistent basis with: other developers, business/functional analysts, programmer/analysts, database/data warehouse administrators and architect. This position typically will advise and counsel: staff for implementation of BigData/Hadoop. Significant backgroun/experience with Hadoop/Java is a requirement. Results seen from this role will include improved uptime of the lake environments. Improved turnaround of patching and upgrades. Interaction with both customers and vendors to test out "POC" in a non-production space to ensure final deployment within the production space is stable, performs and provides the uptime meeting the satisfaction/needs of the platform/customer.

Roles and Responsibilities:

Key Responsibilities:
• Work to improve the lifecycle of the Hadoop environment, using the software background to help tune and troubleshoot the Hadoop/BigData Platform..
• Work in an Enterprise BigData/Hadoop environment tasked to build solutions using federated/virtualization technologies.
• Work in conjunction with others on the team to enhance the availability of the Hadoop platform both on prem, and in the cloud. (AWX, Azure etc.)
• Troubleshoot and help client teams improve performance of various jobs running in the Hadoop environment.
• Working in conjunction with others to define best practice and standards for existing and future implementaitons. .
• Provide support in scheduling jobs, troubleshooting job errors, identifying issues in unusually long running jobs, etc.
• Work with other development and client teams in implementing best practices.
• Monitor nightly jobs and troubleshoot any issues that may arise.
• Maintain technical support documentation and runbooks for the production jobs.
• Other related duties as assigned.

Key Competencies and Skills:

Technical Skills Required:
• Full life-cycle experience on enterprise software development projects. Able to work collaboratively with the customers to aid in the evolution towards an SDLC environment for Hadoop.
• Hands on experience in software development concepts including java performance tuning to aid not only the platform, but that of the
• Experience in project life cycle activities
• Experience in developing best practices, unit testing and developer code reviews.
• Strong hands on Experience in Linux system, files systems, shell scripting.
• Experience in databases/ data marts/data warehouses and complex SQL. Experience with databases including Vertica, Teradata, and Oracle.
•Strong background in Java and Java script.
• Strong written and verbal communication skills including the ability to express ideas concisely and clearly to both technical and non-technical audiences.
• Good problem solving and analytical skills used to resolve technical problems.
• Must possess a good understanding of business requirements and IT strategies.
• Ability to work independently, but must a team player.

Education and Qualifications:

University Degree - Intl in Computer Science and/or Analytics

3-15 years relational Database design / development / Java