Skip to content

AT&T International Careers My Job Application

Join a network that helps you succeed

At AT&T, our talents work together to make a difference. Think you’d like to help us connect more people than ever before? Check out our latest opportunities below, or browse for other recent postings.


Big Data Developer


Bangalore, India

Ref #:


Date published:

Apply now
Impact and Influence:
This position will interact on a consistent basis with: architects, leads, data engineers, data scientists to support cutting edge analytics in development and design of data Acquisitions, data ingestions and data products. Developers will be working on Big data technologies and platforms on prem and cloud. The resources will be involved in following and contributing to best practices in for software development in the big data ecosystem. Our big data platforms are based on the Hortonworks distribution so candidates should be expected to be heaviliy involved with all components of the HDP & HDF packges.

Roles and Responsibilities:

Key Responsibilities.

• Development of high performance, distributed computing tasks using Big Data technologies such as Hadoop, NoSQL, text mining and other distributed environment technologies based on the needs of the Big Data organization.
• Use Big Data programming languages and technology, write code, complete programming and documentation, and perform testing and debugging of various applications.
• Analyze, design, program, debug and modify software enhancements and/or new products used in distributed, large scale analytics solutions.
• Interacts with data scientists and industry experts to understand how data needs to be converted, loaded and presented.
• Provide rich insight into consumer behaviors, preferences and experiences in order to improve the customer experience across a broad range of vertical market.

Key Competencies and Skills:

Technical Skills Required:

• In depth knowledge & experience in Hadoop around all the Hadoop ecosystem (HDP, HDF, nifi, M/R, Hive, pig, Spark/scala, kafka, Hbase, Elastic search and log stash a plus)
• Expert in working on Linux/Unix
• Cloud based Experience (AWS)
• Good understanding & experience with Performance and Performance tuning for complex S/W projects mainly around large scale and low latency
• Experience with data flow Design & Architecture
• NoSQL experience like MongoDB or PostgresDB
• Hadoop certifications/ AWS certifications a plus.
• Experience in understanding Java programs and troubleshooting.
• Excellent communication skills.
• Ability to work in a fast-paced, team oriented environment.

Required / Desired Skills
• Unix/Linux shell scripting (Required 4 years)
• Java Script (Desired 2 years)
• Spark (Desired 2 years)
• AWS experience (Desired 1 year)

Education and Qualifications:

Education: University Degree in Computer Science and/or Analytics

Minimum Experience required: 3-5 year experience in Big Data Design / Development

Additional Information:

Afternoon shift