
Technical Performance Specialist, DATA
- India
- Permanent
- Full-time
- Technical skills of new mentors: Responsible for selecting technically strong mentors who can provide superlative services (Project reviews and answers on the Knowledge platform) to students
- Technical quality of Services: Responsible for deploying processes to ensure that new and existing mentors are providing high-quality, technically accurate services to the students
- Mentor classification: Direct ownership of validating whether any mentor is a low, medium, or high performer. The validation exercise to encompass all data related to mentor performance, including students' ratings, P2P audit scores, and data collected by manually checking the quality of services
- Technical Mentor Ops processes: Ensure timely execution of parts of Mentor ops processes that need a technical understanding of concepts. Methods include getting higher-quality answers to the students through mentors and plagiarism investigations.
- Select and onboard Session lead: Identifying high-performing and technical strong Session leads who would be taking virtual connect sessions for the Enterprise and Scholarship clients
- New Initiatives: Work closely with the rest of the Mentor Operations, Student Services, Product, Engineering, and Content teams to launch new programs and opportunities for enterprise customers, students, mentors, and more
- Technically strong bend of mind with eagerness to learn
- An intermediate proficiency in Python or a similar language and development experience.
- Proficiency in data analysis, visualization, engineering, SQL, data Warehouses, data lakes, and data pipelines on both AWS and Azure.
- Exceptional ability to collect, analyze, and report on data
- Experience in designing, launching, and managing operational processes to drive processes at scale
- Proven experience in prioritizing and multitasking multiple urgent needs
- Intuition to guide Udacity mentors by encouraging best practices for tutoring and giving feedback
- Flexibility in using or learning to use different methods for tracking and conveying information, such as Zendesk, Google Docs, forums, online chat programs, and email
- Strong expertise in Python, SQL, NoSQL databases (e.g., MySQL, MongoDB), data visualization tools (e.g., Tableau), statistical analysis (using Python, R, scipy, statsmodels), and data manipulation libraries (e.g., pandas, dplyr).
- Proficient in data integration concepts, tools for building data pipelines (like Apache Airflow), and cloud platforms (e.g., AWS, Azure) for data services deployment.
- Knowledgeable in big data frameworks (e.g., Apache Hadoop, Apache Spark) for large-scale data processing and analysis.
- Strong Problem-solving skills
- Good at researching and troubleshooting technical issues
- Empathetic and a great mentor
- Excellent organizational skills and ability to meet deadlines
- Driven by data to constantly iterate and improve
- An excellent communicator who excels in listening, writing, and responding professionally
- A team player
- Flexible working hours
- Paid time off
- Comprehensive medical insurance coverage for you and your dependents
- Employee wellness resources and initiatives (access to wellness platforms like Headspace, Modern Health )
- Quarterly wellness day off
- Personalized career development
- Unlimited access to Udacity Nanodegrees