Guavus, a Thales company, is seeking a Data Scientist Intern. This is an exciting, technical role based in our Montreal office. It is focused on enabling our customers to create real, practical value through advanced analytics on their streaming data. Our customers – primarily large telecom service providers – use Guavus’ analytics to increase revenues or reduce costs by better network optimization, customer experience optimization, churn management, marketing, etc.
We offer 4 months internship with a possibility to extend to 8 months. We have full time position available for interns that stand out after the end of their internship & classes.
Guavus solves the world’s most complex data problems. Proven across Tier 1 service providers, Guavus provides a new generation of analytically powered big data applications to address specific business problems for next-generation service assurance, next-generation customer experience management and the Internet of Things. The Company uniquely breaksdown the barriers between Operational Support Systems and Business Support Systems to enable customers to deliver a better customer experience, improve service operations and more efficiently plan network capacity. Guavus’ operational intelligence applications correlate and analyze massive amounts of streaming and stored business, operational and sensor data from multiple, disparate source systems in real time. Guavus’ products currently process more than two trillion transactions per day.
Skills & Qualifications:
- Bachelor’s or master’s degree in an Engineering or Science field with coursework in Linear Algebra and Statistics
- Analytical skills – ability to quickly analyze data to identify and visualize key insights – as evidenced by coursework or research requiring the analysis of large datasets applying machine learning (regression, clustering, decision trees, neural networks, etc.)
- Algorithmic understanding of standard machine learning methods
- Good understanding of CS fundamentals such as data structures and algorithms, functional programming, cluster computing, etc. and the ability to quickly translate ideas to efficient, elegant code
- Working knowledge of at least one program: Java, Scala, Python
- Experience in high volume data scenarios and hands on knowledge of distributed systems such as Hadoop and Spark will be a big plus
- Excellent oral and written communication skills, including the ability to present effectively to both business and technical audiences
- Excellent attention to detail, organizational and time-management skills
Roles & Responsibilities:
- Implement ETL & machine learning algorithms and benchmark their performances
- Perform data validation exercises, tracing data from source to applications to confirm
- implementation designs
- Analyze data and present insights
- Document data analysis work and results