Machine Learning Engineer, MLE II, QuickSight at Amazon

Get hired in a next generation AI company right now.
"Craiglist for jobs in AI" - Tony Rose

As seen in

Designer Daily Report
Trends Weekly
La Vanguardia
La Vanguardia

đź–¤ If you like MoAIJobs, give us a shoutout on đť•Ź

Amazon
Machine Learning Engineer, MLE II, QuickSight
GB, Cambridge

Share:

Interested in applied ML using latest developments in Large Language Models and Natural Language Processing? We are a team creating innovations and working on continuous waves of new products to help our customers in this space. Ask a question on your data and you get an answer in seconds? that's the magic of Q! Amazon Q in QuickSight is a machine learning powered NLQ capability that allows business users to ask any question in natural language about their data and get the answer in seconds. Help us build the next evolution of Generative BI using latest Large Language Models (LLMs) and applied Machine Learning.

As a Machine Learning Engineer, you will be working on projects that are both ambiguous, interesting and involves a high impact to our customers. You will use machine learning to solve real-life problems our customers face and enable them to make data-driven decisions. You will also envision solutions that help our customers understand how Q answers their questions while also creating new avenues for them to further explore their data. The opportunities are endless!

If this is you, we are looking forward to having you join our team and design, build innovative products and help lead a team that is working towards fundamental changes in the industry!

Amazon QuickSight is a fast, cloud-powered BI service that makes it easy to build visualizations, perform ad-hoc analysis, and quickly get business insights from your data. QuickSight is revolutionizing Business Intelligence by empowering anyone to use the power of machine learning and Amazon AI to enhance their understanding of data.


Inclusive Team Culture
Here at AWS, we embrace our differences. We are committed to furthering our culture of inclusion. We have ten employee-led affinity groups, reaching 40,000 employees in over 190 chapters globally. We have innovative benefit offerings, and host annual and ongoing learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences. Amazon’s culture of inclusion is reinforced within our 16 Leadership Principles, which remind team members to seek diverse perspectives, learn and be curious, and earn trust.

Work/Life Balance
Our team puts a high value on work-life balance. It isn’t about how many hours you spend at home or at work; it’s about the flow you establish that brings energy to both parts of your life. We believe striking the right balance between your personal and professional life is critical to life-long happiness and fulfillment. We offer flexibility in working hours and encourage you to find your own balance between your work and personal lives.

Mentorship & Career Growth
Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we’re building an environment that celebrates knowledge sharing and mentorship. We care about your career growth and strive to assign projects based on what will help each team member develop into a better-rounded professional and enable them to take on more complex tasks in the future.


Key job responsibilities
- Understand business objectives, product requirements and develop ML algorithms that achieve them.
- Build Prototypes, POC to determine feasibility.
- Run experiments to assess performance and improvements.
- Provide ideas and alternatives to drive a product/feature.
- Define data and feature validation strategies
- Deploy models to production systems and operate them including monitoring and troubleshooting

We are open to hiring candidates to work out of one of the following locations:

Cambridge, GBR | London, GBR

Please mention that you found this job on Moaijobs, this helps us get more companies to post here, thanks!