Can you teach a computer to smell? Osmo is trying

The company’s work, supported by the Amazon Alexa Fund, has relevant applications for areas from perfumes to disease detection.

At the age of 12, Alex Wiltschko bought his first perfume, Azzaro pour Homme. He’d read about it in his favorite book — Perfumes: The Guide, by Luca Turin — and was thrilled to find it at a knock-down price at his local TJ Maxx. It would be the first in a large collection.

For as long as he can remember, Wiltschko has been obsessed by scent. “It’s how I’m wired,” he says. His other obsession? Computers. “An interest in perfumes and computers was not the recipe for social success as an adolescent,” he adds.

It was, however, the recipe for a life trajectory that took Wiltschko deep into the neuroscience of olfaction and cutting-edge machine learning. This combination has placed Wiltschko at the forefront of the nascent science of digital olfaction — a.k.a. giving computers a sense of smell.

Wiltschko is now the CEO of Osmo, a Google Research spinout based in Cambridge, Massachusetts. In September 2022, the company hit the ground running with $60 million in initial funding, including an investment from the Amazon Alexa Fund.

In the short term, Osmo aims to unlock a new era of commercial fragrance innovation. Longer term, the company envisions its technologies having the potential to save lives through the development of better insect repellents and even digital diagnostic tools for detecting serious illnesses on a person's breath.

The Principal Odor Map

The keystone to all this is the team’s breakthrough advance: the creation of what it calls the Principal Odor Map (POM).

Before vision could be digitized, a map called RGB was required: It shows how every color is made up of varying proportions of red, green, and blue. Before Osmo was spun out, Wiltschko’s team did something similar — and remarkable — with odor. They used machine learning to map the structure of a molecule directly to how humans perceive the smell of that molecule. In other words, they built a model that can tell you what a molecule smells like just by looking at it. This is the POM.

That was an ‘a-ha!’ moment for us, akin to passing a Turing test for odor. We'd built something with real commercial value that was sufficiently validated to bring into the world.
Jon Hennek

Here’s how they created POM and, crucially, how they proved it worked. They first trained a graph neural network (GNN) on about 5,000 molecules from several flavor and fragrance databases. The smells of all these molecules were well-documented with multiple human-judged odor labels such as beefy, floral, or minty. From this, the model was able to learn connections between molecular structure and odor, without needing any knowledge of what actually happens in the nose or brain of a person sniffing an odor.

That’s great, as far as it goes. The crucial question then was, could POM generalize to predict the smell of molecules it had never seen before, based solely on their molecular structure? And could it do that as well as trained human raters, which is the gold standard for odor characterization? To find out, the team took a diverse set of more than 400 odor molecules previously unseen by POM and had the model blindly predict their characteristics. Then a panel of trained human raters sniffed and labeled those same odors.

When the Osmo team compared the results, they were delighted. Not only had the model successfully predicted the odor of these unseen molecules as well as trained humans, but its predicted odor profiles were closer to the average results of the panel than any of the individual panelists themselves.

“That was an ‘a-ha!’ moment for us, akin to passing a Turing test for odor,” says Jon Hennek, chief product officer at Osmo. “We'd built something with real commercial value that was sufficiently validated to bring into the world.”

Islands of odor

POM is not a map in the typical sense, but it can nevertheless be compared to the RGB map. Pick two points at random on a two-dimensional color map. The closer those two points are to each other, the more similar the color. The same is true for odors in POM, though this map exists in a mind-bending 256 dimensions. All of the tulip-smelling molecules are close to each other, for example. Ditto for the brandy-smelling molecules.

“Zooming out a little, all the flowers are next to each other. There's a whole floral Pangaea in this odor map! We didn't tell it to do that,” says Wiltschko. This sort of grouping is also true for woods, bakery-type smells, alcoholic smells, you name it. Our brain seems to organize smells in nested hierarchies, says Wiltschko, so the rose odor is inside the rose category, inside the flowers category, inside the plants category, inside the pleasant category.

“The fact that we were able to observe this in the POM without telling it is astounding,” he says.

On the left is a color map (the CIE 1931 color space chromaticity diagram), similar colors lie near each other. On the right is Osmo’s Principal Odor Map, individual molecules (grey points) are found nearer to each other if they are predicted to smell similar.
In this color map (the CIE 1931 color space chromaticity diagram), similar colors lie near each other. Likewise, in Osmo’s Principal Odor Map, individual molecules (grey points) are found nearer to each other if they are predicted to smell similar.
Courtesy of Osmo

While Wiltschko has bold ideas for the future of Osmo’s technology, the first order of business is putting the company on a solid commercial footing. For now, Osmo is concentrating on developing new ingredients for the global fragrance category.

The Osmo team is using POM to explore the world of odor molecules — several billion of them — and homing in on molecules that POM predicts to have an interesting and strong olfactory character.

“We're much better at that, I believe, than anybody else in the world,” says Hennek. “Because rather than start with rules of thumb and chemical intuition, we are starting with an odor prediction for every molecule we could possibly synthesize. It lets us find molecules that a chemist might never have considered.”

The team is working with advisors, including Christophe Laudamiel, a French master perfumer, and potential customers include fragrance houses and packaged goods companies.

More from Alexa Fund
Alexa Fund company’s assisted reality tech could unlock speech for hundreds of millions of people who struggle to communicate.

“We've had repeated feedback that our ingredients have the potential to be very successful, commercially,” says Wiltschko. “That smells like product/market fit.” The principal idea is to license those molecules to fragrance houses.

It’s a timely endeavor. The global fragrance category is valued at more than $10 billion and growing steadily. But some traditional ingredients, such as sandalwood oils, can result in over-harvesting or other ecological harm, while the characteristics of other ingredients increasingly fall short as the demand grows for safer, more biodegradable products.

With POM, Osmo is paving the way for palettes of safe, synthetic fragrances that recreate natural odors using environmentally friendly and easily synthesized molecules. To that end Osmo is looking at combinations of just a handful of atoms: carbon, hydrogen, oxygen, nitrogen, phosphorus, and sulfur.

“Then we bring them into our lab for a process akin to a drug discovery pipeline,” says Hennek. “We are working towards regulatory approval of those molecules.”

Rise of the graph neural networks

All of this has only become possible in the last six years or so. The core insight that started this scientific project, says Wiltschko, was that machine learning was “getting really good at molecules,” thanks to the recent rise of GNNs.

Related content
Dual embeddings of each node, as both source and target, and a novel loss function enable 30% to 160% improvements over predecessors.

Previously, machine learning approaches primarily converted inputs — images or data arrays, say — into rectangles or data grids to process them. Molecules didn’t fit this mold: a molecule might be two atoms, or it might be 20 atoms, with wildly different structure and connectivity. They are simply not reducible to rectangles or grids.

Instead, the atoms in a molecule can be considered as nodes, and the chemical bonds between them as edges, forming a graph structure. This representation allows GNNs to model and process molecular data.

“Some of this technology was developed by friends of mine at Google. So, it was a fantastic, fertile ground to start exploring this idea,” says Wiltschko.

This ongoing exploration is creating some exciting possibilities. Wiltschko reasoned that, just as the sun has shone on Earth since before life began, resulting in many creatures evolving similar visual apparatuses, the composition of the Earth’s atmosphere has been broadly stable over evolutionary time. So could POM also be used to understand the olfactory responses of other species, even those separated from humans by millions of years of evolution?

Life-saving potential

Take mosquitos. Could POM be used to work out what odors repel these disease-carrying insects?

To find out, they augmented POM with additional data sources. The first was a long-forgotten U.S. government report, published in the 1940s, that featured the results of testing 19,000 compounds for their mosquito repellency. The second was information provided by TropIQ, a Dutch company that develops malaria-control technology. The augmented model was soon able to predict entirely new molecules with repellency at least as powerful as DEET, the active ingredient in the most effective mosquito repellents.

osmo image 2.png
Osmo digitized mosquito-repellency data for 19,000 compounds reported on by the United States Department of Agriculture and used that to refine its model (left). The team then predicted candidate molecules that would be most repellent to mosquitos, produced the most viable options, tested them on real mosquitos, and fed those results back into the model to further refine it.
Courtesy of Osmo

The development of cheaper, more effective, and safer insect repellents could have a huge impact on global health. Wiltschko has nothing to announce yet, but says this research is ongoing in collaboration with the Bill & Melinda Gates Foundation.

Applying POM to mosquitos is also a proof of concept, says Hennek. “We can picture applying our product not just to what mosquitoes don’t like, but to what roaches don’t like. Or any number of agricultural pests.”

Capturing smell forever

Looking further down the road, Wiltschko’s vision is to digitize our sense of smell. The idea is not as far-fetched as it sounds. Consider several hundred years ago. The idea that a visual moment — the fleeting expression on your child’s face or an orchard of apple trees in blossom — could be instantly captured and made available forever more in perfect color would have been nothing short of magical thinking.

By the 1820s came the first photography, and with it, the first steps towards human mastery of the world of light. Today, it feels like a fundamental right to freeze those visual memories and hold on to them forever. And the same goes for the auditory world.

“We know what’s required to digitize a human sense,” says Wiltschko. “And we don't have to wait for any of the inventions that vision did — particularly integrated circuits.”

Indeed, with modern computing power and the harnessing of machine learning, Wiltschko reckons computers will have a “sense of smell” within a decade or two. Three stages are required: “reading” smell, understanding it, and “writing” it. Osmo wants to understand, and ultimately curate a wide palette of safe, synthetic molecules that can recreate the entire human smellscape. The reading (sensing) of odorous molecules currently requires bulky and expensive lab equipment, such as a gas chromatography mass spectrometer, while the writing (producing) of smells on demand remains science fiction at the consumer level, says Wiltschko, for now.

A window to the inside

Sensing and understanding odor at a high level may be sufficient to herald powerful health applications, says Wiltschko. For example, it is well established that serious illnesses, including some cancers, can be detected through their effect on your breath. Being able to take a snapshot of that odor profile — an “Osmograph”, in Wiltschko’s words – could reveal a great deal about what’s going on inside our bodies.

“We don't know if that technology will ultimately have a transformative effect on healthcare, but I am betting that it will,” he says.

Related content
ARA recipient Marinka Zitnik is focused on how machine learning can enable accurate diagnoses and the development of new treatments and therapies.

It’s very important to Wiltschko that, down the line, Osmo grows to develop clinical diagnostics applications. “That's the North Star for me, and it's very important that we get there. But the sheer cost and the talent that's required is rare and expensive,” he says. “So, it can’t be the first beach that we storm.”

As Osmo grows, it will be looking for similarly passionate people to push the mission forward. “We've been finding that there are people out there who are secret scent lovers, who secretly aspire to work in the field of machine olfaction,” says Wiltschko. “Just to put it out there: there's one place to do this, and it's Osmo.”

Talking to Wiltschko and those inspired to work alongside him, it is clear to see that Osmo is the culmination of his lifelong passions. For him, it’s emotional. “Once you smell a thing, you cannot stop the feelings that you get from it. There's a very fundamental feeling and emotional component,” he says, “and I think that’s beautiful.”

Research areas

Related content

US, VA, Arlington
Do you want a role with deep meaning and the ability to have a global impact? Hiring top talent is not only critical to Amazon’s success – it can literally change the world. It took a lot of great hires to deliver innovations like AWS, Prime, and Alexa, which make life better for millions of customers around the world. As part of the Intelligent Talent Acquisition (ITA) team, you'll have the opportunity to reinvent Amazon’s hiring process with unprecedented scale, sophistication, and accuracy. ITA is an industry-leading people science and technology organization made up of scientists, engineers, analysts, product professionals, and more. Our shared goal is to fairly and precisely connect the right people to the right jobs. Last year, we delivered over 6 million online candidate assessments, driving a merit-based hiring approach that gives candidates the opportunity to showcase their true skills. Each year we also help Amazon deliver billions of packages around the world by making it possible to hire hundreds of thousands of associates in the right quantity, at the right location, at exactly the right time. You’ll work on state-of-the-art research with advanced software tools, new AI systems, and machine learning algorithms to solve complex hiring challenges. Join ITA in using cutting-edge technologies to transform the hiring landscape and make a meaningful difference in people's lives. Together, we can solve the world's toughest hiring problems. Within ITA, the Global Hiring Science (GHS) team designs and implements innovative hiring solutions at scale. We work in a fast-paced, global environment where we use research to solve complex problems and build scalable hiring products that deliver measurable impact to our customers. We are seeking selection researchers with a strong foundation in hiring assessment development, legally-defensible validation approaches, research and experimental design, and data analysis. Preferred candidates will have experience across the full hiring assessment lifecycle, from solution design to content development and validation to impact analysis. We are looking for equal parts researcher and consultant, who is able to influence customers with insights derived from science and data. You will work closely with cross-functional teams to design new hiring solutions and experiment with measurement methods intended to precisely define exactly what job success looks like and how best to predict it. Key job responsibilities What you’ll do as a GHS Research Scientist: • Design large-scale personnel selection research that shapes Amazon’s global talent assessment practices across a variety of topics (e.g., assessment validation, measuring post-hire impact) • Partner with key stakeholders to create innovative solutions that blend scientific rigor with real-world business impact while navigating complex legal and professional standards • Apply advanced statistical techniques to analyze massive, diverse datasets to uncover insights that optimize our candidate evaluation processes and drive hiring excellence • Explore emerging technologies and innovative methodologies to enhance talent measurement while maintaining Amazon's commitment to scientific integrity • Translate complex research findings into compelling, actionable strategies that influence senior leader/business decisions and shape Amazon's talent acquisition roadmap • Write impactful documents that distill intricate scientific concepts into clear, persuasive communications for diverse audiences, from data scientists to business leaders • Ensure effective teamwork, communication, collaboration, and commitment across multiple teams with competing priorities A day in the life Imagine diving into challenges that impact millions of employees across Amazon's global operations. As a GHS Research Scientist, you'll tackle questions about hiring and organizational effectiveness on a global scale. Your day might begin with analyzing datasets to inform how we attract and select world-class talent. Throughout the day, you'll collaborate with peers in our research community, discussing different research methodologies and sharing innovative approaches to solving unique personnel challenges. This role offers a blend of focused analytical time and interacting with stakeholders across the globe.
US, WA, Seattle
We are looking for a researcher in state-of-the-art LLM technologies for applications across Alexa, AWS, and other Amazon businesses. In this role, you will innovate in the fastest-moving fields of current AI research, in particular in how to integrate a broad range of structured and unstructured information into AI systems (e.g. with RAG techniques), and get to immediately apply your results in highly visible Amazon products. If you are deeply familiar with LLMs, natural language processing, computer vision, and machine learning and thrive in a fast-paced environment, this may be the right opportunity for you. Our fast-paced environment requires a high degree of autonomy to deliver ambitious science innovations all the way to production. You will work with other science and engineering teams as well as business stakeholders to maximize velocity and impact of your deliverables. It's an exciting time to be a leader in AI research. In Amazon's AGI Information team, you can make your mark by improving information-driven experience of Amazon customers worldwide!
US, MA, N.reading
Amazon Industrial Robotics is seeking exceptional talent to help develop the next generation of advanced robotics systems that will transform automation at Amazon's scale. We're building revolutionary robotic systems that combine cutting-edge AI, sophisticated control systems, and advanced mechanical design to create adaptable automation solutions capable of working safely alongside humans in dynamic environments. This is a unique opportunity to shape the future of robotics and automation at unprecedented scale, working with world-class teams pushing the boundaries of what's possible in robotic manipulation, locomotion, and human-robot interaction. This role presents an opportunity to shape the future of robotics through innovative applications of deep learning and large language models. At Amazon Industrial Robotics we leverage advanced robotics, machine learning, and artificial intelligence to solve complex operational challenges at unprecedented scale. Our fleet of robots operates across hundreds of facilities worldwide, working in sophisticated coordination to fulfill our mission of customer excellence. We are pioneering the development of robotics foundation models that: - Enable unprecedented generalization across diverse tasks - Enable unprecedented robustness and reliability, industry-ready - Integrate multi-modal learning capabilities (visual, tactile, linguistic) - Accelerate skill acquisition through demonstration learning - Enhance robotic perception and environmental understanding - Streamline development processes through reusable capabilities The ideal candidate will contribute to research that bridges the gap between theoretical advancement and practical implementation in robotics. You will be part of a team that's revolutionizing how robots learn, adapt, and interact with their environment. Join us in building the next generation of intelligent robotics systems that will transform the future of automation and human-robot collaboration. Key job responsibilities As an Applied Science Manager in the Foundations Model team, you will: - Build and lead a team of scientists and developers responsible for foundation model development - Define the right ‘FM recipe’ to reach industry ready solutions - Define the right strategy to ensure fast and efficient development, combining state of the art methods, research and engineering. - Lead Model Development and Training: Designing and implementing the model architectures, training and fine tuning the foundation models using various datasets, and optimize the model performance through iterative experiments - Lead Data Management: Process and prepare training data, including data governance, provenance tracking, data quality checks and creating reusable data pipelines. - Lead Experimentation and Validation: Design and execute experiments to test model capabilities on the simulator and on the embodiment, validate performance across different scenarios, create a baseline and iteratively improve model performance. - Lead Code Development: Write clean, maintainable, well commented and documented code, contribute to training infrastructure, create tools for model evaluation and testing, and implement necessary APIs - Research: Stay current with latest developments in foundation models and robotics, assist in literature reviews and research documentation, prepare technical reports and presentations, and contribute to research discussions and brainstorming sessions. - Collaboration: Work closely with senior scientists, engineers, and leaders across multiple teams, participate in knowledge sharing, support integration efforts with robotics hardware teams, and help document best practices and methodologies.
CA, QC, Montreal
Join the next revolution in robotics at Amazon's Frontier AI & Robotics team, where you'll work alongside world-renowned AI pioneers to push the boundaries of what's possible in robotic intelligence. As an Applied Scientist, you'll be at the forefront of developing breakthrough foundation models that enable robots to perceive, understand, and interact with the world in unprecedented ways. You'll drive independent research initiatives in areas such as perception, manipulation, scene understanding, sim2real transfer, multi-modal foundation models, and multi-task learning, designing novel algorithms that bridge the gap between state-of-the-art research and real-world deployment at Amazon scale. In this role, you'll balance innovative technical exploration with practical implementation, collaborating with platform teams to ensure your models and algorithms perform robustly in dynamic real-world environments. You'll have access to Amazon's vast computational resources, enabling you to tackle ambitious problems in areas like very large multi-modal robotic foundation models and efficient, promptable model architectures that can scale across diverse robotic applications. Key job responsibilities - Design and implement novel deep learning architectures that push the boundaries of what robots can understand and accomplish - Drive independent research initiatives in robotics foundation models, focusing on breakthrough approaches in perception, and manipulation, for example open-vocabulary panoptic scene understanding, scaling up multi-modal LLMs, sim2real/real2sim techniques, end-to-end vision-language-action models, efficient model inference, video tokenization - Lead technical projects from conceptualization through deployment, ensuring robust performance in production environments - Collaborate with platform teams to optimize and scale models for real-world applications - Contribute to the team's technical strategy and help shape our approach to next-generation robotics challenges A day in the life - Design and implement novel foundation model architectures, leveraging our extensive compute infrastructure to train and evaluate at scale - Collaborate with our world-class research team to solve complex technical challenges - Lead technical initiatives from conception to deployment, working closely with robotics engineers to integrate your solutions into production systems - Participate in technical discussions and brainstorming sessions with team leaders and fellow scientists - Leverage our massive compute cluster and extensive robotics infrastructure to rapidly prototype and validate new ideas - Transform theoretical insights into practical solutions that can handle the complexities of real-world robotics applications About the team At Frontier AI & Robotics, we're not just advancing robotics – we're reimagining it from the ground up. Our team is building the future of intelligent robotics through ground breaking foundation models and end-to-end learned systems. We tackle some of the most challenging problems in AI and robotics, from developing sophisticated perception systems to creating adaptive manipulation strategies that work in complex, real-world scenarios. What sets us apart is our unique combination of ambitious research vision and practical impact. We leverage Amazon's massive computational infrastructure and rich real-world datasets to train and deploy state-of-the-art foundation models. Our work spans the full spectrum of robotics intelligence – from multimodal perception using images, videos, and sensor data, to sophisticated manipulation strategies that can handle diverse real-world scenarios. We're building systems that don't just work in the lab, but scale to meet the demands of Amazon's global operations. Join us if you're excited about pushing the boundaries of what's possible in robotics, working with world-class researchers, and seeing your innovations deployed at unprecedented scale.
US, NY, New York
The Sponsored Products and Brands team at Amazon Ads is re-imagining the advertising landscape through cutting-edge generative AI technologies, revolutionizing how millions of customers discover products and engage with brands across Amazon.com and beyond. We are at the forefront of re-inventing advertising experiences, bridging human creativity with artificial intelligence to transform every aspect of the advertising lifecycle from ad creation and optimization to performance analysis and customer insights. We are a passionate group of innovators dedicated to developing responsible and intelligent AI technologies that balance the needs of advertisers, enhance the shopping experience, and strengthen the marketplace. If you're energized by solving complex challenges and pushing the boundaries of what's possible with AI, join us in shaping the future of advertising. Key job responsibilities Participate in the Science hiring process as well as mentor other scientists - improving their skills, their knowledge of your solutions, and their ability to get things done. Identify and devise new video related solutions following a customer-obsessed scientific approach to address customer or business problems when the problem is ill-defined, needs to be framed, and new methodologies or paradigms need to be invented at the product level. Articulate potential scientific challenges of ongoing or future customers’ needs or business problems, and present interventions to address them. Independently assess alternative video related technologies, driving evaluation and adoption of those that fit best A day in the life As an Applied Scientist on the Sponsored Products Video team, you will work with a team of talented and experienced engineers, scientists, and designers to help bring new products to market and ensure that our customers are delighted by what we create. The Sponsored Products Video team is responsible for the design, development, and implementation of Sponsored Products Video experiences worldwide. About the team The Sponsored Products Video team within Sponsored Products and Brands creates relevant and engaging video experiences, connecting advertisers and shoppers. We are on a mission to make Amazon the best in class destination for shoppers to discover, engage and build affinity with brands, making shopping delightful, & personal.
IN, TS, Hyderabad
We're seeking an Applied Scientist to lead and innovate in applying advanced AI technologies that will reshape how businesses sell on Amazon. Our team is passionate about leveraging Machine Learning, GenAI, and Agentic AI to help B2B sellers optimize their operations and drive growth. Join Amazon Business 3P (Third Party - Sellers) - a rapidly growing global organization where we innovate at the intersection of AI technology and B2B commerce. We're reimagining how sellers reach and serve business customers, creating intelligent solutions that help them grow their B2B business on Amazon. From AI-powered Seller Central tools to smart business certifications, dynamic pricing capabilities, and advanced analytics, we're transforming how B2B selling happens. As an Applied Scientist II on our AB 3P Tech team, you'll drive the development and implementation of state-of-the-art algorithms and models for supervised fine-tuning and reinforcement learning. You'll work with highly technical, entrepreneurial teams to: - Design and implement AI models that power the B2B selling experience - Lead the development of GenAI products that can handle Amazon-scale use cases - Drive research and implementation of advanced algorithms for human feedback and complex reasoning - Make strategic AI technology decisions and mentor technical talent - Own critical AI systems spanning from Seller Central to Amazon Business detail pages Join us in shaping the future of B2B selling - we're building applied AI solutions that businesses love and trust for their day-to-day success. If you are scrappy and bias for action is your favorite Leadership Principle, you'll fit right in as we innovate across the seller experience to create significant impact in this fast-growing business. Key job responsibilities Key job responsibilities: - Collaborate with cross-functional teams of engineers, product managers, and scientists to identify and solve complex problems in Gen AI - Design and execute experiments to evaluate the performance of different algorithms and models, and iterate quickly to improve results - Think big about the arc of development of Gen AI over a multi-year horizon, and identify new opportunities to apply these technologies to solve real-world problems - Communicate results and insights to both technical and non-technical audiences About the team At Amazon Business Third Party (AB3P) Tech, we're revolutionizing B2B e-commerce by empowering sellers in the business marketplace. Our scope spans the complete B2B selling journey, from Seller Central to Amazon Business detail pages, cart, and checkout for merchant-fulfilled offers. Our entrepreneurial culture and global reach define us. We develop features across seller experience, delivery, certifications, fees, registration, and analytics, collaborating with worldwide teams and leveraging advanced AI technologies to continuously innovate. Working in true Day 1 spirit, we build next-generation solutions that shape the future of B2B commerce. Join us in building next-generation solutions that shape the future of B2B commerce.
GB, London
Come build the future of entertainment with us. Are you interested in shaping the future of movies and television? Prime Video is a premium streaming service that offers customers a vast collection of TV shows and movies - all with the ease of finding what they love to watch in one place. We offer customers thousands of popular movies and TV shows including Amazon Originals and exclusive licensed content to exciting live sports events. Prime Video is a fast-paced, growth business - available in over 200 countries and territories worldwide. The Video Content Research team works in a dynamic environment where innovating on behalf of our customers is at the heart of everything we do. We are seeking a Data Scientist to develop scalable models that uncover key insights into how, why and when customers engage with Prime Video marketing. Key job responsibilities In this role you will work closely with business stakeholders and technical peers (data scientists, economists and engineers) to develop causal marketing measurement models, analyze experiments and investigate customer, marketing and content related factors that drive engagement with Prime Video. You will create mechanisms and infrastructure to deploy complex models and generate insights at scale. You will have the opportunity to work with large datasets, work with AWS to build and deploy machine learning models that impact Prime Video's marketing decisions. About the team The Video Content Research team uses machine learning, econometrics, and data science to optimize Amazon's marketing and content investments. We generate insights for Amazon's digital video strategy, partnering with finance, marketing, and content teams. We analyze customer behavior on Prime Video (marketing impressions, clicks on owned channels) to identify optimization opportunities.
US, MA, Boston
AI is the most transformational technology of our time, capable of tackling some of humanity’s most challenging problems. That is why Amazon is investing in generative AI (GenAI) and the responsible development and deployment of large language models (LLMs) across all of our businesses. Come build the future of human-technology interaction with us. We are looking for a Research Scientist with strong technical skills which includes coding and natural language processing experience in dataset construction, training and evaluating models, and automatic processing of large datasets. You will play a critical role in driving innovation and advancing the state-of-the-art in natural language processing and machine learning. You will work closely with cross-functional teams, including product managers, language engineers, and other scientists. Key job responsibilities Specifically, the Research Scientist will: • Ensure quality of speech/language/other data throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. • Clean, analyze and select speech/language/other data to achieve goals • Build and test models that elevate the customer experience • Collaborate with colleagues from science, engineering and business backgrounds • Present proposals and results in a clear manner backed by data and coupled with actionable conclusions • Work with engineers to develop efficient data querying infrastructure for both offline and online use cases
US, VA, Arlington
The People eXperience and Technology Central Science (PXTCS) team uses economics, behavioral science, statistics, and machine learning to proactively identify mechanisms and process improvements which simultaneously improve Amazon and the lives, wellbeing, and the value of work to Amazonians. PXTCS is an interdisciplinary team that combines the talents of science and engineering to develop and deliver solutions that measurably achieve this goal. PXTCS is looking for an economist who can apply economic methods to address business problems. The ideal candidate will work with engineers and computer scientists to estimate models and algorithms on large scale data, design pilots and measure impact, and transform successful prototypes into improved policies and programs at scale. PXTCS is looking for creative thinkers who can combine a strong technical economic toolbox with a desire to learn from other disciplines, and who know how to execute and deliver on big ideas as part of an interdisciplinary technical team. Ideal candidates will work in a team setting with individuals from diverse disciplines and backgrounds. They will work with teammates to develop scientific models and conduct the data analysis, modeling, and experimentation that is necessary for estimating and validating models. They will work closely with engineering teams to develop scalable data resources to support rapid insights, and take successful models and findings into production as new products and services. They will be customer-centric and will communicate scientific approaches and findings to business leaders, listening to and incorporate their feedback, and delivering successful scientific solutions. A day in the life The Economist will work with teammates to apply economic methods to business problems. This might include identifying the appropriate research questions, writing code to implement a DID analysis or estimate a structural model, or writing and presenting a document with findings to business leaders. Our economists also collaborate with partner teams throughout the process, from understanding their challenges, to developing a research agenda that will address those challenges, to help them implement solutions. About the team PXTCS is a multidisciplinary science team that develops innovative solutions to make Amazon Earth's Best Employer
US, WA, Bellevue
Amazon is looking for a Principal Applied Scientist world class scientists to join its AWS Fundamental Research Team working within a variety of machine learning disciplines. This group is entrusted with developing core machine learning solutions for AWS services. At the AWS Fundamental Research Team you will invent, implement, and deploy state of the art machine learning algorithms and systems. You will build prototypes and explore conceptually large scale ML solutions across different domains and computation platforms. You will interact closely with our customers and with the academic community. You will be at the heart of a growing and exciting focus area for AWS and work with other acclaimed engineers and world famous scientists. About the team About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud.