As an Amazon Associate I earn from qualifying purchases.

Amazon intern Qing Guo explores the interface between statistics and machine learning

[ad_1]

When you ask Alexa a question — say, to buy more peanut butter — you probably have a certain idea in mind. As if playing a game of 20 questions, Alexa tries to fulfill your request in as few turns as possible, pinpointing the context and specifics.

Last summer, as an intern with the Amazon Alexa AI team in Sunnyvale, California, Qing Guo worked on a project to help Alexa understand user intent. The project drew on Guo’s background as a PhD student in statistics at Virginia Tech: she used statistical techniques to improve the training and performance of machine learning models that power Alexa.

Qing Guo’s internships have drawn on her background as a PhD student in statistics at Virginia Tech: She applies statistical techniques to improve the training and performance of machine learning models that power Alexa.

For example, she applied statistical concepts such as importance weighting and variational inference to contrastive learning, enabling the model to focus only on the most relevant answers during the model training process. This makes model training more stable and efficient. Her work is grounded in information theory, which provides a principled framework for measuring and quantifying the amount of information contained in question-answer pairs.

“By incorporating these statistical techniques, I was able to enhance the algorithm’s performance,” Guo said. “The algorithm could be trained using small batch sizes without compromising the overall performance, making it accessible and efficient for real-time interactions. This is significant for training big models, such as deep neural nets.”

Now Guo is back at Amazon for a second internship linked to a fellowship awarded to Virginia Tech doctoral students through the Amazon–Virginia Tech Initiative for Efficient and Robust Machine Learning.

An overhead shot of the Virginia Tech campus

Related content

Initiative will be led by the Virginia Tech College of Engineering and directed by Thomas L. Phillips Professor of Engineering Naren Ramakrishnan.

The initiative, launched in March of 2022, is focused on research pertaining to efficient and robust machine learning. It provides, among other things, an opportunity for doctoral students who are conducting AI and ML research to apply for Amazon fellowships.

Guo applied for the fellowship for the opportunity to try out her academic ideas in a real-world, industrial setting. Her submission was accepted, and her subsequent interactions with Amazon scientists, she said, provide “insights and inspirations from different perspectives to continuously strengthen my research.”

For example, the interactions focused her research on solutions to problems that are simple, robust replacements to components of models used in real-world applications.

An introduction to machine learning

Guo studied applied statistics at Shanghai University of International Business and Economics in China. She didn’t know much about machine learning until she got to Virginia Tech, where her advisor, Xinwei Deng, asked her to code a statistical solution to a machine learning problem using Python.

Guo taught herself Python and started to collaborate with computer science students and academic colleagues, drawing on methods that use theories and concepts from statistics to improve the training and performance of machine learning models.

More broadly, Guo said, her PhD research centers on strategies for extracting the most valuable information from data while using computational resources more efficiently.

“Improving data quality or more efficiently extracting information from data has become increasingly important for machine learning, and this is what statisticians are good at,” she said. “This is formally known as experimental design in the statistical literature, which is less known to the machine learning community and has great potential for improving machine learning practices.”

For example, at Virginia Tech she and Chenyang Tao, an applied scientist who is now her mentor at Amazon, worked together on the development of a technique that enables the use of small datasets to train machine learning models for computer vision and natural-language processing. Training models for these types of applications typically requires large datasets and abundant computer resources.

The work leveraged statistical concepts such as mutual information, which measures the dependency between variables, and variational inference, which, Guo said, “is a powerful tool because it reformulates complex, costly problems with simple, cheap, accurate statistical approximations.”

Their approach is eight times more efficient than the current state-of-the art solution, she noted. The researchers presented their work at the Conference on Neural Information Processing Systems (NeurIPS) in 2022, and since then, they have continued to improve on the technique, which is fundamental to Guo’s PhD research.

Applying statistics to machine learning at Amazon

During the first internship, Guo had weekly meetings with colleagues at Alexa AI, who helped her apply her statistical skills to real-world machine learning problems. They advised, for example, that she needed a strategy to understand customer intent not only from incomplete information but also incorrect information such as the wrong actor’s name when searching for a movie.

“I need to think about this problem and improve my model,” Guo said. “This is a very valuable insight. Nowadays, when I do my research, I always think, ‘Is there anything else I need to think about?’”

For her second internship, Guo is helping Tao and his team on fundamental research for a next-generation machine learning technique that will align computer vision and language models to enable multimodal models to answer questions with information drawn from multimedia content.

She is also involved in exploratory conversations with internal teams about new ways to train large language models with limited, targeted data, reducing the training time of generative AI systems.

Guo believes that training and fine-tuning generative AI models with only the most informative data will overcome some of the computing resource constraints to training these types of models.

“This is a very hot topic now,” Guo said. “Multimodal is a very important area to research. I’m very fortunate to be assigned this project.”

A dedication to academia

Tao noted that the industry experience that Guo gained during her internships will inform her academic research going forward.

“She will know firsthand the challenges applied scientists in industry are facing,” he said. “This will provide her a lot of opportunities for new research topics and a direct path for her research to impact people’s lives.”

Ultimately, Guo hopes to pursue a career in academia with her PhD in statistics.

“My dream is to be a professor,” she said. “I think research is very interesting, since I can solve different kinds of problems that I’m interested in.”

There, another lesson from interning at Amazon will prove key: the need to communicate with colleagues and teammates who approach projects from different perspectives and areas of expertise, noted Guo.

In her weekly meetings with mentors and managers at Amazon, she’s learned how communication skills advanced their careers. To be a professor in academia, it is important to use “simple words to express technical concepts to people with different backgrounds who want to solve the same problem,” she said.



[ad_2]

Source link

We will be happy to hear your thoughts

Leave a reply

Aqualib World- Space of Deals & offers
Logo