Thomas Randall

Thomas Randall
  • Bachelor of Science
  • PhD Candidate at Clemson University

About

4
Publications
204
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3
Citations
Current institution
Clemson University
Current position
  • PhD Candidate
Additional affiliations
June 2024 - August 2024
Oak Ridge National Laboratory
Position
  • OMNI
Description
  • Research with Dr. Prasanna Balaprakash
May 2023 - August 2023
Argonne National Laboratory
Position
  • GIVENS Scholar
Description
  • Research project with Dr. Xingfu Wu
May 2022 - July 2022
Argonne National Laboratory
Position
  • GIVENS Scholar
Description
  • Continuing research from 2020
Education
August 2015 - May 2019
Clemson
Field of study
  • Computer Science

Publications

Publications (4)
Conference Paper
Full-text available
The growing need for energy-efficient computing has led to many novel system innovations, including liquid immersion cooling. While many myths about the technology have been dispelled, the actual impact of this cooling solution on thermal conditions in real computing scenarios remains under-reported and under-studied. In this work, we collate data...
Poster
Full-text available
Large Language Models (LLMs) capture a certain amount of world knowledge spanning many general and technical topics, including programming and performance. Without fine-tuning, the use of In-Context Learning (ICL) can specialize LLM outputs to perform complex tasks. In this work, we seek to demonstrate the regressive capabilities of LLMs in a perfo...
Conference Paper
Full-text available
Word2Vec remains one of the highly-impactful innovations in the field of Natural Language Processing (NLP) that represents latent grammatical and syntactical information in human text with dense vectors in a low dimension. Word2Vec has high computational cost due to the algorithm's inherent sequentiality, intensive memory accesses, and the large vo...

Network

Cited By