open arms


Ganesh L. Gopalakrishnan is a Professor of Computer Science in the School of Computing at the University of Utah. He is serving as the Director of Graduate Studies and Director of the Center for Parallel Computing at Utah ("CPU").

His recent work covers many aspects of the correctness of parallel /concurrent programs that are used to program today's supercomputers. These aspects include data race checking, system resilience, and floating-point reasoning.

He has mentored 38 undergraduate REU students, 25 PhD students (21 finished PhDs), and 4 postdoctoral researchers. He has written two textbooks: "Computation Engineering: Applied Automata Theory and Logic" (2006), and "Automata and Computability: A Programmer's Perspective" (2019). The latter book is accompanied by a software system "Jove" written in Python to interactively explore central concepts in automata and formal verification using Jupyter notebooks.

Prof. Gopalakrishnan is an ACM Distinguished Scientist,

Saurav Muralidharan is a senior researcher in the Programming Systems Research Group at NVIDIA. His current work focuses on optimizing deep neural networks for performance and scalability. More broadly, He is interested in research problems that lie at the intersection of systems and machine learning.

Prior to joining NVIDIA, Saurav received his Ph.D. in Computer Science from the University of Utah under the guidance of Prof. Mary Hall. While at Utah, He worked on machine learning-based techniques to improve the performance, portability, and energy efficiency of GPU programs.

Michael Garland joined NVIDIA in 2006 and is one of the founding members of NVIDIA Research. He currently leads the Programming Systems and Applications Research Group. Dr. Garland holds B.S. and Ph.D. degrees in Computer Science from Carnegie Mellon University, and was previously on the faculty of the Department of Computer Science of the University of Illinois at Urbana-Champaign. He has published numerous articles in leading conferences and journals on a range of topics including surface simplification, remeshing, texture synthesis, novice-friendly modeling, free-form animation, scientific visualization, graph mining, and visualizing complex graphs. His current research interests include computer graphics and visualization, geometric algorithms, and parallel algorithms and programming models.

Mary Hall a professor in the U’s School of Computing was named an 2020 IEEE Fellow and recognized for her contributions to compiler optimization and performance tuning.

While growing up in her hometown of Beaumont, Texas, Mary Hall became fascinated in computing in part because her mother was an early adopter. “When I was in high school, my mother got interested in computer science, and she started taking classes at the local university,” Hall said. “She was teaching computer literacy. And though no one had computers in their home, we had one of these Radio Shack TRS-80s in our house.” Hall, who is now a professor in the University of Utah’s School of Computing, knows the importance mentors like her mother can have on influencing young students in STEM. So she has devoted her career to encouraging female students and those from underrepresented communities to become computer scientists.

“For people of color, the impoverished, those with disabilities, and gender — we don’t have enough computer scientists, and we need to step up,” Hall said. Hall is chair of the Women in Engineering Faculty Advisory Council for the U’s College of Engineering as well as chair of the School of Computing’s diversity committee. She has implemented diversity-advocacy workshops for faculty as well as a mentoring program where she connects female senior faculty members with new female assistant professors. She’s studied the booming enrollment in computer science and its impact on colleges and the diversity of the student body.

Animesh Garg is a Assistant Professor of Computer Science at University of Toronto and a Faculty Member at the Vector Institute. Animesh leads the Toronto People, AI and Robotics (PAIR) research group. He is affiliated with Mechanical and Industrial Engineering (courtesy) and Toronto Robotics Institute. Animesh also shares time as a senior research scientist at Nvidia in ML and Robotics. Prior to this, Animesh was a postdoc at Stanford AI Lab working with Fei-Fei Li and Silvio Savarese. He received MS in Computer Science and Ph.D. in Operations Research from the UC, Berkeley in 2016.

Animesh was advised by Ken Goldberg in the Automation Lab as a part of the Berkeley AI Research Lab (BAIR). He also worked closely with Pieter Abbeel, Alper Atamturk and UCSF Radiation Oncology.

His current research focuses on machine learning algorithms for perception and control in robotics. Animesh develops algorithmic methods to enable efficient robot learning for long-term sequential tasks through Generalizable Autonomy. The principal focus of his research is to understand representations and algorithms to enable the efficiency and generality of learning for interaction in autonomous agents.

Anima Anandkumar holds dual positions in academia and industry. She is a Bren professor at Caltech CMS department and a director of machine learning research at NVIDIA. At NVIDIA, she is leading the research group that develops next-generation AI algorithms. At Caltech, she is the co-director of Dolcit and co-leads the AI4science initiative, along with Yisong Yue.

She has spearheaded the development of tensor algorithms, first proposed in her seminal paper. They are central to effectively processing multidimensional and multimodal data, and for achieving massive parallelism in large-scale AI applications.

Prof. Anandkumar is the youngest named chair professor at Caltech, the highest honor the university bestows on individual faculty. She is recipient of several awards such as the Alfred. P. Sloan Fellowship, NSF Career Award, Faculty fellowships from Microsoft, Google and Adobe, and Young Investigator Awards from the Army research office and Air Force office of sponsored research. She has been featured in documentaries and articles by PBS, wired magazine, MIT Technology review, yourstory, and Forbes.

Jean Kossaif is currently a Senior Research Scientist at NVIDIA Research and a Research Assistant at Imperial College London. Prior to that, He was a Research Scientist at the Samsung AI Center in Cambridge, following the completion of his PhD in the Department of Computing at Imperial College London, within the iBug group. He worked a lot on automatic facial affect estimation, a field which bridges the gap between Computer Vision and Machine Learning. After contributing to facial landmark detection using Active Appearance Models and to emotion detection from faces, He currently works on Machine Learning using tensor methods.

Jean created TensorLy, a high-level API for tensor methods and deep tensorized neural networks in Python, designed to make tensor learning simple and accessible.

Vivek Srikumar assistant professor in the School of Computing at the University of Utah , He is interested in research questions arising from the need to manage, analyze and understand large amounts of unstructured data, particularly in textual form. His research lies in the areas of machine learning and natural language processing.

In particular, He is interested in the following broad questions:

  • Text understanding: What does text understanding mean? How can we represent and transform text into a computer-understandable form to perform machine reading and textual inference?

  • Learning Representations: How can we learn to represent inputs and outputs in real or discrete spaces that makes it easy to build predictors, especially of the structured variety?

  • Structured learning, especially with very little supervision: Many text understanding problems can be phrased as that of predicting a structured representation of text. How do we take advantage of the structure to help to efficiently learn a predictor in spite of having very little annotated data?

  • Structured prediction: Predicting structures is a combinatorial optimization problem. How can we make this faster and scalable?

Aditya Bhaskara is an Assistant Professor in the School of Computing at the University of Utah. He is also a member of the data group at the university.

His research interests include:

-- Theoretical computer science: design and analysis of algorithms, approximation algorithms for optimization problems, spectral methods.

-- Machine learning: complexity of learning, design of provably efficient learning algorithms, robustness in machine learning.

For more information, please see his research page.

Rajeev Balasubramonian, Professor in University of Utah, His research focuses on many aspects of computer architecture. He is especially interested in studying how future technology trends influence the design of microprocessors and memory systems. In recent years, he has focused on designing memory systems that can cater to the bandwidth, latency, power, cost, security, and reliability demands of datacenter and big-data workloads. We are also exploring neuromorphic architectures.

Current projects include:

  • Memory Systems : optimizing DRAM/NVM chips, memory controllers, data placement, and security for big-data and datacenter workloads.

Louis-Noël Pouchet is currently an Assistant Professor at Colorado State University. He works on a variety of topics for high-performance computing, and in particular He develops compiler technologies based on the polyhedral framework. He is a member of the Center for Domain-Specific Computing (NSF) and of the DSL Technology for Exascale Computing project (DoE). He also leads research on the enhancement and use of polyhedral compilation technologies for heterogeneous platforms (NSF and Intel ISRA). In the past He was a member of the Platform-Aware Compilation Environment project (DARPA), where He contributed a polyhedral compilation engine (PolyOpt) for the ROSE compiler.

He received a Ph.D in Computer Science in Jan. 2010, for his work at the ALCHEMY group, INRIA Saclay. He was advised by Albert Cohen and Cédric Bastoul. In 2006, He graduated from EPITA (a French Engineer school specialized in Computer Science) and in parallel received a Master's degree in Computer Science (minor in Cognitive Sciences) from the University of Paris-Sud XI.

He is working on several areas of high-performance computing

Dr. Ganesh Rao, I owe every milestone of mine to him, Dr Ganesh is my role-model, mentor and the best teacher I have ever had.

He taught me Yoga ! and everything else I needed to know during my undergraduate studies in Bangalore, India and also inspired me to get into graduate school and a doctoral program in Computer Science.

Dr Ganesh was the Dean of Telecommunication Engineering at MSR Institute of Technology, Bangalore, India.

Tom Fletcher, Assistant Professor, School of Computing.

Tom Fletcher received his B.A. degree in Mathematics at the University of Virginia in 1999. He received an M.S. in Computer Science in 2002 followed by a Ph.D. in Computer Science in 2004 from the University of North Carolina at Chapel Hill.

Dr. Fletcher's research is focused on creating novel methods at the intersection of statistics, mathematics, and computer science to solve problems in medical image analysis. He is currently collaborating with researchers in Autism and Alzheimer's disease at the University of Utah on the statistical analysis of combined imaging modalities, including structural MRI, DTI, fMRI and PET in longitudinal studies.

Dr. Fletcher will serve as the SCI Institute's third USTAR faculty member. USTAR is an innovative, aggressive and far-reaching effort to bolster Utah's economy with high-paying jobs and keep the state vibrant in the Knowledge Age. The USTAR Support Coalition and the Salt Lake Chamber sought public and private investment to recruit world-class research teams in carefully targeted disciplines. These teams will develop products and services that can be commercialized in new businesses and industries.

Alex Aiken is the Alcatel-Lucent Professor of Computer Science at Stanford.

He received his Bachelors degree in Computer Science and Music from Bowling Green State University in 1983 and his Ph.D. from Cornell University in 1988. Alex was a Research Staff Member at the IBM Almaden Research Center (1988-1993) and a Professor in the EECS department at UC Berkeley (1993-2003) before joining the Stanford faculty in 2003. His research interest is in areas related to programming languages.

Here is his publicly available, free, self-study compilers course. The course covers the essentials of compiler construction, plus material on language design and semantics, optimization, and bit on the history of programming languages.

There are optional programming assignments for hard-core enthusiasts who want to build a full, functioning compiler for COOL, the Classroom Object Oriented Language.

P. (Saday) Sadayappan (Fellow, IEEE) received the B.Tech. degree from the Indian Institute of Technology Madras, Chennai, India, and the M.S. and Ph.D. degrees from Stony Brook University, Stony Brook, NY, USA.,

He is a Professor of Computer Science and Engineering at The University of Utah.

His research interests include compiler optimization for parallel and heterogeneous systems, domain/pattern-specific compiler optimization, and analysis/characterization of data movement complexity of algorithms.

Matthew Flatt is a professor at the University of Utah School of Computing in Salt Lake City. He is also a member of the core development team for the Racket programming language.

Flatt received his PhD at Rice University in 1999, under the direction of Matthias Felleisen.

His dissertation is on the mechanics of first-class modules and mixin classes. His work triggered research in the ML community on mutually recursive modules and in the object-oriented community on mixins and traits.

Flatt served as one of four editors of the Revised^6 Report on the Scheme programming language. The report is influenced by his design of Racket, especially the module system, the exception system, the record system, the macro system, and library links.

Prof. Bei Wang was my Advanced Scientific Computing Professor during my Masters.

She is interested in the analysis and visualization of large and complex data. Her research expertise lies in the theoretical, algorithmic, and application aspects of data analysis and data visualization, with a focus on topological techniques.

Her research interests include: topological data analysis, data visualization, computational topology, computational geometry, machine learning and data mining. Previously, She has worked on projects related to computational biology and bioinformatics, as well as robotics.

Her vision is to tackle problems involving large and complex forms of data that require rich structural descriptions, by combining topological, geometric, statistical, data analysis and visualization techniques.

"Advice is like snow – the softer it falls, the longer it dwells upon, and the deeper it sinks into the mind." -Samuel Taylor

Prof. Bao Wang (UCLA)

Prof. Scott Mahlke (UMich)

Dr. Peter Lindstorm (LLNL)

Alice Fox (LLNL)

More people (Objective Observers) Who Inspire Me

"We are often confident even when we are wrong, and an objective observer is more likely to detect our errors than we are." - Author, Thinking, Fast and Slow

arnab das

univ of utah

michael bentley

univ of utah

shoaib ahmed siddiqui


seonwook park

eth zurich

ekta prashnani


abhishek badki


siva karthik mustikovela

univ. of heidelberg

zahra ghodsi


beidi chen

rice univ

arun visweswaraiah


adrian spurr

eth zurich

maryam dabaghchian

univ. of utah

clara de paolis

northwestern univ.

rocco salvia

univ. of utah

tan nguyen

rice univ.

benjamin brock

uc berkeley

mattias van keirsbilck


kalyan krishnamani


vishal sharma


tailin wu


zhiding yu


shalini de mello


siva hari


steven dalton


abdulrahman mahmoud


jeremy bernstein


mayuri s rao

lbnl, berkeley

batmanabhan purushothaman

x-arm, intel

probir sarkar


priya ranjani das

daily hunt

rajesh cm


milinda fernando


tharindu rusira


pruthuvi maheshakya wijewardena

anuj mahajan

oxford univ.


mark van der merwe

keaton evans rowley

john jacobson

carlos e jimenez

austin watkins

harshitha manduv

harvey dam

nithin chalapathi

emil geisler

sahana kargi

carson parker storm





"The Fabric of the Cosmos", a four-hour series based on the book by renowned physicist and author Brian Greene, takes us to the frontiers of physics to see how scientists are piecing together the most complete picture yet of space, time, and the universe. With each step, audiences will discover that just beneath the surface of our everyday experience lies a world we’d hardly recognize a startling world far stranger and more wondrous than anyone expected. Brian Greene is going to let you in on a secret: We've all been deceived. Our perceptions of time and space have led us astray. Much of what we thought we knew about our universe—that the past has already happened and the future is yet to be, that space is just an empty void that our universe is the only universe that exists — just might be wrong. Interweaving provocative theories, experiments, and stories with crystal-clear explanations and imaginative metaphors like those that defined the groundbreaking and highly acclaimed series "The Elegant Universe," "The Fabric of the Cosmos" aims to be the most compelling, visual, and comprehensive picture of modern physics ever seen on television.

Walter Hendrik Gustav Lewin (born January 29, 1936) is a Dutch astrophysicist and former professor of physics at the Massachusetts Institute of Technology. Lewin earned his doctorate in nuclear physics in 1965 at the Delft University of Technology and was a member of MIT's physics faculty for 43 years beginning in 1966 until his retirement in 2009.

Lewin's contributions in astrophysics include the first discovery of a rotating neutron star through all-sky balloon surveys and research in X-ray detection in investigations through satellites and observatories.

Lewin has received awards for teaching and is known for his lectures on physics and their publication online via YouTube, edX and MIT OpenCourseWare.

Richard Feynman lived from 1918 to 1988. He made his mark as an original genius, starting with his work on the Manhattan Project in his early twenties, through winning a Nobel Prize for his work in developing an understanding of quantum mechanics, and finally as a much-loved professor of undergraduate physics at Caltech.

His lectures continue to be available in many places, providing a deep, fundamental, intuitive way to understand physics.

The Feynman method of thought was developed by a man who refused conventional wisdom at all turns and who sought to build his mental computer from the ground up, starting with an understanding of mathematics at a very young age. (Feynman’s early notebooks are records of him deriving algebra, calculus, trigonometry, and various higher maths on his own, with original results and notation.)

This was how Feynman approached all knowledge: What can I know for sure, and how can I come to know it? It resulted in his famous quote, “You must not fool yourself, and you are the easiest person to fool.” Feynman believed it and practiced it in all of his intellectual work.

Patrick Winston, a beloved professor and computer scientist at MIT, died on July 19 at Massachusetts General Hospital in Boston. He was 76.

A professor at MIT for almost 50 years, Winston was director of MIT’s Artificial Intelligence Laboratory from 1972 to 1997 before it merged with the Laboratory for Computer Science to become MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

A devoted teacher and cherished colleague, Winston led CSAIL’s Genesis Group, which focused on developing AI systems that have human-like intelligence, including the ability to tell, perceive, and comprehend stories. He believed that such work could help illuminate aspects of human intelligence that scientists don’t yet understand.

“My principal interest is in figuring out what’s going on inside our heads, and I’m convinced that one of the defining features of human intelligence is that we can understand stories,'” said Winston, the Ford Professor of Artificial Intelligence and Computer Science, in a 2011 interview for CSAIL. “Believing as I do that stories are important, it was natural for me to try to build systems that understand stories, and that shed light on what the story-understanding process is all about.”

He was renowned for his accessible and informative lectures, and gave a hugely popular talk every year during the Independent Activities Period called “How to Speak.”

Undergraduate days (mentors)

dr. ravish


dr. indumati g

cambridge tech

dr. pappa m.


dr. suganya s


BANGALORE Days (teachers)


sri harsha

vishwa teja

aditya verma sagi

vignesh ramakrishnan

lahari yedla

prem kumar

mishel george

TV Shows

All of us have special talents and superpowers, we need to find it and use it.

Gaia, the spirit of the planet, assembles a diverse team of "planeteers," who are able to combine their powers to summon an elemental warrior that takes on the appearance of superhero Captain Planet. He works with the planeteers to defend Earth from pollution caused by criminals and villains. As the show's theme song says, Captain Planet is "gonna take pollution down to zero" by defeating the villains, who include the likes of Hoggish Greedly, Dr. Blight and Looten Plunder. The animated series was co-created by media mogul Ted Turner, a noted environmentalist.

First episode date: September 15, 1990

Final episode date: May 11, 1996