Headshot of Jeehan (who is a brown woman with black hair) smiling at the camera, wearing a light blue shirt and black-rimmed eye glasses.

Jeehan Malik

I am a Human Factors Engineer at Apple.
Previous: PhD candidate, advised by Prof. Joe Kearney. Part of the Hank Lab at the University of Iowa. 
Research area: Human-Computer Interaction, specifically accessibility and aging.  I am interested in research exploring accessibility and perception in mixed reality applications, as well as safer transport and mobility.

Resume  Email: jeehanmalik24@gmail.com  |  Twitter: @jeehanfmalik  | Google Scholar 



Do Simulated Augmented Reality Alerts impact Street-Crossing Behavior in Non-Mobility Impaired Older and Younger Adults?
Human Factors 2023 

Determining the Impact of Smartphone Alerts and Warnings on Street-Crossing Behavior in Non-Mobility Impaired Older and Younger Adults
ACM CHI 2021 [promo video

Increasing Access to Trainer-led Aerobic Exercise for People with Visual Impairments through a Sensor Mat System
Poster Paper at ASSETS 2021 

Determining a Taxonomy of Accessible Phrases During Exercise Instruction for People with Visual Impairments for Text Analysis
Poster Paper at ASSETS 2021

Mobile Tasks to Improve Art Description Accessibility for People with Visual Impairments EAI International Conference: ArtsIT Interactivity & Game Creation (EAI ArtsIT’21) 


I am part of a multidisciplinary team that investigates sensing technologies across a wide range of Apple products. I focus on designing frameworks and studies to understand and improve users’ experience with Apple products. My work typically involves development of study methodology, providing user-study design recommendations, developing data-processing pipelines and tools, and data analysis. My work requires collaborating cross-functionally with design, engineering, marketing, and development teams.

At the Hank Lab, we used a Virtual Environment to explore the effect of Vehicle-to-Pedestrian communication technology on older pedestrian street-crossing behavior. We presented older and younger adults with smartphone and Augmented Reality alerts and warnings and examined their street-crossing behavior. We are now further exploring how pedestrian behavior can change in different contexts and with different abilities.

At the HawCHI Lab, we explored how to include people with visual impairments in exercise classes. We developed a system using a sensor mat and Convolutional Neural Network to detect feet and are designing feedback for a step-aerobics workout. 

Poster Presentations

Poster titled "Mobile Applications to help Older Adults Make Safe Street-Crossing Decisions"
Safer-sim Symposium, Nov'18


CS:1210 Computer Science Fundamentals (2018-19) 

Instructor for the discussion sections. I conducted 2 lectures every week, with about 20 students each.