top of page

Researching

Perception of

Autonomous Vehicles

STANFORD INTELLIGENT SYSTEMS LABORATORY
​

RESEARCH ASSISTANT

PERCEPTION for Autonomous Vehicles

JUNE 2019 - PRESENT

SKILLS​
​
  • Machine Learning with Pytorch

  • Programming in Python, Bash (Terminal)

  • Interfacing with LiDAR Datasets from CARLA, Kitti, WAYMO 

  • Presenting and communicating at poster sessions

TAKEAWAYS​
​
  • Autonomy over research project, with guidance from a mentor

  • Learning cutting-edge approaches from research publications and talks

  • Collaborating with other researchers at the intersection of our works

using LiDAR data from CARLA Town 1

Shown above is a video of the occupancy maps for a car simulated in the CARLA environment.

From the left to right of the figure, we have, 

(1) visualized LiDAR points (the input): blue points represent free space and red are occupied points,

(2) local occupancy map with color denoting the probability that certain areas are occupied,

(3)  local variance map which provides a measure of confidence about the occupancy,

(4) global occupancy map from accumulating data about each local map area.

Local and Global Occupancy Maps

bottom of page