Senior Software Engineer, Autonomy - Perception, Deep Learning

All the best with your application!

Want more jobs like this straight to your inbox?

Summary

Location

Mountain View, United States

Salary

$180k-198k/year

Work

Full-time

Experience

4+ years

Key Benefits
Company 401k
Public Company Equity
Full Health Coverage
100% Paid Life & Disability
Flexible Vacation
Paid Parental Leave

About this Job

About Cyngn

Based in Mountain View, CA, Cyngn is a publicly-traded autonomous technology company. We deploy self-driving industrial vehicles like forklifts and tuggers to factories, warehouses, and other facilities throughout North America. To build this emergent technology, we are looking for innovative, motivated, and experienced leaders to join us and move this field forward. If you like to build, tinker, and create with a team of trusted and passionate colleagues, then Cyngn is the place for you. Key reasons to join Cyngn:

We are small and big.

With under 100 employees, Cyngn operates with the energy of a startup. On the other hand, we’re publicly traded. This means our employees not only work in close-knit teams with mentorship from company leaders—they also get access to the liquidity of our publicly-traded equity.

We build today and deploy tomorrow.

Our autonomous vehicles aren’t just test concepts—they’re deployed to real clients right now. That means your work will have a tangible, visible impact.

We aren’t robots. We just develop them.

We’re a welcoming, diverse team of sharp thinkers and kind humans. Collaboration and trust drive our creative environment. At Cyngn, everyone’s perspective matters—and that’s what powers our innovation.

About this role:

We are seeking an experienced Senior Software Engineer to join our Perception team within the Autonomy Software organization. This role centers on developing advanced deep learning models that power object detection, segmentation, and tracking using camera and lidar data for Cyngn’s autonomous forklifts. A key responsibility will be designing, training, and deploying multi-modal perception models to detect and localize pallets in diverse warehouse environments, enabling reliable pallet engagement and stacking.

Additional experience with model acceleration techniques, building model development infrastructure, and ownership of the end-to-end model lifecycle—from research and training through deployment in production—is considered a strong bonus.

Responsibilities

  • Design, implement, and optimize deep learning models for object detection, segmentation, and tracking using camera and lidar data.
  • Build and maintain data pipelines, training infrastructure, and inference frameworks to support reproducible and scalable model development.
  • Develop tools and metrics for evaluating model performance and ensuring robustness across diverse warehouse environments.
  • Work with third-party annotation vendors to generate high-quality labeled datasets for training and validation.

Qualifications

  • MS/PhD in computer science, computer engineering, robotics, or similar technical field of study.
  • 4+ years of experience writing Python software in a production environment - unit testing, code review, algorithm performance trade-offs, etc.
  • Strong theoretical foundation in deep learning techniques for computer vision, with working knowledge of linear algebra, probability, and optimization.
  • Hands-on experience developing and deploying deep learning models for real-world perception tasks (e.g., detection, segmentation, multi-object tracking).
  • Proficiency with libraries such as Pytorch, TensorFlow, Numpy, SciPy, OpenCV (Python), etc.
  • Experience building and integrating tools and infrastructure to optimize model development lifecycle, including but not limited to model versioning, model evaluation, model deployment, etc.
  • Excellent written & verbal communication skills.

Bonus Qualifications

  • Proficiency in C++ for robotics or real-time applications.
  • Hands-on experience with model acceleration or compression (TensorRT preferred).Experience utilizing foundation models (e.g., SAM, CLIP) to expedite training and development cycles.
  • Strong classical computer vision skills (e.g., image processing, feature extraction, geometry-based methods) to complement deep learning approaches.
  • Experience with leveraging IsaacSim synthetic data to augment real-world datasets and accelerate model development.
  • Exposure to industrial material handling autonomous vehicles including forklifts and tuggers operating in dynamic warehouse environments.

Benefits & Perks

  • Health benefits (Medical, Dental, Vision, HSA and FSA (Health & Dependent Daycare), Employee Assistance Program, 1:1 Health Concierge)
  • Life, Short-term and long-term disability insurance (Cyngn funds 100% of premiums)
  • Company 401(k)
  • Commuter Benefits
  • Flexible vacation policy
  • Sabbatical leave opportunity after 5 years with the company
  • Paid Parental Leave
  • Daily lunches for in-office employees and fully-stocked kitchen with snacks and beverages

$180,000 - $198,000 a year

About the Company

Cyngn logo

Cyngn

Public Company
Logistics & WarehousingTransportation & Autonomous VehiclesRobotics Software & AI

Automate your material handling and repetitive workflows with our Autonomous Tuggers and Forklifts. Our self-driving technology, DriveMod, is built on trusted, heavy-duty vehicles from legacy OEMs like Motrec and BYD to ensure that our self-driving industrial vehicles can do the most demanding industrial jobs, are familiar to operate and a breeze to maintain. Studies have shown that our deployments reduce labor costs by 64% and have made teams 33% more productive. That’s like adding an additional team member for every three workers at your facility — instantly. With DriveMod, Your Industrial Vehicles — • Safely navigate sites without the need for special infrastructure. • Leverage multiple, redundant, and intelligent layers of safety. • Execute missions based on a variety of flexible, programmable skills — including "auto-unhitch". • Be switched into manual mode and driven by a human. • Transport goods to any on-site location, indoors and outdoors. • Haul and tow thousands of pounds of heavy goods and cargo. • Execute missions based on a variety of flexible, programmable options. • Collect data and reveal suggestions for optimization. • Be remotely managed and monitored via the FMS or on-vehicle display.

View details
Related Jobs

Get the week's best robotics jobs

We review hundreds of postings weekly and hand-pick the top roles for you. High-salary positions, top companies, remote opportunities.

Please enter a valid email address

Unsubscribe anytime. We respect your privacy.