StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

Spatial Perception Systems Engineer

Blue Signal - San Francisco, CA

Apply Now

Job Description

OverviewSpatial Perception Systems Engineer — Location: San Francisco, CA (Hybrid or Onsite Preferred). Type: Full-Time | Autonomy / AR/VR / RoboticsOur client is on a mission to redefine how machines understand and interact with the physical world. As part of their cutting-edge research and development team, they are seeking a Spatial Perception Systems Engineer to develop computer vision and sensor fusion technologies powering next-gen autonomous systems and immersive environments. This role is ideal for candidates eager to push the envelope in robotics, AR/VR, and spatial computing, working hands-on with advanced hardware and real-time applications.Key ResponsibilitiesDesign and deploy computer vision pipelines including object detection, segmentation, tracking, and simultaneous localization and mapping (SLAM).Integrate vision and perception modules into physical systems such as autonomous platforms, AR headsets, and edge devices.Optimize performance of algorithms on embedded systems and hardware accelerators (e.g., Jetson, Apple Vision Pro, Qualcomm chipsets).Collaborate closely with interdisciplinary teams in robotics, machine learning, and embedded hardware to ensure seamless end-to-end system performance.Manage, annotate, and preprocess large-scale image and video datasets for supervised and self-supervised learning applications.Ideal Background3+ years of experience in computer vision development for real-time or embedded environments.Solid foundation in deep learning (CNNs, transformers), geometric vision, and image processing techniques.Experience with tools and libraries such as PyTorch, TensorFlow, OpenCV, ROS, and 3D processing frameworks like Open3D or PCL.Familiar with stereo imaging, camera calibration, depth estimation, and environmental variability challenges (e.g., motion blur, occlusion).Ability to analyze and address edge-case scenarios using robust perception techniques.Bonus QualificationsPrior hands-on experience in robotics, drones, self-driving vehicles, or AR/VR platforms.Deployment experience using inference optimization libraries such as ONNX Runtime, Core ML, or TensorRT.Understanding of SLAM, real-time 3D reconstruction, and multi-view geometry.Exposure to synthetic training environments using simulation platforms like Unity, Unreal Engine, or pensation$160,000 to $230,000 base salary, with equity and full benefits package included.Why You Should ApplyJoin a forward-thinking team at the forefront of autonomy and immersive technology.Work on hardware-software integration that directly impacts emerging consumer and industrial applications.Thrive in a collaborative, problem-solving culture focused on real-world results and scalable innovation. #J-18808-Ljbffr

Created: 2025-09-26

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.