Описание
STARK is a new kind of defence technology company revolutionizing the way autonomous systems are deployed across multiple domains. We design, develop and manufacture high performance unmanned systems that are software-defined, mass-scalable, and cost effective. This provides our operators with a decisive edge in highly contested environments.We’re focused on delivering deployable, high-performance systems - not future promises. In a time of rising threats, STARK is bolstering the technological edge of NATO Allies and their Partners to deter aggression and defend Europe - today.Your missionYou will be developing cutting-edge Computer Vision solutions and algorithms for our autonomous vehicles, bringing these onto our hardware in real time.ResponsibilitiesDesign and develop object detection and tracking pipelines for guidance and terminal guidanceWork with frame-to-frame tracking, re-detection logic, and confidence estimation to prevent tracker drift or false lockDevelop approaches to separate targets from shadows, reflections, terrain, and background noiseWork with multi-sensor vision (daylight, night, thermal / IR), including alignment, synchronization, and fusionProvide uncertainty and confidence metrics to downstream guidance and control systemsIntegrate vision outputs into the guidance loop, considering latency, update rate, and coordinate framesOptimize and deploy models on edge hardware (e.g. Jetson-class devices)Analyze flight logs, investigate failure cases, and improve robustness based on real dataCollaborate closely with GNC, autonomy, and embedded teams during development and flight testingQualificationsStrong background in computer vision (classical + deep learning)Experience with object tracking beyond basic APIs (understanding drift, re-initialization, failure modes)Hands-on experience with video-based systems in real-world conditionsSolid programming skills in PythonExperience deploying or optimizing models for real-time / edge environmentsAbility to reason about uncertainty, confidence, and system-level impactNice to haveExperience with multi-sensor systems (day / night / thermal)Background in robotics, UAVs, or autonomous systemFamiliarity with ROS / ROS2 / C++Experience with Kalman filters or sensor fusionKnowledge of ONNX, TensorRT, CUDAExperience working with GNSS-denied or degraded environments