SonicJobs Logo
Left arrow iconBack to search

Software Development Engineer - Location Technologies, Sensing & Connectivity

Apple
Posted a month ago, valid for 20 days
Location

Cupertino, CA 95015, US

Salary

Competitive

Contract type

Full Time

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.

Sonic Summary

info
  • The Location Context team at Apple is seeking engineers with 5+ years of experience in developing commercial software, particularly systems-level or embedded software for resource-constrained devices.
  • The role involves building location state estimators, designing machine learning models for place inference, and optimizing system performance while ensuring user privacy.
  • Candidates should possess strong programming skills in languages such as C, C++, Objective-C, or Swift, along with a solid foundation in algorithms and data structures.
  • Preferred qualifications include expertise in location technologies and on-device machine learning experience, particularly in optimizing model size and power-efficient inference.
  • The position offers a competitive salary, reflecting the complexity and impact of the work on millions of devices.
Our mission is to personalize the user experience on Apple devices based on where you go, when, and what those places mean to you. You're experiencing our work whenever you see a suggested location in Maps or Calendar, or browse your Memories in Photos or Journal. We're working for you whenever your phone engages Do Not Disturb While Driving or remembers where you parked. We're the Location Context team, and we build the location intelligence backbone powering Maps Visited Places, Siri location suggestions, and predictive features across the OS. We're looking for engineers who love solving hard problems at the intersection of location state estimation, on-device machine learning, and privacy-preserving systems. Are you excited by any of these challenges? • Building location state estimators that fuse GPS, WiFi, IMU, and altimeter data to understand not just where users are, but what floor of a building they're on • Designing ML models to infer the semantics of a place and forecast where the device will go next, entirely on-device with strict power and memory budgets • Developing clustering algorithms and data pipelines that process billions of location events while preserving user privacy • Optimizing system performance at massive scale—where a 1% edge case impacts 10 million devices and a power regression of 0.1% matters • Collaborating with Maps, Siri, Photos, HomeKit, Journal, and Safety teams to power features that require deep contextual understanding If this sounds like you, read on.

Description


In this role, you'll develop the next frontier of location intelligence, in partnership with teams across sensing, Siri, Maps, and system frameworks. You'll work on problems from research through production deployment: Design and implement location state estimation algorithms that fuse multi-modal sensor data (GPS, WiFi positioning, accelerometer, altimeter, barometer) to build a rich understanding of user context and mobility patterns Develop on-device machine learning models for place inference, route prediction, and behavioral forecasting that operate within strict power and memory constraints Build data processing pipelines that aggregate, filter, and cluster real-world sensor data on mobile devices, balancing intelligence with resource constraints Implement sophisticated algorithms for background location awareness and semantic understanding — then integrate them into production code running on hundreds of millions of devices Collect and analyze real-world datasets to train models, validate performance, and iterate on algorithm design Test rigorously. Dogfood your work. Collect metrics across diverse user populations and edge cases. An issue that affects 1% of a billion devices is a big issue. Optimize for the full system: CPU, memory, power consumption, and radio usage. Our software needs to provide a high level of intelligence while sipping battery—this is one of the most exciting engineering challenges in mobile computing. A dedication to users' privacy and security is core to how Apple does business. We want their devices to exhibit the high level of intelligence and proactivity that can only come from deep contextual understanding. We don't want their sensitive data coming back to Apple or being exposed to third parties. Other companies solve similar problems in very different ways. Our way is more work. We believe it's worth it.

Minimum Qualifications


5+ years experience developing commercial software, preferably systems-level or embedded software running on resource-constrained devices Strong programming skills in C, C++, Objective-C, or Swift, with solid foundation in algorithms, data structures, and computational complexity Working knowledge of statistics and probability, including comfort with histograms, probability distributions, Bayesian inference, and hypothesis testing Experience evaluating and optimizing system performance: memory footprint, CPU usage, power consumption, and I/O

Preferred Qualifications


Deep expertise in location technologies: GPS/GNSS positioning, WiFi-based localization, indoor positioning, sensor fusion for state estimation, or IMU-based dead reckoning. If you've built location estimators that fuse multiple sensor modalities, we especially want to hear from you. Experience with machine learning for time-series data, spatial data, or behavioral prediction. On-device ML experience (model size optimization, quantization, power-efficient inference) is a strong plus. Background in signal processing, Kalman filtering, particle filters, or other probabilistic state estimation techniques. Experience with clustering algorithms (DBSCAN, hierarchical clustering, etc.) and unsupervised learning applied to spatial or temporal data. Track record of shipping production systems that operate at scale under resource constraints (mobile, embedded, or edge computing environments). Strong collaboration skills and ability to work effectively across teams with diverse expertise. At Apple, you'll partner closely with teams in sensing, connectivity, privacy, and application frameworks. You'll need to communicate clearly, plan collaboratively, and execute flexibly. Experience with performance profiling tools (Instruments, dtrace, etc.) and systematic optimization of CPU, memory, and power usage. Experience with large-scale data analysis for offline algorithm development, model validation, and performance evaluation across diverse user populations.



Learn more about this Employer on their Career Site

Apply now in a few quick clicks

By applying, a Sonicjobs account will be created for you. Sonicjobs's Privacy Policy and Terms & Conditions will apply.

SonicJobs' Terms & Conditions and Privacy Policy also apply.