Samsung will “lead by following” Apple Watch
Samsung LSI marketing team head Kyushik Hong spoke at length about “Innovation for the next mobile experience,” outlining plans to introduce a “Bio Processor” chip that packed a series of components related to health related data recording. Asked when the new chip would be introduced and when Samsung expected it to become a meaningful revenue generator, Hong stated that it was expected to ship early next year and might be used in some kind of band or other product focusing on activity, not necessarily from Samsung. And while his presentation discussed “wearable device trend” and the potential of wearables to grow dramatically in shipment volumes, there was no discussion of how Samsung was actually performing in the smartwatch category it largely introduced, before partnering with Google on Android Wear and then going solo with its own Tizen-based Gear watches, all without achieving any success along the way, before being steamrolled by the arrival of Apple Watch.
At the same time, the “trends” Samsung identified for wearable devices included authentication and payment, features Samsung’s Galaxy Gear models continue to lack. Apple Watch introduced Apple Pay last fall, but the company’s own new “Samsung Pay” is a feature still confined to Samsung’s phones.
The primary unique “feature” Samsung added to its watches that Apple didn’t was a small, low quality 1.9 MP camera, which gave it a creepy voyeur-vibe reminiscent of Google Glass while failing to capture images of any useful quality.
Samsung’s Galaxy Gear lineup hasn’t attract many buyers. Instead, the watch ended up with Best Buy seeing more than 30 percent of its sales being returned by unsatisfied customers, according to a report by Ars.
Samsung unveils some existing camera technology
Focusing next on photography as a feature of smartphones, Hong introduced “fast and accurate auto focus” using phase detection. Apple calls this “Focus Pixels,” and introduced it last year as a feature of iPhone 6 (using sensors developed by Sony). Samsung had earlier introduced phase detection autofocus in its Galaxy S5, but its speed to market didn’t change the fact that the S5 was outsold by Apple’s iPhone 5s models without the feature. iPhone 6, with Focus Pixels of its own, further trounced the Galaxy S6.
While much attention is devoted to imagining how Apple’s innovations and technologies will be commodified by Android licensees, the reverse actually seems to be happening: any technical advantage introduced by others is eventually adopted by Apple (examples include LTE, NFC and barometers), while Apple’s technical leaps remain largely unmatched by rivals (such as Touch ID, Continuity and 3D Touch).
Other “futuristic” ideas the company addressed included using multiple exposures composited to achieve wide dynamic range and “ISOCELL technology” that puts a barrier between pixels to increase light sensitivity and effectively “controls the absorption of electrons.” If that sounds familiar, it’s because Apple introduced the concept as “deep trench isolation,” in explaining its efforts to increase the pixel count within the iPhone 6s camera sensor without also increasing the noise commonly experienced as pixels get smaller as they are packed more densely to increase overall resolution.
Samsung rushed high resolution camera sensors to market before Apple, but their high megapixel counts didn’t result in better photos. Instead, it resulted in low light noise and less accurate color reproduction.
While Apple explained that it was using this new technology to increase iPhone camera resolution without losing quality, Samsung stated that its goal for the same process (under a different name) was to reduce pixel size in order to help reduce the overall thickness of its phones. Samsung stated it was reducing the pixel size of its 16MP sensor from 1.12um to 1.0um to achieve 1mm of reduced thickness. Apple reduced the pixel size of iPhone 6 from 8MP at 1.5um to 12MP at 1.22um, not primarily to reduce device thickness, but to increase photo and video capture resolution without losing quality, maintaining larger pixels than competing sensors. Pixel size reduction on its own simply makes each pixel less sensitive to light.
It’s noteworthy that while Apple uses a custom version of Sony’s camera sensor for iPhone 6/6s, Samsung also uses Sony’s IMX240 sensor in its Galaxy S6/S6 Edge, at least in the versions it sends to reviewers. Regular users are finding that Samsung might also swap in its own ISOCELL camera sensors to save money, resulting in reduced image quality.
This all happened before
Overall, Samsung’s investor conference seemed far less ambitious and confident as its event from 2013, where JK Shin, Samsung’s president and chief executive of IT & Mobile, promised that the company would “play a key role in the premium smartphone market.”
As AppleInsider noted at the time, this was a direct contradiction of the warning Samsung had earlier given its investors of slowing profits.
It also belied the reality that most of the phones Samsung had been—and was currently selling—were low end devices, not premium phones. Further, Samsung has been—and continues to repeatedly note—that its premium sales remain static (rather than experiencing any tremendous growth in demand as promised) and that its unit growth is coming from low end devices, which are eroding its Average Selling Price.
Back in 2013, Samsung focused upon screen resolutions, forecasting that by this year, it would be selling smartphones with 3840×2160 displays. Instead of that happening, the company is still selling “WQHD” screens, and even those are plaguing Samsung’s high end devices with excessive screen resolutions that its relatively anemic Application Processors aren’t quite capable of driving competitively.