Apple and You: A Publisher’s Cheat Sheet

A new iOS could unlock deeper engagement opportunities for publishers.


Apple ARRegardless of Apple’s dulling sheen in recent years, its evolution of iOS and smartphones remains relevant to all media companies. But don’t let the headlines coming from the widely viewed rollout this week ($1,000 iPhone X—Yikes!) distract you from the developments most material to publishers.

Augmented reality? No Kidding. Apple is serious about AR and the cameras have now become the focus of future interaction. Publishers need to take seriously the ways in which baked-in AR capabilities in the iOS 11 can be used to enhance publishing efforts. In apps (hopefully on the web too) publishers will have the opportunity to overlay product reviews, product guides, how-tos, media enhancements, print-to-web connections and more onto the physical world. This can change our notion of publishing. As wearables and AR-enabled screens become interfaces on the physical world, media is now not an environment unto itself but an augmentation of reality.

QR Redux. While it may be too little too late, iOS 11 finally embeds QR code support in the camera interface. For print publishers this makes it that much easier to implement a standard and familiar 2D coding system that triggers an online extension of a print experience. For over a decade, publishers have struggled with a range of proprietary and open 2D code/watermarking/AR/image recognition protocols that bridge analog to digital media. The codes themselves remain as ugly as ever, of course, but at least now they don’t require users to download a third party app and don’t require that users find and load an app. Oh, if only this simple standard had been baked into the camera when it should have been years ago. Mobile behaviors have evolved in the intervening years, however, and whipping out your phone camera while interacting with the analog world is enough of a reflex now that magazines should be able to leverage it better.

Video Picture-in-Picture on iPad. This could be a big deal at least for tablet traffic. iOS 11 on iPad lets you reduce to a persistent window most full screen views of videos from major players (including embedded YouTube). As many app makers have discovered, allowing viewers to multitask while viewing a video is a great way to encourage streaming media use. We hope this capability comes to iPhone as well, since even a postage stamp sized PIP window is effective. Much of video content is really about the audio track anyway.

Cookie Crumbles. Safari in iOS 11 is taking a machete to tracking cookies, using machine learning to determine which cookies really belong to sites you visit frequently from thirdparty trackers and which it will drop. This is going to wreak havoc on the mobile ad ecosystem. It will favor sites with frequent visitations from the same people (more power to Google and Facebook?) and undermine the value of drive-by visitations and perhaps sites that do not enjoy a lot of direct traffic. Exactly how and whom this will effect is unknown right now, but expect it to hurt anyone relying on revenues ties to retargeting and audience extensions. Conversely, this might only strengthen the hands of multi-title publishers who beef up their own first-party data to leverage across their own properties.


Apple pull quote


The Daily App. The App Store enjoys its most substantial redesign in years and now looks more like a magazine about apps. It highlights daily trending apps and offers back stories about an app’s development as well as curated lists, all of which show up in search results too. Publishers will want to rework their product pages, which now allow for autoplay videos, chart position, and easier in-app purchases. Will this help better apps find their audience? All we know from testing the beta for months is that we visit the App Store more often than we once did.

The Mediated Selfie. Much of the tech innovations Apple touted in its iPhone X—facial recognition, hi-resolution frontfacing camera, facial AR appliques, animojis (animated emojis that mimic you speaking)—are all informed by something deeper than novelty posing as innovation. It strikes a key theme that is emerging about mobile communication—it’s all about controlling, containing, augmenting and refining how we project our sense of self onto the world. The wildly popular facial filters in Instagram and Snapchat, emojis, the selfie itself—are all at once tools of selfexpression and buffers between our private selves and others. The point here is that media companies need to consider how their content can become tools for users to establish and project their virtual identity. We have seen this in rudimentary form with branded stickers, keyboards and emojis. We have also seen some version of this in the way people share branded media in part to establish or enhance their online social identity. But as various AR techniques become baked into the mobile operating system, this opens up new vistas. Home décor, apparel, celebrity faces, beauty styles and items, products of all sorts (the stuff of magazine journalism) can now be overlaid into a user’s real world, wrapped around and tied to their virtual identity, shared and projected into the digital social universe. Mobile users spend much of their time in two or three social apps (and capturing media content haphazardly along the way) because they prefer a virtual social context to traditional interfaces with media. Relying on intercepting them during that experience with traditional news fare is not enough. You need to offer them more tools and content that help users establish, embellish, project that virtual social self they are living on these devices.



More From min