ios development

There’s a buzz in the air… or is it on your arm?

Can you feel it yet?

Apple Watch

This is a guest post by Boisy Pitre, Mobile Visionary and lead iOS developer at Affectiva. You can find him on Twitter here, and see his previous guest posts here.


It is the imminent release of Apple’s latest gadget-wonder… the long awaited Apple Watch. Announced last year, the wearable device is due to hit stores in a few months; its launch, sale price, and subsequent success or failure is the basis for immense speculation in and around the techno news websites and journals.

Last Fall’s announcement of the eagerly anticipated watch was true to Apple’s style of introducing something new: bold yet gentle, glamorous yet modest, confident yet demure. Touted as something truly personal to the wearer, the Apple Watch wooed and wowed the event’s audience and the wider general public. It was easy to see that this wasn’t just another technology device trying to act like a watch, but perhaps the very redefinition of the watch itself.

Predictably, it didn’t take long after the announcement for the questions to follow. How briskly will it sell? Who will buy it? And who wears a watch anymore?

What’s Old is New Again

The concept of wearable devices isn’t necessarily new; it’s been around for some time and has existed in various incarnations. Thinking back 15 years ago, I can distinctly remember attending PalmSource (yes, I’m talking about the Palm Pilot for those of you who can remember) in 2000, and witnessing an attendee walking around the show floor with Palm devices strapped to his forearms. It was reminiscent of Locutus of Borg in an episode of Star Trek: The Next Generation.

Thankfully, we’ve come a bit farther in style today. From arm bands with iPods to smartwatches like the Pebble and offerings from Samsung, you don’t have to look like a cyborg to have technology up close and personal to your body. And all indications are that with its myriad of colors, band styles, and body types, the Apple Watch will be as much of a fashion statement as a technology wearable.

Of course, as developers we are the spark and fuel that moves the pistons of Apple’s engines. Seeking new opportunities and pathways for our work is a constant motivation. So what does the Apple Watch mean for the developer?

A Totally New Platform

Just like the iPhone spurred the creation of the amazing “app economy” in 2008 with the release of the native iPhone SDK, the debut of the Apple Watch brings a whole new set of creative potential to the table. Although it has some utility on its own as a timepiece, where the Apple Watch really shines is its integration with the iPhone itself. The Apple Watch is really complete when it can pair up with an iPhone. The iPhone acts as a deliverer of both content and apps to the watch via Bluetooth. In essence, your Apple Watch becomes an extensible and conveniently accessible accessory to your iPhone.

This means if you have an iOS app already written, you can extend it to bring its functionality to the Apple Watch (assuming that there is some aspect of your app that makes sense appearing on someone’s wrist). Your iPhone is the “carrier” of the smarts that your Apple Watch will use; and in doing so, you have whole new ways to extend the usefulness of your iOS apps.

Think Different

A watch is not a phone, and a phone is not a watch. We carry our phones in our pockets and on our hips, but our watches adorn our wrists. As something that you will wear on your arm, the Apple Watch becomes a very convenient, immediate, and intimate place to view and interact with data. It opens up a whole new world of ideas for apps.

Not only is the Apple Watch as a platform more physically accessible, but its screen is considerably smaller in size than any previous iOS device. Given that the largest Apple Watch is 42mm tall (the other option is an even smaller 38mm in height), you have to carefully think about your app idea, and how it will “fit” onto such a targeted space.

The smaller design space of the Apple Watch, along with the intimacy and complete accessibility that it offers, is certain to inspire creative app extensions. It’s going to be interesting to see where developers will lay stake in this brave new world.

And It Will Get Better

Like all technology, the Apple Watch is bound to get “smarter” over subsequent revisions and generations. The perceived limitation of its tethering to the iPhone will become less and less pronounced, eventually to the point where Apple Watch may become a truly stand-alone, Dick Tracy type futuristic device. Think full audio and video interaction… a complete communications experience right on your wrist.

Challenges certainly remain to get there. Increased processing horsepower and capacity required to drive more features will require more battery life, and that will challenge Apple in interesting ways. There’s not a lot of room to put larger and larger batteries on your wrist.

Are You Ready?

Wearables are about to get a lot more popular, and the apps that will empower them are going to be more and more in demand. If you’re an iOS developer with an existing app, I encourage you to look at how your app might be able to augment your user’s experience on their wrist with their Apple Watch. Not all apps may be able to find that crossover, but many will, and with it will come the opportunity for you to become more familiar and close to your users.


Swift London Are you interested in iOS development? Swift London is a group for iOS and OS X developers of all abilities who want to learn how to use it, who meet regularly at Skills Matter. You can join them for their next meetup on Tuesday 17 February – full details here.

The organisers of the Swift London Meetup group have also put together an impressive line-up for a two-day Swift Summit which is taking place in London on 21 & 22 March. The programme includes speakers such as Chris Eidhof, Daniel Steinberg & Ayaka Nonaka. See the full agenda here.

While It’s Compiling: Skills Matter Interviews Boisy Pitre

While It’s Compiling is a continuing series of interviews with experts across a range of bleeding-edge technologies and practices, exclusive to Skills Matter. Be sure to subscribe to this blog for future interviews, or follow us on Twitter.

Find out who we’ll be interviewing next, and get a chance to put your questions forward with the hashtag #whileitscompiling.

Boisy Pitre at iOSCon 2014

We had a fantastic start to the year when we hosted the first ever iOSCon here at our headquarters in London, bringing together some of the world’s leading iOS experts including Boisy Pitre, Affectiva’s Mobile Visionary and lead iOS developer.

Boisy’s work has led to the creation of the first mobile SDK for delivering emotions to mobile devices for the leading emotion technology company and spin-off of the MIT Media Lab. We were delighted to get the opportunity to interview Boisy while he was here.

You can find the link to his talk from iOSCon at the bottom of this post, and all the talks here.


Hi Boisy, thanks for joining us for this year’s iOSCon. Can you tell us a little about yourself and the work you’ve been doing with Affectiva?

Sure. Currently I’m with Affectiva, an MIT media lab start-up based in Boston. We have an interesting technology which analyzes people’s facial expressions to determine their emotional state. The technology was developed, based on the research that one of the co-founders, Rana El Kaliouby, had pioneered in the affective computing field. The applicability of that technology was originally targeted towards the market research industry to help measure consumers’ emotional connections to brands and media.

About a year ago, Affectiva decided to expand their technology to mobile devices and tap into other industries beyond their current market – including gaming, healthcare, education and others. So, I came on board to lead this mobile initiative; and worked with some brilliant engineers to shrink the existing technology, which had a significant server component, down to the iOS platform. The Affdex Mobile SDK is the outcome of that effort. It does all the processing and reporting of emotional data on a frame-by-frame basis back to the app, right on the device – eliminating the need to connect to a server.

So it’s built on a lot of research – was iOS the natural progression and the natural platform to go to? What sets it apart from other platforms?

iOS was the initially targeted platform. In hindsight, I believe this was the right choice, as targeting iOS devices has been a bit easier due to the commonality of hardware and software; and it allowed us to get the SDK to market pretty quickly. Although we initially focused on iOS, I knew we were going to eventually develop an Android piece as well; which we’re almost done with, in fact. For the Android SDK, we hired team members who love and play in that sandbox. My philosophy is that for a company to be successful in a mobile strategy they should have experts that specialize in a particular platform.

In terms of applications going beyond the obvious marketing and advertising aspects, what are the real-world applications that exist now? Is there anything particularly interesting or exciting that Affectiva is working on right now?

As far as I’m concerned, it’s the wild west for apps that want to take advantage of emotion technology. It reminds me of the introduction of the iPhone in 2007. The idea of touching your device was not new, but Apple started democratizing it with the iPhone. That was the first really breakthrough moment in mobile. The second important breakthrough moment in mobile was the introduction of voice as input – again, Apple democratized how we interact with our phones and our devices when they offered Siri. I see emotional analysis having that same potential in mobile. Like voice, it give us a way of controlling the device and for the device to understand you better and offer you more choices.

So what type of apps can take advantage of this technology? Well obviously the low-hanging fruit would be games, where you’re interacting with the game – you want to have your emotions maybe control the game or have the game respond to your emotions in some way to adjust the level of intensity of play.

Health is another big opportunity where I think this technology can bring value. Take emotional health and well-being, for instance… there’s so much research pointing to the fact that our emotions have an impact on our health for better or for worse. So there’s a whole avenue of possibilities in that regard.

Then there’s the fun stuff. Imagine an app that analyzing your photos on your device to determine the emotional content to get an overall feel of your pictures. Or an app which changes music or colors on the screen while it watches your facial expressions. . Approaches like that can certainly lead to some interesting applications.

Of course, Affectiva is pursing app ideas at the moment based on this technology, but I cannot comment on them at this moment.

You mentioned that 2007 was the introduction of the first device, and how it’s moved-on, especially with Siri. Do you think that for someone such as myself, as a user of this device, are things going to continue coming out in stages, or is there anything around the corner that’s going to be as big and as ground-breaking as touch, or voice? Is there anything that’s going to jump out?

Emotion recognition technology has the potential to be that huge leap which brings in completely new way of interacting with our devices, whether we’re sensing emotions using the camera or through some other sense or mechanism. Having technology understand us better and gather deeper insights into our own emotions, through analysis at specific points in the day as we’re using apps, is a significant break-through in computer-human interaction.

And it’s a different level of interaction that liberates us. Just like touch liberated us from typing on tiny keyboards, and added a new paradigm of full natural touch with swiping. Emotional expressions in our face are instinctive; and they too can be a form of input and control, but they can also be a great form of feedback to us. I really think this will raise awareness of how we see ourselves in the world, as well as how we interact with others.

We hosted Droidcon last year, with devices such as the interactive mirror that could recognise your emotions in the morning. There is a huge interest currently in the Internet of Things, in connected devices and so on. Going beyond the iPhone or Android devices themselves, do you do much in terms of reaching out into connected devices?

Certainly. This technology can reach beyond just the device in your hand. For example, the automotive industry has expressed interest in our technology. That industry may be easier to break through on the Android side of things than it is with iOS, as iOS is a lot more compartmentalised and controlled by Apple. But certainly that’s one industry which could benefit from emotional analysis – just imagine driving along and your car wants to know if you’re falling asleep or paying attention or distracted; it’s looking at locations for safety, and again, health and well-being.

You touched on the fact that Apple and iOS is compartmentalised and controlled a lot more than Android. Do you think that’ s a drawback? Is this holding developers back on iOS or does it create an environment to focus ideas and energies?

Keep in mind that I’m coming from the Apple perspective as that’s the sandbox I play in. I completely understand and buy into Apple’s reasoning for why they do things. I’m also looking at this from a developer point of view.

We all know that Android exists on many, many mobile devices. It can be ported, unlike iOS, to phones, tablets, and other devices. The trade-off for such sheer ease of portability is the “fragmentation issue” which leads to complexity in development. At some point it becomes too massive for developers to support each of those devices. They must pick and choose their device support carefully.

I believe this is getting better as Android matures, but compared to Apple’s unified, streamlined hardware upgrading approach, it’s still a mess.

Apple’s approach, while certainly much more restrictive, brings a sense of order to the device chaos that permeates Android. If anything, I would argue that developers fare better in the Apple ecosystem because of these controls. But that is my opinion, of course.

Finally, in terms of Affectiva and how you work on a day-to-day basis – what’s the structure there? How does the team work?

We have two engineering teams, one dedicated to Android and the other to iOS. Each has an engineering lead. Both engineering teams interface with the science team, which concentrates specifically on the core technology of emotional classification. As science improvements are made, they are provided to engineering, which integrates the changes and improvements into the SDK code base. Agile is the foundation development methodology we use to organize and account for our work across the teams. This constant, connected cycle allows us to quickly iterate so that we can test, examine performance, etc.


Watch Boisy’s talk from iOSCon 2014

Boisy Pitre

What if your iPad or iPhone could detect your emotional state and respond in a way that enhances your day? What if an app could deliver soothing content when you’re feeling upset, or play your favourite song when you’re feeling happy? Find out how you could achieve this in Boisy’s talk!

You can see the rest of the Skillscasts from iOSCon 2014 here.

 

Daniel Steinberg to give an exclusive iOS App Development workshop, 14th May

daniel

Daniel Steinberg will be joining Skills Matter this May to deliver an exclusive iOS App Development Quick Start workshop, covering all the fundamental cornerstones of developing apps for iPad, iPhone, and iPod Touch. The workshop includes Xcode, Objective-C, View Controllers, and Storyboards – everything you need to get started in just one day! This course suits developers from all levels, from beginner to advanced; so as long as you know a C-style language and some object-oriented programming, this workshop is for you.

Click here to find out more!


iOScon 2014

Not only is Daniel giving this fantastic course, he’ll also be presenting the keynote talk at iOScon, our first ever iOS developer conference. And he won’t be alone; you can meet other top thinkers like Martin Pilkington, who will be demystifying Autolayout, Amy Worrall to talk about Key Value Coding, and Simon Whitaker on UIKit Dynamics.

iOScon will be taking place the day after the workshop from May 15th – 16th, followed by a free weekend hackathon on May 17th – 18th where you can meet and collaborate with other iOS developers, create a new app or game, and even win a prize!

If you’re interested in learning more about iOS, meeting other developers, and enhancing your skills, we’ve got a great week coming up for you in May – take your pick!

While It’s Compiling: Skills Matter Interviews Martin Pilkington

martin pilkingtonWhile It’s Compiling is a series of interviews with experts across a range of bleeding-edge technologies and practices, exclusive to Skills Matter. Be sure to subscribe to this blog for future interviews, or follow us on Twitter.

Find out who we’ll be interviewing next, and get a chance to put your questions forward with the hashtag #whileitscompiling.

This week we chatted to Martin Pilkington, freelance iOS developer and founder of M Cubed Software, and has been writing for Apple’s platforms for 10 years. He started tinkering with Autolayout when it was first released and fell in love with it straight away, so much so that he is currently writing a book on the subject called The Autolayout Guide (due out Spring 2014).


1. What drew you towards mobile platforms?

I started out developing for the Mac well before the iPhone came about. One of the first apps I developed for the Mac was actually a tool to create linked collections of notes for the early iPods, so I’ve been working with portable devices for quite a while. I bought an iPod touch when they first became available and fell in love with it. When the iPhone SDK came out I jumped at the chance to play around with it, especially as all my skills from developing for the Mac were transferable.

2. Is there anything in particular about iOS that appeals to you?

The big thing is that it appeals to me as a user. There are lots of plusses as a developer, but the key question I have before developing for any platform is: “do I want to use this platform myself?”

3. What new projects are you working on right now?

I have various client projects I’m working on, which I can’t really talk about. The main project I can talk about, which is quite relevant for my talk, is a book I’m working on called The Autolayout Guide. I see a lot of people struggling with Autolayout, just as I did when I started, but the resources available now are just as lacking as they were for me. I wanted to write the definitive guide on Autolayout to help people learn to get the most out of it. It’s also a bit of an experiment as I’ll be releasing it as an iBook, so I’m hoping to add more and more to it as time goes on.

4. Where do you see iOS in the next five years?

I don’t know where it will be, but I know where it would like to be. There are two big problems I have with iOS that I’d like to see solved.

The first is the iPad. A lot of apps are designed to take full advantage of the iPad’s screen space, really making you feel the difference between the iPad and iPhone. Unfortunately, the OS is still stuck in the “big iPhone” world. This has become even more apparent with iOS 7, where the iPad feels like an afterthought. Despite the complaints Windows 8 gets, I believe it has far more interesting features on tablets than iOS (such as the split screen) and I’d like to see Apple catch up.

The second is better integration between the Mac and iOS. iCloud tabs is one small step towards this, but I want to see it in more places. If I’m writing an email, I want to be able to pick up my iPad and finish that email there. I want to be able to copy something on iOS and paste it on the Mac. And most importantly I want to be able to send files, photos, etc between the two without having to resort to email or syncing the device. Before they got shut down, Palm was starting to make inroads into this area, but no-one else has really picked it up.

5. What do you wish you’d known when you’d first started out?

How to use Instruments. I still haven’t mastered it yet, but it’s one of the most important tools in your arsenal.

6. If you could ask the iOS community anything, what would it be?

To file more radars asking for an official plugin API for Xcode. That’s probably the biggest change that could happen to our tools, and radars are the only way to get Apple to act.


Got a question for Martin? Leave us a comment below!

Martin will be giving a talk about mastering Autolayoutat iOScon, our first ever iOScon conference. Check Skills Matter for updates and tickets!