boisy pitre

There’s a buzz in the air… or is it on your arm?

Can you feel it yet?

Apple Watch

This is a guest post by Boisy Pitre, Mobile Visionary and lead iOS developer at Affectiva. You can find him on Twitter here, and see his previous guest posts here.

It is the imminent release of Apple’s latest gadget-wonder… the long awaited Apple Watch. Announced last year, the wearable device is due to hit stores in a few months; its launch, sale price, and subsequent success or failure is the basis for immense speculation in and around the techno news websites and journals.

Last Fall’s announcement of the eagerly anticipated watch was true to Apple’s style of introducing something new: bold yet gentle, glamorous yet modest, confident yet demure. Touted as something truly personal to the wearer, the Apple Watch wooed and wowed the event’s audience and the wider general public. It was easy to see that this wasn’t just another technology device trying to act like a watch, but perhaps the very redefinition of the watch itself.

Predictably, it didn’t take long after the announcement for the questions to follow. How briskly will it sell? Who will buy it? And who wears a watch anymore?

What’s Old is New Again

The concept of wearable devices isn’t necessarily new; it’s been around for some time and has existed in various incarnations. Thinking back 15 years ago, I can distinctly remember attending PalmSource (yes, I’m talking about the Palm Pilot for those of you who can remember) in 2000, and witnessing an attendee walking around the show floor with Palm devices strapped to his forearms. It was reminiscent of Locutus of Borg in an episode of Star Trek: The Next Generation.

Thankfully, we’ve come a bit farther in style today. From arm bands with iPods to smartwatches like the Pebble and offerings from Samsung, you don’t have to look like a cyborg to have technology up close and personal to your body. And all indications are that with its myriad of colors, band styles, and body types, the Apple Watch will be as much of a fashion statement as a technology wearable.

Of course, as developers we are the spark and fuel that moves the pistons of Apple’s engines. Seeking new opportunities and pathways for our work is a constant motivation. So what does the Apple Watch mean for the developer?

A Totally New Platform

Just like the iPhone spurred the creation of the amazing “app economy” in 2008 with the release of the native iPhone SDK, the debut of the Apple Watch brings a whole new set of creative potential to the table. Although it has some utility on its own as a timepiece, where the Apple Watch really shines is its integration with the iPhone itself. The Apple Watch is really complete when it can pair up with an iPhone. The iPhone acts as a deliverer of both content and apps to the watch via Bluetooth. In essence, your Apple Watch becomes an extensible and conveniently accessible accessory to your iPhone.

This means if you have an iOS app already written, you can extend it to bring its functionality to the Apple Watch (assuming that there is some aspect of your app that makes sense appearing on someone’s wrist). Your iPhone is the “carrier” of the smarts that your Apple Watch will use; and in doing so, you have whole new ways to extend the usefulness of your iOS apps.

Think Different

A watch is not a phone, and a phone is not a watch. We carry our phones in our pockets and on our hips, but our watches adorn our wrists. As something that you will wear on your arm, the Apple Watch becomes a very convenient, immediate, and intimate place to view and interact with data. It opens up a whole new world of ideas for apps.

Not only is the Apple Watch as a platform more physically accessible, but its screen is considerably smaller in size than any previous iOS device. Given that the largest Apple Watch is 42mm tall (the other option is an even smaller 38mm in height), you have to carefully think about your app idea, and how it will “fit” onto such a targeted space.

The smaller design space of the Apple Watch, along with the intimacy and complete accessibility that it offers, is certain to inspire creative app extensions. It’s going to be interesting to see where developers will lay stake in this brave new world.

And It Will Get Better

Like all technology, the Apple Watch is bound to get “smarter” over subsequent revisions and generations. The perceived limitation of its tethering to the iPhone will become less and less pronounced, eventually to the point where Apple Watch may become a truly stand-alone, Dick Tracy type futuristic device. Think full audio and video interaction… a complete communications experience right on your wrist.

Challenges certainly remain to get there. Increased processing horsepower and capacity required to drive more features will require more battery life, and that will challenge Apple in interesting ways. There’s not a lot of room to put larger and larger batteries on your wrist.

Are You Ready?

Wearables are about to get a lot more popular, and the apps that will empower them are going to be more and more in demand. If you’re an iOS developer with an existing app, I encourage you to look at how your app might be able to augment your user’s experience on their wrist with their Apple Watch. Not all apps may be able to find that crossover, but many will, and with it will come the opportunity for you to become more familiar and close to your users.

Swift London Are you interested in iOS development? Swift London is a group for iOS and OS X developers of all abilities who want to learn how to use it, who meet regularly at Skills Matter. You can join them for their next meetup on Tuesday 17 February – full details here.

The organisers of the Swift London Meetup group have also put together an impressive line-up for a two-day Swift Summit which is taking place in London on 21 & 22 March. The programme includes speakers such as Chris Eidhof, Daniel Steinberg & Ayaka Nonaka. See the full agenda here.

A Look at Swift: Strings and Things


This is a guest post by Boisy Pitre, Mobile Visionary and lead iOS developer at Affectiva. Here he talks to us about string manipulation and interpolation, the simplicity of Swift and the power of Apple’s compilers – and how with Swift, less is more.

This is a follow-up post to Boisy’s While It’s Compiling interview from iOSCon 2014, which you can read here.

One of the very cool things about Swift is probably the most mundane: string manipulation and interpolation. The fact that as a developer, we can juggle using strings as naturally as numeric types is a huge deal. In this article, I’ll show you just why this is important, and how it can be not only a time saver, but a bug saver too.

Back In My Day…

Us “old timers” love to start out telling stories with that phrase. In this case, “back in my day” is not that long ago when we were (and still are in most cases) using Objective-C as our primary development language. Objective-C inherits a lot from C, and one of the things it tried to improve on was strings. If you’ve ever worked with C, you’ll remember the obtuse functions we had to use just to concatenate two strings:

char s[255];
strcpy(s, "This is a string. ");
char *p = strcat(s, "And this is another!");

No kidding. If this is foreign to you, then be thankful. Such code is fraught with potential security issues, not to mention the real possibility of outright crashes. Granted, functions like strncat and strncpy came along to limit the security risk by enforcing a maximum number of bytes to copy, but still remained unwieldy to use and slightly unnatural in appearance.

Objective-C did its best to make things better. Here’s the C code above, rewritten in the simplest Objective-C style:

NSString *s = @"This is a string. ";
NSString *p = [NSString stringWithFormat:@"%@ And this is another!", s];

… still not elegant, but fairly descriptive, and safe from the dangers of accessing memory in a more direct fashion in C. And by God, we loved it. And still do.


Swift’s Response

If you think the above code is easy to understand, then you’ve probably seen it one thousand times over. If you think it’s way too complicated, then you’re probably right in line with the new crop of developers who have cut their teeth on scripting languages like Ruby and Python. No matter which camp you’re in, you have to appreciate the simplicity of this:

let s = "This is a string. "
let p = s + " And this is another!"

Come on. This is just too easy, isn’t it? That’s all it takes to add two strings in Swift. I have to tell you, when you come from the old way, it almost feels like cheating. Deep down, you don’t trust the compiler and think all of this pretty, sugary stuff is eating up tons of CPU cycles. Where are the good old “bare metal” programming days going to?

You can relax, because Apple’s compiler wizards have worked hard to optimize these statements under the covers. Just bask in the enjoyment of all that typing you’re going to save.

Format Specifiers… Ugh!

Another win that we get in Swift is with the printing of constants or variables. How many times in your code do you do this? A lot. And it’s a lot of typing… and tedious to boot. Here’s a simple contrived example:

int x = 42;
char *s = "life";
printf("The answer to %s is %d!\n", s, x);

Now multiply that times one hundred lines!

This has been the bane of C and Objective-C for a long time. Keeping track of what format specifiers to use for what types is a pain, but what’s even more of a pain is if you have a ton of them in one printf statement, and forget a couple, or worse, transpose the variables so that the types don’t match. Compilers got smarter over time and warned you of such transgressions, but what a time killer nonetheless.

C++ tried to get it right, and did a decent job, but it still didn’t have that “natural” syntax flavor:

cout << "The answer to " << s << " is " << x << "!\n";

Nice try, C++, but Swift cuts through this problem like a hot knife in butter:

let x = 42
let s = "life"
println("The answer to \(s) is \(x)!")

You may have to get used to typing a backslash followed by open and closing parentheses around your variable names, but this is clear and concise and has minimal chance of collision with your string characters.

Less is More

Strings are just one of many things that Apple has really made fun and easy to use in Swift. The theme of the language certainly seems to be “less is more.” Less typing and gaining more functionality. After all, computer languages are about us humans expressing ourselves more naturally in our work so that things can get done quicker.

If you haven’t started learning Swift yet, you should. If you have started, hold on to your bucket seats, because it’s going to be a wild but great ride!

While It’s Compiling: Skills Matter Interviews Boisy Pitre

While It’s Compiling is a continuing series of interviews with experts across a range of bleeding-edge technologies and practices, exclusive to Skills Matter. Be sure to subscribe to this blog for future interviews, or follow us on Twitter.

Find out who we’ll be interviewing next, and get a chance to put your questions forward with the hashtag #whileitscompiling.

Boisy Pitre at iOSCon 2014

We had a fantastic start to the year when we hosted the first ever iOSCon here at our headquarters in London, bringing together some of the world’s leading iOS experts including Boisy Pitre, Affectiva’s Mobile Visionary and lead iOS developer.

Boisy’s work has led to the creation of the first mobile SDK for delivering emotions to mobile devices for the leading emotion technology company and spin-off of the MIT Media Lab. We were delighted to get the opportunity to interview Boisy while he was here.

You can find the link to his talk from iOSCon at the bottom of this post, and all the talks here.

Hi Boisy, thanks for joining us for this year’s iOSCon. Can you tell us a little about yourself and the work you’ve been doing with Affectiva?

Sure. Currently I’m with Affectiva, an MIT media lab start-up based in Boston. We have an interesting technology which analyzes people’s facial expressions to determine their emotional state. The technology was developed, based on the research that one of the co-founders, Rana El Kaliouby, had pioneered in the affective computing field. The applicability of that technology was originally targeted towards the market research industry to help measure consumers’ emotional connections to brands and media.

About a year ago, Affectiva decided to expand their technology to mobile devices and tap into other industries beyond their current market – including gaming, healthcare, education and others. So, I came on board to lead this mobile initiative; and worked with some brilliant engineers to shrink the existing technology, which had a significant server component, down to the iOS platform. The Affdex Mobile SDK is the outcome of that effort. It does all the processing and reporting of emotional data on a frame-by-frame basis back to the app, right on the device – eliminating the need to connect to a server.

So it’s built on a lot of research – was iOS the natural progression and the natural platform to go to? What sets it apart from other platforms?

iOS was the initially targeted platform. In hindsight, I believe this was the right choice, as targeting iOS devices has been a bit easier due to the commonality of hardware and software; and it allowed us to get the SDK to market pretty quickly. Although we initially focused on iOS, I knew we were going to eventually develop an Android piece as well; which we’re almost done with, in fact. For the Android SDK, we hired team members who love and play in that sandbox. My philosophy is that for a company to be successful in a mobile strategy they should have experts that specialize in a particular platform.

In terms of applications going beyond the obvious marketing and advertising aspects, what are the real-world applications that exist now? Is there anything particularly interesting or exciting that Affectiva is working on right now?

As far as I’m concerned, it’s the wild west for apps that want to take advantage of emotion technology. It reminds me of the introduction of the iPhone in 2007. The idea of touching your device was not new, but Apple started democratizing it with the iPhone. That was the first really breakthrough moment in mobile. The second important breakthrough moment in mobile was the introduction of voice as input – again, Apple democratized how we interact with our phones and our devices when they offered Siri. I see emotional analysis having that same potential in mobile. Like voice, it give us a way of controlling the device and for the device to understand you better and offer you more choices.

So what type of apps can take advantage of this technology? Well obviously the low-hanging fruit would be games, where you’re interacting with the game – you want to have your emotions maybe control the game or have the game respond to your emotions in some way to adjust the level of intensity of play.

Health is another big opportunity where I think this technology can bring value. Take emotional health and well-being, for instance… there’s so much research pointing to the fact that our emotions have an impact on our health for better or for worse. So there’s a whole avenue of possibilities in that regard.

Then there’s the fun stuff. Imagine an app that analyzing your photos on your device to determine the emotional content to get an overall feel of your pictures. Or an app which changes music or colors on the screen while it watches your facial expressions. . Approaches like that can certainly lead to some interesting applications.

Of course, Affectiva is pursing app ideas at the moment based on this technology, but I cannot comment on them at this moment.

You mentioned that 2007 was the introduction of the first device, and how it’s moved-on, especially with Siri. Do you think that for someone such as myself, as a user of this device, are things going to continue coming out in stages, or is there anything around the corner that’s going to be as big and as ground-breaking as touch, or voice? Is there anything that’s going to jump out?

Emotion recognition technology has the potential to be that huge leap which brings in completely new way of interacting with our devices, whether we’re sensing emotions using the camera or through some other sense or mechanism. Having technology understand us better and gather deeper insights into our own emotions, through analysis at specific points in the day as we’re using apps, is a significant break-through in computer-human interaction.

And it’s a different level of interaction that liberates us. Just like touch liberated us from typing on tiny keyboards, and added a new paradigm of full natural touch with swiping. Emotional expressions in our face are instinctive; and they too can be a form of input and control, but they can also be a great form of feedback to us. I really think this will raise awareness of how we see ourselves in the world, as well as how we interact with others.

We hosted Droidcon last year, with devices such as the interactive mirror that could recognise your emotions in the morning. There is a huge interest currently in the Internet of Things, in connected devices and so on. Going beyond the iPhone or Android devices themselves, do you do much in terms of reaching out into connected devices?

Certainly. This technology can reach beyond just the device in your hand. For example, the automotive industry has expressed interest in our technology. That industry may be easier to break through on the Android side of things than it is with iOS, as iOS is a lot more compartmentalised and controlled by Apple. But certainly that’s one industry which could benefit from emotional analysis – just imagine driving along and your car wants to know if you’re falling asleep or paying attention or distracted; it’s looking at locations for safety, and again, health and well-being.

You touched on the fact that Apple and iOS is compartmentalised and controlled a lot more than Android. Do you think that’ s a drawback? Is this holding developers back on iOS or does it create an environment to focus ideas and energies?

Keep in mind that I’m coming from the Apple perspective as that’s the sandbox I play in. I completely understand and buy into Apple’s reasoning for why they do things. I’m also looking at this from a developer point of view.

We all know that Android exists on many, many mobile devices. It can be ported, unlike iOS, to phones, tablets, and other devices. The trade-off for such sheer ease of portability is the “fragmentation issue” which leads to complexity in development. At some point it becomes too massive for developers to support each of those devices. They must pick and choose their device support carefully.

I believe this is getting better as Android matures, but compared to Apple’s unified, streamlined hardware upgrading approach, it’s still a mess.

Apple’s approach, while certainly much more restrictive, brings a sense of order to the device chaos that permeates Android. If anything, I would argue that developers fare better in the Apple ecosystem because of these controls. But that is my opinion, of course.

Finally, in terms of Affectiva and how you work on a day-to-day basis – what’s the structure there? How does the team work?

We have two engineering teams, one dedicated to Android and the other to iOS. Each has an engineering lead. Both engineering teams interface with the science team, which concentrates specifically on the core technology of emotional classification. As science improvements are made, they are provided to engineering, which integrates the changes and improvements into the SDK code base. Agile is the foundation development methodology we use to organize and account for our work across the teams. This constant, connected cycle allows us to quickly iterate so that we can test, examine performance, etc.

Watch Boisy’s talk from iOSCon 2014

Boisy Pitre

What if your iPad or iPhone could detect your emotional state and respond in a way that enhances your day? What if an app could deliver soothing content when you’re feeling upset, or play your favourite song when you’re feeling happy? Find out how you could achieve this in Boisy’s talk!

You can see the rest of the Skillscasts from iOSCon 2014 here.