<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="http://jekyllrb.com" version="3.4.3">Jekyll</generator><link href="https://blog.mosheberman.com/feed.xml" rel="self" type="application/atom+xml" /><link href="https://blog.mosheberman.com/" rel="alternate" type="text/html" /><updated>2017-09-18T19:44:33-04:00</updated><id>https://blog.mosheberman.com/</id><title type="html">Blog.MosheBerman</title><subtitle>Expert advice and opinions on iOS, watchOS, and tvOS development.</subtitle><entry><title type="html">Your App’s Layout and Localization APIs</title><link href="https://blog.mosheberman.com/app-layout-and-localization/" rel="alternate" type="text/html" title="Your App's Layout and Localization APIs" /><published>2017-08-23T14:45:00-04:00</published><updated>2017-08-23T14:45:00-04:00</updated><id>https://blog.mosheberman.com/notes-ios-localization</id><content type="html" xml:base="https://blog.mosheberman.com/app-layout-and-localization/">&lt;style type=&quot;text/css&quot;&gt;

table, tr, td  
{
	border-collapse: collapse;
	border: 1px solid #dedede;
}

tr:nth-child(1) td
{
	padding: 5px;
}

tr:nth-child(even) {
    background-color: #efefef;
}

tr:not(:first-child) td
{
	padding: 5px;
}

tr:not(:first-child) td, tr:not(:first-child) td a code
{
	font-size: 12px;
}

tr:not(:first-child) td:nth-child(3)
{
	font-style: italic;

}

&lt;/style&gt;

&lt;p&gt;Let’s discuss five techniques for dealing with right-to-left layout across Apple platforms, using 11 different APIs. Be sure to check out the handy dandy chart at the end, and Apple’s &lt;a href=&quot;https://developer.apple.com/internationalization/&quot;&gt;Building Apps for the World&lt;/a&gt; webpage. Let’s get started, in order of oldest API to newest.&lt;/p&gt;

&lt;h3 id=&quot;inference-by-writing-direction-using-nslocale&quot;&gt;Inference By Writing Direction using NSLocale&lt;/h3&gt;
&lt;p&gt;The first technique is inferring the device’s direction, using two class methods on &lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSLocale&lt;/code&gt;&lt;/a&gt;. &lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale/1417681-characterdirectionforlanguage&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;characterDirection(forLanguage:)&lt;/code&gt;&lt;/a&gt; and &lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale/1414007-linedirectionforlanguage&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;lineDirection(forLanguage:)&lt;/code&gt;&lt;/a&gt; take an ISO language code, such as &lt;code class=&quot;highlighter-rouge&quot;&gt;en-US&lt;/code&gt; or &lt;code class=&quot;highlighter-rouge&quot;&gt;he-IL&lt;/code&gt; and return a direction. You can get a valid value to pass into these methods from &lt;code class=&quot;highlighter-rouge&quot;&gt;NSLocale&lt;/code&gt;’s &lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale/1643026-languagecode&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;languageCode&lt;/code&gt;&lt;/a&gt; property.&lt;/p&gt;

&lt;p&gt;These class methods are supported as far back as iOS 4.0 and are safe to use in app extensions. They also exist on other Apple platforms: macOS 10.6, tvOS 9.0, and watchOS 2.0. I wouldn’t base the majority of my app’s layout on it, but it’s certainly useful to have them in your toolkit.&lt;/p&gt;

&lt;p&gt;It’s also notable that, unlike the other APIs mentioned here, these methods can return top-to-bottom and bottom-to-top, besides for the expected left-to-right and right-to-left values. English, for example, has a character direction of left-to-right and line direction of top-to-bottom.&lt;/p&gt;

&lt;p&gt;In my &lt;a href=&quot;/assets/localization-api/rtl-playground.zip&quot;&gt;experiments&lt;/a&gt; with Xcode Playgrounds, I did not find any macOS languages that assigned a line direction that wasn’t top-to-bottom.&lt;/p&gt;

&lt;h3 id=&quot;explicit-layout-direction-with-uiapplication--nsapplication&quot;&gt;Explicit Layout Direction with UIApplication &amp;amp; NSApplication&lt;/h3&gt;
&lt;p&gt;The second technique is to reference &lt;a href=&quot;https://developer.apple.com/library/ios/DOCUMENTATION/UIKit/Reference/UIApplication_Class/Reference/Reference.html&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UIApplication&lt;/code&gt;&lt;/a&gt;’s &lt;a href=&quot;https://developer.apple.com/documentation/uikit/uiapplication/1623025-userinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection&lt;/code&gt;&lt;/a&gt; property. This is probably the earliest API explicitly for view layout. Available since iOS 5.0, this property determines the general direction of the app. &lt;a href=&quot;https://developer.apple.com/documentation/appkit/nsapplication&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSApplication&lt;/code&gt;&lt;/a&gt; has this property, too, as of macOS 10.6.&lt;/p&gt;

&lt;p&gt;On both platforms, &lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection&lt;/code&gt; is either &lt;code class=&quot;highlighter-rouge&quot;&gt;.rightToLeft&lt;/code&gt; or &lt;code class=&quot;highlighter-rouge&quot;&gt;.leftToRight&lt;/code&gt;, based on the user’s systemwide preferences. The catch here is that since this property is accessible on a &lt;code class=&quot;highlighter-rouge&quot;&gt;sharedApplication&lt;/code&gt; instance, it won’t work in app extensions.&lt;/p&gt;

&lt;h3 id=&quot;another-writing-direction-trick-with-nsparagraphstyle&quot;&gt;Another Writing Direction Trick, with NSParagraphStyle&lt;/h3&gt;
&lt;p&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsparagraphstyle&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSParagraphStyle&lt;/code&gt;&lt;/a&gt; gives us another way to check for the default text layout direction. &lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsparagraphstyle/1535327-defaultwritingdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;defaultWritingDirectionForLanguage(_:)&lt;/code&gt;&lt;/a&gt; takes an ISO language code, and returns one of three &lt;a href=&quot;https://developer.apple.com/documentation/uikit/nswritingdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSWritingDirection&lt;/code&gt;&lt;/a&gt; values: &lt;code class=&quot;highlighter-rouge&quot;&gt;.natural&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;.leftToRight&lt;/code&gt;, or &lt;code class=&quot;highlighter-rouge&quot;&gt;.rightToLeft&lt;/code&gt;. This was added to iOS 6 with a slew of typesetting improvements, and was around on macOS since 10.2. You can override the writing direction of an &lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsmutableparagraphstyle&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSMutableParagraphStyle&lt;/code&gt;&lt;/a&gt; instance by setting its &lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsparagraphstyle/1527354-basewritingdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;baseWritingDirection&lt;/code&gt;&lt;/a&gt; yourself.&lt;/p&gt;

&lt;h3 id=&quot;uiview--nsview-semantic-content-attributes-watchos&quot;&gt;UIView &amp;amp; NSView Semantic Content Attributes watchOS&lt;/h3&gt;
&lt;p&gt;Next up is &lt;a href=&quot;https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIView_Class/&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UIView&lt;/code&gt;&lt;/a&gt;’s &lt;a href=&quot;https://developer.apple.com/documentation/uikit/uiview/1622480-userinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection(for attribute:)&lt;/code&gt;&lt;/a&gt; method. Added in iOS 9, you can use this to check the layout direction on a specific view or control. You can set the &lt;a href=&quot;https://developer.apple.com/documentation/uikit/uiview/1622461-semanticcontentattribute&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;semanticContentAttribute&lt;/code&gt;&lt;/a&gt; manually, to force specific views to defy the rest of your app’s layout. You can use values such as &lt;code class=&quot;highlighter-rouge&quot;&gt;.forceLeftToRight&lt;/code&gt; and &lt;code class=&quot;highlighter-rouge&quot;&gt;.forceRightToLeft&lt;/code&gt;. The documentation says you should use this API for media playback controls. Another use case might be providing a manual language override for your app. (Really, though, just localize properly.)&lt;/p&gt;

&lt;p&gt;There is a similar property on &lt;a href=&quot;https://developer.apple.com/library/mac/documentation/Cocoa/Reference/ApplicationKit/Classes/NSView_Class/&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSView&lt;/code&gt;&lt;/a&gt; called &lt;a href=&quot;https://developer.apple.com/documentation/appkit/nsview/1483254-userinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection&lt;/code&gt;&lt;/a&gt;. You can set it to an &lt;a href=&quot;https://developer.apple.com/documentation/appkit/nsuserinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSUserInterfaceLayoutDirection&lt;/code&gt;&lt;/a&gt; value: &lt;code class=&quot;highlighter-rouge&quot;&gt;.leftToRight&lt;/code&gt; or &lt;code class=&quot;highlighter-rouge&quot;&gt;.rightToLeft&lt;/code&gt;. &lt;a href=&quot;https://developer.apple.com/documentation/watchkit/wkinterfaceobject&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;WKInterfaceControl&lt;/code&gt;&lt;/a&gt; has a &lt;a href=&quot;https://developer.apple.com/documentation/watchkit/wkinterfaceobject/1628136-setsemanticcontentattribute&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;setSemanticContentAttribute(_:)&lt;/code&gt;&lt;/a&gt; method for watchOS 2.1 to set (but not get) the semantic content attributes.&lt;/p&gt;

&lt;h3 id=&quot;the-modern-approach-uitraitcollection&quot;&gt;The Modern Approach: UITraitCollection&lt;/h3&gt;
&lt;p&gt;Finally, there’s &lt;a href=&quot;https://developer.apple.com/documentation/uikit/uitraitcollection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UITraitCollection&lt;/code&gt;&lt;/a&gt;, which was added in iOS 8 to help deal with the wide variety of screen resolutions stemming from  all of the iOS device screen resolutions and orientations. Trait collections added a property in iOS 10 called &lt;a href=&quot;https://developer.apple.com/documentation/uikit/uitraitcollection/1648355-layoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;layoutDirection&lt;/code&gt;&lt;/a&gt;. Since every &lt;code class=&quot;highlighter-rouge&quot;&gt;UIView&lt;/code&gt; conforms to &lt;a href=&quot;https://developer.apple.com/documentation/uikit/uitraitenvironment&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UITraitEnvironment&lt;/code&gt;&lt;/a&gt;, you can access this property on any &lt;code class=&quot;highlighter-rouge&quot;&gt;UIView&lt;/code&gt; on iOS 10 or higher.&lt;/p&gt;

&lt;p&gt;So there you have it. Five techniques to get and set layout direction on iOS, macOS, tvOS, and watchOS. Oh, here’s the chart:&lt;/p&gt;

&lt;table&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;Class&lt;/td&gt;
      &lt;td&gt;Method&lt;/td&gt;
      &lt;td&gt;Version Added&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSLocale&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale/1417681-characterdirectionforlanguage&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;characterDirection(forLanguage:)&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 4, macOS 10.6, tvOS 9, watchOS 2&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSLocale&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/foundation/nslocale/1414007-linedirectionforlanguage&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;lineDirection(forLanguage:)&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 4, macOS 10.6, tvOS 9.0, watchOS 2&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/library/ios/DOCUMENTATION/UIKit/Reference/UIApplication_Class/Reference/Reference.html&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UIApplication&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/uiapplication/1623025-userinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 5, tvOS 9&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/appkit/nsapplication&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSApplication&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/appkit/nsapplication/1428556-userinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;macOS 10.6&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsparagraphstyle&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSParagraphStyle&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsparagraphstyle/1535327-defaultwritingdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;defaultWritingDirectionForLanguage(_:)&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 6, macOS 10.2, tvOS 9, watchOS 2&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsparagraphstyle&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSParagraphStyle&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/nsparagraphstyle/1527354-basewritingdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;baseWritingDirection&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 6, macOS 10.2, tvOS 9, watchOS 2&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIView_Class/&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UIView&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/uiview/1622480-userinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection(for attribute:)&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 9, tvOS 9&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIView_Class/&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UIView&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/uiview/1622461-semanticcontentattribute&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;semanticContentAttribute&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 9, tvOS 9&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/library/mac/documentation/Cocoa/Reference/ApplicationKit/Classes/NSView_Class/&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;NSView&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/appkit/nsview/1483254-userinterfacelayoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;userInterfaceLayoutDirection&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;macOS 10.6&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/watchkit/wkinterfaceobject&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;WKInterfaceControl&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/watchkit/wkinterfaceobject/1628136-setsemanticcontentattribute&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;setSemanticContentAttribute(_:)&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 8.2, watchOS 2.1&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/uitraitcollection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;UITraitCollection&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;a href=&quot;https://developer.apple.com/documentation/uikit/uitraitcollection/1648355-layoutdirection&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;layoutDirection&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;iOS 10, tvOS 10&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;

&lt;p&gt;In case you missed the link above, here’s &lt;a href=&quot;/assets/localization-api/rtl-playground.zip&quot;&gt;a link to download the bonus playground&lt;/a&gt;.&lt;/p&gt;</content><author><name></name></author><summary type="html"></summary></entry><entry><title type="html">Learning Machine Learning with CoreML</title><link href="https://blog.mosheberman.com/learning-machine-learning-with-coreml/" rel="alternate" type="text/html" title="Learning Machine Learning with CoreML" /><published>2017-08-19T22:00:00-04:00</published><updated>2017-08-19T22:00:00-04:00</updated><id>https://blog.mosheberman.com/learning-machine-learning</id><content type="html" xml:base="https://blog.mosheberman.com/learning-machine-learning-with-coreml/">&lt;h1 id=&quot;coreml&quot;&gt;CoreML&lt;/h1&gt;

&lt;p&gt;This post is my notes collected while watching Apple’s WWDC 2017 &lt;a href=&quot;https://developer.apple.com/videos/play/wwdc2017/703&quot;&gt;Session 703&lt;/a&gt;, “Introducing Core ML” and &lt;a href=&quot;https://developer.apple.com/videos/play/wwdc2017/710&quot;&gt;Session 710&lt;/a&gt;, “Core ML in Depth.”&lt;/p&gt;

&lt;h1 id=&quot;introducing-coreml&quot;&gt;Introducing CoreML&lt;/h1&gt;

&lt;h2 id=&quot;apple-uses&quot;&gt;Apple Uses&lt;/h2&gt;
&lt;ul&gt;
  &lt;li&gt;Photos: People &amp;amp; Scence Recognition&lt;/li&gt;
  &lt;li&gt;Keyboard: Next word prediction, smart responses&lt;/li&gt;
  &lt;li&gt;Watch: Smart responses &amp;amp; handwriting recognition&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;examples&quot;&gt;Examples:&lt;/h2&gt;

&lt;ul&gt;
  &lt;li&gt;Real Time Image recognition&lt;/li&gt;
  &lt;li&gt;Text prediction&lt;/li&gt;
  &lt;li&gt;Entity recognition&lt;/li&gt;
  &lt;li&gt;Handwriting recognition&lt;/li&gt;
  &lt;li&gt;Style transfer&lt;/li&gt;
  &lt;li&gt;Sentiment analysis&lt;/li&gt;
  &lt;li&gt;Search ranking&lt;/li&gt;
  &lt;li&gt;Machine translation&lt;/li&gt;
  &lt;li&gt;Image captioning&lt;/li&gt;
  &lt;li&gt;Personalization&lt;/li&gt;
  &lt;li&gt;Face detection&lt;/li&gt;
  &lt;li&gt;Emotion detection&lt;/li&gt;
  &lt;li&gt;Speaker identification&lt;/li&gt;
  &lt;li&gt;Music tagging&lt;/li&gt;
  &lt;li&gt;Text summarization&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Whoa, that’s lot of examples.&lt;/p&gt;

&lt;h2 id=&quot;example&quot;&gt;Example&lt;/h2&gt;

&lt;p&gt;Recognizing a rose:&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;Start by color&lt;/li&gt;
  &lt;li&gt;Try shape&lt;/li&gt;
  &lt;li&gt;… It gets complicated fast&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
  &lt;p&gt;Rather than describing how a rose looks like programatically, we will describe rose emperically.&lt;/p&gt;

  &lt;p&gt;– Gaurav Kapoor&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2 id=&quot;two-steps-to-machine-learning&quot;&gt;Two Steps to Machine Learning&lt;/h2&gt;
&lt;ol&gt;
  &lt;li&gt;Training&lt;/li&gt;
  &lt;li&gt;Inference&lt;/li&gt;
&lt;/ol&gt;

&lt;h3 id=&quot;training-a-model&quot;&gt;Training a Model&lt;/h3&gt;

&lt;ol&gt;
  &lt;li&gt;Collect sample data (“roses, sunflowers, lillies”)&lt;/li&gt;
  &lt;li&gt;Pass through a learning algorithm&lt;/li&gt;
  &lt;li&gt;Generate a model&lt;/li&gt;
&lt;/ol&gt;

&lt;h3 id=&quot;inference&quot;&gt;Inference&lt;/h3&gt;
&lt;ol&gt;
  &lt;li&gt;Pass an image into the model&lt;/li&gt;
  &lt;li&gt;Get back a result and confidence level.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2 id=&quot;challenges&quot;&gt;Challenges:&lt;/h2&gt;
&lt;ul&gt;
  &lt;li&gt;Prove correctness&lt;/li&gt;
  &lt;li&gt;Performance&lt;/li&gt;
  &lt;li&gt;Energy Efficiency&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Three frameworks: Vision &amp;amp; NLP sit on top of CoreML.&lt;/p&gt;

&lt;p&gt;CoreML is built on top of Accelerate and MPS.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;Domain agnostic&lt;/li&gt;
  &lt;li&gt;Inputs: Images, text, dictionaries, raw number.&lt;/li&gt;
  &lt;li&gt;Accelerate is good for math functionality.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;advantages-to-running-locally&quot;&gt;Advantages to Running Locally&lt;/h3&gt;
&lt;ul&gt;
  &lt;li&gt;Privacy&lt;/li&gt;
  &lt;li&gt;Data Cost&lt;/li&gt;
  &lt;li&gt;No server cost&lt;/li&gt;
  &lt;li&gt;Always available&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;real-time-image-recognition&quot;&gt;Real Time Image Recognition&lt;/h3&gt;
&lt;ul&gt;
  &lt;li&gt;No latency&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;overview&quot;&gt;Overview&lt;/h2&gt;
&lt;ul&gt;
  &lt;li&gt;Xcode integrations&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;models&quot;&gt;Models&lt;/h2&gt;
&lt;p&gt;A model is a function that “happens to be learned from data.” Each takes an input and gives a an output.&lt;/p&gt;

&lt;h3 id=&quot;neural-network-types&quot;&gt;Neural Network Types:&lt;/h3&gt;
&lt;ul&gt;
  &lt;li&gt;Feed Forward Neural Networks (image/video)&lt;/li&gt;
  &lt;li&gt;Convolutional Neural Networks&lt;/li&gt;
  &lt;li&gt;Recurrent Neural Networks (text based applications)&lt;/li&gt;
  &lt;li&gt;Tree Ensembles&lt;/li&gt;
  &lt;li&gt;Support Vector Machines&lt;/li&gt;
  &lt;li&gt;Generalized Linear Models&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Focus on the use-case and let CoreML handle the details.
Models are single documents.&lt;/p&gt;
&lt;ul&gt;
  &lt;li&gt;Inputs, types, outputs&lt;/li&gt;
  &lt;li&gt;Structure of neural network&lt;/li&gt;
  &lt;li&gt;training parameters&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;where-do-models-come-from&quot;&gt;Where do models come from?&lt;/h3&gt;
&lt;ul&gt;
  &lt;li&gt;Developer.apple.com has some ready-to-use models.&lt;/li&gt;
  &lt;li&gt;The machine learning community:
– Caffe 
– Keras
– dmlc XGBoost
– scikit learn
– turi
– libsvm&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For converting data to CoreML format, use &lt;a href=&quot;https://pypi.python.org/pypi/coremltools&quot;&gt;Apple Core ML Tools Python package&lt;/a&gt;.&lt;/p&gt;

&lt;h2 id=&quot;development-flow&quot;&gt;Development Flow&lt;/h2&gt;

&lt;ul&gt;
  &lt;li&gt;Collect data&lt;/li&gt;
  &lt;li&gt;Train model&lt;/li&gt;
  &lt;li&gt;Drag model into Xcode.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Xcode shows name, filesize, author, license, inputs and outputs. It also generates Swift code asynchronously, for loading and predicting against the model.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;Model sizes: How does compression work?&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;Type of file is abstracted.&lt;/li&gt;
  &lt;li&gt;Strongly typed inputs.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;generated-source&quot;&gt;Generated Source&lt;/h3&gt;
&lt;ul&gt;
  &lt;li&gt;Input, output, and classifier classes.&lt;/li&gt;
  &lt;li&gt;Offers access to the underlying &lt;code class=&quot;highlighter-rouge&quot;&gt;MLModel&lt;/code&gt; for programatic access.&lt;/li&gt;
  &lt;li&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;MLModel&lt;/code&gt; has an &lt;code class=&quot;highlighter-rouge&quot;&gt;MLModelDescription&lt;/code&gt; and another conformance-based (?) prediction method.&lt;/li&gt;
  &lt;li&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;MLModel&lt;/code&gt; is JSON based.&lt;/li&gt;
&lt;/ul&gt;

&lt;h1 id=&quot;core-ml-depth&quot;&gt;Core ML Depth&lt;/h1&gt;

&lt;ul&gt;
  &lt;li&gt;CoreML provides a functional abstraction for machine learning models.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;types-of-coreml-inputs&quot;&gt;Types of CoreML Inputs&lt;/h2&gt;

&lt;ul&gt;
  &lt;li&gt;Numeric: &lt;code class=&quot;highlighter-rouge&quot;&gt;Double&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;Int64&lt;/code&gt;&lt;/li&gt;
  &lt;li&gt;Categories: &lt;code class=&quot;highlighter-rouge&quot;&gt;String&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;Int64&lt;/code&gt;&lt;/li&gt;
  &lt;li&gt;Images: &lt;code class=&quot;highlighter-rouge&quot;&gt;CVPixelBuffer&lt;/code&gt;&lt;/li&gt;
  &lt;li&gt;Arrays: &lt;code class=&quot;highlighter-rouge&quot;&gt;MLMultiArray&lt;/code&gt; (New type - why?)&lt;/li&gt;
  &lt;li&gt;Dictionaries: &lt;code class=&quot;highlighter-rouge&quot;&gt;[String: Double]&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;[Int64: Double]&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;working-with-text&quot;&gt;Working With Text&lt;/h2&gt;

&lt;p&gt;Sentiment analysis example: Takes text and passes to the model and the model returns an emoji (happy/ok/sad.)&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;Approach: Operates as word counts.&lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;NSLinguisticTagger to tokenize and count words&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;A “Pipeline Classifier” does a few things before returning a prediction. Takes a dictionary, returns a sentiment label, and sentiment scores between 0 and 1.&lt;/li&gt;
&lt;/ul&gt;</content><author><name></name></author><summary type="html">CoreML</summary></entry><entry><title type="html">Modernizing MBCalendarKit</title><link href="https://blog.mosheberman.com/modernizing-mbcalendarkit/" rel="alternate" type="text/html" title="Modernizing MBCalendarKit" /><published>2017-08-18T09:00:00-04:00</published><updated>2017-08-18T09:00:00-04:00</updated><id>https://blog.mosheberman.com/modernizing-mbcalendarkit</id><content type="html" xml:base="https://blog.mosheberman.com/modernizing-mbcalendarkit/">&lt;p&gt;&lt;img src=&quot;/assets/modernizing-mbcalendarkit/Banner.png&quot; alt=&quot;MBCalendarKit&quot; /&gt;&lt;/p&gt;

&lt;p&gt;MBCalendarKit was written in 2013, to emulate the original iPhone’s calendar app. It featured the classic grid/month view, a week view, and a day view. All three modes had a table below below the calendar, listing events for the currently selected day.&lt;/p&gt;

&lt;p&gt;When iOS 7 brought a new calendar app to iOS, MBCalendarKit was quickly left in the past. I was slow to adopt iOS 6 technologies in this project, which made it harder to keep up. Issues began to pile up on GitHub. Swift came out, and many new calendar libraries sprung up to take MBCalendarKit’s place.&lt;/p&gt;

&lt;p&gt;Recently, I starting woring on some features that needed a calendar UI. I looked around but found that the libraries that were simple to integrate were difficult to customize. The easily customizable ones had inneccassible documentation. I took another look at MBCalendarKit and set out to modernize it.&lt;/p&gt;

&lt;p&gt;I’m proud to announce a nearly complete rewrite, &lt;a href=&quot;https://github.com/MosheBerman/MBCalendarKit&quot;&gt;MBCalendarKit 5&lt;/a&gt;, with these great new features:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;Integration into your project as a dynamic framework&lt;/li&gt;
  &lt;li&gt;Swift interoperability improvements&lt;/li&gt;
  &lt;li&gt;Autolayout support&lt;/li&gt;
  &lt;li&gt;Custom Cell Support&lt;/li&gt;
  &lt;li&gt;Rendering in Interface Builder as an IBDesignable&lt;/li&gt;
  &lt;li&gt;Enhanced support for localization. Specifically, appropriate layout in right-to-left environments&lt;/li&gt;
  &lt;li&gt;An updated sample app to try out different parts of the framework&lt;/li&gt;
  &lt;li&gt;Thorough Documentation&lt;/li&gt;
  &lt;li&gt;An updated changelog&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This update resolved roughly half of the oustanding issues and leads to a place where the library can stand on its own again, as a first-class software library. I have a lot going on right now, but I also have a lot on the roadmap for MBCalendarKit. I hope you like it.&lt;/p&gt;</content><author><name></name></author><summary type="html"></summary></entry><entry><title type="html">Altering Git History</title><link href="https://blog.mosheberman.com/altering-history/" rel="alternate" type="text/html" title="Altering Git History" /><published>2017-07-26T09:00:00-04:00</published><updated>2017-07-26T09:00:00-04:00</updated><id>https://blog.mosheberman.com/sanitizing-git-history</id><content type="html" xml:base="https://blog.mosheberman.com/altering-history/">&lt;h1 id=&quot;the-problem&quot;&gt;The Problem&lt;/h1&gt;

&lt;p&gt;I have a personal project that I want to publish on GitHub without sensitive data, while preserving the rest of my code &lt;em&gt;and my commit history&lt;/em&gt;. Although the project itself is long dead and its passwords are useless, there are personal email addresses, and API keys that shouldn’t be published.&lt;/p&gt;

&lt;h1 id=&quot;two-approaches&quot;&gt;Two Approaches&lt;/h1&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;strong&gt;Initial Approach:&lt;/strong&gt; Since I’m moving the code to a new repository, I could just copy the files, sanitize them by hand, and push a single “initial commit” to the public repository.&lt;/li&gt;
  &lt;li&gt;&lt;strong&gt;Alternate Approach:&lt;/strong&gt; I can try to sanitize my repository by altering my &lt;code class=&quot;highlighter-rouge&quot;&gt;git&lt;/code&gt; history.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The initial approach accomplishes my objective to publish the code without the sensitive parts, but won’t preserve commit history. Let’s investigate the alternate approach.&lt;/p&gt;

&lt;h1 id=&quot;an-alternate-approach&quot;&gt;An Alternate Approach&lt;/h1&gt;

&lt;p&gt;The native way is a git command called &lt;a href=&quot;https://git-scm.com/docs/git-filter-branch&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;git-filter-branch&lt;/code&gt;&lt;/a&gt;, used to run filters on git trees. This command is available wherever &lt;code class=&quot;highlighter-rouge&quot;&gt;git&lt;/code&gt; is installed, but &lt;a href=&quot;https://git-scm.com/docs/git-filter-branch#_notes&quot;&gt;the documentation&lt;/a&gt; actually suggests a tool called &lt;a href=&quot;https://rtyley.github.io/bfg-repo-cleaner/&quot;&gt;BFG Repo-Cleaner&lt;/a&gt; for what I want to do here, so let’s look at that.&lt;/p&gt;

&lt;p&gt;Judging by its documentation, BFG is simpler to use than &lt;code class=&quot;highlighter-rouge&quot;&gt;git-filter-branch&lt;/code&gt;. BFG provides an easy way to filter strings and files from your commit history without having to write scripts yourself.&lt;/p&gt;

&lt;h1 id=&quot;altering-history&quot;&gt;Altering History&lt;/h1&gt;

&lt;p&gt;With a tool in hand, let’s give this a try. Here’s how I did it:&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;Before using BFG to clean up the project history, I had to put the repo’s latest commit into my desired state. (With filtering already done.) Use a standard &lt;code class=&quot;highlighter-rouge&quot;&gt;git clone&lt;/code&gt; or &lt;code class=&quot;highlighter-rouge&quot;&gt;git pull&lt;/code&gt; to ensure the latest content, then make those changes.&lt;/li&gt;
  &lt;li&gt;Once the changes are made, commited, and pushed back to the git server, check out a “&lt;a href=&quot;https://git-scm.com/docs/git-clone#git-clone---mirror&quot;&gt;mirror&lt;/a&gt; copy” of the repository, using the &lt;code class=&quot;highlighter-rouge&quot;&gt;--mirror&lt;/code&gt; flag: &lt;code class=&quot;highlighter-rouge&quot;&gt;git clone --mirror {repo_url}&lt;/code&gt;&lt;/li&gt;
  &lt;li&gt;Finally, it’s time to run BFG: &lt;code class=&quot;highlighter-rouge&quot;&gt;java -jar ~/path/to/bfg.jar --replace-text {text_file_name.txt} {repo.git}&lt;/code&gt;. The file &lt;code class=&quot;highlighter-rouge&quot;&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;text_file_name.txt&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;&lt;/code&gt; contains strings to redact, each on its own line. (See below.)&lt;/li&gt;
  &lt;li&gt;Finally, push the contents of &lt;code class=&quot;highlighter-rouge&quot;&gt;repo_name.git&lt;/code&gt; back to the server:&lt;/li&gt;
&lt;/ol&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;$&amp;gt; cd repo_name.git
$&amp;gt; git push
&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;

&lt;h2 id=&quot;notes&quot;&gt;Notes&lt;/h2&gt;

&lt;h1 id=&quot;bare-vs-mirror&quot;&gt;Bare vs Mirror&lt;/h1&gt;
&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;A “bare” copy is your repository containing its &lt;code class=&quot;highlighter-rouge&quot;&gt;git&lt;/code&gt; state, but &lt;em&gt;without&lt;/em&gt; your actual source files. Think of this as the contents of &lt;code class=&quot;highlighter-rouge&quot;&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;repo_name&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;/.git&lt;/span&gt;&lt;/code&gt; in &lt;code class=&quot;highlighter-rouge&quot;&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;repo_name&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;&lt;/code&gt;.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;A “mirror” copy includes whatever &lt;a href=&quot;https://git-scm.com/docs/git-clone#git-clone---bare&quot;&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;--bare&lt;/code&gt;&lt;/a&gt; does, plus some extra information. (Following the BFG documentation, used &lt;code class=&quot;highlighter-rouge&quot;&gt;--mirror&lt;/code&gt;.)&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;h1 id=&quot;customizing-text-replacements&quot;&gt;Customizing Text Replacements&lt;/h1&gt;

&lt;p&gt;The file passed into &lt;code class=&quot;highlighter-rouge&quot;&gt;--replace-text&lt;/code&gt; may look like this:&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;password
someAPIKey123
aUsernameABC
&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;

&lt;p&gt;By default, each string is replaced with &lt;code class=&quot;highlighter-rouge&quot;&gt;***REPLACED***&lt;/code&gt;. &lt;a href=&quot;https://stackoverflow.com/a/15730571/224988&quot;&gt;According to BFG’s author&lt;/a&gt;, you can customize replacements by adding an arrow and the replacement string, like so:&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;password==&amp;gt;***REDACTED***
someAPIKey123==&amp;gt;{api_key}
aUsernameABS==&amp;gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;

&lt;h1 id=&quot;removing-files&quot;&gt;Removing Files&lt;/h1&gt;

&lt;p&gt;Remove files with:&lt;/p&gt;

&lt;p&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;$&amp;gt; java -jar ~/path/to/bfg.jar --delete-files {file_names} {repo_name.git}&lt;/code&gt;&lt;/p&gt;

&lt;h1 id=&quot;sanity-check&quot;&gt;Sanity Check&lt;/h1&gt;

&lt;p&gt;Double check your work by comparing any commits where a redacted string was modified or introduced. If you see the replacement strings, you’re all set.&lt;/p&gt;</content><author><name></name></author><summary type="html">The Problem</summary></entry><entry><title type="html">Vision</title><link href="https://blog.mosheberman.com/vision-framework/" rel="alternate" type="text/html" title="Vision" /><published>2017-07-04T00:00:00-04:00</published><updated>2017-07-04T00:00:00-04:00</updated><id>https://blog.mosheberman.com/vision-framework</id><content type="html" xml:base="https://blog.mosheberman.com/vision-framework/">&lt;p&gt;These are my notes on &lt;a href=&quot;https://developer.apple.com/videos/play/wwdc2017/506/&quot;&gt;WWDC 2017 Session 506&lt;/a&gt;, called “Vision Framework: Building on Core ML.” I’ve linked to Wikipedia technical terms that I wasn’t familiar with.&lt;/p&gt;

&lt;h2 id=&quot;what-is-it&quot;&gt;What is it?&lt;/h2&gt;

&lt;blockquote&gt;
  &lt;p&gt;High-level on-device solutions to computer vision problems through one simple API.&lt;/p&gt;

  &lt;p&gt;– Brett Keating, WWDC 2017&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2 id=&quot;what-can-it-do&quot;&gt;What can it do?&lt;/h2&gt;

&lt;p&gt;Vision framework can detect faces, based on deep learning. Apple says this gives &lt;a href=&quot;https://en.wikipedia.org/wiki/Precision_and_recall&quot;&gt;higher precision and higher recall&lt;/a&gt; than previous technologies, such as Core Image or AVCapture. This allows for better detection of smaller faces, side views (“strong profiles,”) partially blocked (“occluded”) faces, “including hats and glasses.”&lt;/p&gt;

&lt;p&gt;The full feature list:&lt;/p&gt;
&lt;ul&gt;
  &lt;li&gt;Face landmarks, a “constellation of points” on the face. Essentially, tracing eyes, nose, mouth, and the chin.&lt;/li&gt;
  &lt;li&gt;Photo Stitching (“&lt;a href=&quot;https://en.wikipedia.org/wiki/Image_registration&quot;&gt;Image Registration&lt;/a&gt;”) with two techniques: “translation only” and full “&lt;a href=&quot;https://en.wikipedia.org/wiki/Homography_(computer_vision)&quot;&gt;homography&lt;/a&gt;.”&lt;/li&gt;
  &lt;li&gt;Rectangle, barcode, and text detection.&lt;/li&gt;
  &lt;li&gt;Object tracking, for faces or other rectangles in video.&lt;/li&gt;
  &lt;li&gt;Automatically integrate CoreML models directly into Vision. (Apple showed a demo using an &lt;a href=&quot;http://yann.lecun.com/exdb/mnist/&quot;&gt;MNIST&lt;/a&gt; handwriting recognition model, &amp;amp; Core Image filters to read the number four from a sticky note.)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;three-steps-to-vision&quot;&gt;Three Steps To Vision&lt;/h2&gt;

&lt;ol&gt;
  &lt;li&gt;Use a &lt;code class=&quot;highlighter-rouge&quot;&gt;VNRequest&lt;/code&gt; subclass to ask Vision for something. For example, &lt;code class=&quot;highlighter-rouge&quot;&gt;VNDetectBarcodesRequest&lt;/code&gt; for barcodes.&lt;/li&gt;
  &lt;li&gt;Pass the request to one of two kinds of request handlers, along with a completion block.&lt;/li&gt;
  &lt;li&gt;In the completion block, we get back the initial request, its &lt;code class=&quot;highlighter-rouge&quot;&gt;results&lt;/code&gt; array populated with “observations,” like &lt;code class=&quot;highlighter-rouge&quot;&gt;VNClassificationObservation&lt;/code&gt; or &lt;code class=&quot;highlighter-rouge&quot;&gt;VNDetectedObjectObservation&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3 id=&quot;request-handlers&quot;&gt;Request Handlers&lt;/h3&gt;

&lt;p&gt;Vision offers two request handlers:&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;
    &lt;p&gt;Image Request Handlers: For “interactive” exploration of a still image. Image Request Handlers retain images for their lifecyle, as a performance optimization.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Sequence Request Handler: For tracking movement across video. Sequence request handlers don’t have the same optimization. (Imagine how many frames would need to be cached for a 30 second video clip at 24 frames per second - 720.)&lt;/p&gt;
  &lt;/li&gt;
&lt;/ol&gt;

&lt;h2 id=&quot;best-practices&quot;&gt;Best Practices&lt;/h2&gt;

&lt;p&gt;Apple discussed three areas for best practices:&lt;/p&gt;

&lt;h3 id=&quot;1-image-type&quot;&gt;1. Image Type&lt;/h3&gt;

&lt;p&gt;Vision supports a wide variety of image types/sources: &lt;code class=&quot;highlighter-rouge&quot;&gt;CVPixelBuffer&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;CGImageRef&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;CIImage&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;NSURL&lt;/code&gt;, and &lt;code class=&quot;highlighter-rouge&quot;&gt;NSData&lt;/code&gt;.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;Use &lt;code class=&quot;highlighter-rouge&quot;&gt;CVPixelBuffer&lt;/code&gt;s for streaming. (You get &lt;code class=&quot;highlighter-rouge&quot;&gt;CVPixelBuffer&lt;/code&gt;s from &lt;code class=&quot;highlighter-rouge&quot;&gt;CMSampleBuffer&lt;/code&gt; from a camera stream’s &lt;code class=&quot;highlighter-rouge&quot;&gt;VideoDataOut&lt;/code&gt;.) These are a low level format for in-memory RGB data.&lt;/li&gt;
  &lt;li&gt;Use &lt;code class=&quot;highlighter-rouge&quot;&gt;URL&lt;/code&gt; for accessing images that are saved to disk or &lt;code class=&quot;highlighter-rouge&quot;&gt;NSData&lt;/code&gt; for images from the web. For URL based images, you don’t need to pass EXIF orientation data. (But you can specify if you want to override the default.)&lt;/li&gt;
  &lt;li&gt;You can pass in &lt;code class=&quot;highlighter-rouge&quot;&gt;CIImage&lt;/code&gt;s from Core Image.&lt;/li&gt;
  &lt;li&gt;Finally, if you already have a &lt;code class=&quot;highlighter-rouge&quot;&gt;UIImage&lt;/code&gt; or &lt;code class=&quot;highlighter-rouge&quot;&gt;NSImage&lt;/code&gt;, get the &lt;code class=&quot;highlighter-rouge&quot;&gt;CGImageRef&lt;/code&gt; and pass that into Vision. Easy.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;2-what-am-i-going-to-do-with-the-image&quot;&gt;2. What am I going to do with the image?&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;Use the appropriate handler. (&lt;code class=&quot;highlighter-rouge&quot;&gt;VNImageRequestHandler&lt;/code&gt; or &lt;code class=&quot;highlighter-rouge&quot;&gt;VNSequenceRequestHandler&lt;/code&gt;.)&lt;/li&gt;
  &lt;li&gt;Don’t pre-scale images.&lt;/li&gt;
  &lt;li&gt;Do pass in EXIF orientation data. (Except for URL based images.)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;3-performance&quot;&gt;3. Performance&lt;/h3&gt;

&lt;p&gt;Dispatch to a background queue, so Vision doesn’t block your UI. In the completion handler, remember to dispatch back to the main queue if you’re updating UI.&lt;/p&gt;

&lt;h2 id=&quot;demo&quot;&gt;Demo&lt;/h2&gt;

&lt;p&gt;Apple ended the session with a &lt;a href=&quot;https://developer.apple.com/sample-code/wwdc/2017/ImageClassificationwithVisionandCoreML.zip&quot;&gt;demo&lt;/a&gt;.&lt;/p&gt;</content><author><name></name></author><summary type="html">These are my notes on WWDC 2017 Session 506, called “Vision Framework: Building on Core ML.” I’ve linked to Wikipedia technical terms that I wasn’t familiar with.</summary></entry><entry><title type="html">Apple Car: Getting There With ARKit</title><link href="https://blog.mosheberman.com/apple-car-getting-there-with-arkit/" rel="alternate" type="text/html" title="Apple Car: Getting There With ARKit" /><published>2017-06-13T02:00:00-04:00</published><updated>2017-06-13T02:00:00-04:00</updated><id>https://blog.mosheberman.com/arkit-inverted</id><content type="html" xml:base="https://blog.mosheberman.com/apple-car-getting-there-with-arkit/">&lt;p&gt;&lt;img src=&quot;/assets/2017-06-13-arkit-inverted/car.png&quot; alt=&quot;ARKit or CARKit?&quot; height=&quot;128px&quot; style=&quot;float:left; padding: 10px;&quot; /&gt;&lt;/p&gt;

&lt;p&gt;I watched the intro to &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; session today and the whole time I’m thinking: That’s great for a car. That’s great for a car. It wasn’t just my gut telling me this. The talking points that Apple brought up are all up to par with the feature set you want out of a car.&lt;/p&gt;

&lt;p&gt;Mike Buerli, one of the Apple engineers who ran the &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; session described the tech as “the illusion that virtual objects are placed in the physical world […] based on what your camera sees.” But why? So you can re-model your home? What about games, you say?&lt;/p&gt;

&lt;p&gt;Maybe I’m wrong, but I also bet that &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; powered games, especially on iPad, won’t be great at long immersive play. Quick encounters in Pokémon Go? Sure. Pokémon Blue in the real world? Nope. I’d get tired holding up an iPad for too long, the same way a touch screen Mac has been maligned in the past. It’s tiresome, and just not a great interface for consumption of large amounts of useful data. I think the key part of that quote is “based on what your camera sees.”&lt;/p&gt;

&lt;h2 id=&quot;arkit-or-carkit&quot;&gt;ARKit or CARKit?&lt;/h2&gt;

&lt;p&gt;If augmented reality is cool for people, it’s downright useful for machines. Let’s pretend we’re writing some autonomous car software.&lt;/p&gt;

&lt;p&gt;We probably want to map out the terrain, so we can move the car safely. We’re going to want to make sure other drivers on the road don’t collide with our drivers, and you’re going to want to be able to adapt to new environments based on movement, and light. We’re going to want to be able to respond quickly.&lt;/p&gt;

&lt;p&gt;&lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; provides a partial solution for this, with what Apple is calling “World Tracking.” Buerli describes some of this functionality as one of the three components of &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;“With World Tracking we provide you the ability to get your devices relative position in the physical environment,” he said. He also stressed orientation as a huge benefit of using &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt;. &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; is about where your device, and by extension, you are in the world. Your device can be your phone or your car.&lt;/p&gt;

&lt;p&gt;So this is already getting us part of the way there. We can read the terrain with &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; and we can tell where you are. What about movements and low light?&lt;/p&gt;

&lt;h2 id=&quot;having-a-vision&quot;&gt;Having a Vision&lt;/h2&gt;

&lt;p&gt;The Vision framework isn’t just a fresh take on Apple’s previous software offerings for face detection. Fewer false positives, smaller faces, occlusion detection, and strong profiles (such as the side of a face) were all things Apple tauted in their Vision framework session. (You know that game people play where they pretend the fronts of cars look like faces?)&lt;/p&gt;

&lt;p&gt;I imagine each of these features directly correspond to a challenge one could conceivable face while developing autonomous car software. Accuracy is important if you want to detect an approahcing car in the distance, moving at highway speeds.&lt;/p&gt;

&lt;p&gt;Apple wouldn’t just re-write something to make it better. There’s usually a deeper, long term motivation involved. Metal enabled a lot of this years technology. This kind of wheel re-inventing means that something is going on, and I think that thing is putting the pieces together for the now confirmed car.&lt;/p&gt;

&lt;p&gt;Fewer false positives and smaller faces deal with the vanishing point. Occlusion detection and strong profiles deal with stop signs and intersections. (Cars often approach at 45 degree angles.) These pain points sound exactly like the kinds of problems I’d want to have solved.&lt;/p&gt;

&lt;h2 id=&quot;detours-ahead&quot;&gt;Detour(s) Ahead&lt;/h2&gt;

&lt;p&gt;During the &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; session, Apple outlined some of the scenarious where &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; won’t work well, and they all sound like impediments to shipping a safe system: Low light, temporarily blocked cameras, drift. I think that’s where machine learning can help at least little bit.&lt;/p&gt;

&lt;p&gt;Low light is a problem around the world. Can Apple make car software work with an alternate band of light, such as infrared? Is a design goal of the project to work with stock hardware that can easily spread adoption across partner manufacturers around the world? That might be a reason not to use another kind of light, and make it work with stock components.&lt;/p&gt;

&lt;p&gt;Temporarily blocked cameras are going to be a problem on any camera-based system. It’s the same thing as someone waving something in front of your face or blindfolding you. It happens, and that’s a reason not to rely on cameras 100% for driving. That said, I don’t think this is a problem worth solving.&lt;/p&gt;

&lt;p&gt;Lastly, Apple said that objects in an &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; scene can start to drift of they are moving. That would seemingly pose a problem for detecting other drivers. This is why I think a combination of &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; and &lt;code class=&quot;highlighter-rouge&quot;&gt;Vision&lt;/code&gt; are going to be critical to the success of a car project. &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; for terrain and world. &lt;code class=&quot;highlighter-rouge&quot;&gt;Vision&lt;/code&gt; for tracking nonstationary obstacles.&lt;/p&gt;

&lt;h2 id=&quot;thoughts-on-hardware&quot;&gt;Thoughts on Hardware&lt;/h2&gt;

&lt;p&gt;Apple has made some interesting hardware advances recently, and I think that one is notable in particular, given how accurate &lt;code class=&quot;highlighter-rouge&quot;&gt;ARKit&lt;/code&gt; is with just a single camera. The dual lens on iPhone 7 Plus isn’t just great for photography, but it also more closely resembles how our brain processes depth. That’s potentially a component of this whole picture as well.&lt;/p&gt;

&lt;p&gt;The contrast to the one or two camera system is the panoramic 360 degree cameras we’ve seen on almost every self-driving car prototype to be secretly photographed outside someone’s backyard in Silicon Valley. If Apple has such accurate terrain mapping capable of running in real time on an A9, what are those contraptions on cars doing?&lt;/p&gt;

&lt;h2 id=&quot;what-about-coreml&quot;&gt;What about CoreML?&lt;/h2&gt;

&lt;p&gt;I think that machine learning is going to be useful for complementing all of the vision based technology. I wouldn’t rely on machine learning if the cameras go out, but it might be useful for learning traffic patterns or roadwork patterns. There are a number of interesting applications surrounding driving habits as well.&lt;/p&gt;

&lt;p&gt;I initially thought that machine learning might be useful for kicking in when a car’s cameras fail. That could be, but even familiarity with the terrain doesn’t guarantee that another crazy driver isn’t passing through on a perpendicular road at 80 miles an hour.&lt;/p&gt;

&lt;h2 id=&quot;from-here-to-there-apple-car&quot;&gt;From Here To There: Apple Car&lt;/h2&gt;

&lt;p&gt;Augmented Reality. Vision. CoreML. Are you getting it yet? These aren’t three devices. It’s one device and we’re calling it Apple Car.&lt;/p&gt;</content><author><name></name></author><summary type="html"></summary></entry><entry><title type="html">WWDC Wishlist: The Next Apple TV</title><link href="https://blog.mosheberman.com/my-wwdc-2017-wishlist/" rel="alternate" type="text/html" title="WWDC Wishlist: The Next Apple TV" /><published>2017-05-24T12:00:00-04:00</published><updated>2017-05-24T12:00:00-04:00</updated><id>https://blog.mosheberman.com/wwdc-wishlist</id><content type="html" xml:base="https://blog.mosheberman.com/my-wwdc-2017-wishlist/">&lt;p&gt;With WWDC just around the corner, here’s my wishlist: “Hey, Siri” for Apple TV, with a smarter Siri.&lt;/p&gt;

&lt;p&gt;I think this is all Apple needs to compete with Google Home and Amazon Echo. Everyone else is doing dedicated devices for voice-driven home assistants because they did dongles for television.&lt;/p&gt;

&lt;p&gt;Apple TV is unique in that it’s the only one of the three device devices that actually rests on a television stand and stares back at you in all of its beauty. Apple can add a display and a camera to the front and compete with the Amazon Echo Show and Google Home.&lt;/p&gt;

&lt;p&gt;A television, a gaming system, and a voice-controlled home assistant. Are you getting it yet?&lt;/p&gt;</content><author><name></name></author><summary type="html">With WWDC just around the corner, here’s my wishlist: “Hey, Siri” for Apple TV, with a smarter Siri.</summary></entry><entry><title type="html">Swift is not like Kotlin</title><link href="https://blog.mosheberman.com/swift-is-not-like-kotlin/" rel="alternate" type="text/html" title="Swift is not like Kotlin" /><published>2017-05-19T11:00:00-04:00</published><updated>2017-05-19T11:00:00-04:00</updated><id>https://blog.mosheberman.com/you-wish-kotlin-was-like-swift</id><content type="html" xml:base="https://blog.mosheberman.com/swift-is-not-like-kotlin/">&lt;p&gt;Someone just showed me the “&lt;a href=&quot;http://nilhcem.com/swift-is-like-kotlin/&quot;&gt;Swift is Like Kotlin&lt;/a&gt;” blog post. The page compares Swift and Kotlin code samples side by side, with captions, and I think the post completely missed the point of Swift.&lt;/p&gt;

&lt;p&gt;I want to talk about how the stated design goals of each language influence the comparison, and how a syntactical comparison of the two languages is misleading. Then, I’d like to explain why Swift is more like Python than Kotlin. I’ll wrap this up with some (slightly snarky) commentary about iOS vs Android.&lt;/p&gt;

&lt;p&gt;Before we begin, I’d like to disclaim that I’ve never looked at Kotlin before today, and it seems like an interesting language. That said, the simple syntactical comparison feels like it’s in really poor taste, so let’s start by comparing Apple and JetBrains’ stated goals and go from there.&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://swift.org&quot;&gt;Swift.org&lt;/a&gt; outlines several goals for Swift as a language. From the Swift homepage:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;Swift makes it easy to write software that is incredibly fast and safe by design. […] For students, learning Swift has been a great introduction to modern programming concepts and best practices.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Speed, and safety from errors and crashes. There’s another stated goal that’s also worth highlighting. Swift is about expressiveness, and ease of learning.&lt;/p&gt;

&lt;p&gt;In contrast, Kotlin’s website describes has this tag line in big letters:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;Statically typed programming language for modern multiplatform applications.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Ok, that’s great. It’s a programming language for modern applications, with data types. The only thing I learn from this blurb is that has a stronger typing system than Python. (This belies the fact that Kotlin is &lt;em&gt;not&lt;/em&gt; aimed at ease of learning, or its marketing might have emphasized that a teeny bit more.)&lt;/p&gt;

&lt;p&gt;One of the requirements for “a great introdiction” to a language is a syntax that is easy to learn read and write. To say that Kotlin is like Swift is to say that both languages share this attribute.&lt;/p&gt;

&lt;p&gt;The code samples demonstrate quite the opposite to me. Everything from keywords to collection declarations are geekier in Kotlin, and often harder to remember for beginners. &lt;code class=&quot;highlighter-rouge&quot;&gt;println&lt;/code&gt; is a programmer-speak abbreviation for &lt;code class=&quot;highlighter-rouge&quot;&gt;print line&lt;/code&gt;, whereas &lt;code class=&quot;highlighter-rouge&quot;&gt;print&lt;/code&gt; is a fully spelled out verb. &lt;code class=&quot;highlighter-rouge&quot;&gt;val&lt;/code&gt; is for value and &lt;code class=&quot;highlighter-rouge&quot;&gt;let&lt;/code&gt; for, well, let.&lt;/p&gt;

&lt;p&gt;As a computer science student and a self-learner, professors’ examples in C, and languages like Ruby with abbreviations galore were frustrating to understand. Swift makes a noticable and successful effort to improve in that area.&lt;/p&gt;

&lt;p&gt;Kotlin still doesn’t seem to have array or dictionary/hashmap literal syntax, compared to Swift’s &lt;code class=&quot;highlighter-rouge&quot;&gt;[]&lt;/code&gt;. Which is easier? Are they “like” each other?&lt;/p&gt;

&lt;p&gt;Another comparison is the minimum viable program in each language. Swift’s is one line: &lt;code class=&quot;highlighter-rouge&quot;&gt;print(&quot;Hello World!&quot;)&lt;/code&gt;. Kotlin looks more like this:&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;package hello

fun main(args: Array&amp;lt;String&amp;gt;) {
    println(&quot;Hello world!&quot;)
}

&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;

&lt;p&gt;I don’t know enough about Kotlin, but if my Java memory serves, &lt;code class=&quot;highlighter-rouge&quot;&gt;interface&lt;/code&gt;s don’t have optional functions. Adopting an interface is all-or-nothing. Swift and Objective-C both allow for &lt;code class=&quot;highlighter-rouge&quot;&gt;protocols&lt;/code&gt; to declare their contents as optional for adopters.&lt;/p&gt;

&lt;p&gt;As a final example, subclasses in Kotlin are &lt;code class=&quot;highlighter-rouge&quot;&gt;final&lt;/code&gt; by default. (I mean this in the informal sense.) You need to mark a class as &lt;code class=&quot;highlighter-rouge&quot;&gt;open&lt;/code&gt; for Kotlin to allow subclasses.&lt;/p&gt;

&lt;p&gt;Kotlin is very much &lt;em&gt;not&lt;/em&gt; like Swift these respects. Python, like Swift, is easy to learn, and might have been a better analogy. To be honest, I’m not sure what the purpose of making this comparison is.&lt;/p&gt;

&lt;p&gt;If this post was an the vein of “Android for iOS engineers,” or “Kotlin for Swift developers” it would make more sense to me. “These are things that you’ll be familiar with.” Sure, there are valid comparisons to make between Swift and Kotlin. (For example, I spy templating in the Kotlin code sample.)&lt;/p&gt;

&lt;p&gt;It feels like the intention was to stoke fanyboy-ish and fangirl-ish enthusiasm for Kotlin and Android. “Rah rah! Kotlin is as good as Apple’s language.” In fact, someone linked the original article at the &lt;a href=&quot;https://en.wikipedia.org/wiki/Kotlin_(programming_language)&quot;&gt;top of Kotlin’s Wikipedia page&lt;/a&gt;! (Oh, while we’re here, it wasn’t an accident that Google’s Stephanie Saad Cuthberson used Steve Jobs’ famous “one more thing” formula when announcing a “new” language for Android.)&lt;/p&gt;

&lt;p&gt;Judging by Kotlin’s first commit in 2010, &lt;a href=&quot;https://github.com/JetBrains/kotlin/commit/3e4dce385331c91c9059fcdcea3eae2394f34942&quot;&gt;Kotlin&lt;/a&gt;, it’s about the same age as Swift, which also &lt;a href=&quot;https://en.wikipedia.org/wiki/Swift_(programming_language)#History&quot;&gt;began that year&lt;/a&gt;. Two languages, two platforms, two communities. That’s totally ok. We can retain our identities, yet still be friends.&lt;/p&gt;

&lt;p&gt;Kotlin is like Java. Swift is like Objective-C. They’re probably both “like” C++.&lt;/p&gt;</content><author><name></name></author><summary type="html">Someone just showed me the “Swift is Like Kotlin” blog post. The page compares Swift and Kotlin code samples side by side, with captions, and I think the post completely missed the point of Swift.</summary></entry><entry><title type="html">Swapping App Icons on iOS and tvOS</title><link href="https://blog.mosheberman.com/swapping-app-icons-on-ios-and-tvos/" rel="alternate" type="text/html" title="Swapping App Icons on iOS and tvOS" /><published>2017-05-08T12:00:00-04:00</published><updated>2017-05-08T12:00:00-04:00</updated><id>https://blog.mosheberman.com/ios-icon-swap</id><content type="html" xml:base="https://blog.mosheberman.com/swapping-app-icons-on-ios-and-tvos/">&lt;p&gt;In this post, I’m going to cover the &lt;code class=&quot;highlighter-rouge&quot;&gt;Info.plist&lt;/code&gt; changes required to enable your app to locate alternate icons and checking for the availability of alternate icons. I’m also going to provide an Objective-C code sample, and discuss the &lt;code class=&quot;highlighter-rouge&quot;&gt;supportsAlternateIcons&lt;/code&gt; API and consider what it might be hinting at. I cover the prompt that users &lt;em&gt;may&lt;/em&gt; see when invoking your implementation of this API and how to check if your user is actually using an alternate icon. Finally, we’ll talk about the callback block, and resetting the custom icon back to the original.&lt;/p&gt;

&lt;p&gt;I implemented the new icon swapping API available on iOS 10.3, as part of Ultimate Zmanim 11.2’s new Color Themes for premium users. Here’s what I learned:&lt;/p&gt;

&lt;p&gt;First, the documentation is a little confusing to read, but if you want to implement this feature, you want to add a dictionary with the &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleIcons&lt;/code&gt; key to &lt;code class=&quot;highlighter-rouge&quot;&gt;Info.plist&lt;/code&gt;. &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleIcons&lt;/code&gt; has a few sub-keys that you might be familiar with from iOS 6’s Newsstand. (See &lt;code class=&quot;highlighter-rouge&quot;&gt;UINewsstandIcon&lt;/code&gt;.) The ones we care about here are &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundlePrimaryIcon&lt;/code&gt; and &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleAlternateIcons&lt;/code&gt;. Both of these are dictionaries. &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundlePrimaryIcon&lt;/code&gt; contains two keys: A boolean &lt;code class=&quot;highlighter-rouge&quot;&gt;UIPrerenderedIcon&lt;/code&gt; and an array with the key &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleIconFiles&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;To add alternate icons, though, we need to add &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleAlternateIcons&lt;/code&gt;. The keys in this dictionary are names of icons. Each key contains a dictionary, which matches the contents of the primary icon. That is, each icon key is going to have a &lt;code class=&quot;highlighter-rouge&quot;&gt;UIPrerenderedIcon&lt;/code&gt; key and an array of &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleIconFiles&lt;/code&gt; strings.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/assets/2017-5-8-icon-swap/info-plist.png&quot; alt=&quot;A screenshot of info.plist&quot; /&gt;&lt;/p&gt;

&lt;hr /&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; On tvOS, there are no prerendered icons, so the &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleAlternateIcons&lt;/code&gt; key simply contains an array of icon file name strings.&lt;/p&gt;

&lt;hr /&gt;

&lt;p&gt;The alternate icon files themselves have to be &lt;code class=&quot;highlighter-rouge&quot;&gt;png&lt;/code&gt; files in your app bundle. If you put them into an asset catalog, iOS will not find them. (I tried alternate icon entries, and regular image entries. Neither worked.) Remember to name your icons with the retina display naming convention. For example, if my icon file name is &lt;code class=&quot;highlighter-rouge&quot;&gt;AlternateIconBlue&lt;/code&gt;, I want to have a dictionary in &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleIcons&lt;/code&gt; called &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleAlternateIcons&lt;/code&gt;. On iOS that’s going to have a key called &lt;code class=&quot;highlighter-rouge&quot;&gt;AlternativeIconBlue&lt;/code&gt; with &lt;code class=&quot;highlighter-rouge&quot;&gt;UIPrerenderedIcon&lt;/code&gt; (set to &lt;code class=&quot;highlighter-rouge&quot;&gt;YES&lt;/code&gt; in my case) and then an array called &lt;code class=&quot;highlighter-rouge&quot;&gt;CFBundleIconFiles&lt;/code&gt; containing &lt;code class=&quot;highlighter-rouge&quot;&gt;AlternativeIconBlue&lt;/code&gt; again. (On tvOS, it’s a little simpler, as noted above.)&lt;/p&gt;

&lt;hr /&gt;
&lt;p&gt;Steve Troughton Smith has &lt;a href=&quot;https://github.com/steventroughtonsmith/AlternateIconTest&quot;&gt;a great demo project on Github&lt;/a&gt; illustrating how it all works.&lt;/p&gt;

&lt;hr /&gt;

&lt;p&gt;Lastly, remember that this API is new to iOS 10.3, and using it without checking for availability is going to crash your app. There’s an API check to use for this, and then there’s a way to check for the API. First, check for the &lt;code class=&quot;highlighter-rouge&quot;&gt;supportsAlternateIcons&lt;/code&gt; property on &lt;code class=&quot;highlighter-rouge&quot;&gt;UIApplication&lt;/code&gt;, and then use the property itself to ensure that your app supports alternate icons.&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;
  
- (void)setAppIconToUseIconWithFilename:(NSString *)appIconFileName
  if ([[UIApplication sharedApplication] respondsToSelector:NSSelectorFromString(@&quot;supportsAlternateIcons&quot;)])
    {
        if (UIApplication.sharedApplication.supportsAlternateIcons)
        {
            [[UIApplication sharedApplication] setAlternateIconName:appIconFileName completionHandler:^(NSError * _Nullable error) {
                if (error) {
                    NSLog(@&quot;Filename: %@, Error swapping icons: %@&quot;, appIconFileName, error);
                }
            }];
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;

&lt;p&gt;I’m unsure why we need &lt;code class=&quot;highlighter-rouge&quot;&gt;supportsAlternateIcons&lt;/code&gt;, since iOS 10.3 and tvOS 10.2 both support alternate icons. App extensions do not support this API, but &lt;code class=&quot;highlighter-rouge&quot;&gt;setAlternateIconName:completionHandler:&lt;/code&gt; is already marked as disallowed by the API. We also already have runtime checks to account for the availability of methods and properties. Perhaps this is useful future-proofing if Apple introduces other iOS-based environments that might not support alternate icons. That’s the best explanation I can think of, but if you have an additional perspective, &lt;a href=&quot;https://twitter.com/bermaniastudios&quot;&gt;let me know on Twitter&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This code worked for me, and oddly, didn’t prompt me during my testing. Most developers report seeing an alert informing the user that they changed the icon using this API. I don’t know if it’s a development thing, but I’m planning to test this by installing the app over TestFlight and trying it out.&lt;/p&gt;

&lt;p&gt;Lots of use cases depend on this alert being suppressed for a seamless experience, but the user-comfort argument is a pretty strong one. If apps change icons and don’t tell the user, that can get pretty annoying pretty fast.&lt;/p&gt;

&lt;p&gt;I noticed a few things while looking at &lt;code class=&quot;highlighter-rouge&quot;&gt;UIApplication.h&lt;/code&gt;. There’s a property to check if your app is using an alternate icon. You can check by querying &lt;code class=&quot;highlighter-rouge&quot;&gt;UIApplication&lt;/code&gt;’s &lt;code class=&quot;highlighter-rouge&quot;&gt;alternateIconName&lt;/code&gt; property. If it’s nil, that means your app is using the default icon. That’s a neat trick.&lt;/p&gt;

&lt;p&gt;Also, “the completion handler will be invoked asynchronously on an arbitrary background queue; be sure to dispatch back to the main queue before doing any further UI work.” I think this is a fairly common pattern at this point. Most of Apple’s privacy alerts that I’ve seen have similar behavior. It’s a simple thing to account for, but having the documentation confirm is important.&lt;/p&gt;

&lt;p&gt;One more thing, if you want to reset your app icon to the default icon, you can pass nil into &lt;code class=&quot;highlighter-rouge&quot;&gt;setAlternateIconName:completionHandler:&lt;/code&gt; and &lt;code class=&quot;highlighter-rouge&quot;&gt;UIKit&lt;/code&gt; will clear that &lt;code class=&quot;highlighter-rouge&quot;&gt;alternateIconName&lt;/code&gt; and reset your app’s icon to its original asset-catelog based icon.&lt;/p&gt;

&lt;p&gt;Thanks for reading, and happy coding.&lt;/p&gt;</content><author><name></name></author><summary type="html">In this post, I’m going to cover the Info.plist changes required to enable your app to locate alternate icons and checking for the availability of alternate icons. I’m also going to provide an Objective-C code sample, and discuss the supportsAlternateIcons API and consider what it might be hinting at. I cover the prompt that users may see when invoking your implementation of this API and how to check if your user is actually using an alternate icon. Finally, we’ll talk about the callback block, and resetting the custom icon back to the original.</summary></entry><entry><title type="html">Hello Jekyll!</title><link href="https://blog.mosheberman.com/hello-jekyll/" rel="alternate" type="text/html" title="Hello Jekyll!" /><published>2017-03-27T22:18:16-04:00</published><updated>2017-03-27T22:18:16-04:00</updated><id>https://blog.mosheberman.com/welcome-to-jekyll</id><content type="html" xml:base="https://blog.mosheberman.com/hello-jekyll/">&lt;p&gt;This blows my mind. I was able to deploy an entire brand new blog to my server from my iPhone
just by using Prompt and jekyllrb. Unreal!&lt;/p&gt;</content><author><name></name></author><summary type="html">This blows my mind. I was able to deploy an entire brand new blog to my server from my iPhone just by using Prompt and jekyllrb. Unreal!</summary></entry></feed>