Author Archives: Rex

App Strategy

On the subject of doing app planning & strategy, I recently came across this post from Rob Caraway: http://robcaraway.com/blog/index.php/2017/02/12/how-i-overcame-crippling-perfectionism-and-made-200k-on-the-saturated-app-store/

Parts of it really resonated with me. He says:

Our strategy was basically “Let’s brainstorm ideas and ship massive features and hope people want them”.

That has been my naive strategy so far. Acting as my own ideal user.

Then he talks about validating a MVP:

  • using “Traffic, as indicated by Google Trends”
  • a landing page to capture e-mails
  • building a prototype in a week
  • validating the demand for the prototype

This all seems standard or obvious when you look at it. But I can say that in reality, I have various app ideas that I think are worth making. When it comes to pick the next one, my current process might as well be rolling dice with bad odds. It’s 1000% obvious, but building a neat app with good UX in 2017 doesn’t count for much. Having a solid marketing strategy in a validated niche is significantly more important than building the best app ever.

I’m currently at a point where I’ve released 3 iOS apps. One of them has done decently and the other two are not. I have to make a decision between prioritizing developing new features for my current apps or creating a new app. For the sake of learning new iOS tools (like the camera), it’s probably better for me to work on a new app. Hopefully I can properly validate my idea before I spend months building it this time.

DSLRs

With the way the world is going, it’s too convenient to shoot photos on your smartphone. Charging your DSLR battery pack(s), making sure the memory cards are cleared, and lugging around a backpack full of lenses is a lot of work.

I have an old Canon 40D camera with both EF (full frame / crop) & EF-S (crop only) lenses. I’ll be traveling in a few months and I want something nicer than my iPhone for taking photos.

It seems like the two main options are: get a crop DSLR body or get a full frame DSLR body. Staying within Canon’s ecosystem would be the most convenient. Leaving Canon opens up a can of worms (Nikon, Sony, Pentax, etc?).

If I were to just suck it up, it seems like the answer would be to get a new full frame DSLR & L glass (24-70 EF lens). But I’m leaning towards getting some relatively cheap new crop DSLR body and just make do with what I have (as an economical choice).

With a camera, I care about low light sensitivity (ISO grain) and maybe shutter speed. I don’t care for video options as I don’t intend on shooting and editing movies.

Even though the standalone camera market seems to be dwindling, the big lenses & big image sensors of DSLRs will always provide photography that mobile phones cannot.

iOS 10 Locales and Currency Symbols – Sample App

While working on adding localization to my tip calculator, one thing that seems obvious in retrospect is the difference between a device’s language & region. iOS lets you set the language & region separately. For example, you might want to read text in English, but you could be in Asia. This is relevant to tipping since you could travel to a country where tipping is expected, but the country your phone’s language is associated with doesn’t traditionally tip.

While exploring locales and currency symbols, I whipped together a basic demo app that lets you scroll between all the known locales and their currency symbol in iOS 10. This is pretty useful since you can quickly see what the currencySymbol is for each known iOS locale.

Below is the full implementation of very hacked together (quick and dirty) code. All you need to do:

  • Create new Single View Application project in Xcode
  • Replace the ViewController.swift with below (written for Swift 3)
  • Run the app in Xcode
import UIKit

class ViewController: UIViewController, UITableViewDelegate {
    
    let cellIdentifier = "Cell"
    let currentLocaleHeight = CGFloat(80)
    
    let locales = Locale.availableIdentifiers.sorted { $0.localizedCaseInsensitiveCompare($1) == ComparisonResult.orderedAscending }
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        let tableView: UITableView = UITableView()
        tableView.frame = CGRect(x: 0, y: currentLocaleHeight, width: view.frame.width, height: view.frame.height)
        tableView.dataSource = self
        tableView.delegate = self
        
        self.view.addSubview(tableView)
        
        addCurrentLocaleLabel()
    }
    
    func addCurrentLocaleLabel() {
        let local = Locale.current.identifier
        
        let width = view.frame.width
        let label = UILabel(frame: CGRect(x: 0, y: 0, width: width, height: currentLocaleHeight))
        label.text = "Current locale: " + local
        label.textAlignment = .center
        view.addSubview(label)
    }
    
}

extension ViewController: UITableViewDataSource {
    
    func numberOfSections(in tableView: UITableView) -> Int {
        return 1
    }
    
    func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
        return locales.count
    }
    
    func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
        let cell = UITableViewCell(style: .value1, reuseIdentifier: cellIdentifier)
        
        let localeString = locales[indexPath.row]
        
        let numberFormatter = NumberFormatter()
        numberFormatter.locale = Locale(identifier: localeString)
        
        cell.textLabel?.text = localeString
        cell.detailTextLabel?.text = numberFormatter.currencySymbol
        
        return cell
    }

}

Note that the “¤” symbol means the currency is unspecified.

Using Fastlane Snapshot to generate screenshots with UIPickerViews

This week, I released an update for my Tip Solver calculator to add Chinese localization. I had to generate 5 screenshots for 5 devices (iPhone and iPad) across 3 languages. In the time that I spent automating the process with Fastlane Snapshot, I could have easily done it manually in way less time. But the good news is that I’ve set myself up to painlessly generate screenshots for new languages. Snapshot takes some time to run, but it’s still a huge improvement over generating screenshots manually.

It took me a lot longer than I would have liked to setup my Snapshot process due to my usage of UIPickerViews. Tip Solver makes heavy usage of UIPickerView and I ran into many issues with UITest.

Your mileage may vary, but I found I had to do the following to be able to use UITest and UIPickerViews:

  • disable Ads (which run over the network)
  • drastically reduce the number of UIPickerView rows (in numberOfRowsInComponent)
  • use titleForRow instead of viewForRow for UITest running

The last one (using titleForRow) was a complete non starter since I rely on heavy UIPickerView visual customization. Generating screenshots with incorrect picker views defeats the whole point of the exercise.

I tried using the Xcode’s UITest recorder, but I ran into many issues. One glaring issue is that while recording, I was able to swipe the UIPicker up, but when I played it back, it ended up swiping up the Control Center (instead of adjusting the UIPicker). There is a method (adjustToPickerWheelValue), but I found that it only works with titleForRow (which I don’t use). What I would like is an expansion of the XCUIElement API to add a simple increment/move up or down once.

My final solution (aka work around) was to use a combination of Fastlane launch arguments & brute forcing the UIView (via UIViewController viewDidAppear) to generate my screenshots. My work around isn’t ideal, but it gets the job done.

In my Fastlane Snapfile, I was able to define arguments:

launch_arguments([
 "-screenshot 1",
 "-screenshot 2",
 "-screenshot 3",
 "-screenshot 4",
 "-screenshot 5"
])

In my ViewController (running Swift 3), I was able to handle them accordingly:

let screenshot = UserDefaults.standard.string(forKey: "screenshot")
if screenshot == "1" {
    // do something
} else if screenshot == "2" {
    // do something
} else if screenshot == "3" {
    // do something
} else if screenshot == "4" {
    // do something
} else if screenshot == "5" {
    // do something
}

Once everything is setup, generating screenshots was simply running snapshot on the command line.

I’m sure there’s room for improvement in the code (using an enum, etc.), but I left it at that since it’s only for screenshot generation.

If you’ve made it all the way down here, thanks for reading. I just wanted to share my experience with UITest and UIPickers. UITest probably needs more love from Apple as it was not pleasant to work with.

 

How to open phone app from iMessage extension and pass data

This weekend, I spent some time digging into iOS 10 iMessage app extensions.

Creating an iMessage app or sticker pack is relatively easy. Here are a couple of resources for getting started: tutsplus and medium. Apple also has their own example iMessage app.

For a new iMessage only app, you can choose File > New > Project and either ‘Sticker Pack Application’ or ‘iMessage Application’ in Xcode. To add an iMessage extension to your existing containing phone app, you can use File > New > Target and either ‘Sticker Pack Extension’ or ‘iMessage Extension’.

I wanted to understand the current state of user workflow between the iOS (main/phone) app and the iMessage extension app. Apple has it’s work cut out for them. The iMessage App store is an awkward modal triggered from an individual iMessage conversation. Users seem to have trouble locating the store and managing iMessage apps, so apps are getting bad reviews.

The good news is that you can launch your phone app from the iMessage app.

From the iMessage extension app, you can use this (with your own AppName):

guard let url: URL = URL(string: "AppName://?myParam=myValue") else { return }

self.extensionContext?.open(url, completionHandler: { (success: Bool) in
 
 })

Calling the above code will open the phone app. One issue I ran into (under the extension scheme) is that the phone app will crash in the simulator when opened this way. Xcode shows a SIGKILL since the iMessage app connected to Xcode is quit while the phone app is being opened. This appears to only be a simulator issue.

You can access the URL params in your phone app’s App Delegate:

    func application(_ app: UIApplication, open url: URL, options: [UIApplicationOpenURLOptionsKey : Any] = [:]) -> Bool {

        return true

    }

The bad news is that you cannot launch your iMessage app from your phone app. Apple must have been short on time, as their usual modus operandi is to launch headline features that are incomplete and hopefully iterate later if their business still cares. Radar(s) have been filed.

When asked, “Is it possible for my app to open the Messages app with my iMessage extension activated?” An Apple Staff member says, “There currently isn’t a way to do this.”

I think opening your iMessage app from your phone app would be an excellent, natural use case. User does something in the containing app, then they want to share that app’s state with a friend using iMessage. Why rely on the user going to Messages, finding your iMessage app in the correct convo, and then recreating app state to send it over?

So to recap, iMessage apps have a lot of potential as they are living apps within iMessages. They can communicate with their containing app. But I have reservations about the UX, including discoverability.

How to transfer photos to Apple TV (4th gen) and use photos as screensaver

Here is a quick guide for saving photos on your 4th generation Apple TV so that you can use them as a screensaver on your Apple TV (without constantly have your computer on for Home Sharing).

  1. On your computer, open iTunes and turn on Home Sharing
    1. File > Home Sharing
    2. If applicable, select ‘Choose Photos to Share with Apple TV…’
  2. On your Apple TV, select the ‘Computers’ app icon from the home screen
    1. In your Library, select Photos & choose your album
    2. Select ‘Set as Screensaver’ in the top right & select ‘Yes’
  3. You’re done

That hopefully wasn’t too complicated to do. I wanted to post this since it wasn’t clear to me from googling if you could save photos to your Apple TV (or you had to always stream via Home Sharing).

As for the Apple TV, it feels like Apple Watch territory. Something that is nice to have, but nowhere near necessary. Their app stores are still early and widespread developer support is uncertain.

Use NSLocalizedString for user facing strings

Just a quick iOS tip I wish I knew earlier. Would’ve saved me some time & headache:

Always wrap user-facing strings with NSLocalizedString.

Even if you don’t plan to localize your app into any other languages, there is immense utility in being able to easily review all of the strings that a user will see. And if localization is in the cards, it’s significantly easier to NSLocalize your strings as you go along the first time, then try to find all of them after-the-fact.

via http://nshipster.com/nslocalizedstring/

The advice is sound. It won’t cost you any time to use NSLocalizedString the first time, but it will help save time if/when you localize.

Visual iOS guide on switching from Chinese to English

I wanted to document for my own future reference how to switch an iPhone (in the iOS Simulator) back to English from Chinese. My Chinese literacy is not very high, so I’m sure this guide will be handy.

Note: follow the red ovals and you will be able to change your iOS language back to English

1

2

3

4

5

6

7

8

Why would I need to do this? I’m working on language localization for different apps, and testing localization in the simulator is often easier than on actual hardware.

iOS Universal iPad Portrait Gotcha

I wanted to share a small tip that reinforces the necessity of on-device (non-simulator) testing.

While finalizing my latest iOS app (universal for both iPhone & iPad), I found an issue through manual QA on an actual iPad. I had only left Portrait checked in the Project > General section of Xcode, but my app was somehow running in landscape mode on the iPad.

general

Confused as to why it was rendering in both landscape & portrait mode on my iPad, I found a handy stack overflow post.

For one reason or another, you have to update your Info.plist to only specify portrait settings for iPads. Below is my Info.plist after I updated it to only target Portrait mode.

info_plist

It’s confusing as to why the Project General section’s Device Orientation is not sufficient to force only Portrait orientations and you have to also update the Info.plist.

Through simulator testing, it’s not likely that I would have caught onto this portrait vs landscape issue. I relied mostly on my primary iPhone and copious amounts of simulator testing for the other iOS universal devices.

For the highest level of quality control, you would need an iPhone 4s, iPhone 5, iPhone 6, iPhone 6+, iPad, and iPad Pro. That’s a lot of devices and I certainly don’t have all of those. Sidenote: if you do have all those devices, you would also be positioned to record App Preview videos for all devices natively (AKA lots of work).

Tip Solver launches

This week, my future-inspired tip calc, Tip Solver, launched on the App Store. Not only does it help you calculate you tip, it also helps you solve for how much you are really tipping.

3

As a tip calc, it’s easy to use and hopefully sleek/easy on the eyes. A key feature of this tip calculator is that it allows you to solve for your tip %. When you adjust the total or tip, you can see how much you actually tipped right away. This comes in handy when you pay $100 instead of $95, and so on.

While it’s customary to go from top to bottom when looking a receipt (for the items ordered, tax, tip, and total lines), I felt that it was better to invert the direction (to use a bottom to top approach). The reason being that your thumbs are often better able to reach the bottom (not the top) of the device. I wanted the most common action (setting the bill amount) to be in the easy to reach thumb zone.

Please check out my app if you have a moment. It’s free (with ads) and works on iPhones & iPads. If you have any feedback, you can let me know at rexfeng@gmail.com

https://itunes.apple.com/us/app/tip-solver-premier-gratuity/id1130814051