Tag Archives: iphone

Apple Vision Pro Demo Impressions

I tried out the Apple Vision Pro (AVP) hardware in an Apple store today. The ball was in Apple’s court. I really wanted to be impressed by the hardware and put me over the edge to pick one up and develop apps for it. I’ve released Day 1 apps for the Apple Watch & ARKit (iOS 11), and I believe in the future of AR for productivity.

Not Sharp

Unfortunately, when I wore the AVP, the content (text, images, etc.) was not razor sharp. I could use the device and navigate the OS without an issue, but I was expecting next-gen sharpness on the AVP displays.

My guess is I could try different distances (closer or further) from the screen to find the distance where all the content is sharp and crisp. Probably try different light seals. I couldn’t achieve the level of sharpness that I expect from any 2020 device (phone, HD monitor, etc).

Also, there was an opening at the bottom of my headset (towards my nose). I thought the light seals were supposed to block 360 degrees of light around the headset and not leave a small gap. That gap was normal per the Apple rep.

Demo

Apple did a great job with the demo. The demo was seated (smart) and focused on VR content, not passthrough use cases.

The OS (visionOS) was simple to use. Pinching to drag around or press buttons worked fine with hand gestures. When I tried to resize windows (bottom corners) or use two hands to pull apart, I ran into problems with certain apps where it wouldn’t work.

Content (2D vs 3D)

My imagined ideal use case would be having several large macOS screens in front of me to get work done. However, Apple marketing seems to be focused on entertainment (big TV in front of you) as their selling point.

The problem (in my opinion) is that the content was not great. Spatial content, shot on what I presumed are iPhone 15 Pro Maxes or AVPs, seemed low resolution to me. Enlarging an iPhone photo to fill up your entire room’s wall doesn’t work that well. It lacked detail. Even viewing a Panorama (shot on iPhone? not sure), the resolution was not great when viewed at such a large size.

Part of the demos included immersive environments. The environments were impressive since they were built natively for the device and 3D rendered. Viewing a photo from the moon environment was great since the nearby 3D rocks on the ground really sell the illusion.

I personally felt the other content fell apart. Spatial videos (shot with iPhone 15 PM?) were fun but it didn’t feel immersive to me since moving around lacked the convincing parallax experience you get from viewing things in your every day life.

While internet AVP users seem to enjoy viewing 2D movies on a giant, virtual screen, I think there is a huge opportunity for companies to build 3D immersive environments or games for users to be in (and interact with). Using the AVP’s state of the art hardware to view 2D images is like watching television without sound – a missed opportunity.

Takeaway

Despite the hardware issues (I suspect the light seal), I’d be interested in making AR apps for the AVP. Paying almost $4K to buy a dev kit and develop for Apple is a tough sell for an indie developer. I honestly think Apple should have a program for developers to borrow AVPs and build apps.

Manage Sandbox Account on iPhone (iOS 14)

While working with iOS 14 and IAPs, I was looking for the Sandbox Account management section in iOS Settings. Older guides online reference “iTunes & App Stores”, but it is somewhere else now.

From what I’ve read online, it seems like the 1st step is to attempt an IAP on your iPhone running your development build (with Xcode). You’ll be prompted to sign in, so use your App Store Connect Sandbox Tester credentials.

Afterwards, I was able to find Sandbox at the following location:
* open Settings app
* scroll down & select ‘App Store’
* scroll down & find ‘SANDBOX ACCOUNT’

Here, you can manage or sign out of your Sandbox Apple ID.

How to install iOS 13 beta (for developers)

With an Apple developer account, you can install the iOS 13 beta. Warning: you should backup your existing device before installing the iOS 13 beta, and “install only on systems you are prepared to erase if necessary.”

From the Apple Developer Download page, you can find the Restore Images. Find your device among the list at https://developer.apple.com/download/#ios-restore-images-iphone-new to download the .ipsw

Next use iTunes & follow Apple’s guide (Installation Using the Restore Image)

  • Make sure you are running the latest version of iTunes on your Mac.
  • Open iTunes on your Mac.
  • Connect your iOS device to your computer with the cable that came with your device.
  • If you’re prompted for your device passcode or to Trust This Computer, follow the onscreen steps. If you forget your passcode, help is available.
  • Select your iOS device when it appears in iTunes.
  • In the Summary panel, hold the Option key and click the Check for Update button.
  • Select the iOS beta software restore image and click Open to start the installation.
  • After installing the beta, your device will reboot and will require a network connection to complete activation.

Follow the on screen instructions and you’ll have iOS 13 beta running on your device.

Note: I ran into an error “Can’t install the software because it is not currently available from the Software Update server”. I waited for my iPhone to fully boot up and I’m able to use iOS 13 beta (despite that error message).

Overall, the experience wasn’t too bad. This is a developer beta and not intended for widespread public distribution. If you are interested in the public beta for iOS 13, you can sign up here https://beta.apple.com/sp/betaprogram/

iOS Developer iPhone (Dec 2018)

As an indie iOS app developer, keeping up with Apple’s hardware can get expensive fast. From the Mac to the iPhones and iPads.

I’m focusing on native iPhone apps and currently use an iPhone 7 as my daily driver. I’m considering getting a new iPhone and want to find the right balance between 1.) phone size I want to use daily and 2.) phone is optimal for App Store Connect previews (videos) & screenshots.

Since I’m an AR app developer, having iPhone hardware is essential (the Simulator doesn’t cut it).

Looking at the state of iPhone hardware today (Dec 2018), some quick Googling shows that the iPhone 7, 7+/8+, and X form factors are the most common in the US.

When we look at Apple’s App preview & screenshot guidelines, it tells us that the 5.5 inch (iPhone 8+, etc) form factor is required (screenshots) and recommended (app previews).

For 2019, my guess is that supporting the 5.5 inch (8+) form factor and the 5.8 inch (X/XS) on App Store Connect would give me the most bang for my buck. It would be nice to have both a XS & XS Max to test with, but that’s way out of my budget.

Curiously enough, the app preview video resolutions are the same across the X line (X, XS, XS Max, XR at 886 x 1920 pixels (when portrait). The video resolution is bigger at 1080 x 1920 pixels for the plus line (8+, etc).

Using App Store Connect, I was able to manually verify the different screenshot upload resolutions for the iPhone XS Max (1242 × 2688), iPhone XS (1125 × 2436), and iPhone 8+ (1242 × 2208). It seems like there is no point to try to take or upload iPhone XR screenshots.

In summary, the iPhone plus (8+, etc) line is the most important for app previews & screenshots. After that, the XS & XS Max (in that order) will give you more App Store Connect coverage with diminishing returns.

iOS 12 Siri Shortcuts

The latest update (v1.3.6) of my iPhone app, Power Focus, passed app review today! The App Store review turnaround is amazing nowadays. Super quick.

My app includes minimal iOS 12 Siri Shortcuts support. Watching the WWDC session, there’s two ways to add Siri Shortcuts support: NSUserActivity & Intents. I went with the former since I didn’t need custom Siri UI.

Of different online resources, Anton’s medium post was really helpful as it covers the essentials of using NSUserActivity. With NSUserActivity, the important parts are donating Shortcuts during app usage and handling them in your App Delegate. That’s it.

The experience of implementing minimal iOS 12 Siri Shortcuts was painless and I would recommend your app using NSUserActivity to inform iOS when key actions occur.

How to use child View Controllers in Swift 4.0 programmatically

I’ve just released my Learn to read Korean app for iPhone. It uses a number of child View Controllers in the home screen. While child View Controllers are not a new thing, it was a new experience for me, and I greatly recommend them to reduce the clutter of your View Controllers.

Here’s a photo of my home screen:

The main view controller consists of a vertical UIScrollView and multiple horizontal scrolling UICollectionViews below. While it’s possible to do it all in one massive View Controller, it’s much better to delegate UICollectionView events to their individual child View Controllers.

The good news is that using child UIViewControllers is super easy. You can use your Storyboard or do it programmatically in your UIViewController files. I opted for the latter as I find it easier to reproduce across Xcode projects.

All you need to do to add a child View Controller is below. I included an optional constraints section.

// Create child VC
let childVC = UIViewController()

// Set child VC
self.addChildViewController(childVC)

// Add child VC's view to parent
self.view.addSubview(childVC.view)

// Register child VC
childVC.didMove(toParentViewController: self)

// Setup constraints for layout
childVC.view.translatesAutoresizingMaskIntoConstraints = false
childVC.view.topAnchor.constraint(equalTo: heroView.bottomAnchor).isActive = true
childVC.view.leftAnchor.constraint(equalTo: self.view.leftAnchor).isActive = true
childVC.view.widthAnchor.constraint(equalTo: self.view.widthAnchor).isActive = true
childVC.view.heightAnchor.constraint(equalToConstant: height).isActive = true

With multiple child VCs (each handling their own UICollectionView events), the code base becomes manageable. In each child View Controller, you can handle customization, such as background color, UILabels, UIButtons, etc.

Another tip I have is to use the UIView’s convert(_:to:) method as necessary. You may need to get the child subview’s position relative to your parent View Controller’s view (such as for an UIViewControllerTransitioningDelegate). The code for that is simple too:

// contrived example label in Child VC to get parent frame
let label = UILabel()
let childViewFrame = label.frame
let frameInParent = label.convert(childViewFrame, to: parentVC.view)

That’s all I wanted to share for today. Don’t be afraid of using child View Controllers to break up your massive View Controllers!

Supporting the iPhone X with Storyboard

There are a ton of guides out there for updating your app(s) to support the iPhone X.

If you create your view programmatically, you can use iOS 11’s safeAreaLayoutGuide. If your app targets iOS 10 or below, you can use the availability condition, #available().

With the Storyboard, one thing I appreciate from Apple is making the safe area layout guide backwards deployable.

Apple told us in WWDC 2017 Session 412 that Storyboards using safe areas are backwards deployable. This means you can switch to using the safe area layout guide in Interface Builder even if you still target iOS 10 and older.

via https://useyourloaf.com/blog/safe-area-layout-guide/

I don’t always use the storyboard for my layouts, but for apps that I need to update, this backwards deployability helps a lot.

Visual iOS guide on switching from Chinese to English

I wanted to document for my own future reference how to switch an iPhone (in the iOS Simulator) back to English from Chinese. My Chinese literacy is not very high, so I’m sure this guide will be handy.

Note: follow the red ovals and you will be able to change your iOS language back to English

1

2

3

4

5

6

7

8

Why would I need to do this? I’m working on language localization for different apps, and testing localization in the simulator is often easier than on actual hardware.

iOS Universal iPad Portrait Gotcha

I wanted to share a small tip that reinforces the necessity of on-device (non-simulator) testing.

While finalizing my latest iOS app (universal for both iPhone & iPad), I found an issue through manual QA on an actual iPad. I had only left Portrait checked in the Project > General section of Xcode, but my app was somehow running in landscape mode on the iPad.

general

Confused as to why it was rendering in both landscape & portrait mode on my iPad, I found a handy stack overflow post.

For one reason or another, you have to update your Info.plist to only specify portrait settings for iPads. Below is my Info.plist after I updated it to only target Portrait mode.

info_plist

It’s confusing as to why the Project General section’s Device Orientation is not sufficient to force only Portrait orientations and you have to also update the Info.plist.

Through simulator testing, it’s not likely that I would have caught onto this portrait vs landscape issue. I relied mostly on my primary iPhone and copious amounts of simulator testing for the other iOS universal devices.

For the highest level of quality control, you would need an iPhone 4s, iPhone 5, iPhone 6, iPhone 6+, iPad, and iPad Pro. That’s a lot of devices and I certainly don’t have all of those. Sidenote: if you do have all those devices, you would also be positioned to record App Preview videos for all devices natively (AKA lots of work).

iOS Simulator

When previewing a site on my iPhone (1136 x 640), I wasn’t satisfied with using my iPhone 5 or a Chrome browser Window Resizer app.

Turns out it’s super simple to use the iOS Simulator with Xcode on OS X (version 10.8.5).

Select Xcode > Open Developer Tool > iOS Simulator

xcode

You’ll be presented with an iPhone to navigate within. Select Safari and you can use a site like localhost:3000 for development.

 
ios_sim