Tag Archives: apple

Apple Vision Pro Demo Impressions

I tried out the Apple Vision Pro (AVP) hardware in an Apple store today. The ball was in Apple’s court. I really wanted to be impressed by the hardware and put me over the edge to pick one up and develop apps for it. I’ve released Day 1 apps for the Apple Watch & ARKit (iOS 11), and I believe in the future of AR for productivity.

Not Sharp

Unfortunately, when I wore the AVP, the content (text, images, etc.) was not razor sharp. I could use the device and navigate the OS without an issue, but I was expecting next-gen sharpness on the AVP displays.

My guess is I could try different distances (closer or further) from the screen to find the distance where all the content is sharp and crisp. Probably try different light seals. I couldn’t achieve the level of sharpness that I expect from any 2020 device (phone, HD monitor, etc).

Also, there was an opening at the bottom of my headset (towards my nose). I thought the light seals were supposed to block 360 degrees of light around the headset and not leave a small gap. That gap was normal per the Apple rep.

Demo

Apple did a great job with the demo. The demo was seated (smart) and focused on VR content, not passthrough use cases.

The OS (visionOS) was simple to use. Pinching to drag around or press buttons worked fine with hand gestures. When I tried to resize windows (bottom corners) or use two hands to pull apart, I ran into problems with certain apps where it wouldn’t work.

Content (2D vs 3D)

My imagined ideal use case would be having several large macOS screens in front of me to get work done. However, Apple marketing seems to be focused on entertainment (big TV in front of you) as their selling point.

The problem (in my opinion) is that the content was not great. Spatial content, shot on what I presumed are iPhone 15 Pro Maxes or AVPs, seemed low resolution to me. Enlarging an iPhone photo to fill up your entire room’s wall doesn’t work that well. It lacked detail. Even viewing a Panorama (shot on iPhone? not sure), the resolution was not great when viewed at such a large size.

Part of the demos included immersive environments. The environments were impressive since they were built natively for the device and 3D rendered. Viewing a photo from the moon environment was great since the nearby 3D rocks on the ground really sell the illusion.

I personally felt the other content fell apart. Spatial videos (shot with iPhone 15 PM?) were fun but it didn’t feel immersive to me since moving around lacked the convincing parallax experience you get from viewing things in your every day life.

While internet AVP users seem to enjoy viewing 2D movies on a giant, virtual screen, I think there is a huge opportunity for companies to build 3D immersive environments or games for users to be in (and interact with). Using the AVP’s state of the art hardware to view 2D images is like watching television without sound – a missed opportunity.

Takeaway

Despite the hardware issues (I suspect the light seal), I’d be interested in making AR apps for the AVP. Paying almost $4K to buy a dev kit and develop for Apple is a tough sell for an indie developer. I honestly think Apple should have a program for developers to borrow AVPs and build apps.

Manage Sandbox Account on iPhone (iOS 14)

While working with iOS 14 and IAPs, I was looking for the Sandbox Account management section in iOS Settings. Older guides online reference “iTunes & App Stores”, but it is somewhere else now.

From what I’ve read online, it seems like the 1st step is to attempt an IAP on your iPhone running your development build (with Xcode). You’ll be prompted to sign in, so use your App Store Connect Sandbox Tester credentials.

Afterwards, I was able to find Sandbox at the following location:
* open Settings app
* scroll down & select ‘App Store’
* scroll down & find ‘SANDBOX ACCOUNT’

Here, you can manage or sign out of your Sandbox Apple ID.

How to install iOS 13 beta (for developers)

With an Apple developer account, you can install the iOS 13 beta. Warning: you should backup your existing device before installing the iOS 13 beta, and “install only on systems you are prepared to erase if necessary.”

From the Apple Developer Download page, you can find the Restore Images. Find your device among the list at https://developer.apple.com/download/#ios-restore-images-iphone-new to download the .ipsw

Next use iTunes & follow Apple’s guide (Installation Using the Restore Image)

  • Make sure you are running the latest version of iTunes on your Mac.
  • Open iTunes on your Mac.
  • Connect your iOS device to your computer with the cable that came with your device.
  • If you’re prompted for your device passcode or to Trust This Computer, follow the onscreen steps. If you forget your passcode, help is available.
  • Select your iOS device when it appears in iTunes.
  • In the Summary panel, hold the Option key and click the Check for Update button.
  • Select the iOS beta software restore image and click Open to start the installation.
  • After installing the beta, your device will reboot and will require a network connection to complete activation.

Follow the on screen instructions and you’ll have iOS 13 beta running on your device.

Note: I ran into an error “Can’t install the software because it is not currently available from the Software Update server”. I waited for my iPhone to fully boot up and I’m able to use iOS 13 beta (despite that error message).

Overall, the experience wasn’t too bad. This is a developer beta and not intended for widespread public distribution. If you are interested in the public beta for iOS 13, you can sign up here https://beta.apple.com/sp/betaprogram/

iOS Developer iPhone (Dec 2018)

As an indie iOS app developer, keeping up with Apple’s hardware can get expensive fast. From the Mac to the iPhones and iPads.

I’m focusing on native iPhone apps and currently use an iPhone 7 as my daily driver. I’m considering getting a new iPhone and want to find the right balance between 1.) phone size I want to use daily and 2.) phone is optimal for App Store Connect previews (videos) & screenshots.

Since I’m an AR app developer, having iPhone hardware is essential (the Simulator doesn’t cut it).

Looking at the state of iPhone hardware today (Dec 2018), some quick Googling shows that the iPhone 7, 7+/8+, and X form factors are the most common in the US.

When we look at Apple’s App preview & screenshot guidelines, it tells us that the 5.5 inch (iPhone 8+, etc) form factor is required (screenshots) and recommended (app previews).

For 2019, my guess is that supporting the 5.5 inch (8+) form factor and the 5.8 inch (X/XS) on App Store Connect would give me the most bang for my buck. It would be nice to have both a XS & XS Max to test with, but that’s way out of my budget.

Curiously enough, the app preview video resolutions are the same across the X line (X, XS, XS Max, XR at 886 x 1920 pixels (when portrait). The video resolution is bigger at 1080 x 1920 pixels for the plus line (8+, etc).

Using App Store Connect, I was able to manually verify the different screenshot upload resolutions for the iPhone XS Max (1242 × 2688), iPhone XS (1125 × 2436), and iPhone 8+ (1242 × 2208). It seems like there is no point to try to take or upload iPhone XR screenshots.

In summary, the iPhone plus (8+, etc) line is the most important for app previews & screenshots. After that, the XS & XS Max (in that order) will give you more App Store Connect coverage with diminishing returns.

My Aging MacBook Situation

My personal daily driver is a 2011 MacBook Air (MBA). I’ve shipped 6 iPhone apps from it. For a computer bought in 2011, I’m happy with how long it has lasted.

I am interested in buying a new MacBook Pro (MBP) to replace my aging MacBook Air, but I’m not sure what to do. The possible choices I see are:

1.) the current MBP (June 2017 version)
2.) wait ? months for an updated MBP (most likely a minor CPU refresh)
3.) a 2015 MBP version (older hardware style with IMHO better keyboard)

Reasons to upgrade sooner:
* Xcode runs poorly on my MBA. Storyboard, Simulator, and Playgrounds are barely usable.
* macOS Mojave will not run on my MacBook Air. It’s only a matter of time before I’m locked out of macOS & Xcode updates.
* Apple announced a Keyboard Service Program.
* As a professional software developer, I can easily justify 2-3 year upgrade cycles.
* My MBA is showing it’s age; the battery is virtually gone.

Reasons to upgrade later:
* My MBA is able to run Xcode 9 (current) and will hopefully run Xcode 10 GM.
* Buying after a new hardware refresh (minor CPU bump most likely) maximizes the currentness of the purchase. This may not be rational, but it’s a factor nonetheless.
* My iPhone app development is primarily dependent on iPhone hardware updates & Xcode, not my Mac.
* Indecision – since none of the current MBP options (2015 or 2017) are very appealing, I can wait it out.

Reasons that don’t make a difference:
* I don’t like typing on the current generation MBP keyboard, but the next significant MBP hardware refresh is probably a few years away (too long).
* USB-C – I’ve found a Multi-Port Adapter (dongle) that works for me.

Inconclusion

In retrospect, I should have bought a decently equipped 2015 MBP in 2015.

If Xcode 10 GM doesn’t work on my Mac, then I’ll be forced to buy a new Mac right away. Otherwise I will wait around hoping Apple decides to update the MBP.

Intro to Computer Vision

I’m new to computer vision and a lot of the basic concepts are very interesting. As an iOS developer, my interests comes from using CoreML & Apple’s Vision in apps to improve the user experience.

Two common tasks are classification and object detection. Classification allows you to detect dominant objects present in an image. For example, classification can tell you that photo is probably of a car.

Object detection is much more difficult since it not only recognizes what objects are present, but also detects where they are in the image. This means that object detection can tell you that there is probably a car within these bounds of the image.

What’s important is that the machine learning model runs in an acceptable amount of time. Either asynchronous in the background or in real time. Apple provides a listing of sample models for classification at https://developer.apple.com/machine-learning/.

For real time object detection, TinyYOLO is an option, even if the frame rate is not near 60 fps today. Other real time detection models like YOLO or R-CNN are not going to provide a sufficient experience on mobile devices today.

One other interesting thing I came across is the PASCAL Visual Object Classes (VOC). These are common objects used for benchmarking object classification.

For 2012, the twenty object classes that have been selected were:

  • Person: person Animal: bird, cat, cow, dog, horse, sheep
  • Vehicle: aeroplane, bicycle, boat, bus, car, motorbike, train
  • Indoor: bottle, chair, dining table, potted plant, sofa, tv/monitor

These are common objects used to train classification models.

Computer vision used with machine learning has a tremendous amount of potential. Whether used with AR or other use cases, they can provide a compelling user experience beyond Not Hotdog.

Supporting the iPhone X with Storyboard

There are a ton of guides out there for updating your app(s) to support the iPhone X.

If you create your view programmatically, you can use iOS 11’s safeAreaLayoutGuide. If your app targets iOS 10 or below, you can use the availability condition, #available().

With the Storyboard, one thing I appreciate from Apple is making the safe area layout guide backwards deployable.

Apple told us in WWDC 2017 Session 412 that Storyboards using safe areas are backwards deployable. This means you can switch to using the safe area layout guide in Interface Builder even if you still target iOS 10 and older.

via https://useyourloaf.com/blog/safe-area-layout-guide/

I don’t always use the storyboard for my layouts, but for apps that I need to update, this backwards deployability helps a lot.

Multiple UIDynamicAnimators

In past apps, I tended to have one UIDynamicAnimator in my ViewController and that was that. UIDynamicAnimator allows you to use UIDynamics / effects on your UIViews.

The issue that I ran into was that removeBehavior(_:), which “Removes a specified dynamic behavior from a dynamic animator“, didn’t seem to work. I would keep track of specific UIDynamicBehavior instances and pass them as the argument for removeBehavior(_:) but it didn’t appear to remove the behavior.

What does work is calling removeAllBehaviors() on the UIDynamicAnimator. This is fine if you only have one UIView. But most likely, you have multiple UIViews & behaviors. Calling remove all on the only animator isn’t a good idea. That could leave UIViews frozen out of place.

Recently, I released a fun weekend app, Fun Faces. Browsing stack overflow, it occurred to me to use multiple UIDynamicAnimators. One for each UIView I wanted to animate. This worked for my use case, where calling removeAllBehaviors() doesn’t interrupt the other UIView’s behaviors (if any).

Using multiple UIDynamicAnimators isn’t an answer if you have multiple UIViews under the same animator with UICollisionBehavior or other effects that let the UIViews interact with each other.

How to transfer photos to Apple TV (4th gen) and use photos as screensaver

Here is a quick guide for saving photos on your 4th generation Apple TV so that you can use them as a screensaver on your Apple TV (without constantly have your computer on for Home Sharing).

  1. On your computer, open iTunes and turn on Home Sharing
    1. File > Home Sharing
    2. If applicable, select ‘Choose Photos to Share with Apple TV…’
  2. On your Apple TV, select the ‘Computers’ app icon from the home screen
    1. In your Library, select Photos & choose your album
    2. Select ‘Set as Screensaver’ in the top right & select ‘Yes’
  3. You’re done

That hopefully wasn’t too complicated to do. I wanted to post this since it wasn’t clear to me from googling if you could save photos to your Apple TV (or you had to always stream via Home Sharing).

As for the Apple TV, it feels like Apple Watch territory. Something that is nice to have, but nowhere near necessary. Their app stores are still early and widespread developer support is uncertain.

WKInterfacePicker Attributes Illustrated

The WatchKit Framework’s WKInterfacePicker has some level of customization in Interface Builder.

As of this time of writing, there are 4 main attributes to configure:

  • Style
  • Focus
  • Indicator
  • Enabled

Style is the most important attribute since it heavily influences the type of Picker. The options for Style are List, Stack, and Sequence. List is the standard iOS-style 3d list of text. Stack allows you to flip through images as if it were a deck of cards. Sequence lets you move between images without any intermediate transition effect. For a good look, I’d recommend Big Nerd Ranch’s blog post.

Focus presents an outline around the currently focused/selected Picker. This is helpful if you have multiple WatchKit pickers or multiple elements for selection (think “Customize” for your watch face). The options are None, Outline, and Outline with Caption. The last option, “Outline with Caption”, comes into effect if your Style is Stack (or possibly Sequence), see an example here.

 

Plain, without focus

Plain, without focus

Left picker is focused

Left picker is focused

Indicator has two options: Disabled or Shown While Focused. The documentation wasn’t very clear: “A value indicating whether the picker uses an indicator to convey context about the number of picker items and which item is selected.” As far as I can tell, Disabled means the standard look, and the “Shown While Focused” adds a scroll bar helper on the right side of the watch.

Indicator - Shown While Focused

Indicator – Shown While Focused

Enabled is a helpful option that is either on or off. Enabling it or not doesn’t affect the view. When the picker is enabled, it allows the user to use the picker. This attribute can be set or unset programmatically with -setEnabled.

Hopefully this quick explanation of the different WKInterfacePicker attributes helps you out. The digital crown (AKA dial) & the WKInterfacePicker provides an extremely powerful, convenient input method for your watch app users.