Accessibility up to 11!https://accessibilityupto11.comenIgnite v0.6.0https://accessibilityupto11.com/Images/Site/Global/LogoShare.pngAccessibility up to 11!https://accessibilityupto11.com144144https://accessibilityupto11.com/post/2026-02-22-01Learning to develop more accessible iOS gameshttps://accessibilityupto11.com/post/2026-02-22-01Sun, 22 Feb 2026 23:00:00 +0000I started my journey in iOS accessibility about 9 years ago, when I was working at the BBC. I even dared to give a talk about it in App Dev Con in 2018. Years later, I’m pretty sure that I was just able to overcome my terror of public speaking thanks to the Dunning-Kruger effect. I was at that point where you’ve just started learning about something, become passionate about it, and vastly overestimate how much you actually know. It only takes digging a little deeper to realise how vast any topic is, and that one will probably never master it, but can just hope to be in a continuous learning process. And accessibility is no different. Not to discourage anyone! The truth is that a few basic tools in your toolbox can bring you a very long way towards offering a very good experience. As someone said, we are actually not in need of experts, but in need of basic knowledge.

Illustration of an Apple Watch on the left wrist displaying a retro racing game. The right hand is using the Digital Crown as the control for the game. An arrow points to the watch with the name of the game: RetroRapid! 
 About the same time, the Apple Watch had been around for a few years, and I thought it would be cool to do something with it. I thought there were not that many good games for it, so I decided to start one. It would also be the perfect opportunity to start learning about an aspect of accessibility I hadn’t explored before: video games accessibility.

And so, RetroRapid! was born. A game inspired by a cheap old handheld LCD video game I had when I was a kid. There are three lanes, cars come your way, and you have to move left or right to overtake them without crashing. Simple enough to be playable on the Apple Watch, and to be able to explore ideas, as a newbie in the field, to make it accessible. It is a long story, but it turns out that I just released that game. It had been in a drawer for that many years (about 8?!) and unexpectedly, I released it last week.

I’m very much in the process of exploring ways to make the game more accessible, but I thought it would be cool to share some of my first approaches. I should actually do more research about the topic and learn from actual experts. So far, my approach has been the same thought process I would have followed for any app. Plus having observed the truly inspirational journey of a really awesome indie game: Art of Fauna.

Controls

The first question I asked myself was: how many different ways can I give users to control this game?

One of my principles when approaching accessibility is that the user experience should be abstracted to provide multiple input and output mechanisms. Here, we’ll focus on the input mechanisms. When thinking about accessibility for iOS apps, one of the first things that might come to mind is VoiceOver. That’s a great example of this. VoiceOver allows for an alternative input mechanism: a series of special gestures to explore and interact with apps. And output: the device speaking to you, instead of relying on the visuals on the screen.

For a game, providing a multitude of control mechanisms feels essential.

As mentioned, this was always going to be a Watch first game. So the first thing I wanted to try is if it was possible to control the game with the Digital Crown. I also wanted to support tap and swipe controls (more on this later), but the Digital Crown felt like a must because it lets you play without covering the screen, and the posture of the hand resting in it feels more natural, rather than interacting with touch controls on such a small screen.

.digitalCrownRotation(
    $crownValue,
    from: -100,
    through: 100,
    by: 0.1,
    sensitivity: .low,
    isContinuous: true,
    isHapticFeedbackEnabled: true
)

But just because it is Watch first, it doesn’t mean this game wouldn’t be playable on the iPhone. I want it to be truly multi-platform. And the iPhone has many advantages when it comes to alternative input methods. One of them is that it is possible to support a one-hand input mechanism. And so, I thought it would be cool to control the game with swipe gestures. Swipe left moves the car to the left, and swipe right moves the car to the right. Simple! But it works!

Of course, I should also support taps. And I thought, why restrict the tap gesture to a small area? Let’s make half the screen a giant left button, and the other one a giant right button. I’d still add graphics to provide the affordance, and this would make it super fun and easy to play holding the device in landscape mode (more on this later).

The last control that I implemented is the keyboard. Bringing the game to iPadOS, and with more and more of these devices being attached to covers with keyboards, it just made sense that you could simply use the left/right arrow keys to move the car too.

override var keyCommands: [UIKeyCommand]? {
    [
        UIKeyCommand(input: UIKeyCommand.inputLeftArrow, modifierFlags: [], action: #selector(handleLeft)),
        UIKeyCommand(input: UIKeyCommand.inputRightArrow, modifierFlags: [], action: #selector(handleRight))
    ]
}

And I’m not done! Video game controllers will be next. And I have to give it a shot to make it accessible with Switch Control and Voice Control too.

And what about VoiceOver? Yes! That was the first thing that I thought about when starting the game. Just as it happens with any app. Let’s talk modes.

Modes

After giving some thought to controls, it was time to think about how to make the game perceivable.

When creating apps, we should not rely on just one mode, like color, for example. Instead, we should use a combination of modes (shape, text, sound, haptics…) so everyone can perceive that information. So what about games, that have a very strong visual component? Just as the visuals, audio is a very important aspect of games too! I figured, can we make the game playable by using audio cues? I gave each lane a note: C for the left lane, E for the middle, and G for the right. The idea was that even with your eyes closed, you could build a mental map of the road. And implemented three alternative audio cue modes: lane pulse, plays a note for each safe lane to move to (equal time silence for an unsafe lane); arpeggio, plays a sequence of notes for safe lanes; and chord plays all the notes for safe lanes at once. Then, I had to provide audio cues for lane changes too. I decided to offer three styles too: lane confirmation, plays the note of the lane you moved to; success, plays a characteristic sound letting you know if you moved to a safe or unsafe lane; and a combination of both.

And what about the input mechanism for VoiceOver? The classic selection and double-tap for action wouldn’t cut it on a game where you need to act fast. So I decided to allow direct touch, so the gestures are the same as if VoiceOver was not enabled.

.accessibilityDirectTouch(true, options: [.silentOnTouch])

The amount of options comes from a place of not knowing what would be more intuitive, but also from a place of giving choice. Another principle of mine when approaching accessibility is that customization is at its core, and it is about meeting users where they are. It doesn’t seem like giving choice would harm anyone.


We build things for users, and whenever we can, we should get them involved in the process. So I posted in AppleVis’ forum to request feedback. The community was extremely helpful and supportive! The consensus was that the approach could work, but that a tutorial was needed to make it clearer from the start. I took that on board and implemented one covering the game, controls, and audio cues. The results have been encouraging. Some users are already scoring a fair amount of overtakes, which suggests this approach might actually be viable.

Settings

As mentioned, customization is at the very core of accessibility. Not all options work for all users, and what might be useful for someone might make the experience very hard for others.

When I was a kid, the sound of the game was essential. The repetitive bip sound was super useful to get into the rhythm of the game and know exactly when to move the car and avoid crashing. But what if someone can’t perceive audio (or might find it annoying or unbearable)? It felt like a job for haptic feedback (back to the Modes idea!). If you think about it, most users might actually use haptics over sounds these days. But what if haptics are very distracting for someone? Simple, let’s add a toggle to disable them if they want to.

public func triggerCrashHaptic() {
    notificationGenerator.notificationOccurred(.error)
}

public func triggerGridUpdateHaptic() {
    lightImpactGenerator.impactOccurred()
}

public func triggerMoveHaptic() {
    mediumImpactGenerator.impactOccurred()
}

And I also wanted to provide a volume slider for the sound effects, and a font selector because the retro font I chose seems like it might be difficult to read for many users.

Some other settings are more specific. Like VoiceOver announcements that let you know when the speed is about to increase (it happens every 100 points). But I see how the announcement might take you off the game once they know the deal. 


private func announceSpeedIncreaseIfNeeded(oldValue: Bool, newValue: Bool) {
    guard oldValue == false, newValue else { return }
    guard inGameAnnouncementsEnabled else { return }
    UIAccessibility.post(
        notification: .announcement,
        argument: GameLocalizedStrings.string("speed_increase_announcement")
    )
}

And some settings don’t have toggles. I wanted to support Dark Mode, Dynamic Type, and very important: portrait and landscape orientations!

Levels (& language)

Finally, I first developed the game at a speed that I felt it would be fun for me. But what is fun for me might be incredibly tricky for some people. So I added three speeds so more people can enjoy the game.

My first instinct was to call those levels easy/medium/hard, but I immediately thought that that naming could be very unfair. Easy might still be very difficult for some users, so it feels like an inappropriate name. Language is important! I might not have found the perfect names (naming things turns out to be a very complicated thing). But for now, I settled with Cruise/Fast/Rapid.

Again, it’s a small thing, but language shapes how people feel before they even start playing.

Wrapping up

And that’s it! I have more plans for the future. But that’s what I’ve done so far, and I hope you found it interesting.

These are just some of the thoughts and approaches I’ve followed. Take them with a pinch of salt. I’m not an expert at all, but thought it would be interesting simply as an example of my thought process when I approach the accessibility of any app in general. I promise I’ll learn from others’ experiences, and will come back with an update in the future.

My biggest surprise has been that my general understanding of accessibility seems to have brought me quite far. I genuinely think the most important thing is learning to ask yourself the right questions. The code, as these snippets show, tends to follow naturally from there. I hope this post shows a bit of that. And unsurprisingly, making the game more accessible makes it better for everyone. Some people have praised the haptics implementation, for example. My intention was to provide an alternative for those who can’t experience audio, but it turns out others appreciated it too.

If you have any feedback, comments, questions, etc., please hit me in the socials anytime. Thanks for reading!

Where to find the game?

For now, if you’d like to play the game, you can find it in the App Store for iOS, iPadOS, and watchOS. macOS and tvOS should come soon, and I would love to port it to visionOS too, even if it will be unlikely I’ll ever get to test it on a real device.

You can play 5 times a day for free, and unlimited play costs £2.99. But there is a special launch offer, and you can get it for £0.99 with the code ARCTICCONF till 28th Feb 2026. And if you enjoy it, an App Store review genuinely makes a difference in helping others discover it. It’s one of the best ways to support an indie developer.

]]>
https://accessibilityupto11.com/post/2024-12-06-01Advent of iOS Accessibilityhttps://accessibilityupto11.com/post/2024-12-06-01Fri, 06 Dec 2024 12:00:00 +0000Starting something new: The Advent of iOS Accessibility. Twenty-four days of exploring some of the most common accessibility issues I’ve encountered, how to identify them, and—most importantly—how to fix them. Hope you enjoy the series!

Illustration of Santa Claus above a calendar with 24 days. The first day in the calendar is crossed. At the bottom, some text that says: Advent of iOS Accessibility.

Day 1

One of the accessibility issues I see more often in iOS apps, believe it or not, is unlabelled elements. This happens especially for buttons with an icon but no title. In those cases, you need to configure an accessibility label manually.

Calendar of the Advent of iOS Accessibility. Day 1. Buttons with no labels (especially buttons with an icon, but no title). Button with a bell, that intends to let the user see the notifications. If the button has no accessibility label, VoiceOver could say things like: button, bellIcon, possibly notification… far from ideal! A good accessibility label would be “Notifications”. In UIKit, you can just assign a localized string to the accessibilityLabel property for the button. In SwiftUI, you can use the accessibilityLabel modifier and pass in a Localized String Key.

Day 2

Some recommendations for improving your accessibility labels: don’t add the element type, avoid redundancy and verbosity, localize…

Calendar of the Advent of iOS Accessibility. Day 2. Better accessibility labels. A music player with some examples of how to improve your accessibility labels. Don’t add the element type: for the like button, if the label is “Like button”, VoiceOver will say “Like button, button”. So just “Like” is perfectly fine. Avoid redundancy and verbosity. For the repeat button, the label “Repeat This is the last time from Kean” would be too long, and if we specify the song and band for every action, it would be too verbose. Localize. For the next track button, try to use a localized string instead of a plain string so VoiceOver speaks in the correct language. Start with a capital letter and don’t end with a period. This helps with VoiceOver’s inflection.

Day 3

Anything representing a heading in the app should have the header trait. It allows for a faster way of exploring a screen and quickly jumping to the part of the app you are interested in. Screens should also start with a header.

Calendar of Advent of iOS Accessibility. Day 3. Missing header trait for headings. BBC News app shows two sections Technology and Science & Environment. These sections consist of a horizontally scrollable carrousel of at least 5 elements. If the titles of the sections have the header trait, a single swipe down brings you from one section to the other, compared to 6 swipes to the right needed otherwise. At the top of the app, there’s the BBC logo. Screens should start with a heading so the logo could also have the header trait. In UIKit you can insert the header accessibility trait to the accessibilityTraits property of the UI element representing the heading. In SwiftUI, you can use the accessibilityAddTraits modifier and pass isHeader as parameter.

Day 4

Important information is often conveyed visually through icons, badges, or progress bars… These details can easily be overlooked. Please make sure they’re part of your UI’s components’ accessibility labels or values for a more inclusive experience.

Calendar of Advent of iOS Accessibility. Day 4. Visual cues missing from VoiceOver’s announcement. Four examples. The first one is a button with a person silhouette and a checkmark. VoiceOver announces it as “Collaborate, button”. It is missing the fact that you are already collaborating with someone. Represented by the checkmark. Second example shows a tab with the title “Coming soon” and a badge with the number 6. VoiceOver announces it as “Coming soon, Tab 2 of 5”. It is not conveying that there are 6 new movies and shows. Represented by the badge. Third example shows a chart’s title: “Daily Average 1h 55m”, an arrow pointing down, and “55% from last week”. VoiceOver says “Daily average, one hour fifty-five minutes. 55% from last week”. The user wouldn’t know if it went up or down. Represented by the arrow pointing down. Fourth example, episode download. The stop button is surrounded by circular progress bar. VoiceOver says “Stop downloads, button” but not how much is left.

Day 5

Images should either be decorative or have a proper accessibility label or alt text that describes them. If they’re decorative you can make it so they get skipped by assistive tech so it doesn’t get in the way of the experience.

Calendar of Advent of iOS Accessibility. Day 5. Images: with a description or decorative. iOS app with Disney+ showing the “How I met your mother” screen. The header has two images, one of them in the background, with the characters, that could be considered decorative. The other one is an image with the title of the series, which needs a description. In UIKit images are not accessible by default. So in the case of images that need to be accessible, you need to make the accessible elements. In SwiftUI images are by default accessible. You have an Image constructor with a decorative parameter, where you pass the name of the image.

Quick clarification on this one. It is probably not the best example of decorative images. I think these images should have alt text. But in my experience, most APIs won’t give you one for movie posters, music artwork, and things like that… And the point is that a random name just adds noise.

And another nuance. VoiceOver has a feature called Image Explorer. VoiceOver can recognise text, people and objects in images. So if you think that users can get value with this feature, you might want to consider exposing the image then.

Oh! And avoid images of text please!

Day 6

With regular buttons from UIKit or SwiftUI, you are all set. With complex views, headings, or table/collection view cells that, when selected, bring the user somewhere else in the app or perform an action, you’ll have to add the button accessibility trait yourself to convey users that it is an interactive component.

Calendar of Advent of iOS Accessibility. Day 6. Missing button accessibility trait. Four examples of not-so-obvious elements that require the button trait to be added by developers. First example, the collection view cells representing shortcuts in the Shortcuts app. Runs the shortcut when selected. Second example, the table view cell with a post in the Mastodon app. It opens the detail view and thread when selected. Third example, the What to Watch header in the Apple TV+ app. It can be selected and it will open the What to Watch screen. Fourth example, a view that is more complex than a regular button like a notification in Notification Centre that opens the notification.

Day 7

Grouping elements when it makes sense can make a huge impact on easing navigation with some assistive technologies like VoiceOver, Switch Control, or Full Keyboard Access. It also helps on reducing redundancy.

Calendar of Advent of iOS Accessibility. Day 7. Grouping elements for easier navigation. There is an example using the Foursquare app. There is a list of restaurants. Each restaurant has a name, type, location, and rate. By default, you’d need four swipes to the right with VoiceOver to go from one restaurant to the next one. That’s 32 swipes to go through 8 restaurants. If we group all these, it is just a single swipe to go from one item to the next one, easing navigation a lot. There is another example showing Next Door. The post in the example would require 9 swipes. So you can see how things can quickly get worse for more complex views. In that case, each post has a more options, like, reply, and share buttons that would repeat for every single item causing lots of redundancy.

In UIKit, this process consists of three steps:

  1. Setting the parent view as an accessible element: https://iosdev.space/@dadederk/109693895401281036
  2. Configure relevant accessibility label, value, traits and hints on the parent view: https://iosdev.space/@dadederk/109806337876803727
  3. If there are secondary actions within the grouped view, configure custom actions: https://iosdev.space/@dadederk/1097016

In SwiftUI, it might be as simple as using the .accessibilityElement(children:) modifier with the .combine accessibility child behaviour: https://iosdev.space/@dadederk/109932750048041110 If that doesn’t quite work as expected, try to tweak the internal views. If not, you can fall back to UIKit’s three step process.

Day 8

If a view has isAccessibilityElement to true, assistive tech won’t look for any of its subviews. That means that if there are any buttons inside, they won’t be accessible. You can add custom actions to be able to ‘interact’ with them.

Calendar of Advent of iOS Accessibility. Day 8. Inaccessible buttons within accessible views. Example of the Mastodon app. If we make the table view cell representing a post an accessible element so it is easier to navigate, the subviews, including the buttons, won’t be accessible anymore. So how can a VoiceOver, Keyboard, Voice Control, or Switch Control user interact with them? Custom actions to the rescue. In SwiftUI there is a new modifier, since iOS 16, called accessibilityActions. Before, you had to add the actions one by one. In UIKit, there is an accessibilityCustomActions property, which is an array of UIAccessibilityCustomAction.

Day 9

If you have interactions that are hidden or require complex gestures to be performed or that may conflict with VoiceOver, you need to provide alternative ways of executing these actions. Custom actions can help a lot of times, but not always.

Calendar of Advent of iOS Accessibility. Day 9. Hidden actions not being accessible. Find alternative ways of actioning hidden interactions or that require complex gestures or gestures that conflict with VoiceOver’s. Two examples. The first one is swiping left in a table view cell to unveil a delete option. That could be fixed by adding delete as a custom action. Second example is a bottom sheet menu with a grid of options. You need to swipe down to dismiss it. You’d probably need to implement the escape gesture for VoiceOver and it would be a good idea to add a close button anyway.

Day 10

Toggles or UISwitches are often found separated from the label that precedes (and describes) them; with an unclear label; missing a value, trait, or hint; or even not being actionable at all.

Calendar of Advent of iOS Accessibility. Day 10. Bad experience with toggles. Very often toggles or UISwitches are not grouped together with the label preceding them or lack of the right values, traits and hints. Three examples. With the first one VoiceOver just says “Shared accomodation” and it is not even actionable. The second one focuses first on the label and then on the switch. VoiceOver says “Event cancellations” and then “Switch button, on, doublet-tap to toggle setting”. The third one does a similar thing but treats the switch as a button. VoiceOver says: “Like” and then “Button”. All quite confusing. With UIKit you can configure your UISwitch as the accessoryView for a table view cell. With SwiftUI you can use a named Toggle that lets you specify an associated label.

Day 11

Have you ever seen VoiceOver randomly focusing on elements of the previous view when presenting a custom modal view? That can be fixed by letting the system know that the presented view is modal in terms of accessibility.

Calendar of Advent of iOS Accessibility. Day 11. Confusing custom modals. There is a groceries app. It has some text that says “Sorry, we don’t deliver there yet” a search field, a search button, a text that says “or” and a login button. When presenting a modal view on top of it, VoiceOver still focuses through the elements of the previous screen. To avoid that with UIKit, you can set accessibilityViewIsModal to true for the modal view. With SwiftUI there is an accessibility trait for that, .isModal.

Day 12

Sometimes we may fail to update users of things changing on the screen in a perceivable way. Toasts and similar should be announced. We may want to make clear that some content on the screen changed or to update on a task’s progress.

Calendar of Advent of iOS Accessibility. Day 12. Failing to update the user of changes. Three examples over the Apple TV app on the Ted Lasso page. The first one has a hud saying “Episode added to Up Next”. Toasts, snackbars and huds can be conveyed to VoiceOver usersa using accessibility announcements. The second one is a drop down to select the season which changes the episodes visible in the carrousel underneath it. A layout changed notification can help with changing content in the screen. The third one shows an episode with a download indicator. The updates frequently trait could help with that.

Day 13

Sometimes, you may want to create a custom component, even if there is a similar one in UIKit or SwiftUI because you want to style it in a way that the default one won’t let you. Or add extra functionality. That’s fine, but please take into account that you may need a bit of extra work to make it accessible.

Calendar of Advent of iOS Accessibility. Day 13. Custom components can need a bit of work. If you develop a custom tab bar, you need to set the tab bar accessibility trait in the container, then you need to set up the container type with the value semantic group, then give it a localised string, “Tab bar”, as an accessibility label. Also, you need to set up the large content viewer by setting show large content viewer to true and adding the interaction to the buttons… If you use Swift UI, the .accessibilityRepresentation(representation: ) modifier can help you create more accessible custom components.

Day 14

iOS and Xcode provide a wide variety of tools and options to deal with color, and help us providing good color contrast ratios. From system colors that automatically support Increase Contrast, to high contrast (and light and dark mode) color asset variants, automatic checks with the Audit feature in the Accessibility Inspector, and even a built-in contrast calculator.

Calendar of Advent of iOS Accessibility. Day 14. Color contrast. Some of the tools available in Xcode that help you create apps with great contrast are the asset catalog which lets you provide high contrast variants for color sets and assets, and the color contrast calculator. Some examples show how text size also affects contrast ratios. White on Black passes for all sizes. A grey with a 4.5 to 1 ratio over black will fail on small text sizes. A grey with a 2.9 to 1 ratio over black, fails for all text sizes. You can also use system colors an make sure that your app works well with the High Contrast setting.

Day 15

Touch target sizes are recommended to be at least 44 x 44 points for better usability. Buttons in the navigation bar (especially when not using nav bar button items), dismiss buttons, and custom toolbars, are common examples that often fall below this size.

Calendar of Advent of iOS Accessibility. Day 15. Small touch target sizes. Three examples of common cases with buttons that tend to have small touch target areas. The first one is buttons in the navigation bar. The second one is dismiss buttons for modal or inline popups. The third one is toolbars, especially custom ones.

Day 16

A reminder that the more modes we use to convey important information, the more likely it is that all users will perceive it. Consider a combination. of color, icons, messages, sound, haptics, animations, etc.

Calendar of Advent of iOS Accessibility. Day 16. Multimodal information. One example shows an app where you need to introduce a 6-digit pin. When it is the wrong pin, it does a shake animation on the pin field. It is using one mode: animation. If reduce motion is enabled, or for VoiceOver users, this information will not be perceived. The second example shows the same app but it adds a warning icon and a message that says “Incorrect pin” in red and the device also vibrates and does a sound to indicate the error. It uses several modes: animation, iconography, messaging, color, sound, and haptics. This is preferred.

Day 17

Check the traversal order of elements in your app. Sometimes the default top-left to bottom-right order might not be the most logical one. Sometimes you may consciously want to tweak the order. Other times, grouping is the answer.

Three examples of traversal order for three pieces of data in three columns. The first one says Followers and 550 underneath, the second one Following with 340 underneath, and the third one Posts with 750 underneath. In the first one, the order is: Followers, Following, Post, 550, 340, 750. This order is incoherent. For the second one, the order is: Followers, 550, Following, 340, Posts, 750. This one has a logical order. The third one’s order is: Followers 550, Following 340, Posts 750. Where both name and number for each one of the pieces of data is grouped. It is clearer and the navigation is easier.

If you want to find out more, we had an interesting conversation in Bluesky with Paul J. Adam, on the nuances of how to implement this.

Day 18

When building custom components, or if not relying on UIControl’s attributes to configure state, it can be easy to forget to specify the right accessibility traits. These are indispensable for a good experience with VoiceOver, Switch Control, Voice Control, Full Keyboard Access…

Calendar of advent of iOS Accessibility. Day 18. Missing traits. Three examples. The first one shows a custom segmented controller. These tend to have a missing selected trait for the selected option. The second one shows a custom page control. These should configure an adjustable trait. The third one shows a button that shows the disabled state by changing the color of the text instead of setting isEnabled to false. This means the button will be lacking the not enabled trait.

Day 19

Accessibility labels might not be the best input labels, used for example to find or interact with elements with Voice Control or Full Keyboard Access. In those cases, you can provide accessibility user input labels.

Calendar of Advent of iOS Accessibility. Day 19. User input labels. One example shows Apple’s Podcast app in the 13 letters podcast page. There is a button in the top right corner with a + icon. The default label is “Follow”. That might not be everyone’s first guess on how that button is named. You could add “Plus” and “Add” as alternative labels. The second example is some episodes of that same podcast. The default label is very long consisting of a few details about each episode like the publication date and title. That might be a bit long for Voice Control. You can use just the title, or the word “Episode” as possible input alternatives.

To know more about accessibilityUserInputLabels checkout: “Improving Accessibility: Voice Control” by Bas Broek

Day 20

There is an option for the user to request an experience with Reduce Motion, and we should honor it. If your app has animations, make sure to check if the user has this setting on. Here are three examples where Apple does a great job.

Calendar of Advent of iOS Accessibility. Day 20. Reduce Motion. Three examples. In the App Store app, Apple sometimes has tiles with icons from apps or games that continuously moves for the user to get a sneak peek of all the icons. The second one is the Weather app, where there are quite realistic animations of the weather conditions in the background. The third one is the Stocks app. If you open one of your symbols, there is a banner at the top where your symbols keep automatically scrolling mimicking what you see on tv. In all these three cases, Apple completely stops the animations if Reduce Motion is on.

You can support Reduce Motion in UIKit and SwiftUI.

Day 21

There are a few accessibility settings you can check for, or get notifications in case these preferences change. This is especially important when developing custom components as they will mostly work with UIKit and SwiftUI controls.

Calendar of Advent of iOS Accessibility. Day 21. Honoring users’ settings. Display & Text Size Accessibility Settings Screen. There are two examples, Apple’s podcast app playing an episode of Swift by Sundell and FaceTime. There are arrows pointing from some of these settings to how they take effect in these apps. When Reduce Transparency is on, in FaceTime you don’t see the front camera being previewed and blurred in the background, instead, it is just opaque. When bold text is on, all text in both apps is bold. When Increase Contrast is on, the New FaceTime button is darker. When Button Shapes is on, the button for the podcast show or the playback speed (both consisting of just text with the app’s tint color) is underlined so it is easier to be identified as buttons. The sleep timer button, which is an icon, gets some borders around it too. When smart invert is on, the artwork of the podcast doesn’t get inverted.

Checkout UIKit’s Accessibility Capabilities and SwiftUI’s Accessibility environment variables.

Day 22

Make sure you support Dynamic Type up to the largest text size available. Take into account that there are five extra accessibility sizes available from the Accessibility Settings. It can make a huge difference for lots of users.

Calendar of Advent of iOS Accessibility. Day 22. Dynamic Type Support. The Mastodon app in two sizes, the default one (large) and the fifth accessibility text size (the largest). Some things you can do to support dynamic type: support it even when using custom fonts, save asset’s vectorial data, scale icons and buttons too, adjust sizes automatically, tweak the layout if necessary, implement large content viewer. You can test in the SwiftUI variants preview, with the accessibility inspector, in the simulator, using environment overrides…

Day 23

Sometimes your UI will just not optimally scale for large text sizes. Simple changes, for accessibility sizes, like composing elements vertically instead of horizontally, reducing the number of columns, and allowing more lines of text, can do the trick most times.

Calendar of Advent iOS Accessibility. Day 23. Adapt your UI. Example of Apple’s Stock app. The first one has the default text size. The second one uses the largest possible text size. In the first one, the symbol and name are at the left of its row, there is a small graph in the middle, and the value and percentage change to the right. In the second one, the symbol, name, value, and percentage change are at the left, taking most of the row width, and the only thing to the right is the graph. That gives plenty more horizontal space for the text.

Day 24

Test manually. Familiarise yourself with different assistive technologies. I find it useful to start with VoiceOver but check out Voice Control, Full Keyboard Access, and others… Remove friction, configuring shortcuts can help. Happy festive season everyone!

Calendar of Advent of iOS Accessibility. Day 24. Test Manually. Go to Settings, Accessibility, scroll to the bottom, and tap Accessibility Shortcut. Select the accessibility features you’d like in the accessibility shortcuts menu. VoiceOver, Switch Control, and Voice Control are selected in the example. Now triple-clicking the side button on your iPhone (or the home button if you still have one) will show an action sheet with the features selected previously.

Happy festive season everyone!

Thanks for following the series till the end! For more accessibile iOS apps in 2025!

Popular logo used for representing accessibility consisting of a stick man inside a circle. But the stick man is dressed as Santa Claus and there is some text around it that says “Accessibility. Have you been naughty or nice?”.

You can also follow the series in:

]]>
https://accessibilityupto11.com/post/2021-01-21-01Traits of a good accessible iOS apphttps://accessibilityupto11.com/post/2021-01-21-01Wed, 20 Jan 2021 12:00:00 +0000You may know that you can configure a UI component with an accessibility label. The accessibility label is the name of the component. You can also configure an accessibility trait. The accessibility trait is the role of the component, it gives the user information on how they can interact with it. When using VoiceOver, the trait is usually vocalised after the accessibility label.

Illustration with three trait examples. The first one is cells on a collection representing podcasts, which is a button. The second one is showing a section heading in a news app, “Technology”, which is a heading. The third one shows a rating component, from one to five thumbs up, which is adjustable.

Every UI control in your app has an accessibilityTraits property (which is part of the UIAccessibility protocol). Under the hood, it is a bitmask that defines which ones of the different accessibility traits available better describe the UI component. Bear with me, it is easier than it sounds.

At the time of writing this post, there are 17 different traits that you can use to define your UI controls. Button, Selected, Not Enabled, Adjustable… are some examples. You can find the full list in Apple’s documentation. But sometimes, it might not be straightforward to understand what these traits really mean, or how they affect the user experience, in the documentation so I thought it would be a good idea to write a bit about some of them.

You can combine more than one of these traits to define a single component. And for convenience when operating with accessibility traits in Swift, the property conforms to the OptionalSet protocol that conforms to the SetAlgebra one. That means you can use operations like: insert and remove to change the traits that define a component.

It might be easy to see why you would insert, or add, a trait. But what about removing them? Why would you do that? We’ll see an example with the ‘Selected’ trait.

And before we start, I think it is a good moment to mention that if you use UIKit components, a lot of times things will just work and no extra accessibility traits need to be configured. So please, rely on native UI components as much as you can.

It lets the user know that, when the component has VoiceOver’s focus, they can interact with it by doing a double tap anywhere on the screen. And it also tells Switch Control that it is an interactive component. A UIButton has the button trait by default, so why would you ever need to give the button trait to a control?

One of the most common examples where the button trait is often missed is for some cells in table/collection views. Cells are, a lot of times, ‘buttons’ that trigger an action, like playing some music, or bringing the user to a different screen in your app.

cell.accessibilityTraits.insert(.button)

Two examples: 1.Podcasts in a Collection View (Grid). 2.Tracks in Table View. Both have the button trait configured.

But no need to add the button trait if the cell has the disclosure indicator accessory type, though. In that case, iOS will add the trait for you… magic!

Settings screen. Cells have the disclosure indicator accessory type. No need to add the accessibility button trait.

There are also some apps that create custom button components from plain UIViews. One reason why they might do that is to add animations to the button, for when it is tapped.

Header

Probably one of the most important ones as it often helps users navigate an app in a more intuitive and faster way.

Visually, headers help to quickly identify groups of information so we can quickly jump into what it is interesting to us. VoiceOver users can do the same. By using the rotor, you can navigate through headings and jump right to the next/previous heading in the screen with a single swipe down/up on the screen.

To enable the rotor, when VoiceOver on, rotate two fingers on the screen, like trying to turn an invisible knob. The Rotor will appear on screen letting you choose between a number of navigation modes and customisations.

Drawing of how the VoiceOver Rotor looks when selecting navigation through Headings.

Imagine you are using VoiceOver to navigate through the screen represented in the picture. There is a topic cloud, with a header that says “Podcasts”, with 11 items on it. The next topic cloud is for Artists. If no headers are configured, the user will have to do 12 flicks to the right to get to artists. With headers configured, the user will get there with a single swipe down.

Screen with Podcasts heading and 11 topics followed by Artists heading  with 2 topics. 12 swipes to the right or 1 swipe down.

And for letting the user navigate through the headings in your app, all you have to do is to insert the .header trait to any label or view that represents a heading in your app. That’s what I call a quick win!

Link

This one can be a bit confusing, especially if you come from the web development world where there is a clear distinction between what a button and a link are, which doesn’t always seem to apply on iOS apps. The button trait is usually configured for something that triggers an action but also for something that brings the user somewhere else in your app. A link trait usually applies for something that opens some web content and it usually appears in-line in the content and represented by underlined text.

Search Field

It differs from a text field because it will not just let the user know that they can type some text in, but also that it will probably trigger some changes, showing new results as they type.

Adjustable

For UI components that can change a value they’re holding. A good example is a rating UI component, its value can be adjusted to 1 thumb up, 2 thumbs up… 5 thumbs up. It is sometimes used in carrousels too, so you can swipe up/down to select one of the items in the carrousel and swipe right to jump to the next element on the screen. Most of times I think it is a better solution for carrousels to have headings for that purpose though. There is a good blog post about the topic, by Hannah Billingsley-Dadd, in the BBC’s Design & Engineering Medium publication.

Custom rating component. From 1 to 5 thumbs up. Swipe up to increase rating and swipe down to decrease it.

Adjustable is also a bit special, because just configuring the trait may not be enough. There is an extra step needed for it to work properly, you need to override and implement the accessibilityIncrement() and accessibilityDecrement() functions to specify what the behaviour should be when the user interacts with the component by swiping up/down to increment/decrement the value of the component.

override func accessibilityIncrement() {
  guard rating < maxRating else { return }
  rating = rating + 1
  accessibilityValue = \"\(rating)\"
  sendActions(for: .valueChanged)
    
override func accessibilityDecrement() {
  guard rating > 1 else { return }
  rating = rating - 1
  accessibilityValue = \"\(rating)\"
  sendActions(for: .valueChanged)
}

UISliders are adjustable by default but you can use accessibilityIncrement() and accessibilityDecrement() to specify how much the value of the slider should increment or decrement. Imagine a slider that goes from 0 to 1000. You probably don’t want to increment it 1 by 1, and 100 by 100 might be more reasonable.

Selected

There are elements that can be selected or not. Cells in table views have the possibility to be selected, for example. In those cases you can use the selected trait to let the user know that the element is selected. It is often used for buttons that act as toggles and that can have an on or off state.

An unfilled like button has the button trait. A filled like button has the button and selected traits.

But remember, if it gets unselected, you need to remove the selected trait too.

if likeButton.isSelected {
  accessibilityTraits.insert(.selected)
} else {
  accessibilityTraits.remove(.selected)
}

Updates Frequently

This one is useful for views that change its label, or value, and you want to update the user while the focus is on the component. I think a downloading progress bar is a good example of this. Without this trait, the user would just hear the value the component had when they landed on it. With this trait, every now and then, the accessibility trait will poll for updates and announce any changes to the user.

Drawing of a downloading progress bar at 15%. VoiceOver can update the user at 5% and 10% when using updatesFrequently trait.

None

Not a very interesting one, just useful for resetting the accessibility traits of a component.

aView.accessibilityTraits = .none

And more!

Not enabled: It causes VoiceOver to say “dimmed” after the accessibility label, indicating the user that the component is disabled and that they can’t interact with it at the moment.

Starts media session: If the purpose of a button is to start playing some music, or a video, this trait causes the VoiceOver announcement to stop as soon as you interact with it. Plays sound causes a similar effect.

Keyboard key: can be used for custom keyboard keys, like for a custom pin keyboard with just numbers. Although it stops VoiceOver from saying button, so it might not be desirable.

Allows direct interaction: If you have a control that doesn’t play well with VoiceOver’s gestures, you can allow the user to interact directly with it. I imagine it is something similar to what happens when the user has direct touch typing mode enabled. When that is the case, touching a key in the keyboard causes the key to be activated, instead of needing to first select it and then double tap it.

For Interface Builder lovers

If you use interface builder for building the UI of your apps, you can configure the accessibility traits, like many other accessibility properties in the identity inspector.

Interface Builder menu to configure Accessibility properties such as: label, hint, identifier and traits..

It’s a wrap!

I hope this helps and that you’ve identified in this post some UI controls in your app that could be great candidates to get some of these traits configured. I’d like to hear from you. Would you like to share any examples from your app? Do you think I’ve missed anything important? Was there any accessibility trait in the post that you didn’t know about and may start using now? You can write your comments/questions/feedback in the Medium post or reach me on Twitter with the @dadederk handle. Thanks for reading!

If you are one of the lucky ones using SwiftUI already, you usually have equivalent attributes and modifiers to add and remove traits to an element. To know more, I totally recommend this blog post by Rob Whitaker.

What to Read Next

]]>
https://accessibilityupto11.com/post/2020-08-11-01Tips for testing your iOS app’s accessibilityhttps://accessibilityupto11.com/post/2020-08-11-01Tue, 11 Aug 2020 18:30:00 +0000The cycle for creating great accessible apps starts and ends with testing. Sure, it is always better to get feedback from users to create an experience that holds up to their expectations. But that’s not always possible. Till you can do it, I think the way to go is to do your best. Build it as well as you can, then gather all the feedback you can get, apply the suggestions, test, repeat.

Illustration shows how to enable Accessibility Shortcuts in Settings. Then select your technologies from the list. Finally, triple click the side (or Home) button to see the Action Sheet menu with the technologies you selected before.

Therefore, you’ll probably have to start by self-auditing your app to try to find any potential issues or improvements, and after doing any changes, check that your fixes worked as expected. I’m going to focus on manual testing in this article. It is possible to automate some bits, but to get a real sense of the experience you are offering, you have to try it yourself.

I think it is actually a very fun process. It is a great experience to use your app and device in ways you probably wouldn’t in your day-to-day life. It helps you look at it through a different lens that for sure will make you think in the UX of your app from a different perspective and maybe even give you some ideas on how to improve it.

But I know it is also a process that some people may perceive complicated or difficult to approach at the beginning. So I thought I’d share some of my favorite tips when testing for accessibility in the apps I develop.

The first thing for me is to allow myself to switch to ‘testing mode’ as easy and quick as possible. If it is easy to toggle on/off VoiceOver, Voice Control, Switch Control, or any of the multiple assistive technologies iOS lets you create a shortcut for, you’ll find yourself testing more, and more often. To enable Accessibility Shortcuts in your device go to Settings, Accessibility, scroll to the bottom and tap Accessibility Shortcut. The final step is to select any of the options you’d like to have quick access to.

Example of the Accessibility Shortcuts menu.

From now on, they’ll be just a triple-click away to your home button or to the side button of your device (if you don’t have a home button anymore). I recommend having more than one option because actioning the shortcut will show you a menu with the options you enabled. If you don’t, you risk enabling VoiceOver accidentally in meetings. True story.

Control Center

Triple-clicking buttons is not always convenient, so you can also have quick access to Accessibility Shortcuts in Control Centre. Just go to Customize Controls in Control Centre’s Settings and make sure you add Accessibility Shortcuts to the included controls. I also like to have a Text Size control in Control Centre, and enable Larger Accessibility Sizes, to test my apps for Dynamic Type.

Accessibility Shortcuts menu in Control Centre. Text Size menu in Control Centre.

Screen Curtain

One of my favorite tricks. Do you think you can use your app even with your eyes closed? Yes? Are you sure? Let’s try it out. A triple-tap with three fingers on the screen, when VoiceOver is on, will let you navigate your app with the screen off. You know how your app looks like, so if you are struggling to complete some of the most common flows, it is probably time to go back to Xcode and improve your app’s accessibility.

Accessibility Inspector

It is a bit of an unknown tool for some reason, but it comes pre-installed with Xcode. You can use the Accessibility Inspector for quickly inspecting accessibility properties in your simulator or even for running an audit, which lets you find issues like: small hit areas, contrast ratio issues, non-accessible elements, and more. Go to Xcode’s top menu, Open Developer Tools, Accessibility Inspector. Select the simulator and start inspecting, auditing or modifying system’s settings in your app.

Accessibility Inspector: Inspect mode. Accessibility Inspector: Audit mode.

Environment Overrides

New to Xcode 11. It lets you test several accessibility configurations in the simulator, including: Reduce Transparency, Bold Text, Smart Invert Colors, etc. It is similar to the Settings you can find in the Accessibility Inspector but right there in Xcode, and with even more options like Dark Mode. You’ll find the Environment Overrides in the Debug Area’s toolbar. Next to the Debug Memory Graph button.

Environment Overrides menu in Xcode.

Voice Control

Pro tip. You can use Voice Control to find issues with your accessibility labels. Turn it on, and say: “Show names”. You’ll see the accessibility labels of your actionable elements on screen. Handy!, especially to find mistakes such as buttons that you forgot to add an accessibility label to.

Apple Maps app being used with Voice Control after saying “Show Names”.

Switch Control

You can test Switch Control connecting a Bluetooth keyboard or even using your selfie camera and triggering the switch by moving your head! Go to Settings, Accessibility, Switch Control, Switches. You can configure, for example, a Right Head Movement to Move To Next Item and a Left Head Movement to Select Item.

Color Contrast Calculator

There are plenty of color contrast calculators online, but just as with the Accessibility Inspector, Xcode includes its own one, so look no further. With the Accessibility Inspector open, go to Window, in the top menu, and click Show Color Contrast Calculator. Or option+command+C.

Color Contrast Calculator tool in Xcode.

VoiceOver Caption Panel

It is a great way to visualize what VoiceOver is saying. Not only is it good for testing but it can also be an alternative way to perform remote VoiceOver demos. Sharing VoiceOver’s audio in video calls can be challenging sometimes. But this way you or the people in the call can read the captions instead. To enable them, as usual, go to Settings, Accessibility, and then VoiceOver, and toggle on the Caption Panel option.

Voice Over Caption Panel.

Automation and 3rd Party Frameworks

Ok, I know I said I was going to focus on manual testing this time. But we all love some help so here’s a bonus track. There are a few tools out there that were developed to help us create more accessible apps. They can help us catch early issues like missing accessibility labels, text that doesn’t support Dynamic Type, color contrast issues, small interactive areas, etc. Here are some of these projects that I would have a look to:

  • A11yUITests: https://github.com/rwapp/A11yUITests
  • UBKAccessibilityKit: https://github.com/NAB/UBKAccessibilityKit
  • AccessibilitySnapshot: https://github.com/cashapp/AccessibilitySnapshot
  • GTXiLib: https://github.com/google/GTXiLib

It’s a wrap!

Thanks for reading! I really hope this post helps you test your app’s accessibility and create more accessible apps. Did you know these tips? Any you found particularly useful? Do you have any others I didn’t mention? I’d be super interested in knowing more about it. You can find me in Twitter with the handle @dadederk.

What to Read Next

]]>
https://accessibilityupto11.com/post/2020-05-13-01Improving your App’s Accessibility with iOS 13https://accessibilityupto11.com/post/2020-05-13-01Wed, 13 May 2020 18:30:00 +0000I know, iOS 13 has been with us for quite some time now, WWDC 2020 is just around the corner and we hope Apple will present again a ton of new accessibility features and improvements coming with iOS 14.

iOS 13 icon (classic squircle with the number 13 inside), a plus icon, and the classic universal accessibility logo (stick man within a circle with open arms and legs).

But it is almost Global Accessibility Awareness Day! And it is still a great time to catch up and implement everything new that was presented around accessibility in iOS 13 - it was a packed year! - so you are even readier to adapt your apps and fully embrace iOS 14. So I just thought it is a great moment to quickly remember all we can do to improve our app’s accessibility with iOS 13.

We all know how much Apple cares about accessibility. Last year it was packed of new features, and accessibility was one of the core topics they covered in the Platforms State of the Union session.

Platforms State of the Union (WWDC19). Apple talked about Accessibility, Privacy, Machine Learning, Siri, Augmented Reality, and Metal.

The first thing they talked about was Discoverability. Apple really wants everyone to use any of their devices right out of the box, so they made possible to configure your accessibility preferences on iOS’ Quick Start onboarding process. And they also moved Accessibility to the top level of Settings so it gives it visibility and it makes it easier to find.

But star of the show was Voice Control. Apple presented this new mechanism that lets you use your device with a series of voice commands. You’ll probably think immediately what is new about it if we have Siri already. But Voice Control is a more generic tool that lets you completely interact with any part of the system and with any app you have installed. And more importantly, it runs fully on-device, so it will work even if you are offline.

Navigating the Spotify Player with Voice Control using Names (left), Numbers (centre) or a Grid (right).

You can refer to items in the screen by name, by number, or you can use a grid-based layout for finer control. You can edit text, you can speak gestures… It uses the camera to check if you are speaking to the device or if you are looking somewhere else. It works like magic!

If your app supports VoiceOver well, chances are that it will already be pretty good with Voice Control. But we are not happy with “pretty good”, we want it to be excellent, so we’ll see how you can improve your app’s experience with Voice Control in a minute.

What’s new?!

Voice Control

A great thing I quickly realised about Voice Control is that it is a great way for testing your apps for accessibility. Open your app, go to a screen you want to test and say “Show names”. Voice Control will overlay all the accessibility labels you’ve configured. A quick way to check if they make sense or if there are any missing.

What can you do to further support Voice Control? If your app is accessible, you’ll see that it already works quite well. But the accessibility labels you’ve configured might not be the best for Voice Control. This can be because you may want them to be shorter and easier to say. Or, to make it a bit more intuitive, you may want to provide different alternatives of what the user can say so they don’t have to remember a specific label.

For this, you can provide an array of alternative input labels like this:

// Provide input alternatives for Voice Control
settingsButton.accessibilityUserInputLabels = [\"settings\", \"configuration\", \"cog\"]

This also comes handy for cells in table and collection views where accessibility labels are pretty descriptive. And that is ok, but you may want to provide a more concise label to select them. For example, a banking app may show the merchant, amount, and date for each transaction. A good accessibility label describes all that information. But for easily selecting one transaction with Voice Control, you can provide an accessibility user input label that could be just the merchant.

Apple documentation: https://developer.apple.com/documentation/objectivec/nsobject/3197989-accessibilityuserinputlabels

Large Content Viewer

In iOS 11 Apple introduced Large Content Viewer. When using Larger Accessibility Text Sizes, even if an app supports Dynamic Type, the size of the content in navigation bars and tab bars will not grow. That would take too much real estate from the screen. But if you hold your finger on those elements, a large preview will show in the middle of the screen. This used to only work for iOS native components. But now, iOS 13 lets you implement this for your custom components too.

You can support it by adding a Large Content Viewer Interaction to your custom view, and setting the property showLargeContentViewer to true to the elements within your custom view that you can select and want to provide the preview option to.

Assuming customBarView is your custom view that contains a button called customBarTabButton and another custom view called customBarTabView, as subviews, the code would look something like this:

// Create a Large Content Viewer Interaction
let largeContentViewerInteraction = UILargeContentViewerInteraction()

// Add it to your view
customBarView.addInteraction(largeContentViewerInteraction)

// Enable Large Content Viewer in the buttons or labels you want
customBarTabButton.showsLargeContentViewer = true

// If it is a custom view that you want to enable for Large Content Viewer,
// you can provide an image and title
customBarTabView.showsLargeContentViewer = true
customBarTabView.largeContentImage = UIImage(systemName: \"play.circle\")
customBarTabView.largeContentTitle = \"Play\"

If you configure the Large Content Viewer for a button, the image and title of the icon will automatically be used to preview the element. If it is a label, the title will be used too. If it is a custom view, you can configure both the large content image and the large content title.

Example of how Large Content Viewer looks when holding your finger on the Apps tab, in the tab bar of the App Store app, if using Accessibility Text Sizes.

Apple documentation: https://developer.apple.com/documentation/uikit/uilargecontentviewerinteraction

Differentiate Without Color

You should not rely just in colors to signal something in your app. There are users that may not be able to notice differences in the colors you are using and you should accompany this with proper messaging too, for example. But if you really need to - that can probably happen with some games - you can now use this new option in iOS 13.

// Check if the user needs differentiation without color
if UIAccessibility.shouldDifferentiateWithoutColor {
  card.colorSymbolImageView.hidden = false
} else {
  card.colorSymbolImageView.hidden = true
}

Imagine you are developing an UNO-like game where each card has numbers and colors, and the next player chooses its card based on the number or color of the previous player’s card. You could check if the user has enabled the Differentiate Without Color option and show an extra symbol in the card that represents the color. Actually UNO has a color-blind friendly version, UNO ColorADD, that uses the Colorblind Alphabet.

UNO Colorblind Cards. UNO® ColorADD

Apple documentation: https://developer.apple.com/documentation/uikit/uiaccessibility/3043553-shoulddifferentiatewithoutcolor

Auto-Play Video Previews

In a similar way, a user can now decide to disable auto-play for video previews. You should always try to honor the user’s preferences.

// Check if the user has Video Autoplay enabled
if UIAccessibility.isVideoAutoplayEnabled {
  video.play()
} else {
  video.pause()
}

Apple documentation: https://developer.apple.com/documentation/uikit/uiaccessibility/3238036-isvideoautoplayenabled

Note: For both Differentiate Without Color and Auto-Play Video Previews, you can listen to their corresponding ‘did change’ notifications.

Custom Actions with action handlers

You can add custom actions to views. This improves navigation with VoiceOver and Switch Control. A common use case is when a cell in a table view has multiple buttons that let you execute multiple actions. It used to be that you had to provide the name of the action, and also a target and selector, for each one of these custom actions. In iOS 13, Apple introduced a Swiftier way of doing it where you can just provide a closure (instead of the target and action) with the code that will be executed when you trigger the custom action.

// Create Custom Actions
let artistCustomAction = UIAccessibilityCustomAction(name: userName) { _ in
  self.navigationController?.present(artistViewController, animated: true, completion: nil)
  return true
}

let likeCustomAction = UIAccessibilityCustomAction(name: \"like\") { _ in
  likedSongs.append(song)
  return true
}

// Add Custom Actions to your view
accessibilityCustomActions?.append(contentsOf: [artistCustomAction, likeCustomAction])

Apple documentation: https://developer.apple.com/documentation/uikit/uiaccessibilitycustomaction/3043557-init

Environment Overrides & Accessibility Inspector

Testing your iOS apps for accessibility became much easier too.

The Accessibility Inspector added a preview option for VoiceOver. Just click on the speaker button and use the player controls to play/pause or go to the previous/next element.

The Accessibility Inspector can now offer a Quicklook to preview VoiceOver in the simulator.

And Xcode added a Environment Overrides menu that lets you customize a ton of options when running the app in the simulator, including some accessibility configurations like: Increase Contrast, Reduce Transparency, Reduce Motion, Smart Invert, Differentiate Without Color, and many more.

Environment Overrides lets you customise a bunch of user preferences in the simulator.

SwiftUI

Of course, SwiftUI was also presented in the WWDC19. Like everything that Apple does, accessibility is built-in right from the beginning. For sure you’ve noticed already that a lot of things in iOS are accessible by default. But this is even more true with SwiftUI thanks of its declarative nature and because it is state-driven. This probably deserves its own post, or even series of posts, so I totally recommend you Rob Whitaker’s great SwiftUI Accessibility Series: https://rwapp.co.uk/2019/11/06/SwiftUI-Accessibility/

And there is more…

As I said, it was one of the WWDC editions that I remember with more news around accessibility. Apart from all the ones in this post, Apple also presented Full Keyboard Access for iPadOS, SF Symbols (making easier to support dynamic type with glyphs), when using color asset catalogues you can specify high contrast variants, etc.

Have I missed anything? Please let me know, together with any feedback, questions, etc. on Twitter: @dadederk

Let’s now hope for another WWDC full of accessibility improvements!

Thanks for reading!

Resources

  • Rob Whitaker’s post in Mobile A11y “Writing Great iOS Accessibility Labels”: https://mobilea11y.com/blog/writing-great-labels/
  • My previous post “Improving your App’s Accessibility with iOS 11”: https://medium.com/bbc-design-engineering/improving-your-apps-accessibility-with-ios-11-db8bb4ee7c9f
  • WWDC Accessibility Videos: https://developer.apple.com/videos/frameworks/accessibility

What to Read Next

]]>
https://accessibilityupto11.com/post/2018-02-14-01Improving your App’s Accessibility with iOS 11https://accessibilityupto11.com/post/2018-02-14-01Wed, 14 Feb 2018 18:30:00 +0000Apple introduced iOS 11 at their World Wide Developer Conference (WWDC) in June 2017. WWDC is Apple’s showcase of new tools and developer APIs covering iOS, macOS, watchOS, and tvOS. In this post we’d like to show you what’s new in accessibility in iOS 11, and how you can incorporate these features to make your app more accessible to your users.

iOS 11 icon (classic squircle with the number 11 inside), a plus icon, and the classic universal accessibility logo (stick man within a circle with open arms and legs).

If it is your first time reading about Accessibility in iOS, Apple has a pretty good definition for app accessibility:

An accessible app is one that can be used by everyone — including those with a disability or physical impairment — while retaining its functionality and usability.

iOS 11 introduced some great accessibility improvements. Dynamic types are better supported throughout the system and bigger bolder fonts, which are easier to read, are now extensively used too. One of the ways they have done that is by adapting all the standard text style sizes to accessibility sizes.

Title, Headline and Body dynamic types before and after switching to an Accessibility size in iOS 10

In the past versions of iOS, only the Body style would get adapted. You can see in the screenshots above (iOS 10) and below (iOS 11) that there is a difference in how the app is rendered when using larger accessibility sizes. So don’t be surprised if your app does look different on iOS 11 for users using this feature, that is expected. Notice how even the Settings screen itself is much better adapted in iOS 11 too.

Title, Headline and Body dynamic types before and after switching to an Accessibility size in iOS 11

If you are using native iOS components in a standard way you will get lots of these improvements either ‘for free’ or with very little effort.

iOS 11 simplifies the adoption of larger accessibility sizes. If you are using one of the built-in text styles, you should get the new functionality without additional work. If you’ve checked the box in Interface Builder for ‘Automatically Adjust Fonts’, you’ll also pick up changes if the user changes their accessibility preferences in iOS Settings.

Automatically Adjust Font option in Xcode

If you aren’t using Interface Builder and prefer to keep things in code, in the example of a UILabel called titleLabel you can use:

titleLabel.adjustsFontForContentSizeCategory = true

Oddly this isn’t the default option!

And if your app uses custom fonts? Good news again: iOS 11 provides the UIFontMetrics class to scale your font based on the user’s text size preference. For our titleFont UILabel it’s as simple as:

titleLabel.font = UIFontMetrics.default.scaledFont(for: customFont)

You can also map how your font scales to the default text styles by doing something like:

titleLabel.font = UIFontMetrics(forTextStyle: .title1).scaledFont(for: customFont)

Larger Navigation Elements

You may have noticed that not all elements in iOS adapt to font size accessibility settings. Such elements include the iOS Navigation Bar and Tab Bar — text sizes remain the same regardless of these settings. This is a conscious design decision from Apple to avoid taking up too much screen space. The good news is that in iOS 11, if you long-press Tab Bar icons, a larger version of the icon and tab name will show in the middle of the screen. The feature isn’t widely mentioned in WWDC videos, but it also works with elements of a UINavigationBar and UIToolBar as well. You can see this in the BBC News app in both our Tab Bar and the Navigation Bar of our built-in web browser (used in our ‘LIVE’ pages).

Large Preview example in BBC News

For it to work, you just need to add your Tab Bar assets in PDF format. Yes, it’s not a typo — Apple decided to use PDFs for vector-based images. When adding the PDF to your Asset Catalog in Xcode, simply tick the box for preserving the vector data of the asset.

How to Preserve the Vector Data in Xcode

If it’s not easy to get a PDF asset, you can still get the same effect with a larger PNG asset along with the following property on the UITabBarItem:

tabBarItem.largeContentSizeImage = image

Apple don’t have a specific size in their documentation, but 75x75 pixels for an @1x size seems to give a reasonable result.

Smart Inverted Colours

Have you ever tried the Inverted Colours setting on your iPhone or iPad? Some people find it difficult or tiring reading dark text on a light, bright background — Inverted Colours is an accessibility option from Apple designed to help.

Classic Inverted Colours vs Smart Inverted Colours

The image in the left shows you how the BBC News app looks with the use of the Inverted Colours option before iOS 11. Light backgrounds and dark text are inverted, but images appear as if they are in ‘negative’ and are hard to see. The fact that you prefer or need light text over dark backgrounds shouldn’t come at the price of having an awful experience with images, so in iOS 11 you can stop inverting colours on UIImages with this property:

self.accessibilityIgnoresInvertColors = true

This allows the app to look like the image in the right, giving a much better experience. It also offers a pseudo-‘dark mode’ while we wait for official Apple APIs to support it, while iPhone X users may also get some improved battery life with their OLED screens!

And Much More…

  • Dynamic spacing constraints: You can add dynamic spacings between baselines of labels.
headerLabel.firstBaselineAnchor.constraintEqualToSystemSpacingBelow(titleLabel.lastBaselineAnchor, 
                                                                    multiplier: 1.0)
  • Scale images based on the large accessibility font types (you can use vector images for these too):
imageView.adjustsImageSizeForAccessibilityContentSizeCategory = true
  • Inspect if the user is using a font size in the accessibility category. Useful to do changes in the UI to fit a font size that big.
traitCollection.preferredContentSizeCategory.isAccessibilityCategory

This post is the result of research carried out in the BBC News Apps team to improve the accessibility of our app. We hope to implement these features in the near future and are always aiming to offer a more accessible experience to all our users.

Hope you found this post useful, and please let us know about your experience implementing these or other accessibility features!

Resources

What to Read Next

Special thanks to Barry Kidney and Marta Marti for their help with this post.

]]>