ML recommendation app with Create ML on iOS 15

Introduction

WWDC21 brought us a lot of new cool things, such as the new Swift concurrency features, Xcode Cloud, new frameworks, as well as updates to the existing ones. One of the highlights for me was porting CreateML to iOS. CreateML is Apple’s framework for creating machine learning models. Last year, a Mac app was introduced, that can be used for training and testing CoreML models. However, those models are pre-trained and bundled in the app. They can’t be easily customised nor personalised, based on the user’s selections and preferences.

This has changed in WWDC21 – CreateML on iOS was introduced as part of the session about creating dynamic iOS apps with CreateML (I highly recommend watching the session). What this means is that now we can dynamically train CoreML models, based on how the user uses the app. This opens up a lot of possibilities, for recommendations and suggestions for recipe, food, music, movie apps and much more. Your app can react on the user’s taste and selections, thus providing more personalised experience.

What’s cool about all of this is that everything happens on the device (no internet connection is required), so you can protect the user’s privacy.

In this post, we will build a simple music albums recommendation app. There will be a list of albums and the users will select some of them as favorites. Based on their input, the app will show similar music albums.

This Spotify competitor looks very exciting, so let’s build it!

Implementation

Prerequisites

In order to follow along or run the code by yourself, you will need at least Xcode 13 beta 2 (at the moment of writing). You will also need iOS 15 beta device – CreateML is not available on the iOS simulator.

Building the app

The app will consist of one screen, which will have two parts. First, the users will browse albums from a list. For the sake of this tutorial, the albums will be stored in a local JSON file. Each album will have a favorite button, which will add the album to the list of the user’s favorites. The favorites will be stored in the user defaults. The albums are simple structs, containing information about the name of the album, the artist, album cover, as well as a list of of keywords (strings).

When a favorite is added, the app will try to find similar albums, based on the keywords specified in each album entry. The keywords are genre based and possibly subjective. They can be easily updated in the albums.json file in the app, if they don’t fit your musical taste. The suggestions will appear at the top of the screen.

We will not go into much details about the SwiftUI implementation of the screen. In general, it uses scrollable LazyVStack, which contains headers, as well as horizontally scrollable LazyHStack for the recommendations, and LazyVGrid for all albums.

var body: some View {
        NavigationView {
            ScrollView {
                LazyVStack(alignment: .leading) {
                    if viewModel.recommendedAlbums.count > 0 {
                        VStack(alignment: .leading) {
                            HeaderView(title: "Suggested for you")
                            
                            ScrollView(.horizontal) {
                                LazyHStack {
                                    ForEach(viewModel.recommendedAlbums) { album in
                                        AlbumCard(viewModel: viewModel, album: album)
                                    }
                                }
                            }
                        }
                    }
                    
                    HeaderView(title: "Browse albums")
                    
                    LazyVGrid(columns: columns) {
                        ForEach(viewModel.allAlbums) { album in
                            AlbumCard(viewModel: viewModel,
                                      album: album)
                        }
                    }

                }
            }
            .navigationBarTitle("Albums")
            .onAppear {
                async {
                    await viewModel.loadAlbums()
                }
            }
        }
        .navigationViewStyle(.stack)
    }

This view, called AlbumsView, uses a view model (AlbumsViewModel). The view model is created with three services. The AlbumService, is used for loading the albums from the local JSON file, and the FavoritesService, for managing the list of favorites for the user.

The most interesting parts are in the third service, MLRecommendationService. In this service we are doing the on-device training and predictions for the suggested movies. Let’s dive deeper into its implementation.

RecommendationService

Our use-case resolves around finding and returning predictions, based on input which consists of favorite music albums and all music albums available. Therefore, our only public method in the RecommendationService protocol will look like this:

protocol RecommendationService {
    
    func prediction(for favoriteAlbums: [Album], allAlbums: [Album]) async throws -> [Album]
    
}

Next, let’s see how we can implement this method, in our MLRecommendationService implemenentation. Our code will use the Create ML framework on iOS, which supports training classifiers and regressors. Classifiers learn to predict particular classes from data in a training dataset. Regressors are similar, with the difference that they learn to predict a numerical value instead of a discrete class label. Everything will become clearer when we analyse the output of the predictions later in the post.

func prediction(for favoriteAlbums: [Album], allAlbums: [Album]) async throws -> [Album] {
        let allKeywords = allKeywords(from: allAlbums)
        let trainingData = prepareTrainingData(from: favoriteAlbums, allKeywords: allKeywords)
        let classifier = try await trainLinearRegressor(data: trainingData)
        let maxValues = try maxValues(for: favoriteAlbums, allAlbums: allAlbums, classifier: classifier)
        let average = computeAverage(from: maxValues)
        return sortAndFilter(from: maxValues, average: average)
    }

First, we need to extract the keywords from all albums. We will need them in the next method, which is preparing the training data.

private func prepareTrainingData(from albums: [Album], allKeywords: [String]) -> TrainingData {
        var trainingKeywords = [[String: Double]]()
        var trainingTargets = [Double]()
        
        for album in albums {
            let features = featuresFromAlbumAndKeywords(album: album.name,
                                                        keywords: album.keywords)
            trainingKeywords.append(features)
            trainingTargets.append(1.0)
            
            let negativeKeywords = allKeywords.filter { keyword in
                !album.keywords.contains(keyword)
            }
            
            trainingKeywords.append(featuresFromAlbumAndKeywords(album: album.name,
                                                                 keywords: Array(negativeKeywords)))
            trainingTargets.append(-1)
        }
        
        return TrainingData(trainingKeywords: trainingKeywords, trainingTargets: trainingTargets)
    }

In this method, we go through the list of favorite albums the user has selected. For each album, we create features from the album the keywords. Basically, we take the keywords associated with each album and combine them with the current album, to create a new keyword that enables the model to capture the interaction between the keywords and the album. We set a value of 1.0 in the dictionary to indicate that a particular keyword is present in the data entry. However, this will not be enough for the model to learn about the other music albums that the user doesn’t like.

Therefore, we also need to find all the other keywords, let’s call them negative keywords, and create features for them as well. Here, we set a negative value of -1.0. With this, the linear regressor will be able to differentiate the keywords that represent music albums the user likes.

After the training data is prepared, we can create the linear regressor:

private func trainLinearRegressor(data: TrainingData) async throws -> MLLinearRegressor {
        return try await withCheckedThrowingContinuation { continuation in
            DispatchQueue.global(qos: .userInitiated).async {
                var trainingData = DataFrame()
                trainingData.append(column: Column(name: "keywords",
                                                   contents: data.trainingKeywords))
                trainingData.append(column: Column(name: "target",
                                                   contents: data.trainingTargets))
                
                do {
                    let model = try MLLinearRegressor(trainingData: trainingData, targetColumn: "target")
                    continuation.resume(returning: model)
                } catch {
                    continuation.resume(throwing: NSError(domain: "classifier",
                                                          code: 1,
                                                          userInfo: [:]))
                }
            }
        }
    }

The regressor is created with the training data from above (the trainingKeywords and the trainingTargets). They are added as columns in a DataFrame. The creation of the regressor is synchronous and at the moment there are no async APIs for this purpose. Therefore, we move the creation to a background queue. We wrap this in an async method, with the withCheckedThrowingContinuation function, so we can use it with the new Swift concurrency features. If we haven’t done this, there would have been a slight delay of 1 second the first time we tap on the favorite button.

After the classifier is created, we can use it to make predictions.

private func maxValues(for favoriteAlbums: [Album],
                           allAlbums: [Album],
                           classifier: MLLinearRegressor) throws -> [Album: Double] {
        var maxValues = [Album: Double]()
        
        for album in allAlbums {
            if !favoriteAlbums.contains(album) {
                let keywordData = album.keywords.map { keyword in
                    [keyword: 1.0]
                }
                var inputData = DataFrame()
                inputData.append(column: Column(name: "keywords", contents: keywordData))
                let predictions = try classifier.predictions(from: inputData)
                var max: Double = 0
                for prediction in predictions {
                    if let value = prediction as? Double, value > max {
                        max = value
                    }
                }
                
                maxValues[album] = max
            }
        }
        
        return maxValues
    }

Before going to the details of the method, we first need to see what the regressor returns – a type-erased AnyColumn. The Column type can have different unwrapped value types. We have specified this to be Double when we created the regressor – by setting the targetColumn to “target”, which in turn contained array of Double values.

trainingData.append(column: Column(name: "target",
                                   contents: data.trainingTargets))
                
...
let model = try MLLinearRegressor(trainingData: trainingData, targetColumn: "target")

Now that we know what type of values are returned, let’s see what they mean and how we can use them to create the recommended albums. The goal for us is to find albums which are not favorites, but are similar to the ones which are. Therefore, we go through the all albums list and run the prediction against each album and its keywords. What we get as a result is a list of Double values telling us how much each keyword in the album matches the taste of the user, developed by their previous selections. These values are from 0 to 1.

There are several ways how we can do the math here and define what would be the threshold. The approach I took is getting the max value from each album and put it in an array. Then, out of all maximums we compute the average, and that would be the threshold. All the keywords which have value above it, will go to the recommended albums, the others not. The reasoning here is that if there’s at least one strong match in the keywords (e.g. rock), then the user might be interested in this album.

Other approach might be to take the averages from all keywords (not the maximums), and compute the threshold from there. Feel free to experiment to see what works best for you.

When we have the array filtered, we just sort it and return the result.

Finishing touches

Now that we have the RecommendationService ready, it’s pretty straightforward to use it in our view model. Whenever the user taps on one of the favorite buttons, we add/remove the favorite to the list. Based on the updated list, we run the recommendations.

func favoriteButtonTapped(for album: Album) {
        if isFavorite(album: album) {
            favoritesService.removeFromFavorites(album: album)
        } else {
            favoritesService.addToFavorites(album: album)
        }
        updateFavorites()
        async {
            try await makeRecommendations()
        }
    }

The makeRecommendations method is also simple, it just takes the list of favorite albums, and sends it to the recommendation service:

private func makeRecommendations() async throws {
        async {
            var favoriteAlbums = [Album]()
            for (index, value) in favorites.enumerated() {
                if value == true {
                    let album = allAlbums[index]
                    favoriteAlbums.append(album)
                }
            }
            do {
                self.recommendedAlbums = try await recommendationService.prediction(for: favoriteAlbums,
                                                                                    allAlbums: allAlbums)
            } catch {
                print(error)
                self.recommendedAlbums = []
            }
        }
    }

And that’s everything we need to do in order to have a simple recommendation app. There are some details omitted, which are not relevant for the main purpose of the post – so feel free to look at those in the source repo.

Conclusion

In this post, we have built a simple music recommendation app. We have seen how powerful can CreateML on iOS be. There are a lot of new opportunities and tools to make our apps smarter, while not compromising the privacy of the user.

The source code for this post can be found here. Hope you will find it useful. If you have some feedback or suggestions on how to improve the post or general thoughts about on-device ML, you can use the comments section below or ping me on Twitter.

4 Comments

  1. Hi, how did you import the CreateML.framework. Should it be embedded or not? If I embed it, there is “code sign Bundle format unrecognized, invalid, or unsuitable” build issue. If I don’t embed it, there would be runtime crash “Library not loaded: @rpath/libswiftCreateML.dylib”
    I am stucking here. Thanks for your help anyway.

    Like

    1. Hey there, the CreateML.framework doesn’t need to be embedded explicitly. The issue you are having with the runtime crash is probably because you are running on the simulator (which still doesn’t support CreateML). Hope this helps!

      Like

Leave a comment