on-demand-restaurant-recommendation-demo(Python)

Loading...

On-demand features - Restaurant recommendation demo

In this example, a restaurant recommendation model takes a JSON strong containing a user's location and a restaurant id. The restaurant's location is looked up from a pre-materialized feature table published to an online store, and an on-demand feature computes the distance from the user to the restaurant. This distance is passed as input to a model.

Requirements:

  • A cluster running Databricks Runtime for ML 13.3 LTS or above.
  • The cluster access model must be Single user.

Helper functions and notebook variables

Setup

You can call the Python UDF from SQL, as shown in the next cell.

Generate and publish feature data

To access the feature table from Model Serving, you must publish the table.
You must first set up access to DynamoDB by following the steps in the next cell.
Then, create the online store and publish the feature table to it as shown in Cmd 16.

Set up DynamoDB Access Key

In this section, you need to take some manual steps to make DynamoDB accessible to this notebook. The following steps create a new AWS IAM user with the required permissions. You can also choose to use your existing users or roles.

Create an AWS IAM user and download secrets

  1. Go to AWS console http://console.aws.amazon.com, navigate to IAM, and click "Users".
  2. Click "Add users" and create a new user with "Access Key".
  3. Click Next and select policy AmazonDynamoDBFullAccess.
  4. Click Next until the user is created.
  5. Download the "Access key ID" and "Secret access key".

Provide online store credentials using Databricks secrets

Note: For simplicity, the commands below use predefined names for the scope and secrets. To choose your own scope and secret names, follow the process in the Databricks documentation.

  1. Create two secret scopes in Databricks.

    databricks secrets create-scope --scope feature-store
    databricks secrets create-scope --scope feature-store
    
  2. Create secrets in the scopes. Note: the keys should follow the format <prefix>-access-key-id and <prefix>-secret-access-key respectively. Again, for simplicity, these commands use predefined names here. When the commands run, you will be prompted to copy your secrets into an editor.

    databricks secrets put --scope feature-store --key dynamo-access-key-id
    databricks secrets put --scope feature-store --key dynamo-secret-access-key
    
    databricks secrets put --scope feature-store --key dynamo-access-key-id
    databricks secrets put --scope feature-store --key dynamo-secret-access-key
    

Now the credentials are stored with Databricks Secrets. You will use them below to create the online store.

Create a TrainingSet with on-demand features

Log a simple model using the TrainingSet

For simplicity, this notebook uses a hard-coded model. In practice, you'll log a model trained on the generated TrainingSet.

Score the model using score_batch

Serve the Feature Store packaged model

Wait for model serving endpoint to be ready.

Query the endpoint

Alternatively, use the Serving query endpoints UI to send a request:

{
  "dataframe_records": [
    {
          "restaurant_id": 2,
          "json_blob": "{\"user_x_coord\": 37.79122896768446, \"user_y_coord\": -122.39362610820227}"   
    }
  ]
}

Cleanup