Test Drive Scenario

Here is a sample scenario to test out the AI Data Engineer in your environment

There are a couple of steps to complete in Fabric before kicking off the scenario.

Step 1: Add a Lakehouse

i. Log in to Microsoft Fabric: Go to app.fabric.microsoftarrow-up-right.

ii. Create a Lakehousearrow-up-right folder and name it SalesOps

iii. Create a Destination Table(s)*

a. Below is a sample Lakehouse table schema for this scenario. The SQL script is run in a notebook, which will generate your table. Note that there are various methods for creating a schema.

circle-info

*You can also instruct the Engineer to create tables for you.

For help with creating a table.arrow-up-right

Step 2: Add Data

i. Upload your Source data files to the SalesOps folder in your Lakehousearrow-up-right

a. Below are files: one source file and one table for lookups and joins.

file-download
23KB

AI Data Engineer Test Drive Scenario

# Destination Table or Tables:

Sales_Orders_Daily

# Source files:

All files in "SalesOps" folder. You may need to join data from the sales order file and the warehouse file.

# Ingestion instructions:

  1. If the Order Date field is blank, set the date to 01/01/1900

  2. Remove $ from the Price.

  3. Extract or Infer City, State, and Zip Code from Address

  4. Phone number should be in (XXX) XXX-XXXX US Phone number format. You can skip the country code

  5. Assign the Warehouse ID by joining on Part Description from Sample Orders and the Warehouse Part List.

  6. Part Classification can only take one of these two values. Figure out how to map any values in the source to one of these five values. Product Service

3. Select Generate Notebook

  1. Select Go to Notebook to review the ready-to-run Python notebook based on your configuration instructions, source files, and destination schemas.

  2. Run the Notebook to write the data to the destination schema.

Last updated

Was this helpful?