Video overview and demonstration:
The Retail Demo - Instore Analytics provides the ability to identify product inventory conditions using AI on Edge & Machine Learning. It also implements alerting of shelf inventory and customer arrival (for item pickup) via Microsoft teams.
The AI on Edge pattern utilizes Azure IoT Edge Runtime with a custom Vision AI model and Azure Stream Analytics to deliver product detection and count. The Vision AI model has been trained to identify unlabeled cans to avoid trademark / branding issues :-). The ASA module aggregates / buffers the data so not every frame generates a message.
Messages are sent to IoT Central where business rules have been configured to alert when inventory levels are low.
Store employees use a Power App to simulate an Inventory System and place orders.
The demo also includes a Machine Learning scenario whereby inventory levels and trends can be analyzed to do better inventory forecasting and allow managers to make more accurate inventory decisions.
A Location Analytics pattern is implemented through Azure Maps geofencing and indoor mapping. The demo walks you through simulating a customer arriving at the store which triggers the geofence alert. An indoor map of a retail location is also provided that can help with customer wayfinding.
After deploying the demo, a sample demonstration script can be found here
- Demo targets Windows 64-bit Operating System.
- Visual Studio Code (required extensions are outlined in the VS Code environment setup section).
- Docker Community Edition.
- Git & Git LFS.
- Azure Subscription. You can set up a trial account here.
Git will be used to copy all the files for the demo to your local computer.
- Install Git from here.
- Install Git LFS from here
IMPORTANT: Ensure you have Git LFS installed before cloning the repo or large files will be corrupt. If you find you have corrput files, you can run
git lfs pull
to download files in LFS. - Open a command prompt and navigate to a folder where the repo should be downloaded
- Issue the command `git clone https://github.com/Azure-Samples/IoTDemos.git``.
An Azure Resource Manager (ARM) template will be used to deploy all the required resources in the solution. Click on the link below to start the deployment. Note that you may need to register Azure Resource Providers in order for the deployment to complete successfully. If you get an error that a Resource Provider is not registered, you can register the Resource Provider by following the steps in this link
Follow the steps to deploy the required Azure resources.
Resources include:
- API Connections.
- App Service.
- App Insights.
- Azure Maps Account.
- Azure Maps Creator Resource.
- Container Registry.
- Key Vault.
- Logic Apps.
- Machine Learning Workspace
- SQL Server.
- SQL Database.
- Storage Account.
- Stream Analytics Job (Edge).
BASICS
- Subscription: Select the Subscription.
- Resource group: Click on 'Create new' and provide a unique name for the Resource Group.
- Location: Select the Region to deploy the resources. All resources will be deployed to this region.
NOTE: This template has been confirmed to work within the West US 2 region. If you wish to change the region, please ensure it supports the resources above.
SETTINGS
- Team Name: Your desired Microsoft Teams name (Default is
Retail on the Edge
).IMPORTANT: There are instructions later in the setup that outline how to create a new Team. Remember this name as it will need to match your newly created Team!
- Team Channel Name: Your desired Microsoft Teams channel name (Default is
Front Line Workers
)NOTE: This channel name specified above will be created by the Logic App if it does not exist.
- Customer Email: Enter your email address. This will be used by the Logic App to notify the customer that their order is ready to be picked up.
IMPORTANT: Remember the above values as you will not be able to retrieve them after submitting the template.
- Read and accept the
TERMS AND CONDITIONS
by checking the box. - Click the
Purchase
button and wait for the deployment to finish. - Review the output values:
- Go to your
Resource group
and clickDeployments
from the left navigation. - Click
Microsoft.Template
. - Click
Outputs
from the left navigation. - Save values for future use.
- Go to your
IMPORTANT: You will need these values later in the setup.
NOTE: Connection to the SQL server is allowed from all IP Addresses by default. To update this rule, follow the instructions in the
Optional Steps
section.
Some resources require some extra configuration.
Here we will run the script for the creation of the tables required by the solution.
- In the Azure portal select the Resource Group you created earlier.
- Select the SQL database resource.
- Click the Query editor option in the left menu.
- Enter the username
theadmin
and passwordM1cro$oft2020
and click theConnect
button. - Select Open Query from the top navigation and select
sql/script.sql
from the repo. - Click Run.
- You should be able to see the created tables and stored procedures in the database.
Follow the next steps to setup the Edge Stream Analytics job.
- In the Azure portal select the Resource Group you created earlier.
- Take note of the Storage Account resource name.
- Select the Stream Analytics Job resource.
- Click Storage account settings from the left navigation.
- Click Add storage account.
- Select the storage account created in the ARM setup and add a new container named
edgemodules
. - Click Save and confirm clicking the Yes button.
- Select Publish from the left navigation.
- Click Publish and confirm clicking the Yes button.
- Wait for the operation to complete.
- Save the SAS URL as you will need this later in the device deployment.
This setup is required if the team is not created already and will be done using the Microsoft Teams app.
- Open the Microsoft Teams app.
- Log in into the app with the account that you will use to authenticate the Logic Apps.
- Click the Teams option in the left menu.
- Click the Join or create a team button at the bottom of the Teams list.
- Click the Create team button.
- Click the Build a team from scratch.
- Click the Private option button.
- Enter the name for the team, this should be same that you used during the ARM deployment for the Team Name setting.
NOTE: Default value is
Retail on the Edge
. - Click the Create button and wait for the creation to finish.
- Click the Skip button.
Here we need to authorize the connection for the Teams API resource with the account that will be used to send the alert notification emails via the Logic Apps.
- In the Azure portal select the Resource Group you created earlier.
- Select the teams API connection resource.
- Click the Edit API connection option in the left menu.
- Modify the default Display Name if desired.
- Click the Authorize button to start the authorization process.
- Follow the steps with the account you want to use.
- When the process is finished, click the Save button.
Here we need to authorize the connection for the Teams API resource with the account that will be used to send the alert notification emails via the Logic Apps.
- In the Azure portal select the Resource Group you created earlier.
- Select the office365 API connection resource.
- Click the Edit API connection option in the left menu.
- Modify the default Display Name if desired.
- Click the Authorize button to start the authorization process.
- Follow the steps with the account you want to use.
- When the process is finished, click the Save button.
Here we will setup an event subscription for the Azure Maps account in order to notify the geofence events to our Logic App.
- In the Azure portal select the Resource Group that you are using.
- Select the Azure Maps Account resource.
- Click the Events option in the left menu.
- Click the + Event Subscription button in the top of the panel.
- Enter
logicappalerts
to the Name input field. - Leave
Event Grid Schema
as the Event Schema. - For the System Topic Name field enter the value
rotegeofencealert
. - Uncheck the Geofence Result and Geofence Exited options in the Filter to Event Types dropdown. Ensure that ONLY the Geofence Entered event is selected.
- For the Endpoint Type select the Web Hook option.
- Click the Select an endpoint link.
- In the new panel update the Subscriber Endpoint field with the value from the deployment output named Customer Arrived Alert Logic App Endpoint.
- Click the Confirm Selection button.
- Click the Create button.
In this section we will use the IndoorMapSetup console app to use Creator to create an indoor map.
NOTE: The portable executable provided is targeted for Windows 64-bit environments and .net 5.0. If the .exe file below does not run on your PC, consider creating a VM and running the exe there. The steps to create a VM are outside the scope of this document.
If you would like to do this manually, you can follow the tutorial here: Create Indoor Map.
- Open the folder
maps\IndoorMapSetupAppExecutable
in your repo. - Execute the IndoorMapSetupApp.exe file.
- Paste the azure maps subscription key that you saved from the deployment with the name Maps Account Key and press enter.
NOTE: this process may take a few minutes.
- The following output will be displayed.
*********************************************************** * Azure Maps Creator Setup * * * *********************************************************** Setup started... Enter the Azure Maps subscription key and press enter: {subscriptionKey} Upload Udid: {uploadUdid} Conversion Id: {conversionId} Dataset Id: {datasetId} Tileset Id: {tilesetId} Stateset Id: {statesetId} Press any key to exit
- Copy and save the
Dataset Id
,Tileset Id
andStateset Id
values for future use.NOTE: These values are also saved to file.
NOTE: The source code for the above application is available here:
maps/IndoorMapSetupApp
. You will need .NET Core 3.1 or later to run the application.
In this section we will setup the Inventory Power App.
- Login to Power Apps.
- Click the Data option in the left menu.
- Click the Connections sub option from the list.
- Click the + New connection button.
- Click the SQL Server option.
- Select the SQL Server Authentication option for the Authentication Type.
- Enter the following values from the outputs of the template deployment:
- SQL Server Name:
<sql Server>
(Obtain value from deployment output or the Azure Portal). - SQL Database name:
<database Name>
(Default value is:rote
). - Username:
<sql Server Username>
(Default value is:theadmin
). - Password:
<sql Server Password>
(Default value is:M1cro$oft2020
). - Choose a gateway: leave blank / not needed
- SQL Server Name:
- Click the Create button and wait for the connection to be created.
-
Click the Flows option in the left menu.
-
Click the + New option.
-
Select the Create from template option.
-
Select the PowerApps button option.
-
Click the New step button.
NOTE: There is an accompanying animation at the end of this section that outlines the upcoming steps.
-
Search for the SQL Server option and find the Execute store procedure (V2) option.
-
Click
...
in the top right of the step and ensure the SQL connection is correct under My Connections. -
Select the following options from the dropdown lists:
- Server name:
Use connection settings (<sql Server>)
. - Database name:
Use connection settings (rote)
. - Procedure name:
[dbo].[CreateSupplierOrder]
.
- Server name:
-
For the following 4 input fields click the text box and select the Ask in PowerApps option under the Dynamic content tab.
IMPORTANT! You must do this in the order below or it will fail. If the Ask in PowerApps option is not displayed, click See more.
- CreationDateStr.
- ProductCode.
- Quantity.
- Total.
-
Add a new step and search for the
HTTP
type and select it. -
Configure the steps with the following detail:
- Method:
POST
. - URI: This is the URI from the output of the deploy with the name: Order Ready to Picked Up Logic App Endpoint.
- Body: Enter the following in the body input field:
{ "name": "" }
- Click in the double-quotes ("") and select Executestoredprocedure(V2)_ProductCode
- Method:
-
Click the Save button.
NOTE: The animation below outlines the above process.
- Click the Apps option in the menu.
- Click the Import canvas app button on the top.
- Click the Upload button and select
powerapp/Inventory.zip
. - Click the Configure button next to the App resource.
- Select Create as new under Setup and click the Save button.
- Click the Configure button next to the SQL Server Connection resource.
- Select the connection created earlier and click the Save button.
- Select the Configure button next to the Flow resource.
- Select the flow created earlier and click the Save button.
- Click the Import button and wait for the app to be imported.
- Click the Apps option in the menu.
- Open the Inventory App and allow the use of the SQL Server connection.
- Install Visual Studio Code (VS Code) if required.
- Install 64 bit Anaconda with Python version 3.7.
- Install Docker Community Edition (CE) if required. Don't sign in to Docker Desktop after Docker CE is installed.
- Install the following extensions for VS Code:
- Azure Machine Learning (Azure Account and the Microsoft Python will be automatically installed)
- Azure IoT Hub Toolkit
- Azure IoT Edge
- Docker Extension
- Restart VS Code.
- Select [View > Command Palette…] to open the command palette box, then enter [Python: Select Interpreter] command in the command palette box to select your Python interpreter.
- Select the Anaconda interpreter.
- Enter [Azure: Sign In] command in the command palette box to sign in Azure account and select your subscription.
-
Launch Visual Studio Code, and open the
/device
folder in your registry. -
Update the .env file with the values for your container registry.
-
In the Azure portal select the Resource Group you created earlier.
-
Select the Container Registry resource.
-
Select Access Keys from the left navigation.
-
Update the following in
device/.env
with the following values from the Access Keys within the Container Registry:CONTAINER_REGISTRY_ADDRESS=
<Login Server>
(Ensure this is the login server and NOT the Registry Name)CONTAINER_REGISTRY_USER_NAME=
<Username>
CONTAINER_REGISTRY_PASSWORD=
<Password>
ASA_BLOB_URL=
<ASA Blob Url>
(the one you obtained earlier in theEdge Stream Analytics Job
setup step) -
Save the file.
-
-
Sign in to your Azure Container Registry by entering the following command in the Visual Studio Code integrated terminal (replace <CONTAINER_REGISTRY_ADDRESS>, <CONTAINER_REGISTRY_USERNAME>, and <CONTAINER_REGISTRY_PASSWORD> with your container registry values set in the .env file).
docker login -u <CONTAINER_REGISTRY_USER_NAME> -p <CONTAINER_REGISTRY_PASSWORD> <CONTAINER_REGISTRY_ADDRESS>
As default, the app uses videos to imitate an empty or stocked shelf when consuming the model. If you want to use an RTSP stream instead, you can do the following:
-
Open
device/deployment.test-amd64.template.json
. -
Update the following desired properties for the CameraCapture module.
"VideoPath": "<RTSP Stream Address>"
-
Save the file.
NOTE: The model on the device works with unlabelled silver cans. 4 or more cans is a stocked shelf, any less will trigger an alert.
The ImageClassifierService module comes with a Custom Vision object detection model that has been trained to recognised cans with the tag "GroceryItem".
If you would like to use a different product, you can update the model, however, you must tag the item with "GroceryItem" or the Edge Streaming Analytics Job will not work correctly.
- Replace
device/modules/ImageClassifierService/app/model.pb
with your new model - Increment the version in the
device/modules/ImageClassifierService/module.json
file. - Complete the steps in the next section.
- Right click on
device/deployment.test-amd64.template.json
. - Select Build and Push IoT Edge Solution.
- Wait for the process to complete.
- Go to
device/config
and savedeployment.test-amd64.json
as you will need this for the IoT Central setup later in the guide.
Creating a new IoT Central environment is outside the scope of this document. Follow the instructions here to setup an IoT Central Retail application.
Once your environment is created, complete the following steps to configure it for the demo.
Click here and here to learn about how to customize IoT Central
- Change the image on the dashboard to use the Contoso Market logo here
- Set the Browser colors to #ff566C
- Sign in to your IoT Central environment.
- Click Device Templates
- Click + New.
- Click Azure IoT Edge.
- Click Next: Customize.
- Device template name: This can be anything but you'll need to remember it for use later.
- Click Browse.
- Upload the
device/config/deployment.test-amd64.json
file. - Click Next: Review.
- Click Create.
- Click on Module ImageClassifierService.
- Click Delete (and then Delete again).
NOTE: this module doesn't have any properties or telemetry so is considered invalid during publishing.
- Under Module GroceryItemsEdgeStreamJob, click Manage.
- Using + Add capability, add the following items. Each is of capability type Telemetry:
- Name:
message_type
, schema:String
- Name:
event_type
, schema:String
- Name:
timestamp
, schema:DateTime
- Name:
count
, schema:Double
NOTE: all other values can be left as their default.
- Name:
- Click Save.
- Click Views.
- Click Visualizing the device.
- Check count.
- Click Add tile.
- Click Save.
- Click Publish (and then Publish again).
- Click Rules.
- Click + New.
- Enter a Rule Name
- Select your new Device Template from the drop down.
- Add the following conditions:
message_type
Equalsalert
event_type
Does not equalupdate
count
Is greater than or equal to0
- Under Action, click + Webhook.
- Enter
Logic App
for the Display name. - Insert the Stock Level Alert Logic App Endpoint from the outputs of the ARM template.
- Click Done.
- Click Save.
- Click Devices.
- Click + New.
- Choose the template created earlier from Template type.
- Optionally customize the device ID or name.
- Click Create.
- Click on your new Device ID (in the device list).
- Click Connect.
- Copy the following values:
- ID scope
- Device ID
- Primary key
NOTE: these will be used to connect the device shortly using DPS.
- Click Administration.
- Click API Tokens.
- Click + Generate Token.
- Enter a Token name.
- Leave Role as Administrator.
- Save the token for later use in the Web App Setup.
- At the home page of IoT Central App, click on Edit
- Under the section Add a tile:
- For Device group chose the IoT Edge device group created earlier
- For Devices choose the device that has been onboarded
- For Telemetry check out count
- Click on Add tile
- Move the tile so it's prominently displayed
- Click Save
The easiest option to create a device is to use the IoT Edge template available in Azure.
-
Provision a new VM using the template. Consider the following when provisioning:
- You can use the existing resource group you have created or create a new one.
- Ensure port 22 is open (this is the default).
- Select a machine type with at least two vCPU cores.
- If you haven't used SSH before/recently, password-based auth might be an easier option.
-
Once provisioning is complete, open your preferred shell.
NOTE: ensure you have SSH installed. If you're not sure, you can always use the Cloud Shell.
-
Run
ssh <your-username>@<your-machine-ip-address>
.NOTE: your username will be the one you specified during provisioning and your machine IP address will be available from the Azure Portal (by viewing the VM you just created).
-
Once connected to the machine, run
sudo nano /etc/iotedge/config.yaml
. -
Comment out the following section (i.e. add
#
before each line):provisioning: source: "manual" device_connection_string: "<ADD DEVICE CONNECTION STRING HERE>"
-
Uncomment out the following section (i.e. remove
#
before each line - and the space):NOTE: This is the bottom DPS code block.
provisioning: source: "dps" global_endpoint: "https://global.azure-devices-provisioning.net" scope_id: "{scope_id}" attestation: method: "symmetric_key" registration_id: "{registration_id}" symmetric_key: "{symmetric_key}"
-
Update
{scope_id}
,{registration_id}
, and{symmetric_key}
with the values you copied from the previous section.NOTE:
scope_id
refers to ID scope,registration_id
refers to the device ID, and thesymmetric_key
refers to the primary key.IMPORTANT: Make sure to remove the
{}
in the placeholders. -
Press
CTRL + X
,Y
, and theEnter
to save and quit. -
Run
sudo systemctl restart iotedge
.NOTE: Your
config.yaml
file should look like the below: -
Run
sudo iotedge list
to see if your modules are running (these may take a while to download). You can also runsudo iotedge check
to see if there are any issues with your configuration.
Learn more here.
If you would like to onboard an actual device, you can follow instructions here:
Next we wil update our web app configuration settings.
- In the Azure portal select the Resource Group you created earlier.
- Select the App Service resource.
- Click the Configuration option in the left menu.
- Update the following settings by clicking Edit then Save:
- Name:
Azure:Maps:DatasetId
, Value:<DatasetId from Maps Output>
- Name:
Azure:Maps:StateSetId
, Value:<StateSetId from Maps Output>
- Name:
Azure:Maps:TilesetId
, Value:<TileSetId from Maps Output>
- Name:
Azure:IoTCentral:DeviceId
, Value:<IoT Central Device Id>
- Name:
Azure:IoTCentral:IoTCentralDomain
, Value:<IoT Central Domain>
(e.g.<instance>.azureiotcentral.com
) - Name:
Azure:IoTCentral:IoTCentralApiToken
, Value:<IoT Central API token>
- Name:
- Click Save to commit the changes.
In this section, we will setup our notebook with the required files for the generation of the model.
- In the Azure portal select the Resource Group you created earlier.
- Select the Storage account resource.
- Click the Containers option in the left menu under Blob service.
- Click the azureml-blobstore-GUID container.
- Click the Upload button in the top.
- Click the Select a file input and select the
ml\data\ntest_data.parquet
from your repo. - Click the Upload button and wait for the upload to finish.
- Click the Upload button in the top again.
- Click the Select a file input and select the
ml\data\train_data.parquet
from your repo. - Click the Upload button and wait for the upload to finish.
-
Select Azure Active Directory option from the main navigation in the Azure Portal:
-
Copy the Tenant Id value from the overview as you will need this value later.
-
Go back to the Resource Group you created earlier.
-
Select the Machine Learning resource.
-
Take note of the following values to be used later in the deployment
-
Click the Launch now button to open the Machine Learning workspace.
-
Click the Notebooks option in the left menu under Author.
-
Click the Upload files button.
-
Select the file
rote-forecast.ipynb
inside theml
folder. -
If prompted, check the **I trust contents of this file ***.
-
Select your username folder from the target directory list.
-
Click the Upload button.
We need configure values within the notebook before being able to execute it:
- Click the
rote-forecast.ipynb
in the My files navigation: - Click the New Compute button.
- Enter the name
compute-{your-initials}
. - Select CPU (Central Processing Unit) from the Virtual machine type dropdown.
- Select the virtual machine size Standard_D12_v2.
- Click the Create button and wait for the compute to be created.
Note: This process can take several minutes; wait until status of compute is
Running
. - Click the Edit dropdown and select the Edit in Jupyter option.
Note: If required, login with your Azure credentials.
- Replace the values within the Setup Azure ML cell with the values you obtained in the Notebook files upload section:
interactive_auth = InteractiveLoginAuthentication(tenant_id="<tenant_id>") # Get instance of the Workspace and write it to config file ws = Workspace( subscription_id = '<subscription_id>', resource_group = '<resource_group>', workspace_name = '<workspace_name>', auth = interactive_auth)
- Click File > Save and Checkpoint from the menu.
- Close the browser tab being used to edit the Jupyter notebook
- In the tab with the Machine Learning Workspace:
- Start to execute the cells with following considerations:
- After creating the utils.py file, you MUST restart the kernel.
- For the Setup Azure ML cell observe the output to authenticate via the URL provided (https://microsoft.com/devicelogin).
- From here, Run the remaining cells sequentially until you have executed the notebook.
- Executing the Monitor Experiment takes on average around 17 minutes.
IMPORTANT: Remember to wait for each cell to execute before continuing.
The steps below cover how to demonstrate the functionality of the demo. To perform the demonstration in a business context, we've created a walkthrough with Retail personas. Once you're familiar with the demo architecture and steps, consider using the retail version of the demo walkthrough located here
These steps show how AI + IoT on the Edge can help monitor inventory
-
Open the Contoso Market Web Application.
-
Click on Canned Beans.
-
Click Buy.
-
Click Cart on the navigation. Increasing quantity purchased is not needed.
-
Click on Proceed to Checkout.
-
Click Proceed to Pay At this point in the demo, we can show that customer demand is diminishing stock. We can use AI on the Edge to react faster:
-
Triggering the Low Stock Alert:
- Default Behavior: An IoT Edge module property update occurs after the customer makes the order to set the video on the module to show Low Stock.
- Browsing the IoT Edge endpoint on port 5012 will show the current stream being analyzed in a web browser. ex: http://192.168.1.1:5012
- RTSP Stream: You should make sure there are less than 4 cans in the cameras vision to trigger the alert.
-
Go to Microsoft Teams and observe the Low Stock Alert from the device in the Front Line Workers channel.
-
OPTIONAL: If you want to restock the shelves you can do the following (note this will also happen when you reset the demo):
- RTSP Stream: Add more than four cans in the cameras vision and observe the restocked alert in Teams.
- Videos:
- Go to your IoT Central environment.
- Click Devices.
- Click on your device.
- Click Manage from the tabs.
- Udpate the CameraCapture property VideoPath to
./StockedShelf.mp4
. - Observe the restocked alert in Teams.
-
OPTIONAL: Go to IoT Central and show that the tile added for the IoT Edge device is reporting the sock level
These steps show how the integration capabilities of the Power Platform
- Go to Power Apps.
- Click Apps.
- Open the Inventory application.
- At the end of the Canned Beans row, click to make an order.
- Select the Quantity. This is optional and does not impact the demo
- Click Make Order.
- Observe the new order.
- Triggering the Customer Email notification. This is a hidden trigger to make the demo simpler to execute and happens when you make the supplier order. Customer (you) should received an email notification saying that your product is ready to be picked up!
These steps show the integration with Azure maps
- Go back to the Contoso Market Web Application.
- Click Account in the navigation. This is a hidden trigger that will trigger a geofence alert that will create a notification via a Logic App.
- Go to Microsoft Teams and observe the Customer Has Arrived Alert in the Front Line Workers channel.
- Click the Notification to get directions to the store.
- Click the Floor Map to gain focus. (May need two clicks)/ You will see floor levels in the top right.
- You will see Contoso Markets at store 252 is highlighted.
- Run the notebook you created in the previous section and follow the necessary steps. You may want to explain what each step is performing when demoing the notebook.
To reset the demo, manually navigate to http://<webapp-url>/reset
. This will do the following updates:
- Reset the database into its original state.
- If you are using the video, it will put it back to the stocked shelf state.
- Reset the users Geofence location.
It is recommended to refresh the web application after this step before running through the demo again.
In this section we will describe some steps that are not required for the demo but allow for further customization if required.
Follow the next steps to update or remove the rule that allow the connection from all IP addresses.
- In the Azure portal select the Resource Group that you are using.
- Find the SQL server resource and click it to see the detail.
- Click the Firewalls and virtual networks option in the menu on the left under the Security section.
- Check the AllowAllIps rule.
- If you want to remove click the ... button next to the rule and click the Delete button.
- If you want to update the rule, click and update the values of the Start IP and End IP columns of the rule with the values you want.
Note: For a single IP set both values with the required IP.
- Click the Save button.