Build Better Customer Applications with Multiexperience Development

The promise of multiexperience applications is phenomenal. With multiexperience you can align and connect the optimal user experience for each user touchpoint through fit-for-purpose applications that make every interaction across the user journey effortless. Attend this session to see why Mendix is a leader in multiexperience development platforms. Simon Black, Team Lead Evangelist, and David Brault, Product Marketing Manager, demonstrate applications that utilize several modalities across a customer journey, including:

  • chatbots
  • augmented reality
  • voice assistants
  • progressive web applications
  • native mobile
  • TV applications
  • Transcript

    [00:00:00.000]
    (upbeat music)

    [00:00:15.280]
    <v ->Hello, and welcome to this session</v>

    [00:00:16.441]
    on building multiexperience applications.

    [00:00:19.690]
    My name is Simon Black, team lead for the Evangelists.

    [00:00:23.099]
    Today I’m broadcasting from Elian, Cambridgeshire

    [00:00:26.280]
    in the UK.

    [00:00:27.410]
    And today I’m joined by Dave Brault,

    [00:00:29.380]
    who will talk to us

    [00:00:30.480]
    about what a multiexperience development platform is.

    [00:00:34.080]
    Hi Dave, and can you introduce yourself?

    [00:00:36.620]
    <v ->Thanks, Simon.</v>

    [00:00:37.453]
    My name is David Brault,

    [00:00:38.286]
    product marketing manager here at Mendix.

    [00:00:40.340]
    A little bit about myself.

    [00:00:41.620]
    I relocated to Austin, Texas about six years ago

    [00:00:45.080]
    where I’m doing this broadcast live

    [00:00:46.626]
    and I’m fully assimilated now

    [00:00:48.800]
    to the point where I spend way too much time

    [00:00:51.190]
    on the hunt for the best barbecue in Texas.

    [00:00:53.790]
    Anyway, let’s get back to why you’re here,

    [00:00:55.320]
    building better customer applications

    [00:00:57.350]
    with multiexperience development.

    [00:00:59.050]
    During this session, we’re going to take a look

    [00:01:00.940]
    at how MXDP is changing the development landscape

    [00:01:03.910]
    and how Mendix can help.

    [00:01:05.395]
    Then we’ll demonstrate

    [00:01:06.498]
    several different kinds of experiences

    [00:01:08.960]
    and let Simon pull back the covers

    [00:01:10.495]
    so we can see how they were built.

    [00:01:13.120]
    Let’s start off with a quote.

    [00:01:14.757]
    “Good design is actually a lot harder to notice

    [00:01:17.307]
    than poor design,

    [00:01:18.547]
    in part because good designs fit our needs so well

    [00:01:21.867]
    that the design is invisible.”

    [00:01:23.650]
    And I absolutely love this quote

    [00:01:25.210]
    because it’s so on point with the topic at hand.

    [00:01:28.990]
    The primary purpose of MXDP or

    [00:01:30.480]
    multiexperience development platforms

    [00:01:32.950]
    is to enable companies to create applications

    [00:01:35.300]
    that deliver sophisticated user experiences

    [00:01:37.550]
    across many devices and modalities

    [00:01:39.860]
    like the ones you see on the screen right now.

    [00:01:42.320]
    Now the promise of multiexperience it’s phenomenal.

    [00:01:45.540]
    Connect the optimal user experience

    [00:01:47.740]
    to each customer touchpoint

    [00:01:49.510]
    with fit for purpose applications

    [00:01:51.720]
    that make every user interaction efficient and effortless,

    [00:01:55.400]
    or invisible, as Don would say.

    [00:01:58.120]
    Now, the Mendix platform delivers on this promise

    [00:02:00.650]
    with a truly integrated solution

    [00:02:02.830]
    that uses a single skillset

    [00:02:04.600]
    to build rich and engaging applications

    [00:02:06.925]
    and experiences for any situation.

    [00:02:10.110]
    Let’s take a deep dive

    [00:02:11.210]
    into how the platform supports

    [00:02:12.650]
    the rapid development of multiple experience solutions.

    [00:02:17.170]
    At the foundation layer, the platform services

    [00:02:19.590]
    and cloud native architecture of Mendix

    [00:02:21.860]
    does all the heavy lifting.

    [00:02:23.600]
    It handles the complexity of dealing

    [00:02:25.290]
    with loosely coupled applications

    [00:02:27.330]
    and services running on a service base architecture.

    [00:02:30.749]
    It also handles all the core services

    [00:02:32.370]
    like logging and security and backup.

    [00:02:35.809]
    Mendix applications and services are completely portable,

    [00:02:39.120]
    which means it can be moved or distributed

    [00:02:40.960]
    across cloud providers at will.

    [00:02:44.490]
    Now at the next level

    [00:02:45.323]
    integration to any service or data source

    [00:02:47.905]
    can be consumed and published inside Studio Pro.

    [00:02:50.880]
    So that’s REST, SOAP, OData, SQL, JSON, XML,

    [00:02:54.980]
    even proprietary sources, all with no coding.

    [00:02:59.340]
    They can be packaged up

    [00:03:00.380]
    as re-usable connectors inside of Data Hub

    [00:03:03.040]
    or used in app services published in the Mendix marketplace.

    [00:03:08.430]
    App services, what they do, they combine UI,

    [00:03:11.180]
    building blocks, widgets, logic connectors, and services

    [00:03:14.900]
    into packaged business capabilities,

    [00:03:17.270]
    which can be used for building experiences

    [00:03:19.426]
    higher up in the development chain.

    [00:03:22.850]
    Now with Data Hub, these same reusable connectors

    [00:03:25.164]
    can be exposed as virtualized data entities

    [00:03:28.635]
    in a searchable catalog, which is great

    [00:03:30.870]
    because now any developer

    [00:03:32.223]
    has the ability to access rich metadata

    [00:03:34.721]
    and equally important with built in governance

    [00:03:37.500]
    and security access.

    [00:03:40.150]
    Now the peak of this pyramid,

    [00:03:41.860]
    developers standard atop a mountain of technology

    [00:03:45.110]
    abstracted in reuse that allows them to focus

    [00:03:47.970]
    on designing compelling user experiences.

    [00:03:51.580]
    Which means development

    [00:03:53.050]
    is no longer constrained by technology.

    [00:03:55.519]
    Okay, so now that you’ve seen the architecture,

    [00:03:57.850]
    let’s see what multiexperience applications look like

    [00:04:00.480]
    and how they’re built.

    [00:04:02.580]
    Now for the rest of this session,

    [00:04:03.661]
    we’re going to follow a customer

    [00:04:05.830]
    through a journey of buying a car,

    [00:04:07.510]
    from ordering it, to getting it delivered

    [00:04:09.465]
    with a small hiccup along the way.

    [00:04:12.900]
    Let’s start with researching and buying the car,

    [00:04:14.880]
    which involves a combination of progressive web apps,

    [00:04:17.296]
    chatbots, augmented reality,

    [00:04:19.890]
    and leveraging native mobile applications

    [00:04:22.060]
    and their device features all built with Mendix.

    [00:04:25.900]
    The customer journey begins here

    [00:04:27.370]
    with this progressive web app.

    [00:04:28.780]
    It’s responsive so it runs on any form factor

    [00:04:31.410]
    and it’s fast because most of the application

    [00:04:33.830]
    runs locally on the device.

    [00:04:36.054]
    Now instead of calling or emailing the dealership,

    [00:04:38.692]
    the next experience is an inline chatbot

    [00:04:41.280]
    to schedule a test drive for 3:00 p.m.

    [00:04:43.820]
    Using a combination of both typing

    [00:04:45.600]
    and voice-to-text capabilities.

    [00:04:48.510]
    Now, after the test drive,

    [00:04:49.411]
    the customer uses augmented reality to configure their car

    [00:04:53.160]
    by overlaying different paint and wheel colors

    [00:04:56.410]
    because dealerships rarely stock

    [00:04:58.017]
    every single color combination.

    [00:05:02.180]
    Deciding to buy the car, the customer harnesses

    [00:05:04.680]
    the power of the phones location services

    [00:05:07.060]
    to populate their address

    [00:05:08.440]
    and uses a credit card scanner

    [00:05:09.743]
    to populate their credit card details all without typing.

    [00:05:15.070]
    And last in between commercial breaks,

    [00:05:17.336]
    the customer uses a native TV application

    [00:05:19.820]
    to check on the status of their car

    [00:05:21.690]
    as it moves through the various stages

    [00:05:23.326]
    of the manufacturing process.

    [00:05:26.460]
    So at this point in time,

    [00:05:27.293]
    I’m going to let Simon share with us

    [00:05:29.610]
    how he used Mendix to build some of these experiences.

    [00:05:33.200]
    Simon.

    [00:05:34.410]
    <v ->Thanks, Dave.</v>

    [00:05:35.680]
    In my sections, I’ll be taking a deeper dive

    [00:05:38.070]
    into how those experiences

    [00:05:39.590]
    were built using the Mendix platform.

    [00:05:41.830]
    In this section, we’ll cover

    [00:05:43.190]
    how we built the chatbot experience using AWS Lex,

    [00:05:47.300]
    how we built out the AR experience

    [00:05:49.580]
    using our react native platform,

    [00:05:51.520]
    and finally how we built out the experience

    [00:05:54.010]
    for our Fire TV Stick up.

    [00:05:56.730]
    So let’s a look at how those experiences are built.

    [00:05:59.680]
    First of all, inside this progressive web application,

    [00:06:02.660]
    we can purchase a number of vehicles,

    [00:06:04.399]
    but also ask certain questions

    [00:06:06.690]
    using this chatbot feature here.

    [00:06:09.170]
    This particular chatbot is using AWS Lex

    [00:06:11.790]
    as its chatbot engine.

    [00:06:13.750]
    And we can configure it to use a number of dialogues

    [00:06:16.908]
    and understand what our customer is asking it.

    [00:06:20.210]
    We can also add certain context data

    [00:06:22.160]
    from our Mendix application.

    [00:06:24.500]
    The way we train those bots is using the bot trainer

    [00:06:27.830]
    inside the AWS Lex interface here.

    [00:06:30.710]
    And all bots work in a similar manner.

    [00:06:33.150]
    You build them using intents, slots and entities.

    [00:06:37.520]
    An intent is something that you want the bot to do.

    [00:06:40.570]
    And here we have a number of bots that we’ve created,

    [00:06:43.600]
    but we have this scheduled test drive,

    [00:06:45.700]
    the one we showed in that video earlier.

    [00:06:48.570]
    So here we can see we have an intent to make an appointment,

    [00:06:51.642]
    and with this, we have to give it a number of utterances.

    [00:06:55.280]
    Essentially, an utterance is a sentence,

    [00:06:57.420]
    an example sentence that we wanna train this bot on.

    [00:07:00.500]
    It will recognize those patterns

    [00:07:02.430]
    and trigger an action based on this particularly intent.

    [00:07:05.888]
    We can also pick out certain key data

    [00:07:08.193]
    from that particular utterance.

    [00:07:09.950]
    So things like the booking type, the time

    [00:07:12.550]
    and also the car that we wanna book for.

    [00:07:15.930]
    So all chatbots work in a very similar manner

    [00:07:18.710]
    and we’ll show you more

    [00:07:19.710]
    as we go through this demonstrations.

    [00:07:22.810]
    So, first we can see here we have the booking type,

    [00:07:25.610]
    the car model, date/time,

    [00:07:27.490]
    and these are stored in slot values.

    [00:07:29.300]
    These are the things that we want to keep and store

    [00:07:31.690]
    inside our application.

    [00:07:33.170]
    Because the chatbot is very dumb.

    [00:07:34.910]
    It doesn’t actually store any data,

    [00:07:37.400]
    it simply acts on certain information

    [00:07:39.290]
    and sends that back to the requester.

    [00:07:41.917]
    So let’s go ahead and have a look

    [00:07:44.240]
    at how that is built inside the Mendix model.

    [00:07:47.500]
    So here we have the Mendix model for our application

    [00:07:49.907]
    and we first have our progressive web app

    [00:07:51.950]
    where we can see the different details.

    [00:07:53.800]
    And we also have a microflow

    [00:07:56.033]
    that is being used to send the data

    [00:07:58.350]
    to that particular Lex service.

    [00:08:00.600]
    Now, this is using the AWS Lex connector

    [00:08:02.820]
    available in the Mendix app store.

    [00:08:04.890]
    And inside that particular connector,

    [00:08:06.870]
    you can set up the keys and identities

    [00:08:09.870]
    as well as the utterance

    [00:08:10.883]
    that we’re gonna send to this particular chatbot.

    [00:08:14.970]
    And as I said, that utterance is like a message.

    [00:08:17.015]
    So it will interpret that message

    [00:08:18.850]
    and come back with a response.

    [00:08:20.670]
    And inside that response will be a number of slots.

    [00:08:24.210]
    Those are how we store the actual values,

    [00:08:26.610]
    the information like the time, the date,

    [00:08:28.607]
    and also the car type.

    [00:08:30.830]
    And we’re storing those inside the Mendix application

    [00:08:33.040]
    so that we can remember that conversation

    [00:08:36.040]
    and we can also create that booking.

    [00:08:40.360]
    Inside the UI of this application for this chatbot,

    [00:08:42.950]
    we’ve just simply used a list view.

    [00:08:45.130]
    And inside that list view,

    [00:08:46.150]
    we can showcase all of those messages we’ve sent

    [00:08:49.050]
    and also received from that chatbot.

    [00:08:52.060]
    So very quickly, that’s an overview

    [00:08:54.110]
    as to how we’ve built that integration.

    [00:08:56.038]
    Let’s take a look at our next integration,

    [00:08:58.600]
    which was to build an AR experience.

    [00:09:01.760]
    So to do so, we actually used a library

    [00:09:04.560]
    that’s available for React Native.

    [00:09:06.770]
    This is called ViroReact.

    [00:09:08.602]
    And ViroReact allows you to create VR and AR experiences

    [00:09:12.840]
    leveraging the AR capabilities of the device,

    [00:09:15.790]
    whether that be ARCore or ARKit.

    [00:09:19.757]
    And by using this,

    [00:09:21.000]
    we can actually start to build those visualizations.

    [00:09:23.890]
    And the way we did this is inside the modeler,

    [00:09:26.229]
    we built out some custom widgets.

    [00:09:29.195]
    Inside these widgets,

    [00:09:30.770]
    they allow us to set up certain markers to track.

    [00:09:34.940]
    So here we have a tracking widget.

    [00:09:39.381]
    And inside this particular tracking widget,

    [00:09:41.630]
    we can set a particular image we wanna use to identify

    [00:09:45.183]
    and place the object in a 3D space.

    [00:09:48.510]
    So here we can see, we can select the image.

    [00:09:51.060]
    And in this case, we’re gonna use a Mendix logo.

    [00:09:53.650]
    This has some unique characteristics of it,

    [00:09:56.280]
    so that we can easily identify it in the 3D space.

    [00:10:00.530]
    We can also set some of the properties such as actions

    [00:10:03.410]
    to be triggered when we detect certain items

    [00:10:05.950]
    in the 3D space.

    [00:10:08.948]
    Inside this pluggable widget,

    [00:10:11.040]
    we then have a number of additional widgets

    [00:10:13.300]
    to show the objects, and in this case the object is a car

    [00:10:17.220]
    and a number of spheres

    [00:10:18.053]
    and the spheres are the icons we saw

    [00:10:19.610]
    at the top of that particular car to change the color.

    [00:10:23.038]
    If we drill down into the object,

    [00:10:24.750]
    we can select the material that is being used,

    [00:10:27.039]
    we can choose the interaction

    [00:10:28.785]
    and also the events that are used

    [00:10:30.870]
    when we actually interact with this particular item.

    [00:10:34.430]
    So when the tracker detects that particular marker,

    [00:10:38.050]
    it will take this particular object

    [00:10:40.240]
    and place it in the 3D space.

    [00:10:42.380]
    We can then interact with it, walk around it,

    [00:10:44.446]
    and we can get more information from it as well.

    [00:10:49.940]
    So it’s a really powerful way of being able to preview

    [00:10:53.300]
    and look at certain goods like a car

    [00:10:55.840]
    or another product such as the light bulb,

    [00:10:57.910]
    being able to interact with it

    [00:10:59.320]
    without actually purchasing it beforehand.

    [00:11:02.930]
    So let’s move on to our next experience.

    [00:11:04.770]
    The last one that we showed was a TV application

    [00:11:08.100]
    running on a Fire TV Stick.

    [00:11:10.010]
    And actually this particular interaction

    [00:11:13.100]
    and this particular device is very easy to integrate into.

    [00:11:18.420]
    And this is because the application is built using Android.

    [00:11:23.760]
    So all applications that are deployed onto a Fire TV Stick

    [00:11:27.400]
    run on the Android platform.

    [00:11:29.408]
    And because the Mendix Make It Native application

    [00:11:32.197]
    deploys onto Android,

    [00:11:34.210]
    we can simply install it onto the Fire TV Stick.

    [00:11:37.520]
    And to do so, we just need to use this guide here.

    [00:11:40.530]
    This guides uses ADB, which is the Android Debugging System,

    [00:11:45.380]
    which allows you to connect to a device

    [00:11:47.440]
    on your local network and install certain applications.

    [00:11:51.160]
    So all we did is make our Fire TV Stick

    [00:11:53.410]
    available on our network

    [00:11:55.210]
    and using a few commands,

    [00:11:57.190]
    we could install it onto that particular device.

    [00:12:01.500]
    Now, the way we built that particular application

    [00:12:03.970]
    isn’t anything fancy.

    [00:12:05.850]
    All we needed to do is build out a native application

    [00:12:08.500]
    in a separate app here.

    [00:12:10.440]
    And here we have the carousel that we saw earlier.

    [00:12:13.020]
    So the user was able to see pictures

    [00:12:14.829]
    as to what stage of production they were in,

    [00:12:18.020]
    and they can swipe through those using their buttons

    [00:12:20.603]
    on their particular Fire TV Stick.

    [00:12:25.560]
    There was one thing that we did need to change though.

    [00:12:27.810]
    The Fire TV Stick runs on a particular TV,

    [00:12:31.550]
    and this is a landscape view.

    [00:12:34.110]
    We needed to make sure that

    [00:12:35.740]
    instead of opening up the application in portrait,

    [00:12:38.270]
    we had to open it up in landscape.

    [00:12:40.460]
    So to do so it’s very easy.

    [00:12:42.360]
    Inside the application we have a option

    [00:12:46.880]
    to be able to configure it to the screen orientation.

    [00:12:49.820]
    This is the code behind the native app

    [00:12:52.750]
    that we built for this particular Fire TV Stick up.

    [00:12:56.157]
    And this is based on the base template

    [00:12:58.080]
    available from the platform.

    [00:12:59.960]
    And all we’ve configured is this option here

    [00:13:02.720]
    to switch it from portrait to landscape.

    [00:13:04.639]
    This is to ensure that when we open up the application,

    [00:13:08.070]
    it doesn’t first open up in portrait

    [00:13:10.010]
    and then flip it to landscape.

    [00:13:11.700]
    It makes sure that we open it up

    [00:13:13.200]
    and it goes in landscape first.

    [00:13:15.950]
    So by changing a few configurations like this,

    [00:13:19.290]
    it gives us another device profile,

    [00:13:21.324]
    another experience for the user

    [00:13:23.510]
    that we might not have even considered beforehand.

    [00:13:27.044]
    So in the last few minutes,

    [00:13:28.640]
    we’ve seen a demonstration of how we can use AWS Lex

    [00:13:32.410]
    to communicate with a chatbot.

    [00:13:34.880]
    We’ve then seen an AR experience on our native mobile,

    [00:13:39.460]
    and the final experience is we saw a TV application

    [00:13:42.292]
    using the nature platform from Mendix

    [00:13:45.000]
    and deploying onto a Fire TV Stick up.

    [00:13:48.470]
    So, I’m gonna handover back to Dave now

    [00:13:50.650]
    who’s gonna take you on the next journey

    [00:13:52.630]
    of our customer journey.

    [00:13:55.360]
    <v ->Alright, Simon, thanks.</v>

    [00:13:56.500]
    Very impressive.

    [00:13:57.520]
    Let’s move on to the next phase

    [00:13:59.410]
    where the customer finds out

    [00:14:01.210]
    that there’s an issue with their order,

    [00:14:02.850]
    and now they need to speak to an agent to get it resolved.

    [00:14:06.940]
    So in this next experience,

    [00:14:08.190]
    the customer receives an automated text message

    [00:14:10.510]
    and a push notification

    [00:14:11.601]
    that a manufacturing delay is required

    [00:14:14.606]
    and it needs their immediate attention.

    [00:14:18.790]
    So what they do, they decide to call customer service

    [00:14:21.470]
    and they’re actually greeted by a virtual agent.

    [00:14:24.090]
    Let’s listen in.

    [00:14:32.830]
    <v Virtual Agent>Hello there,</v>

    [00:14:33.670]
    welcome to Tesla customer care.

    [00:14:35.630]
    How can I help you today?

    [00:14:37.276]
    <v ->I’d like to know the status of my order.</v>

    [00:14:39.750]
    <v Virtual Agent>No problem.</v>

    [00:14:40.670]
    Can you tell me your order number

    [00:14:42.380]
    so I can check that for you?

    [00:14:44.340]
    <v ->1086.</v>

    [00:14:45.640]
    <v Virtual Agent>Your order 1086</v>

    [00:14:48.070]
    is currently in the chassis manufacturing stage

    [00:14:50.850]
    and will be moving onto body manufacturing soon.

    [00:14:53.510]
    Would you like me to push you through

    [00:14:54.720]
    to a member of our team who can help you further?

    [00:14:56.883]
    <v ->Yes, please.</v>

    [00:14:58.500]
    <v Virtual Agent>Not a problem at all.</v>

    [00:14:59.484]
    I’m connecting you now to a member of our team

    [00:15:02.437]
    who has been briefed on your details, please hold.

    [00:15:05.599]
    (upbeat music)

    [00:15:06.432]
    (dialing tone beeps)

    [00:15:08.120]
    <v ->Hi, Mr Black.</v>

    [00:15:08.953]
    This is Tesla customer care.

    [00:15:10.400]
    My name is Allister, how can I help you today?

    [00:15:13.040]
    <v ->So the virtual agent</v>

    [00:15:13.873]
    successfully gathered all the information

    [00:15:16.200]
    required to route the caller to the appropriate person

    [00:15:19.070]
    and prepare that employee for the call.

    [00:15:22.430]
    Now, from there, the agent was able to resolve the issue.

    [00:15:25.290]
    So at this point, I’ll let Simon take control again

    [00:15:27.910]
    so he can show you how to build

    [00:15:29.500]
    a virtual agent application with Mendix.

    [00:15:32.430]
    Simon, back to you.

    [00:15:34.460]
    <v ->Thanks, Dave.</v>

    [00:15:35.293]
    What we saw there is a customer interacting with a bot

    [00:15:39.930]
    using voice recognition.

    [00:15:41.970]
    This particular bop was trained

    [00:15:43.531]
    using the Twilio dialogue service.

    [00:15:46.420]
    Inside Twilio, we can train and build a number of tasks.

    [00:15:50.920]
    These tasks are like intents,

    [00:15:52.660]
    which we saw in our AWS Lex interface.

    [00:15:56.590]
    From here, we can train it on a number of samples

    [00:15:59.696]
    and these samples are like utterances,

    [00:16:02.040]
    the same as we had inside our AWS Lex interface.

    [00:16:05.750]
    Sample words and sentences that we want to trigger.

    [00:16:11.370]
    Inside each of these,

    [00:16:12.250]
    we also have the ability to program what happens

    [00:16:15.370]
    when these particular key words and sentences are triggered.

    [00:16:19.370]
    And then in this case, we’re doing a redirect to a URL.

    [00:16:22.587]
    And this URL is a service hosted on a Mendix application.

    [00:16:27.450]
    So all we’ve done is we’ve published a REST API

    [00:16:30.350]
    from the Mendix application,

    [00:16:31.724]
    which will get called and executed

    [00:16:34.390]
    when these particular sentences is issued.

    [00:16:38.060]
    So let’s switch into the model now

    [00:16:39.910]
    and see how that experience is built out.

    [00:16:43.340]
    Inside the model of this application,

    [00:16:45.130]
    we can see here, we have this REST API

    [00:16:47.010]
    that has been published.

    [00:16:48.617]
    Inside that particular API call, we have a microflow.

    [00:16:52.670]
    And this mic flow is executed

    [00:16:54.710]
    every time we get that API call.

    [00:16:57.980]
    In this particular microflow, we have a number of steps,

    [00:17:00.250]
    which is picking up your current task, the information

    [00:17:03.204]
    and then finally, it’s making a lot of decisions

    [00:17:05.753]
    around where it should direct the customer

    [00:17:08.630]
    based on the certain input that it is getting.

    [00:17:12.020]
    Now we could have created multiple different API endpoints

    [00:17:15.360]
    depending on the different type of interaction,

    [00:17:18.048]
    but we wanted one central microflow

    [00:17:20.140]
    so we could show you

    [00:17:21.360]
    the complexity of logic that’s going on

    [00:17:23.760]
    behind the scenes in the Mendix application.

    [00:17:27.100]
    So in this case,

    [00:17:27.933]
    it’s detecting whether a redirect is needed or not.

    [00:17:31.280]
    And if a redirect is needed, what it will do

    [00:17:33.880]
    is it will then send a customer response back to Twilio

    [00:17:37.136]
    to redirect them to a certain number.

    [00:17:40.780]
    So in our scenario, we were redirected to Allister

    [00:17:44.060]
    in the customer services team,

    [00:17:45.910]
    who was able to then help us and start to fix the issue.

    [00:17:49.425]
    And to do that, we actually submitted back some XML.

    [00:17:53.380]
    This XML structure defines what phone number

    [00:17:57.050]
    we need you to dial to actually talk to the customer.

    [00:18:00.714]
    And we can do all the things inside this XML.

    [00:18:03.010]
    This is a very common structure.

    [00:18:05.235]
    AWS uses a similar structure where you can embed it

    [00:18:08.840]
    with more content rich information,

    [00:18:11.280]
    things like phone numbers, pictures, audios, and so on.

    [00:18:16.200]
    For the other messages, we just simply use plain text

    [00:18:18.910]
    to interact with those.

    [00:18:23.360]
    So we’ve seen in the last few minutes,

    [00:18:25.300]
    an overview as to how we dealt with those conversations.

    [00:18:29.640]
    We used autopilot from Twilio

    [00:18:31.860]
    to be able to handle those conversations

    [00:18:34.610]
    and recognize those key utterances,

    [00:18:37.060]
    and then back those to Mendix to get the key information

    [00:18:40.800]
    such as the status of the order and other information.

    [00:18:45.550]
    So let’s hand back to Dave now

    [00:18:47.270]
    for our final part of the customer journey.

    [00:18:50.180]
    <v ->Thanks, Simon.</v>

    [00:18:51.013]
    During the last part of the buyer’s journey,

    [00:18:53.790]
    we’ll take a look at a couple of different experiences used

    [00:18:56.870]
    while the car is out for delivery.

    [00:18:58.648]
    Let’s pick up with the customer asking Alexa

    [00:19:01.450]
    for a status update.

    [00:19:03.580]
    <v ->So Alexa ask connected car for the status of my order.</v>

    [00:19:10.000]
    <v Alexa>If you let me know your order number,</v>

    [00:19:12.020]
    I can look up the status for you.

    [00:19:14.644]
    <v ->1086.</v>

    [00:19:18.320]
    <v Alexa>Your car has been built</v>

    [00:19:20.057]
    and is out for delivery.

    [00:19:21.960]
    It will be with you by 3:13 p.m.

    [00:19:25.600]
    <v ->Now when the driver arrives at the customer,</v>

    [00:19:27.650]
    they use a native mobile application

    [00:19:29.440]
    to walk through a checklist to release the car.

    [00:19:32.745]
    Native apps are perfect

    [00:19:35.040]
    for when workers need to interact

    [00:19:36.430]
    with customers face to face.

    [00:19:38.180]
    They can capture photographic proof

    [00:19:39.658]
    of a successful delivery,

    [00:19:41.169]
    or unfortunately catalog any damages

    [00:19:43.887]
    so that a problem can be resolved as quickly as possible.

    [00:19:48.250]
    The native app eliminates paper based processes

    [00:19:50.530]
    by digitally capturing all this information,

    [00:19:53.100]
    including the customer’s signature

    [00:19:54.920]
    once they’re satisfied with the delivery.

    [00:19:58.097]
    Okay, for one last time I’ll pass control to Simon.

    [00:20:01.030]
    He’s going to show you how he used Mendix

    [00:20:02.940]
    to build the Alexa app and the native mobile application.

    [00:20:06.330]
    So Simon, back over to you.

    [00:20:09.410]
    <v ->Thanks, Dave.</v>

    [00:20:10.300]
    In this next section, we’ll take a look

    [00:20:11.900]
    how we built that integration into our Alexa device.

    [00:20:16.460]
    To build an integration into Alexa,

    [00:20:18.390]
    you first need to build a skill.

    [00:20:20.159]
    A skill is like an app on the apps tool

    [00:20:23.000]
    but it’s personalized for Alexa.

    [00:20:25.130]
    It uses voice rather than touch for interaction.

    [00:20:29.480]
    Here, we have a skill that we’ve created

    [00:20:31.360]
    for our connected car journey.

    [00:20:33.300]
    And if we drill down on it, we can start to configure it

    [00:20:35.776]
    to meet our particular needs.

    [00:20:40.260]
    Inside here, we have, first of all, an invocation word.

    [00:20:43.890]
    This is the key word or skill name

    [00:20:46.400]
    that you want to give to this particular skill.

    [00:20:49.230]
    And this will get triggered

    [00:20:50.245]
    when you ask Alexa to do something.

    [00:20:54.120]
    Next, we have the interaction model.

    [00:20:56.740]
    And again, you can see some very similar principles here

    [00:21:00.720]
    to how we were doing with the AWS Lex.

    [00:21:03.310]
    We can use certain intents, train them on certain utterances

    [00:21:07.060]
    and pick up certain slots.

    [00:21:09.940]
    So here we can see the particular dialogue

    [00:21:12.740]
    for our status for our order.

    [00:21:15.100]
    We can give it some utterances,

    [00:21:16.220]
    some slot data that we want to capture,

    [00:21:19.000]
    and that can then get triggered

    [00:21:20.300]
    inside our Mendix application.

    [00:21:23.770]
    So really as long as you know

    [00:21:26.050]
    how to build one type of chatbot or a chat interface,

    [00:21:30.330]
    you can very easily switch to other type of platforms.

    [00:21:33.487]
    They have some slot tool differences,

    [00:21:35.510]
    but you can see there’s a lot of similarities

    [00:21:37.398]
    across them.

    [00:21:40.315]
    Inside the Alexa information here,

    [00:21:43.010]
    we can also set an end point.

    [00:21:45.080]
    And the end point is where we’re actually

    [00:21:45.913]
    going to get that data from.

    [00:21:48.540]
    So when we trigger a certain intent,

    [00:21:50.800]
    we then want to be able to process it

    [00:21:52.680]
    using this Mendix application.

    [00:21:55.550]
    So let’s go inside that model

    [00:21:57.220]
    and take a look at how that is implemented.

    [00:22:00.730]
    So we open up the same model as our Twilio example.

    [00:22:05.090]
    We can then start to see the information

    [00:22:07.168]
    from our Alexa device.

    [00:22:09.550]
    So here we actually register certain handles

    [00:22:12.620]
    for those intents.

    [00:22:14.380]
    So in the after startup flow,

    [00:22:16.190]
    when we start up our application,

    [00:22:18.110]
    we can then trigger and set certain information and intents

    [00:22:22.298]
    to get triggered

    [00:22:23.450]
    when we actually see this particular intent.

    [00:22:26.100]
    So in this case, when we see the intent status,

    [00:22:28.440]
    it will trigger this particular microflow.

    [00:22:30.920]
    Now the one we showed in the example

    [00:22:32.470]
    was for a status for our order.

    [00:22:35.127]
    And inside this particular microflow,

    [00:22:37.480]
    we can see we can get the information about the request

    [00:22:40.156]
    and we can also get information from the slots,

    [00:22:43.280]
    so the actual data that we wanna capture.

    [00:22:45.150]
    And in this case it’s the actual order number

    [00:22:47.070]
    that’s important.

    [00:22:48.530]
    We wanna be able to capture who’s order it is,

    [00:22:51.660]
    look it up in the Mendix application

    [00:22:53.980]
    and respond to that particular chatbot and to Alexa

    [00:22:58.920]
    with the information that we need.

    [00:23:01.840]
    So here we can see, we have a check

    [00:23:04.130]
    to see whether the order number was found or not.

    [00:23:05.900]
    And if it is found,

    [00:23:07.604]
    we will respond with a certain message to it.

    [00:23:11.630]
    So here we can see we have a conditional options

    [00:23:15.110]
    based on the status of our order.

    [00:23:17.520]
    So if the order is delayed,

    [00:23:19.180]
    then we will send a certain message to them.

    [00:23:21.830]
    If it’s in finalization or manufacturing,

    [00:23:24.730]
    we’ll send a different message.

    [00:23:27.070]
    So you can really customize those experiences

    [00:23:29.730]
    and those messages you provide back to your users.

    [00:23:33.280]
    And again, like we had for Twillio,

    [00:23:35.730]
    you can respond with plain text or with SSML.

    [00:23:40.020]
    And SSML is an XML format structure

    [00:23:42.974]
    which allows you to embed audio, images

    [00:23:45.840]
    and additional information to your Alexa device

    [00:23:49.480]
    because some of extra devices have screens.

    [00:23:51.290]
    So if you look at the Alexa show,

    [00:23:53.274]
    you can actually show information

    [00:23:55.006]
    and also play audio at the same time as well.

    [00:24:00.870]
    So it’s a really easy to use connector.

    [00:24:03.140]
    You can simply download this connector

    [00:24:04.715]
    into your application,

    [00:24:06.310]
    and this is actually using the AWS SDK for Java

    [00:24:10.470]
    and uses those services and APIs

    [00:24:12.840]
    to be able to communicate with them.

    [00:24:16.770]
    So let’s take a look at our next experience,

    [00:24:18.900]
    which is how did we build out the native application

    [00:24:22.300]
    for our field service delivery drivers?

    [00:24:24.910]
    And again, this was a different application

    [00:24:26.940]
    and different module.

    [00:24:28.297]
    And what we tended to do for all of these experiences,

    [00:24:31.280]
    we tried to break them out

    [00:24:32.113]
    into the smaller application as possible

    [00:24:34.473]
    and share data across them.

    [00:24:36.990]
    This is a really key point about the MXDP

    [00:24:39.330]
    is that your user should be able

    [00:24:41.750]
    to seamlessly go through different applications,

    [00:24:44.602]
    but also different experiences.

    [00:24:47.940]
    So inside this model, we’re actually using data

    [00:24:51.510]
    that’s exposed from those systems from Data Hub.

    [00:24:55.440]
    And there’s some sessions

    [00:24:56.410]
    that are gonna be covering what Data Hub is.

    [00:24:59.070]
    But inside this model, you’ll see three colored entities.

    [00:25:03.710]
    The first entity is these gray entities here.

    [00:25:06.760]
    And these are known as virtual entities.

    [00:25:09.040]
    These are not stored in the Mendix application.

    [00:25:11.460]
    These are simply retrieved from the source system,

    [00:25:14.200]
    whether it be an OData, SQL, GraphQL.

    [00:25:17.610]
    The idea is that these can be queried dynamically

    [00:25:20.670]
    on any page or any interface

    [00:25:23.030]
    whether it be on native or web.

    [00:25:25.480]
    And this allows you to combine that data together

    [00:25:27.780]
    to build new experiences

    [00:25:29.240]
    and share data across different modalities

    [00:25:32.270]
    and different experiences.

    [00:25:37.280]
    So inside this application,

    [00:25:38.580]
    it was a very straightforward native mobile application.

    [00:25:41.580]
    We had some experiences

    [00:25:42.850]
    where we could view the next appointments,

    [00:25:45.790]
    we could see the tasks that we needed to be completed.

    [00:25:48.860]
    But some of the interesting items

    [00:25:50.010]
    were things like we could do barcode scanning

    [00:25:52.490]
    to be able to check

    [00:25:53.490]
    that the VIN number was correct for the car,

    [00:25:55.908]
    as well as being able to do native signatures.

    [00:25:58.660]
    So using a signature widget,

    [00:26:00.600]
    we can interact with that particular user

    [00:26:03.440]
    and get them to confirm

    [00:26:04.620]
    that they have received that particular vehicle.

    [00:26:09.661]
    So in the last few minutes, I’ve gone through very quickly

    [00:26:13.040]
    some of the experiences that we’ve built

    [00:26:15.050]
    using the Mendix platform

    [00:26:16.820]
    and given you a flavor of what is really possible

    [00:26:19.373]
    when you push Mendix to the edge

    [00:26:22.740]
    and be able to leverage it to its full complexity

    [00:26:26.140]
    and full potential.

    [00:26:27.928]
    I’ll now hand over to Dave to give our final remarks

    [00:26:31.119]
    and wrap up this particular session.

    [00:26:33.540]
    <v ->Thanks Simon, great job as usual.</v>

    [00:26:35.470]
    So during the last 20 minutes,

    [00:26:37.310]
    we’ve demonstrated

    [00:26:38.580]
    what a multiexperience customer journey can look like

    [00:26:41.160]
    if you use Mendix,

    [00:26:42.346]
    we utilize PWAs, chatbots, virtual agents,

    [00:26:45.930]
    native mobile apps, augmented reality,

    [00:26:48.240]
    TV apps, virtual assistants,

    [00:26:50.590]
    even an Alexa conversational application.

    [00:26:53.560]
    So this, my friend

    [00:26:54.393]
    is what the future of development looks like.

    [00:26:57.300]
    So moving forward, you’re gonna go way beyond

    [00:26:59.690]
    just your typical web and mobile applications.

    [00:27:02.900]
    In fact, Gartner predicts that by 2024,

    [00:27:05.810]
    one out of three enterprises will use an MXDP

    [00:27:08.950]
    to accelerate the speed of IT and business fusion teams

    [00:27:11.960]
    to deliver successful digital products.

    [00:27:15.169]
    And Mendix is here to help.

    [00:27:16.652]
    Gartner selected us as a leader

    [00:27:18.920]
    in the multiexperience development category

    [00:27:21.530]
    and we really Excel at delivering

    [00:27:24.200]
    these types of applications at speed and scale.

    [00:27:28.100]
    For example, Mendix is the only leader in the magic quadrant

    [00:27:31.270]
    that supports all four mobile architectures.

    [00:27:33.750]
    And we’re the only one that supports native,

    [00:27:35.840]
    which allows you to deliver the best application

    [00:27:37.977]
    for every single situation.

    [00:27:40.500]
    Also multiexperience is much more than just web and mobile.

    [00:27:43.690]
    And we can help you deliver additional experiences

    [00:27:46.200]
    like immersive, conversational and recreational.

    [00:27:50.274]
    So besides thanking Simon

    [00:27:52.180]
    for all of his great work during the demos,

    [00:27:53.880]
    I want to leave you with this final thought today.

    [00:27:56.920]
    To build great experiences,

    [00:27:59.070]
    the end user must be at top of mind.

    [00:28:01.770]
    And the goal is to deliver applications

    [00:28:04.270]
    that are so effortless or invisible, as Don would say,

    [00:28:07.348]
    that they don’t even realize

    [00:28:08.791]
    the technology that they’re using.

    [00:28:11.590]
    So make Don proud and start building

    [00:28:13.590]
    some multiexperience applications today.

    [00:28:16.180]
    So with that, I’d like to say thanks for attending

    [00:28:18.740]
    and have a great day

    [00:28:19.625]
    (upbeat music)