>> Hi, all. Good morning
or good evening,
depending on where you're located.
Thank you for joining
our Power BI webinar
and my name is Deepak Shankar,
I'm a Community Manager within
the Microsoft business
application group.
Today we have Mihail, an MVP,
who will walk through the
new Power BI features and
how to build modern solutions
with Cosmos and Power BI.
So hi, Mihail, welcome
to a webinar session,
and it's all yours now.
>> Thank you for the introduction.
Hello, everyone.
I will say a few more
words about myself.
I am Microsoft MVP in two areas,
Microsoft Azure and Data Platform,
and also a member of Microsoft
Regional Director program.
I have experience mostly
with big solutions
related to Azure and IoT,
working like Solution Architect
and Enterprise Architect
for big customers,
enterprise multi-billion
companies usually.
Last 10 years I've focused
on Microsoft Azure,
and also last five years I
worked on solutions tightly
related to Internet of Things.
This presentation is related
mostly to how you can integrate
Cosmos DB and Power BI in the context
of modern Internet
of Things solutions.
So we will go through the agenda.
We will consider a few things
about Internet of Things
solutions, best practices,
some important things related
to reference architecture,
some scenario-centric
recommendations and,
after that, we'll go
to scenarios where we
can use Cosmos DB and Power BI
in Internet of Things solutions.
After that, we will go
deeper in some patterns
where we can integrate
Cosmos DB and Power BI.
In the end, we will
have few short demos
which will represent
integration patterns,
how we can integrate
exactly these awesome products
in modern Internet
of Things solutions.
So that's the plan, actually.
What we can have in the
beginning, probably,
you know that Internet of Things is
real hip in modern software industry.
Not only software, but we will
be focused mostly on this part.
The number of solutions and
connected devices is growing and
growing and we have
already billions devices,
which will stay several times
more in the next years.
Internet of Things cover
almost all areas from
modern industries, including
automotive industry,
including energy area,
including agriculture,
logistics, smart cities,
healthcare, almost everything.
Our devices will stay
more and more connected.
So actually, this is a huge part of
our industry where we can
have many challenges.
We have almost all parts of
modern solutions in Internet of
Things in different scenarios.
We will talk a little bit about this.
We can have the whole
power of services like
Cosmos DB and Power BI in
the context of these Internet
of Things solutions.
I will explain a few
things related to
common recommendations
because in IoT solutions we
have some good practices
which we will have in
all solutions if we want to
have good design and
reliable solution.
There is important
specific knowledge related
to specific platform in our
context, Microsoft Azure.
So actually we'll talk about
all these details in
our presentation.
About common practices, it is
mostly from the architecture part,
and it is quite relevant
also for other solutions.
In the big part of these
recommendations we need to have,
if it is possible, simple solutions.
That means we will try to
reduce the complexity.
We have also specific
layered architecture here,
which is a little bit
different than architecture
which we consider
for other solutions.
We will talk a little
bit later about this.
We need to have security
by design anywhere.
We need to be very precise
related to the data model.
We will need to automate
all operations related to
development of these solutions.
Scalability, related
to fast growing number
of devices and workload
is very important.
Many solutions are with
specific requirements related to
response time as some
other specific metrics,
which we will consider.
Of course, Cloud is one of
the main reasons to have
such kind of grow of these kind
of solutions, Internet of Things.
High level design is
actually split to two parts;
part which is on site,
that means different
sensors or managed objects,
and devices which are connected
to the backend using
Internet connections.
We have more or less complex
analysis on the backend.
IoT back office is
the major focus of what we
will talk during this webinar.
But devices are important
because we need to
collect all this
information from sensors
via specific approach and to
ingest this data in our storage,
which are the backend.
Considering layered architecture in
the context of Internet of Things,
we have three layers which we
consider a little bit different,
as I explained, than what we
have for other solutions.
We have backend, which
is mostly Cloud related,
and we will talk about Cloud.
We have edge.
Edge means everything which we have
like processing, filtering,
aggregations onsite,
and we have also
additional devices which are
in the context of sensors,
gateways, and other
equipment which we
need to have the whole
Internet of Things solution.
A very important part is security.
Our webinar is not
focused on security,
but it is important to know
that regarding to this part,
we need to consider
every component in the context
to have separate security.
Because if one part of
our Internet of Things
solution can be compromised,
there is a huge probability
this lead to compromising
of the whole solution.
So we need to consider
every component
as part which need to be
secured independently,
not only to rely on security,
which brings the whole solution.
Another important thing
is related to data model.
Now we have very
modern approaches for
micro-services solutions where
different micro-services
will keep their data separately.
But in Internet of Things,
we have several huge storages,
I mean as amount of data,
which can be in many designs,
also shared between
different services.
Just it depends on specific cases,
but we have from one site storage,
which need to collect all data and we
have different storages
related to aggregated data,
which we will have, and
configuration data.
But from the beginning,
it is good to know
from where data comes.
In this context, we have data
which is generated from devices.
We have data which is generated
from users who actually
manage the system.
We have data which is
from outside systems.
For example, we can import
data from hardware supplier
related to devices.
We have also additional data which
is related to authentication,
user identity, and different
kinds of configuration.
About the data model,
we can split our storages
to several groups.
We have raw data storage,
which usually collects
not processed data,
data that can be processed
but will be in a raw state.
We will have no specific aggregations
which will be good to
be in separate storage.
We have metadata storage,
which quite often is needed
to optimize our traffic.
That means our raw data
will be in reach with
metadata to have collected data
more readable during the analysis.
We have aggregated data storage,
which can be as
a result of some real-time or
close to real-time processing,
also, it can be a result of
some additional analysis.
We have configuration data storage,
which is quite important to have
the whole configuration
for our solution.
About raw data storage
and the context of Azure,
we have different options.
Usually, we are looking
for cheap storage,
which can be used for
huge amount of data.
It is very important this
storage to be scalable.
One natural options is when we have
Azure blob storage because it
is cheap and it is easy to use.
But in this context,
we have some limitations.
Azure Storage cannot
support many indexes,
so we will have limited search.
Or we need to have
additional services which will be
responsible for indexing which,
means we will have
additional storage for
index data and we will have
additional computing also for
implementation of these indexing.
It is possible, regarding
to requirements,
to have also for raw data
different type of databases.
Quite often, if we have
a good understanding
about the amount of data
and can optimize the
structure of our raw data,
we can use NoSQL database like
Cosmos DB, which is fast,
supporting different
searching and analysis,
and providing very good scalability.
But it is something
which is related to
general approach and it could be
considered case by case
based on the requirements.
It is possible to have also
some more complex solutions
where we can have time-series
databases with complex logic.
But we need to have case because
such solutions usually
have higher costs,
and we need to have a
business case to use it.
About metadata, usually metadata
is in such structured format.
So quite often, we persist
metadata in relational database.
Azure SQL database is one of
the best options for this.
But we can consider also document-based
solutions like Cosmos DB,
which can work also in
these contexts depending on
how our data is structured and
depending on how we can manage
it for specific purposes.
Some of NoSQL databases are
also good option for this.
Configuration storage is
considered in two different cases.
From one side, the different parts
of solution has own configuration,
and from another side,
we have one very important component
of the whole Internet
of Things solutions,
which is named Device
Configuration Management.
Usually this part manage
all information about provision
devices, their state,
and additional information
related to these devices,
where these devices are mounted,
organizations which
actually all these devices
and additional related
to specific case data.
So configurations storage
usually is also SQL database,
but we can consider
some NoSQL solution
document-based like Cosmos DB
for specific cases like alternatives.
Aggregated data storage
usually is a storage where we
persist data which we already
process and have this data
in some aggregated state.
This data is used for
scheduled reports,
reports on demand or
some other analysis.
Here, usually we have
structure and, again,
relational database and
some NoSQL databases,
Cosmos DB is one good example,
can be considered as storage
for such kind of data.
To summarize, for most of storages,
depending on the specific case,
Cosmos DB can be
considered like option.
But again, it is difficult to talk on
a very general level when
we have specific cases,
and have more precise analysis,
it could be more clear if Cosmos
DB will be the best option,
just it is applicable
for all these areas.
I will not cover all parts
of Internet of Things,
but it is good to mention
several things which are
important in the whole
design of these solutions.
First, we have in Internet
of Things surfing,
which is quite important.
We have network and
connectivity cases,
because we have distributed systems,
and we collect our data
from remote devices in
different locations.
That means we have always
communication between some
devices which send data,
and services which collect data.
What we need to consider is
how to implement connection,
and regarding to the
common understanding,
it is good to spend just a few
minutes to go deep in details.
Usually, when we have
Internet of Things solutions,
the weakest part is
the part which can be
protected more difficult.
What we consider in the
context of protection,
usually we use encrypted connection.
But when we have such
kind of security,
we need computing power to encrypt
and decrypt our information.
Backend services,
especially in the Clouds,
usually have enough
computing power for this.
When we talk about devices,
sometimes it is possible,
we have already quite
powerful devices.
But sometimes, it is
more difficult to
have the same options
like on the backend.
So devices usually need
to be protected better.
Years ago, Internet of Things
solutions were considered in
the context where devices will
not accept inbound connection,
they will always initiate
connection to the backend,
and this approach actually
make backend with
one potential security
hole in the context
of the gateway service or the
service which accept connection.
After several stages
of evolution, now,
we have much better security
based on service assisted
communication approach,
for which I will say a few words.
We have several types of
communication between
backend and devices.
The most straightforward
is when we have Telemetry,
that means we actually receive
information from devices,
and do not send anything.
In some cases, backend need to send
response based on specific
information from devices,
so we have inquiries in this case,
and we need to have this information
finally received in devices.
If we have the opposite approach,
the same approach, but
from an opposite side,
backend to send information,
and to receive response from devices,
such kind of communication,
we consider as Commands.
The last one is related to
notification where actually backend
notify devices about
some specific things
without any need to expect response.
Historically, devices are
protected with specific firewall.
When we have such kind of design,
as I already explained,
we have more issues in the
context of the backend.
This case is solved in the
modern solutions when we
have so named Service Assisted
Communication approach.
In Services Assisted
Communication approach,
we have in the gateway two
different queues which
actually accept information from
devices and from the backend,
and we have never
direct communication
between devices, and backend.
Devices send information to
one cue in the gateway service
in the context of Azure,
this Gateway services,
Azure IoT Hub usually.
Backend send information
to another queue,
which is in this Gateway service.
Each site writes information
to one on queue,
and receive information
from another queue,
which is with information
from the opposite side
of the whole solution.
In this case, we have communication
which is very secure,
and only the Gateway service
accepts connections.
So even potential attack succeed
against Gateway service,
we will have quite secured
other parts of our solution,
because there is no case
where Gateway service
actually initiate connection
to other services or device.
One very important thing in
modern Cloud design and,
of course, also IoT solutions,
is that Cloud is moving target.
That means good practices now
can be obsolete after few years,
and everything which we consider
now will be relevant for few years,
but Cloud design is approved
and approved every year.
So new services will be
available and existing services
will be extended to work
better with more features.
So everything which we discuss
now will be relevant probably
for the next few years,
and after that,
everything is possible.
Usually when we work on design
of specific Cloud solution,
we consider the latest features
because when solution is ready,
with a life-cycle of one year
or two year, for example,
these features will be actually
on their [inaudible] ,
and there will be a specific period
of a few years after when
the solution will be
actual before to consider for
some update based on changes
in the Cloud platform.
One good thing in Microsoft
Azure and IoT solutions,
is that there are
very good examples related to
the reference architecture.
That means we have
a very good blueprints how to
start to design these solutions
without to make big mistakes.
This approach help
architects and developers
to start easy and to have relative
good design from the beginning.
We have different components,
and what will be most important in
this webinar is actually how we
will put in the whole blueprint,
our database, Cosmos
DB, and Power BI,
like BI and reporting
tool which is used
for different kinds
of analysis and visualization.
If you see the schema here,
actually you can see that
we have Solution UX,
which actually is also
related to Power BI,
but Power BI also covers
part of analysis and logic.
From another hand, we have
several different storages.
I already explained
that Cosmos DB can be
considered as almost all
type of storage in general.
What is interesting,
that usually we need
some additional components to
connect these services in
the whole solution design.
All data usually comes
through one common kernel,
and in our case it is Azure IoT hub.
When we have our Azure IoT hub,
actually it is a service which
provides messaging functionality,
but also provides additional
security options to manage devices,
options to have
multi-tenancy support,
options to be configured in
different ways,
automatically or on-demand.
We need to have some clue
between devices which
send data and our storage
and analysis services.
In this context, we can have
different messaging services,
as I already explained.
IoT Hub is just
partially in this role,
but it is actually
the most important component
in the context of communication
as gateway service.
Additionally, we can have
different messaging services
which will be part of our flows,
which we will consider
a little bit later.
Different kind of services
like Service Bus when we need to
have more complex logic
regarding to messages,
if we want to have topics,
if we need to add
some additional rules for
our messages related to
multi-tenancy and other things.
We have Event Hub for mass
spreading of information.
But Event Hub usually is a service
where other services are
subscribed and pull data.
In some scenarios, we need to have
event service which push data,
and Event Grid is one very
good option for this.
Usually when we consider
modern solutions,
we're looking for the simplest way
how we can implement solutions,
and to have also optimal
usage of resources.
In this context, for many scenarios,
we are using server-less components.
I am considering here Azure
functions, for example,
because usually server-less
solutions spend less resources,
and if we have
options in the context of
functional and
non-functional requirements,
it is recommended to use
it in most of solutions.
One good thing in the context
of Cosmos DB is that you can
have several different options
to have event-driven design,
or to actually call
functions from Cosmos DB.
It is possible to work on
lower level using change fit.
It is possible also to use
triggers which Cosmos DB supports,
and to call functions
from these triggers.
Cosmos DB is quite good optimized
for event-driven design and
fits very well in the context
of solutions where we
have Azure functions
and we can have lightweight
solutions for Internet of Things.
Another interesting approach
is when need to have,
from our solutions, actually
analyses which are straightforward,
but we need to have
specific logic where
we need to rise this logic
based on specific cases.
In these situations, we
usually use Stream Analytics,
the most popular service in
Azure for stream processing.
We can rise Azure functions from
these service directly or to
have more complex logic
via additional service.
For stream processing, you can use,
of course, different solutions.
You can use open source solutions,
you can use Databricks,
you can use Apache Storm,
but just Azure Stream Analytics
is probably the easiest way to
start work on such solutions in
the context of stream processing.
Several scenarios which
we can discuss a little
bit more detailed in the context
of usage of Power BI
and Cosmos DB in IoT.
If we see this schema,
we have the common components
which we have in one solution.
First, we have devices,
as it was mentioned,
different IoT devices with
Edge processing or
without Edge processing.
We have communication via
Cloud gateway or IoT Hub.
We have such kind of close to
real time stream processing.
We are able to use
Cosmos DB for warm path,
that means for processed data.
But also we can consider it,
again, for Cloud storage,
that means for raw data,
depending on amount of data and when
and how we'll process this
raw data in the future.
About UI reporting and tools,
Power BI is one of the most powerful
components which we can use.
It can be integrated
with stream professing,
but we can have additional business
logic which will be between
storages and Power BI to represent
some information based
on analysis on demand.
To be more complete,
I will try to add few details about
all options which we have
for IoT architecture,
again, with alternative services.
If we consider stream ingestion,
you can use different services with
these functionalities
which can replace IoT Hub.
Just IoT Hub is the most optimized,
easier to use, most developed,
but you can use Apache Kafka,
you can use Event Hub.
Most of these services have
no functionality for
embedded security and
functionality to managed in
automated way devices via
the service like IoT Hub.
So this is probably the
most important part
which makes IoT Hub so attractive.
About stream processing, again,
we have many different options
in the context of open source.
We are using Azure Stream Analytics
because it is easy to use,
it is key per service,
and you can have low
costs and you can have
short time to market if you
design solutions including
Azure Stream Analytics.
Azure Stream Analytics quite often is
used as glue also
between other services.
It is not always the best pattern,
but because it is
possible to make it easy
and relative for low cost,
you can see many solutions where
Azure Stream Analytics is used
for output and input
for different services.
Despite that, the major reason
to have Stream Analytics is
to have stream processing.
I kindly recommend to avoid
this practice and to use
server-less solutions
like Azure functions
when it is needed if you
need just to make connections
between different services.
When we consider storages,
as I mentioned, Cosmos DB
is applicable anywhere.
We will not be focused on
other options for storages.
All these options are interesting,
but this webinar is focused
on Power BI and Cosmos DB.
Regarding to presentation
and analysis,
you can have also very
fast and relative
cheap depending on what kind
of analysis you are having,
how many users you have.
But yes, relative optimized
solution using Power BI.
So it is a very natural solution
for analysis and presentations
in IoT solutions.
If we have a need to connect
different components,
quite often you can use it also
with Azure storage because
Azure storage also supports
option to have subscribing
for Azure functions.
So you are able to call server-less
solutions in this case.
Despite that it is
a service used mostly for
cold path for raw data,
we are able also to
use it in some scenarios in the
context of the whole integration.
It is useful based on the support
of events when we
actually change our data.
Now, we will continue
with some patterns;
how we will be able to integrate
our solutions with
Cosmos DB and Power BI.
One typical solution
which was not very
easy to implement in the
past is when we have
in our solution raw storage.
We have logic which is raised
based on server-less components,
and we have processed
data in Cosmos DB,
and we have output from
Cosmos DB to Power BI.
Now, it is possible because Power
BI has Cosmos DB connectors.
Cosmos DB connectors
are still in preview,
but now it is one relative
easy way actually
to have from one site
processed data in Cosmos DB,
and output directly from
Cosmos DB to Power BI.
If we need to have data
which is based on raw data
analysis from Azure Storage,
then this another case and we
can have different patterns,
we can make this with server-less
components like Azure functions.
Or if we have our data with
clear structure, for example,
like JSON documents in Azure Storage,
it is possible to access it
from Power BI for such
kind of analysis.
We can have the same design
with different kinds of
messaging services in front of.
I already explained that IoT
Hub is the preferable option,
but we can use instead the IoT
Hub different messaging services.
In the context of data
storage and analysis in UI,
it will be usually the same,
at least till the moment
when we start to have
support for Power BI
directly from IoT Hub.
For now, it is not possible.
Another option is when we
need to integrate Cosmos DB,
or Power BI, or both
directly with IoT Hub.
As I explain these integration
is not possible without
additional components.
For now, we need to have some
component in the middle.
Azure Functions is probably
the most natural option for
this because it is cheap,
you can control it,
but it requires more
or less development.
If you have no options
for development,
it is possible to make it with
service like Stream Analytics.
Again, my note is
that Stream Analytics
is good to be used when
we have data processing,
not just to connect services to
use it more efficient, of course.
We can have much more complex
scenarios where we are able to
have our data persist in
different services and
with several steps actually to
receive our data in Power BI.
Usually we have such solutions
when we need to have
on several steps,
processing and
enrichment of our data.
It depends on specific use cases
which we can have in our solutions.
We can use also in
the middle different services
for processing when we have
real need for processing.
Usually, these services can
have output to HTTP endpoint,
which as we will see
is possible for both,
for Cosmos DB and for Power BI.
We can have also integration
between Cosmos DB and Power BI
in the context of AI components.
Because quite often we need to have
analysis where to call services
published using Azure
Cognitive Services
or if we have Azure
Machine Learning models.
As you see, we have many
different scenarios.
But usually the devil is in details.
That's the reason to go a little bit
deeper in some patterns
on the application level.
How we will use actually
Power BI and Cosmos
DB for the solutions,
which specific things we
need to have in mind.
Regarding to Power BI,
one of the most
interesting things is when
we have real time data sets.
Because for static dashboards,
we will have something,
which will be different than
what you know about Power BI,
and other types of analysis
and creating dashboards.
But in the context of
streaming dashboards,
there are specific details and it
is good to have some overview.
When we have live data
dashboards in Power BI,
we can have several
options to implement
these using several different
types of data sets.
These data sets actually
have small differences
related to their behavior.
But in general, we consider
all these data sets
like stream data sets.
I will go in some details.
We have options when
we can push data to
Power BI from services like
Azure Stream Analytics.
In this case we will have
actually Push data set.
Push data set give us
optional to have live data,
but to keep history.
We can use another specific
type of data sets,
so-named Streaming data set,
where this data set
is considered to keep
in cache for a specific
short timeline data,
and after that this
data is not accessible,
this data is deleted.
We can use API calls to implement
these streaming data sets.
We can configure these data sets,
actually in specific cases.
Also to keep historical data
and to be like Push data set.
But in general, the context is to
have data only for
short period of time.
This data is needed to
represent life-changing
some specific metrics,
and after that we will
have no need to store
this data in Power BI data set.
Specific additional feature
is so-named PubNub dataset.
PubNub is a service software
as a service which can be
used to publish data
from different devices.
It is considered mostly as
service which is used
for IoT solutions,
but it could be used also for
other purposes chap, for example.
We can be subscribed from
Power BI for such datasets,
and to display data which
comes from such solution.
What additionally is
important? Streaming datasets,
which we are using with API need
to have some configuration.
We will go through these
steps during our demos.
For push data sets.
If you are using,
for example, Azure Stream Analytics,
it could be done without any
knowledge about API configuration.
Push dataset, as I mentioned,
keeps historical
data, and for PubNub,
you need to have some knowledge about
how to work with these series.
If we consider the common
streaming data sets architecture,
this schema can represent how
all these strings work actually.
As you see, all these
services excluding
PubNub actually work
with Power BI REST API.
But for PubNUb it is
additional specific SDK which
need to be used to manage it.
There are some specific
limitations related
to different streaming datasets,
which you can see here related to
what amount of data you can have
and if it is possible
to store data or not.
It is part of Microsoft
documentation so you
can go deep in details
using docsmicrosoft.com,
and you also you'll have
access to these slide deck.
About API integration,
the good thing is
that actually you are able to have
customer solutions which
can be configured in
a way to work with API of Power BI,
and to send live data.
These data to be consider
like streaming dataset.
There are several specific things
which we will see how we will
call our Power BI endpoint
from Azure function.
It is one of the demos
which we will see.
We will have very short
demonstration related to this.
I already have the whole
setup deployed and
we will consider a few things
also related to Cosmos DB.
As I already explained,
Cosmos DB has no direct
connector for IoT hub for now.
We need to have separate
service or custom logic.
It's connectors for Power
BI are still in preview,
but you can use it.
You are not able to use
PowerBI.com site directly.
You need to create projects
using Power BI desktop,
and you can have diagrams
based on Cosmos DB.
But you are not able to have
a live diagrams directly
from Cosmos DB.
You can use live data or data
which is changed in Cosmos DB.
But you need to have
solution where you will
have some logic subscribe to
change feet, for example,
and to trigger Azure function,
which will call Power BI API
to feel streaming dataset.
These are scenarios
which we can have.
Now, I will try to
demonstrate few demos.
For this purpose, I
will change the program,
which I will share.
First, I will try to
demonstrate example where we
have device which sends
information to IoT Hub
using device simulator.
It is the same like if we
will have a real device,
and in this very simple simulator,
we are sending actually information
about wind speed for
specific devices.
We will have additional
information about the device ID.
We have in this case just one device,
but we can have many,
and IoT Hub and Azure
Stream Analytics
are able to add additional
information, like additional metadata.
Azure Stream Analytics usually
adds to messages information
about when specific message
is received and when specific
message is processed.
I will start the
simulator and I will show
you the output screen.
Output screen will start to print
actually message information about
messages which we are sending.
The next component, which we need
to see is actually
our Azure Stream Analytics series,
which is in Microsoft Azure.
I will try to
move to our presentation screen
one Azure Stream Analytics job,
which is already created.
Here the idea is just to show
how we can send
information to Power BI.
So I did not add it in the query
specific logic for processing.
But again, the good pattern
is if you are using
Azure Stream Analytics,
this to be when you have a
reason to have processing.
So this example is
mostly to demonstrate
what will be the possible approach.
We are starting this job.
It will take several
seconds till job is started
and I will show you after that
actually how we have all these
information in Power BI.
In advanced, we can
see the output here.
Output can be configured for
different services and it
is not possible when we have
already running job to manage this.
But actually what we are not
able to see here are
settings for connection,
which we need to add when
authenticate these job to Power BI.
After few seconds,
we will be able to see
what we have in Power BI.
I will hide this screen and
will show you Power BI screen,
which actually, after few seconds
will start to visualize
information from our
simulated device.
In the moment, we see
the old information,
and this information will
start to be changed when
our job service actually
started and how
information is received
it in Power BI.
When we have such
kind of information,
we need to have,
for a specific workspace, dashboard.
Yes, I think that we started
already to have this live data.
We cannot, additional tile
with custom streaming data.
If this data is
from Azure Stream Analytics,
we will be able to have
the output already available.
We will be able to use different
types of representations,
several charts, or
just a simple card.
For example, if it is simple card,
we can configure details
and we are able also to create
phone view if it is needed.
Now, for presentation purposes,
I will not go to configure this.
So we are able to have
such kind of solutions
using IoT Hub and Azure
Stream Analytics.
It works quite easy.
Now, I will demonstrate
how we will be able to
have the same approach
with Azure Functions.
I will demonstrate the projects
with Azure Functions, which we have.
This is a simple project
where we have actually one function
which is already deployed.
I used information which is
already available in another
service for Bitcoin price.
So this pattern is
mostly to represent
how we can connect with serverless,
we take our functions Power BI.
I will demonstrate, in this case,
how it is implemented in our
Azure deployed function.
We are starting the function,
and it will take
probably few seconds,
and you will be
able to start one function.
Infinity Loop actually
sends information
about Bitcoin price to Power BI.
I'm showing again one live dashboard,
which is already
created several times.
We can use these price,
which is used to visualize price of
the Bitcoin using
this Azure Functions.
Meanwhile, here is another tile
which represent the same
approach via PubNub.
So this is the most important which
I would like to demonstrate in
the context to how I am able
to integrate live dashboards
in several different ways.
We can go in many age cases,
but this webinar is mostly to
represent the major
functionalities and how we will
have Power BI and Cosmos DB as
useful components in modern
Azure IoT solutions.
So thank you very much
for the attention,
and I will be glad if you
have any questions to
answer your questions.
>> Thank you so much, Mihail.
There is no questions for now.
Mihail?
>> Yes, I'm here. Yes.
>> All right. So we don't
have any questions,
and it's been a wonderful
webinar session.
So do you have any other questions
or inputs you need to give?
Hence, we can close the webinar.
>> Thank you very much for
the great opportunity.
>> Thank you so much.
>> Have a nice day.
