Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
The Eventstream item in Fabric is a very easy to use service, that leverages the ability to get data from streaming sources, such as IoT devices, Telemetry, webhooks etc.
The default approach of getting the Eventstream processor to listen to data is highly scalable and versatile in maby ways. Below I will dive into the details around generating an Event Hub in Real-Time intelligence and begin to use it for ingestion.
First of all, we need an Eventstream to manage the data streams from the source.
This is done by selecting the [+ New item] menu at the top of Fabric and find the Eventstream item in the list.
The Eventstream is found in 3 areas in Fabric:
Either one will get you to the next screen of typing in a name for the Eventstream. For the remainder of this post, I've selected "Eventstream_demo" as the name.
In the next window, you have the options of getting data from an External Source, Play around with the sample data sets, or use a custom endpoint.
To create an Eventhub in Fabric, you must select the "Use custom endpoint" as shown below:
Give the endpoint a name and click "Add" in the bottom left corner.
You should end with something like this:
Until now, nothing has been deployed - but if you click the "Publish" button in the top right corner, the Eventstream will publish itself + an Eventhub which you can use to send data to.
Here I have selected the CustomEndpoint name from my topology, and below you can see the different options to send data to the Eventstream.
I have options to use:
For each option I will have the possibility to get the needed formation for sending data to that specific endpoint.
Below is my information for the Event Hub service:
You will also see that I have both the Primary Key and Secondary Key with the corresponding connection strings.
I've made a short YouTube video to guide you to this solution:
If you want to try to send some arbitrary data to the Event hub, I've made the Notebook attached to this post.
You can import the notebook to your Fabric workspace and modify it as below.
In the notebook, you will find a section with two parameters: Connection string and Event Hub Name.
You can find your specific name and connectionstring at the image above.
TIP: To get your connection string, you must first click the "eye" to view it.
Above you can see the information pasted in - I've put in some XX in the text, as these connections and keys are secrets.
The notebook uses the faker.js library to generate arbitrary data. The data configured in the notebook looks like web activities from specific network machines and log events based on that activity. It is not real data - it is only to play around with for demo purposes.
Once you are done with this small change, you can execute the entire Notebook.
After a few seconds the notebook will begin to send data to the Eventstream, and you can get a peek into the data by clicking the middle area in your Eventstream.
This will give you an example data set as shown below:
You will also see that I've added a destination to my Eventstream to an Eventhouse, and from here I can begin to analyse and visualize the data as needed.
The two other options for sending data to the Eventstream is AMQP and Kafka. AMQP (or Advanced Message Queuing Protocol) is a standard described in this link to Wikipedia: Advanced Message Queuing Protocol - Wikipedia
The Kafka endpoint is the Apache Kafka standard and can be found here: Apache Kafka - Wikipedia
No matter what type of messaging you are using of the above, the same Eventstream as I've made here, can be used to ingest data to Fabric.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.