In this session I will show how to train and use the Face API for recognition of people. Secondly I will show an example of how to determine emotions using the APIs. After this session attendees will have a basic understanding of setting up the APIs, and know how to use and implement the Face and Emotion APIs on the Cognitive Services Stack.
Recently, Microsoft added Dutch language support for the Text Analysis API within the Cognitive Services stack. In this blogpost I will show how to determine the sentiment of a newsfeed in Dutch.
The importance of language support
Most development articles are written in English. Most documentation is written in English. I’m Dutch and even I am blogging in English!
With the rise of chatbots and other dialogue based UX apps, native language support becomes ever more important. Think of an insurance company wanting to let users talk to their chatbot, or a banking app that lets you choose what mortgage you want for your house. In these situations most companies and clients want to use their native language in communicating with the apps.
A lot of examples of the Cognitive Services include UWP applications and samples. But sometimes you have to support older operating systems, that force you to fall back on techniques like WPF. The use of webcams in WPF is not included by default in the framework.
In this blog I will explain how to implement the Emotion API and build a WPF application capturing your webcam.
A sample of an application you can create shown in the following screenshot. This application shows the realtime spline chart of emotions, and the average bubble chart of the recorded session.
Naar aanleiding van de presentatie op Microsoft Heroes Academy #5, thema: Artificial Intelligence in een notendop hierbij de sheets in PDF formaat van de sessie ‘Technische introductie Cognitive Services’ door Peter Rombouts.
De sheets bevatten op de laatste pagina de links naar diverse gebruikte packages en tooling.