You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: 11-integrating-with-function-calling/README.md
+7-7
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ The above mentioned problems are what this chapter is looking to address.
10
10
11
11
This lesson will cover:
12
12
13
-
- Explain what is function calling and its use cases.
13
+
- Explain what function calling is and its use cases.
14
14
- Creating a function call using Azure OpenAI.
15
15
- How to integrate a function call into an application.
16
16
@@ -22,7 +22,7 @@ By the end of this lesson, you will be able to:
22
22
- Setup Function Call using the Azure OpenAI Service.
23
23
- Design effective function calls for your application's use case.
24
24
25
-
## Scenario: improving our chatbot with functions
25
+
## Scenario: Improving our chatbot with functions
26
26
27
27
For this lesson, we want to build a feature for our education startup that allows users to use a chatbot to find technical courses. We will recommend courses that fit their skill level, current role and technology of interest.
28
28
@@ -38,7 +38,7 @@ To get started, let's look at why we would want to use function calling in the f
38
38
39
39
Before function calling, responses from an LLM were unstructured and inconsistent. Developers were required to write complex validation code to make sure they are able to handle each variation of a response. Users could not get answers like "What is the current weather in Stockholm?". This is because models were limited to the time the data was trained on.
40
40
41
-
Function Calling is a feature of the Azure OpenAI Service to overcome to the following limitations:
41
+
Function Calling is a feature of the Azure OpenAI Service to overcome the following limitations:
42
42
43
43
-**Consistent response format**. If we can better control the response format we can more easily integrate the response downstream to other systems.
44
44
-**External data**. Ability to use data from other sources of an application in a chat context.
@@ -189,7 +189,7 @@ There are many different use cases where function calls can improve your app lik
189
189
The process of creating a function call includes 3 main steps:
190
190
191
191
1.**Calling** the Chat Completions API with a list of your functions and a user message.
192
-
2.**Reading** the model's response to perform an action ie execute a function or API Call.
192
+
2.**Reading** the model's response to perform an action i.e. execute a function or API Call.
193
193
3.**Making** another call to Chat Completions API with the response from your function to use that information to create a response to the user.
@@ -247,7 +247,7 @@ Let's describe each function instance more in detail below:
247
247
248
248
-`name` - The name of the function that we want to have called.
249
249
-`description` - This is the description of how the function works. Here it's important to be specific and clear.
250
-
-`parameters` - A list of values and format that you want the model to produce in its response. The parameters array consists of items where item have the following properties:
250
+
-`parameters` - A list of values and format that you want the model to produce in its response. The parameters array consists of items where the items have the following properties:
251
251
1.`type` - The data type of the properties will be stored in.
252
252
1.`properties` - List of the specific values that the model will use for its response
253
253
1.`name` - The key is the name of the property that the model will use in its formatted response, for example, `product`.
@@ -305,7 +305,7 @@ After we have tested the formatted response from the LLM, now we can integrate t
305
305
306
306
To integrate this into our application, let's take the following steps:
307
307
308
-
1. First, let's make the call to the Open AI services and store the message in a variable called `response_message`.
308
+
1. First, let's make the call to the OpenAI services and store the message in a variable called `response_message`.
309
309
310
310
```python
311
311
response_message = response.choices[0].message
@@ -455,4 +455,4 @@ Hint: Follow the [Learn API reference documentation](https://learn.microsoft.com
455
455
456
456
After completing this lesson, check out our [Generative AI Learning collection](https://aka.ms/genai-collection?WT.mc_id=academic-105485-koreyst) to continue leveling up your Generative AI knowledge!
457
457
458
-
Head over to Lesson 12 where we will look at how to [design UX for AI applications](../12-designing-ux-for-ai-applications/README.md?WT.mc_id=academic-105485-koreyst)!
458
+
Head over to Lesson 12, where we will look at how to [design UX for AI applications](../12-designing-ux-for-ai-applications/README.md?WT.mc_id=academic-105485-koreyst)!
0 commit comments