Skip to content

Commit c157409

Browse files
authored
Update README.md
Fixed some potential misspellings/errors.
1 parent d1762a3 commit c157409

File tree

1 file changed

+7
-7
lines changed
  • 11-integrating-with-function-calling

1 file changed

+7
-7
lines changed

11-integrating-with-function-calling/README.md

+7-7
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ The above mentioned problems are what this chapter is looking to address.
1010

1111
This lesson will cover:
1212

13-
- Explain what is function calling and its use cases.
13+
- Explain what function calling is and its use cases.
1414
- Creating a function call using Azure OpenAI.
1515
- How to integrate a function call into an application.
1616

@@ -22,7 +22,7 @@ By the end of this lesson, you will be able to:
2222
- Setup Function Call using the Azure OpenAI Service.
2323
- Design effective function calls for your application's use case.
2424

25-
## Scenario: improving our chatbot with functions
25+
## Scenario: Improving our chatbot with functions
2626

2727
For this lesson, we want to build a feature for our education startup that allows users to use a chatbot to find technical courses. We will recommend courses that fit their skill level, current role and technology of interest.
2828

@@ -38,7 +38,7 @@ To get started, let's look at why we would want to use function calling in the f
3838

3939
Before function calling, responses from an LLM were unstructured and inconsistent. Developers were required to write complex validation code to make sure they are able to handle each variation of a response. Users could not get answers like "What is the current weather in Stockholm?". This is because models were limited to the time the data was trained on.
4040

41-
Function Calling is a feature of the Azure OpenAI Service to overcome to the following limitations:
41+
Function Calling is a feature of the Azure OpenAI Service to overcome the following limitations:
4242

4343
- **Consistent response format**. If we can better control the response format we can more easily integrate the response downstream to other systems.
4444
- **External data**. Ability to use data from other sources of an application in a chat context.
@@ -189,7 +189,7 @@ There are many different use cases where function calls can improve your app lik
189189
The process of creating a function call includes 3 main steps:
190190

191191
1. **Calling** the Chat Completions API with a list of your functions and a user message.
192-
2. **Reading** the model's response to perform an action ie execute a function or API Call.
192+
2. **Reading** the model's response to perform an action i.e. execute a function or API Call.
193193
3. **Making** another call to Chat Completions API with the response from your function to use that information to create a response to the user.
194194

195195
![LLM Flow](./images/LLM-Flow.png?WT.mc_id=academic-105485-koreyst)
@@ -247,7 +247,7 @@ Let's describe each function instance more in detail below:
247247

248248
- `name` - The name of the function that we want to have called.
249249
- `description` - This is the description of how the function works. Here it's important to be specific and clear.
250-
- `parameters` - A list of values and format that you want the model to produce in its response. The parameters array consists of items where item have the following properties:
250+
- `parameters` - A list of values and format that you want the model to produce in its response. The parameters array consists of items where the items have the following properties:
251251
1. `type` - The data type of the properties will be stored in.
252252
1. `properties` - List of the specific values that the model will use for its response
253253
1. `name` - The key is the name of the property that the model will use in its formatted response, for example, `product`.
@@ -305,7 +305,7 @@ After we have tested the formatted response from the LLM, now we can integrate t
305305

306306
To integrate this into our application, let's take the following steps:
307307

308-
1. First, let's make the call to the Open AI services and store the message in a variable called `response_message`.
308+
1. First, let's make the call to the OpenAI services and store the message in a variable called `response_message`.
309309

310310
```python
311311
response_message = response.choices[0].message
@@ -455,4 +455,4 @@ Hint: Follow the [Learn API reference documentation](https://learn.microsoft.com
455455

456456
After completing this lesson, check out our [Generative AI Learning collection](https://aka.ms/genai-collection?WT.mc_id=academic-105485-koreyst) to continue leveling up your Generative AI knowledge!
457457

458-
Head over to Lesson 12 where we will look at how to [design UX for AI applications](../12-designing-ux-for-ai-applications/README.md?WT.mc_id=academic-105485-koreyst)!
458+
Head over to Lesson 12, where we will look at how to [design UX for AI applications](../12-designing-ux-for-ai-applications/README.md?WT.mc_id=academic-105485-koreyst)!

0 commit comments

Comments
 (0)