38
loading...
This website collects cookies to deliver better user experience
Original: She no went to the market.
Standard American English:
She didn't go to the market.
The following are conversations with a chat AI.
The chat AI is programmed to give fixed responses to common questions.
If the chat AI is ever presented with a question not in it's list
of known questions, it responds with "A human will be in touch shortly".
| Question | Response |
| What are your hours? | Our hours are Monday-Friday, 9am-5pm. |
| Are you currently hiring? | We're hiring for cashiers |
| What do you sell? | We sell clothing and accessories at affordable prices.
You can see all our products at https://www.example.com |
###
Human: What are your hours?
AI: Our hours are Monday-Friday, 9am-5pm.
###
Human: What do you have in store?
AI: We sell clothing and accessories at affordable prices. You can see all our products at https://www.example.com
###
Human: what's the deal here
AI: A human will be in touch shortly
###
Human: Are you open monday?
AI: Our hours are Monday-Friday, 9am-5pm.
###
Human: Where are you located?
AI: A human will be in touch shortly
###
Human: Are you currently hiring?
AI: We're hiring for cashiers
Human: ...
text, and it responded with the AI: ...
text.Human: do you have shoes?
AI: We sell clothing and accessories at affordable prices. You can see all our products at https://www.example.com
Human: what can I buy?
AI: We sell clothing and accessories at affordable prices. You can see all our products at https://www.example.com
Human: can i buy dog food here
AI: A human will be in touch shortly
Human: I'm running really late, are you open on sundays?
AI: Our hours are Monday-Friday, 9am-5pm.
Human: One time I bought a product from a rival store.
AI: We're hiring for cashiers
Human: when's closing time?
AI: Our hours are Monday-Friday, 9am-5pm.
Human: what happens if I show up at 4:59pm???
AI: A human will be in touch shortly
The following are conversations with a chat AI. ...
| Question | Response |
...questions and responses...
...Human/AI examples...
#######
The following are conversations with a chat AI. ...
| Question | Response |
...questions and responses...
...Human/AI examples...
########
...and so on...
$ mkdir backend
$ python3 -m venv venv
$ source venv/bin/activate
(venv) $ pip install -U flask openai Flask-SQLAlchemy propelauth-flask
openai
's completion API like so:from flask import Flask, request
import os
import openai
app = Flask(__name__)
openai.api_key = os.getenv("OPENAI_API_KEY")
@app.route("/test_prompt", methods=["POST"])
def test_prompt():
prompt = request.get_json().get("prompt")
return complete_prompt(prompt)
def complete_prompt(prompt):
response = openai.Completion.create(
engine="davinci-instruct-beta-v3",
prompt=prompt,
temperature=0,
max_tokens=60,
top_p=1.0,
frequency_penalty=0.0,
presence_penalty=0.0,
stop=["\n"]
)
# Get and return just the text response
return response["choices"][0]["text"]
flask run
and then hitting it with curl:$ curl -X POST
-d '{"prompt": "Original: She no went to the market.\nStandard American English:"}'
-H "Content-Type: application/json"
localhost:5000/test_prompt
She didn't go to the market.
org_id
(short for organization identifier) and then all we need to do is store questions/answers for each org_id
.models.py
:from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
class QuestionsAndAnswers(db.Model):
# We'll see later on that these IDs are issued from PropelAuth
org_id = db.Column(db.String(), primary_key=True, nullable=False)
# Simplest way to store it is just as JSON,
# sqlite doesn't natively have a JSON column so we'll use string
questions_and_answers = db.Column(db.String(), nullable=False)
app.py
:from models import db
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///local.db'
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db.init_app(app)
(venv) $ python
# ...
>>> from app import app, db
>>> with app.app_context():
... db.create_all()
...
>>>
org_id
as a path parameter and won't worry about verifying it just yet. We'll revisit it at the end when we add authentication. For now, anyone can send a request on behalf of any organization.@app.route("/org/<org_id>/questions_and_answers", methods=["POST"])
def update_questions_and_answers(org_id):
# TODO: validate this input
questions_and_answers = request.get_json()["questions_and_answers"]
existing_record = QuestionsAndAnswers.query.get(org_id)
if existing_record:
existing_record.questions_and_answers = json.dumps(questions_and_answers)
else:
db.session.add(QuestionsAndAnswers(org_id=org_id, questions_and_answers=json.dumps(questions_and_answers)))
db.session.commit()
return "Ok"
@app.route("/org/<org_id>/questions_and_answers", methods=["GET"])
def fetch_questions_and_answers(org_id):
questions_and_answers = QuestionsAndAnswers.query.get_or_404(org_id)
return json.loads(questions_and_answers.questions_and_answers)
$ curl -X POST
-d '{"questions_and_answers": [{"question": "What is your phone number for support?", "answer": "Our support number is 555-5555, open any time between 9am-5pm"}]}'
-H "Content-Type: application/json"
localhost:5000/org/5/questions_and_answers
Ok
$ curl localhost:5000/org/5/questions_and_answers
{"questions_and_answers":[{"answer":"Our support number is 555-5555, open any time between 9am-5pm","question":"What is your phone number for support?"}]}
$ curl -v localhost:5000/org/3/questions_and_answers
# ...
< HTTP/1.0 404 NOT FOUND
# ...
test_prompt
function to:@app.route("/org/<org_id>/test_prompt", methods=["POST"])
def test_prompt(org_id):
# TODO: validate this prompt
user_specified_prompt = request.get_json().get("prompt")
db_row = QuestionsAndAnswers.query.get_or_404(org_id)
q_and_a = json.loads(db_row.questions_and_answers)["questions_and_answers"]
full_prompt = generate_full_prompt(q_and_a, user_specified_prompt)
return complete_prompt(full_prompt)
generate_full_prompt
function:def generate_full_prompt(q_and_as, user_specified_prompt):
# Generate the table using our data
q_and_a_table = "\n".join(map(
lambda q_and_a: f"| {q_and_a['question']} | {q_and_a['answer']} |",
q_and_as
))
# Generate positive examples using the exact question + answers
q_and_a_examples = "###\n\n".join(map(
lambda q_and_a: f"Human: {q_and_a['question']}\nAI: {q_and_a['answer']}",
q_and_as
))
# Generate some negative examples too
random_negative_examples = """###
Human: help me
AI: A human will be in touch shortly
###
Human: I am so angry right now
AI: A human will be in touch shortly
"""
# Note: if you aren't using a fine-tuned model, your whole prompt needs to be here
return f"""The following are conversations with a chat AI. The chat AI is programmed to give fixed responses to common questions. If the chat AI is ever presented with a question not in it's list of known questions, it responds with "A human will be in touch shortly".
| Question | Response |
{q_and_a_table}
###
{q_and_a_examples}
{random_negative_examples}
###
Human: {user_specified_prompt}
AI: """
$ curl -X POST
-d '{"prompt": "what movies are playing?"}'
-H "Content-Type: application/json"
localhost:5000/org/5/test_prompt
A human will be in touch shortly
$ curl -X POST
-d '{"prompt": "can I get support via phone"}'
-H "Content-Type: application/json"
localhost:5000/org/5/test_prompt
Our support number is 555-5555, open any time between 9am-5pm
org_id
. Next we'll add PropelAuth to address that.from propelauth_flask import init_auth
auth = init_auth("https://REPLACE_ME.propelauthtest.com", "YOUR_API_KEY")
@auth.require_org_member
decorator:@app.route("/org/<org_id>/questions_and_answers", methods=["GET"])
@auth.require_org_member()
def fetch_questions_and_answers(org_id):
questions_and_answers = QuestionsAndAnswers.query.get_or_404(org_id)
return json.loads(questions_and_answers.questions_and_answers)
require_org_member()
will do a few things:org_id
. org_id
s, from user's that aren't in that org, or if they are from non-users.$ curl -v localhost:5000/org/5/questions_and_answers
# ...
< HTTP/1.0 401 UNAUTHORIZED
# ...
SWR
to query our backend and @propelauth/react
for authentication.$ npx create-next-app@latest frontend
$ cd frontend
$ yarn add swr @propelauth/react
pages/_app.js
by wrapping our application with an AuthProvider
.import {AuthProvider} from "@propelauth/react";
function MyApp({Component, pageProps}) {
return <AuthProvider authUrl="REPLACE_ME">
<Component {...pageProps} />
</AuthProvider>
}
export default MyApp
AuthProvider
allows any child component to fetch user information using withAuthInfo
. We also have information on which organizations the user is a member of. We can make a drop-down menu which allows users to pick an organization:// Allow users to select an organization
function OrgSelector(props) {
// isLoggedIn and orgHelper are injected automatically from withAuthInfo below
if (!props.isLoggedIn) return <span/>
const orgs = props.orgHelper.getOrgs();
// getSelectedOrg() will infer an intelligent default
// in case they haven't selected one yet
const selectedOrg = props.orgHelper.getSelectedOrg();
const handleChange = (event) => props.orgHelper.selectOrg(event.target.value);
return <select value={selectedOrg.orgId} onChange={handleChange}>
{orgs.map(org => <option value={org.orgId}>{org.orgName}</option>)}
</select>
}
export default withAuthInfo(OrgSelector);
selectOrg
which we can later get with getSelectedOrg
. We'll need this when fetching from our backend.withAuthInfo
. Here's a function that uses fetch to pass along an access token.async function testPrompt(orgId, accessToken, prompt) {
const response = await fetch(`http://localhost:5000/org/${orgId}/test_prompt`, {
method: "POST",
headers: {
"Content-Type": "application/json",
// accessTokens are passed in the header
"Authorization": `Bearer ${accessToken}`
},
body: JSON.stringify({prompt: prompt})
})
return await response.json()
}
test_prompt
.function TestPrompt(props) {
const [prompt, setPrompt] = useState("");
const [response, setResponse] = useState("");
if (!props.isLoggedIn) return <RedirectToLogin/>
const submit = async (e) => {
e.preventDefault();
const orgId = props.orgHelper.getSelectedOrg().orgId;
const apiResponse = await testPrompt(orgId, props.accessToken, prompt);
setResponse(apiResponse)
}
return <form onSubmit={submit}>
<input type="text" value={prompt} onChange={e => setPrompt(e.target.value)} />
<button>Submit</button>
<pre>Server Response: {response}</pre>
</form>
}
export default withAuthInfo(TestPrompt)