Today, we’re going to explore the capabilities of OpenAI‘s powerful GPT technology and how we can utilize it to create an assistant capable of summarising CVs with a single command. OpenAI Assistants harness the power of Generative Pretraining Transformer models (GPT), which are advanced machine-learning models designed to generate human-like text. In the scope of OpenAI Assistants, this GPT model delivers a conversational interface that learns and provides useful outputs depending on the context it’s given.

OpenAI GPT Assistants: How it works
Integrating AI into applications has never been more seamless with the advent of OpenAI’s Assistants API. This dynamic feature allows the creation of specialized assistants that can be meticulously customized to your needs. Here’s how it beautifully ties it all together:
- Start by creating an ‘Assistant‘. This powerful entity can be tailored with custom instructions and married with a specific AI model. To further amplify its abilities, you can enable advanced tools like code interpretation, data retrieval, and function calling.
- One of the wonders of this system is the concept of a ‘Thread‘. Think of it as a conversation between a user and your assistant. Initiating a thread is as simple as the user or your AI application beginning a conversation.
- Then comes in the ‘Message‘. Within this conversation or thread, your users or applications can add their messages – these contain their questions or directives for the assistant. Worried about the conversation getting too long? Fear not, as there’s no limit to the number of messages. Beyond a certain limit, the context is intelligently truncated to fit the model’s context window.
- The next step is the ‘Run‘. Once all the user messages are part of the thread, we let the assistant loose by creating a run. This triggers the assistant, leveraging the power of its associated model and tools, to generate and deliver a response, resulting in an intelligent and engaging interaction.
To give you an illustration, imagine setting up an Assistant as a personal math tutor, enabled with a code interpreter tool. A conversation with a user is initiated, with user messages posing mathematical queries and the assistant, drawing on its tools, responds with solutions. Simple, isn’t it?
OpenAI GPT Assistants: A Toolbox of Possibilities
OpenAI’s Tools Beta program introduces exciting features that significantly enhance the capabilities of Assistants across various domains. Among these tools are the Code Interpreter, Knowledge Retrieval, and Function Calling functionalities, each designed to empower Assistants with advanced capabilities.
Code Interpreter:
This tool allows Assistants to write and execute code within a secure environment, enabling them to solve complex problems iteratively. Assistants can generate customized solutions by running code and producing outputs like files with data and images of graphs.
Knowledge Retrieval:
With Knowledge Retrieval, Assistants access external knowledge by indexing and retrieving content from uploaded documents. This expands their information pool, enabling more informed and accurate responses to user queries.
Function Calling:
Function Calling empowers Assistants to utilize predefined functions for specialized tasks, enhancing their ability to handle complex scenarios and provide tailored responses dynamically.
Goal of Our Recruiter Assistant Demo
The assistant we are building today, named “Recruiter Assistant“, is designed to be a recruiter’s aid, primarily focusing on summarizing key aspects of a candidate’s CV, including the technologies they have worked with and the duration of their experience, all while ensuring confidentiality by not disclosing company and candidate names. To enrich the assistant’s capabilities, we’ll leverage the knowledge retrieval feature to broaden the information base using the document provided for query, my CV.
Creating Your OpenAI Assistant: Tools and Setup
To get started, we need a few pre-requisites:
- Node.js installed
- An OpenAI account (sign up here)
- The OpenAI API key, which you can get from your OpenAI account
- A working CV file to provide as fodder for our assistant to digest
After securing the above, let’s start by initializing a new project with Node.js and install the necessary dependencies:
mkdir openai-api-nodejs
cd openai-api-nodejs
npm init -y
npm install --save openai dotenv
npm install --save-dev typescript ts-nodeInitialize a new TypeScript configuration file:
npx tsc --initCreate a new file index.ts and let’s start writing a simple program to use OpenAI API.
Building the OpenAI Assistant
In our setup, we’ll be employing 3 main classes:
FileHandlerto handle file-related operationsThreadHandlerfor managing threadsAssistantHandlerwhich deals with assistant-related operations.
The App class holds all the handlers and the run method which executes the main flow of our application. The checkRun method checks the status of a run and logs it and the utilized resources.
Handling File Uploads with FileHandler
Our FileHandler class is a simple one with a single createFile method:
class FileHandler {
async createFile() {
const file = await openai.files.create({
file: fs.createReadStream(path.join(__dirname, "/data/cv.pdf")),
purpose: "assistants",
});
console.log("Upload file for assistant: ", file.id);
return file;
}
}The createFile function uploads the CV file to OpenAI’s servers and returns the file object.
Managing Threads with ThreadHandler
The ThreadHandler class has a method for creating threads:
class ThreadHandler {
async createThread() {
const thread = await openai.beta.threads.create({});
console.log("Thread has been created: ", thread);
return thread;
}
}In createThread, a new empty thread is created using OpenAI’s API and the thread object is returned.
Creating and Managing Assistants with AssistantHandler
The AssistantHandler class includes the createAssistant method, which is responsible for creating the assistant:
class AssistantHandler {
async createAssistant(file: OpenAI.Files.FileObject) {
const assistantName = 'Recruiter assistant';
const assistants = await openai.beta.assistants.list();
const existingAssistant = assistants.data.find(a => a.name === assistantName);
if (existingAssistant) {
console.log(`Assistant '${assistantName}' exists: `, existingAssistant);
return existingAssistant;
} else {
const assistant = await openai.beta.assistants.create({...});
console.log('Assistant has been created: ', assistant);
return assistant;
}
}
}createAssistant checks for the existence of an assistant with the name ‘Recruiter assistant’. If one exists, it is returned; otherwise a new assistant is created.
Executing the Workflow with App
The App class ties everything together with the run method:
class App {
fileHandler: FileHandler;
threadHandler: ThreadHandler;
assistantHandler: AssistantHandler;
constructor() {
this.fileHandler = new FileHandler();
this.threadHandler = new ThreadHandler();
this.assistantHandler = new AssistantHandler();
}
async checkRun(
thread: OpenAI.Beta.Threads.Thread,
run: OpenAI.Beta.Threads.Runs.Run
) {
return new Promise((resolve, reject) => {
const interval = setInterval(async () => {
const retrieveRun = await openai.beta.threads.runs.retrieve(
thread.id,
run.id
);
console.log("Run status: ", retrieveRun.status);
if (retrieveRun.status === "completed") {
console.log("Run completed: ", retrieveRun);
clearInterval(interval);
resolve(retrieveRun);
}
}, 3000);
});
}
async run() {
const file = await this.fileHandler.createFile();
const thread = await this.threadHandler.createThread();
const assistant = await this.assistantHandler.createAssistant(file);
const message = await openai.beta.threads.messages.create(thread.id, {
role: "user",
content:
"Summarize technologies that the candidate has worked with in the past and how long they have worked with them without mention company name or the candidate name.",
});
console.log("Adding message to thread: ", message);
const run = await openai.beta.threads.runs.create(thread.id, {
assistant_id: assistant.id,
});
console.log("Run has been created: ", run);
await this.checkRun(thread, run);
const messages = await openai.beta.threads.messages.list(thread.id);
const answer = (messages.data ?? []).find((m) => m?.role === "assistant")
?.content?.[0];
console.log("Answer: ", answer);
}
}This code defines a class App that orchestrates interactions with OpenAI’s API. It initializes instances of FileHandler, ThreadHandler, and AssistantHandler within its constructor. The checkRun method monitors the status of a thread’s run, polling every 3 seconds until the run is completed.
The run method executes a series of steps: creating a file, a thread, and an assistant; sending a message to the thread; creating a run with the assistant; and finally, waiting for the run to complete. Once completed, it retrieves and logs the assistant’s response from the thread’s messages. Overall, this class encapsulates the logic for interacting with OpenAI’s API to orchestrate the assistant’s behavior.
Running Our OpenAI Assistant
Now that we’ve set everything up, it’s time to put our OpenAI assistant to the test. We can do so with the following code snippet:
(async () => {
console.log("Running job...⏳");
const app = new App();
await app.run();
console.log("Successfully ran job! ✅");
})();This part of the code is a self-invoking asynchronous function that immediately calls itself, therefore triggering the entire operation. First, it logs “Running job…⏳” to indicate the operation has started. Subsequently, it creates an instance of the App class and then calls the run() method to orchestrate the entire operation.
Link to the GitHub repo: antonespo/open-ai-recruiter-assistant (github.com)
Once the application completes its execution, you will see a console log displaying a message generated by the assistant based on the input given in the App run method. It would be something like this:
Running job...⏳
Upload file for assistant: file-32s72Vwc2BYpn17UnkRcxGgq
Thread has been created: {
id: 'thread_bcyHijwiF8YEcuSeqMcaLavv',
object: 'thread',
created_at: 1710951432,
metadata: {}
}
Assistant 'Recruiter assistant' exists: {
id: 'asst_bMVH8sFdXukwb7FQj0lCmHFm',
object: 'assistant',
created_at: 1710949980,
name: 'Recruiter assistant',
description: null,
model: 'gpt-4-1106-preview',
instructions: 'You are a recruiter support chatbot. Use your knowledge base to best respond to customer queries. Please also keep the answer short and concise.',
tools: [ { type: 'retrieval' } ],
file_ids: [ 'file-4aOJU56PzUfDkPTvpGPx2aSt' ],
metadata: {}
}
Adding message to thread: {
id: 'msg_U7rWdRXLAvZr1pjc1T9Mo7CF',
object: 'thread.message',
created_at: 1710951433,
assistant_id: null,
thread_id: 'thread_bcyHijwiF8YEcuSeqMcaLavv',
run_id: null,
role: 'user',
content: [ { type: 'text', text: [Object] } ],
file_ids: [],
metadata: {}
}
Run has been created: {
id: 'run_PQ2Pqcfdyk8ZsLKTkfa2cfjW',
object: 'thread.run',
created_at: 1710951434,
assistant_id: 'asst_bMVH8sFdXukwb7FQj0lCmHFm',
thread_id: 'thread_bcyHijwiF8YEcuSeqMcaLavv',
status: 'queued',
started_at: null,
expires_at: 1710952034,
cancelled_at: null,
failed_at: null,
completed_at: null,
required_action: null,
last_error: null,
model: 'gpt-4-1106-preview',
instructions: 'You are a recruiter support chatbot. Use your knowledge base to best respond to customer queries. Please also keep the answer short and concise.',
tools: [ { type: 'retrieval' } ],
file_ids: [ 'file-4aOJU56PzUfDkPTvpGPx2aSt' ],
metadata: {},
usage: null
}
Run status: in_progress
Run status: in_progress
Run status: in_progress
Run status: in_progress
Run status: completed
Run completed: {
id: 'run_PQ2Pqcfdyk8ZsLKTkfa2cfjW',
object: 'thread.run',
created_at: 1710951434,
assistant_id: 'asst_bMVH8sFdXukwb7FQj0lCmHFm',
thread_id: 'thread_bcyHijwiF8YEcuSeqMcaLavv',
status: 'completed',
started_at: 1710951434,
expires_at: null,
cancelled_at: null,
failed_at: null,
completed_at: 1710951448,
required_action: null,
last_error: null,
model: 'gpt-4-1106-preview',
instructions: 'You are a recruiter support chatbot. Use your knowledge base to best respond to customer queries. Please also keep the answer short and concise.',
tools: [ { type: 'retrieval' } ],
file_ids: [ 'file-4aOJU56PzUfDkPTvpGPx2aSt' ],
metadata: {},
usage: { prompt_tokens: 2934, completion_tokens: 278, total_tokens: 3212 }
}
Answer: {
type: 'text',
text: {
value: 'The candidate has experience with the following technologies:\n' +
'\n' +
'- **Microsoft Azure**: Used in current and previous roles, as well as for various side projects. The candidate is Microsoft Azure Developer Associate Certified (AZ-204, September 2023), Microsoft Azure Data Fundamentals Certified (DP-900, October 2023), and Microsoft Azure Fundamentals Certified (AZ-900, July 2021).\n' +
'- **TypeScript & Angular**: Frequently used across different roles for full-stack development, and also in teaching within an academy.\n' +
'- **RxJs & NgRx**: Implemented in current technical leadership and development role as well as in teaching.\n' +
'- **.NET & C#**: Worked with since at least March 2019 as a full-stack developer and as an educator. The candidate is Microsoft Technology Associate: Software Development Fundamentals Certified (December 2021).\n' +
'- **Node.js**: Used in the current full-stack development role and in an academy teaching role. Also certified with a Fincons Node.js – Nest.js course (July 2022).\n' +
'- **Python**: Applied in the development of a logistic application during a previous role as a Software Developer Engineer.\n' +
'- **SQL Server, Entity Framework**: Used for full-stack development in a previous role.\n' +
'- **MATLAB**: Used for developing algorithms and models for CAE implementation in a junior engineer role.\n' +
'- **React**: Utilized in developing side-projects such as a social network for events and a chat app.\n' +
'- **SignalR**: Leveraged for real-time communication in a chat app side-project.\n' +
'- **Chart.js**: Employed in a personal web app for monitoring blood pressure data.\n' +
'\n' +
'In terms of experience duration, it ranges from at least four to five years for most of the technologies, with teaching roles indicating a deep understanding and proficiency in some of them. The candidate
has been leading teams and teaching since at least 2021, suggesting a strong command of the stack and best practices in software development.',
annotations: []
}
}
Successfully ran job! ✅Here is the result formatted in a more clean and readable manner:
The candidate has experience with the following technologies:
- Microsoft Azure: Used in current and previous roles, as well as for various side projects. The candidate is Microsoft Azure Developer Associate Certified (AZ-204, September 2023), Microsoft Azure Data Fundamentals Certified (DP-900, October 2023), and Microsoft Azure Fundamentals Certified (AZ-900, July 2021).
- TypeScript & Angular: Frequently used across different roles for full-stack development, and also in teaching within an academy.
- RxJs & NgRx: Implemented in current technical leadership and development role as well as in teaching.
- .NET & C#: Worked with since at least March 2019 as a full-stack developer and as an educator. The candidate is Microsoft Technology Associate: Software Development Fundamentals Certified (December 2021).
- Node.js: Used in the current full-stack development role and in an academy teaching role. Also certified with a Fincons Node.js – Nest.js course (July 2022).
- Python: Applied in the development of a logistic application during a previous role as a Software Developer Engineer.
- SQL Server, Entity Framework: Used for full-stack development in a previous role.
- MATLAB: Used for developing algorithms and models for CAE implementation in a junior engineer role.
- React: Utilized in developing side-projects such as a social network for events and a chat app.
- SignalR: Leveraged for real-time communication in a chat app side-project.
- Chart.js: Employed in a personal web app for monitoring blood pressure data.
In terms of experience duration, it ranges from at least four to five years for most of the technologies, with teaching roles indicating a deep understanding and proficiency in some of them. The candidate has been leading teams and teaching since at least 2021, suggesting a strong command of the stack and best practices in software development.
The Power of OpenAI Assistants
And there we have it! We’ve created an OpenAI assistant capable of summarising key technologies and experience duration from a given CV, providing recruiters with a handy tool for screening applications.
The assistant is not only efficient, but is also a prime example of the power of OpenAI’s GPT models. This assistant can evaluate and summarize information from any document, not just CVs, and can vastly reduce manual reviewing time.
Looking forward, I believe there are many more ways to harness the power of OpenAI’s GPT models that are yet to be discovered. This was just a glimpse of it, a tip of an immense iceberg. So, stay tuned for future posts as we delve deeper into the world of AI and OpenAI integrations.
Happy Coding!
