KLARNA - Language Models and Knowledge Graphs: A Systems Approach
- 2. LLMs and knowledge graphs are currently
in a state of purgatory
Heaven*
Hell
KGs x LLMs
* heaven = production grade applications and systems
- 3. Rebuild our knowledge graph stack from the ground up with LLMs embedded
throughout, in order to maximise their potential.
We must take a systems approach to integrating
LLMs with knowledge graphs
- 4. ● Knowledge is patterns
○ LLMs and patterns
○ Knowledge graphs and patterns
● Current state of integration between knowledge graphs and LLMs
○ The challenges of production use cases
● The future and how we can get there
○ Taking a systems approach
○ Helpful tips
Today we will cover:
- 6. How do we know a dog is a dog?
Knowledge | Current State | Future State
- 7. Pattern: Four legs, a tail, barking, a wet nose, arguing with a cat.
How do we know a dog is a dog?
Knowledge | Current State | Future State
- 8. How do we know when someone likes us?
Knowledge | Current State | Future State
- 9. Pattern: Eye contact, making time for you and showing interest. Laughing at our bad
jokes.
How do we know when someone likes us?
Knowledge | Current State | Future State
- 10. How do we detect fraud in financial transactions?
Knowledge | Current State | Future State
- 11. Pattern: Unusual spending patterns; transactions across multiple geographies in a short
space of time, transactions at unusual times, transactions on unexpected good or
services.
How do we detect fraud in financial transactions?
Knowledge | Current State | Future State
- 12. For LLMs pattern prediction is grounded in predicting the next token accurately. In
order to do this, they in turn develop intelligence.
LLMs are just large scale pattern recognition and
prediction engines
Knowledge | Current State | Future State
- 13. Knowledge graphs are our opinionated version of the
patterns in the world we care about
Our knowledge graph schemas, ontologies and taxonomies are effectively the data
points and patterns in the world we care about within a certain context.
Knowledge | Current State | Future State
- 14. Pattern in a regular DB:
Dog Cat
Knowledge | Current State | Future State
- 15. Graphs are much richer:
Dog Cat
Eats
Knowledge | Current State | Future State
- 16. Graphs are much richer:
Dog Cat
Loves
Knowledge | Current State | Future State
- 17. By focusing on patterns, we can design systems where LLMs and knowledge graphs are
interoperable and intrinsically connected.
Both technologies are complementary. LLMs are very good at at synthesizing large
quantities of unstructured data. Knowledge graphs provide frameworks for structured,
easily navigable representations of that information.
Knowledge Graphs and LLMs are similar in their focus
on patterns
Knowledge | Current State | Future State
- 18. Knowledge Graphs and LLM’s:
The Current State of the Play
Knowledge | Current State | Future State
- 19. Typically involves turning a natural language query into a structured cypher query,
which is executed against a graph. The result is then turned into a natural language
answer.
Use Case 1: Using LLMs to Query graphs
Knowledge | Current State | Future State
- 20. Involves taking unstructured information and processing to into more structured
information of a knowledge graph.
Use Case 2: Using LLM’s to build knowledge graphs
Knowledge | Current State | Future State
- 21. ● Often results don’t work at production scale and complexity graphs
But we are encountering a lot issues
Knowledge | Current State | Future State
- 22. ● Often results don’t work at production scale and complexity graphs
● Querying graphs
○ LLMs don’t receive enough information to generate correct cypher queries
○ Schemas have not been designed with querying in mind
○ Infrastructure is not optimised for LLMs
But we are encountering a lot issues
Knowledge | Current State | Future State
- 23. ● Often results don’t work at production scale and complexity graphs
● Querying graphs
○ LLMs don’t receive enough information to generate correct cypher queries
○ Schemas have not been designed with querying in mind
○ Infrastructure is not optimised for LLMs
● Building Graphs
○ Examples are often shown with ‘schemaless’ graphs
○ Unable to handle edge cases
○ Unreliable output when populating graphs
○ Generated schemas are of low quality
But we are encountering a lot issues
Knowledge | Current State | Future State
- 26. So how do we get there?
Solutions Systems
Knowledge | Current State | Future State
- 27. Objective: building schemas effectively
We should treat our LLM like a new employee in terms of the context we provide
Building and populating graphs
Knowledge | Current State | Future State
- 28. Node 1:
* Supplier name: Firenze Inc
* Location: Italy
Building and populating graphs with the system
in mind
Node 2:
* Supplier name: Firenze Inc
* Location: Italy
* Desc: Producer of high
quality knitwear and wool
garments
Query: find me a jumper supplier in italy?
Knowledge | Current State | Future State
- 29. Objective: turn the messy into structured
Let’s lean into our better documentation and structured frameworks like Pydantic to do
better extraction.
Populating graphs
Knowledge | Current State | Future State
- 30. Objective: use queries and responses to feedback and improve the system
Cache queries and cypher and pass as few-shot examples to the LLM
Adjust schemas or add descriptions and relationships based on querying difficulties.
Querying and improving the system
Knowledge | Current State | Future State
- 31. Today we covered:
● Knowledge is patterns
○ LLMs and patterns
○ Knowledge graphs and patterns
● Current state of integration between knowledge graphs and LLMs
○ The challenges of production use cases
● The future and how we can get there
○ Taking a systems approach
○ Helpful tips
Knowledge | Current State | Future State
- 32. ● Intro to LLMS
● Pydantic is all you need
● Ilya on Dwarkesh podcast
Essential reading:
Solutions Systems