Nice summary. For those who were not there (like me), the livestreams are available on YouTube (at least from some tracks, like keynotes, multimodality and codegen) https://www.youtube.com/@aiDotEngineer/streams
great recap! and thanks for coming again! i would say the commerical bent is partially my fault - because i was trying to give people a survey of the landscape of ai tooling to take home to their own companies to discuss. next year i will make more effort to rebalance toward more applications
LangGraph aims to overcome the major limitation of traditional LangChain chains - that they lack looping at runtime. This problem can be easily addressed by introducing a graph structure, as graphs can easily introduce loops into chains, which are essentially directed acyclic graphs (DAGs).
Furthermore, graph tools have shown their power in organizing knowledge bases for retrieval-augmented generation (RAG) scenarios. Specifically, they can enhance the "retrieval" stage to enable more meaningful context retrieval, ultimately improving the accuracy of generated responses. To achieve this, our strategy is to store the knowledge base in a graph-based database (e.g., Neo4j), leveraging the semantic capabilities of LLMs to correctly identify and map entities and relationships.
To realize this goal, LangChain has developed a powerful library called LLMGraphTransformer, which aims to transform unstructured text data into a graph-based representation.
Nice summary. For those who were not there (like me), the livestreams are available on YouTube (at least from some tracks, like keynotes, multimodality and codegen) https://www.youtube.com/@aiDotEngineer/streams
great recap! and thanks for coming again! i would say the commerical bent is partially my fault - because i was trying to give people a survey of the landscape of ai tooling to take home to their own companies to discuss. next year i will make more effort to rebalance toward more applications
LangGraph aims to overcome the major limitation of traditional LangChain chains - that they lack looping at runtime. This problem can be easily addressed by introducing a graph structure, as graphs can easily introduce loops into chains, which are essentially directed acyclic graphs (DAGs).
Furthermore, graph tools have shown their power in organizing knowledge bases for retrieval-augmented generation (RAG) scenarios. Specifically, they can enhance the "retrieval" stage to enable more meaningful context retrieval, ultimately improving the accuracy of generated responses. To achieve this, our strategy is to store the knowledge base in a graph-based database (e.g., Neo4j), leveraging the semantic capabilities of LLMs to correctly identify and map entities and relationships.
To realize this goal, LangChain has developed a powerful library called LLMGraphTransformer, which aims to transform unstructured text data into a graph-based representation.