Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug:KeyError: 'tail_type' #841

Open
neverlatetolearn0 opened this issue Oct 30, 2024 · 5 comments
Open

Bug:KeyError: 'tail_type' #841

neverlatetolearn0 opened this issue Oct 30, 2024 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@neverlatetolearn0
Copy link

2024-10-30 22:00:48,100 - Deleted File Path: E:\Python_Code\Neo4j-llm-graph-builder\backend\merged_files\test9.txt and Deleted File Name : test9.txt
2024-10-30 22:00:48,101 - file test9.txt deleted successfully
{'message': 'Failed To Process File:test9.txt or LLM Unable To Parse Content ', 'error_message': "'tail_type'", 'file_name': 'test9.txt', 'status':
'Failed', 'db_url': 'neo4j://localhost:7687', 'failed_count': 1, 'source_type': 'local file', 'source_url': None, 'wiki_query': None, 'logging_time': '2024-10-30 14:00:48 UTC'}
2024-10-30 22:00:48,101 - File Failed in extraction: {'message': 'Failed To Process File:test9.txt or LLM Unable To Parse Content ', 'error_message'
: "'tail_type'", 'file_name': 'test9.txt', 'status': 'Failed', 'db_url': 'neo4j://localhost:7687', 'failed_count': 1, 'source_type': 'local file', 'source_url': None, 'wiki_query': None, 'logging_time': '2024-10-30 14:00:48 UTC'}
Traceback (most recent call last):
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\score.py", line 169, in extract_knowledge_graph_from_file
result = await asyncio.to_thread(
File "D:\Anaconda3\Install\envs\Neo4j_llm_bulider\lib\asyncio\threads.py", line 25, in to_thread
return await loop.run_in_executor(None, func_call)
File "D:\Anaconda3\Install\envs\Neo4j_llm_bulider\lib\concurrent\futures\thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\main.py", line 202, in extract_graph_from_file_local_file
return processing_source(graph, model, file_name, pages, allowedNodes, allowedRelationship, True, merged_file_path, uri)
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\main.py", line 319, in processing_source
node_count, rel_count = processing_chunks(selected_chunks,graph,file_name,model,allowedNodes,allowedRelationship,node_count, rel_count)
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\main.py", line 379, in processing_chunks
graph_documents = generate_graphDocuments(model, graph, chunkId_chunkDoc_list, allowedNodes, allowedRelationship)
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\generate_graphDocuments_from_llm.py", line 46, in generate_graphDocuments
graph_documents = get_graph_from_OpenAI(model, graph, chunkId_chunkDoc_list, allowedNodes, allowedRelationship)
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\openAI_llm.py", line 45, in get_graph_from_OpenAI
return get_graph_document_list(llm, combined_chunk_document_list, allowedNodes, allowedRelationship, use_function_call)
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\llm.py", line 245, in get_graph_document_list
graph_document = future.result()
File "D:\Anaconda3\Install\envs\Neo4j_llm_bulider\lib\concurrent\futures_base.py", line 451, in result
return self.__get_result()
File "D:\Anaconda3\Install\envs\Neo4j_llm_bulider\lib\concurrent\futures_base.py", line 403, in __get_result
raise self._exception
File "D:\Anaconda3\Install\envs\Neo4j_llm_bulider\lib\concurrent\futures\thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\graph_transformers\llm.py", line 753, in convert_to_graph_documents
return [self.process_response(document) for document in documents]
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\graph_transformers\llm.py", line 753, in
return [self.process_response(document) for document in documents]
File "E:\Python_Code\Neo4j-llm-graph-builder\backend\src\graph_transformers\llm.py", line 704, in process_response
nodes_set.add((rel["tail"], rel["tail_type"]))
KeyError: 'tail_type'

I encountered this problem, anyone else encountered the same problem

@neverlatetolearn0 neverlatetolearn0 changed the title Bug Bug:KeyError: 'tail_type' Oct 30, 2024
@kartikpersistent kartikpersistent added the bug Something isn't working label Oct 31, 2024
@kaustubh-darekar
Copy link
Collaborator

Hi @yl950218 , can you elaborate on config and setup.
which llm you have used?
In which functionality you encountered this error?

@neverlatetolearn0
Copy link
Author

Hi @yl950218 , can you elaborate on config and setup. which llm you have used? In which functionality you encountered this error?

qwen2.5:14b

@kaustubh-darekar
Copy link
Collaborator

Hi @yl950218 .
Apologies for late reply.

We have tested qwen model on fireworks platform and it is working fine as expected. Unfortunately we dont have access to any other platform for qwen model.
If possible can you try fireworks and let us know results

@neverlatetolearn0
Copy link
Author

Hi @yl950218 . Apologies for late reply.

We have tested qwen model on fireworks platform and it is working fine as expected. Unfortunately we dont have access to any other platform for qwen model. If possible can you try fireworks and let us know results

It could be a problem with the ability of the big prediction model, but I solved it by modifying the code

@kaustubh-darekar
Copy link
Collaborator

thanks for the info @neverlatetolearn0.
If possible do let us know the modifications you did, it will help to improve application in such scenarios.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants