This repository contains several experiments with transformer based language models. All experiments in this are being conducted exclusively by the Bigpicture Team and were funded by the Ministry of SMEs and Startups.
This is a fine-tuned model to classify news articles into 9 topics - ITscience, culture, economy, entertainment, health, life, politic, social, sport.
It varies by topic, but shows up to 96% classification accuracy.
This model has been fine-tuned with 130 Jjaltoon scripts.
It takes a short text as input and generates the rest of the sequence.
> 아니
아니 그게 뭔 개소리야!
> 민수
민수에게 헥토파스칼킥, 이후 파운딩) 개새끼야!
> 인공지능
인공지능 대체 왜... (고개를 확 들며 울먹이며) 남친도 있으면서 대체 왜 나한테 그렇게 이쁘게 웃어줬냐고!!!
This experiment is to find an answer to a question within a given context. The purpose of this experiment is to verify that answer can be predicted from unstructured natural language documents. The future works in this Q&A experiment involves finding answer in web documents.
The model used in this experiment is a fine-tuned KcBERT model using MiNSU. This can be implemented as a web interface for inference via ivete.
Preparing...