Multilingual Automatic Speech Recognition with word-level timestamps and confidence
-
Updated
Dec 6, 2024 - Python
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
KoBERT와 CRF로 만든 한국어 개체명인식기 (BERT+CRF based Named Entity Recognition model for Korean)
Plot the vector graph of attention based text visualisation
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Train and visualize Hierarchical Attention Networks
🚀 Cross attention map tools for huggingface/diffusers
Visualizing query-key interactions in language + vision transformers
Comparatively fine-tuning pretrained BERT models on downstream, text classification tasks with different architectural configurations in PyTorch.
Visualization for simple attention and Google's multi-head attention.
Summary of Transformer applications for computer vision tasks.
my codes for learning attention mechanism
(ECCV2020) Tensorflow implementation of A Generic Visualization Approach for Convolutional Neural Networks
Lightweight visualization tool for neural attention mechanisms
Implemented image caption generation method propossed in Show, Attend, and Tell paper using the Fastai framework to describe the content of images. Achieved 24 BLEU score for Beam search size of 5. Designed a Web application for model deployment using the Flask framework.
Easy-to-read implementation of self-supervised learning using vision transformer and knowledge distillation with no labels - DINO 😃
attention mechanism in keras, like Dense and RNN...
CLIP GUI - XAI app ~ explainable (and guessable) AI with ViT & ResNet models
Transfer learning pretrained vision transformers for breast histopathology
PyTorch implementation of the End-to-End Memory Network with attention layer vizualisation support.
Add a description, image, and links to the attention-visualization topic page so that developers can more easily learn about it.
To associate your repository with the attention-visualization topic, visit your repo's landing page and select "manage topics."