Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
HJHGJGHHG
GitHub Repository: HJHGJGHHG/CCF-BDCI-AQYI
Path: blob/main/models/__pycache__/bert_tdlstm.cpython-38.pyc
153 views
U

=za�	�@sHddlZddlmZddlmZddlmZdd�ZGdd�de�ZdS)�N)�BertForSequenceClassification)�DynamicLSTMcCs�t�ddd��d�}g}td�D]�}||��dd}|�|�t�|d|dt�ttd|d����d��}t�d|d��d�}tj	tj
||fdd�dd�}tj
||fdd�}q t�|��d�}t�|dt�ttdtj
d����d��}||fS)	N��@��cuda�r�����)�dim)�torch�zeros�to�range�tolist�append�index_select�tensor�list�	unsqueeze�cat�
LongTensor�args�
batch_size)�character_in_text�all_hidden_statesZx_l_sumZx_l_lenZi_sample�len_lZx_lZzero_l�r�%D:\CCF-BDCI\AQY\models\bert_tdlstm.py�get_x_ls
�*rcs&eZdZ�fdd�Zddd�Z�ZS)�Bert_TD_LSTMcsXt��|�d|jj_tddddd�|_tddddd�|_t�	t�
dd�t���|_dS)NTrr)�
num_layers�batch_firsti)
�super�__init__�bert�config�output_hidden_statesrZlstm_lZlstm_r�nn�
Sequential�Linear�Tanh�linear)�selfr%��	__class__rrr#s

�zBert_TD_LSTM.__init__NcCs<|
dk	r|
n|jj}
|j||||||||	|
d�	}|d}
dS)N)�attention_mask�token_type_ids�position_ids�	head_mask�
inputs_embeds�output_attentionsr&�return_dict�)r%�use_return_dictr$)r,�	input_idsr/r0r1r2r3�labelsr4r&r5r�outputsrrrr�forward#s�
zBert_TD_LSTM.forward)NNNNNNNNNNN)�__name__�
__module__�__qualname__r#r;�
__classcell__rrr-rrs�r)	r�torch.nnr'�transformersrZ
models.layersrrrrrrr�<module>s