Tap on "From" and enter the city you are travelling from
+
Tap on "To Where" and enter the city you are travelling to
+
Tap on "Departure Date" and from the calendar select your departure date
+
Tap on "Search Buses" button
+
From the list of options choose your suitable bus based on timings and price
+
Choose the Seats: From the displayed seat map you can choose the seat according to your preference. Seats shown in white are available whereas the grey seats are already occupied
+
Now tap on the "Proceed" option at the bottom right corner of the page. The selected seat will be highlighted in green Color
+
Check the circle check box to select the preferred boarding and drop point
+
Tap "Proceed"
+
Enter the full name, age, contact information of the traveler in the respective boxes
+
Review your booking and tap "Proceed"
+
Tap on the "Yes I understand. Proceed to Pay"
+
Tap on Pay Securely
+
After your payment is successful you will get a Booked status on your booking.You will get your ticket on email and SMS or you can download from here also
+
+
+
+
Step 1.
+
+
Step 2.
+
+
Step 3.
+
+
Step 4.
+
+
Step 5.
+
+
Step 6.
+
+
Step 7.
+
+
Step 8.
+
+
Step 9.
+
+
Step 10.
+
+
Step 11.
+
+
Step 12.
+
+
Step 13.
+
+
Step 14.
+
+
Step 15.
+
+
Step 16.
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/Book_Bus_using_Paytm/img/1.jpg b/Book_Bus_using_Paytm/img/1.jpg
new file mode 100644
index 0000000..ec073ee
Binary files /dev/null and b/Book_Bus_using_Paytm/img/1.jpg differ
diff --git a/Book_Bus_using_Paytm/img/10.jpg b/Book_Bus_using_Paytm/img/10.jpg
new file mode 100644
index 0000000..b266fff
Binary files /dev/null and b/Book_Bus_using_Paytm/img/10.jpg differ
diff --git a/Book_Bus_using_Paytm/img/11.jpg b/Book_Bus_using_Paytm/img/11.jpg
new file mode 100644
index 0000000..428a623
Binary files /dev/null and b/Book_Bus_using_Paytm/img/11.jpg differ
diff --git a/Book_Bus_using_Paytm/img/12.jpg b/Book_Bus_using_Paytm/img/12.jpg
new file mode 100644
index 0000000..dba4a80
Binary files /dev/null and b/Book_Bus_using_Paytm/img/12.jpg differ
diff --git a/Book_Bus_using_Paytm/img/13.jpg b/Book_Bus_using_Paytm/img/13.jpg
new file mode 100644
index 0000000..0aa50ec
Binary files /dev/null and b/Book_Bus_using_Paytm/img/13.jpg differ
diff --git a/Book_Bus_using_Paytm/img/14.jpg b/Book_Bus_using_Paytm/img/14.jpg
new file mode 100644
index 0000000..fe8c931
Binary files /dev/null and b/Book_Bus_using_Paytm/img/14.jpg differ
diff --git a/Book_Bus_using_Paytm/img/15.jpg b/Book_Bus_using_Paytm/img/15.jpg
new file mode 100644
index 0000000..7a34c14
Binary files /dev/null and b/Book_Bus_using_Paytm/img/15.jpg differ
diff --git a/Book_Bus_using_Paytm/img/16.jpg b/Book_Bus_using_Paytm/img/16.jpg
new file mode 100644
index 0000000..b3f0ca7
Binary files /dev/null and b/Book_Bus_using_Paytm/img/16.jpg differ
diff --git a/Book_Bus_using_Paytm/img/2.jpg b/Book_Bus_using_Paytm/img/2.jpg
new file mode 100644
index 0000000..547c07e
Binary files /dev/null and b/Book_Bus_using_Paytm/img/2.jpg differ
diff --git a/Book_Bus_using_Paytm/img/3.jpg b/Book_Bus_using_Paytm/img/3.jpg
new file mode 100644
index 0000000..e2ae8f5
Binary files /dev/null and b/Book_Bus_using_Paytm/img/3.jpg differ
diff --git a/Book_Bus_using_Paytm/img/4.jpg b/Book_Bus_using_Paytm/img/4.jpg
new file mode 100644
index 0000000..a246768
Binary files /dev/null and b/Book_Bus_using_Paytm/img/4.jpg differ
diff --git a/Book_Bus_using_Paytm/img/5.jpg b/Book_Bus_using_Paytm/img/5.jpg
new file mode 100644
index 0000000..4632009
Binary files /dev/null and b/Book_Bus_using_Paytm/img/5.jpg differ
diff --git a/Book_Bus_using_Paytm/img/6.jpg b/Book_Bus_using_Paytm/img/6.jpg
new file mode 100644
index 0000000..ac77913
Binary files /dev/null and b/Book_Bus_using_Paytm/img/6.jpg differ
diff --git a/Book_Bus_using_Paytm/img/7.jpg b/Book_Bus_using_Paytm/img/7.jpg
new file mode 100644
index 0000000..00dbd7e
Binary files /dev/null and b/Book_Bus_using_Paytm/img/7.jpg differ
diff --git a/Book_Bus_using_Paytm/img/8.jpg b/Book_Bus_using_Paytm/img/8.jpg
new file mode 100644
index 0000000..2efd4c6
Binary files /dev/null and b/Book_Bus_using_Paytm/img/8.jpg differ
diff --git a/Book_Bus_using_Paytm/img/9.jpg b/Book_Bus_using_Paytm/img/9.jpg
new file mode 100644
index 0000000..e753680
Binary files /dev/null and b/Book_Bus_using_Paytm/img/9.jpg differ
diff --git a/Copy_of_SpanBERT_for_Coreference_Resolution.ipynb b/Copy_of_SpanBERT_for_Coreference_Resolution.ipynb
new file mode 100644
index 0000000..8fec4c3
--- /dev/null
+++ b/Copy_of_SpanBERT_for_Coreference_Resolution.ipynb
@@ -0,0 +1,1169 @@
+{
+ "nbformat": 4,
+ "nbformat_minor": 0,
+ "metadata": {
+ "colab": {
+ "provenance": [],
+ "include_colab_link": true
+ },
+ "kernelspec": {
+ "name": "python3",
+ "display_name": "Python 3"
+ }
+ },
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "view-in-github",
+ "colab_type": "text"
+ },
+ "source": [
+ ""
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "H0xPknceFORt"
+ },
+ "source": [
+ "This notebook runs the coreferecne resolution model described in [\"SpanBERT: Improving Pre-training by Representing and Predicting Spans\"](https://arxiv.org/pdf/1907.10529.pdf) by Mandar Joshi, Danqi Chen, Yinhan Liu, Daniel S. Weld, Luke Zettlemoyer, Omer Levy, and released here: https://github.com/mandarjoshi90/coref\n",
+ "\n",
+ "This Colab is by me, Jonathan K. Kummerfeld. My website is www.jkk.name\n",
+ "\n",
+ "Thank you to [Shon Otmazgin](https://github.com/shon-otmazgin) for bugfixes that address software changes since I originally made this colab.\n",
+ "\n",
+ "Note:\n",
+ "- This code does not handle text with multiple speakers, for that you will need to adjust the data preparation process.\n",
+ "- Occasionally I get a bug where either an assertion about the size of the input mask fails or a sequence is being assigned to an array element. It appears to be inconsistent across runs, so I'm not sure what is going on.\n",
+ "- The default model is not the best one. I chose it because it is much faster to download.\n",
+ "\n",
+ "If you have suggestions, please contact me at jkummerf@umich.edu"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "yWXlf3vQKDo8"
+ },
+ "source": [
+ "# Configuration\n",
+ "\n",
+ "First, specify your input. If you are just playing with this, edit the provided text. If you want to run on a larger file:\n",
+ "\n",
+ "1. Upload a file.\n",
+ "2. Set the filename."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "5_8SXJRR-oB_"
+ },
+ "source": [
+ "filename = \"optional-change-to-your-file.txt\"\n",
+ "\n",
+ "text = [\n",
+ "\"Firefly is an American space Western drama television series which ran from 2002-2003, created by writer and director Joss Whedon, under his Mutant Enemy Productions label.\",\n",
+ "\"Whedon served as an executive producer, along with Tim Minear.\",\n",
+ "\"The series is set in the year 2517, after the arrival of humans in a new star system and follows the adventures of the renegade crew of Serenity, a 'Firefly-class' spaceship.\",\n",
+ "\"The ensemble cast portrays the nine characters who live on Serenity.\",\n",
+ "\"Whedon pitched the show as 'nine people looking into the blackness of space and seeing nine different things.'\",\n",
+ "\"The show explores the lives of a group of people, some of whom fought on the losing side of a civil war, who make a living on the fringes of society as part of the pioneer culture of their star system.\",\n",
+ "\"In this future, the only two surviving superpowers, the United States and China, fused to form the central federal government, called the Alliance, resulting in the fusion of the two cultures.\",\n",
+ "\"According to Whedon's vision, 'nothing will change in the future: technology will advance, but we will still have the same political, moral, and ethical problems as today.'\",\n",
+ "\"Firefly premiered in the U.S. on the Fox network on September 20, 2002.\",\n",
+ "\"By mid-December, Firefly had averaged 4.7 million viewers per episode and was 98th in Nielsen ratings.\",\n",
+ "\"It was canceled after 11 of the 14 produced episodes were aired.\",\n",
+ "\"Despite the relatively short life span of the series, it received strong sales when it was released on DVD and has large fan support campaigns.\",\n",
+ "\"It won a Primetime Emmy Award in 2003 for Outstanding Special Visual Effects for a Series.\",\n",
+ "\"TV Guide ranked the series at No. 5 on their 2013 list of 60 shows that were 'Cancelled Too Soon.'\",\n",
+ "\"The post-airing success of the show led Whedon and Universal Pictures to produce Serenity, a 2005 film which continues from the story of the series, and the Firefly franchise expanded to other media, including comics and a role-playing game.\",\n",
+ "]\n",
+ "\n",
+ "if filename != \"optional-change-to-your-file.txt\":\n",
+ " data = [l.strip() for l in open(filename).readlines()]"
+ ],
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "MmdXWnG9-ni2"
+ },
+ "source": [
+ "Next, specify the data type and model:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "P2l-YNp8KCvh"
+ },
+ "source": [
+ "genre = \"nw\"\n",
+ "# The Ontonotes data for training the model contains text from several sources\n",
+ "# of very different styles. You need to specify the most suitable one out of:\n",
+ "# \"bc\": broadcast conversation\n",
+ "# \"bn\": broadcast news\n",
+ "# \"mz\": magazine\n",
+ "# \"nw\": newswire\n",
+ "# \"pt\": Bible text\n",
+ "# \"tc\": telephone conversation\n",
+ "# \"wb\": web data\n",
+ "\n",
+ "model_name = \"spanbert_base\"\n",
+ "# The fine-tuned model to use. Options are:\n",
+ "# bert_base\n",
+ "# spanbert_base\n",
+ "# bert_large\n",
+ "# spanbert_large"
+ ],
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "bwu-ICg1QUAS"
+ },
+ "source": [
+ "# System Installation\n",
+ "Get the code:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "tlSneeenEwad",
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "outputId": "101c8244-ec59-47c8-938c-dc91eec87a2a"
+ },
+ "source": [
+ "! git clone https://github.com/mandarjoshi90/coref.git\n",
+ "%cd coref"
+ ],
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "stream",
+ "text": [
+ "Cloning into 'coref'...\n",
+ "remote: Enumerating objects: 6, done.\u001b[K\n",
+ "remote: Counting objects: 100% (6/6), done.\u001b[K\n",
+ "remote: Compressing objects: 100% (6/6), done.\u001b[K\n",
+ "remote: Total 734 (delta 2), reused 0 (delta 0), pack-reused 728\u001b[K\n",
+ "Receiving objects: 100% (734/734), 4.17 MiB | 8.64 MiB/s, done.\n",
+ "Resolving deltas: 100% (441/441), done.\n",
+ "/content/coref\n"
+ ],
+ "name": "stdout"
+ }
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "hDVgvzQnMuMG"
+ },
+ "source": [
+ "Temporary hack to fix a requirement (pending pull request)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "hFety44KE6R_"
+ },
+ "source": [
+ "! sed 's/MarkupSafe==1.0/MarkupSafe==1.1.1/; s/scikit-learn==0.19.1/scikit-learn==0.21/; s/scipy==1.0.0/scipy==1.6.2/' < requirements.txt > tmp\n",
+ "! mv tmp requirements.txt\n",
+ "\n",
+ "! sed 's/.D.GLIBCXX.USE.CXX11.ABI.0//' < setup_all.sh > tmp\n",
+ "! mv tmp setup_all.sh\n",
+ "! chmod u+x setup_all.sh"
+ ],
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "hrmR50GZMszn"
+ },
+ "source": [
+ "Set some environment variables. The data directory one is used by the system, the other is so we can use the model defined above."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "d4cS1Xf0G2EL"
+ },
+ "source": [
+ "import os\n",
+ "os.environ['data_dir'] = \".\"\n",
+ "os.environ['CHOSEN_MODEL'] = model_name"
+ ],
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "a1HQYiHqEx1B"
+ },
+ "source": [
+ "Run Setup. Note, some incompatibility issues do appear, but I still find that everything installs and runs. Also, I specifically request tensorflow 2 and then uninstall it to make sure we've got a clean setup."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "w5n-XspBFHjD",
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "outputId": "f01f81ef-a47b-46b8-d924-f56c9a6bd5c5"
+ },
+ "source": [
+ "%tensorflow_version 2.x\n",
+ "! pip uninstall -y tensorflow\n",
+ "! pip install -r requirements.txt --log install-log.txt -q\n",
+ "! ./setup_all.sh"
+ ],
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "stream",
+ "text": [
+ "Uninstalling tensorflow-2.4.1:\n",
+ " Successfully uninstalled tensorflow-2.4.1\n",
+ "\u001b[K |████████████████████████████████| 102kB 5.6MB/s \n",
+ "\u001b[K |████████████████████████████████| 1.2MB 10.5MB/s \n",
+ "\u001b[K |████████████████████████████████| 163kB 27.1MB/s \n",
+ "\u001b[K |████████████████████████████████| 6.6MB 26.5MB/s \n",
+ "\u001b[K |████████████████████████████████| 552kB 49.5MB/s \n",
+ "\u001b[K |████████████████████████████████| 61kB 7.6MB/s \n",
+ "\u001b[33m WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProtocolError('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))': /packages/e5/27/1f908ebb99c8d48a5ba4eb9d7997f5633b920d98fe712f67aaa0663f1307/grpcio-1.23.0-cp37-cp37m-manylinux1_x86_64.whl\u001b[0m\n",
+ "\u001b[K |████████████████████████████████| 2.2MB 8.5MB/s \n",
+ "\u001b[K |████████████████████████████████| 266kB 37.3MB/s \n",
+ "\u001b[K |████████████████████████████████| 890kB 52.4MB/s \n",
+ "\u001b[K |████████████████████████████████| 133kB 54.3MB/s \n",
+ "\u001b[K |████████████████████████████████| 153kB 58.2MB/s \n",
+ "\u001b[K |████████████████████████████████| 51kB 6.5MB/s \n",
+ "\u001b[K |████████████████████████████████| 51kB 6.6MB/s \n",
+ "\u001b[K |████████████████████████████████| 92kB 11.1MB/s \n",
+ "\u001b[K |████████████████████████████████| 20.3MB 1.4MB/s \n",
+ "\u001b[K |████████████████████████████████| 2.1MB 47.5MB/s \n",
+ "\u001b[K |████████████████████████████████| 1.2MB 45.9MB/s \n",
+ "\u001b[K |████████████████████████████████| 430kB 48.1MB/s \n",
+ "\u001b[K |████████████████████████████████| 71kB 9.1MB/s \n",
+ "\u001b[K |████████████████████████████████| 256kB 51.8MB/s \n",
+ "\u001b[K |████████████████████████████████| 71kB 8.9MB/s \n",
+ "\u001b[K |████████████████████████████████| 61kB 7.8MB/s \n",
+ "\u001b[K |████████████████████████████████| 194kB 45.0MB/s \n",
+ "\u001b[K |████████████████████████████████| 512kB 51.7MB/s \n",
+ "\u001b[K |████████████████████████████████| 61kB 7.6MB/s \n",
+ "\u001b[K |████████████████████████████████| 6.7MB 46.6MB/s \n",
+ "\u001b[K |████████████████████████████████| 27.4MB 102kB/s \n",
+ "\u001b[K |████████████████████████████████| 3.2MB 44.7MB/s \n",
+ "\u001b[K |████████████████████████████████| 491kB 47.1MB/s \n",
+ "\u001b[K |████████████████████████████████| 377.1MB 43kB/s \n",
+ "\u001b[K |████████████████████████████████| 748.9MB 23kB/s \n",
+ "\u001b[K |████████████████████████████████| 8.8MB 43.2MB/s \n",
+ "\u001b[K |████████████████████████████████| 327kB 49.5MB/s \n",
+ "\u001b[K |████████████████████████████████| 256kB 48.7MB/s \n",
+ "\u001b[K |████████████████████████████████| 4.1MB 33.1MB/s \n",
+ "\u001b[K |████████████████████████████████| 51kB 6.7MB/s \n",
+ "\u001b[?25h Building wheel for absl-py (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for gast (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for h5py (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for html5lib (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for JPype1 (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for mmh3 (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for msgpack-python (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for psycopg2 (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for pycparser (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for pyhocon (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for wrapt (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Building wheel for PyYAML (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ "\u001b[31mERROR: fancyimpute 0.4.3 requires tensorflow, which is not installed.\u001b[0m\n",
+ "\u001b[31mERROR: umap-learn 0.5.1 has requirement scikit-learn>=0.22, but you'll have scikit-learn 0.21.0 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: torchtext 0.9.0 has requirement torch==1.8.0, but you'll have torch 1.2.0 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: tensorflow-probability 0.12.1 has requirement gast>=0.3.2, but you'll have gast 0.2.2 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: tensorflow-metadata 0.28.0 has requirement absl-py<0.11,>=0.9, but you'll have absl-py 0.7.1 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: pyasn1-modules 0.2.8 has requirement pyasn1<0.5.0,>=0.4.6, but you'll have pyasn1 0.4.2 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: pandas 1.1.5 has requirement python-dateutil>=2.7.3, but you'll have python-dateutil 2.6.1 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: googleapis-common-protos 1.53.0 has requirement protobuf>=3.12.0, but you'll have protobuf 3.9.1 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: google-colab 1.0.0 has requirement astor~=0.8.1, but you'll have astor 0.8.0 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: google-colab 1.0.0 has requirement six~=1.15.0, but you'll have six 1.12.0 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: google-api-python-client 1.12.8 has requirement six<2dev,>=1.13.0, but you'll have six 1.12.0 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: google-api-core 1.26.1 has requirement protobuf>=3.12.0, but you'll have protobuf 3.9.1 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: google-api-core 1.26.1 has requirement six>=1.13.0, but you'll have six 1.12.0 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: flask 1.1.2 has requirement Jinja2>=2.10.1, but you'll have jinja2 2.10 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: flask 1.1.2 has requirement Werkzeug>=0.15, but you'll have werkzeug 0.14.1 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: fbprophet 0.7.1 has requirement python-dateutil>=2.8.0, but you'll have python-dateutil 2.6.1 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\u001b[0m\n",
+ "\u001b[31mERROR: albumentations 0.1.12 has requirement imgaug<0.2.7,>=0.2.5, but you'll have imgaug 0.2.9 which is incompatible.\u001b[0m\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n",
+ " from ._conv import register_converters as _register_converters\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n",
+ " from ._conv import register_converters as _register_converters\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n"
+ ],
+ "name": "stdout"
+ }
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "knVqV4uHMyc3"
+ },
+ "source": [
+ "Get the finetuned BERT model specified above."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "WNETVn_RMxVV",
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "outputId": "369e3838-0849-45e1-fe42-3d5a98e814bb"
+ },
+ "source": [
+ "! ./download_pretrained.sh $CHOSEN_MODEL"
+ ],
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "stream",
+ "text": [
+ "Downloading spanbert_base\n",
+ "--2021-03-30 14:28:26-- http://nlp.cs.washington.edu/pair2vec/spanbert_base.tar.gz\n",
+ "Resolving nlp.cs.washington.edu (nlp.cs.washington.edu)... 128.208.3.120, 2607:4000:200:12::78\n",
+ "Connecting to nlp.cs.washington.edu (nlp.cs.washington.edu)|128.208.3.120|:80... connected.\n",
+ "HTTP request sent, awaiting response... 200 OK\n",
+ "Length: 1633726311 (1.5G) [application/x-gzip]\n",
+ "Saving to: ‘./spanbert_base.tar.gz’\n",
+ "\n",
+ "spanbert_base.tar.g 100%[===================>] 1.52G 68.5MB/s in 38s \n",
+ "\n",
+ "2021-03-30 14:29:04 (41.2 MB/s) - ‘./spanbert_base.tar.gz’ saved [1633726311/1633726311]\n",
+ "\n",
+ "spanbert_base/\n",
+ "spanbert_base/checkpoint\n",
+ "spanbert_base/model.max.ckpt.index\n",
+ "spanbert_base/stdout.log\n",
+ "spanbert_base/bert_config.json\n",
+ "spanbert_base/vocab.txt\n",
+ "spanbert_base/model.max.ckpt.data-00000-of-00001\n",
+ "spanbert_base/events.out.tfevents.1561596094.learnfair1413\n"
+ ],
+ "name": "stdout"
+ }
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "EucqOJQ6QZuG"
+ },
+ "source": [
+ "# Data Preparation and Prediction\n",
+ "\n",
+ "Process the data to be in the required input format."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "q0jLV2_sHC7e",
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "outputId": "d78ccffd-f8b9-4e8d-a452-711bdd3e66be"
+ },
+ "source": [
+ "from bert import tokenization\n",
+ "import json\n",
+ "\n",
+ "data = {\n",
+ " 'doc_key': genre,\n",
+ " 'sentences': [[\"[CLS]\"]],\n",
+ " 'speakers': [[\"[SPL]\"]],\n",
+ " 'clusters': [],\n",
+ " 'sentence_map': [0],\n",
+ " 'subtoken_map': [0],\n",
+ "}\n",
+ "\n",
+ "# Determine Max Segment\n",
+ "max_segment = None\n",
+ "for line in open('experiments.conf'):\n",
+ " if line.startswith(model_name):\n",
+ " max_segment = True\n",
+ " elif line.strip().startswith(\"max_segment_len\"):\n",
+ " if max_segment:\n",
+ " max_segment = int(line.strip().split()[-1])\n",
+ " break\n",
+ "\n",
+ "tokenizer = tokenization.FullTokenizer(vocab_file=\"cased_config_vocab/vocab.txt\", do_lower_case=False)\n",
+ "subtoken_num = 0\n",
+ "for sent_num, line in enumerate(text):\n",
+ " raw_tokens = line.split()\n",
+ " tokens = tokenizer.tokenize(line)\n",
+ " if len(tokens) + len(data['sentences'][-1]) >= max_segment:\n",
+ " data['sentences'][-1].append(\"[SEP]\")\n",
+ " data['sentences'].append([\"[CLS]\"])\n",
+ " data['speakers'][-1].append(\"[SPL]\")\n",
+ " data['speakers'].append([\"[SPL]\"])\n",
+ " data['sentence_map'].append(sent_num - 1)\n",
+ " data['subtoken_map'].append(subtoken_num - 1)\n",
+ " data['sentence_map'].append(sent_num)\n",
+ " data['subtoken_map'].append(subtoken_num)\n",
+ "\n",
+ " ctoken = raw_tokens[0]\n",
+ " cpos = 0\n",
+ " for token in tokens:\n",
+ " data['sentences'][-1].append(token)\n",
+ " data['speakers'][-1].append(\"-\")\n",
+ " data['sentence_map'].append(sent_num)\n",
+ " data['subtoken_map'].append(subtoken_num)\n",
+ "\n",
+ " if token.startswith(\"##\"):\n",
+ " token = token[2:]\n",
+ " if len(ctoken) == len(token):\n",
+ " subtoken_num += 1\n",
+ " cpos += 1\n",
+ " if cpos < len(raw_tokens):\n",
+ " ctoken = raw_tokens[cpos]\n",
+ " else:\n",
+ " ctoken = ctoken[len(token):]\n",
+ "\n",
+ "data['sentences'][-1].append(\"[SEP]\")\n",
+ "data['speakers'][-1].append(\"[SPL]\")\n",
+ "data['sentence_map'].append(sent_num - 1)\n",
+ "data['subtoken_map'].append(subtoken_num - 1)\n",
+ "\n",
+ "with open(\"sample.in.json\", 'w') as out:\n",
+ " json.dump(data, out, sort_keys=True)\n",
+ "\n",
+ "! cat sample.in.json"
+ ],
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "stream",
+ "text": [
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n",
+ "WARNING: Logging before flag parsing goes to stderr.\n",
+ "W0330 14:29:28.621733 140693483616128 deprecation_wrapper.py:119] From /content/coref/bert/tokenization.py:125: The name tf.gfile.GFile is deprecated. Please use tf.io.gfile.GFile instead.\n",
+ "\n"
+ ],
+ "name": "stderr"
+ },
+ {
+ "output_type": "stream",
+ "text": [
+ "{\"clusters\": [], \"doc_key\": \"nw\", \"sentence_map\": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 13], \"sentences\": [[\"[CLS]\", \"Fire\", \"##fly\", \"is\", \"an\", \"American\", \"space\", \"Western\", \"drama\", \"television\", \"series\", \"which\", \"ran\", \"from\", \"2002\", \"-\", \"2003\", \",\", \"created\", \"by\", \"writer\", \"and\", \"director\", \"Jo\", \"##ss\", \"W\", \"##hed\", \"##on\", \",\", \"under\", \"his\", \"Mu\", \"##tant\", \"Enemy\", \"Productions\", \"label\", \".\", \"W\", \"##hed\", \"##on\", \"served\", \"as\", \"an\", \"executive\", \"producer\", \",\", \"along\", \"with\", \"Tim\", \"Mine\", \"##ar\", \".\", \"The\", \"series\", \"is\", \"set\", \"in\", \"the\", \"year\", \"251\", \"##7\", \",\", \"after\", \"the\", \"arrival\", \"of\", \"humans\", \"in\", \"a\", \"new\", \"star\", \"system\", \"and\", \"follows\", \"the\", \"adventures\", \"of\", \"the\", \"re\", \"##ne\", \"##gade\", \"crew\", \"of\", \"Ser\", \"##eni\", \"##ty\", \",\", \"a\", \"'\", \"Fire\", \"##fly\", \"-\", \"class\", \"'\", \"spaces\", \"##hip\", \".\", \"The\", \"ensemble\", \"cast\", \"portrays\", \"the\", \"nine\", \"characters\", \"who\", \"live\", \"on\", \"Ser\", \"##eni\", \"##ty\", \".\", \"W\", \"##hed\", \"##on\", \"pitched\", \"the\", \"show\", \"as\", \"'\", \"nine\", \"people\", \"looking\", \"into\", \"the\", \"blackness\", \"of\", \"space\", \"and\", \"seeing\", \"nine\", \"different\", \"things\", \".\", \"'\", \"The\", \"show\", \"explores\", \"the\", \"lives\", \"of\", \"a\", \"group\", \"of\", \"people\", \",\", \"some\", \"of\", \"whom\", \"fought\", \"on\", \"the\", \"losing\", \"side\", \"of\", \"a\", \"civil\", \"war\", \",\", \"who\", \"make\", \"a\", \"living\", \"on\", \"the\", \"fringe\", \"##s\", \"of\", \"society\", \"as\", \"part\", \"of\", \"the\", \"pioneer\", \"culture\", \"of\", \"their\", \"star\", \"system\", \".\", \"In\", \"this\", \"future\", \",\", \"the\", \"only\", \"two\", \"surviving\", \"super\", \"##power\", \"##s\", \",\", \"the\", \"United\", \"States\", \"and\", \"China\", \",\", \"fused\", \"to\", \"form\", \"the\", \"central\", \"federal\", \"government\", \",\", \"called\", \"the\", \"Alliance\", \",\", \"resulting\", \"in\", \"the\", \"fusion\", \"of\", \"the\", \"two\", \"cultures\", \".\", \"According\", \"to\", \"W\", \"##hed\", \"##on\", \"'\", \"s\", \"vision\", \",\", \"'\", \"nothing\", \"will\", \"change\", \"in\", \"the\", \"future\", \":\", \"technology\", \"will\", \"advance\", \",\", \"but\", \"we\", \"will\", \"still\", \"have\", \"the\", \"same\", \"political\", \",\", \"moral\", \",\", \"and\", \"ethical\", \"problems\", \"as\", \"today\", \".\", \"'\", \"Fire\", \"##fly\", \"premiered\", \"in\", \"the\", \"U\", \".\", \"S\", \".\", \"on\", \"the\", \"Fox\", \"network\", \"on\", \"September\", \"20\", \",\", \"2002\", \".\", \"By\", \"mid\", \"-\", \"December\", \",\", \"Fire\", \"##fly\", \"had\", \"averaged\", \"4\", \".\", \"7\", \"million\", \"viewers\", \"per\", \"episode\", \"and\", \"was\", \"98\", \"##th\", \"in\", \"Nielsen\", \"ratings\", \".\", \"It\", \"was\", \"canceled\", \"after\", \"11\", \"of\", \"the\", \"14\", \"produced\", \"episodes\", \"were\", \"aired\", \".\", \"Despite\", \"the\", \"relatively\", \"short\", \"life\", \"span\", \"of\", \"the\", \"series\", \",\", \"it\", \"received\", \"strong\", \"sales\", \"when\", \"it\", \"was\", \"released\", \"on\", \"DVD\", \"and\", \"has\", \"large\", \"fan\", \"support\", \"campaigns\", \".\", \"It\", \"won\", \"a\", \"Prime\", \"##time\", \"Emmy\", \"Award\", \"in\", \"2003\", \"for\", \"Outstanding\", \"Special\", \"Visual\", \"Effects\", \"for\", \"a\", \"Series\", \".\", \"[SEP]\"], [\"[CLS]\", \"TV\", \"Guide\", \"ranked\", \"the\", \"series\", \"at\", \"No\", \".\", \"5\", \"on\", \"their\", \"2013\", \"list\", \"of\", \"60\", \"shows\", \"that\", \"were\", \"'\", \"Can\", \"##cell\", \"##ed\", \"Too\", \"Soon\", \".\", \"'\", \"The\", \"post\", \"-\", \"airing\", \"success\", \"of\", \"the\", \"show\", \"led\", \"W\", \"##hed\", \"##on\", \"and\", \"Universal\", \"Pictures\", \"to\", \"produce\", \"Ser\", \"##eni\", \"##ty\", \",\", \"a\", \"2005\", \"film\", \"which\", \"continues\", \"from\", \"the\", \"story\", \"of\", \"the\", \"series\", \",\", \"and\", \"the\", \"Fire\", \"##fly\", \"franchise\", \"expanded\", \"to\", \"other\", \"media\", \",\", \"including\", \"comics\", \"and\", \"a\", \"role\", \"-\", \"playing\", \"game\", \".\", \"[SEP]\"]], \"speakers\": [[\"[SPL]\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"[SPL]\"], [\"[SPL]\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"-\", \"[SPL]\"]], \"subtoken_map\": [0, 0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 12, 12, 12, 13, 14, 15, 16, 17, 18, 18, 19, 19, 19, 19, 20, 21, 22, 22, 23, 24, 25, 25, 26, 26, 26, 27, 28, 29, 30, 31, 31, 32, 33, 34, 35, 35, 35, 36, 37, 38, 39, 40, 41, 42, 43, 43, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 60, 60, 61, 62, 63, 63, 63, 63, 64, 65, 65, 65, 65, 65, 65, 66, 66, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 77, 77, 77, 78, 78, 78, 79, 80, 81, 82, 83, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 95, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 117, 118, 119, 120, 121, 122, 123, 124, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 136, 137, 138, 139, 139, 140, 141, 142, 143, 144, 144, 144, 144, 145, 146, 147, 148, 149, 149, 150, 151, 152, 153, 154, 155, 156, 156, 157, 158, 159, 159, 160, 161, 162, 163, 164, 165, 166, 167, 167, 168, 169, 170, 170, 170, 170, 170, 171, 171, 172, 172, 173, 174, 175, 176, 177, 177, 178, 179, 180, 180, 181, 182, 183, 184, 185, 186, 187, 188, 188, 189, 189, 190, 191, 192, 193, 194, 194, 194, 195, 195, 196, 197, 198, 199, 199, 199, 199, 200, 201, 202, 203, 204, 205, 206, 206, 207, 207, 208, 209, 209, 209, 209, 210, 210, 211, 212, 213, 213, 213, 214, 215, 216, 217, 218, 219, 220, 220, 221, 222, 223, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 260, 261, 262, 263, 264, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 276, 276, 277, 277, 278, 279, 280, 281, 282, 283, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 294, 294, 294, 295, 296, 296, 296, 297, 298, 298, 298, 299, 300, 301, 302, 303, 304, 304, 304, 305, 306, 307, 308, 309, 310, 310, 310, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 321, 322, 323, 324, 324, 325, 326, 327, 328, 329, 329, 330, 331, 332, 333, 334, 334, 334, 335, 335, 335]}"
+ ],
+ "name": "stdout"
+ }
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "iM3yxYqiz-8m"
+ },
+ "source": [
+ "Run Prediction"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "1K_Z0h-LS8od",
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "outputId": "e58306d8-acd5-428c-983c-a9699b3e0d0f"
+ },
+ "source": [
+ "! GPU=0 python predict.py $CHOSEN_MODEL sample.in.json sample.out.txt"
+ ],
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "stream",
+ "text": [
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n",
+ " from ._conv import register_converters as _register_converters\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n",
+ " np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n",
+ "WARNING: Logging before flag parsing goes to stderr.\n",
+ "W0330 14:29:30.837061 140695779342208 deprecation_wrapper.py:119] From /content/coref/coref_ops.py:11: The name tf.NotDifferentiable is deprecated. Please use tf.no_gradient instead.\n",
+ "\n",
+ "/usr/local/lib/python3.7/dist-packages/sklearn/utils/linear_assignment_.py:21: DeprecationWarning: The linear_assignment_ module is deprecated in 0.21 and will be removed from 0.23. Use scipy.optimize.linear_sum_assignment instead.\n",
+ " DeprecationWarning)\n",
+ "W0330 14:29:30.949380 140695779342208 deprecation_wrapper.py:119] From /content/coref/bert/optimization.py:87: The name tf.train.Optimizer is deprecated. Please use tf.compat.v1.train.Optimizer instead.\n",
+ "\n",
+ "W0330 14:29:33.780439 140695779342208 lazy_loader.py:50] \n",
+ "The TensorFlow contrib module will not be included in TensorFlow 2.0.\n",
+ "For more information, please see:\n",
+ " * https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md\n",
+ " * https://github.com/tensorflow/addons\n",
+ " * https://github.com/tensorflow/io (for I/O related ops)\n",
+ "If you depend on functionality not listed there, please file an issue.\n",
+ "\n",
+ "Setting CUDA_VISIBLE_DEVICES to: 0\n",
+ "Running experiment: spanbert_base\n",
+ "data_dir = \"/sdb/data/new_coref\"\n",
+ "model_type = \"independent\"\n",
+ "max_top_antecedents = 50\n",
+ "max_training_sentences = 3\n",
+ "top_span_ratio = 0.4\n",
+ "max_num_speakers = 20\n",
+ "max_segment_len = 384\n",
+ "bert_learning_rate = 2e-05\n",
+ "task_learning_rate = 0.0001\n",
+ "num_docs = 2802\n",
+ "dropout_rate = 0.3\n",
+ "ffnn_size = 3000\n",
+ "ffnn_depth = 1\n",
+ "num_epochs = 20\n",
+ "feature_size = 20\n",
+ "max_span_width = 30\n",
+ "use_metadata = true\n",
+ "use_features = true\n",
+ "use_segment_distance = true\n",
+ "model_heads = true\n",
+ "coref_depth = 2\n",
+ "coarse_to_fine = true\n",
+ "fine_grained = true\n",
+ "use_prior = true\n",
+ "train_path = \"./train.english.384.jsonlines\"\n",
+ "eval_path = \"./dev.english.384.jsonlines\"\n",
+ "conll_eval_path = \"./dev.english.v4_gold_conll\"\n",
+ "single_example = true\n",
+ "genres = [\n",
+ " \"bc\"\n",
+ " \"bn\"\n",
+ " \"mz\"\n",
+ " \"nw\"\n",
+ " \"pt\"\n",
+ " \"tc\"\n",
+ " \"wb\"\n",
+ "]\n",
+ "eval_frequency = 1000\n",
+ "report_frequency = 100\n",
+ "log_root = \".\"\n",
+ "adam_eps = 1e-06\n",
+ "task_optimizer = \"adam\"\n",
+ "bert_config_file = \"./spanbert_base/bert_config.json\"\n",
+ "vocab_file = \"./spanbert_base/vocab.txt\"\n",
+ "tf_checkpoint = \"./spanbert_base/model.max.ckpt\"\n",
+ "init_checkpoint = \"./spanbert_base/model.max.ckpt\"\n",
+ "log_dir = \"./spanbert_base\"\n",
+ "W0330 14:29:33.854637 140695779342208 deprecation_wrapper.py:119] From /content/coref/bert/modeling.py:92: The name tf.gfile.GFile is deprecated. Please use tf.io.gfile.GFile instead.\n",
+ "\n",
+ "W0330 14:29:33.938644 140695779342208 deprecation_wrapper.py:119] From /content/coref/independent.py:48: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n",
+ "\n",
+ "W0330 14:29:33.949810 140695779342208 deprecation_wrapper.py:119] From /content/coref/independent.py:50: The name tf.PaddingFIFOQueue is deprecated. Please use tf.queue.PaddingFIFOQueue instead.\n",
+ "\n",
+ "W0330 14:29:33.953515 140695779342208 deprecation.py:323] From /content/coref/bert/modeling.py:158: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n",
+ "Instructions for updating:\n",
+ "Use `tf.cast` instead.\n",
+ "W0330 14:29:33.973174 140695779342208 deprecation_wrapper.py:119] From /content/coref/bert/modeling.py:175: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.\n",
+ "\n",
+ "W0330 14:29:33.973397 140695779342208 deprecation_wrapper.py:119] From /content/coref/bert/modeling.py:175: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.\n",
+ "\n",
+ "W0330 14:29:34.053309 140695779342208 deprecation.py:506] From /content/coref/bert/modeling.py:362: calling dropout (from tensorflow.python.ops.nn_ops) with keep_prob is deprecated and will be removed in a future version.\n",
+ "Instructions for updating:\n",
+ "Please use `rate` instead of `keep_prob`. Rate should be set to `rate = 1 - keep_prob`.\n",
+ "W0330 14:29:34.094469 140695779342208 deprecation.py:323] From /content/coref/bert/modeling.py:676: dense (from tensorflow.python.layers.core) is deprecated and will be removed in a future version.\n",
+ "Instructions for updating:\n",
+ "Use keras.layers.dense instead.\n",
+ "W0330 14:29:36.932627 140695779342208 deprecation.py:323] From /usr/local/lib/python3.7/dist-packages/tensorflow/python/ops/array_ops.py:1354: add_dispatch_support..wrapper (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\n",
+ "Instructions for updating:\n",
+ "Use tf.where in 2.0, which has the same broadcast rule as np.where\n",
+ "W0330 14:29:37.013033 140695779342208 deprecation.py:323] From /content/coref/independent.py:221: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n",
+ "Instructions for updating:\n",
+ "Use `tf.cast` instead.\n",
+ "W0330 14:29:37.064145 140695779342208 deprecation_wrapper.py:119] From /content/coref/util.py:117: The name tf.nn.xw_plus_b is deprecated. Please use tf.compat.v1.nn.xw_plus_b instead.\n",
+ "\n",
+ "W0330 14:29:37.640215 140695779342208 deprecation_wrapper.py:119] From /content/coref/independent.py:60: The name tf.train.init_from_checkpoint is deprecated. Please use tf.compat.v1.train.init_from_checkpoint instead.\n",
+ "\n",
+ "**** Trainable Variables ****\n",
+ " name = bert/embeddings/word_embeddings:0, shape = (28996, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/embeddings/token_type_embeddings:0, shape = (2, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/embeddings/position_embeddings:0, shape = (512, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/embeddings/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/embeddings/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_0/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_1/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_2/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_3/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_4/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_5/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_6/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_7/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_8/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_9/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_10/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/self/query/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/self/query/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/self/key/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/self/key/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/self/value/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/self/value/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/output/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/attention/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/intermediate/dense/kernel:0, shape = (768, 3072), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/intermediate/dense/bias:0, shape = (3072,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/output/dense/kernel:0, shape = (3072, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/output/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/output/LayerNorm/beta:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/encoder/layer_11/output/LayerNorm/gamma:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = bert/pooler/dense/kernel:0, shape = (768, 768), *INIT_FROM_CKPT*\n",
+ " name = bert/pooler/dense/bias:0, shape = (768,), *INIT_FROM_CKPT*\n",
+ " name = span_width_embeddings:0, shape = (30, 20), *INIT_FROM_CKPT*\n",
+ " name = mention_word_attn/output_weights:0, shape = (768, 1), *INIT_FROM_CKPT*\n",
+ " name = mention_word_attn/output_bias:0, shape = (1,), *INIT_FROM_CKPT*\n",
+ " name = mention_scores/hidden_weights_0:0, shape = (2324, 3000), *INIT_FROM_CKPT*\n",
+ " name = mention_scores/hidden_bias_0:0, shape = (3000,), *INIT_FROM_CKPT*\n",
+ " name = mention_scores/output_weights:0, shape = (3000, 1), *INIT_FROM_CKPT*\n",
+ " name = mention_scores/output_bias:0, shape = (1,), *INIT_FROM_CKPT*\n",
+ " name = span_width_prior_embeddings:0, shape = (30, 20), *INIT_FROM_CKPT*\n",
+ " name = width_scores/hidden_weights_0:0, shape = (20, 3000), *INIT_FROM_CKPT*\n",
+ " name = width_scores/hidden_bias_0:0, shape = (3000,), *INIT_FROM_CKPT*\n",
+ " name = width_scores/output_weights:0, shape = (3000, 1), *INIT_FROM_CKPT*\n",
+ " name = width_scores/output_bias:0, shape = (1,), *INIT_FROM_CKPT*\n",
+ " name = genre_embeddings:0, shape = (7, 20), *INIT_FROM_CKPT*\n",
+ " name = src_projection/output_weights:0, shape = (2324, 2324), *INIT_FROM_CKPT*\n",
+ " name = src_projection/output_bias:0, shape = (2324,), *INIT_FROM_CKPT*\n",
+ " name = antecedent_distance_emb:0, shape = (10, 20), *INIT_FROM_CKPT*\n",
+ " name = output_weights:0, shape = (20, 1), *INIT_FROM_CKPT*\n",
+ " name = output_bias:0, shape = (1,), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/same_speaker_emb:0, shape = (2, 20), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/antecedent_distance_emb:0, shape = (10, 20), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/segment_distance/segment_distance_embeddings:0, shape = (3, 20), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/slow_antecedent_scores/hidden_weights_0:0, shape = (7052, 3000), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/slow_antecedent_scores/hidden_bias_0:0, shape = (3000,), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/slow_antecedent_scores/output_weights:0, shape = (3000, 1), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/slow_antecedent_scores/output_bias:0, shape = (1,), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/f/output_weights:0, shape = (4648, 2324), *INIT_FROM_CKPT*\n",
+ " name = coref_layer/f/output_bias:0, shape = (2324,), *INIT_FROM_CKPT*\n",
+ "W0330 14:29:38.538446 140695779342208 deprecation_wrapper.py:119] From /content/coref/independent.py:74: The name tf.train.get_or_create_global_step is deprecated. Please use tf.compat.v1.train.get_or_create_global_step instead.\n",
+ "\n",
+ "W0330 14:29:38.545189 140695779342208 deprecation_wrapper.py:119] From /content/coref/optimization.py:13: The name tf.train.polynomial_decay is deprecated. Please use tf.compat.v1.train.polynomial_decay instead.\n",
+ "\n",
+ "W0330 14:29:38.549285 140695779342208 deprecation.py:323] From /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/optimizer_v2/learning_rate_schedule.py:409: div (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n",
+ "Instructions for updating:\n",
+ "Deprecated in favor of operator or tf.math.divide.\n",
+ "W0330 14:29:38.569683 140695779342208 deprecation_wrapper.py:119] From /content/coref/optimization.py:64: The name tf.train.AdamOptimizer is deprecated. Please use tf.compat.v1.train.AdamOptimizer instead.\n",
+ "\n",
+ "bert:task 199 27\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/ops/gradients_util.py:93: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.\n",
+ " \"Converting sparse IndexedSlices to a dense Tensor of unknown shape. \"\n",
+ "/usr/local/lib/python3.7/dist-packages/tensorflow/python/ops/gradients_util.py:93: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.\n",
+ " \"Converting sparse IndexedSlices to a dense Tensor of unknown shape. \"\n",
+ "2021-03-30 14:29:48.887415: I tensorflow/stream_executor/platform/default/dso_loader.cc:42] Successfully opened dynamic library libcuda.so.1\n",
+ "2021-03-30 14:29:48.925766: E tensorflow/stream_executor/cuda/cuda_driver.cc:318] failed call to cuInit: CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected\n",
+ "2021-03-30 14:29:48.925828: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (0c3db3635866): /proc/driver/nvidia/version does not exist\n",
+ "2021-03-30 14:29:48.926289: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA\n",
+ "2021-03-30 14:29:48.986357: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 1999995000 Hz\n",
+ "2021-03-30 14:29:48.986627: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55ac822dbdc0 executing computations on platform Host. Devices:\n",
+ "2021-03-30 14:29:48.986666: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): , \n",
+ "Restoring from ./spanbert_base/model.max.ckpt\n",
+ "2021-03-30 14:29:54.087705: W tensorflow/compiler/jit/mark_for_compilation_pass.cc:1412] (One-time warning): Not using XLA:CPU for cluster because envvar TF_XLA_FLAGS=--tf_xla_cpu_global_jit was not set. If you want XLA:CPU, either set that envvar, or use experimental_jit_scope to enable XLA:CPU. To confirm that XLA is active, pass --vmodule=xla_compilation_cache=1 (as a proper command-line flag, not via TF_XLA_FLAGS) or set the envvar XLA_FLAGS=--xla_hlo_profile.\n",
+ "W0330 14:29:58.155017 140695779342208 deprecation.py:323] From /usr/local/lib/python3.7/dist-packages/tensorflow/python/training/saver.py:1276: checkpoint_exists (from tensorflow.python.training.checkpoint_management) is deprecated and will be removed in a future version.\n",
+ "Instructions for updating:\n",
+ "Use standard file APIs to check for files with this prefix.\n",
+ "2021-03-30 14:30:08.798833: W tensorflow/core/framework/allocator.cc:107] Allocation of 246820000 exceeds 10% of system memory.\n",
+ "2021-03-30 14:30:12.599419: W tensorflow/core/framework/allocator.cc:107] Allocation of 246820000 exceeds 10% of system memory.\n",
+ "Decoded 1 examples.\n"
+ ],
+ "name": "stdout"
+ }
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "_9slRLDdQFDP"
+ },
+ "source": [
+ "# Output Handling\n",
+ "\n",
+ "Finally, we do a little processing to get the output to have the same token indices as our input."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "metadata": {
+ "id": "FMiNJOgSUA7D",
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "outputId": "3140683a-5f20-4031-9569-c6893393d501"
+ },
+ "source": [
+ "output = json.load(open(\"sample.out.txt\"))\n",
+ "\n",
+ "comb_text = [word for sentence in output['sentences'] for word in sentence]\n",
+ "\n",
+ "def convert_mention(mention):\n",
+ " start = output['subtoken_map'][mention[0]]\n",
+ " end = output['subtoken_map'][mention[1]] + 1\n",
+ " nmention = (start, end)\n",
+ " mtext = ''.join(' '.join(comb_text[mention[0]:mention[1]+1]).split(\" ##\"))\n",
+ " return (nmention, mtext)\n",
+ "\n",
+ "seen = set()\n",
+ "print('Clusters:')\n",
+ "for cluster in output['predicted_clusters']:\n",
+ " mapped = []\n",
+ " for mention in cluster:\n",
+ " seen.add(tuple(mention))\n",
+ " mapped.append(convert_mention(mention))\n",
+ " print(mapped, end=\",\\n\")\n",
+ "\n",
+ "print('\\nMentions:')\n",
+ "for mention in output['top_spans']:\n",
+ " if tuple(mention) in seen:\n",
+ " continue\n",
+ " print(convert_mention(mention), end=\",\\n\")"
+ ],
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "stream",
+ "text": [
+ "Clusters:\n",
+ "[((15, 20), 'writer and director Joss Whedon'), ((21, 22), 'his'), ((26, 27), 'Whedon'), ((78, 79), 'Whedon'), ((170, 171), \"Whedon ' s\"), ((304, 305), 'Whedon')],\n",
+ "[((2, 9), 'an American space Western drama television series'), ((36, 38), 'The series'), ((80, 82), 'the show'), ((96, 98), 'The show'), ((195, 196), 'Firefly'), ((210, 211), 'Firefly'), ((224, 225), 'It'), ((243, 245), 'the series'), ((245, 246), 'it'), ((250, 251), 'it'), ((261, 262), 'It'), ((280, 282), 'the series'), ((301, 303), 'the show'), ((320, 322), 'the series'), ((324, 325), 'Firefly')],\n",
+ "[((63, 67), \"Serenity , a ' Firefly - class ' spaceship\"), ((77, 78), 'Serenity')],\n",
+ "[((50, 54), 'a new star system'), ((134, 137), 'their star system')],\n",
+ "[((12, 13), '2002'), ((207, 208), '2002')],\n",
+ "[((12, 13), '2003'), ((268, 269), '2003')],\n",
+ "[((277, 279), 'TV Guide'), ((286, 287), 'their')],\n",
+ "\n",
+ "Mentions:\n",
+ "((0, 1), 'Firefly'),\n",
+ "((0, 2), 'Firefly is'),\n",
+ "((2, 13), 'an American space Western drama television series which ran from 2002 - 2003'),\n",
+ "((10, 11), 'ran'),\n",
+ "((12, 13), '2002 - 2003'),\n",
+ "((13, 14), 'created'),\n",
+ "((15, 26), 'writer and director Joss Whedon , under his Mutant Enemy Productions label'),\n",
+ "((21, 25), 'his Mutant Enemy Productions'),\n",
+ "((21, 26), 'his Mutant Enemy Productions label'),\n",
+ "((26, 36), 'Whedon served as an executive producer , along with Tim Minear'),\n",
+ "((27, 28), 'served'),\n",
+ "((34, 36), 'Tim Minear'),\n",
+ "((36, 54), 'The series is set in the year 2517 , after the arrival of humans in a new star system'),\n",
+ "((38, 39), 'is'),\n",
+ "((39, 40), 'set'),\n",
+ "((41, 44), 'the year 2517'),\n",
+ "((41, 54), 'the year 2517 , after the arrival of humans in a new star system'),\n",
+ "((45, 49), 'the arrival of humans'),\n",
+ "((45, 54), 'the arrival of humans in a new star system'),\n",
+ "((48, 49), 'humans'),\n",
+ "((54, 56), 'and follows'),\n",
+ "((55, 56), 'follows'),\n",
+ "((56, 67), \"the adventures of the renegade crew of Serenity , a ' Firefly - class ' spaceship\"),\n",
+ "((59, 67), \"the renegade crew of Serenity , a ' Firefly - class ' spaceship\"),\n",
+ "((65, 66), 'Firefly'),\n",
+ "((65, 66), 'Firefly -'),\n",
+ "((65, 66), 'Firefly - class'),\n",
+ "((65, 66), \"Firefly - class '\"),\n",
+ "((65, 67), \"Firefly - class ' spaces\"),\n",
+ "((67, 70), 'The ensemble cast'),\n",
+ "((67, 78), 'The ensemble cast portrays the nine characters who live on Serenity'),\n",
+ "((70, 71), 'portrays'),\n",
+ "((71, 78), 'the nine characters who live on Serenity'),\n",
+ "((78, 96), \"Whedon pitched the show as ' nine people looking into the blackness of space and seeing nine different things\"),\n",
+ "((79, 80), 'pitched'),\n",
+ "((80, 96), \"the show as ' nine people looking into the blackness of space and seeing nine different things\"),\n",
+ "((83, 96), 'nine people looking into the blackness of space and seeing nine different things'),\n",
+ "((87, 91), 'the blackness of space'),\n",
+ "((90, 91), 'space'),\n",
+ "((92, 93), 'seeing'),\n",
+ "((93, 96), 'nine different things'),\n",
+ "((95, 96), \"'\"),\n",
+ "((98, 99), 'explores'),\n",
+ "((106, 118), 'some of whom fought on the losing side of a civil war'),\n",
+ "((111, 118), 'the losing side of a civil war'),\n",
+ "((115, 118), 'a civil war'),\n",
+ "((119, 120), 'make'),\n",
+ "((123, 127), 'the fringes of society'),\n",
+ "((126, 127), 'society'),\n",
+ "((130, 137), 'the pioneer culture of their star system'),\n",
+ "((134, 135), 'their'),\n",
+ "((138, 140), 'this future'),\n",
+ "((138, 150), 'this future , the only two surviving superpowers , the United States and China ,'),\n",
+ "((140, 148), 'the only two surviving superpowers , the United States'),\n",
+ "((140, 150), 'the only two surviving superpowers , the United States and China'),\n",
+ "((140, 150), 'the only two surviving superpowers , the United States and China ,'),\n",
+ "((145, 148), 'the United States'),\n",
+ "((149, 150), 'China'),\n",
+ "((150, 151), 'fused'),\n",
+ "((150, 152), 'fused to'),\n",
+ "((150, 153), 'fused to form'),\n",
+ "((150, 160), 'fused to form the central federal government , called the Alliance'),\n",
+ "((150, 160), 'fused to form the central federal government , called the Alliance ,'),\n",
+ "((150, 161), 'fused to form the central federal government , called the Alliance , resulting'),\n",
+ "((150, 162), 'fused to form the central federal government , called the Alliance , resulting in'),\n",
+ "((150, 168), 'fused to form the central federal government , called the Alliance , resulting in the fusion of the two cultures'),\n",
+ "((150, 168), 'fused to form the central federal government , called the Alliance , resulting in the fusion of the two cultures .'),\n",
+ "((152, 153), 'form'),\n",
+ "((153, 160), 'the central federal government , called the Alliance'),\n",
+ "((162, 168), 'the fusion of the two cultures'),\n",
+ "((165, 168), 'the two cultures'),\n",
+ "((170, 172), \"Whedon ' s vision\"),\n",
+ "((172, 173), \"'\"),\n",
+ "((172, 195), \"' nothing will change in the future : technology will advance , but we will still have the same political , moral , and ethical problems as today\"),\n",
+ "((176, 178), 'the future'),\n",
+ "((176, 195), 'the future : technology will advance , but we will still have the same political , moral , and ethical problems as today'),\n",
+ "((178, 179), 'technology'),\n",
+ "((180, 181), 'advance'),\n",
+ "((182, 183), 'we'),\n",
+ "((182, 195), 'we will still have the same political , moral , and ethical problems as today'),\n",
+ "((184, 186), 'still have'),\n",
+ "((185, 186), 'have'),\n",
+ "((186, 195), 'the same political , moral , and ethical problems as today'),\n",
+ "((194, 195), 'today'),\n",
+ "((195, 197), 'Firefly premiered'),\n",
+ "((195, 200), 'Firefly premiered in the U . S .'),\n",
+ "((195, 204), 'Firefly premiered in the U . S . on the Fox network'),\n",
+ "((195, 208), 'Firefly premiered in the U . S . on the Fox network on September 20 , 2002'),\n",
+ "((195, 208), 'Firefly premiered in the U . S . on the Fox network on September 20 , 2002 .'),\n",
+ "((196, 197), 'premiered'),\n",
+ "((198, 200), 'the U . S .'),\n",
+ "((201, 204), 'the Fox network'),\n",
+ "((205, 208), 'September 20 , 2002'),\n",
+ "((210, 218), 'Firefly had averaged 4 . 7 million viewers per episode'),\n",
+ "((212, 213), 'averaged'),\n",
+ "((212, 218), 'averaged 4 . 7 million viewers per episode'),\n",
+ "((213, 218), '4 . 7 million viewers per episode'),\n",
+ "((217, 218), 'episode'),\n",
+ "((222, 224), 'Nielsen ratings'),\n",
+ "((224, 227), 'It was canceled'),\n",
+ "((225, 227), 'was canceled'),\n",
+ "((226, 227), 'canceled'),\n",
+ "((228, 234), '11 of the 14 produced episodes'),\n",
+ "((228, 236), '11 of the 14 produced episodes were aired'),\n",
+ "((230, 234), 'the 14 produced episodes'),\n",
+ "((234, 236), 'were aired'),\n",
+ "((235, 236), 'aired'),\n",
+ "((237, 245), 'the relatively short life span of the series'),\n",
+ "((237, 255), 'the relatively short life span of the series , it received strong sales when it was released on DVD'),\n",
+ "((245, 255), 'it received strong sales when it was released on DVD'),\n",
+ "((250, 255), 'it was released on DVD'),\n",
+ "((252, 253), 'released'),\n",
+ "((254, 255), 'DVD'),\n",
+ "((256, 257), 'has'),\n",
+ "((257, 261), 'large fan support campaigns'),\n",
+ "((261, 277), 'It won a Primetime Emmy Award in 2003 for Outstanding Special Visual Effects for a Series'),\n",
+ "((262, 263), 'won'),\n",
+ "((263, 267), 'a Primetime Emmy Award'),\n",
+ "((263, 269), 'a Primetime Emmy Award in 2003'),\n",
+ "((263, 277), 'a Primetime Emmy Award in 2003 for Outstanding Special Visual Effects for a Series'),\n",
+ "((270, 277), 'Outstanding Special Visual Effects for a Series'),\n",
+ "((275, 277), 'a Series'),\n",
+ "((277, 297), \"TV Guide ranked the series at No . 5 on their 2013 list of 60 shows that were ' Cancelled Too Soon\"),\n",
+ "((279, 280), 'ranked'),\n",
+ "((280, 297), \"the series at No . 5 on their 2013 list of 60 shows that were ' Cancelled Too Soon\"),\n",
+ "((286, 288), 'their 2013'),\n",
+ "((286, 297), \"their 2013 list of 60 shows that were ' Cancelled Too Soon\"),\n",
+ "((290, 297), \"60 shows that were ' Cancelled Too Soon\"),\n",
+ "((296, 297), \"'\"),\n",
+ "((301, 322), 'the show led Whedon and Universal Pictures to produce Serenity , a 2005 film which continues from the story of the series'),\n",
+ "((304, 308), 'Whedon and Universal Pictures'),\n",
+ "((304, 322), 'Whedon and Universal Pictures to produce Serenity , a 2005 film which continues from the story of the series'),\n",
+ "((306, 308), 'Universal Pictures'),\n",
+ "((309, 310), 'produce'),\n",
+ "((310, 311), 'Serenity'),\n",
+ "((310, 322), 'Serenity , a 2005 film which continues from the story of the series'),\n",
+ "((312, 313), '2005'),\n",
+ "((317, 322), 'the story of the series'),\n",
+ "((323, 326), 'the Firefly franchise'),\n",
+ "((323, 336), 'the Firefly franchise expanded to other media , including comics and a role - playing game'),\n",
+ "((326, 327), 'expanded'),\n",
+ "((328, 336), 'other media , including comics and a role - playing game'),\n",
+ "((331, 332), 'comics'),\n",
+ "((333, 336), 'a role - playing game'),\n"
+ ],
+ "name": "stdout"
+ }
+ ]
+ }
+ ]
+}
\ No newline at end of file