If you read my blog from December 20 about answering questions from long passages using BERT, you know how excited I am about how BERT is having a huge impact on natural language processing. BERT, or Bidirectional Encoder Representations from Transformers, which was developed by Google, is a new method of pre-training language representations which obtains state-of-the-art results on a wide ... Continue reading "Finding Cosine Similarity Between Sentences Using BERT-as-a-Service"The post Finding Cosine Similarity Between Sentences Using BERT-as-a-Service appeared first on Mark III Systems.