PIER Here is a list of Papers I Enjoyed Reading LinkBERT: Pretraining Language Models with Document Links Is Cosine-Similarity of Embeddings Really About Similarity? Deep Bidirectional Language-Knowledge Graph Pretraining Chameleon: Plug-and-Play Compositional Reasoning with Large Language Models RITUAL: Random Image Transformations as a Universal Anti-hallucination Lever in LVLMs Efficiently Teaching an Effective Dense Retriever with Balanced Topic Aware Sampling Can a large language model be a gaslighter? Poincaré Embeddings for Learning Hierarchical Representations