Corpus ID: 236428420

Pressure Test: Good Stress for Company Success

  title={Pressure Test: Good Stress for Company Success},
  author={Sanja Scepanovic and Marios Constantinides and Daniele Quercia and Seunghyun Kim},
Workplace stress is often considered to be negative, yet lab studies on individuals suggest that not all stress is bad. There are two types of stress: distress refers to harmful stimuli, while eustress refers to healthy, euphoric stimuli that create a sense of fulfillment and achievement. Telling the two types of stress apart is challenging, let alone quantifying their impact across corporations. We just did that for the S&P 500 companies in the U.S., and did so by leveraging a dataset of 440K… Expand


Stay cool under pressure — without appearing cold
  • Harv. Bus. Rev
  • 2020
Results of a national study
  • JAMA Intern. Medicine 173, 76–77
  • 2013
Bertopic: leveraging bert and c-tf-idf to create easily interpretable topics
  • 2020
Bertopic: leveraging bert and c-tf-idf to create easily interpretable topics (2020)
  • URL https://doi
  • 2020
Extracting medical entities from social media
A deep-learning method using contextual embeddings that upon two existing benchmark datasets, one containing annotated AskaPatient posts (CADEC) and the other containing annotate tweets (Micromed), outperformed existing state-of-the-art methods. Expand
Modeling Organizational Culture with Workplace Experiences Shared on Glassdoor
This work uses multiple job descriptors to operationalize OC as a word vector representation and validate this construct with language used in 650k different Glassdoor reviews, and proposes a methodology to apply this construct onGlassdoor reviews to quantify the OC of employees by sector. Expand
Neural Architectures for Nested NER through Linearization
Two neural network architectures for nested named entity recognition (NER), a setting in which named entities may overlap and also be labeled with more than one label, are proposed and outperform the nested NER state of the art on four corpora. Expand
Pooled Contextualized Embeddings for Named Entity Recognition
This work proposes a method in which it dynamically aggregate contextualized embeddings of each unique string that the authors encounter and uses a pooling operation to distill a ”global” word representation from all contextualized instances. Expand
RoBERTa: A Robustly Optimized BERT Pretraining Approach
It is found that BERT was significantly undertrained, and can match or exceed the performance of every model published after it, and the best model achieves state-of-the-art results on GLUE, RACE and SQuAD. Expand