This paper introduces attentional activation units (ATAC), a unification of activation functions and attention mechanisms. ATAC units include a local channel attention module for simultaneous non-linear activation and element-wise feature refinement, locally aggregating point-wise cross-channel feature contexts in convolutional networks.