As public research budgets in the social sciences have shrunk, large government and academic surveys that have traditionally been conducted using in-person interviews are being supplemented or replaced by less expensive online samples. Many large-scale surveys have begun exploring alternative online design approaches, despite a notable lack of research directly comparing in-person and online surveys. In this paper we leverage the 2012 and 2016 iterations of the American National Election Study (ANES), the largest and longest running electoral survey in the US to understand the implications of supplementing large academic and government surveys with online samples for estimates of population means, trends over time, and associations between key variables. Although the release of the in-person and online ANES samples in a single dataset might suggest few differences between modes, we find notable differences in standard measures of a variety key indicators of voter behavior, including political knowledge, voter registration, turnout, partisanship, and ideology. Additionally, we consider differences in data quality across in-person and online samples, examining satisficing and item-nonresponse. We not only finds statistically significant differences in means across samples, but also substantial variation in how strongly they predict other quantities of central interest such as voter turnout. These results illustrate the importance of considering design differences in analyses of hybrid sample datasets and highlight a number of issues that should be considered in future large scale data collection efforts using multiple modes.