This blog post is part of a series on SMS surveying. The second part, ‘SMS Surveying: Logistics’ can be accessed here. The third part, ‘SMS Surveying: Lessons Learnt in Tanzania’, can be accessed here.
The use of remote data collection in conducting survey research has accelerated with the COVID-19 global pandemic. While phone surveys seem to be the preferred alternative approach for researchers, other innovative remote data collection methods such as short message service (SMS) surveys are progressively raising the interest in the research world as this type of survey allows for fast implementation and is low cost.
In this three-part blog series on SMS Surveying I am sharing my thoughts and insights from my recent experience in conducting an SMS survey in Tanzania. I am starting with some tips and best practices for designing an SMS survey:
1) Length of the SMS survey: A long SMS survey may discourage the respondent from reaching the end, may be perceived as too intrusive or may lead to rushed responses toward the end which reduces the data quality. From our experience, an SMS survey that includes 5 – 10 questions can be easily completed in less than five minutes.
2) Type of questions: Close-ended questions should be preferred over open-ended questions. Close-ended questions are easier and allow the respondent to send a message with just the number corresponding to the answer (e.g. sending 1 for “Yes”, or 2 for “No”). In addition, sensitive questions should also be avoided in SMS surveys as it is challenging to establish trust over an SMS. Furthermore, the setting of SMS surveys raise important ethical concerns as privacy cannot be guaranteed especially for households where a phone is shared among several members.
3) Length of the questions: The shorter, the better. Each question and message should not exceed the size of two SMS (320 characters) as messages that are too long may not display properly for respondents using basic or feature phones1, as used by the majority of mobile phone owners in Tanzania
4) Other messages to be included: The introduction message and the consent question are the key starting points for engaging the respondent in continuing and completing the SMS survey. The first message should be concise while still providing enough background on the research, consent, data protection and other information such as the length of the survey, incentives, and how SMS are billed in the survey. Other messages to be included are a thank you message at the end, and a question asking for a better alternative phone number in the case of a follow-up survey.
5) SMS programming: Some key functionalities we used with the software TextIt are presented below:
-
-
-
-
Skips and routings will allow the respondent to answer only relevant questions to him/her depending on the previous answer provided, e.g. asking about his/her child’s age only if he/she has children.
-
Automated error messages to be sent to the respondent to probe him/her to resubmit a response from the range of values indicated, e.g. “Please use numeric codes provided to select the answer”. This message can be customised for each question and additional validations can be used to confirm a response that was sent and seems unusual in the local context.
-
SMS reminders and Follow-up SMS when the survey was not started or not finished by the respondent. The content and the frequency of SMS reminders can be differentiated for the respondent who has started the survey but did not finish it and the respondent who has not started the SMS survey at all. In this latter case, the reminders may be sent less frequently to avoid the phone user receiving too many messages at once; in the case of a phone that was switched off for example.
-
Customise SMS is possible in SMS surveys by pre-populating specific information or previous data related to the respondent. For example, adding the name of the respondent in the introduction message can help the targeted respondent to reply (especially when the phone is used by more than one person).
-
Randomise the order of the questions to address the bias of incomplete SMS surveys. Some respondents start answering the first questions and do not reach the end of the survey either because they voluntarily stopped the SMS survey or face technical issues to finish completion (network failure, battery problems etc.).
-
-
-
These tips on SMS survey design constitute the first blog from the series related to SMS surveying. In the next blogpost, I will share my experience with the logistics involved in conducting an SMS survey in Tanzania.
Additional resources:
I recommend exploring the resources available from TextIt (https://textit.com/) that include a learning centre for programming SMS surveys on their platform. TextIt is a platform developed by Nyaruka, a software firm originally founded in Rwanda that offers services for conducting text messaging surveys in over 170 countries.