Summary of Changes to Be Made for FY ’04
In addition to the 12-page report of consumer survey results, each program was asked to submit a Supplemental Program Report. The first part of the supplemental report was mandatory and consisted of basic questions regarding program information, number of unduplicated clients, number of surveys reported, and information regarding administration methodology. The second section of the report was optional and allowed providers the opportunity to give feedback regarding specific aspects of the survey instrument as well as the survey process in general.
For programs submitting survey results approximately 68% submitted the Supplemental Program Report; and approximately 44% of those submitting the supplemental report completed the optional feedback section.
The questions found in both sections 1 and 2 of the report are noted below followed by a summarization of responses
SECTION 1 - Mandatory
Did Consumers have access to personal assistance in completing the survey?
A majority of programs reported that consumers had access to some form of personal assistance in completing the survey. Programs reported that the following types of assistance were available:
-
administrative support staff
-
other non-direct service staff
-
direct service staff not involved with client’s treatment
-
direct service staff involved with client’s treatment
-
other
Please note or estimate the percentage of surveys completed in the following manner:
Overall 22% of consumers completing the survey received assistance. Of particular interest to DMHAS was the percentage of individuals who were helped by either direct service staff or by a peer or volunteer. Among the 22% of those who were assisted, about one half were assisted by direct staff and one fifth by a peer or volunteer.
Did survey administration provide for consumer privacy and anonymity?
Almost all programs (approximately 95%) reported that efforts were made to provide privacy to the extent possible for the program setting. The following reflects some of the strategies utilized by programs:
-
Surveys were returned in sealed envelopes made available by the program.
-
Surveys were returned to secure boxes or receptacles.
-
Private offices were made available to complete the survey.
-
If a separate office was not available, a specific section away from other program participants and staff was made available.
-
Consumers were offered the option of completing the survey at home or another location away from the program.
Describe your plan to share survey results with program participants.
Subsequent to the provider orientation it was decided that sharing results with program participants would not be mandatory for the FY ’03 survey. However, programs were free to share results if they wished. Many programs did plan to share results and the following strategies were noted (some programs utilized only one, while others utilized a combination of strategies):
-
Posting of results at the program site.
-
Posting results along with an accompanying corrective action plan.
-
Meetings.
-
Regularly scheduled consumer meetings.
-
Ad hoc consumer meetings scheduled specifically to discuss results.
-
Follow up focus groups with consumers to discuss results.
-
Regularly scheduled staff and quality management meetings.
-
SECTION 2 – Optional Feedback
Programs were given the option of providing feedback and suggestions regarding the DMHAS consumer survey process. Forty four percent of programs submitting a supplemental report did complete this section. Questions are shown in bold below and a summary of the major points addressed in the responses is provided.
Survey Questions
-
Questions were applicable and appropriate
-
Standardization across the DMHAS system is a good idea
-
Assessing the various domains is helpful.
-
The length of the survey was quite manageable
-
Consumers like multiple choice
-
Consumer friendly language
-
The length of the survey was too long
-
Questions are more applicable to treatment settings
-
Some questions do not apply to certain programs (many social and vocational rehabilitation programs made this comment).
-
For social and vocational rehabilitation: questions 5,6,9,12,13,15,16
-
(It was felt that when questions are asked that don’t really apply, confusion on the part of respondents can occur; and erroneous expectations may be created about what kinds of services should be offered)
-
For residential services: questions 6, 7
-
Some addiction service programs noted that the language in some questions is geared towards MH issues and did not apply to AS. Specifically, questions 9, 12, 15, 23
-
-
Respondents tend to judge themselves harshly and therefore results are not reflective of actual improvement
-
Many variables, aside from program participation, impact outcomes
-
Outcomes are not related to program participation
-
Outcomes may be related to services in general and not necessarily the specific program that is being surveyed
-
Outcomes are dependent upon length of time in treatment
Recommendations related to questions:
-
Allow programs to fill in N/A for questions that don’t apply to a specific program type before distributing to consumers
-
Allow programs to reword some questions
-
Allow programs to add program specific questions (this option was noted as an option in the orientation and the written training material, but perhaps this needs more emphasis)
-
Question 9 should be separated into 2 questions (1 about treatment and 1 about medication)
-
Have separate surveys for certain types of program
-
Add section where respondents can make comments/suggestions
-
Make time intervals in demographic section smaller so that they can capture shorter length of stay
-
For outcome questions (17-23) add a time period, i.e., “Since receiving services at___; or “Within the last ______.
-
Change to present tense
-
Make tense consistent throughout the survey
The following recommendations related to questions were provided at an October 2002 meeting with addiction service providers. They reflect recommendations regarding the questions that are not covered above.
-
Demographic section: How long have you been in the program?
-
The shortest length of service that can be selected is 6 months. It was noted that for several types of programs the length of service is much shorter than 6 months. Not having a more detailed breakdown less than 6 months, programs will not be able to review survey results in relation to these different lengths of stay. Suggested options were to add to the possible answers:
-
__less than 7 days
-
__less than 14 days
-
__less than 30 days
Question #5 - Staff was willing to see me as often as I felt was necessary.
It was suggested that DMHAS consider adding a question that would reflect satisfaction with actual follow up with scheduled appointments.
Question #14 – Staff was sensitive to my cultural/ethnic background (race, religion, language, etc)
Replace the word sensitive with the word respectful. It was felt that clients would more clearly understand this word.
Add more questions that would provide more specificity as to what factors might be contributing to satisfaction or dissatisfaction, i.e., sensitivity regarding gender; regarding sexual orientation; regarding religion; etc.
Question 15 - Staff helped me obtain information I needed so that I could take charge of managing my illness.
Replace the word “illness” with “recovery”.
Question 16 – My wishes are respected about the amount of family involvement I want in my treatment.
Replace “treatment” with “recovery”.
Some questions are in the present tense and some in the past.
It was suggested that the tense be consistent throughout the survey tool
Time frame for conduct of survey
-
Many programs responding to this question felt that the time frame was appropriate or adequate.
-
However there were also many programs that found the three-month time frame to be inadequate in terms of preparing for conduct of the survey and administering the survey to the number of consumers required.
-
The short time frame also exacerbated the problem of asking consumers to complete multiple surveys for those who are enrolled in multiple programs.
-
More lead time would have been helpful
Sample Size
-
Most programs responding indicated that the sample size was reasonable/appropriate/not a problem.
-
Some small programs that were required to report on all consumers (programs with less than 25 were required to survey and report on all program participants) had difficulty obtaining the required amount as :
-
Not all consumers wished to complete survey
-
For some programs, length of stay is short
-
Enrollment numbers change frequently
-
Some larger programs considered the sample size requirement to be too high
Guidelines for Administration
-
The majority of programs responding indicated satisfaction with the guidelines
-
Concise, clear, excellent
-
-
Alternately, the following comments were also made:
-
Administration should be fully prescribed by DMHAS and standardized from one program to another.
-
Administration guidelines do not allow for inter-rater reliability
-
No way to ensure results are accurate
-
Reporting Requirements
-
Most programs reported satisfaction with the requirements
-
Not a problem, reasonable, ok
-
-
There were several positive comments related to the data entry program that was provided to programs:
-
Outstanding, easy to use, fast data entry, good reports, instructions easy to read
-
-
Alternately, the following comments were also made:
-
Instructions were not clear
-
Program was hard to use
-
Reporting was too time consuming
-
Reporting requirements were redundant, i.e., having to send both hard copy reports and electronic raw data
-
OOC should have the data regarding unduplicated client count therefore this shouldn’t need to be asked on the program supplemental report
-
Training did not provide enough information regarding depth of reporting requirements
-
Supplemental Program Report too busy; too many questions.
-
Specific recommendations regarding reporting:
-
Develop program that allows programs to produce an agency wide report
NOTE: Many providers required assistance in utilizing the programmed disc. To a very large extent problems were related to lack of familiarity with the program. There were a small number of technical problems that were related to the computer equipment or software being used by specific programs. Overall use of the programmed disc was very successful.
What proved helpful in soliciting consumer participation?
-
Posting announcements
-
Providing verbal reminders
-
Trusting relationship with program staff
-
Emphasizing that results will have an impact upon services
-
Anonymity
-
Case managers encouraged consumers to complete surveys
-
Staff dropped surveys at clients’ homes
-
Survey was easy to fill out
-
Weekly reminders to staff
-
Group administration
-
For residential services, easy access to consumers (but it was noted that this may also inhibit honest responses)
-
Cover letter from Medical Director encouraging completion of survey
-
Availability of Spanish version survey
-
Staggering administration between programs so that multiple programs were not administering survey at the same time
What were the barriers encountered in soliciting consumer participation?
-
Additional work required to conduct survey
-
Consumers reluctance to complete multiple surveys
-
Hispanic clients historically do not like to complete surveys
Comments/Recommendations Not Covered Above
Survey Design:
-
Redesign format so that it is not so cluttered
-
Put demographic questions at the end of the survey
-
Make clearer that questions regarding Hispanic origin are in a separate section and should be filled out
-
Reduce number of questions
Administration of survey:
-
Several providers indicated that lack of available “neutral” staff or volunteers to assist consumers was problematic.
-
Some programs did not provide assistance and some programs opted to utilize direct service staff. Both options were considered less than optimal in terms of the potential impact upon survey results.
-
Outside/neutral party should administer survey to ensure consistency and to promote honest responses
-
Administration should be staggered between programs (programs do have this option)
-
OOC should provide examples of how to ensure anonymity
-
Centralize entire process and pay consumer representatives to provide assistance
-
Some programs recommended that the survey be conducted by non-program personnel such as OOC staff or contracted personnel.
Privacy:
-
Some programs reported that owing to lack of available private space, consumers completed the surveys in close proximity to their direct service provider. This was felt to be less than optimal in terms of perceived privacy.
-
Many programs felt that not requiring names on the survey increased comfort in completing the survey honestly.
Scoring
-
Several programs recommended that the neutral answer option be omitted.
-
One program noted that it was impossible to discern what a neutral answer means.
-
A small number of programs noted that a neutral response was considered to be negative or detracted from their satisfaction rating.
Misc.
-
Utilize focus groups instead of written survey
-
One addiction services agency with over 15 active programs opted to conduct the survey with all programs – funded and non-funded – as they felt that the survey was an improvement over current instruments and felt that the information provided was useful.
IMPACT OF FY03 SURVEY ON PLANS FOR FY ’04 CONSUMER SURVEY
Feedback regarding use of the FY ’03 survey was greatly appreciated. It was helpful in understanding how the survey administration proceeded across the state and in identifying issues as DMHAS moves forward with the survey process. While not all recommendations could be acted upon the changes to the survey process outlined below do respond to several issues raised by programs across the state.
Survey will be conducted on an agency level versus a program level.
For FY ’03 the focus of the survey was specific individual programs for which there was a contract requirement. For FY ’04 the focus of the survey will be upon agencies. Agencies are expected to conduct the surveys across all programs where the survey is a contract requirement; however the results are to be reported by agency and not by individual program. This change responds to a number of issues that emerged following conduct of the FY ’03 survey such as:
-
Program participants resented having to complete multiple surveys. With the survey being conducted at an agency level respondents will not be required to complete the survey on multiple occasions.
-
Questions are too global and don’t address specific programs; respondents may not be able to focus on only one program if they are receiving multiple services from one agency; the outcome measures (#17 – 23) are more reflective of services in general versus specific programs. Although agencies will be asked to conduct the survey at all programs for which there is a contract requirement, respondents will be asked to answer questions with reference to the agency and not specific programs. This perspective should work for both respondents who are receiving only one service and for respondents who are receiving more than one service.
-
Obtaining required sample size for reporting was problematic for some agencies. The sample size will be calculated based upon agency-wide client counts rather than by individual program. As a result, the sample size may be smaller than the sum of responses that were needed when the sample size was based upon individual programs.
Sample size requirement is being increased to improve accuracy of results.
For FY ’03 the sample size resulted in a 95% confidence interval value of +/- 10%. For FY ’04 the sample size will be adjusted to provide a better estimate of overall satisfaction with DMHAS services. Despite the increase in sample size, for many agencies the number of required surveys, on an agency-wide level, may be smaller than for FY ’03.
Changes in wording of questions. The following changes are being made for the FY ’04 survey:
-
For questions that reference “program”, the word “agency” will be substituted.
-
#9: I felt comfortable asking questions about my treatment and medication.
-
Change to: I felt comfortable asking questions about my services, treatment and/or medication.
-
#13: Staff respected my wishes about who is, and who is not, to be given information about my treatment.
-
Change to: Staff respected my wishes about who is, and who is not, to be given information about my treatment and/or services.
There were additional recommendations for changes in wording, however DMHAS continues to be invested in maintaining as much consistency as possible with the MHSIP Survey so as to enable comparability on a national level and to comply with federal reporting requirements. For those questions that do not apply to certain respondents, the provision of clear instructions will be important to avoid confusion.
Changes in demographic section. On the FY ’03 survey many Hispanic or Latino respondents did not go on to complete the second part of the demographic question.
On the FY ’04 survey the question relating to Hispanic/Latino respondents has been changed to be clearer (If you are of Hispanic/Latino origin are you? Puerto Rican/ Mexican/ Other Hispanic/Latino).
Section related to length of time in service will be removed. A section that will ask respondents to note the type of services that they are receiving within the agency is replacing this section. In this way we will be able to compile some data that is service-type specific. This change is also being made in line with the fact that the survey will now be conducted with a focus on the agency level versus program level. For respondents receiving multiple services, a question regarding length of time in service could not be analyzed.
DMHAS does consider the length of service to be an important element to look at in relation to satisfaction. However at this point in time a choice had to be made as to which data would be captured. In the future length of service may be reintroduced into the survey.
Supplemental Program Report. The Supplemental Program Report will be condensed for the FY ’04 survey cycle. Agencies wishing to provide more feedback regarding the survey or the process are free to do so at the time their reports are submitted.
ADDITIONAL NOTE
Agencies/programs may add specific questions of interest at the end of the standardized DMHAS survey. This option was also available for the FY ’03 survey. Agencies may add specific questions that are program specific or are otherwise of interest. Results from any additional questions will not be reported to DMHAS. Agencies will need to analyze these questions on their own, because the report tool is not flexible enough to accommodate additional questions.
In addition, agencies are free to incorporate an additional section that allows respondents to provide narrative feedback or suggestions regarding services at either the agency or program level.