A Conceptual Model of the Factors Affecting Education Policy Implementation

Abdulghani Muthanna

1,* and

Guoyuan Sang

Guoyuan Sang

Department of Teacher Education, Norwegian University of Science and Technology, 7491 Trondheim, Norway

Faculty of Education, Beijing Normal University, Beijing 100875, China Author to whom correspondence should be addressed. Educ. Sci. 2023, 13(3), 260; https://doi.org/10.3390/educsci13030260

Submission received: 14 January 2023 / Revised: 20 February 2023 / Accepted: 24 February 2023 / Published: 1 March 2023

(This article belongs to the Section Higher Education)

Abstract

Implementing education policy is crucial for achieving policy goals, and several factors lead to the success or failure of this vital endeavour. Drawing upon a policy document critical review, observational research and in-depth, audio-recorded interviews with 93 participants, this article reports on the key factors that hindered the implementation of the national strategy for the development of higher education (NSDHE) in Yemen; the main factors are the implementers’ lack of information combined with lack of commitment by university leaders; the use of strategy top-down planning; a lack of funding coupled with financial corruption; the absence of an institutional strategy; the presence of eco-political challenges; and a lack of basic infrastructure (e.g., classrooms, teaching aids, offices, and toilets). The article also provides a simple conceptual model for these key factors. The findings offer direct benefits that will help policy formulators and implementers enhance the formulation and implementation of present and future education policies.

1. Introduction

Education policy is a specific set of goals that academic institutions need to achieve within a specific period of time. Since there are different stages of education, there are also different education policies to prepare appropriately for these stages. In several contexts, studies have investigated the implementations of education policies and strategies to foster and achieve sustainable development of education quality [1,2,3,4]

In Yemen, this is the first study to investigate the implementation of the National Strategy for the Development of Higher Education (NSDHE) from the perspective of the university key implementers. This paper mainly focuses on answering the following overarching research question: what key factors led to success or failure in implementing the NSDHE at higher education institutions in Yemen? In particular, we seek (i) perceptions on the strategy’s implementation among senior university leadership, academic staff, and students, and (ii) the main problems and challenges preventing the implementation of the strategy. Significantly, this study highlights the main factors that led to the failure of the NSDHE strategy and conceptually frames these influential factors in a simple model for both policy formulators and implementers to consider. Other places with a similar context to Yemen may directly benefit from these findings and thereby enhance the formulation and implementation processes of present or future education policies. First, we give a brief presentation of the context of the study in terms of higher education and the current situation.

1.1. Higher Education and the Current Situation in Yemen

In Yemen, higher education started by establishing two colleges of education in Sana’a and Aden in 1970-1. These two colleges developed to become universities that included several colleges of education in other cities. Until 1990, only these two universities existed in Yemen, and their main mission was preparing teachers for basic and secondary education. Currently, there are nine state universities in Yemen: Sana’a, Aden, Taiz, Hodeida, Thammar, Amran, Ibb, Shaba’a University, and the Hadhramout University of Science and Technology. Shaba’a University was recently established in the governorate of Ma’reb. These institutions of higher education comprise several different colleges that include many departments related to both the natural and social sciences. Moreover, private universities have expanded threefold relative to public ones in recent years. (This pattern is due to the current war, which has led professors to receive no salary; as a result, teachers also teach in private universities). Both state and private universities recruit undergraduate students for four or five years, depending on the programme. Only a few graduate and post-graduate programmes are offered in these universities. Thus, many graduate candidates are prepared outside the nation.

The system of decision-making is heavily bureaucratic in the Ministry of Higher Studies and Scientific Research, which is responsible for higher education institutions. However, universities have full autonomy with respect to teaching and administrative practices. This results in several challenges with regard to the improvement of higher education quality. These challenges include the presence of many administrative and academic problems [5,6,7,8], a high rate of unemployment [9], the migration of teachers [10,11], the absence of fair implementation of the laws concerning the appointment of academics or administrators, the use of personal power and nepotism [8,10], the absence of a research ethics code [12], and the use of universities as political arenas [13], and currently the presence of war [14]. A further challenge is associated with the presence of unclear roles for administrators both in the Ministry of Higher Studies and Scientific Research [7] and at universities [13]. Additionally, there is a continuous deficit in the governmental budget, which may stem from improper management and distribution of financing among sectors or from corrupt officials. It is reported that “corruption is endemic” in Yemen [9] (p. 22).

Since the Arab Spring in general and 2014 in particular, Yemen has undergone serious civilian and foreign conflicts that have posed a threat to the lives of people, especially students who are targeted for armed groups’ recruitment [14,15]. There are also regional and international interests [16] that prevent stability in Yemen. This internal and external conflict resulting in instability has a strong effect not only on basic and higher education but also on the entire community. Schools and universities have been closed down in many cities. Furthermore, the higher education institutions in the north are currently governed by the so-called Huthi government. The following is a brief overview of NSDHE policy.

1.2. A Brief Synthesis of NSDHE Policy

The NSDHE was prepared with the collaboration of an external professional and included the relevant ministers, vice ministers, and chancellors of state and private universities in the nation. The inclusion of key administrators motivated the planners to use an iterative approach. However, it is surprising that teachers and students (the most important implementers, especially in regard to conducting teaching, learning and research activities) were not included in the preparation of this strategy.

It is worth noting that the higher education ministry intended to conduct reforms that would lead to better quality in higher education in the nation. This strategy, with the application of a strengths, weaknesses, opportunities, and threats (SWOT) analysis, pointed out these very features. Based on the SWOT analysis, the strategy proposers indicated many serious problems that higher education suffers from and that need to be addressed immediately. Problems related to institutional governance (autonomy and structure), financial resources (e.g., educational expenditures, salaries, overseers’ scholarships, and investment expenditures), human and physical resources, student numbers, equity, pedagogy, quality, research, and service are all mentioned in the strategy [17].

The NSDHE is a 5-year education policy to be implemented between 2006 and 2010 at all higher education institutions in the nation. The policy document states the vision of the Ministry of Higher Studies and Scientific Research (MHSSR) as follows:

To create a higher education system characterized by quality, broad participation, multiple and open routes vertically and horizontally, that is effective and efficient and delivers quality programs, shows excellence in teaching, research and service to society, and enhances Yemen’s quality of life.

The MHSSR translated the above vision into 16 broad mission statements, which led to the formulation of four key objectives: (1) governance and institutional governance: ensuring the cooperation of related ministries in developing higher education and the establishment of a supreme council of higher education; (2) resources: providing sufficient resources to create a high-quality system (financial and human resources as well as curriculum renewals); (3) teaching, research, and service: ensuring that high-quality teaching, research, and services are provided to meet the needs of Yemen and its people; and (4) institutional diversification: guaranteeing that higher education is developed in a diverse way and permits a variety of institutions to meet diverse needs [17] (p. 61).

The proposers of the strategy included a two-page report showing the implementation of the strategy. Although the implementation procedures are not detailed, the main points are concerned with major expenditures. Furthermore, the report advises the Ministry of Higher Education to prepare a plan for the future financing of tertiary education and a systematic renewal of laboratories and equipment infrastructure. In other words, the strategy provides a clear statement about the implementation mechanism, stating, “the strategy will need to be followed by an implementation plan, in which the government identifies in detail the specific steps required to implement each of the recommendations” [17] (p. 114). In addition, the strategy document indicates that the strategy will need a regular review to monitor the plan’s progress and revise it [17].

Above all, the Ministry of Higher Studies and Scientific Research has produced a document in which 12 goals (with a few sub-goals) are stated and generalized for all higher education programmes in Yemen [17] (see pp. 12–13 for details). For undergraduate, graduate and post-graduate programmes, the goals are the same. One main instrument for achieving these goals is the implementation of the NSDHE. In other words, the implementation of the strategy implies the realization of the stated goals. The following is a brief definition of the policy makers and policy implementers with respect to the current study.

1.3. Policy Makers and Policy Implementers

In this case, policy makers proposed the NSDHE policy, and these are government officials in the top-level administration. Specifically, the proposers are the ministers and vice-ministers of higher education, education, and other relevant ministries. Furthermore, university chancellors and vice-chancellors are considered policy makers, as they were engaged in the preparation of the policy. However, university chancellors and vice chancellors are also considered policy implementers, as they have legal authority to oversee how the new strategy is implemented. They are actually the main policy implementers, as they directly lead higher education institutions. Further, these university academic administrators (e.g., chancellors, vice-chancellors, deans, and vice-deans) also teach some courses, and supervise master’s and doctoral candidates’ theses, and therefore are considered direct implementers of the prepared education policies.

Policy implementers are those actors who try to implement policies and strategies by interpreting and translating them into actual practices [18]). Thus, these implementers are those who are directly involved in executing a certain strategy or policy. In this study, the implementers are bureaucratic and academic administrators (university leadership: chancellors, vice-chancellors, deans, vice deans, and chairpersons), teacher educators and teacher students. Teachers and learners (and an institution’s administrators) are the key implementers of an education policy that mainly targets improvement in the teaching-learning and research processes. However, it is important to note that teacher educators and teacher students are recruited only from the College of Education, as this is the college that is meant to prepare teachers to teach various school subjects. The following briefly discusses the formulation processes of education policy.

1.4. Education Policy Formulation: A Brief Overview

Policy formulation involves struggles for meanings [19], i.e., a kind of linguistic action using discourse [20] that can be found in an interrelated set of texts and the practices of their production, dissemination, and reception [21] (as cited in [22] (p. 650)). Education policies are formulated in the manner of documents compiled at a top level or by institutions, such as official reports, meeting minutes, plans, events, and talks [18]. Such policies are prepared for implementation in schools, colleges, and universities. Policy implementers attempt to implement policies by interpreting and translating them into actual practice. Policy interpretation (decoding) involves reading policy documents to understand them and determining how the policies relate to the implementers. Achieving this decoding leads to translation ‘recoding’, which is the actual application of the policies by following particular strategies, such as talks, meetings, and official reports [18] (pp. 619–621). With the help of the recoding process, policies are enacted in schools and classrooms. Policy enactment is defined as the process of interpreting and translating policies into contextualized practice [23].

Previous studies have reported on three major approaches in planning an education strategy or policy: top-down [24], bottom-up [25,26], and iterative approaches [27]. The top-down approach mainly prescribes certain rules for implementers to follow [28,29]. On the contrary, Hill and Hupe [30] argued that interactions between variables and efforts to coordinate the process of implementation are crucial for the implementation process and this is known as a bottom-up approach. The purpose of the ‘iterative’ approach is to link policymaking and implementing activities. Iterative studies attempt to “recognize how communication and evaluation activities create linkages … between the policy-making and the organizational environments, potentially allowing policymakers and organizational actors to reach mutual accommodation over time” [27] (p. 453).

Education policies produced in a hierarchal way do not consider the perspectives of students [31]; therefore, it is perplexing why such policies are made for students without their engagement. Smyth and Robinson [32] suggested the “policy deafness” concept, which covers the problem of a policy that removes students from mainstream schooling and puts them into different learning programmes. The authors indicate that this inappropriate policy does not respond to the actual needs of students. Drawing on the ideas of Butler [33], Gowlett, Keddie, Mills, Renshaw, Christie, Geelan, and Monk [34] used the term “policy reception” to interpret the attitudes of teachers towards a particular policy; Gowlett et al. [34]) contend that teachers’ reception of a policy is formed through pre-existing ideas and normative understandings (p. 152). Teachers’ teaching quality is a primary concern of higher education policies, and this teaching quality differs from one context to another. However, education policy should comprise at least the creation of leading and responsive knowledge, collaborative teaching efforts, and continuous training for life-long learning [35]. The following is a brief discussion of a specific education policy’s implementation and the key factors that influence it.

1.5. Education Policy Implementation and Key Influential Factors

Policy implementation is the “Achilles’ heel of human service delivery” [36] (p. vii), and this weakness indicates the difficulty of the policy implementation process. It is considered as a type of research that focuses either on revealing the interactions among policies, people, and places and showing the outcomes of such interactions [37], or on how policy objectives are translated into practice [38]. Honig [37] also advises researchers on education policy implementation to initially start to build knowledge about policies, people, and places and how they interact with each other. Policy implementation researchers have considered “the problem of education policy implementation as one of teacher learning”, hence, teachers need to be considered as learning individuals who should think of learning as an ongoing process [39] (p. 25). In reality, policy implementation research ‘does not fully examine all implementers’; therefore, conducting critical policy implementation studies in different contexts provides new insights about policy implementation and research on implementation [40] (p. 14).

Because policy-making and implementation support each other, the implementation process is an indispensable constituent of the policy-making process [41]. In other words, to assess whether the goals of a policy are well planned and achievable, policy-makers and reformers must pay attention to the evaluations of the implementers. Preferably, implementers’ evaluations are given a short period after the policy planning and distribution. Moreover, implementation studies should be continuously conducted to reform the policy goals and develop a sound education policy that increases and strengthens education quality [25,42].

Achieving the goals of an education policy demands the presence of certain instruments [43], such as mandates, incentives, and capacities [44]. Policy instruments are tools that policy implementers use to achieve a target [45,46]. These instruments can vary across nations depending on the local socioeconomic issues, and they can be either soft or hard. Whereas soft instruments include policy recommendations, guidelines, informational devices, and school evaluations, hard instruments contain legislation and regulations with a centralized, rigid, compulsory application [47] (pp. 1–3).

Previous studies have reported various factors that affect the implementation process. The implementing institution’s conditions, the clarity of its policy, and the communication of standards play major roles in the failure or success of an implementation [48]. Furthermore, researchers have reported that conflicts between policy makers’ interests and those of implementers produce failed implementations [43,49]. Unclear university missions, ambiguous roles of implementers, weak policy awareness, and conflicts of interest negatively affect both teachers’ perceptions of an education policy and its implementation [4]. It should be noted that implementers’ expertise and commitment have an impact on a policy’s implementation [50,51]. Funding plays a major role in the failure or success of education policy [1]; the wording of policy objectives [52,53] and implementers’ interpretations [53] also influence the implementation. The following section details the research data collection methods, participants, and data interpretation methods.

2. Research Design

2.1. Research Methods

This study followed a qualitative case study methodology, which is mainly used in studies of social phenomena, while retaining the characteristics of the events [54]. By following Yin [54], this study is a multiple-case (embedded) design. Primarily, a single case study could be about one organization/institution and could “involve more than one unit of analysis” [54] (p. 50) to report outcomes about all units of that organization/institution. Such units are known as embedded cases/units. In this sense, the authors used the single-case embedded study within the same higher education institution as there are many units (e.g., deans, vice deans, teachers, students, bureaucratic administrators, and university leaders) within the same academic institution. However, the inclusion of five participants from the Ministry of Higher Studies and Scientific Research at a later stage also adds another institution, and therefore this study is considered to be a multiple-case study.

The primary author began by critically reviewing the NSDHE document The national strategy for the development of higher education in Yemen , published in 2007.

This critical review helped the primary author understand the policy document and prepare interview guidelines to investigate its actual implementation. Then, the primary author piloted the interview questions twice. One pilot was conducted by telephone calls with one teacher educator and one teacher candidate in the Arabic Teacher Program at one public higher education institution in Yemen before the defence of the research proposal. After this defence, the primary author travelled to Yemen, the context of the study. Next, the primary author piloted the interviews again with one teacher candidate and one instructor from a Psychology Department, one vice-dean, and one bureaucratic administrator. Then, the primary author employed observational research to closely observe how the teaching process is performed, how students react to the teaching process and the availability of teaching and research resources. Moreover, the primary author visited all of the buildings of the university and observed how they are structured, as well as the availability of classrooms, teaching aids, restrooms, offices, library facilities, etc. The primary author also observed some of the lectures. While working as an inside observer, the primary author developed a greater understanding of how the teaching, research, and administering processes are conducted in the institution. Taking notes based on these continuous observations increased the researcher’s knowledge of the issues under study, and this process continued until the end of the data collection processes. For example, student and teacher interviewees reported the lack of sufficient classrooms, teaching aids, offices, and other facilities. Therefore, it was necessary for the researcher to visit all of the buildings and assess the veracity of these statements. These observations were used as background familiarity with the context of the research and also helped us reformulate the interview question guidelines. Based on the critical review of the policy document and observations, the final draft of the in-depth interviews was prepared.

Although the study participants were in different groups (students, teachers, and administrators), the main focus of the interviews was similar. The interview questions were divided into two parts. The first part focused on obtaining general information about the participants and their administrative, teaching or learning experiences; it was also used to facilitate interactions [55]. Furthermore, this first part assisted in providing descriptions of the study participants, which is necessary for critical qualitative research studies. The second part of the interviews focused on two focal dimensions. The first dimension encompassed an inquiry into the participants’ perceptions of the stated objectives of the NSDHE and their implementation at the institution. The second dimension was based on the participants’ experiences, and concentrated on exploring any factors that hindered the implementation.

In detail, the interview guidelines for chairpersons and instructors focused on whether the NSDHE goals are being implemented with the reasons for their statuses. Furthermore, our guidelines focused on investigating whether their programmes have developed their own policies and the challenges facing them. We had a similar focus in the interviews of administrators, namely, the chancellor, vice chancellors, deans, vice deans and bureaucratic administrators, with a further focus on the processes of educational reform (e.g., the institutional strategy), the engagement of teachers and students in such activities, and funding issues. Regarding student teachers, the interviews focused on recording their reflections on their teaching programmes, the resources available for learning and teaching, and how the programmes prepare them for performing teaching activities. These personnel were also asked whether they had information on the NSDHE or were involved in any policy preparation for the programmes, colleges or the university. Most importantly, the interviews focused on inquiring into the main problems and challenges facing the interviewees during their studies.

After translating the interviews into Arabic, which was the mother tongue of the primary author and the interviewees, the accuracy of the translation was checked by two colleagues who are versed in both Arabic and English. The interviews were of two types, but all were face to face. One interview type was a one-to-one interview, and the other was a focus group interview. The focus group interview was conducted with all of the student teachers, who were mostly female. Culturally speaking, it is unacceptable for a male interviewer to interview a female outside the university in most cases. Furthermore, as they were the top students in most programmes, the female teacher students were shy about being individually interviewed even inside the university. However, with the help of some teacher educators who allowed and encouraged the top students to leave their classes, the interviews were conducted inside the university; most interviews occurred in the university library. The interviews with students were focus groups where the researcher acted as a moderator. For every teacher education programme, there was one focus group interview (with two to five people in a teacher student group). A focus group interview with teacher students was conducted prior to interviewing the teacher educators and administrators. (This setting also helped with critical reflections during the interviews with teachers and administrators.) The focus group interview is particularly suitable for challenging one’s views and those of the focus group members, leading to more realistic accounts of the interviewees’ thoughts [56], especially in the case of academics. Focus group interviews also occurred with three teacher educators in the Administration and Educational Foundations Program, two educators in the Psychological Teacher Education Program, and three educators in the Physics Teacher Education Program. This interview pattern was based on the educators’ interest. Each focus group interview was conducted in the afternoon at one of the group member’s houses. All of the other interviews with teacher educators and academic and bureaucratic administrators were individually conducted. On the basis of their preference, the interviews were conducted within the precincts of the university (mostly in offices or a library) and at interviewees’ homes. Additionally, there were many informal interviews with different participants while visiting the research site, either to take part in an interview or to obtain a new interview appointment.

2.2. Stages of the Interviews and the Selection of Participants

The interview process consisted of four stages. The first stage was used to introduce the general idea of the topic to the interviewees; their consent to participate was obtained and appointments were made for further interviews. The second stage comprised mainly in-depth interviews, which lasted for 2 h. This stage focused on ascertaining the participants’ perspectives on the objectives of the NSDHE and their implementation at the institution; it also focused on exploring any factors that hindered the implementation. The initial interpretations of the collected data demanded further inquiries, which led to the third stage. In this stage, the primary author explored specific issues and confirmed that the initial interpretations accurately reflected the interviewees’ intentions. Deeper interpretations demanded the inclusion of some participants from the MHSSR, which constituted the fourth stage. However, it is important to note that the interviews with five participants from the MHSSR were carried out through telephone calls (due to the presence of war, which precluded face-to-face interviews). The main purpose of these telephone calls was to investigate whether the newly established supreme higher council is meeting its responsibilities. (This forced procedural change is why these interviewees are not included in the table of the study participants).

The selection of interviewees was based on certain criteria. Bureaucratic administrators had previous administrative experience and were at the time of the interviews involved in different types of bureaucratic administration. Academics (such as the chancellor, vice chancellors, deans, vice deans, chairpersons and instructors) are involved in teaching and administration duties and have PhD qualifications. Some others were also leading private institutions at that time. Furthermore, the student teachers were in the last semester of their studies. The primary author also informally interviewed five managers who were working in the MHSSR. The total number of interviewees was 98.

2.3. Research Participants

The research participants numbered 98 in total. However, the five participants from the MHSSR were informally telephoned for the sake of assessing whether the ministerial higher education council was meeting its responsibilities at that time. Therefore, they are not included in the following Table 1, which gives details of the 93 participants from the same public higher education institution in Yemen. The primary author gave all of the participants pseudonym codes to protect their anonymity. Table 1 also gives the participants’ age, gender, role, and academic ranking.

2.4. Research Interpretation Methods

Systematic data interpretation is very crucial, as the whole work was performed to reach logical conclusions and report significant findings. In other words, developing ideas from data and relating them to the literature and to broader categories and concepts [57] is the most crucial part of the entire study. Thus, reviewing the literature is also used as a method of obtaining findings. Still, developing ideas demands careful interpretation methods that lead to robust findings. As qualitative researchers deal with copious quantities of words, they carefully need to profile them. The primary author profiled every interview with its verbatim manual transcriptions. The researcher gave a pseudonym code to each profile/participant to maintain their anonymity. The data derived from observations and interviews make a large corpus.

The primary author critically reviewed the education policy document under study. This review process included highlighting and extracting some important texts and notions that the primary author needed to investigate to comprehend how implementers perceived and implemented the policies. For example, the studied policy goals were extracted and used in the interview questions to investigate how the implementers perceived and implemented them. Further content analysis was used to make comparisons of what this policy document contains and states and what the participants think and do in real-life situations. Important policy quotes were also used while reporting the findings in order to enhance and explain them.

After transcribing the interviews into Arabic, the primary author followed the pioneers of grounded theory analysis methodology [58,59,60] and started to extensively read the transcripts and highlight some texts and concepts. This procedure is a method of categorizing the data into smaller chunks (open coding). This open coding helped the primary author organize the highlighted concepts and texts from each transcript and combine them into a new file for comparisons. After reading the selected texts and concepts, the primary author labelled each selected text (axial coding). Further extensive reviews of the data enabled comparisons of the coded segments, which led us to select the core categories among the many identified (selective coding).

By giving an example, it becomes easier to understand how these coding processes are attained. The primary author read the transcripts and found that there are texts that focus on different factors for the failure of the NSDHE implementation. These factors are related to implementers’ lack of information on the education policy goals, neglect from those leaders of the institution, and other issues of that type. Then, the primary author highlighted these texts with a different colour for each factor. After making these notes with all of the transcripts, they were studied intensively to attach labels such as teachers’ dissatisfaction with university administration, students, and teachers’ lack of information regarding the NSDHE goals, implementers’ perceptions of the NSDHE goals, and factors in the NSDHE implementation failure. These categories were then grouped into larger categories (such as implementers’ lack of information and administrators’ lack of commitment). Further intensive reading helped uncover the core category factors behind the non-implementation of the NSDHE. This coding technique facilitated the process of identifying the most important categories that detail the implementation failure’s main factors. The use of interpretational analysis helped summarize and interpret the selected texts and the themes emerging from them [61]. Another example of coding is the presence of texts highlighting problems related to the lack of teaching resources such as toilets, offices, classrooms, and laboratories. For instance, all of the texts relating to laboratories were gathered into one profile with the label ‘labs’. This compilation was applied to all of the other texts that are related to the other mentioned problems. Then, the primary author connected all of these problem texts from one research group’s participants (for example, teacher students). The same process was also applied to teacher educators. The third stage focused on comparing the statements of all of the participants with each particular label. This comparison led to specific categories such as lack of labs, lack of teaching aids, and lack of teaching/learning classrooms. Making comparisons between all of the texts is important and led to a final larger category (e.g., a lack of infrastructure) that comprises all of the previous smaller categories.

Because it is logical that not all statements from participants reflect reality as it is, the primary researcher employed the technique of ‘reflective analysis’ [61] (backed up with field observations) to enhance the processes of interpretation and to provide robust evidence. Part of the data needed to be translated into English (with a word-by-word translation). Two colleagues who are versed in both Arabic and English checked the accuracy of the translation. The primary author also employed the ‘Member Check’ technique that involved some participants (teachers in particular) deciding whether the interpretations of their interviews were accurate and reflected their beliefs and perceptions. Both authors also reviewed the emerging categories and interpretations in the later stages.

3. Research Evidence

The critical investigations led to the identification of many factors that contributed to the failure to implement the strategic objectives. The main factors are detailed below.

3.1. Implementers’ Lack of Information about the Strategy and University Leaders’ Lack of Commitment

The NSDHE document states that the policy needs ‘to be followed by an implementation plan, in which the government identifies in detail the specific steps required to implement each of the recommendations’ [17] (p. 114). The MHSSR (representing the government in this case) is responsible for preparing a detailed implementation plan and disseminating it to all of the relevant institutions. However, the analyses of the collected data revealed that the ministry failed to prepare such a plan; it did not provide universities with the necessary plan to implement the overall strategy. Many research participants revealed that they lacked information about the strategy and its content. In this regard, the participants made the following statements:

The strategy is not available online or at our university. Teachers read the document regarding the laws concerning higher education to know the rules about getting promotions or financial increments.