Distance Education Learning for higher levels depends on multiple factors which entail pedagogical and technical dimensions. This research focused on an assessment study of materials contained in an educational platform, showing results on the factors causing significant intervention in the ways of learning in Virtual Learning Environments. The study contemplates a sampling for the educational platform operating at the Autonomous University of the State of Mexico. It was conducted following a transversal descriptive methodology, assessment of 19 independent study guides, and analyses of 13 fundamental indicators for each of them. For the research design, descriptive and correlational statistical techniques were used to analyze the foundation of the model. T tests were conducted to understand the discrepancy of those factors that should be showing a convergence. Results showed that there were areas of opportunity in terms of contents, learning activities, activity evaluation with feedback and materials content. The main inference found was that, in order to be able to build a distance educational model, the main components for learning should score a minimum of 7 out of the 10 possible points to be able to guarantee quality education materials in educational platforms. The recommendations include the prioritization of not replicating a traditional educational model deposited in virtual form, implying the support of the integration of educational materials with the Virtual Learning Environments.
Keywords: Computer assisted learning, Distance education, Educational model, Educational platforms, Learning, Virtual learning environments.
Received: 2 February 2022 / Revised: 9 March 2022 / Accepted: 25 March 2022/ Published: 7 April 2022
This research provides the efficiency and scope of the educational model through its virtual materials, as well as areas of opportunity that serve as a basis for the construction of the didactic pedagogical model required by universities for the evaluation of their educational programs in non-school-based modality.
In the last decade, we have seen significant changes in higher education, with new educational models, ways of thinking, and the boom of Distance Education (DE) where new developments in Information and Communication Technologies (ICT) have fostered the transference of knowledge, self-learning, and the understanding of multiple everyday actions (Khan & Markauskaite, 2017). Likewise, there has been an increase in the different skills and attitudes improving the future of work (Faúndez, Bravo, Ramírez, & Astudillo, 2017) and have generated in every sector of society, a wide array of options allowing the appearance of unstudied distance models, and concentrated educational platforms; thus seeking the development of educational processes through a technological scope (Gutiérrez Bonilla, 2016).
In addition, there have been new demands and challenges focused on the creation of scenarios prepared for the development of the academic process, which will respond not only to technical, operative, and procedural aspects; but rather integrate a proposal taking into consideration the practical elements that will efficiently contribute to the process. The aforementioned through virtual and technological scenarios will gain the students’ acceptance in an environment and educational model (Blanco Martínez & Anta, 2016).
This research study, in the conception, development, and implementation of online materials, seeks to establish the foundations of the elements that Virtual Learning Environments (VLE) provide. They are itemized as computer applications that are available to facilitate pedagogical communication between the parties involved in the teaching-learning process (Quiñones-Negrete, Martin-Cuadrado, & Coloma-Manrique, 2021). For these education forms originating in this society of knowledge and constant change, the purpose of VLE is to ensure that all learning activities for the students, regardless of their level, should provide a rapid access to information. This is in addition to the developments in the personalization of contents and in correlation with the management of learning resources, thus forming an information system that will produce an adequate perception for the user. This will allow the performance of educational platforms to become more efficient (Fontalvo-Herrera, Delahoz, & Mendoza-Mendoza, 2018; Knijnenburg, Willemsen, Gantner, Soncu, & Newell, 2012).
As far as VLE is concerned, we must take into consideration that, in the learning building process, the student is the owner and is responsible of it taking place, and students must establish the instructions and objectives. In every discipline, contents have to provide elements with such quality that the computer learning is integrated by contents, resources, and a wide variety of tools related to management and communication. In order for the student to be able to find a virtual classroom full of resources, these tools can be accessed within the platform that is being used (Cedeño & Murillo, 2019; Liyanagunawardena, Williams, & Adams, 2014).
At the same time, it is important to revise and adjust the curriculum and personalize it, thus committing to the quality of the contents in the virtual classroom through educational platforms (Avello Martínez & Duart, 2016). It is important to understand the adequate rapport and communication between all education stakeholders mainly between professor and students, information integration, didactics or the environment. Since the creation of the integration of pedagogical strategies, the intention must be to plan, develop, and execute actions with the purpose of the student learning, and that the teaching produces knowledge and assessment, thus scaffolding the knowledge and being aware of his or her evolution (Herrera, 2014).
There is a significant difference between planning a class for the traditional classroom model, and the one that implies a learning environment with a wide conception of the resources and planning of activities in a virtual environment, which integrates guides that will lead to student self-learning with the support and guidance of the professor (Rubio & Abreu, 2016).
Therefore, with the support of didactics, we can improve student productivity and increase quality. This can be done based on theories about how people learn. It is very important that throughout its development different components are considered such as autonomy, revision, appraisal, and self-guidance, so that they admit the adaptability with regards to the operation and application media and, with the recommendation that they are goal-oriented and filtered according to authenticity, diversity, pertinence, transparency, adequacy, insertion, and knowledge (Ausín, Abella, Delgado, & Hortigüela, 2016).
Gómez et al. (2015) asserted that technological component must be associated with the computer based communication system and it must bind them with the stakeholders of the pedagogical Teaching-Learning process. It has to be based on “a multidirectional form of communication, so that each student participates actively and in an autonomous manner” (Saza-Garzón, 2016) with a” flexible and dynamic system that is adapted to the environment where it takes place” (Juca, 2016) with shared resources, stability, comfort, and that specifically supports group activities.
Several researchers talk about the importance of assessing a DE model from different perspectives, some of them from an organization standpoint, others from a pedagogical, or a technological one. One of them, that is not so commonly assessed due to its connection with learning, are the estimating tools which measure teachers’ efficiency and the efficiency of education in virtual and digital environments in computer assisted learning (Acón-Matamoros & Trujillo-Cotera, 2011; Aguilar, Ayala, Lugo, & Zarco, 2014; Ardila-Rodríguez, 2011; Gómez-Suárez, 2017; Rakic et al., 2020; Roig-Vila, Mengual-Andrés, & Suárez-Guerrero, 2014; Salas, Moro, & Pérez, 2020) under the premise that learning has to prevail over technological criteria.
Educational models within their substantial functions privilege the learning variable in students, which is ultimately what proves the convergence of the pertinence for this type of education. Hence the importance of assessing from different perspectives such as from the perspective of Instructional Design (ID) or the correct structure is necessary. It is also important in online courses, as well as the quality of the contents, the topics, the general presentation of the course, the goals, objectives, available material that is adequate for the course, among others.
This research focused on presenting an evaluation of the contents for this type of educational model, by evaluating a sample of 19 Independent Study Guides (ISG) with an assessment design instrument called “The ISG Assessment Matrix”, which considers 13 fundamental aspects of ID that must be contained in a good educational quality course, thus guaranteeing proper student learning.
The research was based on a quantitative research paradigm, as well as non-experimental, descriptive-transversal, with a mixed focus (since we conducted some measurements or collected autonomous information). We consider these studies as promoting the adequacy to develop them in scenarios where educational research is based on a real context (Roig-Vila, Mengual-Andrés, & Quinto-Medrano, 2015) thus allowing us to analyze the information to objectively describe reality, by providing elements, specifying properties, main characteristics, and other relevant components found in people, classes, communities, or in any other studied phenomena that can be incorporated to an analysis (Hernández-Sampieri & Mendoza, 2018).
It is evident that quantitative and qualitative methodological strategies offer divergent perspectives. Regarding the analysis foundation of the participating researchers, we designed a checklist to evaluate the virtual learning environments, using the spaces from a perspective of the variables as a selection of present or absent elements, classified in 13 groups.
The general indicators included for the design of the Checklist to Evaluate the Independent Study Guides were: 1) Program 2) Curriculum Indicators, 3) Schematic Contents 4) Curriculum Support Materials and Bibliography, 5) Study Methodology, 6) Specific Orientations for Study, 7) Evaluation to Pass the Course, 8) Development of Contents, 9) Writing and Presentation of the Material, 10) Use of Open Educational Resources, 11) Uses of Technological Tools Available in the Educational Platform, 12) Activities Related to the Methodology and, 13) Inclusion of Evaluation Tools.
The first 7 sets of assessed elements, have a relatively lower value. Quantifying the scores between the values of 0 and 1 with regards to the subsequent, are part of the ID fundamental structure and of the basic structure of the Virtual Learning Unit. The other 6 categories take values higher than 1, for a total score of 13 categories adding up to 10 points as part of the ISG assessment. The highest value category is the item Writing and Presentation of the Material, representing 50% of the total score, and a minimum of 8 points requiring to score a ISG as acceptable and publishable within an educational platform.
This study considered a sample from the educational platform operating in the Autonomous University of the State of Mexico (Spanish: Universidad Autónoma del Estado de México, UAEM) of the Independent Study Guides of the 3 non-formal educational programs offered by the Cuautitlán Izcalli Professional Academic Unit (Spanish: Unidad Académica Profesional Cuautitlán Izcalli, UAPCI), where we took a sample of 19 ISG to evaluate them and determine the state of efficiency of the materials. We also attempted to find a relation between the impact caused by learning and the efficiency of the component in the educational model.
The measurements were conducted to evaluate the existing correlation between the variables fostering the learning capacities in students, using descriptive and correlational statistical techniques. We analyzed the existing problems regarding the foundations of a model that showed control, was updated, and adapted to the learning processes with integrated materials mediated by ICT, revealing Virtual Learning Environments for students, and that also taking into consideration learning differentiation in people (Alshammari, 2020; Hernández., Casado, & Negre, 2016).
Once the data was collected, we observed the inferences over the sample and how each of the variables behaved. This was done with the purpose of generalizing the state of efficiency of educational materials and their impact on learning and the educational model.
Data analysis was conducted using the statistical software SPSS 24.0, where we analyzed 3 different aspects: the first one being the statistical indicators obtained for data collection. The second was a contingency table (crossed table) of the contribution of elements per educational program. Finally, we conducted T testing on a sample of the importance value providing the status inferences for a set of categories.
Table 1. Statistical indicators and reference percentages of the assessment matrix per ISG category.
ISG Category | N |
Minimum |
Maximum |
Average Value of the Category |
Percentage Obtained per Category % |
Standard Deviation |
Program | 19 |
0.100 |
0.100 |
0.100/0.1 |
100 |
0.000 |
Program Indicators | 19 |
0.020 |
0.100 |
0.058/0.1 |
58 |
0.019 |
Schematic Contents | 19 |
0.000 |
0.100 |
0.047/0.1 |
47 |
0.051 |
Curricular Support Materials and Bibliography | 19 |
0.000 |
0.100 |
0.053/0.1 |
53 |
0.051 |
Study Methodology | 19 |
0.000 |
0.200 |
0.147/0.2 |
74 |
0.090 |
Specific Orientation for Study | 19 |
0.000 |
0.100 |
0.074/0.1 |
74 |
0.045 |
Evaluation to Pass the Course | 19 |
0.000 |
0.300 |
0.221/0.3 |
74 |
0.136 |
Development of Contents | 19 |
0.000 |
0.400 |
0.100/0.5 |
20 |
0.130 |
Writing and Presentation of the Material | 19 |
2.080 |
5.000 |
3.794/5 |
76 |
0.844 |
Use of Open Educational Resources | 19 |
0.630 |
1.000 |
0.757/1 |
76 |
0.114 |
Use of Technological Tools Available in the Educational Platform | 19 |
0.000 |
0.500 |
0.421/0.5 |
84 |
0.187 |
Activities Related to the Methodology | 19 |
0.170 |
1.000 |
0.894/1 |
89 |
0.224 |
Inclusion of Evaluation Tools | 19 |
0.000 |
0.830 |
0.228/1 |
23 |
0.310 |
The diagnosis was conducted based on the 3 non-formal and distance educational programs offered at the UAPCI since 2013: (1) International Business, (2) Logistics, and (3) International Law. It was based on the feasibility studies conducted in the 4th Region of the State of Mexico, with the purpose of providing non-formal, quality, and innovative education that were contextualized through the institutional educational platform called SEDUCA, of the Autonomous University of the State of Mexico (Autonomous Mexico State University, 2010).
We took a sample of 19 Independent Study Guides, considering the following participation per educational program: 31.5% for International Business, 37% for Logistics, and 31.5% for International Law, with different Learning Units belonging to 3 training centers of the educational programs (basic, substantial, and integral). Data collection was planned for each variable category and summarized per educational program.
3.1. ISG Assessment Matrix
The contribution of elements for the analysis of the Independent Study Guides (ISG), shown in Table 1, the average contribution per category of the set of reference data to the value provided in the instrument for each one. The minimum and maximum values were also exposed, as well as the coverage percentage where 5 of the 13 categories did not cover the minimum to obtain an acceptance criteria of the contents integrated by the category.
3.2. Contribution of Elements per Program
Category integration prevailed through a subgroup composition, forming a total of 45 subtopics for the ISG Assessment Instrument. Table 2 shows each subtopic per program in order to show their contribution from the smallest integrating elements, and in increasing order those which provide the largest amount of integrated elements in any of the ISG. Concentrations show that the contribution of subtopics is higher in 27 elements regarding a total of 45 elements, which is a reflection of the necessary minimum for the integration of the virtual material showed for the educational type.
Table 2. Integrated elements of the assessment categories per program.
Program | Amount (%) |
Integrated Elements |
Total |
||||||||
20.0 |
24.0 |
26.0 |
27.0 |
28.0 |
29.0 |
30.0 |
32.0 |
35.0 |
|||
International Law | Amount |
0 |
0 |
1 |
2 |
0 |
0 |
1 |
0 |
1 |
5 |
% |
0.000 |
0.000 |
5.300 |
10.500 |
0.000 |
0.000 |
5.300 |
0.000 |
5.300 |
26.300 |
|
Logistics | Amount |
1 |
1 |
0 |
1 |
2 |
1 |
1 |
1 |
0 |
8 |
% |
5.300 |
5.300 |
0.000 |
5.300 |
10.500 |
5.300 |
5.300 |
5.300 |
0.000 |
42.100 |
|
International Business | Amount |
1 |
1 |
1 |
4 |
3 |
2 |
3 |
2 |
2 |
19 |
% |
5.300 |
5.300 |
5.300 |
21.100 |
15.800 |
10.500 |
15.800 |
10.500 |
10.500 |
100.000 |
3.3. Category Assessment per Value
Within the group of categories, we partially assessed according to importance order within the contents. The category representing 50% of the total evaluation of the instrument was the category of: Writing and Presentation of the Material, which was composed of a subset of 12 elements.
Table 3 shows the T test for the sample, with reference values of 3.9 and 4.0 as critical values. These values are considered based on a scale of 5.0 contribution points. For an educational material to be considered within an acceptable range, it must contain 80% of integration for this specific category. This would indicate what is good material in an adequate Virtual Learning Environment, and that whether it contains the fundamental contents to be operative and functional within the educational model for learning.
We decided to take the critical values of 4.0 that represented the desired percentage, but a tolerance of 0.1 was provided to allow for comparison. The probability that this sample had the value of 3.9 was 59% and only 30% for the 4.0 value. Both percentages indicated that there were higher and lower values within an acceptance range below the value of the standard deviation. Hence indicating that the largest category must maintain the hypothesis and that the obtained ISG sample should pass the proposed quality test.
Table 3. T Test for the sample of the Category: Writing and presentation of the material and a value reference of 5.0.
Category | Test Value = 3.9 and 4.0 respectively | |||||
T |
Degrees of Freedom (gl) |
Bilateral significance |
Difference in the Means |
Trust Interval of 95% of the difference |
||
Lower |
Higher |
|||||
Writing and Presentation of the Material | -0.548 |
18 |
0.590 |
-0.10614 |
-0.5128 |
0.3005 |
-1.065 |
18 |
0.301 |
-0.20614 |
-0.6128 |
0.2005 |
3.3.1. Value Categories 1.0
Following the order of importance of the categories, there were three of them with a value of 1.0: Use of Open Educational Resources, Activities Related to the Methodology, and Inclusion of Assessment Tools. These categories had a value of 0.6 with the necessary minimum for the quality factor. Table 4 shows that the first two categories reveal that 95% of the values have positive higher and lower intervals. Hence the evaluation of the ISG sample was beyond the required minimum. This was opposed to the last category that showed a negative means difference indicating that the Inclusion of Evaluation Tools should be addressed with clear and coherent performance criteria. It described what should be learned, how to facilitate students, how to supervise and critique their work, and eliminate subjectivity in evaluation.
Table 4. T Test for the sample of categories with a 1.0 value reference.
Category | Test Value = 0.6 |
|||||
T |
Degrees of Freedom (gl) |
Bilateral significance |
Difference in the Means |
Trust Interval of 95% of the difference |
||
Lower |
Higher |
|||||
Use of Open Educational Resources | 5.992 |
18 |
0.000 |
0.157 |
0.102 |
0.211 |
Activities Related to the Methodology | 5.727 |
18 |
0.000 |
0.294 |
0.186 |
0.402 |
Inclusion of Evaluation Tools | -5.223 |
18 |
0.000 |
-0.372 |
-0.521 |
-0.222 |
3.3.2. Value Categories 0.5
Such sets of categories with values smaller than 1.0were considered to be those forming and integrating the fundamental structure for virtual environment courses. They were those not correlated to the disciplinary part of the Learning Unit, but were in fact related to the curricular structure. They provide formality to the structural elements that a learning community in a virtual environment must contain and give relevance to the information elements over ISG generalities.
Within this set of categories of curricular elements, Table 5 shows two categories: Development of Contents and Use of Technological Tools Available in the Educational Platform with reference values of 0.5 and a test value of 0.3 as a minimum to comply with the structure. We observed that the first one lacked the fundamental elements, indicating that it is not available for the development of every topic and subtopic of the ISG contents indicated in the curriculum and in agreement with the Learning Unit. This left spaces in the subtopics that were consequently not considered important in their integration.
In the case of the category Use of Technological Tools Available in the Educational Platform, the difference of the values between the means and the higher and lower limits showed a positive and adequate relation by being higher than the required reference value. Therefore, in this category, all the tools available in the platform were used adequately such as questionnaires, e-mail, wiki, forums, online virtual classrooms and sessions when required.
Table 5. T Test for the sample of categories with a 0.5 value reference.
Category | Test Value = 0.3 |
|||||
T |
Degrees of Freedom (gl) |
Bilateral significance |
Difference in the Means |
Trust Interval of 95% of the difference |
||
Lower |
Higher |
|||||
Development of Contents | -6.697 |
18 |
0.000 |
-0.200 |
-0.263 |
-0.137 |
Use of Technological Tools Available in the Educational Platform | 2.817 |
18 |
0.011 |
0.121 |
0.031 |
0.211 |
3.3.3. Value Categories 0.2
Within the information structure of a non-formal course for virtual materials, the categories: Study Methodology and Evaluation to Pass the Course, granted value to a ISG as a fundamental informative part. Table 6 shows the distribution of the categories with a 0.2 reference value and a test value of 0.18. It states a lower behavior in a normal distribution, with lower negative values leaning towards 0, and higher positive values with a uniform distribution. This indicated that there was clarity in the study methodology and in the criteria to be able to evaluate the course.
Table 6. T test for the sample of categories with a 0.2 value reference.
Category | Test Value = 0.18 |
|||||
T |
Degrees of Freedom (gl) |
Bilateral significance |
Difference in the Means |
Trust Interval of 95% of the difference |
||
Lower |
Higher |
|||||
Study Methodology | -1.572 |
18 |
0.133 |
-0.0326 |
-0.076 |
0.011 |
Evaluation to Pass the Course | 1.318 |
18 |
0.204 |
0.0410 |
-0.024 |
0.106 |
3.3.4. Value Categories 0.1
In the last section of the set of categories, we included all the remaining ones that comprised the curricular and structural requirements for the course as part of the instructional design of a ISG and virtual learning materials for educational platforms. Table 7 shows the contribution of the categories with a reference value of 0.1 points and a Test Value of 0.6, where the categories Program Indicators, and Curricular Support Materials and Bibliography are accepted as being concentrated in the average test value as a minimum requirement.
The categories, Schematic Contents and Specific Orientation for Study, show a normal distribution in the average test value. This indicates that the set of data is distributed and integrated in a uniform manner with other elements such as showing the contents of the Learning Unit, as well as suggesting adequate study strategies and techniques. In the case of the category Program, there is a wide coverage in the set of samples collected.
Table 7. T Test for the sample of categories with a 0.1 value reference.
Category | Test Value = 0.06 |
||||||||
T |
Degrees of Freedom (gl) |
Bilateral significance |
Difference in the Means |
Trust Interval of 95% of the difference |
|||||
Lower |
Higher |
||||||||
Program | 0.000 |
18 |
0.000 |
0.040 |
0.040 |
0.040 |
|||
Program Indicators | -0.490 |
18 |
0.630 |
-0.002 |
-0.011 |
0.007 |
|||
Schematic Contents | -1.073 |
18 |
0.297 |
-0.013 |
-0.037 |
0.012 |
|||
Curricular Support Materials and Bibliography | -0.626 |
18 |
0.539 |
-0.007 |
-0.032 |
0.017 |
|||
Specific Orientation for Study | 1.318 |
18 |
0.204 |
0.014 |
-0.008 |
0.035 |
As it can be seen, the situation of the ISG state of efficiency, in general, barely surpasses the necessary minimum to be able to talk about quality of educational materials. There are elements such as Writing and Presentation of the Materials of all the topics and subtopics stated in the program that represents the highest value and does not exceed the barrier of 4.0 points. They comply with 80% of the category requirement. Likewise, elements such as the evaluation part in Inclusion of Evaluation Tools are completely lacking for the presentation of materials in their duly quantification and qualification. Verification lists and rubrics like evaluation guides must also be shown as an area of opportunity and design when developing a ISG.
We adopted the following acceptance criteria in the analysis of a ISG, and the grade obtained in their design by the involved parties: 1) From 0 to 5.9: It was recommended to completely redesign the ISG since it did not comply with the minimum form and structure requirements. It did not qualify as an adequate and quality ISG within the educational platform; 2) From 6 to 7.9: It was recommended to revise and update some topics, update the support material, and avoid the excessive use of materials from other universities but cite sources adequately. It did not qualify for the publication and operation within the educational platform as adequate and a quality ISG; and 3) From 8 to 10: It was recommended to address the observations of the assessment matrix. It did meet the quality standards for the publication and operation within the educational platform and was adequate and a quality ISG.
Likewise, the ISG assessment with the verification instrument intends to find the present or absent elements within the assessed ISG. It must comply with the first 8 categories as the required indicators for a first approval since it is the substantial part of the ISG.
The study shows both global and local educational programs in the second assessment criteria. The recommendation was made to conduct the revision and to update some topics, support materials, and avoid the excessive use of materials from other universities, as well as cite sources properly. It indicated a situation in which it was necessary to revise the status of every ISG of all three educational programs, but the pertinence of education of the virtual educational materials offered cannot be guaranteed just with the analysis and study of the aforementioned. It can be fully assured that it did not qualify for the publication and operation within the educational platform as an adequate and quality ISG.
This study brought a perspective of the situation of the ISG by analyzing different elements for learning. This evaluation provided a direct relation of the direct state of the educational model and the pertinence it had in the offered educational type, in which it must prioritize that a traditional educational model was not replicated in a virtual form, the correlation with the innovation in the emerging educational paradigms, as well as the learning process that can be shown clearly and directly with the VLE supported by the integration of the materials presented and the operating structure until today.
This research was conducted in the community within the non-formal educational platform of the Cuautitlán Izcalli Professional Academic Unit (UAPCI) of the Autonomous University of the State of Mexico, offering the distance education professional programs. The sample provided the scope that the operating educational model had through the virtual educational materials containing the educational platform and acknowledging the efficiency of the educational model. The purpose of this study was to highlight the current situation of program objectives, contents, learning processes, evaluation, communication between professors and students, school management, the development of the educational platform, and the consideration of the Instructional Design that takes place in the platform. This becomes an integrating agent of educational processes and begins from the analysis all the way to the implementation of the resource and in agreement with the type of program with the unification of criteria and contents as a differentiating agent.
Results show areas of opportunity in the texts and information contained in the Independent Study Guides, which are students’ classroom. They also show that the presentation of information and activity design must keep a high correlation which reflects with the degree of performance and learning that takes place. The development and the process where educational materials are built is a determining factor in the Virtual Learning Environment since the latter proves the planning of every activity in the course, thus outlining the required objective for learning to occur, providing an added value to universities (Díaz, 2020).
Likewise, Vázquez, Méndez, Román, and López-Meneses (2013) state that one of the greatest challenges in Distance Education was to produce materials that would promote learning through activities and stimulate learning, and have the students develop a massive active learning process. Its foundations should be their own experiences, accompanied by the digital didactic sense and the coverage of integrating paradigms.
From the data collected, we were able to highlight that the program objectives were adequate, well supported, and that there was a coherence between the objectives and the program, the curricular area, and the Learning Units. The objectives were reached based on content design and activities included in the SEDUCA platform. Therefore, clearly defining the form and construction process of educational materials for Distance Education brought great benefits.
The teaching-learning process in the study showed a significant difference as an area that must be prioritized. It is considered to be the reflection of a face to face course that is undergoing a transition where we have to understand that it might be considered normal since the ones designing materials are teachers, and when facing a contingency, such as a pandemic, they take better advantage of their abilities if they are technologically qualified, showing more willingness to integrate learning communities into this study modality.
We might say then, that the university needs to train and implement actions to establish a model showing procedures, that integrates pedagogical fundamentals, and contemporary theories that can guarantee a ICT supported model, but which can also develop learning and communication with them in such a way that knowledge can be constructed in students by means of Virtual Learning Environments.
In order to be able to build a DE model in educational programs in non-formal Bachelor Degree programs, we took the evaluation aspect into consideration, an essential part of providing guidance to the educational model, and that is a result of the measurement of educational objectives, learning, and the functionality of the model per se. The study of the categories clearly showed the opportunity to improve the implementation of rubrics for the designed activities, the response times for their evaluation, the amount of designed activities, and the degree of complexity of each one of them. Likewise, we must avoid at all times the impression of a face to face model placed in a virtual platform. Instructional design based educational models generally presented very generic proposals. Therefore, it was necessary to adjust them and for this case in particular, to define the proper characteristics of a model that contextualized the needs of the studied population. For that reason, the bases for the evaluation of an educational model included participation, design and development of clear measuring instruments, activities and materials. They contributed to achieving the objectives set in the Learning Units, an objective criteria of the professors, and a clear and precise understanding from the students. Together, all of these elements complemented each other so that the DE and the educational model can make the most of quality education.
This research provided a general perspective of the most relevant aspects of the model operating currently. It showed the strengths and areas of opportunity of the elements considered in the design of a didactic pedagogical distance model which responded to the needs, that created scenarios in which learning can take place in an optimal and scaffolding form, and in the dynamic use of the design of Virtual Learning Environments that contributed more than a face to face model reflected in a platform. It was ultimately a challenge. However, this study paved the way for the construction of a didactic pedagogical model required by universities for the assessment of their non-formal educational programs.
Funding: This study received no specific financial support. |
Competing Interests: The authors declare that they have no competing interests. |
Authors’ Contributions: All authors contributed equally to the conception and design of the study. |
Acón-Matamoros, A. G., & Trujillo-Cotera, A. (2011). Evaluation of an online course: Quality criteria. Electronic Journal Quality in Higher Education, 2(1), 86-101.Available at: https://doi.org/10.22458/caes.v2i1.418.
Aguilar, I., Ayala, J., Lugo, O., & Zarco, A. (2014). Analysis of evaluation criteria for the quality of digital teaching materials. CTS: Ibero-American Journal of Science, Technology and Society, 25(9), 73-89.
Alshammari, M. T. (2020). Design and evaluation of an adaptive framework for virtual learning environments. International Journal of Advanced and Applied Sciences, 7(5), 39-51.Available at: https://doi.org/10.21833/ijaas.2020.05.006.
Ardila-Rodríguez, M. (2011). Quality indicators of digital educational platforms. Education and Educators, 14(1), 189-206.Available at: https://doi.org/10.5294/edu.2011.14.1.10.
Ausín, V., Abella, V., Delgado, V., & Hortigüela, D. (2016). Project-based learning through ICT: An experience of teaching innovation from university classrooms. University Education, 9(3), 31-38.Available at: https://dx.doi.org/10.4067/S0718-50062016000300005.
Autonomous Mexico State University. (2010). Agreement of the university council of the autonomous university of the State of Mexico, establishing the Cuautitlán Izcalli professional academic unit, UAEM, XXVI, Época XIII. 87-90.
Avello Martínez, R., & Duart, J. M. (2016). New trends in collaborative learning in e-learning: Keys for its effective implementation. Pedagogical Studies (Valdivia), 42(1), 271-282.Available at: https://dx.doi.org/10.4067/S0718-07052016000100017.
Blanco Martínez, A., & Anta, F. P. (2016). A online students’ perspective about virtual learning environment in higher education. Innoeduca: International Journal of Technology and Educational Innovation, 2(2), 109-116.Available at: https://doi.org/10.20548/innoeduca.2016.v2i2.2032.
Cedeño, E., & Murillo, J. (2019). Virtual learning environments and their innovative role in the teaching process. Rehuso, Journal of Humanist and Social Sciences, 4(1), 119-127.
Díaz, Z. Y. (2020). Learning and IT platforms in postgraduate programs, EVA: A proposal for learning. Iberoamerican Business Journal, 3(2), 74-95.Available at: https://doi.org/10.22451/5817.ibj2019.vol3.2.11035.
Faúndez, C., Bravo, A., Ramírez, G., & Astudillo, H. (2017). Information and communication technologies (ICT) for teaching and learning of thermodynamic concepts as a tool for future teachers. University Education, 10(4), 43-54.Available at: https://dx.doi.org/10.4067/S0718-50062017000400005.
Fontalvo-Herrera, T. J., Delahoz, E. J., & Mendoza-Mendoza, A. A. (2018). Application of data mining for the classification of high quality accredited industrial engineering university programs in Colombia. Technological Information, 29(3), 89-96.Available at: https://dx.doi.org/10.4067/S0718-07642018000300089.
Gómez-Suárez, A. (2017). The importance of the instructional script in the design of virtual learning environments. Academy and Virtuality, 10(2), 47-60.Available at: https://doi.org/10.18359/ravi.2868.
Gómez, A. M. M., Roque, L. R., Garcés, B. R. G., Mesa, Y. R., Iglesias, M. E. D., & Ganen, M. S. (2015). The communication process mediated by information technologies. Advantages and disadvantages in various spheres of social life. MediSur, 13(4), 481-493.
Gutiérrez Bonilla, L. A. (2016). Deliberation on Virtual Education. Interconnecting Knowledge, 1(1), 77-89.
Hernández-Sampieri, R., & Mendoza, C. (2018). Research methodology. México: McGraw-Hill.
Hernández., C. A., Casado, M. Y., & Negre, B. F. (2016). Diagnosis of needs and use of ICT for learning assessment in Physics at the university of computer sciences. EDUTEC: Electronic Journal of Educational Technology(55), a326.Available at: https://doi.org/10.21556/edutec.2016.55.619.
Herrera, J. A. M. (2014). A reflective look at ICTs in higher education. Electronic Journal of Educational Research, 17(1), 1-4.
Juca, M. F. (2016). Distance education, a necessity for the training of professionals. University and Society, 8(1), 106-111.
Khan, S., & Markauskaite, L. (2017). Approaches to ICT-enhanced teaching in technical and vocational education: A phenomenographic perspective. Higher Education, 73(5), 691-707.Available at: https://doi.org/10.1007/s10734-016-9990-2.
Knijnenburg, B. P., Willemsen, M. C., Gantner, Z., Soncu, H., & Newell, C. (2012). Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction, 22(4-5), 441–504.Available at: https://doi.org/10.1007/s11257-011-9118-4.
Liyanagunawardena, T., Williams, S., & Adams, A. (2014). The impact and reach of MOOCs: A developing countries’ perspective. ELearning Papers, 33, 38-46.
Quiñones-Negrete, M. M., Martin-Cuadrado, A. M., & Coloma-Manrique, C. R. (2021). Academic performance and educational factors of students in the virtual environment education program. Influence of teaching variables. University Education, 14(3), 25-36.Available at: https://dx.doi.org/10.4067/S0718-50062021000300025.
Rakic, S., Tasic, N., Marjanovic, U., Softic, S., Lüftenegger, E., & Turcin, I. (2020). Student performance on an E-learning platform: Mixed method approach. International Journal of Emerging Technologies in Learning (iJET), 15(02), 187-203.Available at: http://dx.doi.org/10.3991/ijet.v15i02.11646 .
Roig-Vila, R., Mengual-Andrés, S., & Quinto-Medrano, P. (2015). Primary teachers’ technological, pedagogical and content knowledge. Comunicar, 23(45), 151-159.Available at: https://doi.org/10.3916/C45-2015-16.
Roig-Vila, R., Mengual-Andrés, S., & Suárez-Guerrero, C. (2014). Evaluation of the pedagogical quality of MOOCs. Curriculum Magazine and Faculty Formation, 18(1), 27-41.
Rubio, V. I., & Abreu, P. J. (2016). Pedagogical model in distance education. Institutional actions for its implementation. Option: Journal of Human and Social Sciences, 32(12), 541-568.
Salas, R. E. M., Moro, J. C. I., & Pérez, J. G. (2020). Evaluation of virtual learning environments: A management to improve. IJERI: International Journal of Educational Research and Innovation, 13, 126-142.Available at: https://doi.org/10.46661/ijeri.4593.
Saza-Garzón, I. D. (2016). Didactic strategies in web technologies for virtual learning environments. Praxis, 12(1), 103-110.Available at: https://doi.org/10.21676/23897856.1851.
Vázquez, E., Méndez, J. M., Román, P., & López-Meneses, E. J. (2013). Design and development of the pedagogical model of the educational platform “quantum university project”. Campus Virtuales, 2(1), 54-63.
Views and opinions expressed in this article are the views and opinions of the author(s), International Journal of Education and Practice shall not be responsible or answerable for any loss, damage or liability etc. caused in relation to/arising out of the use of the content. |