THE COMPARISON OF PIRLS, TIMSS, AND PISA EDUCATIONAL RESULTS IN MEMBER STATES OF THE EUROPEAN UNION

The PIRLS (Progress in International Reading Literacy Study), TIMSS (Trends in International Mathematics and Science Study), and PISA (Programme for International Student Assessment) have become gold standards for the international comparison of children’s performances, when aged 10 and 15 years. This paper focuses on secondary analysis of basic statistical indicators on reading literacy (PIRLS), as well as the mathematics and scientific literacy (TIMSS) of pupils at 10 years of age, followed by their reading, mathematics and scientific literacy at 15 years of age (PISA). It compares the pupils’ main educational results in PIRLS and TIMSS with their PSA results. PIRLS, TIMSS, and PISA help to identify key problems within pupils’ educational levels in these selected literacies and create effective educational policy measures. One aspect of the comparison within the research paper is the aggregate indicator; this is the arithmetic mean of PIRLS and TIMSS results, using pupils’ PIRLS results from 2001, 2006, 2011 and 2016, and TIMSS results from 2007, 2011 and 2015. The other aspect of the comparison is the aggregate indicator; which is the arithmetic mean of pupils’ PISA results for 2006, 2009, 2012 and 2015. A significant relationship was found to exist between the arithmetic mean of pupils’ PIRLS, TIMSS, and PISA results. Political and professional policy decisions within schooling affect the early years of pupils’ school attendance. This has a significant impact on their future education at all levels of schooling. The findings of this paper support a hypothesis regarding the effects of pupils’ educational performance and the need for measures to improve education in schools that should be adopted on an ongoing basis. UDC Classification: 338.2, DOI: https://doi.org/10.12955/pss.v1.70


Introduction
In the early 1990s, society became increasingly cognisant of the growing importance of literacy skills for the emerging generation of young people coming from education into the work-place. This coincided with an emerging globalised information society. A large degree of acquired encyclopaedic knowledge is no longer necessary for successful inclusion within society, but rather an ability to work with creativity and comprehension. This means using a wide range of information, assessing it correctly, and applying it efficiently in both work and personal life. These are the reasons for carrying out national and international literacy surveys on pupils in the fields of education. These are considered to be of the upmost importance for integrating the younger generation into today's information society, and thereafter to utilise this scientific data in a broader sense with proper understanding and interpretation. Surveying pupil's achievements in the international context is valuable in terms of enabling participating countries to compare their own pupils' achievements with those of other countries. This comparison of pupil achievements in multiple jurisdictions enables participating countries to identify not just differences in the size and content of the curriculum taught, but mainly the differences in pupils' ability to better utilise the knowledge acquired. For these reasons, international surveys have an impact not just on the work of teachers with pupils within classrooms, but particularly on the participating countries' policy decisions. The results obtained from international comparisons of pupil performances enable the participating countries to make targeted adjustments within the organisations and the content of the education itself, with the aim of increasing pupils' ability to use the knowledge acquired in a broader context. It is a pupils' ability to understand the curriculum content and to utilise it creatively in broader contexts that is the subject of the three international surveys, namely Progress in International Reading Literacy Study (PIRLS), Trends in International Mathematics and Science Study (TIMSS), and the Programme for International Student Assessment (PISA). (Plavčan, 2018).

Methodology and research methods
The international surveys of PIRLS and TIMSS are focused on surveying the fields of reading, mathematical and scientific literacy of 9 -10-year-olds (TIMSS in 1995(TIMSS in , 1999(TIMSS in , and 2003 was focused on 8 th -Grade Elementary School pupils or the corresponding year of a gymnasium school). It is an age that can be overlooked in a child's education, but at the same time it is an age where much can be corrected. This is precisely the preventive element of the international PIRLS and TIMSS surveys as they give information about the child's level of knowledge and their ability to use it with understanding, and do so relatively early in the child's development. The International PISA Survey is focused on reading, mathematical and scientific literacy of 15-yearolds. The international average of all three international surveys -PIRLS, TIMSS, and PISA has long been set at 500 points in each of the literacy surveys, and therefore a comparison of the results of approximately 10-year-olds (pupils with four years of education) and 15-year-old pupils is of great assistance in assessing the level of their learnt knowledge, and how they are able apply it in school and everyday life. A large number of countries from all over the world participate in the PIRLS, TIMSS, and PISA international surveys. The results of pupils from EU Member States are the focal point of our comparative analysis. When the tables indicate 'England', this represents the entire United Kingdom as a Member State of the European Union. Similarly, if Belgium is mentioned, this means their Flemish region. In expressing the relationship, ccorrelation of the mean values of the PIRLS+TIMSS and PISA measurements.

Summary indicators on PIRLS, TIMSS and PISA
In compiling the statistical indicator that is the arithmetic mean of the PIRLS reading literacy results in 2001, 2006, 2011, and 2016 and the mathematical and scientific literacy of pupils in the TIMSS International Survey 2007, 2011, and 2015, the total data took into account the four PIRLS international surveys (4 data items). It also factored in three TIMSS international surveys, and in each literacy results (6 data items), with the full participation of a European Union Member State, represents ten data items (the PIRLS and TIMSS average), which were taken into account in this summary statistical indicator for this particular Member State. Of the 28 E.U. Member States, only nine have full participation in all 10 PIRLS international reading literacy surveys and in the TIMSS mathematical and scientific literacy surveys. These are England, Netherlands, Hungary, Lithuania, Sweden, Germany, Italy, Slovenia and Slovakia. Estonia has never participated in the PIRLS international surveys, and Estonia, Luxembourg and Greece have never participated in the TIMSS international surveys. Estonia is the only Member State of the European Union that has no participation in the PIRLS and TIMSS international surveys, which also means that its results cannot be compared with those in PISA. In compiling the statistical indicator, the arithmetic mean of the results of reading, mathematical and scientific literacy of pupils in the PISA International Survey in 2006, 2009, 2012 and 2015, four PISA international surveys, and in each of the three literacy surveys, (which in the complete participation of a European Union Member State represents 12 data items),are taken into account in the summary statistical indicator for that Member State. All 28 E.U. Member States have fully participated in all four PISA international surveys of reading, mathematical and scientific literacy, in contrast to the PIRLS and TIMSS international surveys, where only nine E.U. Member States have fully participated. Estonia has never participated in the international PIRLS and TIMSS surveys, and therefore their results cannot be compared with the results in PISA, which is to the detriment of that particular member state, as it ranks second in the PISA international survey (Plavčan, 2019:1).

Selected findings on the relationship between PIRLS, TIMSS and PISA
To illustrate these points we have created statistical indicators, namely the "arithmetic mean of the ranking results for the summary average value of reading, mathematical and scientific literacy of pupils in the PIRLS and TIMSS international surveys" and the "arithmetic mean of the ranking results for the summary mean value of reading, mathematical and scientific literacy in the PISA international survey". We obtained the average ranking of each E.U. Member State which shows us the overall ranking of that member state for all international surveys in this secondary analysis, and in PIRLS in 2001, 2006, 2011, and 2016, in TIMSS in 2007, 2011, and 2015, and in PISA in 2006, 2009, 2012, and 2015. Estonia did not participate in the PIRLS and TIMSS international surveys in 2006, 2009, 2012, and 2015. Therefore, the PISA results cannot be used in this ranking of Member States, and so it is presented at the end of Table 1. 1.
In the overview of the European Union Member States, fewer than half the Member States are found above the average value of the statistical indicator, i.e. the arithmetic mean of the Member States' ranking (14 th place).These being in order from the Member State with the highest ranking: Finland, Ireland, the Netherlands, England, Germany, Denmark, Luxembourg, Belgium, Latvia, Poland, Sweden and Hungary. Austria, and Slovenia have the same statistical value as the international average. 2.
The following Member States fall below the average value (14 th place) of the statistical indicator. The arithmetic mean of the results of the European Union Member States: Czech Republic, ranked, Portugal, Lithuania, Bulgaria, Italy, France, Spain, Greece, Croatia, Slovakia, Cyprus, Malta, and Romania. Hungary (487) (432) and Romania (429). In this group of Member States with a below-average value of this statistical indicator, Slovakia is also ranked 22, which is six places lower than the international average.  Table 1 shows (for illustrative purposes) all E.U. Member States with the summary statistical indicator, the arithmetic mean of results in the international PIRLS, TIMSS and PISA surveys of pupils' reading, mathematical and scientific literacy, the E.U. Member States' ranking, the statistical indicators the average ranking and the absolute difference between the rankings of the Member States in both rankings. The graphic in Figure 1 correlates the two statistical indicators, namely the "arithmetic mean of the ranking results for the summary average value of reading, mathematical and scientific literacy of pupils in the PIRLS and TIMSS international surveys", and the "arithmetic mean of the ranking results for the summary mean value of reading, mathematical and scientific literacy in the PISA international survey".   Regarding the positions of the Member States of the European Union shown in graph 1, we present the following findings: The Figure 1, the slope for the arrangement of Member States' positions is of interest. This representation of the slope suggests that in a comparison of the statistical indicator "arithmetic mean of ranking results for the summary average value of pupils' reading, mathematical and science literature in the PIRLS and TIMSS international surveys" and the statistical indicator "arithmetic mean of ranking results for the summary average value of pupils' reading, mathematical and science literacy in the PISA international survey", i.e. the relationship between them is shown visually. Demonstrating a hypothesis on the relationship of both statistical indicators is justified by the relatively even distribution of the positions of Member States to the left or right around the middle axis shown in graph 1.

Conclusions
The findings of a statistical relationship in the secondary analysis of PIRLS, TIMSS, and PISA reading, mathematical and scientific literacy are useful for all member states of the European Union in deciding the direction of national school policies. A significant statistical relationship points to the need for quality assurances in each school year throughout the curriculum. Pupils' educational results in lower years significantly affect results in later school years. Secondly, political and professional bodies decisions to increase education expenditure has a positive effect on the quality of education at schools in general, regardless of how the economy is doing.
Our results of the analysis support the assertion that even in the case of a Member State's restrictive budget, expenditure on education and creative activity in general should not be restricted (Plavčan, 2019, 2). This should be considered in EU Member States' education tactics and strategies when formulating principles of national educational policy. Certainly, in those E.U. Member States where results in the PIRLS, TIMSS, and PISA international surveys of pupils' reading, mathematical and scientific literacy are below average over the long term.