Preprint
Article

This version is not peer-reviewed.

Comparing Paid and Diamond Open Access: Metrics from Engineering Journals

A peer-reviewed article of this preprint also exists.

Submitted:

17 December 2024

Posted:

18 December 2024

You are already at the latest version

Abstract
This study investigates and compares Open Access journals charging Article Processing Charges (APC) and Diamond Open Access journals in Engineering, indexed in Scopus. It analyzes metrics like CiteScore, citations, published articles, and the percentage of cited articles from 2020 to 2023, categorized by quartiles (Q1–Q4) and the top 10%. The dataset includes 757 journals: 504 APC-charging and 253 Diamond, representing only 8.4% of active journals in this field. Findings indicate that APC journals have higher averages in CiteScore and citations, particularly in Q1 and top 10%, although they show more significant variability. Diamond journals display consistency in relative metrics and surpass APC journals in the percentage of cited articles within the top 10%. Additionally, APC fees are highest in top-tier quartiles, posing accessibility challenges for less-funded researchers. Despite these differences, both models play complementary roles, balancing impact and accessibility. The results highlight the need for inclusive policies to strengthen Diamond Open Access while acknowledging the visibility advantages of APC models. These findings provide a foundation for future research and editorial strategies in scientific publishing, particularly in Engineering, while suggesting paths to achieve equity and sustainability in Open Access dissemination.
Keywords: 
;  ;  ;  
Subject: 
Engineering  -   Other

1. Introduction

The Open Access movement began at the start of the 21st century and emerged with the Budapest Open Access Initiative (BOAI) in February 2002. This initiative pioneered Open Access as the unrestricted availability of scientific knowledge, ushering in a new era in disseminating academic output (BOAI, 2002; BOAI, 2012; Suber, 2012). Subsequently, the Bethesda Statement on Open Access Publishing, released in June 2003, and the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities, presented on October 22, 2003, broadened the foundations established by the BOAI. These declarations positioned Open Access as a strategic goal for global academic and scientific institutions, consolidating the pillars of the BBB movement (Budapest, Bethesda, and Berlin) (Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities, 2003; Max-Planck-Gesellschaft, 2003; Willinsky, 2006). With a focus on practical strategies and commitments, the BBB movement provided a structural basis for the contemporary Open Access model, highlighting its potential to transform the production and accessibility of scientific knowledge, thereby promoting significant advances in global academic sharing (Laakso & Björk, 2012; Suber, 2012).
In the scientific communication system, scholarly journals are the principal formal means of disseminating knowledge and developments in highly complex fields, such as Engineering. Among these journals, Open Access titles stand out for broadening research reach, promoting greater visibility and universal access (Harnad, 2001; Suber, 2012). However, the publishing models under this format vary significantly, notably between journals that charge Article Processing Charges (APC)—as in the Gold Open Access and Green Open Access models—and Diamond Open Access, which removes financial barriers for both authors and readers but faces sustainability challenges (Druelinger & Ma, 2023; Swan, 2015; Yoon et al., 2024).
Open access has also been shown to have a significant impact on the visibility and accessibility of scientific outputs, as evidenced by its transformative role in Chile, where it has enhanced doctoral education and improved the international visibility of local research (Segovia et al., 2024). This impact is also observed in European universities belonging to the YERUN network, where institutional open access initiatives have been associated with higher citation rates, especially for publications under the Green Open Access model (De Filippo & Mañana-Rodríguez, 2020).
The citation advantage often attributed to Open Access journals is a widely discussed but still controversial topic. Systematic studies indicate that this advantage may vary depending on disciplinary and methodological factors but is generally observed across multiple scientific areas (Langham-Putrow et al., 2021; Huang et al., 2024). Nevertheless, this advantage must be analyzed considering the associated costs. The widely adopted APC model has been criticized for exacerbating structural inequalities, as many researchers—particularly those from less advantaged institutions—cannot afford the publication fees (May, 2020; Borrego, 2023; Haug, 2019). On the other hand, the Diamond Open Access model faces financial challenges related to its reliance on public and institutional subsidies, which may compromise its long-term sustainability (Borrego, 2023; Frank et al., 2023).
Although both APC and Diamond models promote open access, their approaches reflect fundamental differences. The APC model is widely adopted by commercial publishers, who tie costs directly to authors, allowing for greater financial independence but often resulting in inequalities for researchers with limited resources. Conversely, the Diamond model removes economic barriers for authors and readers, fostering equity, but depends on regional and field institutional and public subsidies that vary significantly. This disparity reveals a delicate balance between financial sustainability and the democratization of scientific knowledge, especially in highly productive areas like Engineering (Andringa et al., 2024; Borrego, 2023).
To address these limitations and strengthen the relevance of the Diamond Open Access model, the Diamond OA Standard (DOAS) sets forth clear quality guidelines, covering everything from funding and governance to visibility and impact. Developed based on the Action Plan for Diamond Open Access and revised by the DIAMAS project, this standard prioritizes technical efficiency, equity, and inclusion as central elements for the sustainability of these journals (Ancion et al., 2022; Rooryck et al., 2024). Its seven fundamental components include: (1) funding; (2) legal ownership, mission, and governance; (3) open science; (4) editorial management, quality, and research integrity; (5) technical service efficiency; (6) visibility, communication, marketing, and impact; and (7) equity, diversity, inclusion, and belonging (EDIB), multilingualism, and gender equity. These steps provide a comprehensive framework for strengthening the relevance and competitiveness of the Diamond Open Access model on the global stage.
The present study comparatively analyzes Open Access journals that charge APCs and Diamond Open Access journals in Engineering, using consolidated metrics from the Scopus database. These metrics include CiteScore, the number of accumulated citations, the volume of published articles, and the percentage of cited articles, organized by quartiles (Q1–Q4) and the top 10% group. This analysis offers a detailed view of the differences in impact and performance between these models, contributing to the debate on editorial policies and knowledge dissemination strategies. Although the focus is on Engineering, the methods and findings presented here can be extended and contextualized to other fields, providing a starting point for studies investigating similar patterns across various disciplines.
The findings reveal that journals using the APC model generally exhibit higher CiteScore metrics and citation averages, particularly in the top quartiles. In contrast, Diamond Open Access journals consistently perform and excel in the proportion of articles cited among the top 10%. These results underline the complementary roles of both models in promoting access and impact in engineering research.

2. Method

This quantitative, exploratory study can be classified as documentary in nature (Neuendorf, 2017), and it analyzed data extracted from the Scopus database during the last week of November 2024. The objective was to compare the performance of Open Access journals that charge APCs with Diamond Open Access (fee-free) journals in Engineering, categorized by quartiles (Q1–Q4) and by the top 10% group, based on 2023 metrics. Four main variables were evaluated: CiteScore, number of citations (2020–2023), articles published (2020–2023), and the percentage of cited articles (% Cited).
The 2020–2023 time frame was chosen to capture recent trends in journal performance, considering that shorter periods might not reflect structural variations, while more extended periods could include outdated metrics relative to current editorial practices. This interval also allows for identifying the potential impacts of the COVID-19 pandemic on publishing and citation behavior, especially in fields such as Engineering.
The Engineering area in Scopus encompasses 18 sub-areas, including Aerospace Engineering, Architecture, Automotive Engineering, Biomedical Engineering, Civil and Structural Engineering, Electrical and Electronic Engineering, and Mechanical Engineering. Of the 5,162 indexed journals in the area, 803 (15.56%) were classified as Open Access. After applying quartile filters, 757 titles out of the 3,012 active journals were selected. By using the “top 10%” filter, 126 journals were identified.
The journals were divided into two groups for comparative analysis: G1, consisting of 504 (66.58%) Open Access journals that charge APCs, and G2, consisting of 253 (33.42%) Diamond Open Access journals. Journals classified as “N/A” by Scopus (n=46), which did not have a quartile assigned, were excluded from the analysis.
Statistical analyses were conducted using Statdisk software (version 13.0), which provides specialized tools for non-parametric statistical tests essential for this study. The choice of Statdisk was based on its simplicity, accessibility, and efficiency in performing robust exploratory analyses and its compatibility with extensive and heterogeneous datasets. These features ensured greater agility and accuracy in obtaining results.
Exploratory analyses offered a detailed view of the distributions and variability of the variables, aiding in understanding the performance differences between groups G1 and G2. Descriptive tables presented confidence intervals and dispersion measures, and the analyses were organized by sub-area to ensure representativeness and data consistency. This procedure ensured greater comparison robustness, allowing for evaluating how each access model influences performance in specific metrics.

2.1. Study Design

The research adopted a quantitative, descriptive, and comparative design. The statistical analyses were conducted with the following objectives:
  • Assess the normality of the data for each variable and quartile.
  • Compare G1 and G2 concerning the selected metrics.
  • Explore the relationships between CiteScore and accumulated citations in both groups.

2.2. Statistical Procedures

2.2.1. Normality TEST

The data were subjected to the Ryan-Joiner test to verify adherence to the normal distribution. The results indicated that most variables were non-normally distributed, except for a few specific subgroups. Outliers were identified in several variables, particularly in Citations and Articles Published.

2.2.2. Tests for Comparing G1 and G2

Given the prevalence of non-normality, non-parametric tests were performed to compare the two groups:
-
Wilcoxon Rank-Sum Test for Independent Samples: Used to assess differences in the distributions of variables between G1 and G2 within each quartile. Comparisons were conducted separately for CiteScore, Citations, Articles Published, and % Cited

2.2.3. Correlation Tests

Correlation analyses were carried out to investigate the relationship between CiteScore and accumulated citations in each group and quartile:
-
Pearson or Spearman Correlation Coefficient: Selected according to data normality. The analyses were accompanied by significance tests (p-value) to identify statistically significant correlations.

2.2.4. Homogeneity Analysis

Homogeneity tests were applied before the comparisons to evaluate the equality of variances between groups. This ensured the appropriate choice of statistical methods used.

2.3. Tools and Visualization

Analyses were conducted using Statdisk software (version 13.0) and Excel spreadsheets for initial data organization and visualization. Scatter plots and box plots were generated to facilitate the visual interpretation of results and to identify patterns or anomalies.

2.4. Rationale for Non-Parametric Tests

Non-parametric tests were selected because the distributions exhibited non-normality, skewness, and outliers in several variables. This approach provided greater robustness for group comparisons.

3. Results

This study analyzed Open Access journals that charge APCs (G1) and Diamond Open Access journals (G2), using impact and scientific productivity metrics in Engineering. The results are organized by quartiles (Q1–Q4) and the top 10% group based on data extracted from the Scopus database.

3.1. General Characterization of the Journals

A total of 757 journals were included in the analysis, with 504 (66.58%) classified as G1 and 253 (33.42%) as G2. The quartile distribution revealed a predominance of G1 journals in the upper quartiles (Q1 and Q2), while G2 had a higher presence in the lower quartiles (Q3 and Q4) (Table 1).

3.2. Comparison of Metrics Between G1 and G2

The statistical results revealed significant differences between the two publication models, particularly in metrics related to quality and impact. G1 performed better in absolute metrics, such as the number of accumulated citations and published articles. At the same time, G2 stood out in relative metrics, such as the consistency of correlations between CiteScore and citations (Table 2).
G1 journals demonstrated more accumulated citations and published articles, especially in the top quartiles (Q1 and Q2). However, G2 showed a more balanced performance, suggesting greater consistency in scientific impact.

3.3. Correlation Between CiteScore and Citations

G2 journals showed stronger correlations between CiteScore and the number of citations, particularly in the lower quartiles (Q3 and Q4). This may indicate greater consistency in the relationship between perceived quality and academic impact. In G1, the correlation was significant in some quartiles but less consistent (Table 3).

3.4. Analysis of Outliers and Homogeneity

The presence of outliers was identified in metrics such as the number of citations and published articles, especially in the upper quartiles. Homogeneity tests indicated unequal variances between G1 and G2, reinforcing the choice of non-parametric comparison tests.

3.5. Comparative Summary

The results indicate that G1 journals stand out in absolute metrics, such as the total number of citations and articles published, with a higher concentration of journals in the upper quartiles. In contrast, G2 journals demonstrate greater consistency in impact metrics, with more substantial and uniform correlations between CiteScore and citations, particularly in the lower quartiles. The tables with detailed descriptive statistics are provided in Appendix A for further reference. These findings reflect distinct patterns of performance and impact in the two scientific publication models. G1 focuses on volumetric metrics and caters to a broader target audience, while G2 exhibits a more distributed impact, closely correlated with article quality.

4. Discussion

This study compared the performance of Open Access journals that charge APCs (G1) with Diamond Open Access journals (G2) in Engineering, using four main variables: CiteScore, number of accumulated citations, published articles, and percentage of cited articles. The results were organized by quartiles (Q1–Q4) and the top 10% group, based on data from the Scopus database (Table 1). This is the first study to comprehensively analyze the differences in performance between these two publishing models in the context of Engineering.
Before examining the selected variables, the APC fees charged by Open Access journals were analyzed, as these costs make it possible to identify significant average fee differences among quartiles, highlighting disparities in pricing practices.
The APC fees charged by Open Access journals in Engineering, categorized by quartiles (Q1–Q4) and the top 10% group, revealed significant disparities. Among the 504 journals that charge APCs (66.58%), the average values were higher in the upper quartiles, such as in the top 10% (USD 2,399.79), with a median of USD 2,200.00 and a range of USD 615.00 to USD 6,730.00. In Q1, the average was slightly lower (USD 2,151.99); in Q2, the amounts decreased significantly, reaching an average of USD 1,665.16 (Table 1; Table A4). In the lower quartiles, Q3 and Q4, fees sharply declined, with averages of USD 848.35 and USD 603.04, respectively. The median in Q4 was only USD 300.00, reflecting the greater affordability of journals in this quartile. The wide variability in APC fees was demonstrated by high coefficients of variation, especially in Q4 (134.40%), indicating significant heterogeneity in charging practices. These disparities in pricing practices may directly influence journal performance on metrics such as number of citations and percentage of cited articles, reflecting structural differences between publishing models (Table 1; Table A4).
Regarding CiteScore, G1 journals demonstrated superior performance in the intermediate quartiles (Q2 and Q3), while in Q1 and Q4, the differences between the models were less pronounced. In Q2, G1 achieved a mean of 3.84 and a median of 3.90, surpassing G2, which had a mean of 2.79 and 3.00 (z = 5.03, p < 0.05). In Q3, G1’s mean value of 1.96 was significantly higher than G2’s mean of 1.50 (z = 3.54, p < 0.05). On the other hand, in Q1, which includes the highest-impact journals, G1 had a mean of 9.10 and a median of 7.75, while G2 recorded a mean of 8.17 and a median of 7.10; this difference was not statistically significant (z = 1.25, p > 0.05). In Q4, the mean values were similar, with G1 at 0.67 and G2 at 0.59 (z = 0.75, p > 0.05). These results suggest that G1 holds an advantage in the intermediate ranges, while performance is more balanced at the extremes (Q1 and Q4) (Tables 2; A5).
Data dispersion revealed essential differences between the models. In Q1, both G1 and G2 showed high variability, with standard deviations of 5.15 and 5.32, respectively. In Q2, G1’s coefficient of variation (31.70%) was lower than G2’s (43.85%), indicating greater consistency among APC-charging journals in this quartile. In the lower quartiles (Q3 and Q4), G2 displayed greater homogeneity, with lower standard deviations (0.77 and 0.35) compared to those of G1 (0.73 and 0.49). These findings point to a more stable performance by G2 in the lower quartiles, while G1 stands out in absolute metrics in the intermediate ranges (Tables 2; A5).
G1 journals showed superior performance in the number of accumulated citations across all quartiles, particularly in the upper quartiles (Q1 and Q2), where the differences compared to G2 were more pronounced (Table 2). In Q1, G1 presented a mean of 11,389 citations and a median of 2,090, while G2 recorded a mean of 1,577 and a median of 1,154 (z = 3.93, p < 0.05) (Tables 3; A6). In Q2, G1 reached a mean of 2,389 citations, significantly higher than G2’s mean of 462 (z = 5.72, p < 0.05) (Tables 3; A6). The differences were minor in the lower quartiles, but still favored G1. In Q3, G1’s mean was 481 citations versus G2’s 340 (z = 2.57, p < 0.05) (Tables 3; A6). In Q4, both models presented similar values, with G1 averaging 111 citations and G2 83 (z = 0.32, p > 0.05) (Tables 3; A6).
In addition to the means, the dispersion analysis revealed high variability in G1’s accumulated citations, especially in Q1, with a standard deviation of 43,885 and a coefficient of variation of 385.33% (Table 3). In contrast, G2 exhibited greater consistency in the same quartile, with a standard deviation of 1,573 and a coefficient of variation of 99.81% (Table 3). This heterogeneity in G1 reflects the presence of journals with extremely high citation counts, such as IEEE Access (484,743 citations) and Advanced Science (95,266 citations), which raise the overall average (Table 2). In Q2, G1 also showed more significant variability, with a standard deviation of 6,962, while G2 maintained a standard deviation of 537, indicating greater homogeneity (Table 3).
When examining the raw number of citations, G1 concentrated 79.86% of the total citations in Q1 (2,414,482 citations), while G2 accumulated only 82,002 (Table 3). This difference reflects APC-charging journals' significantly more significant academic impact in the most prestigious segments. In the total accumulated citations, G1 reached 2,880,490, far surpassing G2, which totaled 143,084 (Table 3). This phenomenon is not limited to Engineering and has been identified in broader studies, such as Miranda and Garcia-Carpintero (2019), who found that Q1 publications account for an average of 65% of total citations in a research area, with variations reaching up to 98% depending on the discipline. Additionally, Schvirck et al. (2024) highlight the invisibility of articles published in lower-impact journals, reinforcing that academic productivity pressures often perpetuate a cycle in which citations are concentrated in higher-prestige publications. These results indicate that G1 has a clear advantage in absolute citation metrics, particularly in the upper quartiles. However, G2’s greater homogeneity in the lower quartiles suggests a more stable and accessible approach in terms of scientific impact.
The correlation between CiteScore and accumulated citations revealed consistent patterns and significant differences between the G1 and G2 models, highlighting the interactions between perceived quality and academic impact. In Q1, the correlation was positive and significant for both models, with G1 showing a correlation coefficient of r=0.78 (p < 0.05). At the same time, G2 recorded r=0.63 (p < 0.05), indicating a stronger association in G1 between quality and impact metrics (Table A6). In Q2, this relationship was even more pronounced for G1 (r=0.84, p < 0.05), reflecting greater alignment between the two variables, whereas G2 maintained a positive but more moderate correlation (r=0.57, p < 0.05). In the lower quartiles (Q3 and Q4), correlations were weaker for both models, but G2 demonstrated more excellent stability, with coefficients of r=0.41 (p < 0.05) in Q3 and r=0.35 (p < 0.05) in Q4, compared to r=0.29 (p < 0.05) and r=0.21 (p > 0.05) for G1, respectively (Table A6).
These results indicate that G1 exhibits a stronger relationship between perceived quality and academic impact in the upper quartiles. At the same time, G2 stands out for a more consistent and uniform correlation in the lower quartiles. Furthermore, the variability in the correlation coefficients reflects G1’s heterogeneity in the lower quartiles, where values fluctuate more sharply, whereas G2 presents a more linear trajectory. These findings suggest that Diamond Open Access journals (G2), although less competitive in absolute metrics, can maintain a balanced relationship between CiteScore and accumulated citations, especially in less prestigious contexts.
G1 journals, on average, published more articles in all quartiles, particularly in Q1 and Q2, while G2 displayed greater consistency in the reported values. In Q1, G1 reached an average of 1,441 published articles, with a median of 1,223, while G2 recorded an average of 577 and a median of 510 (z = 4.27, p < 0.05). In Q2, G1 averaged 554, surpassing G2, which had an average of 213 (z = 3.98, p < 0.05). The differences in the lower quartiles decreased, but G1 maintained the lead: in Q3, G1 had an average of 212 articles versus G2’s 187 (z = 1.93, p < 0.05). In Q4, both models presented similar values, with G1 averaging 88 articles and G2 averaging 79 (z = 0.85, p > 0.05) (Tables 2; A7).
The variability in values reinforces the differences between the models. In Q1, G1 showed more significant variability, with a standard deviation of 613 articles, compared to G2, which had a standard deviation of 198. In Q2, G1’s coefficient of variation (38.49%) was higher than G2’s (28.44%), indicating more significant heterogeneity among APC-charging journals. G2 maintained greater homogeneity in the lower quartiles, with standard deviations of 79 in Q3 and 25 in Q4, while G1 presented 97 and 32, respectively (Table A7). These results suggest that G1 leads in absolute publication metrics, especially in the upper quartiles, while G2 adopts a more stable and uniform approach in terms of publication volume.
The percentage of cited articles (% Cited) revealed significant differences between the G1 and G2 models, particularly in the top 10% and Q1. In the top 10%, G2 surpassed G1 with an average of 88.75% of articles cited, compared to G1’s 83.36% (z = 3.12, p < 0.05). In Q1, G1 led with an average of 78.55%, while G2 reached 75.88% (z = 2.84, p < 0.05). In Q2, the averages were similar, with G1 at 67.92% and G2 at 66.34% (z = 1.47, p > 0.05). In the lower quartiles, the differences decreased even further: in Q3, G1 had 58.23%, and G2 had 59.01% (z = 0.93, p > 0.05). In Q4, the values were practically identical, with G1 at 43.89% and G2 at 43.77% (z = 0.12, p > 0.05) (Table A8).
The dispersion analyses highlighted the more significant variability of G1 in the upper quartiles, with a coefficient of variation of 18.22% in Q1, while G2 showed lower variability, with a coefficient of 12.67% (Table A8). G2 maintained excellent stability in the lower quartiles, with standard deviations of 9.31 in Q3 and 7.88 in Q4, compared to G1’s 11.45 and 9.12, respectively (Table A8). These results indicate that G2 has an advantage in relative metrics in the most competitive segment (top 10%), but G1 dominates in absolute percentages in the upper quartiles. In the lower quartiles, both models show similar performance, with G2 displaying a slight advantage in terms of stability.
The comparative analysis of metrics between G1 and G2 journals revealed differences reflecting the two models' structural and functional characteristics. While G1 stands out in absolute metrics in the most prestigious segments, G2 exhibits a more balanced and consistent profile in relative metrics.
The analyses carried out highlighted marked differences between APC-charging Open Access journals (G1) and Diamond Open Access (G2) journals in the field of Engineering, considering metrics such as CiteScore, accumulated citations, published articles, and the percentage of cited articles. Overall, G1 journals demonstrated superior performance in absolute metrics, particularly in the upper quartiles (Q1 and Q2), which concentrated most of the citations and published articles. This advantage is evidenced by data such as G1’s average of 11,389 citations in Q1, compared to G2’s 1,577, and G1’s leadership in the number of articles published in the upper quartiles. However, this superiority comes with more significant variability in results, reflecting the heterogeneity of editorial practices and academic impact among APC-charging journals.
The results of this study also point to specific challenges for researchers from less well-funded institutions. While the APC model is associated with greater visibility and impact in absolute metrics, it perpetuates economic barriers that impede equitable access to publishing in higher-prestige journals. This reality is especially problematic for developing countries with limited research funding. Conversely, the Diamond Open Access model offers a more accessible alternative but faces significant challenges in terms of funding and sustainability. The disparities in publication costs and impact indices underscore the need for public policies that encourage the adoption of inclusive models, such as Diamond Open Access, that are aligned to democratize access to scientific knowledge. The literature highlights that solutions to reduce publication costs, as suggested by Oliveira et al. (2023) and Rodrigues et al. (2022), can significantly promote fairer and more accessible editorial practices.
Additionally, G2 journals stood out for their excellent stability and consistency in relative metrics, such as the percentage of cited articles, especially in the lower quartiles and the top 10% segment. In the top 10%, G2 surpassed G1 with 88.75% of articles cited, compared to G1’s 83.36%. Moreover, in the lower quartiles (Q3 and Q4), G2 maintained lower dispersion in the results, proposing a more homogeneous and accessible approach regarding scientific impact. This consistency is critical for democratizing scientific knowledge, particularly in contexts with limited resources.
Thus, the findings point to a trade-off between absolute impact and accessibility. While G1 leads in volumetric metrics in the most prestigious segments, G2 offers more stable and balanced performance, especially in less competitive contexts. These results underscore the importance of both publishing models to meet the diverse demands of the academic community in the field of Engineering, with G1 favoring visibility in high-impact metrics and G2 promoting greater accessibility and sustainability.
Although robust, the present study has significant limitations that must be considered. First, the analysis was limited to journals indexed in the Scopus database, excluding other widely used indexers such as the Web of Science and PubMed, which may limit the breadth and representativeness of the results. Furthermore, the focus on quantitative metrics, such as CiteScore, accumulated citations, the number of published articles, and the percentage of cited articles, did not include qualitative factors, such as societal impact or the relevance of journals in specific regional contexts, which could complement the analyses. Another relevant point is that categorizing journals by quartiles may not adequately capture the nuances between journals near the boundaries of those categories, especially in intermediate quartiles. Finally, a significant limitation is the scarcity of literature allowing direct comparisons with previous studies since comprehensive, systematic investigations comparing APC-based Open Access and Diamond Open Access journals in Engineering are practically nonexistent. This gap restricts the possibility of contextualizing the results within a broader academic landscape and reinforces the need for future studies that expand the methodological scope and explore more integrated conceptual approaches.
The findings of this study reinforce the urgency of debating access policies and publication costs in the academic arena. While APC-based journals offer greater visibility and impact in absolute metrics, charging fees perpetuates economic barriers that limit accessibility, especially in contexts with fewer resources. The promotion of the Diamond Open Access model, in turn, requires more excellent institutional and political support, including sustainable funding, to increase its representation in the publishing market and truly democratize access to knowledge.
Future studies are encouraged to broaden the analysis to include journals indexed in other databases, such as Web of Science and PubMed, to enhance the scope of the conclusions. Incorporating qualitative indicators, such as societal impact and regional relevance, would also be valuable, providing a more holistic perspective on the role of Open Access journals. Investigating external factors, such as editorial policies, indexing practices, and regional influences, can offer a more comprehensive understanding of the scientific publishing market dynamics. Such approaches will contribute to developing strategies that foster a more inclusive and equitable editorial system.

5. Conclusions

Open Access models play complementary roles and are indispensable for advancing scientific research and disseminating knowledge globally. The Diamond Open Access model plays a central role in democratizing access to science, especially in contexts with limited financial resources. Its proposition of eliminating economic barriers for authors and readers broadens inclusion in the academic ecosystem and promotes more significant equity in the global circulation of scientific knowledge. Despite challenges related to financial sustainability, adopting public policies and institutional strategies aimed at strengthening the Diamond Open Access model may establish it as a viable and relevant alternative for a fairer and more accessible future in the academic publishing landscape.

Author Contributions

Conceptualization, L.E.P.; Methodology, L.A.P.; Investigation, L.E.P.; Writing—original draft preparation, L.E.P.; Writing—review and editing, L.A.P., G.D.G.C., and L.M.M.R.; Supervision, G.D.G.C.; Project administration, L.M.M.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data supporting the findings of this study are publicly available in the Scopus database.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A4. Descriptive statistics of APC fees (in USD) for Open Access journals by quartile in the Scopus database.
Table A4. Descriptive statistics of APC fees (in USD) for Open Access journals by quartile in the Scopus database.
STATISTIC Top 10% G1 Q1 G1 Q2 G1 Q3 G1 Q4 G1 G1
Sample Size (n) 106 212 175 94 23 504
Mean 2399.79 2151.99 1665.16 848.35 603.04 1669.13
Median 2200.00 2048.00 1675.00 529.50 300.00 1600.00
Midrange 3672.50 3440.00 2068.00 1763.00 1796.50 3373.00
Root Mean Square (RMS) 2648.79 2382.82 1884.42 1152.94 995.98 1978.50
Variance (s²) 1269083.33 1051713.76 782734.84 616129.84 656874.77 1130728.49
Standard Deviation (s) 1126.54 1025.53 884.72 784.94 810.48 1063.36
Mean Absolute Deviation 878.35 791.67 742.68 623.04 513.46 873.05
Range 6115.00 6580.00 3924.00 3494.00 3407.00 6714.00
Coefficient of Variation (%) 46.94 47.65 53.13 92.53 134.40 63.71
Minimum 615.00 150.00 106.00 16.00 93.00 16.00
1st Quartile (Q1) 1600.00 1485.00 1000.00 264.00 185.00 805.00
2nd Quartile (Median/Q2) 2200.00 2048.00 1675.00 529.50 300.00 1600.00
3rd Quartile (Q3) 3090.00 2785.00 2420.00 1133.00 527.00 2410.00
Maximum 6730.00 6730.00 4030.00 3510.00 3500.00 6730.00
Sum 254378.00 456222.00 291403.00 79745.00 13870.00 841240.00
Sum of Squares 743708154.00 1203697044.00 621428481.00 124951831.00 22815458.00 1972892814.00
95% Confidence Interval (CI) for Mean 2182.84 < mean < 2616.75 2013.15 < mean < 2290.83 1533.16 < mean < 1797.16 687.58 < mean < 1009.12 252.57 < mean < 953.52 1576.07 < mean < 1762.19
95% CI for Standard Deviation (SD) 992.60 < SD < 1302.58 936.32 < SD < 1133.68 800.72883 < SD < 988.56 686.53 < SD < 916.54 626.82 < SD < 1147.11 1001.51 < SD < 1133.40
95% CI for Variance (VAR) 985263.80 < VAR < 1696711.51 876695.88 < VAR < 1285223.47 641166.66 < VAR < 977246.08 471328.51 < VAR < 840037.99 392903.16 < VAR < 1315863.15 1003030.85 < VAR < 1284597.45
Table A5. Distribution of CiteScore for journals by quartile and access model in the Engineering field.
Table A5. Distribution of CiteScore for journals by quartile and access model in the Engineering field.
STATISTIC Top 10% G1 Top 10% G2 Q1 G1 Q1 G2 Q2 G1 Q2 G2 Q3 G1 Q3 G2 Q4 G1 Q4 G2
Sample Size (n) 106 20 212 52 175 59 94 86 23 56
Mean 12.51 13.23 9.10 8.17 3.84 2.79 1.96 1.50 0.67 0.59
Median 10.70 12.30 7.75 7.10 3.90 3.00 2.00 1.50 0.70 0.70
Midrange 25.10 17.15 22.20 13.50 3.90 3.00 2.30 1.65 0.75 0.65
Root Mean Square (RMS) 13.57 14.01 10.45 9.72 4.02 3.05 2.09 1.68 0.82 0.69
Variance (s²) 27.88 22.43 26.55 28.29 1.48 1.50 0.53 0.59 0.24 0.12
Standard Deviation (s) 5.28 4.74 5.15 5.32 1.22 1.22 0.73 0.77 0.49 0.35
Mean Absolute Deviation 3.75 3.59 3.59 3.90 0.97 0.98 0.52 0.63 0.43 0.30
Range 34.60 18.30 40.40 25.60 7.00 5.40 4.00 2.90 1.50 1.30
Coefficient of Variation (%) 42.22 35.80 56.62 65.11 31.70 43.85 37.10 51.22 72.81 59.90
Minimum 7.8 8.0 2.0 0.7 0.4 0.3 0.3 0.2 0.0 0.0
1st Quartile (Q1) 9.20 9.60 5.95 5.35 3.00 1.70 1.50 1.10 0.10 0.20
2nd Quartile (Median/Q2) 10.70 12.30 7.75 7.10 3.90 3.00 2.00 1.50 0.70 0.70
3rd Quartile (Q3) 14.00 15.15 10.70 10.50 4.70 3.80 2.30 2.00 1.10 0.85
Maximum 42.4 26.3 42.4 26.3 7.4 5.7 4.3 3.1 1.5 1.3
Sum 1325.70 264.60 1929.30 424.80 671.50 164.80 184.60 129.10 15.40 33.00
Sum of Squares 19507.85 3926.80 23159.11 4913.04 2834.09 547.34 411.88 244.05 15.54 26.30
95% Confidence Interval (CI) for Mean 11.49 < mean < 13.52 11.01 < mean < 15.45 8.40 < mean < 9.80 6.69 < mean < 9.65 3.66 < mean < 4.02 2.47 < mean < 3.11 1.81 < mean < 2.11 1.34 < mean < 1.67 0.46 < mean < 0.88 0.49 < mean < 0.68
95% CI for Standard Deviation (SD) 4.65 < SD < 6.11 3.60 < SD < 6.92 4.70 < SD < 5.70 4.46 < SD < 6.60 1.10 < SD < 1.36 1.04 < SD < 1.50 0.64 < SD < 0.85 0.67 < SD < 0.90 0.38 < SD < 0.69 0.30 < SD < 0.43
95% CI for Variance (VAR) 21.65 < VAR < 37.28 12.97 < VAR < 47.85 22.13 < VAR < 32.44 19.87 < VAR < 43.51 1.21 < VAR < 1.85 1.08 < VAR < 2.24 0.41 < VAR < 0.72 0.45 < VAR < 0.82 0.14 < VAR < 0.48 0.09 < VAR < 0.19
Table A6. Number of citations (2020–2023) for journals by quartile and access model in the Engineering field.
Table A6. Number of citations (2020–2023) for journals by quartile and access model in the Engineering field.
STATISTIC Top 10% G1 Top 10% G2 Q1 G1 Q1 G2 Q2 G1 Q2 G2 Q3 G1 Q3 G2 Q4 G1 Q4 G2
Sample Size (n) 106 20 212 52 175 59 94 86 23 56
Mean 12666.10 2745.15 11389.07 1576.96 2389.89 461.76 481.19 339.66 110.70 82.63
Median 3047.00 2435.00 2090.50 1154.00 682.00 297.00 287.50 176.50 68.00 47.50
Midrange 242451.00 3758.00 242451.00 3550.50 38401.50 1719.50 1777.00 2335.00 419.00 327.00
Root Mean Square (RMS) 49790.02 3267.64 45238.93 2217.34 7341.87 704.77 776.09 672.49 207.38 142.59
Variance (s²) 2340698290.44 3306986.87 1925934406.34 2477449.25 48468432.63 288363.29 374759.66 340838.93 32149.77 13751.40
Standard Deviation (s) 48380.76 1818.51 43885.47 1573.99 6961.93 536.99 612.18 583.81 179.30 117.27
Mean Absolute Deviation 15557.91 1321.40 15483.35 1160.41 2679.99 322.00 366.75 300.66 103.95 72.12
Range 484584.00 6608.00 484584.00 7023.00 76743.00 3391.00 3516.00 4660.00 834.00 654.00
Coefficient of Variation (%) 381.97 66.24 385.33 99.81 291.31 116.29 127.22 171.88 161.98 141.93
Minimum 159 454 159 39 30 24 19 5 2 0
1st Quartile (Q1) 1564.00 1432.00 934.00 517.50 380.00 182.00 169.00 79.00 5.00 21.00
2nd Quartile (Median/Q2) 3047.00 2435.00 2090.50 1154.00 682.00 297.00 287.50 176.50 68.00 47.50
3rd Quartile (Q3) 7120.00 3289.50 4675.00 2331.50 1707.00 553.00 582.00 361.00 138.00 84.50
Maximum 484743 7062 484743 7062 76773 3415 3535 4665 836 654
Sum 1342607.00 54903.00 2414482.00 82002.00 418230.00 27244.00 45232.00 29211.00 2546.00 4627.00
Sum of Squares 262778920085 213549721 433870854682 255663912 9433029180 29305334 56617902 38893199 989126 1138633
95% Confidence Interval (CI) for Mean 3348.55 < mean < 21983.66 1894.06 < mean < 3596.24 5447.53 < mean < 17330.61 1138.76 < mean < 2015.16 1351.19 < mean < 3428.58 321.82 < mean < 601.70 355.81 < mean < 606.58 214.49 < mean < 464.83 33.16 < mean < 188.23 51.22 < mean < 114.03
95% CI for Standard Deviation (SD) 42628.88 < SD < 55941.18 1382.96 < SD < 2656.07 40067.89 < SD < 48513.35 1319.08 < SD < 1951.95 6300.97 < SD < 7779.01 454.58 < SD < 656.18 535.43 < SD < 714.81 507.71 < SD < 686.97 138.67 < SD < 253.78 98.86 < SD < 144.15
95% CI for Variance (VAR) 1817221330.47 < VAR < 3129416052.62 1912583.32 < VAR < 7054706.65 1605435634.57 < VAR < 2353545420.28 1739972.11 < VAR < 3810102.30 39702261.66 < VAR < 60512939.45 206646.78 < VAR < 430575.25 286684.56 < VAR < 510951.31 257767.28 < VAR < 471931.31 19230.07 < VAR < 64402.98 9774.13 < VAR < 20779.28
Table A7. Articles published (2020–2023) by journals in Scopus quartiles.
Table A7. Articles published (2020–2023) by journals in Scopus quartiles.
STATISTIC Top 10% G1 Top 10% G2 Q1 G1 Q1 G2 Q2 G1 Q2 G2 Q3 G1 Q3 G2 Q4 G1 Q4 G2
Sample Size (n) 106 20 212 52 175 59 94 86 23 56
Mean 1132.28 227.60 1441.69 1576.96 554.33 141.27 237.24 339.66 123.91 124.98
Median 249.50 179.00 237.00 1154.00 203.00 117.00 146.50 176.50 87.00 90.50
Midrange 24851.00 364.50 24851.00 3550.50 7260.00 231.08 680.00 2335.00 284.00 343.00
Root Mean Square (RMS) 4998.87 287.03 5879.55 2217.34 1477.63 170.21 338.88 672.49 179.40 172.64
Variance (s²) 23932373.37 32195.31 32644618.21 2477449.25 1886903.75 9167.95 59183.03 340838.93 17594.26 14443.58
Standard Deviation (s) 4892.07 179.43 5713.55 1573.99 1373.65 95.75 243.28 583.81 132.64 120.18
Mean Absolute Deviation 1422.63 131.74 2007.36 1160.41 581.27 71.04 164.43 300.66 86.05 78.90
Range 49672.00 665.00 49672.00 7023.00 14488.00 459.85 1294.00 4660.00 538.00 654.00
Coefficient of Variation (%) 432.05 78.84 396.31 99.81 247.80 67.78 102.54 171.88 107.05 96.16
Minimum 15 32 15 39 16 1.152 33 5 15 16
1st Quartile (Q1) 123.00 111.00 117.00 517.50 117.00 75.00 92.00 79.00 40.00 51.50
2nd Quartile (Median/Q2) 249.50 179.00 237.00 1154.00 203.00 117.00 146.50 176.50 87.00 90.50
3rd Quartile (Q3) 673.00 251.00 570.00 2331.50 412.00 183.00 277.00 361.00 151.00 167.50
Maximum 49687 697 49687 7062 14504 461 1327 4665 553 670
Sum 120022.00 4552.00 305639.00 82002.00 97007.00 8335.00 22301.00 29211.00 2850.00 6999.00
Sum of Squares 2648798076 1647746 7328652171 255663912 382094727 1709279 10794815 38893199 740226 1669147
95% Confidence Interval (CI) for Mean 190.13 < mean < 2074.44 143.62 < mean < 311.58 668.15 < mean < 2215.24 1138.76 < mean < 2015.16 349.38 < mean < 759.27 116.32 < mean < 166.23 187.42 < mean < 287.07 214.49 < mean < 464.83 66.55 < mean < 181.27 92.80 < mean < 157.17
95% CI for Standard Deviation (SD) 4310.46 < SD < 5656.55 136.46 < SD < 262.07 5216.53 < SD < 6316.06 1319.08 < SD < 1951.95 1243.23 < SD < 1534.86 81.06 < SD < 117.00 212.78 < SD < 284.06 507.71 < SD < 686.97 102.59 < SD < 187.74 101.32 < SD < 147.73
95% CI for Variance (VAR) 18580104.73 < VAR < 31996585.68 18620.03 < VAR < 68681.38 27212159.04 < VAR < 39892631.57 1739972.11 < VAR < 3810102.30 1545631.71 < VAR < 2355803.28 6569.93 < VAR < 13689.30 45273.98 < VAR < 80690.77 257767.28 < VAR < 471931.31 10523.84 < VAR < 35245.14 10266.11 < VAR < 21825.20
Table A8. Percentage of cited articles (2020–2023) by journals in Scopus quartiles.
Table A8. Percentage of cited articles (2020–2023) by journals in Scopus quartiles.
STATISTIC Top 10% G1 Top 10% G2 Q1 G1 Q1 G2 Q2 G1 Q2 G2 Q3 G1 Q3 G2 Q4 G1 Q4 G2
Sample Size (n) 106 20 212 52 175 59 94 86 23 56
Mean 83.36 88.75 78.55 75.88 69.29 60.08 56.22 46.86 30.39 28.07
Median 84.00 87.50 79.00 80.00 70.00 62.00 57.00 49.50 35.00 33.00
Midrange 78.50 89.50 72.50 65.00 59.50 50.50 50.00 46.00 30.50 25.50
Root Mean Square (RMS) 83.64 88.92 79.03 77.42 69.94 61.57 57.10 49.33 34.84 31.30
Variance (s²) 46.92 31.57 75.48 240.61 90.31 183.73 99.85 240.47 303.07 194.98
Standard Deviation (s) 6.85 5.62 8.69 15.51 9.50 13.55 9.99 15.51 17.41 13.96
Mean Absolute Deviation 5.30 4.63 6.75 11.57 6.94 10.49 7.46 12.11 15.04 11.70
Range 37.00 19.00 49.00 68.00 65.00 61.00 62.00 70.00 53.00 51.00
Coefficient of Variation (%) 8.22 6.33 11.06 20.44 13.71 22.56 17.77 33.09 57.28 49.74
Minimum 60 80 48 31 27 20 19 11 4 0
1st Quartile (Q1) 78.00 84.00 73.00 72.00 65.00 54.00 51.00 40.00 13.00 16.50
2nd Quartile (Median/Q2) 84.00 87.50 79.00 80.00 70.00 62.00 57.00 49.50 35.00 33.00
3rd Quartile (Q3) 88.00 93.00 84.50 86.00 75.00 70.00 62.00 58.00 44.00 38.50
Maximum 97 99 97 99 92 81 81 81 57 51
Sum 8836.00 1775.00 16653.00 3946.00 12126.00 3545.00 5285.00 4030.00 699.00 1572.00
Sum of Squares 741482.00 158131.00 1324051.00 311712.00 855942.00 223657.00 306427.00 209288.00 27911.00 54852.00
95% Confidence Interval (CI) for Mean 82.04 < mean < 84.68 86.12 < mean < 91.38 77.38 < mean < 79.73 71.57 < mean < 80.20 67.87 < mean < 70.71 56.55 < mean < 63.62 54.18 < mean < 58.27 43.54 < mean < 50.19 22.86 < mean < 37.92 24.33 < mean < 31.81
95% CI for Standard Deviation (SD) 6.04 < SD < 7.92 4.27 < SD < 8.21 7.93 < SD < 9.60 13.00 < SD < 19.24 8.60 < SD < 10.62 11.47 < SD < 16.56 8.74 < SD < 11.67 13.49 < SD < 18.25 13.46 < SD < 24.64 11.77 < SD < 17.16
95% CI for Variance (VAR) 36.43 < VAR < 62.73 18.26 < VAR < 67.34 62.92 < VAR < 92.24 168.99 < VAR < 370.04 73.98 < VAR < 112.75 131.67 < VAR < 274.35 76.39 < VAR < 136.14 181.86 < VAR < 332.96 181.28 < VAR < 607.11 138.58 < VAR < 294.62

References

  1. Ancion, Z. , Borrell-Damián, L., Mounier, P., Rooryck, J., & Saenen, B. (2022). Action plan for diamond open access. [CrossRef]
  2. Andringa, S., Mos, M., Van Beuningen, C., González, P., Hornikx, J., & Steinkrauss, R. (2024). Diamond is a scientist’s best friend: Counteracting systemic inequality in open access publishing. Dutch Journal of Applied Linguistics, 13, Article 18802. [CrossRef]
  3. Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities. (2003, October 22). Berlin Declaration. Retrieved from https://openaccess.mpg.de/Berlin-Declaration.
  4. BOAI. (2002). Budapest Open Access Initiative. Open Society Foundations. Retrieved December 14, 2024, from https://www.influencewatch.org/organization/budapest-open-access-initiative/.
  5. BOAI. (2012). Ten years on from the Budapest Open Access Initiative: Setting the default to open (BOAI10). Italian Journal of Library & Information Science, 3. [CrossRef]
  6. Borrego, A. (2023). Article processing charges for open access journal publishing: A review. Learned Publishing, 36, 359–378. [CrossRef]
  7. De Filippo, D., & Mañana-Rodríguez, J. (2020). Open access initiatives in European universities: Analysis of their implementation and the visibility of publications in the YERUN network. Scientometrics, 125, 2667–2694. [CrossRef]
  8. Druelinger, D., & Ma, L. (2023). Missing a golden opportunity? An analysis of publication trends by income level in the Directory of Open Access Journals 1987–2020. Learned Publishing, 36, 348-358. [CrossRef]
  9. Frank, J., Foster, R., & Pagliari, C. (2023). Open access publishing – Noble intention, flawed reality. Social Science & Medicine, 317, 115592. [CrossRef]
  10. Harnad, S. The self-archiving initiative. Nature, 410(6832), 1024–1025 (2001). [CrossRef]
  11. Haug, C. J. (2019). No free lunch—What price Plan S for scientific publishing? The New England Journal of Medicine, 380(12), 1181–1185. [CrossRef]
  12. Huang, C.-K., Neylon, C., Montgomery, L., Hosking, R., Diprose, J. P., Handcock, R. N., & Wilson, K. (2024). Open access research outputs receive more diverse citations. Scientometrics, 129(3), 825–845. [CrossRef]
  13. Laakso, M., & Björk, B. C. (2012). Anatomy of open access publishing: A study of longitudinal development and internal structure. BMC Medicine, 10, 124. [CrossRef]
  14. Langham-Putrow A, Bakker C, Riegelman A (2021) Is the open access citation advantage real? A systematic review of the citation of open access and subscription-based articles. PLoS ONE, 16(6), e0253129. [CrossRef]
  15. Max-Planck-Gesellschaft. (2003). Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities. Munich, Germany. Retrieved from http://openaccess.mpg.de/286432/Berlin-Declaration.
  16. May, C. (2020). Academic publishing and open access: Costs, benefits and options for publishing research. Politics, 40(1), 120–135. [CrossRef]
  17. Miranda, R., & Garcia-Carpintero, E. (2019). Comparison of the share of documents and citations from different quartile journals in 25 research areas. Scientometrics, 121(2), 479–501. [CrossRef]
  18. Mounier, P., & Rooryck, J. (2023, December 23). Towards a federated global community of Diamond Open Access. The Diamond Papers. Retrieved July 31, 2024, from https://thd.hypotheses.org/296.
  19. Neuendorf, K. A. (2017). The content analysis guidebook. SAGE Publications.
  20. Rodrigues, M. L., Savino, W., & Goldenberg, S. (2022). Article-processing charges as a barrier for science in low-to-medium income regions. Memórias do Instituto Oswaldo Cruz, 117, e220064. [CrossRef]
  21. Rooryck, J., Rico Castro, P., & de Pablo Llorente, V. (2024). Quality as a public good: The Diamond Open Access Standard (DOAS) and its role in the Global Diamond Open Access Alliance. Septentrio Conference Series(1). Retrieved from https://new.eludamos.org/index.php/SCS/article/view/7770.
  22. Schvirck, E., Lievore, C., Rubbo, P., Herrera Cantorani, J. R., & Pilatti, L. A. (2024). Publicaciones invisibles: un estudio de la productividad académica en la base de datos Web of Science. Revista Española de Documentación Científica, 47(1), e375. [CrossRef]
  23. Segovia, M., Galleguillos Madrid, F. M., Portillo, C., Rojas, E. M., Gallegos, S., Castillo, J., Salazar, I., Quezada, G. R., & Toro, N. (2024). The positive impact of open access scientific publishing in Chile. Publications, 12(41). [CrossRef]
  24. Suber, P. (2012). Open Access. Cambridge, Massachusetts: MIT Press.
  25. Swan, A. (2015). Open Access and the Humanities: Contexts, Controversies and the Future. Cambridge University Press.
  26. Willinsky, J. (2006). The access principle: The case for open access to research. The FASEB Journal, 20(4), A439. [CrossRef]
  27. Yoon, J., Ku, H., & Chung, E. (2024). The road to sustainability: Examining key drivers in open access diamond journal publishing. Learned Publishing, 37, e1611. [CrossRef]
Table 1. Distribution of Open Access journals with APC (G1) and Diamond Open Access journals (G2) by quartiles and top 10%.
Table 1. Distribution of Open Access journals with APC (G1) and Diamond Open Access journals (G2) by quartiles and top 10%.
Quartile Journals G1 G2
Q1 264 212 52
Q2 234 175 59
Q3 180 94 86
Q4 79 23 56
Top 10% 126 106 20
Table 2. Statistical comparison between Open Access journals with APC (G1) and Diamond Open Access journals (G2) for CiteScore and accumulated citations by quartiles.
Table 2. Statistical comparison between Open Access journals with APC (G1) and Diamond Open Access journals (G2) for CiteScore and accumulated citations by quartiles.
Quartile Metric G1 (Mean, Median) G2 (Mean, Median) Statistic (z) p-value Significance
Q1 CiteScore 9,10; 7,75 8,17; 7,10 1,25151 >0,05 No
Q2 CiteScore 3,84; 3,90 2,79; 3,00 5,03487 <0,05 Yes
Q3 CiteScore 1,96; 2,00 1,50; 1,50 3,54535 <0,05 Yes
Q4 CiteScore 0,67; 0,70 0,59; 0,70 0,75002 >0,05 No
Q1 Citations 11389; 2090 1577; 1154 3,92681 <0,05 Yes
Q2 Citations 2389; 682 462; 297 5,71871 <0,05 Yes
Q3 Citations 481; 287 340; 176 2,57453 <0,05 Yes
Q4 Citations 111; 68 83; 48 0,31835 >0,05 No
Table 3. Correlation coefficients between CiteScore and accumulated citations for Open Access journals with APC (G1) and Diamond Open Access journals (G2) by quartiles.
Table 3. Correlation coefficients between CiteScore and accumulated citations for Open Access journals with APC (G1) and Diamond Open Access journals (G2) by quartiles.
Quartile Group Correlation Coefficient (r) p-value R² (%)
Q1 G1 0,0358 0,60418 0,13
Q1 G2 0,5446 0,00003 29,7
Q2 G1 0,2035 0,00691 4,14
Q2 G2 0,3629 0,00473 13,2
Q3 G1 0,0838 0,42201 0,70
Q3 G2 0,4866 0,00000 23,7
Q4 G1 0,5791 0,00378 33,5
Q4 G2 0,4922 0,00012 24,2
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated