CONTEMPORARY EDUCATIONAL TECHNOLOGY
e-ISSN: 1309-517X
Enhancing logical reasoning in language models: An investigation of the Capybara dataset

Luis Eduardo Muñoz Guerrero 1, Yony Fernando Ceballos 2, Luis David Trejos Rojas 1 *

CONT ED TECHNOLOGY, Volume 17, Issue 3, Article No: ep582

https://doi.org/10.30935/cedtech/16425

Submitted: 26 July 2024, Published Online: 02 June 2025

OPEN ACCESS   126 Views   46 Downloads

Download Full Text (PDF)

Abstract

Recent progress made in conversational AI lays emphasis on the need for development of language models that possess solid logical reasoning skills and further extrapolated capabilities. An examination into this phenomenon investigates how well the Capybara dataset can improve one’s ability to reason using language-based systems. Multiple cutting-edge linguistic models were fine-tuned using the Capybara corpus before assessing their performances on standard tasks demanding sophisticated reasoning. The comparison using different ways reveals that the logical reasoning of models improves and their ability to make inferences is enhanced. This research explores this further by considering what it means for developers who want more human-like machine conversation intelligence. We also see that this could become an invaluable tool when training reasoning-oriented language generating models.

References

Citation

The articles published in this journal are licensed under the CC-BY Creative Commons Attribution International License.
This website uses cookies to provide necessary website functionality. By using our website, you are agree to our Privacy Policy.