Potential applications of Chat Generative Pre-trained Transformer in obstetrics and gynecology: comment

Article information

Obstet Gynecol Sci. 2024;67(3):341-342
Publication date (electronic) : 2024 April 2
doi : https://doi.org/10.5468/ogs.24027
1Private Academic Consultant, Phonhong, Lao People’s Democratic Republic
2Department of Community Medicine, Dr. D. Y. Patil Vidhyapeeth, Pune, Pimpri-Chinchwad, India
Corresponding author: Hinpetch Daungsupawong, PhD, Private Academic Consultant, Lak 52 Phonhong, Vientiane 10000, Lao People’s Democratic Republic, E-mail: hinpetchdaung@gmail.com
Received 2024 January 23; Revised 2024 March 18; Accepted 2024 March 26.

Dear Editor, we want to share ideas on the publication “potential applications of Chat Generative Pre-trained Transformer (ChatGPT) in obstetrics and gynecology in Korea: a review article” [1]. This article illustrates ChatGPT’s capacity to produce responses that resemble those of humans and discusses its use in obstetrics and gynecology (OBGYN). Although the promise of ChatGPT in medicine, research, and education has been mentioned, no concrete instances or supporting data justifies these assertions. The benefits and drawbacks of using ChatGPT in the context of health have been extensively explored, and weighing these factors is important [2,3].

Furthermore, without going to great lengths, the essay skims over difficulties in confirming the veracity of chatbot data and ethical issues. It does not discuss possible outcomes or strategies to lessen artificial intelligence (AI) systems’ shortcomings and inaccuracies.

Concentrating on enhancing and confirming the dependability and correctness of the data produced by ChatGPT would be imperative in the future. Further investigation is required to identify and resolve ethical issues and restrictions surrounding the application of chatbot technology in OBGYN and medicine. A more thorough understanding of the subject would also benefit from current investigations or research projects to resolve the shortcomings and errors of AI systems in the healthcare industry. Finally, educating users about the constraints and possible hazards associated with chatbot technology in the healthcare industry is critical to guarantee patient safety. User ethics is an important topic that should not be ignored. The device must be used responsibly if it is to be used [4].

The study [1] found that several machine-learning techniques have been successfully used for data acquisition for the early diagnosis of maternal-fetal disorders, as mentioned by Ahn and Lee [5]. Ethical concerns must be considered as artificial intelligence becomes more widely used [5]. Further exploration of topics including patient confidentiality, informed permission, and potential biases in AI-generated responses is essential to provide a more thorough understanding of the ethical implications of using ChatGPT in obstetrics and gynecology. Furthermore, investigating the ramifications of AI technology for sensitive medical data and decision-making procedures will help to clarify the moral conundrums that medical practitioners might encounter. Finally, considering the effect of AI on doctor-patient relationship and ensuring that ChatGPT is used transparently will help allay ethical worries and increase confidence in the technology.


Conflict of interest

The authors declare no conflict of interest.

Ethical approval

Not applicable.

Patient consent

Not applicable.

Funding information



1. Lee Y, Kim SY. Potential applications of ChatGPT in obstetrics and gynecology in Korea: a review article. Obstet Gynecol Sci 2024;67:153–9.
2. Lee P, Bubeck S, Petro J. Benefits, limits, and risks of GPT- 4 as an AI chatbot for medicine. N Engl J Med 2023;388:1233–9.
3. The Lancet Digital Health. ChatGPT: friend or foe? Lancet Digit Health 2023;5:e102.
4. Kleebayoon A, Wiwanitkit V. Artificial intelligence, chatbots, plagiarism and basic honesty: comment. Cell Mol Bioeng 2023;16:173–4.
5. Ahn KH, Lee KS. Artificial intelligence in obstetrics. Obstet Gynecol Sci 2022;65:113–24.

Article information Continued