SmartMedBox: A Smart Medicine Box for Visually Impaired People Using IoT and Computer Vision Techniques

SmartMedBox: A Smart Medicine Box for Visually Impaired People Using IoT and Computer Vision Techniques

Vidula V. Meshram Kailas R. PatilVishal A. Meshram Shripad Bhatlawande 

Department of Computer Engineering, Vishwakarma University, Pune 411048, India

Department of Computer Engineering, Vishwakarma Institute of Information Technology, Pune 411048, India

Electronic &Telecommunication Communication Engineering, Vishwakarma Institute of Technology, Pune 411037, India

Corresponding Author Email: 
kailas.patil@vupune.ac.in
Page: 
681-688
|
DOI: 
https://doi.org/10.18280/ria.360504
Received: 
18 July 2022
|
Revised: 
12 October 2022
|
Accepted: 
22 October 2022
|
Available online: 
23 December 2022
| Citation

© 2022 IIETA. This article is published by IIETA and is licensed under the CC BY 4.0 license (http://creativecommons.org/licenses/by/4.0/).

OPEN ACCESS

Abstract: 

The Internet of Things (IoT) can easily connect real-life objects or physical things to the internet, thus having applications in different domains. Healthcare is one of the prominent application areas. The proposed work aims to design and development of a smart medical box for visually impaired people using IoT and computer vision methods. This application comprises of two modules: first, the QR code scanning module in the mobile app scans the QR code applied to the medicine strip by a pharmacist, it reads the entire medicine information and sets the voice alarms according to a medicine dosage schedule. The second module comprises of the medicine box, an ultrasonic sensor, and an alarm sensor connected to an Arduino microcontroller. When the user cannot find the medicine box, he presses the "locate me" button in the mobile app, and the alarm starts ringing, enabling the user to easily locate the medicine box in the indoor environment on a sound basis. On detection of an object close to the medicine box by an ultrasonic sensor the alarm stops ringing, and that will be the actual location of the medicine box. The experimental analysis of the system with 30 real-time beneficiaries, produces 86.33% accuracy in finding the location of SmartMedBox.

Keywords: 

alarm sensor, computer vision, internet of things, mobile application, QR code generation, ultrasonic sensor, visually impaired people

1. Introduction

Uncorrected refractive error is a major reason for loss of vision in adults and children as per the key facts given by WHO. Visual acuity (VA) of 20/50 or worse of a person indicates he has an uncorrected refractive error. Vision impairment severely impacts the quality of life of the person.  One of the reasons for uncorrected refractive error is the patient is unaware of the refractive error and is reluctant to visit the specialist due to mundane activities in day-to-day life. Therefore, medical issues related to vision impairment continue to grow, and people suffer. Senior residents have the same issue since they often cannot attend medical clinics. People are also unwilling to stand in line and wait for registration. If the person has a severe health issue, the therapy isn't available locally. Thus, the person must go to the nearest location. In India, over 1.8 million people die from cardiac disease, with 42.7% dying on route to an emergency clinic. The cause is a lack of suitable observation in the rescue/transport vehicle. In today's fast-paced world, improving patient checking systems is critical. According to the 2016 legislative measures, only one paramedic is available for every 3200 Indians. It's a troubling situation. WHO data reveal that a person is lost every day due to inaccessibility to essential health upgrades. Other developing countries have provided their folks with acceptable human services. This project is a great way to enhance the paramedic condition in India. The work done proposes the design and implementation of a reliable, low-energy consumption system for blind and elderly people to manage their medicine intake using the Internet of Things (IoT), as IoT has a prominent application in the healthcare domain [1]. It will help visually impaired people as well as old people to manage their medicine intake which in turn will enhance their medical condition due to timely intake of medicine.

Many people today have an unquenchable need for medicine due to the persistence of various diseases. The diseases do not discriminate based on age; they may strike persons of any age. A lot of people have inborn disability problems such as blindness. Nevertheless, technology impresses with agile devices and eliminates such obstacles for these communities.  Moreover, consuming medicine at the correct time and correct amount can speed up recovery. Many people who use prescription drugs do not follow their doctors' instructions properly while taking medicine. People may start feeling better and decide not to finish the medication. People who don't observe relief in their symptoms may stop taking the medication because they believe the medicine isn't giving any relief. The common problems faced by individuals in their daily life are:

1.1 Missing tablets consumption in time

Diseases get more severe, and the number of citizens using medicine rises dramatically due to long-term illnesses affecting people of all ages. Patients with diabetes, heart disease, high blood pressure, and the elderly all take medications daily for various reasons. However, in many cases, individuals fail to take the medicine on time, resulting in various adverse effects. Medicine should be consumed on time, every time, to ensure that they would have an adequate amount of medication in the body anymore.

1.2 Tablets mismatch

Some patients are confused and modify their pattern of medicine intake owing to a lack of information about the medicine. The elderly and illiterates cannot choose the proper tablet regularly due to a lack of understanding about the tablets or not understanding the instruction of the prescribing doctors. This leads to serious health issues and the danger of drug allergies.

Thus, in this article, we introduce an intelligent medication box that may assist individuals', old as well as visually impaired people using the Internet of Things (IoT) and Computer Vision (CV) techniques which help in the proper intake of medicine.

The rest of the article describes the literature review in Section II with existing methodologies that numerous researchers have done. In Section III, the research methodology describes the architecture of the system and the implementing flow of the model in detail. The setup done for the experiment and the results of the experiment are mentioned in Section IV, Section V discusses the current work, while the final section VI deals with the conclusion of the proposed system.

2. Literature Survey

Kripalani et al. [2] proposed an approach for the medical schedule for low literacy persons. Low-knowledge persons have trouble reading prescription pill labels and other pharmaceutical instructions. This article details the creation, implementation, and early assessment of an illustrated dosage itinerary that uses pill illustrations and iconography to convey a patient's daily drug routine. Randomly chosen participants were evaluated in a controlled environment and they reported the usage of the card and its usefulness for the successful deployment of this system. The comments were examined based on the patient's reading level and other factors. According to the experimental analysis, around 92% of respondents said it is easy to use for daily routine medical management while 94% of patients found they commented it is easy to remember important medication information.  UbiMeds is a system assistance for medicine adherence demonstrated by Silva et al. [3] to enhance severe accessibility. It is a smartphone application that combines with prevailing Patient Health Information (PHI) systems to provide automatically generated scheduling, remembrance, and monitoring of prescriptions. Medication intake and proactive notifications are sent to doctors and family members when the service user fails to follow the prescription regimen. The purpose was to make drug administration more accessible to elderly and disabled people and raise caregivers' and elderly parents' knowledge of the patient's condition. A performance study with genuine patients with complicated medication schedules would be part of the UbiMeds initiative. This kind of assessment would enable us to see how many UbiMeds' basic principles increase accessibility for the challenges raised in this study.

Medicine self-management method for visually impaired people is discussed by McCann et al. [4]. The principal objective of this research is a comparison of problems of consuming medicines with self-supervision for an elderly and blind person. This paper demonstrates the study evaluation has been done with >65 years of age participants, including with or without visually impaired people. The 158 participants rated and assessed each other. Older persons with visual impairment need medication management assistance more likely than their colleagues without visual impairment. Patient demands accurate time notification in clinical practice in primary care. It is necessary to investigate strategies for successful medication self-management. Benjamin et al. [5] created a technique for visually challenged persons to identify pharmaceutical cartons using visual characteristics matching. To detect critical elements on the medication box, we employ a camera device, which is accessible in a variety of everyday devices like computers, TVs, and mobile phones. Following the identification of the box, audio recordings are played to provide information on the medication's dose, indications, and precautions. Using this camera system may assist many blind people to ingest the correct medicine as per prescription and at the correct time prescribed by their doctor. Hasanuzzaman et al. [6] proposed a system of RFID-based video analysis, for monitoring the activity of taking the medicine of amalgamation. This approach could help monitor older people's everyday activities in their own homes. RFID tags are implemented on prescription bottles in a medicine cabinet in adaptive surroundings so that each treatment bottle has a unique ID. Each tag's characterization of treatment data is manually entered into a database. If any of these containers are removed from the medicine cabinet, RFID scanners will detect this and determine the tag linked to the medication bottle. By combining face monitoring and classification, mouth identification, background removal, and increased activity, a video camera is mounted to observe the movement of taking medication.

Xiao et al. [7] proposed a dynamic navigation approach for old as well as visually impaired people. It's a high-level overview of the assisted navigation system, which uses real-time localization technology to parse semantic elements in the physical and virtual worlds. It also explains the operational principle of creating an intelligent system that allows the user to navigate alone. The authors offer a pilot study to show the system's capabilities in indoor and outdoor settings. This method resulted in a workable prototype, but it still has a long way. It isn't easy to manage CPU resources and synchronize picture processing and other calculations when dealing with a massive dataset from several sensors. It's challenging to decrease the picture blur produced by the back vibration of individual walking steps when employing a head-mounted camera handled by a blind person. It is critical in outdoor navigation that users have constant access to GPS information. Other issues include integrating with its components and creating and maintaining an availability database. Salgia et al. [8] proposed a system for a smart pill box that helps to ingest the correct medicine at the correct time for blind and old people. On a regular pillbox, the primary concept is to combine the idea of ringing with shadow slot sense. Multitouch fields are used as an alternative to the light-based sensing approach. It has a GSM module incorporated to keep it up to date by informing both the patient and the pharmacist at the appropriate time. Creating a way of improvement-backed pillbox labeled Smart pill box adds to the answer to this complication. The pillbox's intelligence is accomplished by using low-cost slot sensing methods such as capacitance-based slot detection. These basic but effective tactics are aided by innovations such as GSM technology, which bridges the communication barriers between the provider or pharmacist and the patient.

Gori et al. [9] developed a device for visually handicapped persons that raise critical questions concerning Sensor Replacement Devices. Despite significant technical advances in the last decade, we are still far from seeing users' systems utilized in regular life, particularly youngsters. Jayashree et al. [10] proposed a system medicine management application for visually impaired users. This research establishes an application to make it easier and more natural for blind users to discover medications and take everything according to their medical prescription. Visually impaired persons do not need to rely on others to find their medicine. This Application program is used to help them deal with the challenges they confront in this situation. In this application, a voice output establishes a reminder that advises the user whether to take their medication. The integrated camera of the cellphone captures images of the medication strip held in hand. The image is analyzed, and then text translation and retrieval are performed, resulting in the identification of the modern medical name. This software also offers a spotter segment that checks the prescribed medication that has already been submitted to the mobile phone of the user, the medicine name is checked, and whether the treatment is necessary immediately, then the quantity of medicine is communicated to the user via voice commands. The user takes their meds thus according to their prescription after hearing the speech output from their phone. It may also help uninformed individuals who suffer to figure out which drugs they need to take. It is especially difficult for elderly persons who are not trained to read their pharmaceutical names on their own.

Berger et al. [11] assistive technology-based Google glass for visually impaired people. This article aims to present and primarily evaluate a built application for fundamental navigation concerns that may be used by blind or visually impaired persons daily. Second, the authors have focused on how the app might be used as a rudimentary recognition aid for visually impaired persons. They may refer to Smart Glasses and the created mobile application as assistive technology in this context since they enable participants to go to any location quicker and more manageably but without intervention. The developed program that uses Google Glass is up-and-coming, and the work of further evaluation and experimentation is done by researchers to gather additional data for improving it.

Almuzaini et al. [12] medication identification approach for blind people with various techniques. This approach describes different practices with their advantages and disadvantages. The principal methods have been demonstrated in that paper with QR code generation and IoT-based systems for the generation of smart timetable management for medication management. Lee et al. [13] proposed a system evaluation of the use of medications for pharmacy as well as blind or visually impaired people. It assesses how visually impaired people use drugs and pharmacy services, and the status of pharmaceutical counseling provided by community pharmacists to blind people. It is bridge research that included visually challenged people and nine pharmacists. After a pilot study, survey questions for each of the ten groups were created. Participants with visual impairments were selected from two South Korean institutes for the visually handicapped aged and above. Thirteen pharmacists were recruited from 47 community pharmacies, of whom were chosen by the Seoul Metropolitan Administration for braille sticker dissemination.

Kamal et al. [14] claimed that a phone-based treatment program is implemented for effective evaluation in low to middle-income countries where there are a large number of clinics and the literacy in patients is less There were no major improvements in the compliant behavior of medication treatment as per the study done of the follow up taken. Kiana Farhadyar et al. [15] proposed a system of smart medication management for visually impaired people. The authors presented a goal-directed technique to identify the objectives of blind users who had to utilize a medicine reconciliation method. The most significant aspect of this strategy is designing identities. The information required to create personas came from two literature studies and conversations with 14 blind people and three professionals in the field. As a result, the individuals' various objectives were identified. Chang et al. [16] proposed an IoT-based medication identification technique using deep learning techniques. The suggested system combines wearable electronic glasses, a continuous medication pill detection device mounted on the waist, mobile device applications, and a virtualized management solution. To prevent ingesting incorrect medications the recommended method recognizes the drug tablet by employing the technology which uses deep learning.

Meshram et al. [17] proposed a system, named NavCane which assists blind people with navigation and object recognition. This device is specially designed for indoor environments that accurately detect obstacles or normal conditions. This stick is attached to numerous analog sensors that dynamically extract the data from the RFID reader. The use and efficacy of NavCane for those with limited vision, completely blind people, and the old was shown in a study with people with visual impairments. Ali Almuzaini and Abdullah-Al-Wadud [18] proposed a system for the identification of medicine for people who are unable to see. This system also aimed to develop an aid that enables people to identify their medicine without the need for technology other than their smartphones. The authors tested the usability and accessibility of their developed framework with blind users after building it.

Chang et al. [19] proposed MedGlasses developed using wearable smart glass for pill recognition by blind people. This device employs a Convolutional Neural Network for pill recognition. A pair of wireless wearable eyeglasses, a machine learning-based smart medicine detection box, an app that works on mobile, and a virtual platform that manages information make up the MedGlasses system. Test data suggest that up to 95.1 percent classification performance may be attained. As a result, the recommended MedGlasses system may successfully limit the issue of drug interactions due to taking the wrong medications, lowering the expense of medical treatment, and creating a safe pharmaceutical atmosphere for vision-challenged patients with chronic diseases. Awad et al. [20] 3D print table with Braille patterns for blind people. This research illustrates 3D printing to build customized dose forms tailored to blind patients. The SLS 3D printing process might be used to make printlets having designs of Moon or Braille on their exterior that can be recognized by a blind person. It is anticipated that this novel idea will give a revolutionary technique for the management of visual impairment patients, boosting independent medication management and lowering medicine mistakes.

Chang et al. [21] proposed an intelligent system for assisting people who are visually impaired in taking safety measures while walking on the road and while crossing the zebra lines. Legitimate zebra crossing image detection is done using the deep learning method. When approaching a zebra crossing, visually challenged customers must wear the suggested intelligent eyewear and gadget that is mounted on the waist and use the cane system that is proposed. Whenever a visually impaired pedestrian approaches a zebra crossing, they will get an instant message informing them of the present condition at the crossing and the signalized intersection signal.

Barontini et al. [22] created integrated wearable devices for eliminating obstacles for visually impaired people. A sensor which is RGB-D, a peripheral device to compute data that is seen while obstacle detection and a that is a wearable device that may supply general direction signals for guiding in an unfamiliar interior setup are all part of the proposed system.  Almukainzi et al. [23] proposed a system of medical identification using Braille labels for blind people in Saudi Arabia. The main contribution of this system is an investigation of the pharmaceutical usage patterns of visually impaired and blind Saudi Arabian patients, as well as to assess the necessity for labeling medicines using braille language while giving medicine to visually impaired people.

Thus, from the literature survey, it was found that there are systems in place that aid visually impaired people in the detection of an obstacle, avoidance of obstacles, navigating, and knowing their medicine. The medicine detection system either uses a mobile app-based approach, an IoT-based approach, or a Braille labeling approach, which is either costly or lacks a user-friendly approach. We have designed a system that consists of a mobile app and IoT-based SmartMedBox which has the following characteristics:

(1) Mobile app to scan QR code.

(2) The QR code consists of medicine information as per the doctor's prescription, which consists of the medicine name, medicine intake schedule, medicine dosage, and medicine expiry date.

(3) The mobile app has a locate button that helps the user to identify the location of the Smart medicine box which stores the medicine.

(4) The alarm facility helps the user to reach the Smart medicine box.

(5) Set up notification for medicine alerts.

3. Research Methodology

Figure 1 describes the proposed system architecture for intelligent medicine detection using IoT and computer vision methods. This application is helpful for older people as well as visually impaired patients. This methodology follows the QR code generation to capture the descriptive metadata of medicine with consumption time and store it in the network data set. In the mobile application, we have the designer functionality, as it extracts the entire data during the QR code scanning process. The extracted data application automatically generates alarms for our desired type to take medicine. In a particular scenario, whenever a patient cannot find the actual location of the medicine box, we have provided additional functionality for finding the medicine box, The IoT model can help for finding this medical box using an alarm sensor whenever the user requests the web server to locate me. The below section demonstrates.

Figure 1. System architecture for the proposed model

3.1 Missing tablets consumption in time

A Japanese company, Denso Wave, devised the QR (Quick Response) Code [24]. Unlike a Traditional barcode data is encoded vertically and horizontally. Because these codes can automatically encode kanji characters, they immediately gained worldwide appeal and were adopted by many systems, notably in Japan. QR codes hold important data. QR codes are divided into portions, each with a specific purpose. In addition to being simpler to read, QR code store more data. Our module stores prescription-related data. The medical practitioner fills in the information regarding the drug prescribed by the doctor. The medicine name, medicine dose, expiry date of medicine, medicine schedule, the number of days for which medicine must be ingested, and the start and finish date of the medicine are entered in the QR generating module. 8-bit code words are separated into error detection blocks based on QR code shares and encoding level. spire.Barcode.BitMatrix creates a QR code for text data.

3.2 QR code scanning

The QR code applied to the medicine strip has to be scanned to extract the medicine information. This is done by clicking the scanQR button in the mobile app to turn on the mobile camera. The QR code is scanned by holding the mobile phone steadily over the QR Code applied medicine strip, until the program generates the beep event, extracting all the data encoded in the code.

3.3 Event generation

Alerts are set up for specified days for drug ingestion based on information collected from the QR code after scanning. In addition to the alarm, we have incorporated an SMS service for alert notification, which is useful to alert users about medicine intake time.

3.4 IoT event generation

IoT monitoring and event generation are the enhanced functionality of SmartMedBox, an advanced version of Pocketed [25]. We deploy Arduino as a microcontroller while an alarm sensor and ultrasonic sensor have utilized for evaluating the event. When the visually impaired or an old person presses the locate me button on the mobile app, the alarm attached to SmartMedBox starts ringing, then the user upon hearing the alarm sound moves toward the sound to locate the SmartMedBox in which the QR Code applied medicines are kept. As soon as the user comes near the medicine box, the ultrasonic sensor detects the object and the alarm stops ringing, thus helping the user to locate SmartMedBox. The name, dosage, validity, and duration of the drug are indicated when it is scanned.

4. Experimental Setup

4.1 SmartMedBox

Figure 2a shows the mobile app menu screen which has a "locate me" button and Figure 2b shows the smart medicine box. We have developed the prototype, which consists of a medicine box that will store the medicine, and a microcontroller, an alarm sensor, and an ultrasonic sensor are attached to the medicine box. On pressing the locate me button on the mobile app, the alarm sensor starts ringing which is placed on the SmartMedBox, the user moves in the direction of the alarm to locate the medicine box. As soon as he comes closer to the box, the alarm stops ringing as the ultrasonic sensor detects the person coming closer by. Thus, the correct location of SmartMedBox is identified.

Figure 2. (a) PocetMed MobileApp, (b)SmartMedBox

In an extensive experimental analysis, we calculate the matrices for accuracy in the system performance assessment. The system is built on an open-source Java 3-tier with Python architectural framework with an INTEL 2.8 GHz i3 CPU and 4 GB RAM. The performance of SmartMedBox was evaluated with 30 participants, out of which 10 were blind and 20 were old-aged people. These people were first trained for two consecutive days for using the mobile app and for locating the SmartMedBox after hearing the alarm that rings after the "locate me" button is pressed. The experiment was conducted in an indoor environment. After training, the participants were confident to use the SmartMedBox. Only two to three participants out of thirty found it difficult to locate the SmartMedBox upon hearing the alarm sound. They needed more practice to locate it on a sound basis. Figure 3 shows the experimental setup done for the evaluation of smartMedBox in an indoor environment. As seen in the figure, the SmartMedBox is located by the user and the QR Code is scanned by the user to know the medicine information.

Figure 3. Locating SmartMedBox and scanning the QR code by the user in an indoor environment

Figure 4. Participants able to locate SmartMedBox

Figure 4 shows the graph which indicates the result in which out of 30 participants who participated in the evaluation of the system, 26 users were able to locate the SmartMedBox successfully. After the system has been implemented, a comparison between the number of current systems and the proposed system has is shown in Table 1. From comparison with Table 1, it can be seen that existing systems have some extra overhead of carrying some IoT device or preconfigured tags, and lack user-friendliness, but our proposed system requires carrying no extra hardware for locating SmartMedbox or any preconfigured tags and is a userfriendly system.

5. Discussion

The proposed SmartMedBox system is a useful assistive device for visually impaired and elderly people as it comprises of two modules, first, the mobile app module, which helps with medicine management, i.e., to identify the medicine name medicine intake dosage, intake time, expiry date, no. of days for which the medicine has to be taken and the second module SmartMedBox which helps to locate the medicine. This box can be located by the user after pressing the "locate me" button on the mobile app. The current work has two limitations, first, the user has no other feedback to reach the location of the medicine box other than the alarm sound, and second, the ultrasonic sensor does not identify the person but only that the person is reached closer to the medicine box.

Table 1. Comparison between the current and proposed system

Techniques

Advantages

Disadvantages

Smart

Labeling

solutions

Near Field Communication [26]

  • It does not need any extra mental effort.
  • This technology is treated as a low-cost technology.
  • On curved surfaces, NFC may not work.
  • It's difficult for blind people to find NFC tags with identical colors and have the same contacting surface as the pharmaceutical packaging.

RFID [27]

  • It has a broader Bluetooth range than NFC and can recognize items.
  • It is not necessary to place the item close to the RFID reader, however, assistance is provided for finding the thing.
  • An extra device, such as an RFID reader has to be carried

Computer

Vision

approaches

Visual tags [28]

  • The precision with which the information was obtained.
  • There's no need to bring any other gadgets to you to scan the tags.
  • A barcode is not seen on every pharmaceutical packaging.
  • The barcode does not have a defined position.
  • To find the visual tag, you may need to scan the complete product packaging surface.

Object

Recognition [29]

  • There's no need to buy tags or pre-tagged items.
  • The capacity to recognize an item in a variety of sizes and rotations.
  • Different shooting settings, such as lighting blur and backdrop, may have an impact on the matching findings.
  • Medication packaging might have a lot of similarities or just a few elements that can fool the system.
  • The system's pre-learning is required.

Crowdsourcing [30]

  • The users can ask questions in regular usage language.
  • Tips and advice on how to take better photos can be provided by real-time users.
  • It jeopardizes the privacy of the users.
  • In comparison to other strategies, there is a temporal delay in responding to questions.

SmartMedBox (Proposed)

  • The QR code gives each medicine information in detail with consumption time, etc.
  • The system gives an alarm from time to time while it gives an alarm sound where the medicine box is located.
  • Hearing the alarm sound blind user can easily detect the indoor location of the medicine box
  • Web database dependency for event generation.
  • General Android applications are sometimes not understood by old users.

Which helps with medicine management, i.e., to identify the medicine name medicine intake dosage, intake time, expiry date, no. of days for which the medicine has to be taken, and the second module SmartMedBox helps to locate the medicine. This box can be located by the user after pressing the "locate me" button on the mobile app. The current work has two limitations, first, the user has no other feedback to reach the location of the medicine box other than the alarm sound, and second, the ultrasonic sensor does not identify the person but only that the person is reached closer to the medicine box.

6. Conclusions and Future Work

This paper proposed SmartMedBox for the visually impaired and elderly people for medicine management. This approach comprises IoT and computer vision techniques in the real-time environment for identification of the location of the medicine and its description as per the doctor's prescription. To evaluate this product, the validation of the system has been done with 30 real-time beneficiaries. As per the performance evaluation of the system, the results are 86.66% accuracy in finding the correct location of the medicine box, while QR code generation, QR code scanning, and automated alarm setting give 100% accuracy. The main contributions of the work are 1) Scan the QR code applied to the medicine strip using a mobile app; 2) Identify of medicine name; 3) Know the dosage and schedule of medicine intake; 4) Know the expiry date of medicine; 5) Set up notifications as per medicine intake schedule; 6) Locate the SmartMedBox which stores the medicine. The future enhancement of this work is to identify medicine in the specific indoor location using deep learning techniques to find the best route to the medicine box from the user's current location, using a camera sensor to identify the user.

  References

[1] Rahaman, A., Islam, M. M., Islam, M.R., Sadi, M.S., Nooruddin, S. (2019). Developing IoT based smart health monitoring systems: A review. Revue d'Intelligence Artificielle, 33(6): 35-440. https://doi.org/10.18280/ria.330605

[2] Kripalani, S., Robertson, R., Love-Ghaffari, M.H., Henderson, L.E., Praska, J., Strawder, A., Katz, M.G., Jacobson, T.A. (2007). Development of an illustrated medication schedule as a low-literacy patient education tool. Patient Education and Counseling, 66(3): 368-377. https://doi.org/10.1016/j.pec.2007.01.020

[3] Silva, J.M., Mouttham, A., El Saddik, A. (2009). UbiMeds: A mobile application to improve accessibility and support medication adherence. InProceedings of the 1st ACM SIGMM International Workshop on Media Studies and Implementations That Help Improving Access to Disabled Users, pp. 71-78. https://doi.org/10.1145/1631097.1631109

[4] McCann, R.M., Jackson, A.J., Stevenson, M., Dempster, M., McElnay, J.C., Cupples, M.E. (2012). Help needed in medication self-management for people with visual impairment: Case–control study. British Journal of General Practice, 62(601): e530-e537. https://doi.org/10.3399/bjgp12X653570

[5] Benjamim, X.C., Gomes, R.B., Burlamaqui, A.F., Gonçalves, L.M. (2012). Visual identification of medicine boxes using features matching. In 2012 IEEE International Conference on Virtual Environments Human-Computer Interfaces and Measurement Systems (VECIMS) Proceedings, Tianjin, China, pp. 43-47. https://doi.org/10.1109/VECIMS.2012.6273190

[6] Hasanuzzaman, F.M., Yang, X., Tian, Y., Liu, Q., Capezuti, E. (2013). Monitoring activity of taking medicine by incorporating RFID and video analysis. Network Modeling Analysis in Health Informatics and Bioinformatics, 2(2): 61-70. https://doi.org/10.1007/s13721-013-0025-y

[7] Xiao, J., Joseph, S.L., Zhang, X., Li, B., Li. X., Zhang, J. (2015). An assistive navigation framework for the visually impaired. IEEE Transactions on Human-Machine Systems, 45(5): 635-640. https://doi.org/10.1109/THMS.2014.2382570

[8] Salgia, A.S., Ganesan, K., Raghunath, A. (2015). Smart pill box. Indian Journal of Science and Technology, 8(S2): 189-194.

[9] Gori, M., Cappagli, G., Tonelli, A., Baud-Bovy, G., Finocchietti, S. (2016). Devices for visually impaired people: High technological devices with low user acceptance and no adaptability for children. Neuroscience & Biobehavioral Reviews, 69: 79-88. https://doi.org/10.1016/j.neubiorev.2016.06.043

[10] Jayashree, D., Farhath, K.A., Amruthavarshini, R., Pavithra, S. (2016). Voice based application as medicine spotter for visually impaired. In 2016 Second International Conference on Science Technology Engineering and Management (ICONSTEM), Chennai, India, pp. 56-60. https://doi.org/10.1109/ICONSTEM.2016.7560923

[11] Berger, A., Vokalova, A., Maly, F., Poulova, P. (2017). Google glass used as assistive technology its utilization for blind and visually impaired people. International Conference on Mobile Web and Information Systems, Springer, Cham, pp. 70-82. https://doi.org/10.1007/978-3-319-65515-4_6

[12] Almuzaini, M.A., Abdullah-Al-Wadud, M. (2018). A review on medication identification techniques for visually impaired patients. In 2018 21st Saudi Computer Society National Computer Conference (NCC), Riyadh, Saudi Arabia, pp. 1-6. https://doi.org/10.1109/NCG.2018.8593193

[13] Lee, B.H., Lee, Y.J. (2019). Evaluation of medication use and pharmacy services for visually impaired persons: Perspectives from both visually impaired and community pharmacists. Disability and Health Journal, 12(1): 79-86. https://doi.org/10.1016/j.dhjo.2018.07.012

[14] Kamal, A.K, Khalid, W., Muqeet, A., Jamil, A., Farhat, K., Gillani, S.R., Zulfiqar, M., Saif, M., Muhammad, A.A., Zaidi, F., Mustafa, M. (2018). Making prescriptions “talk” to stroke and heart attack survivors to improve adherence: results of a randomized clinical trial (The Talking Rx Study). PloS One, 13(12): e0197671. https://doi.org/10.1371/journal.pone.0197671

[15] Farhadyar, K., Safdari, R., Beh-Pajooh, A (2018). User goals extraction for a mhealth-based medication management system for individuals with visual impairment. European Journal of Biomedical Informatics, 14(2).

[16] Chang, W.J., Yu, Y.X., Chen, J.H., Zhang, Z.Y., Ko, S.J., Yang, T.H., Hsu, C.H., Chen, L.B., Chen, M.C. (2019). A deep learning based wearable medicines recognition system for visually impaired people. In 2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), pp. 207-208. https://doi.org/10.1109/AICAS.2019.8771559

[17] Meshram, V.V., Patil, K., Meshram, V.A., Shu, F.C. (2019). An astute assistive device for mobility and object recognition for visually impaired people. IEEE Trans. on Human-Machine Systems, 49(5): 449-460. https://doi.org/10.1109/THMS.2019.2931745

[18] Almuzaini, M.A., Abdullah-Al-Wadud, M. (2019). Medication identification aid for visually impaired patients. In 2019 2nd International Conference on Computer Applications & Information Security (ICCAIS), Riyadh, Saudi Arabia, pp. 1-6. https://doi.org/10.1109/CAIS.2019.8769563

[19] Chang, W.J., Chen, L.B., Hsu, C.H., Chen, J.H., Yang, T.C., Lin, C.P. (2020). MedGlasses: A wearable smart-glasses-based drug pill recognition system using deep learning for visually impaired chronic patients. IEEE Access, 8: 17013-24. https://doi.org/10.1109/ACCESS.2020.2967400

[20] Awad, A., Yao, A., Trenfield, S.J., Goyanes, A., Gaisford, S., Basit, A.W. (2020). 3D printed tablets (printlets) with braille and moon patterns for visually impaired patients. Pharmaceutics, 12(2): 172. https://doi.org/10.3390/pharmaceutics12020172

[21] Chang, W.J., Chen, L.B., Sie, C.Y., Yang, C.H. (2020). An artificial intelligence edge computing-based assistive system for visually impaired pedestrian safety at zebra crossings. IEEE Transactions on Consumer Electronics, 67(1): 3-11. https://doi.org/10.1109/TCE.2020.3037065

[22] Barontini, F., Catalano, M.G., Pallottino, L., Leporini, B., Bianchi, M. (2020). Integrating wearable haptics and obstacle avoidance for the visually impaired in indoor navigation: A user-centered approach. IEEE Transactions on Haptics, 14(1): 109-122. https://doi.org/10.1109/TOH.2020.2996748

[23] Almukainzi, M., Almuhareb, A., Aldwisan, F., Alquaydhib, W. (2020). Medication use patterns in the visually impaired in Saudi Arabia and the importance of applying Braille labeling. Saudi Pharmaceutical Journal, 28(3): 274-280. https://doi.org/10.1016/j.jsps.2020.01.006

[24] Chang, J. H. (2014). An introduction to using QR codes in scholarly journals.Science Editing, 1(2):113-117. https://doi.org/10.6087/kcse.2014.1.113  

[25] Meshram, V., Patil, K., Meshram, V.  Effective Medicine management for visually impaired people: pocketmed. submitted to ICIC-Express Letter, International Journal of Research and Surveys, paper in review.

[26] Ervasti, M., Isomursu, M., Leibar, I. (2011), Touch-and audio-based medication management service concept for vision impaired older people. 2011 IEEE International Conference on RFID-Technologies and Applications, RFID-TA 2011, Sitges, Spain, September 15-16.

[27] Murad, M., Rehman, A., Shah, A. A., Ullah, S., Fahad, M., & Yahya, K. M. (2011, September). RFAIDE—An RFID based navigation and object recognition assistant for visually impaired people. In 2011 7th International Conference on Emerging Technologies, pp. 1-4. https://doi.org/10.1109/ICET.2011.6048486

[28] Al-Quwayfili, N., Al-Khalifa, H. (2014). AraMedReader: An Arabic medicine identifier using barcodes. International Conference on Human-Computer Interaction, pp. 383-388. https://doi.org/10.1007/978-3-319-07854-0_67

[29] Jayashree, D., Afritha Farhath, K., Amruthavarshini, R., Pavithra, S. (2016). Voice based application as medicine spotter for visually impaired. Science Technology Engineering and Management (ICONSTEM), Second International Conference, Chennai, India, pp. 56-60. https://doi.org/10.1109/ICONSTEM.2016.7560923

[30] Hoonlor, A., Ayudhya, S., Harnmetta, S., Kitpanon, S. Khlaprasit, K. (2015). UCap: A crowdsourcing application for the visually impaired and blind persons on Android smartphone. Computer Science and Engineering Conference (ICSEC), Chiang Mai, Thailand, pp. 1-6. https://doi.org/10.1109/ICSEC.2015.7401406