Search results for: error metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2414

Search results for: error metrics

794 Sporting Events among the Disabled between Excellence and Ideal in Motor Performance: Analytical Descriptive Study in Some Paralympic Sports

Authors: Guebli Abdelkader, Reguieg Madani, Belkadi Adel, Sbaa Bouabdellah

Abstract:

The identification of mechanical variables in the motor performance trajectory has a prominent role in improving skill performance, error-exceeding, it contributes seriously to solving some problems of learning and training. The study aims to highlight the indicators of motor performance for Paralympic athletes during the practicing sports between modelling and between excellence in motor performance, this by taking into account the distinction of athlete practicing with special behavioral skills for the Paralympic athletes. In the study, we relied on the analysis of some previous research of biomechanical performance indicators during some of the events sports (shooting activities in the Paralympic athletics, shooting skill in the wheelchair basketball). The results of the study highlight the distinction of disabled practitioners of sporting events identified in motor performance during practice, by overcoming some physics indicators in human movement, as a lower center of body weight, increase in offset distance, such resistance which requires them to redouble their efforts. However, the results of the study highlighted the strength of the correlation between biomechanical variables of motor performance and the digital level achievement similar to the other practitioners normal.

Keywords: sports, the disabled, motor performance, Paralympic

Procedia PDF Downloads 270
793 Deep Learning-Based Liver 3D Slicer for Image-Guided Therapy: Segmentation and Needle Aspiration

Authors: Ahmedou Moulaye Idriss, Tfeil Yahya, Tamas Ungi, Gabor Fichtinger

Abstract:

Image-guided therapy (IGT) plays a crucial role in minimally invasive procedures for liver interventions. Accurate segmentation of the liver and precise needle placement is essential for successful interventions such as needle aspiration. In this study, we propose a deep learning-based liver 3D slicer designed to enhance segmentation accuracy and facilitate needle aspiration procedures. The developed 3D slicer leverages state-of-the-art convolutional neural networks (CNNs) for automatic liver segmentation in medical images. The CNN model is trained on a diverse dataset of liver images obtained from various imaging modalities, including computed tomography (CT) and magnetic resonance imaging (MRI). The trained model demonstrates robust performance in accurately delineating liver boundaries, even in cases with anatomical variations and pathological conditions. Furthermore, the 3D slicer integrates advanced image registration techniques to ensure accurate alignment of preoperative images with real-time interventional imaging. This alignment enhances the precision of needle placement during aspiration procedures, minimizing the risk of complications and improving overall intervention outcomes. To validate the efficacy of the proposed deep learning-based 3D slicer, a comprehensive evaluation is conducted using a dataset of clinical cases. Quantitative metrics, including the Dice similarity coefficient and Hausdorff distance, are employed to assess the accuracy of liver segmentation. Additionally, the performance of the 3D slicer in guiding needle aspiration procedures is evaluated through simulated and clinical interventions. Preliminary results demonstrate the effectiveness of the developed 3D slicer in achieving accurate liver segmentation and guiding needle aspiration procedures with high precision. The integration of deep learning techniques into the IGT workflow shows great promise for enhancing the efficiency and safety of liver interventions, ultimately contributing to improved patient outcomes.

Keywords: deep learning, liver segmentation, 3D slicer, image guided therapy, needle aspiration

Procedia PDF Downloads 39
792 System of Quality Automation for Documents (SQAD)

Authors: R. Babi Saraswathi, K. Divya, A. Habeebur Rahman, D. B. Hari Prakash, S. Jayanth, T. Kumar, N. Vijayarangan

Abstract:

Document automation is the design of systems and workflows, assembling repetitive documents to meet the specific business needs. In any organization or institution, documenting employee’s information is very important for both employees as well as management. It shows an individual’s progress to the management. Many documents of the employee are in the form of papers, so it is very difficult to arrange and for future reference we need to spend more time in getting the exact document. Also, it is very tedious to generate reports according to our needs. The process gets even more difficult on getting approvals and hence lacks its security aspects. This project overcomes the above-stated issues. By storing the details in the database and maintaining the e-documents, the automation system reduces the manual work to a large extent. Then the approval process of some important documents can be done in a much-secured manner by using Digital Signature and encryption techniques. Details are maintained in the database and e-documents are stored in specific folders and generation of various kinds of reports is possible. Moreover, an efficient search method is implemented is used in the database. Automation supporting document maintenance in many aspects is useful for minimize data entry, reduce the time spent on proof-reading, avoids duplication, and reduce the risks associated with the manual error, etc.

Keywords: e-documents, automation, digital signature, encryption

Procedia PDF Downloads 384
791 Random Subspace Neural Classifier for Meteor Recognition in the Night Sky

Authors: Carlos Vera, Tetyana Baydyk, Ernst Kussul, Graciela Velasco, Miguel Aparicio

Abstract:

This article describes the Random Subspace Neural Classifier (RSC) for the recognition of meteors in the night sky. We used images of meteors entering the atmosphere at night between 8:00 p.m.-5: 00 a.m. The objective of this project is to classify meteor and star images (with stars as the image background). The monitoring of the sky and the classification of meteors are made for future applications by scientists. The image database was collected from different websites. We worked with RGB-type images with dimensions of 220x220 pixels stored in the BitMap Protocol (BMP) format. Subsequent window scanning and processing were carried out for each image. The scan window where the characteristics were extracted had the size of 20x20 pixels with a scanning step size of 10 pixels. Brightness, contrast and contour orientation histograms were used as inputs for the RSC. The RSC worked with two classes and classified into: 1) with meteors and 2) without meteors. Different tests were carried out by varying the number of training cycles and the number of images for training and recognition. The percentage error for the neural classifier was calculated. The results show a good RSC classifier response with 89% correct recognition. The results of these experiments are presented and discussed.

Keywords: contour orientation histogram, meteors, night sky, RSC neural classifier, stars

Procedia PDF Downloads 132
790 Critical Evaluation and Analysis of Effects of Different Queuing Disciplines on Packets Delivery and Delay for Different Applications

Authors: Omojokun Gabriel Aju

Abstract:

Communication network is a process of exchanging data between two or more devices via some forms of transmission medium using communication protocols. The data could be in form of text, images, audio, video or numbers which can be grouped into FTP, Email, HTTP, VOIP or Video applications. The effectiveness of such data exchange will be proved if they are accurately delivered within specified time. While some senders will not really mind when the data is actually received by the receiving device, inasmuch as it is acknowledged to have been received by the receiver. The time a data takes to get to a receiver could be very important to another sender, as any delay could cause serious problem or even in some cases rendered the data useless. The validity or invalidity of a data after delay will therefore definitely depend on the type of data (information). It is therefore imperative for the network device (such as router) to be able to differentiate among the packets which are time sensitive and those that are not, when they are passing through the same network. So, here is where the queuing disciplines comes to play, to handle network resources when such network is designed to service widely varying types of traffics and manage the available resources according to the configured policies. Therefore, as part of the resources allocation mechanisms, a router within the network must implement some queuing discipline that governs how packets (data) are buffered while waiting to be transmitted. The implementation of the queuing discipline will regulate how the packets are buffered while waiting to be transmitted. In achieving this, various queuing disciplines are being used to control the transmission of these packets, by determining which of the packets get the highest priority, less priority and which packets are dropped. The queuing discipline will therefore control the packets latency by determining how long a packet can wait to be transmitted or dropped. The common queuing disciplines are first-in-first-out queuing, Priority queuing and Weighted-fair queuing (FIFO, PQ and WFQ). This paper critically evaluates and analyse through the use of Optimized Network Evaluation Tool (OPNET) Modeller, Version 14.5 the effects of three queuing disciplines (FIFO, PQ and WFQ) on the performance of 5 different applications (FTP, HTTP, E-Mail, Voice and Video) within specified parameters using packets sent, packets received and transmission delay as performance metrics. The paper finally suggests some ways in which networks can be designed to provide better transmission performance while using these queuing disciplines.

Keywords: applications, first-in-first-out queuing (FIFO), optimised network evaluation tool (OPNET), packets, priority queuing (PQ), queuing discipline, weighted-fair queuing (WFQ)

Procedia PDF Downloads 351
789 The Use of Artificial Intelligence in Diagnosis of Mastitis in Cows

Authors: Djeddi Khaled, Houssou Hind, Miloudi Abdellatif, Rabah Siham

Abstract:

In the field of veterinary medicine, there is a growing application of artificial intelligence (AI) for diagnosing bovine mastitis, a prevalent inflammatory disease in dairy cattle. AI technologies, such as automated milking systems, have streamlined the assessment of key metrics crucial for managing cow health during milking and identifying prevalent diseases, including mastitis. These automated milking systems empower farmers to implement automatic mastitis detection by analyzing indicators like milk yield, electrical conductivity, fat, protein, lactose, blood content in the milk, and milk flow rate. Furthermore, reports highlight the integration of somatic cell count (SCC), thermal infrared thermography, and diverse systems utilizing statistical models and machine learning techniques, including artificial neural networks, to enhance the overall efficiency and accuracy of mastitis detection. According to a review of 15 publications, machine learning technology can predict the risk and detect mastitis in cattle with an accuracy ranging from 87.62% to 98.10% and sensitivity and specificity ranging from 84.62% to 99.4% and 81.25% to 98.8%, respectively. Additionally, machine learning algorithms and microarray meta-analysis are utilized to identify mastitis genes in dairy cattle, providing insights into the underlying functional modules of mastitis disease. Moreover, AI applications can assist in developing predictive models that anticipate the likelihood of mastitis outbreaks based on factors such as environmental conditions, herd management practices, and animal health history. This proactive approach supports farmers in implementing preventive measures and optimizing herd health. By harnessing the power of artificial intelligence, the diagnosis of bovine mastitis can be significantly improved, enabling more effective management strategies and ultimately enhancing the health and productivity of dairy cattle. The integration of artificial intelligence presents valuable opportunities for the precise and early detection of mastitis, providing substantial benefits to the dairy industry.

Keywords: artificial insemination, automatic milking system, cattle, machine learning, mastitis

Procedia PDF Downloads 55
788 A Model for Diagnosis and Prediction of Coronavirus Using Neural Network

Authors: Sajjad Baghernezhad

Abstract:

Meta-heuristic and hybrid algorithms have high adeer in modeling medical problems. In this study, a neural network was used to predict covid-19 among high-risk and low-risk patients. This study was conducted to collect the applied method and its target population consisting of 550 high-risk and low-risk patients from the Kerman University of medical sciences medical center to predict the coronavirus. In this study, the memetic algorithm, which is a combination of a genetic algorithm and a local search algorithm, has been used to update the weights of the neural network and develop the accuracy of the neural network. The initial study showed that the accuracy of the neural network was 88%. After updating the weights, the memetic algorithm increased by 93%. For the proposed model, sensitivity, specificity, positive predictivity value, value/accuracy to 97.4, 92.3, 95.8, 96.2, and 0.918, respectively; for the genetic algorithm model, 87.05, 9.20 7, 89.45, 97.30 and 0.967 and for logistic regression model were 87.40, 95.20, 93.79, 0.87 and 0.916. Based on the findings of this study, neural network models have a lower error rate in the diagnosis of patients based on individual variables and vital signs compared to the regression model. The findings of this study can help planners and health care providers in signing programs and early diagnosis of COVID-19 or Corona.

Keywords: COVID-19, decision support technique, neural network, genetic algorithm, memetic algorithm

Procedia PDF Downloads 63
787 Investigation of Dry Ice Mixed Novel Hybrid Lubri-Coolant in Sustainable Machining of Ti-6AL-4V Alloy: A Comparison of Experimental and Modelling

Authors: Muhammad Jamil, Ning He, Aqib Mashood Khan, Munish Kumar Gupta

Abstract:

Ti-6Al-4V has numerous applications in the medical, automobile, and aerospace industries due to corrosion resistivity, structural stability, and chemical inertness to most fluids at room temperature. These peculiar characteristics are beneficial for their application and present formidable challenges during machining. Machining of Ti-6Al-4V produces an elevated cutting temperature above 1000oC at dry conditions. This accelerates tool wear and reduces product quality. Therefore, there is always a need to employ sustainable/effective coolant/lubricant when machining such alloy. In this study, Finite Element Modeling (FEM) and experimental analysis when cutting Ti-6Al-4V under a distinctly developed dry ice mixed hybrid lubri-coolant are presented. This study aims to model the milling process of Ti-6Al-4V under a proposed novel hybrid lubri-coolant using different cutting speeds and feed per tooth DEFORM® software package was used to conduct the FEM and the numerical model was experimentally validated. A comparison of experimental and simulation results showed a maximum error of no more than 6% for all experimental conditions. In a nutshell, it can be said that the proposed model is effective in predicting the machining temperature precisely.

Keywords: friction coefficient, heat transfer, finite element modeling (FEM), milling Ti-6Al-4V

Procedia PDF Downloads 47
786 The Need for a One Health and Welfare Approach to Animal Welfare in Industrial Animal Farming

Authors: Clinton Adas

Abstract:

Antibiotic resistance has been identified by the World Health Organisation as a real possibility for the 21st Century. While many factors contribute to this, one of the more significant is industrial animal farming and its effect on the food chain and environment. Livestock consumes a significant portion of antibiotics sold globally, and these are used to make animals grow faster for profit purposes, to prevent illness caused by inhumane living conditions, and to treat disease when it breaks out. Many of these antibiotics provide little benefit to animals, and most are the same as those used by humans - including those deemed critical to human health that should therefore be used sparingly. Antibiotic resistance contributes to growing numbers of illnesses and death in humans, and the excess usage of these medications results in waste that enters the environment and is harmful to many ecological processes. This combination of antimicrobial resistance and environmental degradation furthermore harms the economic well-being and prospects of many. Using an interdisciplinary approach including medical, environmental, economic, and legal studies, the paper evaluates the dynamic between animal welfare and commerce and argues that while animal welfare is not of great concern to many, this approach is ultimately harming human welfare too. It is, however, proposed that both could be addressed under a One Health and Welfare approach, as we cannot continue to ignore the linkages between animals, the environment, and people. The evaluation of industrial animal farming is therefore considered through three aspects – the environmental impact, which is measured by pollution that causes environmental degradation; the human impact, which is measured by the rise of illnesses from pollution and antibiotics resistance; and the economic impact, which is measured through costs to the health care system and the financial implications of industrial farming on the economic well-being of many. These three aspects are considered in light of the Sustainable Development Goals that provide additional tangible metrics to evidence the negative impacts. While the research addresses the welfare of farmed animals, there is potential for these principles to be extrapolated into other contexts, including wildlife and habitat protection. It must be noted that while the question of animal rights in industrial animal farming is acknowledged and of importance, this is a separate matter that is not addressed here.

Keywords: animal and human welfare, industrial animal farming, one health and welfare, sustainable development goals

Procedia PDF Downloads 76
785 Transient Enhanced LDO Voltage Regulator with Improved Feed Forward Path Compensation

Authors: A. Suresh, Sreehari Rao Patri, K. S. R. Krishnaprasad

Abstract:

An ultra low power capacitor less low-dropout voltage regulator with improved transient response using gain enhanced feed forward path compensation is presented in this paper. It is based on a cascade of a voltage amplifier and a transconductor stage in the feed forward path with regular error amplifier to form a composite gain-enhanced feed forward stage. It broadens the gain bandwidth and thus improves the transient response without substantial increase in power consumption. The proposed LDO, designed for a maximum output current of 100 mA in UMC 180 nm, requires a quiescent current of 69 µA. An undershoot of 153.79mV for a load current changes from 0mA to 100mA and an overshoot of 196.24mV for current change of 100mA to 0mA. The settling time is approximately 1.1 µs for the output voltage undershoot case. The load regulation is of 2.77 µV/mA at load current of 100mA. Reference voltage is generated by using an accurate band gap reference circuit of 0.8V.The costly features of SOC such as total chip area and power consumption is drastically reduced by the use of only a total compensation capacitance of 6pF while consuming power consumption of 0.096 mW.

Keywords: capacitor-less LDO, frequency compensation, transient response, latch, self-biased differential amplifier

Procedia PDF Downloads 445
784 Efficiency of Google Translate and Bing Translator in Translating Persian-to-English Texts

Authors: Samad Sajjadi

Abstract:

Machine translation is a new subject increasingly being used by academic writers, especially students and researchers whose native language is not English. There are numerous studies conducted on machine translation, but few investigations have assessed the accuracy of machine translation from Persian to English at lexical, semantic, and syntactic levels. Using Groves and Mundt’s (2015) Model of error taxonomy, the current study evaluated Persian-to-English translations produced by two famous online translators, Google Translate and Bing Translator. A total of 240 texts were randomly selected from different academic fields (law, literature, medicine, and mass media), and 60 texts were considered for each domain. All texts were rendered by the two translation systems and then by four human translators. All statistical analyses were applied using SPSS. The results indicated that Google translations were more accurate than the translations produced by the Bing Translator, especially in the domains of medicine (lexis: 186 vs. 225; semantic: 44 vs. 48; syntactic: 148 vs. 264 errors) and mass media (lexis: 118 vs. 149; semantic: 25 vs. 32; syntactic: 110 vs. 220 errors), respectively. Nonetheless, both machines are reasonably accurate in Persian-to-English translation of lexicons and syntactic structures, particularly from mass media and medical texts.

Keywords: machine translations, accuracy, human translation, efficiency

Procedia PDF Downloads 71
783 Enhancing Academic Writing Through Artificial Intelligence: Opportunities and Challenges

Authors: Abubakar Abdulkareem, Nasir Haruna Soba

Abstract:

Artificial intelligence (AI) is developing at a rapid pace, revolutionizing several industries, including education. This talk looks at how useful AI can be for academic writing, with an emphasis on how it can help researchers be more accurate, productive, and creative. The academic world now relies heavily on AI technologies like grammar checkers, plagiarism detectors, and content generators to help with the writing, editing, and formatting of scholarly papers. This study explores the particular uses of AI in academic writing and assesses how useful and helpful these applications may be for both students and scholars. By means of an extensive examination of extant literature and a sequence of empirical case studies, we scrutinize the merits and demerits of artificial intelligence tools utilized in academic writing. Important discoveries indicate that although AI greatly increases productivity and lowers human error, there are still issues that need to be resolved, including reliance, ethical concerns, and the potential loss of critical thinking abilities. The talk ends with suggestions for incorporating AI tools into academic settings so that they enhance rather than take the place of the intellectual rigor that characterizes scholarly work. This study adds to the continuing conversation about artificial intelligence (AI) in higher education by supporting a methodical strategy that uses technology to enhance human abilities in academic writing.

Keywords: artificial intelligence, academic writing, ai tools, productivity, ethics, higher education

Procedia PDF Downloads 14
782 Comparison of Sensitivity and Specificity of Pap Smear and Polymerase Chain Reaction Methods for Detection of Human Papillomavirus: A Review of Literature

Authors: M. Malekian, M. E. Heydari, M. Irani Estyar

Abstract:

Human papillomavirus (HPV) is one of the most common sexually transmitted infection, which may lead to cervical cancer as the main cause of it. With early diagnosis and treatment in health care services, cervical cancer and its complications are considered to be preventable. This study was aimed to compare the efficiency, sensitivity, and specificity of Pap smear and polymerase chain reaction (PCR) in detecting HPV. A literature search was performed in Google Scholar, PubMed and SID databases using the keywords 'human papillomavirus', 'pap smear' and 'polymerase change reaction' to identify studies comparing Pap smear and PCR methods for the detection. No restrictions were considered.10 studies were included in this review. All samples that were positive by pop smear were also positive by PCR. However, there were positive samples detected by PCR which was negative by pop smear and in all studies, many positive samples were missed by pop smear technique. Although The Pap smear had high specificity, PCR based HPV detection was more sensitive method and had the highest sensitivity. In order to promote the quality of detection and high achievement of the maximum results, PCR diagnostic methods in addition to the Pap smear are needed and Pap smear method should be combined with PCR techniques according to the high error rate of Pap smear in detection.

Keywords: human papillomavirus, cervical cancer, pap smear, polymerase chain reaction

Procedia PDF Downloads 125
781 Triangular Geometric Feature for Offline Signature Verification

Authors: Zuraidasahana Zulkarnain, Mohd Shafry Mohd Rahim, Nor Anita Fairos Ismail, Mohd Azhar M. Arsad

Abstract:

Handwritten signature is accepted widely as a biometric characteristic for personal authentication. The use of appropriate features plays an important role in determining accuracy of signature verification; therefore, this paper presents a feature based on the geometrical concept. To achieve the aim, triangle attributes are exploited to design a new feature since the triangle possesses orientation, angle and transformation that would improve accuracy. The proposed feature uses triangulation geometric set comprising of sides, angles and perimeter of a triangle which is derived from the center of gravity of a signature image. For classification purpose, Euclidean classifier along with Voting-based classifier is used to verify the tendency of forgery signature. This classification process is experimented using triangular geometric feature and selected global features. Based on an experiment that was validated using Grupo de Senales 960 (GPDS-960) signature database, the proposed triangular geometric feature achieves a lower Average Error Rates (AER) value with a percentage of 34% as compared to 43% of the selected global feature. As a conclusion, the proposed triangular geometric feature proves to be a more reliable feature for accurate signature verification.

Keywords: biometrics, euclidean classifier, features extraction, offline signature verification, voting-based classifier

Procedia PDF Downloads 373
780 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 62
779 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date

Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian

Abstract:

To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.

Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven

Procedia PDF Downloads 166
778 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method

Authors: Wassana Naiyapo, Atichat Sangtong

Abstract:

The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.

Keywords: classification tree method, test case, UML use case diagram, use case specification

Procedia PDF Downloads 157
777 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency

Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu

Abstract:

In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.

Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal

Procedia PDF Downloads 144
776 Encryption and Decryption of Nucleic Acid Using Deoxyribonucleic Acid Algorithm

Authors: Iftikhar A. Tayubi, Aabdulrahman Alsubhi, Abdullah Althrwi

Abstract:

The deoxyribonucleic acid text provides a single source of high-quality Cryptography about Deoxyribonucleic acid sequence for structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to encrypt and decrypt Deoxy Ribonucleic Acid sequence text. It includes complex, securing by using Algorithm to encrypt and decrypt Deoxy Ribonucleic Acid sequence. The utility of this Deoxy Ribonucleic Acid Sequence Text is that, it can provide a user-friendly interface for users to Encrypt and Decrypt store the information about Deoxy Ribonucleic Acid sequence. These interfaces created in this project will satisfy the demands of the scientific community by providing fully encrypt of Deoxy Ribonucleic Acid sequence during this website. We have adopted a methodology by using C# and Active Server Page.NET for programming which is smart and secure. Deoxy Ribonucleic Acid sequence text is a wonderful piece of equipment for encrypting large quantities of data, efficiently. The users can thus navigate from one encoding and store orange text, depending on the field for user’s interest. Algorithm classification allows a user to Protect the deoxy ribonucleic acid sequence from change, whether an alteration or error occurred during the Deoxy Ribonucleic Acid sequence data transfer. It will check the integrity of the Deoxy Ribonucleic Acid sequence data during the access.

Keywords: algorithm, ASP.NET, DNA, encrypt, decrypt

Procedia PDF Downloads 224
775 Vertical Accuracy Evaluation of Indian National DEM (CartoDEM v3) Using Dual Frequency GNSS Derived Ground Control Points for Lower Tapi Basin, Western India

Authors: Jaypalsinh B. Parmar, Pintu Nakrani, Ashish Chaurasia

Abstract:

Digital Elevation Model (DEM) is considered as an important data in GIS-based terrain analysis for many applications and assessment of processes such as environmental and climate change studies, hydrologic modelling, etc. Vertical accuracy of DEM having geographically dynamic nature depends on different parameters which affect the model simulation outcomes. Vertical accuracy assessment in Indian landscape especially in low-lying coastal urban terrain such as lower Tapi Basin is very limited. In the present study, attempt has been made to evaluate the vertical accuracy of 30m resolution open source Indian National Cartosat-1 DEM v3 for Lower Tapi Basin (LTB) from western India. The extensive field investigation is carried out using stratified random fast static DGPS survey in the entire study region, and 117 high accuracy ground control points (GCPs) have been obtained. The above open source DEM was compared with obtained GCPs, and different statistical attributes were envisaged, and vertical error histograms were also evaluated.

Keywords: CartoDEM, Digital Elevation Model, GPS, lower Tapi basin

Procedia PDF Downloads 353
774 The Effect of Nursing Teamwork Training on Nursing Teamwork Effectiveness

Authors: Manar Ahmed Elbadawy

Abstract:

Background: Empirical evidence suggested that improving nursing teamwork (NTW) may be the key to reducing medical error. The functioning nursing teams require open communication, mutual respect, and shared mental models to activate quality patient care. The complexity and the high demands for specialized nursing knowledge and skill also require nursing staff to consult with one another and work in teams regularly. The current study aimed to evaluate the effect of the nursing teamwork training program on nursing teamwork effectiveness. Design: A quasi-experimental (one group pretest-posttest) design was utilized. Three medical intensive care units at a teaching hospital affiliated to Cairo University Hospital, Egypt. Subjects: A convenient sample of 48 nursing staff worked at the selected units. The Nursing Teamwork Observational Checklist was used. Results: Total (NTW) mean scores exhibited quite elevation post-program implementation compared to preprogram and showed little decrease 3 months later ( = 2.52, SD = ± 0.27, mean % =51.98, = 2.72, SD = ± 0.20, mean %=72.45, = 2.67, SD = ± 0.11, mean %= 67.48 respectively). Conclusion: Implementation of (NTW) training program had a positive effect on increasing (NTW) effectiveness. Regular and frequent short-term teamwork training is important to be introduced as well as sustainable monitoring is required to ensure nursing attitudes, knowledge and skills’ change about teamwork effectiveness.

Keywords: effectiveness, nursing, teamwork, training

Procedia PDF Downloads 119
773 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function

Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros

Abstract:

The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.

Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change

Procedia PDF Downloads 168
772 Municipal Asset Management Planning 2.0 – A New Framework For Policy And Program Design In Ontario

Authors: Scott R. Butler

Abstract:

Ontario, Canada’s largest province, is in the midst of an interesting experiment in mandated asset management planning for local governments. At the beginning of 2021, Ontario’s 444 municipalities were responsible for the management of 302,864 lane kilometers of roads that have a replacement cost of $97.545 billion CDN. Roadways are by far the most complex, expensive, and extensive assets that a municipality is responsible for overseeing. Since adopting Ontario Regulation 588/47: Asset Management Planning for Municipal Infrastructure in 2017, the provincial government has established prescriptions for local road authorities regarding asset category and levels of service being provided. This provincial regulation further stipulates that asset data such as extent, condition, and life cycle costing are to be captured in manner compliant with qualitative descriptions and technical metrics. The Ontario Good Roads Association undertook an exercise to aggregate the road-related data contained within the 444 asset management plans that municipalities have filed with the provincial government. This analysis concluded that collectively Ontario municipal roadways have a $34.7 billion CDN in deferred maintenance. The ill-state of repair of Ontario municipal roads has lasting implications for province’s economic competitiveness and has garnered considerable political attention. Municipal efforts to address the maintenance backlog are stymied by the extremely limited fiscal parameters municipalities must operate within in Ontario. Further exacerbating the program are provincially designed programs that are ineffective, administratively burdensome, and not necessarily aligned with local priorities or strategies. This paper addresses how municipal asset management plans – and more specifically, the data contained in these plans – can be used to design innovative policy frameworks, flexible funding programs, and new levels of service that respond to these funding challenges, as well as emerging issues such as local economic development and climate change. To fully unlock the potential that Ontario Regulation 588/17 has imposed will require a resolute commitment to data standardization and horizontal collaboration between municipalities within regions.

Keywords: transportation, municipal asset management, subnational policy design, subnational funding program design

Procedia PDF Downloads 89
771 Evaluation of Medication Administration Process in a Paediatric Ward

Authors: Zayed Alsulami, Asma Aldosseri, Ahmed Ezziden, Abdulrahman Alonazi

Abstract:

Children are more susceptible to medication errors than adults. Medication administration process is the last stage in the medication treatment process and most of the errors detected in this stage. Little research has been undertaken about medication errors in children in the Middle East countries. This study was aimed to evaluate how the paediatric nurses adhere to the medication administration policy and also to identify any medication preparation and administration errors or any risk factors. An observational, prospective study of medication administration process from when the nurses preparing patient medication until administration stage (May to August 2014) was conducted in Saudi Arabia. Twelve paediatric nurses serving 90 paediatric patients were observed. 456 drug administered doses were evaluated. Adherence rate was variable in 7 steps out of 16 steps. Patient allergy information, dose calculation, drug expiry date were the steps in medication administration with lowest adherence rates. 63 medication preparation and administration errors were identified with error rate 13.8% of medication administrations. No potentially life-threating errors were witnessed. Few logistic and administrative factors were reported. The results showed that the medication administration policy and procedure need an urgent revision to be more sensible for nurses in practice. Nurses’ knowledge and skills regarding the medication administration process should be improved.

Keywords: medication sasfety, paediatric, medication errors, paediatric ward

Procedia PDF Downloads 388
770 Multi-Linear Regression Based Prediction of Mass Transfer by Multiple Plunging Jets

Authors: S. Deswal, M. Pal

Abstract:

The paper aims to compare the performance of vertical and inclined multiple plunging jets and to model and predict their mass transfer capacity by multi-linear regression based approach. The multiple vertical plunging jets have jet impact angle of θ = 90O; whereas, multiple inclined plunging jets have jet impact angle of θ = 600. The results of the study suggests that mass transfer is higher for multiple jets, and inclined multiple plunging jets have up to 1.6 times higher mass transfer than vertical multiple plunging jets under similar conditions. The derived relationship, based on multi-linear regression approach, has successfully predicted the volumetric mass transfer coefficient (KLa) from operational parameters of multiple plunging jets with a correlation coefficient of 0.973, root mean square error of 0.002 and coefficient of determination of 0.946. The results suggests that predicted overall mass transfer coefficient is in good agreement with actual experimental values; thereby suggesting the utility of derived relationship based on multi-linear regression based approach and can be successfully employed in modelling mass transfer by multiple plunging jets.

Keywords: mass transfer, multiple plunging jets, multi-linear regression, earth sciences

Procedia PDF Downloads 449
769 Analytical Authentication of Butter Using Fourier Transform Infrared Spectroscopy Coupled with Chemometrics

Authors: M. Bodner, M. Scampicchio

Abstract:

Fourier Transform Infrared (FT-IR) spectroscopy coupled with chemometrics was used to distinguish between butter samples and non-butter samples. Further, quantification of the content of margarine in adulterated butter samples was investigated. Fingerprinting region (1400-800 cm–1) was used to develop unsupervised pattern recognition (Principal Component Analysis, PCA), supervised modeling (Soft Independent Modelling by Class Analogy, SIMCA), classification (Partial Least Squares Discriminant Analysis, PLS-DA) and regression (Partial Least Squares Regression, PLS-R) models. PCA of the fingerprinting region shows a clustering of the two sample types. All samples were classified in their rightful class by SIMCA approach; however, nine adulterated samples (between 1% and 30% w/w of margarine) were classified as belonging both at the butter class and at the non-butter one. In the two-class PLS-DA model’s (R2 = 0.73, RMSEP, Root Mean Square Error of Prediction = 0.26% w/w) sensitivity was 71.4% and Positive Predictive Value (PPV) 100%. Its threshold was calculated at 7% w/w of margarine in adulterated butter samples. Finally, PLS-R model (R2 = 0.84, RMSEP = 16.54%) was developed. PLS-DA was a suitable classification tool and PLS-R a proper quantification approach. Results demonstrate that FT-IR spectroscopy combined with PLS-R can be used as a rapid, simple and safe method to identify pure butter samples from adulterated ones and to determine the grade of adulteration of margarine in butter samples.

Keywords: adulterated butter, margarine, PCA, PLS-DA, PLS-R, SIMCA

Procedia PDF Downloads 135
768 A Comparative Study of Wellness Among Sportsmen and Non Sportsmen

Authors: Jaskaran Singh Sidhu

Abstract:

Aim: The purpose of this study is to find the relationship between wellness among sportsmen and non sportsmen. Methodology: The present study is an experimental study for 80 senior secondary volleyball players of 16-19 years of age from Ludhiana District of Punjab (India), and 80 non-sportsperson were taken from senior secondary school of Ludhiana district. The sample for this study was taken through a random sampling technique. Tools: A five point scale havinf 50 items was used to acess the wellness Statistical Analysis: To find out the relationship among the variables exists or not, a t-test was used to test the significance of the difference between the means. Statistics for each characteristic were calculated; Mean, Standard deviation, Standard error of Mean. Data were analyzed using SPSS (statistical package for the social sciences). Statistical significance was set at p < 0.05. Results: Substantial deviations were noted at p<0.5 in the totality of wellness. Sportsmen show significant differences exist at p<0.5 in three parameters of wellness i.e., physical wellness, mental wellness, and social wellness. In spiritual and emotional wellness attributes, non-sportsmen shows significant difference at p<0.5. Conclusion: From the data interpretation it reflects that overall wellness can be improved by participation in sports. It further noted in study that participation in sports promote the attributes of wellness i.e., physical wellness, mental wellness, emotional wellness and social wellness.

Keywords: physical, mental, social, emotional, wellness, spiritual

Procedia PDF Downloads 83
767 Development of Requirements Analysis Tool for Medical Autonomy in Long-Duration Space Exploration Missions

Authors: Lara Dutil-Fafard, Caroline Rhéaume, Patrick Archambault, Daniel Lafond, Neal W. Pollock

Abstract:

Improving resources for medical autonomy of astronauts in prolonged space missions, such as a Mars mission, requires not only technology development, but also decision-making support systems. The Advanced Crew Medical System - Medical Condition Requirements study, funded by the Canadian Space Agency, aimed to create knowledge content and a scenario-based query capability to support medical autonomy of astronauts. The key objective of this study was to create a prototype tool for identifying medical infrastructure requirements in terms of medical knowledge, skills and materials. A multicriteria decision-making method was used to prioritize the highest risk medical events anticipated in a long-term space mission. Starting with those medical conditions, event sequence diagrams (ESDs) were created in the form of decision trees where the entry point is the diagnosis and the end points are the predicted outcomes (full recovery, partial recovery, or death/severe incapacitation). The ESD formalism was adapted to characterize and compare possible outcomes of medical conditions as a function of available medical knowledge, skills, and supplies in a given mission scenario. An extensive literature review was performed and summarized in a medical condition database. A PostgreSQL relational database was created to allow query-based evaluation of health outcome metrics with different medical infrastructure scenarios. Critical decision points, skill and medical supply requirements, and probable health outcomes were compared across chosen scenarios. The three medical conditions with the highest risk rank were acute coronary syndrome, sepsis, and stroke. Our efforts demonstrate the utility of this approach and provide insight into the effort required to develop appropriate content for the range of medical conditions that may arise.

Keywords: decision support system, event-sequence diagram, exploration mission, medical autonomy, scenario-based queries, space medicine

Procedia PDF Downloads 122
766 Introduction of Para-Sasaki-Like Riemannian Manifolds and Construction of New Einstein Metrics

Authors: Mancho Manev

Abstract:

The concept of almost paracontact Riemannian manifolds (abbr., apcR manifolds) was introduced by I. Sato in 1976 as an analogue of almost contact Riemannian manifolds. The notion of an apcR manifold of type (p,q) was defined by S. Sasaki in 1980, where p and q are respectively the numbers of the multiplicity of the structure eigenvalues 1 and -1. It also has a simple eigenvalue of 0. In our work, we consider (2n+1)-dimensional apcR manifolds of type (n,n), i.e., the paracontact distribution of the studied manifold can be considered as a 2n-dimensional almost paracomplex Riemannian distribution with almost paracomplex structure and structure group O(n) × O(n). The aim of the present study is to introduce a new class of apcR manifolds. Such a manifold is obtained using the construction of a certain Riemannian cone over it, and the resulting manifold is a paraholomorphic paracomplex Riemannian manifold (abbr., phpcR manifold). We call it a para-Sasaki-like Riemannian manifold (abbr., pSlR manifold) and give some explicit examples. We study the structure of pSlR spaces and find that the paracontact form η is closed and each pSlR manifold locally can be considered as a certain product of the real line with a phpcR manifold, which is locally a Riemannian product of two equidimensional Riemannian spaces. We also obtain that the curvature of the pSlR manifolds is completely determined by the curvature of the underlying local phpcR manifold. Moreover, the ξ-directed Ricci curvature is equal to -2n, while in the Sasaki case, it is 2n. Accordingly, the pSlR manifolds can be interpreted as the counterpart of the Sasaki manifolds; the skew-symmetric part of ∇η vanishes, while in the Sasaki case, the symmetric part vanishes. We define a hyperbolic extension of a (complete) phpcR manifold that resembles a certain warped product, and we indicate that it is a (complete) pSlR manifold. In addition, we consider the hyperbolic extension of a phpcR manifold and prove that if the initial manifold is a complete Einstein manifold with negative scalar curvature, then the resulting manifold is a complete Einstein pSlR manifold with negative scalar curvature. In this way, we produce new examples of a complete Einstein Riemannian manifold with negative scalar curvature. Finally, we define and study para contact conformal/homothetic deformations by deriving a subclass that preserves the para-Sasaki-like condition. We then find that if we apply a paracontact homothetic deformation of a pSlR space, we obtain that the Ricci tensor is invariant.

Keywords: almost paracontact Riemannian manifolds, Einstein manifolds, holomorphic product manifold, warped product manifold

Procedia PDF Downloads 202
765 Evaluating the Effect of Structural Reorientation to Thermochemical and Energetic Properties of 1,4-Diamino-3,6-Dinitropyrazolo[4,3- C]Pyrazole

Authors: Lamla Thungathaa, Conrad Mahlasea, Lisa Ngcebesha

Abstract:

1,4-Diamino-3,6-dinitropyrazolo[4,3-c]pyrazole (LLM-119) and its structural isomer 3,6-dinitropyrazolo[3,4-c]pyrazole-1,4(6H)-diamine were designed by structural reorientation of the fused pyrazole rings and their respective substituents (-NO2 and -NH2). Structural reorientation involves structural rearrangement which result in different structural isomers, employing this approach, six structural isomers of LLM-119 were achieved. The effect of structural reorientation (isomerisation and derivatives) on the enthalpy of formation, detonation properties, impact sensitivity, and density of these molecules is studied Computationally. The computational method used are detailed in the document and they yielded results that are close to the literature values with a relative error of 2% for enthalpy of formation, 2% for density, 0.05% for detonation velocity, and 4% for detonation pressure. The correlation of the structural reorientation to the calculated thermochemical and detonation properties of the molecules indicated that molecules with a -NO2 group attached to a Carbon atom and -NH2 connected to a Nitrogen atom maximize the enthalpy of formation and detonation velocity. The joining of pyrazole molecules has less effect on these parameters. It was seen that density and detonation pressure improved when both –NO2 or -NH2 functional groups were on the same side of the molecular structure. The structural reorientation gave rise to 3,4-dinitropyrazolo[3,4-c]pyrazole-1,6-diamine which exhibited optimal density and detonation performance compared to other molecules.

Keywords: LLM-119, fused rings, azole, structural isomers, detonation properties

Procedia PDF Downloads 84