Search results for: accuracy improvement
2403 E-Learning Recommender System Based on Collaborative Filtering and Ontology
Authors: John Tarus, Zhendong Niu, Bakhti Khadidja
Abstract:
In recent years, e-learning recommender systems has attracted great attention as a solution towards addressing the problem of information overload in e-learning environments and providing relevant recommendations to online learners. E-learning recommenders continue to play an increasing educational role in aiding learners to find appropriate learning materials to support the achievement of their learning goals. Although general recommender systems have recorded significant success in solving the problem of information overload in e-commerce domains and providing accurate recommendations, e-learning recommender systems on the other hand still face some issues arising from differences in learner characteristics such as learning style, skill level and study level. Conventional recommendation techniques such as collaborative filtering and content-based deal with only two types of entities namely users and items with their ratings. These conventional recommender systems do not take into account the learner characteristics in their recommendation process. Therefore, conventional recommendation techniques cannot make accurate and personalized recommendations in e-learning environment. In this paper, we propose a recommendation technique combining collaborative filtering and ontology to recommend personalized learning materials to online learners. Ontology is used to incorporate the learner characteristics into the recommendation process alongside the ratings while collaborate filtering predicts ratings and generate recommendations. Furthermore, ontological knowledge is used by the recommender system at the initial stages in the absence of ratings to alleviate the cold-start problem. Evaluation results show that our proposed recommendation technique outperforms collaborative filtering on its own in terms of personalization and recommendation accuracy.Keywords: collaborative filtering, e-learning, ontology, recommender system
Procedia PDF Downloads 3922402 Implementation of Lean Tools (Value Stream Mapping and ECRS) in an Oil Refinery
Authors: Ronita Singh, Yaman Pattanaik, Soham Lalwala
Abstract:
In today’s highly competitive business environment, every organization is striving towards lean manufacturing systems to achieve lower Production Lead Times, lower costs, less inventory and overall improvement in supply chains efficiency. Based on the similar idea, this paper presents the practical application of Value Stream Mapping (VSM) tool and ECRS (Eliminate, Combine, Reduce, and Simplify) technique in the receipt section of the material management center of an oil refinery. A value stream is an assortment of all actions (value added as well as non-value added) that are required to bring a product through the essential flows, starting with raw material and ending with the customer. For drawing current state value stream mapping, all relevant data of the receipt cycle has been collected and analyzed. Then analysis of current state map has been done for determining the type and quantum of waste at every stage which helped in ascertaining as to how far the warehouse is from the concept of lean manufacturing. From the results achieved by current VSM, it was observed that the two processes- Preparation of GRN (Goods Receipt Number) and Preparation of UD (Usage Decision) are both bottle neck operations and have higher cycle time. This root cause analysis of various types of waste helped in designing a strategy for step-wise implementation of lean tools. The future state thus created a lean flow of materials at the warehouse center, reducing the lead time of the receipt cycle from 11 days to 7 days and increasing overall efficiency by 27.27%.Keywords: current VSM, ECRS, future VSM, receipt cycle, supply chain, VSM
Procedia PDF Downloads 3242401 Toxicological Standardization of Heavy Metals and Microbial Contamination Haematinic Herbal Formulations Marketed in India
Authors: A. V. Chandewar, Sanjay Bais
Abstract:
Backgound: In India, drugs of herbal origin have been used in traditional systems of medicines such as Unani and Ayurveda since ancient times. WHO limit for Escherichia coli is 101/gm cfu, for Staphylococus aureus 105/gm cfu, and for Pseudomonas aeruginosa 103/gm cfu and for Salmonella species nil cfu. WHO mentions maximum permissible limits in raw materials only for arsenic, cadmium, and lead, which amount to 1.0, 0.3, and 10 ppm, respectively. Aim: The main purpose of the investigation was to document evidence for the users, and practitioners of marketed haematinic herbal formulations. In the present study haematinic herbal formulations marketed in Yavatmal India were determined for the presence of microbial and heavy metal content. Method: The investigations were performed by using specific medias and atomic absorption spectrometry. Result: The present work indicates the presence of heavy metal contents in herbal formulations selected for study. It was found that arsenic content in formulations was below the permissible limit in all formulations. The cadmium and lead content in six formulations were above the permissible limits. Such formulations are injurious to health of patient if consumed regularly. The specific medias were used to determining the presence of Escherichia coli 4 samples, Staphylococcus aureus 3 samples, and P. aeruginosa 4 samples. The data indicated suggest that there is requirement of in process improvement to provide better quality for consumer health in order to be competitive in international markets. Summary/Conclusion: The presence of microbial and heavy metal content above WHO limits indicates that the GMP was not followed during manufacturing of herbal formulations marketed in India.Keywords: toxicological standardization, heavy metals, microbial contamination, haematinic herbal formulations
Procedia PDF Downloads 4542400 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics
Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur
Abstract:
Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics
Procedia PDF Downloads 1132399 A “Best Practice” Model for Physical Education in the BRICS Countries
Authors: Vasti Oelofse, Niekie van der Merwe, Dorita du Toit
Abstract:
This study addresses the need for a unified best practice model for Physical Education across BRICS nations, as current research primarily offers individual country recommendations. Drawing on relevant literature within the framework of Bronfenbrenner’s Ecological Systems Theory, as well as data from open-ended questionnaires completed by Physical Education experts from the BRICS countries, , the study develops a best practice model based on identified challenges and effective practices in Physical Education. A model is proposed that incorporates flexible and resource-efficient strategies tailored to address PE challenges specific to these countries, enhancing outcomes for learners, empowering teachers, and fostering systemic collaboration among BRICS members. The proposed model comprises six key areas: “Curriculum and policy requirements”, “General approach”, “Theoretical basis”, “Strategies for presenting content”, “Teacher training”, and “Evaluation”. The “Strategies for presenting program content” area addresses both well-resourced and poorly resourced schools, adapting curriculum, teaching strategies, materials, and learner activities for varied socio-economic contexts. The model emphasizes a holistic approach to learner development, engaging environments, and continuous teacher training. A collaborative approach among BRICS countries, focusing on shared best practices and continuous improvement, is vital for the model's successful implementation, enhancing Physical Education programs and outcomes across these nations.Keywords: BRICS countries, physical education, best practice model, ecological systems theory
Procedia PDF Downloads 202398 Project Knowledge Harvesting: The Case of Improving Project Performance through Project Knowledge Sharing Framework
Authors: Eng Rima Al-Awadhi, Abdul Jaleel Tharayil
Abstract:
In a project-centric organization like KOC, managing the knowledge of the project is of critical importance to the success of the project and the organization. However, due to the very nature and complexity involved, each project engagement generates a lot of 'learnings' that need to be factored into while new projects are initiated and thus avoid repeating the same mistake. But, many a time these learnings are localized and remains as ‘tacit knowledge’ leading to scope re-work, schedule overrun, adjustment orders, concession requests and claims. While KOC follows an asset based organization structure, with a multi-cultural and multi-ethnic workforce and larger chunk of the work is carried out through complex, long term project engagement, diffusion of ‘learnings’ across assets while dealing with the natural entropy of the organization is of great significance. Considering the relatively higher number of mega projects, it's important that the issues raised during the project life cycle are centrally harvested, analyzed and the ‘learnings’ from these issues are shared, absorbed and are in-turn utilized to enhance and refine the existing process and practices, leading to improve the project performance. One of the many factors contributing to the successful completion of a project on time is the reduction in the number of variations or concessions triggered during the project life cycle. The project process integrated knowledge sharing framework discusses the knowledge harvesting methodology adopted, the challenges faced, learnings acquired and its impact on project performance. The framework facilitates the proactive identification of issues that may have an impact on the overall quality of the project and improve performance.Keywords: knowledge harvesting, project integrated knowledge sharing, performance improvement, knowledge management, lessons learn
Procedia PDF Downloads 4012397 Compliance of Systematic Reviews in Plastic Surgery with the PRISMA Statement: A Systematic Review
Authors: Seon-Young Lee, Harkiran Sagoo, Katherine Whitehurst, Georgina Wellstead, Alexander Fowler, Riaz Agha, Dennis Orgill
Abstract:
Introduction: Systematic reviews attempt to answer research questions by synthesising the data within primary papers. They are an increasingly important tool within evidence-based medicine, guiding clinical practice, future research and healthcare policy. We sought to determine the reporting quality of recent systematic reviews in plastic surgery. Methods: This systematic review was conducted in line with the Cochrane handbook, reported in line with the PRISMA statement and registered at the ResearchRegistry (UIN: reviewregistry18). MEDLINE and EMBASE databases were searched in 2013 and 2014 for systematic reviews by five major plastic surgery journals. Screening, identification and data extraction was performed independently by two teams. Results: From an initial set of 163 articles, 79 met the inclusion criteria. The median PRISMA score was 16 out of 27 items (59.3%; range 6-26, 95% CI 14-17). Compliance between individual PRISMA items showed high variability. It was poorest for items related to the use of review protocol (item 5; 5%) and presentation of data on risk of bias of each study (item 19; 18%), while being the highest for description of rationale (item 3; 99%) and sources of funding and other support (item 27; 95%), and for structured summary in the abstract (item 2; 95%). Conclusion: The reporting quality of systematic reviews in plastic surgery requires improvement. ‘Hard-wiring’ of compliance through journal submission systems, as well as improved education, awareness and a cohesive strategy among all stakeholders is called for.Keywords: PRISMA, reporting quality, plastic surgery, systematic review, meta-analysis
Procedia PDF Downloads 2962396 Assessment of Forest Above Ground Biomass Through Linear Modeling Technique Using SAR Data
Authors: Arjun G. Koppad
Abstract:
The study was conducted in Joida taluk of Uttara Kannada district, Karnataka, India, to assess the land use land cover (LULC) and forest aboveground biomass using L band SAR data. The study area covered has dense, moderately dense, and sparse forests. The sampled area was 0.01 percent of the forest area with 30 sampling plots which were selected randomly. The point center quadrate (PCQ) method was used to select the tree and collected the tree growth parameters viz., tree height, diameter at breast height (DBH), and diameter at the tree base. The tree crown density was measured with a densitometer. Each sample plot biomass was estimated using the standard formula. In this study, the LULC classification was done using Freeman-Durden, Yamaghuchi and Pauli polarimetric decompositions. It was observed that the Freeman-Durden decomposition showed better LULC classification with an accuracy of 88 percent. An attempt was made to estimate the aboveground biomass using SAR backscatter. The ALOS-2 PALSAR-2 L-band data (HH, HV, VV &VH) fully polarimetric quad-pol SAR data was used. SAR backscatter-based regression model was implemented to retrieve forest aboveground biomass of the study area. Cross-polarization (HV) has shown a good correlation with forest above-ground biomass. The Multi Linear Regression analysis was done to estimate aboveground biomass of the natural forest areas of the Joida taluk. The different polarizations (HH &HV, VV &HH, HV & VH, VV&VH) combination of HH and HV polarization shows a good correlation with field and predicted biomass. The RMSE and value for HH & HV and HH & VV were 78 t/ha and 0.861, 81 t/ha and 0.853, respectively. Hence the model can be recommended for estimating AGB for the dense, moderately dense, and sparse forest.Keywords: forest, biomass, LULC, back scatter, SAR, regression
Procedia PDF Downloads 322395 European Union Health Policy and the Response to COVID-19 Pandemic: Building a European Health Union
Authors: Aikaterini Tsalampouni
Abstract:
The European Union has long been the most developed model of economic and political integration that has brought a common market, a common currency and a standardization of national policies in certain areas in consistent with EU values and principles. To this direction, there is a parallel process of social integration that effect public policy decisions of member states. Even though social policy, i.e. social protection and moreover healthcare policy, still remains in state's responsibility to develop, EU applies different mechanisms in order to influence health policy systems, since from a more federalist point of view, EU ought to expand its regulatory and legislative roles in as many policy areas as possible. Recently, the pandemic has become a turning point for health care provision and at the same time has also highlighted the need to strengthen the EU’s role in coordinating health care. This paper analyses the EU health policy in general, as well as the response to COVID-19 pandemic with an attempt to identify indications of interaction between EU policies and the promotion of sustainable and resilient health systems. More analytically, the paper investigates the EU binding legal instruments, non-binding legal instruments, monitoring and assessment instruments and instruments for co-financing concerning health care provision in member states and records the evolution of health policies before and during the COVID-19 pandemic. The paper concludes by articulating some remarks regarding the improvement of health policy in EU. Since the ability to deal with a pandemic depends on continuous and increased investment in health systems, the involvement of the EU can lead to a policy convergence, necessary for the resilience of the systems, maintaining at the same time, a strong health policy framework in Europe.Keywords: EU health policy, EU response to COVID-19, European Health Union, health systems in Europe
Procedia PDF Downloads 1172394 High Resolution Sandstone Connectivity Modelling: Implications for Outcrop Geological and Its Analog Studies
Authors: Numair Ahmed Siddiqui, Abdul Hadi bin Abd Rahman, Chow Weng Sum, Wan Ismail Wan Yousif, Asif Zameer, Joel Ben-Awal
Abstract:
Advances in data capturing from outcrop studies have made possible the acquisition of high-resolution digital data, offering improved and economical reservoir modelling methods. Terrestrial laser scanning utilizing LiDAR (Light detection and ranging) provides a new method to build outcrop based reservoir models, which provide a crucial piece of information to understand heterogeneities in sandstone facies with high-resolution images and data set. This study presents the detailed application of outcrop based sandstone facies connectivity model by acquiring information gathered from traditional fieldwork and processing detailed digital point-cloud data from LiDAR to develop an intermediate small-scale reservoir sandstone facies model of the Miocene Sandakan Formation, Sabah, East Malaysia. The software RiScan pro (v1.8.0) was used in digital data collection and post-processing with an accuracy of 0.01 m and point acquisition rate of up to 10,000 points per second. We provide an accurate and descriptive workflow to triangulate point-clouds of different sets of sandstone facies with well-marked top and bottom boundaries in conjunction with field sedimentology. This will provide highly accurate qualitative sandstone facies connectivity model which is a challenge to obtain from subsurface datasets (i.e., seismic and well data). Finally, by applying this workflow, we can build an outcrop based static connectivity model, which can be an analogue to subsurface reservoir studies.Keywords: LiDAR, outcrop, high resolution, sandstone faceis, connectivity model
Procedia PDF Downloads 2322393 Analysis of Spatiotemporal Efficiency and Fairness of Railway Passenger Transport Network Based on Space Syntax: Taking Yangtze River Delta as an Example
Abstract:
Based on the railway network and the principles of space syntax, the study attempts to reconstruct the spatial relationship of the passenger network connections from space and time perspective. According to the travel time data of main stations in the Yangtze River Delta urban agglomeration obtained by the Internet, the topological drawing of railway network under different time sections is constructed. With the comprehensive index composed of connection and integration, the accessibility and network operation efficiency of the railway network in different time periods is calculated, while the fairness of the network is analyzed by the fairness indicators constructed with the integration and location entropy from the perspective of horizontal and vertical fairness respectively. From the analysis of the efficiency and fairness of the railway passenger transport network, the study finds: (1) There is a strong regularity in regional system accessibility change; (2) The problems of efficiency and fairness are different in different time periods; (3) The improvement of efficiency will lead to the decline of horizontal fairness to a certain extent, while from the perspective of vertical fairness, the supply-demand situation has changed smoothly with time; (4) The network connection efficiency of Shanghai, Jiangsu and Zhejiang regions is higher than that of the western regions such as Anqing and Chizhou; (5) The marginalization of Nantong, Yancheng, Yangzhou, Taizhou is obvious. The study explores the application of spatial syntactic theory in regional traffic analysis, in order to provide a reference for the development of urban agglomeration transportation network.Keywords: spatial syntax, the Yangtze River Delta, railway passenger time, efficiency and fairness
Procedia PDF Downloads 1382392 Thermal Image Segmentation Method for Stratification of Freezing Temperatures
Authors: Azam Fazelpour, Saeed R. Dehghani, Vlastimil Masek, Yuri S. Muzychka
Abstract:
The study uses an image analysis technique employing thermal imaging to measure the percentage of areas with various temperatures on a freezing surface. An image segmentation method using threshold values is applied to a sequence of image recording the freezing process. The phenomenon is transient and temperatures vary fast to reach the freezing point and complete the freezing process. Freezing salt water is subjected to the salt rejection that makes the freezing point dynamic and dependent on the salinity at the phase interface. For a specific area of freezing, nucleation starts from one side and end to another side, which causes a dynamic and transient temperature in that area. Thermal cameras are able to reveal a difference in temperature due to their sensitivity to infrared radiance. Using Experimental setup, a video is recorded by a thermal camera to monitor radiance and temperatures during the freezing process. Image processing techniques are applied to all frames to detect and classify temperatures on the surface. Image processing segmentation method is used to find contours with same temperatures on the icing surface. Each segment is obtained using the temperature range appeared in the image and correspond pixel values in the image. Using the contours extracted from image and camera parameters, stratified areas with different temperatures are calculated. To observe temperature contours on the icing surface using the thermal camera, the salt water sample is dropped on a cold surface with the temperature of -20°C. A thermal video is recorded for 2 minutes to observe the temperature field. Examining the results obtained by the method and the experimental observations verifies the accuracy and applicability of the method.Keywords: ice contour boundary, image processing, image segmentation, salt ice, thermal image
Procedia PDF Downloads 3232391 Medical Diagnosis of Retinal Diseases Using Artificial Intelligence Deep Learning Models
Authors: Ethan James
Abstract:
Over one billion people worldwide suffer from some level of vision loss or blindness as a result of progressive retinal diseases. Many patients, particularly in developing areas, are incorrectly diagnosed or undiagnosed whatsoever due to unconventional diagnostic tools and screening methods. Artificial intelligence (AI) based on deep learning (DL) convolutional neural networks (CNN) have recently gained a high interest in ophthalmology for its computer-imaging diagnosis, disease prognosis, and risk assessment. Optical coherence tomography (OCT) is a popular imaging technique used to capture high-resolution cross-sections of retinas. In ophthalmology, DL has been applied to fundus photographs, optical coherence tomography, and visual fields, achieving robust classification performance in the detection of various retinal diseases including macular degeneration, diabetic retinopathy, and retinitis pigmentosa. However, there is no complete diagnostic model to analyze these retinal images that provide a diagnostic accuracy above 90%. Thus, the purpose of this project was to develop an AI model that utilizes machine learning techniques to automatically diagnose specific retinal diseases from OCT scans. The algorithm consists of neural network architecture that was trained from a dataset of over 20,000 real-world OCT images to train the robust model to utilize residual neural networks with cyclic pooling. This DL model can ultimately aid ophthalmologists in diagnosing patients with these retinal diseases more quickly and more accurately, therefore facilitating earlier treatment, which results in improved post-treatment outcomes.Keywords: artificial intelligence, deep learning, imaging, medical devices, ophthalmic devices, ophthalmology, retina
Procedia PDF Downloads 1842390 Corporate Social Responsibility: An Ethical or a Legal Framework?
Authors: Pouira Askary
Abstract:
Indeed, in our globalized world which is facing with various international crises, the transnational corporations and other business enterprises have the capacity to foster economic well-being, development, technological improvement and wealth, as well as causing adverse impacts on human rights. The UN Human Rights Council declared that although the primary responsibility to protect human rights lie with the State but the transnational corporations and other business enterprises have also a responsibility to respect and protect human rights in the framework of corporate social responsibility. In 2011, the Human Rights Council endorsed the Guiding Principles on Business and Human Rights, a set of guidelines that define the key duties and responsibilities of States and business enterprises with regard to business-related human rights abuses. In UN’s view, the Guiding Principles do not create new legal obligations but constitute a clarification of the implications of existing standards, including under international human rights law. In 2014 the UN Human Rights Council decided to establish a working group on transnational corporations and other business enterprises whose mandate shall be to elaborate an international legally binding instrument to regulate, in international human rights law, the activities of transnational corporations and other business enterprises. Extremely difficult task for the working group to codify a legally binding document to regulate the behavior of corporations on the basis of the norms of international law! Concentration of this paper is on the origins of those human rights applicable on business enterprises. The research will discuss that the social and ethical roots of the CSR are much more institutionalized and elaborated than the legal roots. Therefore, the first step is to determine whether and to what extent corporations, do have an ethical responsibility to respect human rights and if so, by which means this ethical and social responsibility is convertible to legal commitments.Keywords: CSR, ethics, international law, human rights, development, sustainable business
Procedia PDF Downloads 3872389 A Two-Phase Flow Interface Tracking Algorithm Using a Fully Coupled Pressure-Based Finite Volume Method
Authors: Shidvash Vakilipour, Scott Ormiston, Masoud Mohammadi, Rouzbeh Riazi, Kimia Amiri, Sahar Barati
Abstract:
Two-phase and multi-phase flows are common flow types in fluid mechanics engineering. Among the basic and applied problems of these flow types, two-phase parallel flow is the one that two immiscible fluids flow in the vicinity of each other. In this type of flow, fluid properties (e.g. density, viscosity, and temperature) are different at the two sides of the interface of the two fluids. The most challenging part of the numerical simulation of two-phase flow is to determine the location of interface accurately. In the present work, a coupled interface tracking algorithm is developed based on Arbitrary Lagrangian-Eulerian (ALE) approach using a cell-centered, pressure-based, coupled solver. To validate this algorithm, an analytical solution for fully developed two-phase flow in presence of gravity is derived, and then, the results of the numerical simulation of this flow are compared with analytical solution at various flow conditions. The results of the simulations show good accuracy of the algorithm despite using a nearly coarse and uniform grid. Temporal variations of interface profile toward the steady-state solution show that a greater difference between fluids properties (especially dynamic viscosity) will result in larger traveling waves. Gravity effect studies also show that favorable gravity will result in a reduction of heavier fluid thickness and adverse gravity leads to increasing it with respect to the zero gravity condition. However, the magnitude of variation in favorable gravity is much more than adverse gravity.Keywords: coupled solver, gravitational force, interface tracking, Reynolds number to Froude number, two-phase flow
Procedia PDF Downloads 3172388 Examination of Forged Signatures Printed by Means of Fabrication in Terms of Their Relation to the Perpetrator
Authors: Salim Yaren, Nergis Canturk
Abstract:
Signatures are signs that are handwritten by person in order to confirm values such as information, amount, meaning, time and undertaking that bear on a document. It is understood that the signature of a document and the accuracy of the information on the signature is accepted and approved. Forged signatures are formed by forger without knowing and seeing original signature of person that forger will imitate and as a result of his/her effort for hiding typical characteristics of his/her own signatures. Forged signatures are often signed by starting with the initials of the first and last name or persons of the persons whose fake signature will be signed. The similarities in the signatures are completely random. Within the scope of the study, forged signatures are collected from 100 people both their original signatures and forged signatures signed referring to 5 imaginary people. These signatures are compared for 14 signature analyzing criteria by 2 signature analyzing experts except the researcher. 1 numbered analyzing expert who is 9 year experience in his/her field evaluated signatures of 39 (39%) people right and of 25 (25%) people wrong and he /she made any evaluations for signatures of 36 (36%) people. 2 numbered analyzing expert who is 16 year experienced in his/her field evaluated signatures of 49 (49%) people right and 28 (28%) people wrong and he /she made any evaluations for signatures of 23 (23%) people. Forged signatures that are signed by 24 (24%) people are matched by two analyzing experts properly, forged signatures that are signed by 8 (8%) people are matched wrongfully and made up signatures that are signed by 12 (12%) people couldn't be decided by both analyzing experts. Signatures analyzing is a subjective topic so that analyzing and comparisons take form according to education, knowledge and experience of the expert. Consequently, due to the fact that 39% success is achieved by analyzing expert who has 9 year professional experience and 49% success is achieved by analyzing expert who has 16 year professional experience, it is seen that success rate is directly proportionate to knowledge and experience of the expert.Keywords: forensic signature, forensic signature analysis, signature analysis criteria, forged signature
Procedia PDF Downloads 1272387 Randomized, Controlled Blind Study Comparing Sacroiliac Intra-Articular Steroid Injection to Radiofrequency Denervation for Management of Sacroiliac Joint Pain
Authors: Ossama Salman
Abstract:
Background and objective: Sacroiliac joint pain is a common cause for chronic axial low back pain, with up to 20% prevalence rate. To date, no effective long-term treatment intervention has been embarked on yet. The aim of our study was to compare steroid block to radiofrequency ablation for SIJ pain conditions. Methods: A randomized, blind, study was conducted in 30 patients with sacroiliac joint pain. Fifteen patients received radiofrequency denervation of L4-5 primary dorsal rami and S1-3 lateral sacral branch, and 15 patients received steroid under fluoroscopy. Those in the steroid group who did not respond to steroid injections were offered to cross over to get radiofrequency ablation. Results: At 1-, 3- and 6-months post-intervention, 73%, 60% and 53% of patients, respectively, gained ≥ 50 % pain relief in the radiofrequency (RF) ablation group. In the steroid group, at one month post intervention follow up, only 20% gained ≥ 50 % pain relief, but failed to show any improvement at 3 months and 6 months follow up. Conclusions: Radiofrequency ablation at L4 and L5 primary dorsal rami and S1-3 lateral sacral branch may provide effective and longer pain relief compared to the classic intra-articular steroid injection, in properly selected patients with suspected sacroiliac joint pain. Larger studies are called for to confirm our results and lay out the optimal patient selection and treatment parameters for this poorly comprehended disorder.Keywords: lateral branch denervation, LBD, radio frequency, RF, sacroiliac joint, SIJ, visual analogue scale, VAS
Procedia PDF Downloads 2192386 An Integrated Label Propagation Network for Structural Condition Assessment
Authors: Qingsong Xiong, Cheng Yuan, Qingzhao Kong, Haibei Xiong
Abstract:
Deep-learning-driven approaches based on vibration responses have attracted larger attention in rapid structural condition assessment while obtaining sufficient measured training data with corresponding labels is relevantly costly and even inaccessible in practical engineering. This study proposes an integrated label propagation network for structural condition assessment, which is able to diffuse the labels from continuously-generating measurements by intact structure to those of missing labels of damage scenarios. The integrated network is embedded with damage-sensitive features extraction by deep autoencoder and pseudo-labels propagation by optimized fuzzy clustering, the architecture and mechanism which are elaborated. With a sophisticated network design and specified strategies for improving performance, the present network achieves to extends the superiority of self-supervised representation learning, unsupervised fuzzy clustering and supervised classification algorithms into an integration aiming at assessing damage conditions. Both numerical simulations and full-scale laboratory shaking table tests of a two-story building structure were conducted to validate its capability of detecting post-earthquake damage. The identifying accuracy of a present network was 0.95 in numerical validations and an average 0.86 in laboratory case studies, respectively. It should be noted that the whole training procedure of all involved models in the network stringently doesn’t rely upon any labeled data of damage scenarios but only several samples of intact structure, which indicates a significant superiority in model adaptability and feasible applicability in practice.Keywords: autoencoder, condition assessment, fuzzy clustering, label propagation
Procedia PDF Downloads 1002385 Architectural Adaptation for Road Humps Detection in Adverse Light Scenario
Authors: Padmini S. Navalgund, Manasi Naik, Ujwala Patil
Abstract:
Road hump is a semi-cylindrical elevation on the road made across specific locations of the road. The vehicle needs to maneuver the hump by reducing the speed to avoid car damage and pass over the road hump safely. Road Humps on road surfaces, if identified in advance, help to maintain the security and stability of vehicles, especially in adverse visibility conditions, viz. night scenarios. We have proposed a deep learning architecture adaptation by implementing the MISH activation function and developing a new classification loss function called "Effective Focal Loss" for Indian road humps detection in adverse light scenarios. We captured images comprising of marked and unmarked road humps from two different types of cameras across South India to build a heterogeneous dataset. A heterogeneous dataset enabled the algorithm to train and improve the accuracy of detection. The images were pre-processed, annotated for two classes viz, marked hump and unmarked hump. The dataset from these images was used to train the single-stage object detection algorithm. We utilised an algorithm to synthetically generate reduced visible road humps scenarios. We observed that our proposed framework effectively detected the marked and unmarked hump in the images in clear and ad-verse light environments. This architectural adaptation sets up an option for early detection of Indian road humps in reduced visibility conditions, thereby enhancing the autonomous driving technology to handle a wider range of real-world scenarios.Keywords: Indian road hump, reduced visibility condition, low light condition, adverse light condition, marked hump, unmarked hump, YOLOv9
Procedia PDF Downloads 322384 The Implementation of Level of Service for Development of Kuala Lumpur Transit Information System using GIS
Authors: Mokhtar Azizi
Abstract:
Due to heavy traffic and congested roads, it is crucial that the most popular main public transport services in Kuala Lumpur i.e. Putra LRT, Star LRT, KTM Commuter, KL Monorail and Rapid Bus must be continuously monitored and improved to fulfill the rider’s requirement and kept updated by the transit agencies. Evaluation on the current status of the services has been determined out by calculating the transit supportive area (TSA) and level of service (LOS) for each transit station. This research study has carried out the TSA and LOS mapping based on GIS techniques. The detailed census data of the region along the line of services has been collected from the Department of Statistics Malaysia for this purpose. The service coverage has been decided by 400 meters buffer zone for bus stations and 800 meters for rails station and railways in measurement the Quality of Service along the line of services. All the required information has been calculated by using the customized GIS software called Kuala Lumpur Transit Information System (KLTIS). The transit supportive area was calculated with the employment density at least 10 job/hectare or household density at 7.5 unit/hectare and total area covered by transit supportive area is 22516 hectare and the total area that is not supported by transit is 1718 hectare in Kuala Lumpur. The level of service is calculated with the percentage of transit supportive area served by transit for each station. In overall the percentage transit supportive areas served by transit for all the stations were less than 50% which falls in a very low level of service category. This research has proven its benefit by providing the current transit services operators with vital information for improvement of existing public transport services.Keywords: service coverage, transit supportive area, level of service, transit system
Procedia PDF Downloads 3802383 Research the Causes of Defects and Injuries of Reinforced Concrete and Stone Construction
Authors: Akaki Qatamidze
Abstract:
Implementation of the project will be a step forward in terms of reliability in Georgia and the improvement of the construction and the development of construction. Completion of the project is expected to result in a complete knowledge, which is expressed in concrete and stone structures of assessing the technical condition of the processing. This method is based on a detailed examination of the structure, in order to establish the injuries and the elimination of the possibility of changing the structural scheme of the new requirements and architectural preservationists. Reinforced concrete and stone structures research project carried out in a systematic analysis of the important approach is to optimize the process of research and development of new knowledge in the neighboring areas. In addition, the problem of physical and mathematical models of rational consent, the main pillar of the physical (in-situ) data and mathematical calculation models and physical experiments are used only for the calculation model specification and verification. Reinforced concrete and stone construction defects and failures the causes of the proposed research to enhance the effectiveness of their maximum automation capabilities and expenditure of resources to reduce the recommended system analysis of the methodological concept-based approach, as modern science and technology major particularity of one, it will allow all family structures to be identified for the same work stages and procedures, which makes it possible to exclude subjectivity and addresses the problem of the optimal direction. It discussed the methodology of the project and to establish a major step forward in the construction trades and practical assistance to engineers, supervisors, and technical experts in the construction of the settlement of the problem.Keywords: building, reinforced concrete, expertise, stone structures
Procedia PDF Downloads 3382382 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 902381 Review of Numerical Models for Granular Beds in Solar Rotary Kilns for Thermal Applications
Authors: Edgar Willy Rimarachin Valderrama, Eduardo Rojas Parra
Abstract:
Thermal energy from solar radiation is widely present in power plants, food drying, chemical reactors, heating and cooling systems, water treatment processes, hydrogen production, and others. In the case of power plants, one of the technologies available to transform solar energy into thermal energy is by solar rotary kilns where a bed of granular matter is heated through concentrated radiation obtained from an arrangement of heliostats. Numerical modeling is a useful approach to study the behavior of granular beds in solar rotary kilns. This technique, once validated with small-scale experiments, can be used to simulate large-scale processes for industrial applications. This study gives a comprehensive classification of numerical models used to simulate the movement and heat transfer for beds of granular media within solar rotary furnaces. In general, there exist three categories of models: 1) continuum, 2) discrete, and 3) multiphysics modeling. The continuum modeling considers zero-dimensional, one-dimensional and fluid-like models. On the other hand, the discrete element models compute the movement of each particle of the bed individually. In this kind of modeling, the heat transfer acts during contacts, which can occur by solid-solid and solid-gas-solid conduction. Finally, the multiphysics approach considers discrete elements to simulate grains and a continuous modeling to simulate the fluid around particles. This classification allows to compare the advantages and disadvantages for each kind of model in terms of accuracy, computational cost and implementation.Keywords: granular beds, numerical models, rotary kilns, solar thermal applications
Procedia PDF Downloads 492380 Incidence and Causes of Elective Surgery Cancellations in Songklanagarind Hospital, Thailand
Authors: A. Kaeotawee, N. Bunmas, W. Chomthong
Abstract:
Background: The cancellation of elective surgery is a major indicator of poor operating room efficiency. Furthermore, it is recognized as a major cause of emotional trauma to patients as well as their families. This study was carried out to assess the incidence and causes of elective surgery cancellation in our setting and to find the appropriate solutions for better quality management. Objective: To determine the incidence and causes of elective surgery cancellations in Songklanagarind Hospital. Material and Method: A prospective survey was conducted from September to November 2012. All patients who had their scheduled elective operations cancelled were assessed. Data was collected on the following 2 components: (1) patient demographics;(2) main reasons for cancellations, which were grouped into patient-related factors and organizational-related factors. Data are reported as a percentage of patients whose operations were cancelled. The association between cancellation status and patient demographics was assessed using univariate logistic regression. Results: 2,395 patients were scheduled for elective surgery and of these 343 (14.3%) had their operations cancelled. Cardiothoracic surgery had the highest rate of cancellations (28.7%) while the least number of cancellations occurred in ophthalmology (10.1%). The main reasons for cancellations were related to the unit's organization (53.6%), due to the surgeon (48.4%). Patient related causes (46.4%), due to non medical reasons (32.1%). The most common cause of cancellation by the surgeon was lack of theater time (21.3%), by patients due to the patient’s nonappearance (25.1%). Cancellation was significantly associated with type of patient, health insurance, type of anesthesia and specialties (p<0.05). Conclusion: Surgery cancellations by surgeons relating to a lack of theater time was a significant problem in our setting. Appropriate solutions for better quality improvement are needed.Keywords: elective cases, surgery cancellation, quality management, appropriate solutions
Procedia PDF Downloads 2612379 Preliminary Studies of Transient Stability for the 380 kV Connection West-Central of Saudi Electricity Company
Authors: S. Raja Mohamed, M. H Shwehdi, D. Devaraj
Abstract:
This paper is to present and discuss the new planned 380 kV transmission line performance under steady and transient states. Dynamic modeling and analysis of such inter-tie, which is, proposed to transfer energy from west to south and vice versa will be demonstrated and discussed. The west-central-south inter-tie links Al-Aula-Zaba-Tabuk-Tubajal-Jawf-Hail. It is essential to investigate the transient over-voltage to assure steady and stable transmission over such inter-tie. Saudi Electricity Company (SEC) has been improving its grid to make the whole country as an interconnected system. Already east, central and west were interconnected, yet mostly each is fed with its local generation. The SEC is planning to establish many inter-ties to strengthen the transient stability of its grid. The paper studies one of the important links of 380 kV, 220 km between Tabouk and Tubarjal, which is a step towards connecting the West with the South region. Modeling and analysis using some softwares will be utilized under different scenarios. Adoption of methods to stabilize and increase its power transmission are also discussed. Improvement of power system transients has been controlled by FACTS elements such the Static Var Compensators (SVC) receiving a wide interest since many technical studies have proven their effects on damping system oscillations and stability enhancement. Illustrations of the transient at each main generating or load bus will be checked in all inter-tie links. A brief review of possible means to solve the transient over-voltage problem using different FACTS element modeling will be discussed.Keywords: transient stability, static var compensator, central-west interconnected system, damping controller, Saudi Electricity Company
Procedia PDF Downloads 6112378 Improving Our Understanding of the in vivo Modelling of Psychotic Disorders
Authors: Zsanett Bahor, Cristina Nunes-Fonseca, Gillian L. Currie, Emily S. Sena, Lindsay D.G. Thomson, Malcolm R. Macleod
Abstract:
Psychosis is ranked as the third most disabling medical condition in the world by the World Health Organization. Despite a substantial amount of research in recent years, available treatments are not universally effective and have a wide range of adverse side effects. Since many clinical drug candidates are identified through in vivo modelling, a deeper understanding of these models, and their strengths and limitations, might help us understand reasons for difficulties in psychosis drug development. To provide an unbiased summary of the preclinical psychosis literature we performed a systematic electronic search of PubMed for publications modelling a psychotic disorder in vivo, identifying 14,721 relevant studies. Double screening of 11,000 publications from this dataset so far established 2403 animal studies of psychosis, with the most common model being schizophrenia (95%). 61% of these models are induced using pharmacological agents. For all the models only 56% of publications test a therapeutic treatment. We propose a systematic review of these studies to assess the prevalence of reporting of measures to reduce risk of bias, and a meta-analysis to assess the internal and external validity of these animal models. Our findings are likely to be relevant to future preclinical studies of psychosis as this generation of strong empirical evidence has the potential to identify weaknesses, areas for improvement and make suggestions on refinement of experimental design. Such a detailed understanding of the data which inform what we think we know will help improve the current attrition rate between bench and bedside in psychosis research.Keywords: animal models, psychosis, systematic review, schizophrenia
Procedia PDF Downloads 2922377 Electrochemical APEX for Genotyping MYH7 Gene: A Low Cost Strategy for Minisequencing of Disease Causing Mutations
Authors: Ahmed M. Debela, Mayreli Ortiz , Ciara K. O´Sullivan
Abstract:
The completion of the human genome Project (HGP) has paved the way for mapping the diversity in the overall genome sequence which helps to understand the genetic causes of inherited diseases and susceptibility to drugs or environmental toxins. Arrayed primer extension (APEX) is a microarray based minisequencing strategy for screening disease causing mutations. It is derived from Sanger DNA sequencing and uses fluorescently dideoxynucleotides (ddNTPs) for termination of a growing DNA strand from a primer with its 3´- end designed immediately upstream of a site where single nucleotide polymorphism (SNP) occurs. The use of DNA polymerase offers a very high accuracy and specificity to APEX which in turn happens to be a method of choice for multiplex SNP detection. Coupling the high specificity of this method with the high sensitivity, low cost and compatibility for miniaturization of electrochemical techniques would offer an excellent platform for detection of mutation as well as sequencing of DNA templates. We are developing an electrochemical APEX for the analysis of SNPs found in the MYH7 gene for group of cardiomyopathy patients. ddNTPs were labeled with four different redox active compounds with four distinct potentials. Thiolated oligonucleotide probes were immobilised on gold and glassy carbon substrates which are followed by hybridisation with complementary target DNA just adjacent to the base to be extended by polymerase. Electrochemical interrogation was performed after the incorporation of the redox labelled dedioxynucleotide. The work involved the synthesis and characterisation of the redox labelled ddNTPs, optimisation and characterisation of surface functionalisation strategies and the nucleotide incorporation assays.Keywords: array based primer extension, labelled ddNTPs, electrochemical, mutations
Procedia PDF Downloads 2512376 Development and Validation Method for Quantitative Determination of Rifampicin in Human Plasma and Its Application in Bioequivalence Test
Authors: Endang Lukitaningsih, Fathul Jannah, Arief R. Hakim, Ratna D. Puspita, Zullies Ikawati
Abstract:
Rifampicin is a semisynthetic antibiotic derivative of rifamycin B produced by Streptomyces mediterranei. RIF has been used worldwide as first line drug-prescribed throughout tuberculosis therapy. This study aims to develop and to validate an HPLC method couple with a UV detection for determination of rifampicin in spiked human plasma and its application for bioequivalence study. The chromatographic separation was achieved on an RP-C18 column (LachromHitachi, 250 x 4.6 mm., 5μm), utilizing a mobile phase of phosphate buffer/acetonitrile (55:45, v/v, pH 6.8 ± 0.1) at a flow of 1.5 mL/min. Detection was carried out at 337 nm by using spectrophotometer. The developed method was statistically validated for the linearity, accuracy, limit of detection, limit of quantitation, precise and specifity. The specifity of the method was ascertained by comparing chromatograms of blank plasma and plasma containing rifampicin; the matrix and rifampicin were well separated. The limit of detection and limit of quantification were 0.7 µg/mL and 2.3 µg/mL, respectively. The regression curve of standard was linear (r > 0.999) over a range concentration of 20.0 – 100.0 µg/mL. The mean recovery of the method was 96.68 ± 8.06 %. Both intraday and interday precision data showed reproducibility (R.S.D. 2.98% and 1.13 %, respectively). Therefore, the method can be used for routine analysis of rifampicin in human plasma and in bioequivalence study. The validated method was successfully applied in pharmacokinetic and bioequivalence study of rifampicin tablet in a limited number of subjects (under an Ethical Clearance No. KE/FK/6201/EC/2015). The mean values of Cmax, Tmax, AUC(0-24) and AUC(o-∞) for the test formulation of rifampicin were 5.81 ± 0.88 µg/mL, 1.25 hour, 29.16 ± 4.05 µg/mL. h. and 29.41 ± 4.07 µg/mL. h., respectively. Meanwhile for the reference formulation, the values were 5.04 ± 0.54 µg/mL, 1.31 hour, 27.20 ± 3.98 µg/mL.h. and 27.49 ± 4.01 µg/mL.h. From bioequivalence study, the 90% CIs for the test formulation/reference formulation ratio for the logarithmic transformations of Cmax and AUC(0-24) were 97.96-129.48% and 99.13-120.02%, respectively. According to the bioequivamence test guidelines of the European Commission-European Medicines Agency, it can be concluded that the test formulation of rifampicin is bioequivalence with the reference formulation.Keywords: validation, HPLC, plasma, bioequivalence
Procedia PDF Downloads 2952375 The Effort of Nutrition Status Improvement through Partnership with Early Age Education Institution on Urban Region, City of Semarang, Indonesia
Authors: Oktia Woro Kasmini Handayani, Sri Ratna Rahayu, Efa Nugroho, Bertakalswa Hermawati
Abstract:
In Indonesia, from 2007 until 2013, the prevalence of overnutrition in children under five years and school age tends to increase. Clean and Health Life Behavior of school children supporting nutrition status still below the determined target. On the other side, school institution is an ideal place to educate and form health behavior, that should be initiated as early as possible (Early Age Education/PAUD level). The objective of this research was to find out the effectivity of education model through partnership with school institution in urban region, city of Semarang, Central Java Province, Indonesia. The research used quantitative approach supported with qualitative data. The population consist of all mother having school children of ages 3-5 years within the research region; sampling technique was purposive sampling, as many as 237 mothers. Research instrument was Clean and Health Life Behavior evaluation questionaire, and video as education media. The research used experimental design. Data analysis used effectivity criteria from Sugiyono and 2 paired sampel t test. Education model optimalization in the effort to improve nutrition status indicates t test result with signification < 0.05 (there was significant effect before and after model intervention), with effectivity test result of 79% (effective), but still below expected target which is 80%. Education model need to be utilized and optimallized the implementation so that expected target reached.Keywords: nutrition status, early age education, clean dan health life behavior, education model
Procedia PDF Downloads 3882374 Designing Garments Ergonomically to Improve Life Quality of Elderly People
Authors: Nagda Ibrahim Mady, Shimaa Mohamed Atiha
Abstract:
In light of actual needs of elderly people and the changes that accompany age in eyesight, hearing, dexterity, mobility, and memory which make aged people unable to carry out the simplest living affairs especially clothing demands. These needs are almost neglected in the current clothing market obligate aged peoples to wear the available choices without any consideration to their actual desires and needs. Fashion designer has gained many experiences that can gather between ergonomics and stages of fashion designing process. Fashion designer can determine the actual needs of aged people and reply these needs with designs that can achieve Improvement to the life quality of aged people besides maintaining good appearance. Thus Fashion designer can help elderly people to avoid negative impacts age leaves on them, either it is psychological or kinetic or that of dementia. Ergonomics in clothing is considered the tools and mechanisms that are used to fit aged people satisfactions supporting them to improve their living using the least time and effort. Providing the elderly with comfort besides maintaining good appearance that can make self–confidence besides independence. From this point of view the research is looking forward to improve the life of aged people through addressing functional clothes that can make elderly independent in the wearing process. Providing in these designs comfort, quality, and practicality and economic cost. Suggesting the suitable fabrics and materials and applying it to the designs to help the elderly perform their daily living customs. Reaching the successful designs that can be acceptable to specialists and to consumers whom they confirm: it supplies their clothing needs and provides the atheistic and functional performance and therefore it gives them better life.Keywords: ergonomic, design garments, elderly people, life quality
Procedia PDF Downloads 572