Search results for: producer's classification accuracy
899 Artificial Neural Network Modeling of a Closed Loop Pulsating Heat Pipe
Authors: Vipul M. Patel, Hemantkumar B. Mehta
Abstract:
Technological innovations in electronic world demand novel, compact, simple in design, less costly and effective heat transfer devices. Closed Loop Pulsating Heat Pipe (CLPHP) is a passive phase change heat transfer device and has potential to transfer heat quickly and efficiently from source to sink. Thermal performance of a CLPHP is governed by various parameters such as number of U-turns, orientations, input heat, working fluids and filling ratio. The present paper is an attempt to predict the thermal performance of a CLPHP using Artificial Neural Network (ANN). Filling ratio and heat input are considered as input parameters while thermal resistance is set as target parameter. Types of neural networks considered in the present paper are radial basis, generalized regression, linear layer, cascade forward back propagation, feed forward back propagation; feed forward distributed time delay, layer recurrent and Elman back propagation. Linear, logistic sigmoid, tangent sigmoid and Radial Basis Gaussian Function are used as transfer functions. Prediction accuracy is measured based on the experimental data reported by the researchers in open literature as a function of Mean Absolute Relative Deviation (MARD). The prediction of a generalized regression ANN model with spread constant of 4.8 is found in agreement with the experimental data for MARD in the range of ±1.81%.Keywords: ANN models, CLPHP, filling ratio, generalized regression, spread constant
Procedia PDF Downloads 289898 Experimental Studies of Sigma Thin-Walled Beams Strengthen by CFRP Tapes
Authors: Katarzyna Rzeszut, Ilona Szewczak
Abstract:
The review of selected methods of strengthening of steel structures with carbon fiber reinforced polymer (CFRP) tapes and the analysis of influence of composite materials on the steel thin-walled elements are performed in this paper. The study is also focused to the problem of applying fast and effective strengthening methods of the steel structures made of thin-walled profiles. It is worth noting that the issue of strengthening the thin-walled structures is a very complex, due to inability to perform welded joints in this type of elements and the limited ability to applying mechanical fasteners. Moreover, structures made of thin-walled cross-section demonstrate a high sensitivity to imperfections and tendency to interactive buckling, which may substantially contribute to the reduction of critical load capacity. Due to the lack of commonly used and recognized modern methods of strengthening of thin-walled steel structures, authors performed the experimental studies of thin-walled sigma profiles strengthened with CFRP tapes. The paper presents the experimental stand and the preliminary results of laboratory test concerning the analysis of the effectiveness of the strengthening steel beams made of thin-walled sigma profiles with CFRP tapes. The study includes six beams made of the cold-rolled sigma profiles with height of 140 mm, wall thickness of 2.5 mm, and a length of 3 m, subjected to the uniformly distributed load. Four beams have been strengthened with carbon fiber tape Sika CarboDur S, while the other two were tested without strengthening to obtain reference results. Based on the obtained results, the evaluation of the accuracy of applied composite materials for strengthening of thin-walled structures was performed.Keywords: CFRP tapes, sigma profiles, steel thin-walled structures, strengthening
Procedia PDF Downloads 301897 A Machine Learning-Based Model to Screen Antituberculosis Compound Targeted against LprG Lipoprotein of Mycobacterium tuberculosis
Authors: Syed Asif Hassan, Syed Atif Hassan
Abstract:
Multidrug-resistant Tuberculosis (MDR-TB) is an infection caused by the resistant strains of Mycobacterium tuberculosis that do not respond either to isoniazid or rifampicin, which are the most important anti-TB drugs. The increase in the occurrence of a drug-resistance strain of MTB calls for an intensive search of novel target-based therapeutics. In this context LprG (Rv1411c) a lipoprotein from MTB plays a pivotal role in the immune evasion of Mtb leading to survival and propagation of the bacterium within the host cell. Therefore, a machine learning method will be developed for generating a computational model that could predict for a potential anti LprG activity of the novel antituberculosis compound. The present study will utilize dataset from PubChem database maintained by National Center for Biotechnology Information (NCBI). The dataset involves compounds screened against MTB were categorized as active and inactive based upon PubChem activity score. PowerMV, a molecular descriptor generator, and visualization tool will be used to generate the 2D molecular descriptors for the actives and inactive compounds present in the dataset. The 2D molecular descriptors generated from PowerMV will be used as features. We feed these features into three different classifiers, namely, random forest, a deep neural network, and a recurring neural network, to build separate predictive models and choosing the best performing model based on the accuracy of predicting novel antituberculosis compound with an anti LprG activity. Additionally, the efficacy of predicted active compounds will be screened using SMARTS filter to choose molecule with drug-like features.Keywords: antituberculosis drug, classifier, machine learning, molecular descriptors, prediction
Procedia PDF Downloads 389896 Development of a Regression Based Model to Predict Subjective Perception of Squeak and Rattle Noise
Authors: Ramkumar R., Gaurav Shinde, Pratik Shroff, Sachin Kumar Jain, Nagesh Walke
Abstract:
Advancements in electric vehicles have significantly reduced the powertrain noise and moving components of vehicles. As a result, in-cab noises have become more noticeable to passengers inside the car. To ensure a comfortable ride for drivers and other passengers, it has become crucial to eliminate undesirable component noises during the development phase. Standard practices are followed to identify the severity of noises based on subjective ratings, but it can be a tedious process to identify the severity of each development sample and make changes to reduce it. Additionally, the severity rating can vary from jury to jury, making it challenging to arrive at a definitive conclusion. To address this, an automotive component was identified to evaluate squeak and rattle noise issue. Physical tests were carried out for random and sine excitation profiles. Aim was to subjectively assess the noise using jury rating method and objectively evaluate the same by measuring the noise. Suitable jury evaluation method was selected for the said activity, and recorded sounds were replayed for jury rating. Objective data sound quality metrics viz., loudness, sharpness, roughness, fluctuation strength and overall Sound Pressure Level (SPL) were measured. Based on this, correlation co-efficients was established to identify the most relevant sound quality metrics that are contributing to particular identified noise issue. Regression analysis was then performed to establish the correlation between subjective and objective data. Mathematical model was prepared using artificial intelligence and machine learning algorithm. The developed model was able to predict the subjective rating with good accuracy.Keywords: BSR, noise, correlation, regression
Procedia PDF Downloads 78895 Design & Development of a Static-Thrust Test-Bench for Aviation/UAV Based Piston Engines
Authors: Syed Muhammad Basit Ali, Usama Saleem, Irtiza Ali
Abstract:
Internal combustion engines have been pioneers in the aviation industry, use of piston engines for aircraft propulsion, from propeller-driven bi-planes to turbo-prop, commercial, and cargo airliners. To provide an adequate amount of thrust piston engine rotates the propeller at a specific rpm, allowing enough mass airflow. Thrust is the only forward-acting force of an aircraft that helps heavier than air bodies to fly, depending on the mathematical model and variables included in that with the correct measurement. Test-benches have been a bench-mark in the aerospace industry to analyse the results before a flight, having paramount significance in reliability and safety engineering, depending on the mathematical model and variables included in that with the correct measurement. Calculation of thrust from a piston engine also depends on environmental changes, the diameter of the propeller, and the density of air. The project would be centered on piston engines used in the aviation industry for light aircraft and UAVs. A static thrust test bench involves various units, each performing a designed purpose to monitor and display. Static thrust tests are performed on the ground, and safety concerns hold paramount importance. The execution of this study involves research, design, manufacturing, and results based on reverse engineering initiating from virtual design, analytical analysis, and simulations. The final evaluation of results gathered from various methods such as co-relation between conventional mass-spring and digital loadcell. On average, we received 17.5kg of thrust (25+ engine run-ups – around 40 hours of engine run), only 10% deviation from analytically calculated thrust –providing 90% accuracy.Keywords: aviation, aeronautics, static thrust, test bench, aircraft maintenance
Procedia PDF Downloads 408894 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis
Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate
Abstract:
This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull
Procedia PDF Downloads 72893 Investigations of Bergy Bits and Ship Interactions in Extreme Waves Using Smoothed Particle Hydrodynamics
Authors: Mohammed Islam, Jungyong Wang, Dong Cheol Seo
Abstract:
The Smoothed Particle Hydrodynamics (SPH) method is a novel, meshless, and Lagrangian technique based numerical method that has shown promises to accurately predict the hydrodynamics of water and structure interactions in violent flow conditions. The main goal of this study is to build confidence on the versatility of the Smoothed Particle Hydrodynamics (SPH) based tool, to use it as a complementary tool to the physical model testing capabilities and support research need for the performance evaluation of ships and offshore platforms exposed to an extreme and harsh environment. In the current endeavor, an open-sourced SPH-based tool was used and validated for modeling and predictions of the hydrodynamic interactions of a 6-DOF ship and bergy bits. The study involved the modeling of a modern generic drillship and simplified bergy bits in floating and towing scenarios and in regular and irregular wave conditions. The predictions were validated using the model-scale measurements on a moored ship towed at multiple oblique angles approaching a floating bergy bit in waves. Overall, this study results in a thorough comparison between the model scale measurements and the prediction outcomes from the SPH tool for performance and accuracy. The SPH predicted ship motions and forces were primarily within ±5% of the measurements. The velocity and pressure distribution and wave characteristics over the free surface depicts realistic interactions of the wave, ship, and the bergy bit. This work identifies and presents several challenges in preparing the input file, particularly while defining the mass properties of complex geometry, the computational requirements, and the post-processing of the outcomes.Keywords: SPH, ship and bergy bit, hydrodynamic interactions, model validation, physical model testing
Procedia PDF Downloads 130892 Comparison of Different Hydrograph Routing Techniques in XPSTORM Modelling Software: A Case Study
Authors: Fatema Akram, Mohammad Golam Rasul, Mohammad Masud Kamal Khan, Md. Sharif Imam Ibne Amir
Abstract:
A variety of routing techniques are available to develop surface runoff hydrographs from rainfall. The selection of runoff routing method is very vital as it is directly related to the type of watershed and the required degree of accuracy. There are different modelling softwares available to explore the rainfall-runoff process in urban areas. XPSTORM, a link-node based, integrated storm-water modelling software, has been used in this study for developing surface runoff hydrograph for a Golf course area located in Rockhampton in Central Queensland in Australia. Four commonly used methods, namely SWMM runoff, Kinematic wave, Laurenson, and Time-Area are employed to generate runoff hydrograph for design storm of this study area. In runoff mode of XPSTORM, the rainfall, infiltration, evaporation and depression storage for sub-catchments were simulated and the runoff from the sub-catchment to collection node was calculated. The simulation results are presented, discussed and compared. The total surface runoff generated by SWMM runoff, Kinematic wave and Time-Area methods are found to be reasonably close, which indicates any of these methods can be used for developing runoff hydrograph of the study area. Laurenson method produces a comparatively less amount of surface runoff, however, it creates highest peak of surface runoff among all which may be suitable for hilly region. Although the Laurenson hydrograph technique is widely acceptable surface runoff routing technique in Queensland (Australia), extensive investigation is recommended with detailed topographic and hydrologic data in order to assess its suitability for use in the case study area.Keywords: ARI, design storm, IFD, rainfall temporal pattern, routing techniques, surface runoff, XPSTORM
Procedia PDF Downloads 452891 The Changing Landscape of Fire Safety in Covered Car Parks with the Arrival of Electric Vehicles
Authors: Matt Stallwood, Michael Spearpoint
Abstract:
In 2020, the UK government announced that sales of new petrol and diesel cars would end in 2030, and battery-powered cars made up 1 in 8 new cars sold in 2021 – more than the total from the previous five years. The guidance across the UK for the fire safety design of covered car parks is changing in response to the projected rapid growth in electric vehicle (EV) use. This paper discusses the current knowledge on the fire safety concerns posed by EVs, in particular those powered by lithium-ion batteries, when considering the likelihood of vehicle ignition, fire severity and spread of fire to other vehicles. The paper builds on previous work that has investigated the frequency of fires starting in cars powered by internal combustion engines (ICE), the hazard posed by such fires in covered car parks and the potential for neighboring vehicles to become involved in an incident. Historical data has been used to determine the ignition frequency of ICE car fires, whereas such data is scarce when it comes to EV fires. Should a fire occur, then the fire development has conventionally been assessed to match a ‘medium’ growth rate and to have a 95th percentile peak heat release of 9 MW. The paper examines recent literature in which researchers have measured the burning characteristics of EVs to assess whether these values need to be changed. These findings are used to assess the risk posed by EVs when compared to ICE vehicles. The paper examines what new design guidance is being issued by various organizations across the UK, such as fire and rescue services, insurers, local government bodies and regulators and discusses the impact these are having on the arrangement of parking bays, particularly in residential and mixed-use buildings. For example, the paper illustrates how updated guidance published by the Fire Protection Association (FPA) on the installation of sprinkler systems has increased the hazard classification of parking buildings that can have a considerable impact on the feasibility of a building to meet all its design intents when specifying water supply tanks. Another guidance on the provision of smoke ventilation systems and structural fire resistance is also presented. The paper points to where further research is needed on the fire safety risks posed by EVs in covered car parks. This will ensure that any guidance is commensurate with the need to provide an adequate level of life and property safety in the built environment.Keywords: covered car parks, electric vehicles, fire safety, risk
Procedia PDF Downloads 72890 Analysis of Travel Behavior Patterns of Frequent Passengers after the Section Shutdown of Urban Rail Transit - Taking the Huaqiao Section of Shanghai Metro Line 11 Shutdown During the COVID-19 Epidemic as an Example
Authors: Hongyun Li, Zhibin Jiang
Abstract:
The travel of passengers in the urban rail transit network is influenced by changes in network structure and operational status, and the response of individual travel preferences to these changes also varies. Firstly, the influence of the suspension of urban rail transit line sections on passenger travel along the line is analyzed. Secondly, passenger travel trajectories containing multi-dimensional semantics are described based on network UD data. Next, passenger panel data based on spatio-temporal sequences is constructed to achieve frequent passenger clustering. Then, the Graph Convolutional Network (GCN) is used to model and identify the changes in travel modes of different types of frequent passengers. Finally, taking Shanghai Metro Line 11 as an example, the travel behavior patterns of frequent passengers after the Huaqiao section shutdown during the COVID-19 epidemic are analyzed. The results showed that after the section shutdown, most passengers would transfer to the nearest Anting station for boarding, while some passengers would transfer to other stations for boarding or cancel their travels directly. Among the passengers who transferred to Anting station for boarding, most of passengers maintained the original normalized travel mode, a small number of passengers waited for a few days before transferring to Anting station for boarding, and only a few number of passengers stopped traveling at Anting station or transferred to other stations after a few days of boarding on Anting station. The results can provide a basis for understanding urban rail transit passenger travel patterns and improving the accuracy of passenger flow prediction in abnormal operation scenarios.Keywords: urban rail transit, section shutdown, frequent passenger, travel behavior pattern
Procedia PDF Downloads 84889 The Impact of Bim Technology on the Whole Process Cost Management of Civil Engineering Projects in Kenya
Authors: Nsimbe Allan
Abstract:
The study examines the impact of Building Information Modeling (BIM) on the cost management of engineering projects, focusing specifically on the Mombasa Port Area Development Project. The objective of this research venture is to determine the mechanisms through which Building Information Modeling (BIM) facilitates stakeholder collaboration, reduces construction-related expenses, and enhances the precision of cost estimation. Furthermore, the study investigates barriers to execution, assesses the impact on the project's transparency, and suggests approaches to maximize resource utilization. The study, selected for its practical significance and intricate nature, conducted a Systematic Literature Review (SLR) using credible databases, including ScienceDirect and IEEE Xplore. To constitute the diverse sample, 69 individuals, including project managers, cost estimators, and BIM administrators, were selected via stratified random sampling. The data were obtained using a mixed-methods approach, which prioritized ethical considerations. SPSS and Microsoft Excel were applied to the analysis. The research emphasizes the crucial role that project managers, architects, and engineers play in the decision-making process (47% of respondents). Furthermore, a significant improvement in cost estimation accuracy was reported by 70% of the participants. It was found that the implementation of BIM resulted in enhanced project visibility, which in turn optimized resource allocation and facilitated the process of budgeting. In brief, the study highlights the positive impacts of Building Information Modeling (BIM) on collaborative decision-making and cost estimation, addresses challenges related to implementation, and provides solutions for the efficient assimilation and understanding of BIM principles.Keywords: cost management, resource utilization, stakeholder collaboration, project transparency
Procedia PDF Downloads 66888 Challenges of eradicating neglected tropical diseases
Authors: Marziye Hadian, Alireza Jabbari
Abstract:
Background: Each year, tropical diseases affect large numbers of tropical or subtropical populations and give rise to irreparable financial and human damage. Among these diseases, some are known as Neglected Tropical Disease (NTD) that may cause unusual dangers; however, they have not been appropriately accounted for. Taking into account the priority of eradication of the disease, this study explored the causes of failure to eradicate neglected tropical diseases. Method: This study was a systematized review that was conducted in January 2021 on the articles related to neglected tropical diseases on databases of Web of Science, PubMed, Scopus, Science Direct, Ovid, Pro-Quest, and Google Scholar. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines as well as Critical Appraisal Skills Program (CASP) for articles and AACODS (Authority, Accuracy, Coverage, Objectivity, Date, Significance) for grey literature (provides five criteria for judging the quality of grey information) were integrated. Finding: The challenges in controlling and eradicating neglected tropical diseases in four general themes are as follows: shortcomings in disease management policies and programs, environmental challenges, executive challenges in policy disease and research field and 36 sub-themes. Conclusion: To achieve the goals of eradicating forgotten tropical diseases, it seems indispensable to free up financial, human and research resources, proper management of health infrastructure, attention to migrants and refugees, clear targeting, prioritization appropriate to local conditions and special attention to political and social developments. Reducing the number of diseases should free up resources for the management of neglected tropical diseases prone to epidemics as dengue, chikungunya and leishmaniasis. For the purpose of global support, targeting should be accurate.Keywords: neglected tropical disease, NTD, preventive, eradication
Procedia PDF Downloads 129887 One Decade Later: The Conundrum of Unrecognized Asherman Syndrome
Authors: Maria Francesca Lavadia-Gumabao, Mary Antoinette Salvamante-Torallo
Abstract:
Introduction: The fibrous intrauterine adhesions forming inside the uterus and/or cervix in Asherman syndrome can obstruct the internal cervical orifice and may present as a case of outflow tract obstruction. Asherman syndrome is often overlooked since it has no specific presentation and is undetectable by routine physical examinations or diagnostic procedures such as an ultrasound. This paper highlights the delay and elusive diagnosis of Asherman syndrome which negatively impacted the patient’s fertility and quality of life. Case presentation: A 33-year-old woman (gravida 3, para 3) who presented with secondary amenorrhea for thirteen years associated with cyclic pelvic pain and secondary infertility sought a consultation at our institution for evaluation and specialty management. The patient had no other well-established risk factors for Asherman syndrome aside from pregnancy. For more than a decade, she delayed seeking medical care. At presentation, history taking, physical examination, and ultrasound were not helpful in identifying the cause of outflow tract obstruction. Diagnostic hysteroscopy was then performed, during which extensive scarring and fibrosis completely obscured the internal cervical orifice were observed, consistent with the diagnosis of Asherman syndrome (Grade 5B). The patient then underwent ultrasound guided hysteroscopy outflow tract dilatation and responded well to the treatment as she had her menstrual period a month after the procedure and no longer had cyclic pelvic pain with a repeat ultrasound finding of an unremarkable uterus. The hispathology result of the tissues retrieved revealed myometrial fragments with associated old hemorrhage benign endometrial stromal tissues, which failed to show endometrial glands. Conclusion: The delay and elusive diagnosis of Asherman syndrome can be brought about by poor health seeking behavior of patients and difficulty in detecting this condition by routine physical examinations or diagnostic procedures such as an ultrasound. It is, therefore, necessary to include Asherman syndrome in the differential diagnosis of secondary amenorrhea and secondary infertility. With expertise in hysteroscopy, early diagnosis, proper classification in the advent of hysteroscopy, and optimal management can improve patient outcomes.Keywords: Asherman syndrome, outflow tract obstruction, secondary amenorrhea, infertility, hysteroscopy
Procedia PDF Downloads 9886 A Framework on Data and Remote Sensing for Humanitarian Logistics
Authors: Vishnu Nagendra, Marten Van Der Veen, Stefania Giodini
Abstract:
Effective humanitarian logistics operations are a cornerstone in the success of disaster relief operations. However, for effectiveness, they need to be demand driven and supported by adequate data for prioritization. Without this data operations are carried out in an ad hoc manner and eventually become chaotic. The current availability of geospatial data helps in creating models for predictive damage and vulnerability assessment, which can be of great advantage to logisticians to gain an understanding on the nature and extent of the disaster damage. This translates into actionable information on the demand for relief goods, the state of the transport infrastructure and subsequently the priority areas for relief delivery. However, due to the unpredictable nature of disasters, the accuracy in the models need improvement which can be done using remote sensing data from UAVs (Unmanned Aerial Vehicles) or satellite imagery, which again come with certain limitations. This research addresses the need for a framework to combine data from different sources to support humanitarian logistic operations and prediction models. The focus is on developing a workflow to combine data from satellites and UAVs post a disaster strike. A three-step approach is followed: first, the data requirements for logistics activities are made explicit, which is done by carrying out semi-structured interviews with on field logistics workers. Second, the limitations in current data collection tools are analyzed to develop workaround solutions by following a systems design approach. Third, the data requirements and the developed workaround solutions are fit together towards a coherent workflow. The outcome of this research will provide a new method for logisticians to have immediately accurate and reliable data to support data-driven decision making.Keywords: unmanned aerial vehicles, damage prediction models, remote sensing, data driven decision making
Procedia PDF Downloads 377885 Document-level Sentiment Analysis: An Exploratory Case Study of Low-resource Language Urdu
Authors: Ammarah Irum, Muhammad Ali Tahir
Abstract:
Document-level sentiment analysis in Urdu is a challenging Natural Language Processing (NLP) task due to the difficulty of working with lengthy texts in a language with constrained resources. Deep learning models, which are complex neural network architectures, are well-suited to text-based applications in addition to data formats like audio, image, and video. To investigate the potential of deep learning for Urdu sentiment analysis, we implemented five different deep learning models, including Bidirectional Long Short Term Memory (BiLSTM), Convolutional Neural Network (CNN), Convolutional Neural Network with Bidirectional Long Short Term Memory (CNN-BiLSTM), and Bidirectional Encoder Representation from Transformer (BERT). In this study, we developed a hybrid deep learning model called BiLSTM-Single Layer Multi Filter Convolutional Neural Network (BiLSTM-SLMFCNN) by fusing BiLSTM and CNN architecture. The proposed and baseline techniques are applied on Urdu Customer Support data set and IMDB Urdu movie review data set by using pre-trained Urdu word embedding that are suitable for sentiment analysis at the document level. Results of these techniques are evaluated and our proposed model outperforms all other deep learning techniques for Urdu sentiment analysis. BiLSTM-SLMFCNN outperformed the baseline deep learning models and achieved 83%, 79%, 83% and 94% accuracy on small, medium and large sized IMDB Urdu movie review data set and Urdu Customer Support data set respectively.Keywords: urdu sentiment analysis, deep learning, natural language processing, opinion mining, low-resource language
Procedia PDF Downloads 70884 Predicting Match Outcomes in Team Sport via Machine Learning: Evidence from National Basketball Association
Authors: Jacky Liu
Abstract:
This paper develops a team sports outcome prediction system with potential for wide-ranging applications across various disciplines. Despite significant advancements in predictive analytics, existing studies in sports outcome predictions possess considerable limitations, including insufficient feature engineering and underutilization of advanced machine learning techniques, among others. To address these issues, we extend the Sports Cross Industry Standard Process for Data Mining (SRP-CRISP-DM) framework and propose a unique, comprehensive predictive system, using National Basketball Association (NBA) data as an example to test this extended framework. Our approach follows a holistic methodology in feature engineering, employing both Time Series and Non-Time Series Data, as well as conducting Explanatory Data Analysis and Feature Selection. Furthermore, we contribute to the discourse on target variable choice in team sports outcome prediction, asserting that point spread prediction yields higher profits as opposed to game-winner predictions. Using machine learning algorithms, particularly XGBoost, results in a significant improvement in predictive accuracy of team sports outcomes. Applied to point spread betting strategies, it offers an astounding annual return of approximately 900% on an initial investment of $100. Our findings not only contribute to academic literature, but have critical practical implications for sports betting. Our study advances the understanding of team sports outcome prediction a burgeoning are in complex system predictions and pave the way for potential profitability and more informed decision making in sports betting markets.Keywords: machine learning, team sports, game outcome prediction, sports betting, profits simulation
Procedia PDF Downloads 101883 Pneumoperitoneum Creation Assisted with Optical Coherence Tomography and Automatic Identification
Authors: Eric Yi-Hsiu Huang, Meng-Chun Kao, Wen-Chuan Kuo
Abstract:
For every laparoscopic surgery, a safe pneumoperitoneumcreation (gaining access to the peritoneal cavity) is the first and essential step. However, closed pneumoperitoneum is usually obtained by blind insertion of a Veress needle into the peritoneal cavity, which may carry potential risks suchas bowel and vascular injury.Until now, there remains no definite measure to visually confirm the position of the needle tip inside the peritoneal cavity. Therefore, this study established an image-guided Veress needle method by combining a fiber probe with optical coherence tomography (OCT). An algorithm was also proposed for determining the exact location of the needle tip through the acquisition of OCT images. Our method not only generates a series of “live” two-dimensional (2D) images during the needle puncture toward the peritoneal cavity but also can eliminate operator variation in image judgment, thus improving peritoneal access safety. This study was approved by the Ethics Committee of Taipei Veterans General Hospital (Taipei VGH IACUC 2020-144). A total of 2400 in vivo OCT images, independent of each other, were acquired from experiments of forty peritoneal punctures on two piglets. Characteristic OCT image patterns could be observed during the puncturing process. The ROC curve demonstrates the discrimination capability of these quantitative image features of the classifier, showing the accuracy of the classifier for determining the inside vs. outside of the peritoneal was 98% (AUC=0.98). In summary, the present study demonstrates the ability of the combination of our proposed automatic identification method and OCT imaging for automatically and objectively identifying the location of the needle tip. OCT images translate the blind closed technique of peritoneal access into a visualized procedure, thus improving peritoneal access safety.Keywords: pneumoperitoneum, optical coherence tomography, automatic identification, veress needle
Procedia PDF Downloads 133882 Reasons for Lack of an Ideal Disinfectant after Dental Treatments
Authors: Ilma Robo, Saimir Heta, Rialda Xhizdari, Kers Kapaj
Abstract:
Background: The ideal disinfectant for surfaces, instruments, air, skin, both in dentistry and in the fields of medicine, does not exist.This is for the sole reason that all the characteristics of the ideal disinfectant cannot be contained in one; these are the characteristics that if one of them is emphasized, it will conflict with the other. A disinfectant must be stable, not be affected by changes in the environmental conditions where it stands, which means that it should not be affected by an increase in temperature or an increase in the humidity of the environment. Both of these elements contradict the other element of the idea of an ideal disinfectant, as they disrupt the solubility ratios of the base substance of the disinfectant versus the diluent. Material and methods: The study aims to extract the constant of each disinfectant/antiseptic used during dental disinfection protocols, accompanied by the side effects of the surface of the skin or mucosa where it is applied in the role of antiseptic. In the end, attempts were made to draw conclusions about the best possible combination for disinfectants after a dental procedure, based on the data extracted from the basic literature required during the development of the pharmacology module, as a module in the formation of a dentist, against data published in the literature. Results: The sensitivity of the disinfectant to changes in the atmospheric conditions of the environment where it is kept is a known fact. The care against this element is always accompanied by the advice on the application of the specific disinfectant, in order to have the desired clinical result. The constants of disinfectants according to the classification based on the data collected and presented are for alcohols 70-120, glycols 0.2, aldehydes 30-200, phenols 15-60, acids 100, povidone iodine halogens 5-75, hypochlorous acid halogens 150, sodium hypochlorite halogens 30-35, oxidants 18-60, metals 0.2-10. The part of halogens should be singled out, where specific results were obtained according to the representatives of this class, since it is these representatives that find scope for clinical application in dentistry. Conclusions: The search for the "ideal", in the conditions where its defining criteria are also established, not only for disinfectants but also for any medication or pharmaceutical product, is an ongoing search, without any definitive results. In this mine of data in the published literature if there is something fixed, calculable, such as the specific constant for disinfectants, the search for the ideal is more concrete. During the disinfection protocols, different disinfectants are applied since the field of action is different, including water, air, aspiration devices, tools, disinfectants used in full accordance with the production indications.Keywords: disinfectant, constant, ideal, side effects
Procedia PDF Downloads 67881 Approaches to Valuing Ecosystem Services in Agroecosystems From the Perspectives of Ecological Economics and Agroecology
Authors: Sandra Cecilia Bautista-Rodríguez, Vladimir Melgarejo
Abstract:
Climate change, loss of ecosystems, increasing poverty, increasing marginalization of rural communities and declining food security are global issues that require urgent attention. In this regard, a great deal of research has focused on how agroecosystems respond to these challenges as they provide ecosystem services (ES) that lead to higher levels of resilience, adaptation, productivity and self-sufficiency. Hence, the valuing of ecosystem services plays an important role in the decision-making process for the design and management of agroecosystems. This paper aims to define the link between ecosystem service valuation methods and ES value dimensions in agroecosystems from ecological economics and agroecology. The method used to identify valuation methodologies was a literature review in the fields of Agroecology and Ecological Economics, based on a strategy of information search and classification. The conceptual framework of the work is based on the multidimensionality of value, considering the social, ecological, political, technological and economic dimensions. Likewise, the valuation process requires consideration of the ecosystem function associated with ES, such as regulation, habitat, production and information functions. In this way, valuation methods for ES in agroecosystems can integrate more than one value dimension and at least one ecosystem function. The results allow correlating the ecosystem functions with the ecosystem services valued, and the specific tools or models used, the dimensions and valuation methods. The main methodologies identified are multi-criteria valuation (1), deliberative - consultative valuation (2), valuation based on system dynamics modeling (3), valuation through energy or biophysical balances (4), valuation through fuzzy logic modeling (5), valuation based on agent-based modeling (6). Amongst the main conclusions, it is highlighted that the system dynamics modeling approach has a high potential for development in valuation processes, due to its ability to integrate other methods, especially multi-criteria valuation and energy and biophysical balances, to describe through causal cycles the interrelationships between ecosystem services, the dimensions of value in agroecosystems, thus showing the relationships between the value of ecosystem services and the welfare of communities. As for methodological challenges, it is relevant to achieve the integration of tools and models provided by different methods, to incorporate the characteristics of a complex system such as the agroecosystem, which allows reducing the limitations in the processes of valuation of ES.Keywords: ecological economics, agroecosystems, ecosystem services, valuation of ecosystem services
Procedia PDF Downloads 122880 Formation of the Investment Portfolio of Intangible Assets with a Wide Pairwise Comparison Matrix Application
Authors: Gulnara Galeeva
Abstract:
The Analytic Hierarchy Process is widely used in the economic and financial studies, including the formation of investment portfolios. In this study, a generalized method of obtaining a vector of priorities for the case with separate pairwise comparisons of the expert opinion being presented as a set of several equal evaluations on a ratio scale is examined. The author claims that this method allows solving an important and up-to-date problem of excluding vagueness and ambiguity of the expert opinion in the decision making theory. The study describes the authentic wide pairwise comparison matrix. Its application in the formation of the efficient investment portfolio of intangible assets of a small business enterprise with limited funding is considered. The proposed method has been successfully approbated on the practical example of a functioning dental clinic. The result of the study confirms that the wide pairwise comparison matrix can be used as a simple and reliable method for forming the enterprise investment policy. Moreover, a comparison between the method based on the wide pairwise comparison matrix and the classical analytic hierarchy process was conducted. The results of the comparative analysis confirm the correctness of the method based on the wide matrix. The application of a wide pairwise comparison matrix also allows to widely use the statistical methods of experimental data processing for obtaining the vector of priorities. A new method is available for simple users. Its application gives about the same accuracy result as that of the classical hierarchy process. Financial directors of small and medium business enterprises get an opportunity to solve the problem of companies’ investments without resorting to services of analytical agencies specializing in such studies.Keywords: analytic hierarchy process, decision processes, investment portfolio, intangible assets
Procedia PDF Downloads 264879 Towards a Robust Patch Based Multi-View Stereo Technique for Textureless and Occluded 3D Reconstruction
Authors: Ben Haines, Li Bai
Abstract:
Patch based reconstruction methods have been and still are one of the top performing approaches to 3D reconstruction to date. Their local approach to refining the position and orientation of a patch, free of global minimisation and independent of surface smoothness, make patch based methods extremely powerful in recovering fine grained detail of an objects surface. However, patch based approaches still fail to faithfully reconstruct textureless or highly occluded surface regions thus though performing well under lab conditions, deteriorate in industrial or real world situations. They are also computationally expensive. Current patch based methods generate point clouds with holes in texturesless or occluded regions that require expensive energy minimisation techniques to fill and interpolate a high fidelity reconstruction. Such shortcomings hinder the adaptation of the methods for industrial applications where object surfaces are often highly textureless and the speed of reconstruction is an important factor. This paper presents on-going work towards a multi-resolution approach to address the problems, utilizing particle swarm optimisation to reconstruct high fidelity geometry, and increasing robustness to textureless features through an adapted approach to the normalised cross correlation. The work also aims to speed up the reconstruction using advances in GPU technologies and remove the need for costly initialization and expansion. Through the combination of these enhancements, it is the intention of this work to create denser patch clouds even in textureless regions within a reasonable time. Initial results show the potential of such an approach to construct denser point clouds with a comparable accuracy to that of the current top-performing algorithms.Keywords: 3D reconstruction, multiview stereo, particle swarm optimisation, photo consistency
Procedia PDF Downloads 202878 Recommendations for Teaching Word Formation for Students of Linguistics Using Computer Terminology as an Example
Authors: Svetlana Kostrubina, Anastasia Prokopeva
Abstract:
This research presents a comprehensive study of the word formation processes in computer terminology within English and Russian languages and provides listeners with a system of exercises for training these skills. The originality is that this study focuses on a comparative approach, which shows both general patterns and specific features of English and Russian computer terms word formation. The key point is the system of exercises development for training computer terminology based on Bloom’s taxonomy. Data contain 486 units (228 English terms from the Glossary of Computer Terms and 258 Russian terms from the Terminological Dictionary-Reference Book). The objective is to identify the main affixation models in the English and Russian computer terms formation and to develop exercises. To achieve this goal, the authors employed Bloom’s Taxonomy as a methodological framework to create a systematic exercise program aimed at enhancing students’ cognitive skills in analyzing, applying, and evaluating computer terms. The exercises are appropriate for various levels of learning, from basic recall of definitions to higher-order thinking skills, such as synthesizing new terms and critically assessing their usage in different contexts. Methodology also includes: a method of scientific and theoretical analysis for systematization of linguistic concepts and clarification of the conceptual and terminological apparatus; a method of nominative and derivative analysis for identifying word-formation types; a method of word-formation analysis for organizing linguistic units; a classification method for determining structural types of abbreviations applicable to the field of computer communication; a quantitative analysis technique for determining the productivity of methods for forming abbreviations of computer vocabulary based on the English and Russian computer terms, as well as a technique of tabular data processing for a visual presentation of the results obtained. a technique of interlingua comparison for identifying common and different features of abbreviations of computer terms in the Russian and English languages. The research shows that affixation retains its productivity in the English and Russian computer terms formation. Bloom’s taxonomy allows us to plan a training program and predict the effectiveness of the compiled program based on the assessment of the teaching methods used.Keywords: word formation, affixation, computer terms, Bloom's taxonomy
Procedia PDF Downloads 9877 Obese and Overweight Women and Public Health Issues in Hillah City, Iraq
Authors: Amean A. Yasir, Zainab Kh. A. Al-Mahdi Al-Amean
Abstract:
In both developed and developing countries, obesity among women is increasing, but in different patterns and at very different speeds. It may have a negative effect on health, leading to reduced life expectancy and/or increased health problems. This research studied the age distribution among obese women, the types of overweight and obesity, and the extent of the problem of overweight/obesity and the obesity etiological factors among women in Hillah city in central Iraq. A total of 322 overweight and obese women were included in the study, those women were randomly selected. The Body Mass Index was used as indicator for overweight/ obesity. The incidence of overweight/obesity among age groups were estimated, the etiology factors included genetic, environmental, genetic/environmental and endocrine disease. The overweight and obese women were screened for incidence of infection and/or diseases. The study found that the prevalence of 322 overweight and obese women in Hillah city in central Iraq was 19.25% and 80.78%, respectively. The obese women types were recorded based on BMI and WHO classification as class-1 obesity (29.81%), class-2 obesity (24.22%) and class-3 obesity (26.70%), the result was discrepancy non-significant, P value < 0.05. The incidence of overweight in women was high among those aged 20-29 years (90.32%), 6.45% aged 30-39 years old and 3.22% among ≥ 60 years old, while the incidence of obesity was 20.38% for those in the age group 20-29 years, 17.30% were 30-39 years, 23.84% were 40-49 years, 16.92% were 50-59 years group and 21.53% were ≥ 60 years age group. These results confirm that the age can be considered as a significant factor for obesity types (P value < 0.0001). The result also showed that the both genetic factors and environmental factors were responsible for incidents of overweight or obesity (84.78%) p value < 0.0001. The results also recorded cases of different repeated infections (skin infection, recurrent UTI and influenza), cancer, gallstones, high blood pressure, type 2 diabetes, and infertility. Weight stigma and bias generally refers to negative attitudes; Obesity can affect quality of life, and the results of this study recorded depression among overweight or obese women. This can lead to sexual problems, shame and guilt, social isolation and reduced work performance. Overweight and Obesity are real problems among women of all age groups and is associated with the risk of diseases and infection and negatively affects quality of life. This result warrants further studies into the prevalence of obesity among women in Hillah City in central Iraq and the immune response of obese women.Keywords: obesity, overweight, Iraq, body mass index
Procedia PDF Downloads 385876 Shifting to Electronic Operative Notes in Plastic surgery
Authors: Samar Mousa, Galini Mavromatidou, Rebecca Shirley
Abstract:
Surgeons carry out numerous operations in the busy burns and plastic surgery department daily. Writing an accurate operation note with all the essential information is crucial for communication not only within the plastics team but also to the multi-disciplinary team looking after the patient, including other specialties, nurses and GPs. The Royal college of surgeons of England, in its guidelines of good surgical practice, mentioned that the surgeon should ensure that there are clear (preferably typed) operative notes for every procedure. The notes should accompany the patient into recovery and to the ward and should give sufficient detail to enable continuity of care by another doctor. The notes should include the Date and time, Elective/emergency procedure, Names of the operating surgeon and assistant, Name of the theatre anesthetist, Operative procedure carried out, Incision, Operative diagnosis, Operative findings, Any problems/complications, Any extra procedure performed and the reason why it was performed, Details of tissue removed, added or altered, Identification of any prosthesis used, including the serial numbers of prostheses and other implanted materials, Details of closure technique, Anticipated blood loss, Antibiotic prophylaxis (where applicable), DVT prophylaxis (where applicable), Detailed postoperative care instructions and Signature. Fourteen random days were chosen in December 2021 to assess the accuracy of operative notes and post-operative care. A total of 163 operative notes were examined. The average completion rates in all domains were 85.4%. An electronic operative note template was designed to cover all domains mentioned in the Royal College of surgeons' good surgical practice. It is kept in the hospital drive for all surgeons to use.Keywords: operative notes, plastic surgery, documentation, electronic
Procedia PDF Downloads 78875 Human-Machine Cooperation in Facial Comparison Based on Likelihood Scores
Authors: Lanchi Xie, Zhihui Li, Zhigang Li, Guiqiang Wang, Lei Xu, Yuwen Yan
Abstract:
Image-based facial features can be classified into category recognition features and individual recognition features. Current automated face recognition systems extract a specific feature vector of different dimensions from a facial image according to their pre-trained neural network. However, to improve the efficiency of parameter calculation, an algorithm generally reduces the image details by pooling. The operation will overlook the details concerned much by forensic experts. In our experiment, we adopted a variety of face recognition algorithms based on deep learning, compared a large number of naturally collected face images with the known data of the same person's frontal ID photos. Downscaling and manual handling were performed on the testing images. The results supported that the facial recognition algorithms based on deep learning detected structural and morphological information and rarely focused on specific markers such as stains and moles. Overall performance, distribution of genuine scores and impostor scores, and likelihood ratios were tested to evaluate the accuracy of biometric systems and forensic experts. Experiments showed that the biometric systems were skilled in distinguishing category features, and forensic experts were better at discovering the individual features of human faces. In the proposed approach, a fusion was performed at the score level. At the specified false accept rate, the framework achieved a lower false reject rate. This paper contributes to improving the interpretability of the objective method of facial comparison and provides a novel method for human-machine collaboration in this field.Keywords: likelihood ratio, automated facial recognition, facial comparison, biometrics
Procedia PDF Downloads 128874 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 29873 Comparison of an Anthropomorphic PRESAGE® Dosimeter and Radiochromic Film with a Commercial Radiation Treatment Planning System for Breast IMRT: A Feasibility Study
Authors: Khalid Iqbal
Abstract:
This work presents a comparison of an anthropomorphic PRESAGE® dosimeter and radiochromic film measurements with a commercial treatment planning system to determine the feasibility of PRESAGE® for 3D dosimetry in breast IMRT. An anthropomorphic PRESAGE® phantom was created in the shape of a breast phantom. A five-field IMRT plan was generated with a commercially available treatment planning system and delivered to the PRESAGE® phantom. The anthropomorphic PRESAGE® was scanned with the Duke midsized optical CT scanner (DMOS-RPC) and the OD distribution was converted to dose. Comparisons were performed between the dose distribution calculated with the Pinnacle3 treatment planning system, PRESAGE®, and EBT2 film measurements. DVHs, gamma maps, and line profiles were used to evaluate the agreement. Gamma map comparisons showed that Pinnacle3 agreed with PRESAGE® as greater than 95% of comparison points for the PTV passed a ± 3%/± 3 mm criterion when the outer 8 mm of phantom data were discluded. Edge artifacts were observed in the optical CT reconstruction, from the surface to approximately 8 mm depth. These artifacts resulted in dose differences between Pinnacle3 and PRESAGE® of up to 5% between the surface and a depth of 8 mm and decreased with increasing depth in the phantom. Line profile comparisons between all three independent measurements yielded a maximum difference of 2% within the central 80% of the field width. For the breast IMRT plan studied, the Pinnacle3 calculations agreed with PRESAGE® measurements to within the ±3%/± 3 mm gamma criterion. This work demonstrates the feasibility of the PRESAGE® to be fashioned into anthropomorphic shape, and establishes the accuracy of Pinnacle3 for breast IMRT. Furthermore, these data have established the groundwork for future investigations into 3D dosimetry with more complex anthropomorphic phantoms.Keywords: 3D dosimetry, PRESAGE®, IMRT, QA, EBT2 GAFCHROMIC film
Procedia PDF Downloads 414872 Evaluating the Dosimetric Performance for 3D Treatment Planning System for Wedged and Off-Axis Fields
Authors: Nashaat A. Deiab, Aida Radwan, Mohamed S. Yahiya, Mohamed Elnagdy, Rasha Moustafa
Abstract:
This study is to evaluate the dosimetric performance of our institution's 3D treatment planning system for wedged and off-axis 6MV photon beams, guided by the recommended QA tests documented in the AAPM TG53; NCS report 15 test packages, IAEA TRS 430 and ESTRO booklet no.7. The study was performed for Elekta Precise linear accelerator designed for clinical range of 4, 6 and 15 MV photon beams with asymmetric jaws and fully integrated multileaf collimator that enables high conformance to target with sharp field edges. Ten tests were applied on solid water equivalent phantom along with 2D array dose detection system. The calculated doses using 3D treatment planning system PrecisePLAN were compared with measured doses to make sure that the dose calculations are accurate for simple situations such as square and elongated fields, different SSD, beam modifiers e.g. wedges, blocks, MLC-shaped fields and asymmetric collimator settings. The QA results showed dosimetric accuracy of the TPS within the specified tolerance limits. Except for large elongated wedged field, the central axis and outside central axis have errors of 0.2% and 0.5%, respectively, and off- planned and off-axis elongated fields the region outside the central axis of the beam errors are 0.2% and 1.1%, respectively. The dosimetric investigated results yielded differences within the accepted tolerance level as recommended. Differences between dose values predicted by the TPS and measured values at the same point are the result from limitations of the dose calculation, uncertainties in the measurement procedure, or fluctuations in the output of the accelerator.Keywords: quality assurance, dose calculation, wedged fields, off-axis fields, 3D treatment planning system, photon beam
Procedia PDF Downloads 444871 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning
Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu
Abstract:
Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.
Procedia PDF Downloads 88870 High-Resolution Flood Hazard Mapping Using Two-Dimensional Hydrodynamic Model Anuga: Case Study of Jakarta, Indonesia
Authors: Hengki Eko Putra, Dennish Ari Putro, Tri Wahyu Hadi, Edi Riawan, Junnaedhi Dewa Gede, Aditia Rojali, Fariza Dian Prasetyo, Yudhistira Satya Pribadi, Dita Fatria Andarini, Mila Khaerunisa, Raditya Hanung Prakoswa
Abstract:
Catastrophe risk management can only be done if we are able to calculate the exposed risks. Jakarta is an important city economically, socially, and politically and in the same time exposed to severe floods. On the other hand, flood risk calculation is still very limited in the area. This study has calculated the risk of flooding for Jakarta using 2-Dimensional Model ANUGA. 2-Dimensional model ANUGA and 1-Dimensional Model HEC-RAS are used to calculate the risk of flooding from 13 major rivers in Jakarta. ANUGA can simulate physical and dynamical processes between the streamflow against river geometry and land cover to produce a 1-meter resolution inundation map. The value of streamflow as an input for the model obtained from hydrological analysis on rainfall data using hydrologic model HEC-HMS. The probabilistic streamflow derived from probabilistic rainfall using statistical distribution Log-Pearson III, Normal and Gumbel, through compatibility test using Chi Square and Smirnov-Kolmogorov. Flood event on 2007 is used as a comparison to evaluate the accuracy of model output. Property damage estimations were calculated based on flood depth for 1, 5, 10, 25, 50, and 100 years return period against housing value data from the BPS-Statistics Indonesia, Centre for Research and Development of Housing and Settlements, Ministry of Public Work Indonesia. The vulnerability factor was derived from flood insurance claim. Jakarta's flood loss estimation for the return period of 1, 5, 10, 25, 50, and 100 years, respectively are Rp 1.30 t; Rp 16.18 t; Rp 16.85 t; Rp 21.21 t; Rp 24.32 t; and Rp 24.67 t of the total value of building Rp 434.43 t.Keywords: 2D hydrodynamic model, ANUGA, flood, flood modeling
Procedia PDF Downloads 275