Search results for: Multiple criteria decision-making (MCDM)
6261 The Network Relative Model Accuracy (NeRMA) Score: A Method to Quantify the Accuracy of Prediction Models in a Concurrent External Validation
Authors: Carl van Walraven, Meltem Tuna
Abstract:
Background: Network meta-analysis (NMA) quantifies the relative efficacy of 3 or more interventions from studies containing a subgroup of interventions. This study applied the analytical approach of NMA to quantify the relative accuracy of prediction models with distinct inclusion criteria that are evaluated on a common population (‘concurrent external validation’). Methods: We simulated binary events in 5000 patients using a known risk function. We biased the risk function and modified its precision by pre-specified amounts to create 15 prediction models with varying accuracy and distinct patient applicability. Prediction model accuracy was measured using the Scaled Brier Score (SBS). Overall prediction model accuracy was measured using fixed-effects methods that accounted for model applicability patterns. Prediction model accuracy was summarized as the Network Relative Model Accuracy (NeRMA) Score which ranges from -∞ through 0 (accuracy of random guessing) to 1 (accuracy of most accurate model in concurrent external validation). Results: The unbiased prediction model had the highest SBS. The NeRMA score correctly ranked all simulated prediction models by the extent of bias from the known risk function. A SAS macro and R-function was created to implement the NeRMA Score. Conclusions: The NeRMA Score makes it possible to quantify the accuracy of binomial prediction models having distinct inclusion criteria in a concurrent external validation.Keywords: prediction model accuracy, scaled brier score, fixed effects methods, concurrent external validation
Procedia PDF Downloads 2386260 Diagnostic Value of Different Noninvasive Criteria of Latent Myocarditis in Comparison with Myocardial Biopsy
Authors: Olga Blagova, Yuliya Osipova, Evgeniya Kogan, Alexander Nedostup
Abstract:
Purpose: to quantify the value of various clinical, laboratory and instrumental signs in the diagnosis of myocarditis in comparison with morphological studies of the myocardium. Methods: in 100 patients (65 men, 44.7±12.5 years) with «idiopathic» arrhythmias (n = 20) and dilated cardiomyopathy (DCM, n = 80) were performed 71 endomyocardial biopsy (EMB), 13 intraoperative biopsy, 5 study of explanted hearts, 11 autopsy with virus investigation (real-time PCR) of the blood and myocardium. Anti-heart antibodies (AHA) were also measured as well as cardiac CT (n = 45), MRI (n = 25), coronary angiography (n = 47). The comparison group included of 50 patients (25 men, 53.7±11.7 years) with non-inflammatory heart diseases who underwent open heart surgery. Results. Active/borderline myocarditis was diagnosed in 76.0% of the study group and in 21.6% of patients of the comparison group (p < 0.001). The myocardial viral genome was observed more frequently in patients of comparison group than in study group (group (65.0% and 40.2%; p < 0.01. Evaluated the diagnostic value of noninvasive markers of myocarditis. The panel of anti-heart antibodies had the greatest importance to identify myocarditis: sensitivity was 81.5%, positive and negative predictive value was 75.0 and 60.5%. It is defined diagnostic value of non-invasive markers of myocarditis and diagnostic algorithm providing an individual assessment of the likelihood of myocarditis is developed. Conclusion. The greatest significance in the diagnosis of latent myocarditis in patients with 'idiopathic' arrhythmias and DCM have AHA. The use of complex of noninvasive criteria allows estimate the probability of myocarditis and determine the indications for EMB.Keywords: myocarditis, "idiopathic" arrhythmias, dilated cardiomyopathy, endomyocardial biopsy, viral genome, anti-heart antibodies
Procedia PDF Downloads 1736259 Guillain Barre Syndrome in Children
Authors: A. Erragh, K. Amanzoui, M. Elharit, H. Salem, M. Ababneh, K. Elfakhr, S. Kalouch, A. Chlilek
Abstract:
Guillain-Barre syndrome (GBS) is the most common form of acute polyradiculoneuritis (PRNA). It is a medical emergency in pediatrics that requires rapid diagnosis and immediate assessment of the severity criteria for the implementation of appropriate treatment. Retrospective, descriptive study in 24 patients under the age of 18 who presented with GBS between September 2017 and July 2021 and were hospitalized in the multipurpose pediatric intensive care unit of the Abderrahim EL Harouchi children's hospital in Casablanca. The average age was 7.91 years, with extremes ranging from 18 months and 14 years and a male predominance of 75%. After a prodromal event, most often infectious (80%) and a free interval of 12 days on average, 2 types of motor disorders begin either hypo or arereflectic flaccid paralysis of the lower limbs (45.8%) or flaccid quadriplegia hypo or arereflectic (54.2%). During GBS, the most formidable complication is respiratory distress, which can occur at any time. In our study, respiratory impairment was observed in 70.8% of cases. In addition, other signs of severity, such as swallowing disorders (75%) and dysautonomic disorders (8.33%), were also observed, which justified care in the intensive care unit for all of our patients. The use of invasive ventilation was necessary in 76.5% of cases, and specific treatments based on immunoglobulins were administered in all our patients. Despite everything, the death rate remains high (25%) and is mainly due to complications related to hospitalization. Guillain Barré syndrome is, therefore, a pediatric emergency that requires rapid diagnosis and immediate assessment of severity criteria for the implementation of appropriate treatment.Keywords: guillain barre syndrome, emergency, children, medical
Procedia PDF Downloads 726258 Images Selection and Best Descriptor Combination for Multi-Shot Person Re-Identification
Authors: Yousra Hadj Hassen, Walid Ayedi, Tarek Ouni, Mohamed Jallouli
Abstract:
To re-identify a person is to check if he/she has been already seen over a cameras network. Recently, re-identifying people over large public cameras networks has become a crucial task of great importance to ensure public security. The vision community has deeply investigated this area of research. Most existing researches rely only on the spatial appearance information from either one or multiple person images. Actually, the real person re-id framework is a multi-shot scenario. However, to efficiently model a person’s appearance and to choose the best samples to remain a challenging problem. In this work, an extensive comparison of descriptors of state of the art associated with the proposed frame selection method is studied. Specifically, we evaluate the samples selection approach using multiple proposed descriptors. We show the effectiveness and advantages of the proposed method by extensive comparisons with related state-of-the-art approaches using two standard datasets PRID2011 and iLIDS-VID.Keywords: camera network, descriptor, model, multi-shot, person re-identification, selection
Procedia PDF Downloads 2796257 End-to-End Control and Management of Multi-AS Virtual Service Networks Using SDN and Autonomic Computing Architecture
Authors: Yong Xue, Daniel A. Menascé
Abstract:
Automated and end-to-end network resource management and provisioning for virtual service networks in a multiple autonomous systems (a.k.a multi-AS) environment is a challenging and open problem. This paper proposes a novel, scalable and interoperable high-level architecture that incorporates a number of emerging enabling technologies including Software Defined Network (SDN), Network Function Virtualization (NFV), Service Oriented Architecture (SOA), and Autonomic Computing. The proposed architecture can be used to not only automate network resource management and provisioning for virtual service networks across multiple autonomous substrate networks, but also provide an adaptive capability for achieving optimal network resource management and maintaining network-level end-to-end network performance as well. The paper argues that this SDN and autonomic computing based architecture lays a solid foundation that can facilitate the development of the future Internet based on the pluralistic paradigm.Keywords: virtual network, software defined network, virtual service network, adaptive resource management, SOA, multi-AS, inter-domain
Procedia PDF Downloads 5336256 The Reasons for Vegetarianism in Estonia and its Effects to Body Composition
Authors: Ülle Parm, Kata Pedamäe, Jaak Jürimäe, Evelin Lätt, Aivar Orav, Anna-Liisa Tamm
Abstract:
Vegetarianism has gained popularity across the world. It`s being chosen for multiple reasons, but among Estonians, these have remained unknown. Previously, attention to bone health and probable nutrient deficiency of vegetarians has been paid and in vegetarians lower body mass index (BMI) and blood cholesterol level has been found but the results are inconclusive. The goal was to explain reasons for choosing vegetarian diet in Estonia and impact of vegetarianism to body composition – BMI, fat percentage (fat%), fat mass (FM), and fat free mass (FFM). The study group comprised of 68 vegetarians and 103 omnivorous. The determining body composition with DXA (Hologic) was concluded in 2013. Body mass (medical electronic scale, A&D Instruments, Abingdon, UK) and height (Martin metal anthropometer to the nearest 0.1 cm) were measured and BMI calculated (kg/m2). General data (physical activity level included) was collected with questionnaires. The main reasons why vegetarianism was chosen were the healthiness of the vegetarian diet (59%) and the wish to fight for animal rights (72%) Food additives were consumed by less than half of vegetarians, more often by men. Vegetarians had lower BMI than omnivores, especially amongst men. Based on BMI classification, vegetarians were less obese than omnivores. However, there were no differences in the FM, FFM and fat percentage figures of the two groups. Higher BMI might be the cause of higher physical activity level among omnivores compared with vegetarians. For classifying people as underweight, normal weight, overweight and obese both BMI and fat% criteria were used. By BMI classification in comparison with fat%, more people in the normal weight group were considered; by using fat% in comparison with BMI classification, however, more people categorized as overweight. It can be concluded that the main reasons for vegetarianism chosen in Estonia are healthiness of the vegetarian diet and the wish to fight for animal rights and vegetarian diet has no effect on body fat percentage, FM and FFM.Keywords: body composition, body fat percentage, body mass index, vegetarianism
Procedia PDF Downloads 4196255 Water Dumpflood into Multiple Low-Pressure Gas Reservoirs
Authors: S. Lertsakulpasuk, S. Athichanagorn
Abstract:
As depletion-drive gas reservoirs are abandoned when there is insufficient production rate due to pressure depletion, waterflooding has been proposed to increase the reservoir pressure in order to prolong gas production. Due to high cost, water injection may not be economically feasible. Water dumpflood into gas reservoirs is a new promising approach to increase gas recovery by maintaining reservoir pressure with much cheaper costs than conventional waterflooding. Thus, a simulation study of water dumpflood into multiple nearly abandoned or already abandoned thin-bedded gas reservoirs commonly found in the Gulf of Thailand was conducted to demonstrate the advantage of the proposed method and to determine the most suitable operational parameters for reservoirs having different system parameters. A reservoir simulation model consisting of several thin-layered depletion-drive gas reservoirs and an overlying aquifer was constructed in order to investigate the performance of the proposed method. Two producers were initially used to produce gas from the reservoirs. One of them was later converted to a dumpflood well after gas production rate started to decline due to continuous reduction in reservoir pressure. The dumpflood well was used to flow water from the aquifer to increase pressure of the gas reservoir in order to drive gas towards producer. Two main operational parameters which are wellhead pressure of producer and the time to start water dumpflood were investigated to optimize gas recovery for various systems having different gas reservoir dip angles, well spacings, aquifer sizes, and aquifer depths. This simulation study found that water dumpflood can increase gas recovery up to 12% of OGIP depending on operational conditions and system parameters. For the systems having a large aquifer and large distance between wells, it is best to start water dumpflood when the gas rate is still high since the long distance between the gas producer and dumpflood well helps delay water breakthrough at producer. As long as there is no early water breakthrough, the earlier the energy is supplied to the gas reservoirs, the better the gas recovery. On the other hand, for the systems having a small or moderate aquifer size and short distance between the two wells, performing water dumpflood when the rate is close to the economic rate is better because water is more likely to cause an early breakthrough when the distance is short. Water dumpflood into multiple nearly-depleted or depleted gas reservoirs is a novel study. The idea of using water dumpflood to increase gas recovery has been mentioned in the literature but has never been investigated. This detailed study will help a practicing engineer to understand the benefits of such method and can implement it with minimum cost and risk.Keywords: dumpflood, increase gas recovery, low-pressure gas reservoir, multiple gas reservoirs
Procedia PDF Downloads 4456254 The Efficacy of Vestibular Rehabilitation Therapy for Mild Traumatic Brain Injury: A Systematic Review and Meta-Analysis
Authors: Ammar Aljabri, Alhussain Halawani, Alaa Ashqar, Omar Alageely
Abstract:
Objective: mild Traumatic Brain Injury (mTBI) or concussion is a common yet undermanaged and underreported condition. This systematic review and meta-analysis aim to determine the efficacy of VRT as a treatment option for mTBI. Method: This review and meta-analysis was performed following the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines and included RCTs and pre-VRT/post-VRT retrospective chart reviews. Records meeting the inclusion criteria were extracted from the following databases: Medline, Embase, and Cochrane Register of Controlled Trials (CENTRAL). Results: Eight articles met the inclusion criteria, and six RCTs were included in the meta-analysis. VRT demonstrated significant improvement in decreasing perceived dizziness at the end of the intervention program, as shown by DHI scores (SMD= -0.33, 95% CI -0.62 to -0.03, p=0.03, I2= 0%). However, no significant reduction in DHI was evident after two months of follow-up (SMD= 0.15, 95% CI -0.23 to 0.52, p=0.44, I2=0%). Quantitative analysis also depicts significant reduction in both VOMS (SMD=-0.40, 95% CI -0.60 to -0.20, p<0.0001, I2=0%) and PCSS (SMD= -0.39, 95% CI -0.71 to -0.07, p=0.02, I2=0%) following the intervention. Lastly, there was no significant difference between intervention groups on BESS scores (SMD= -31, 95% CI -0.71 to 0.10, p=0.14, I2=0%) and return to sport/function (95% CI 0.32 to 30.80, p=0.32, I2=82%). Conclusions: Current evidence on the efficacy of VRT for mTBI is limited. This review and analysis provide evidence that supports the role of VRT in improving perceived symptoms following concussion. There is still a need for high-quality trials evaluating the benefit of VRT using a standardized approach.Keywords: concussion, traumatic brain injury, vestibular rehabilitation, neurorehabilitation
Procedia PDF Downloads 1456253 Ranking of the Main Criteria for Contractor Selection Procedures on Major Construction Projects in Libya Using the Delphi Method
Authors: Othoman Elsayah, Naren Gupta, Binsheng Zhang
Abstract:
The construction sector constitutes one of the most important sectors in the economy of any country. Contractor selection is a critical decision that is undertaken by client organizations and is central to the success of any construction project. Contractor selection (CS) is a process which involves investigating, screening and determining whether candidate contractors have the technical and financial capability to be accepted to formally tender for construction work. The process should be conducted prior to the award of contract, characterized by many factors such as: contactor’s skills, experience on similar projects, track- record in the industry, and financial stability. However, this paper evaluates the current state of knowledge in relation to contractor selection process and demonstrates the findings from the analysis of the data collected from the Delphi questionnaire survey. The survey was conducted with a group of 12 experts working in the Libyan construction industry (LCI). The paper starts by briefly explaining the general outline of the questionnaire including the survey participation rate, the different fields the experts came from, and the business titles of the participants. Then, the paper describes the tests used to determine when the experts had reached consensus. The paper is based on research which aims to develop rank contractor selection criteria with specific application to make construction projects in the Libyan context. The findings of this study will be utilized to establish the scope of work that will be used as part of a PhD research.Keywords: contractor selection, Libyan construction industry, decision experts, Delphi technique
Procedia PDF Downloads 3326252 Study of the Biological Activity of a Ganglioside-Containing Drug (Cronassil) in an Experimental Model of Multiple Sclerosis
Authors: Hasmik V. Zanginyan, Gayane S. Ghazaryan, Laura M. Hovsepyan
Abstract:
Experimental autoimmune encephalomyelitis (EAE) is an inflammatory demyelinating disease of the central nervous system that is induced in laboratory animals by developing an immune response against myelin epitopes. The typical clinical course is ascending palsy, which correlates with inflammation and tissue damage in the thoracolumbar spinal cord, although the optic nerves and brain (especially the subpial white matter and brainstem) are also often affected. With multiple sclerosis, there is a violation of lipid metabolism in myelin. When membrane lipids (glycosphingolipids, phospholipids) are disturbed, metabolites not only play a structural role in membranes but are also sources of secondary mediators that transmit multiple cellular signals. The purpose of this study was to investigate the effect of ganglioside as a therapeutic agent in experimental multiple sclerosis. The biological activity of a ganglioside-containing medicinal preparation (Cronassial) was evaluated in an experimental model of multiple sclerosis in laboratory animals. An experimental model of multiple sclerosis in rats was obtained by immunization with myelin basic protein (MBP), as well as homogenization of the spinal cord or brain. EAE was induced by administering a mixture of an encephalitogenic mixture (EGM) with Complete Freund’s Adjuvant. Mitochondrial fraction was isolated in a medium containing 0,25 M saccharose and 0, 01 M tris buffer, pH - 7,4, by a method of differential centrifugation on a K-24 centrifuge. Glutathione peroxidase activity was assessed by reduction reactions of hydrogen peroxide (H₂O₂) and lipid hydroperoxides (ROOH) in the presence of GSH. LPO activity was assessed by the amount of malondialdehyde (MDA) in the total homogenate and mitochondrial fraction of the spinal cord and brain of control and experimental autoimmune encephalomyelitis rats. MDA was assessed by a reaction with Thiobarbituric acid. For statistical data analysis on PNP, SPSS (Statistical Package for Social Science) package was used. The nature of the distribution of the obtained data was determined by the Kolmogorov-Smirnov criterion. The comparative analysis was performed using a nonparametric Mann-Whitney test. The differences were statistically significant when р ≤ 0,05 or р ≤ 0,01. Correlational analysis was conducted using a nonparametric Spearman test. In the work, refrigeratory centrifuge, spectrophotometer LKB Biochrom ULTROSPECII (Sweden), pH-meter PL-600 mrc (Israel), guanosine, and ATP (Sigma). The study of the process of lipid peroxidation in the total homogenate of the brain and spinal cord in experimental animals revealed an increase in the content of malonic dialdehyde. When applied, Cronassial observed normalization of lipid peroxidation processes. Reactive oxygen species, causing lipid peroxidation processes, can be toxic both for neurons and for oligodendrocytes that form myelin, causing a violation of their lipid composition. The high content of lipids in the brain and the uniqueness of their structure determines the nature of the development of LPO processes. The lipid layer of cellular and intracellular membranes performs two main functions -barrier and matrix (structural). Damage to the barrier leads to dysregulation of intracellular processes and severe disorders of cellular functions.Keywords: experimental autoimmune encephalomyelitis, multiple sclerosis, neuroinflammation, therapy
Procedia PDF Downloads 936251 A Proposal of Multi-modal Teaching Model for College English
Authors: Huang Yajing
Abstract:
Multimodal discourse refers to the phenomenon of using various senses such as hearing, vision, and touch to communicate through various means and symbolic resources such as language, images, sounds, and movements. With the development of modern technology and multimedia, language and technology have become inseparable, and foreign language teaching is becoming more and more modal. Teacher-student communication resorts to multiple senses and uses multiple symbol systems to construct and interpret meaning. The classroom is a semiotic space where multimodal discourses are intertwined. College English multi-modal teaching is to rationally utilize traditional teaching methods while mobilizing and coordinating various modern teaching methods to form a joint force to promote teaching and learning. Multimodal teaching makes full and reasonable use of various meaning resources and can maximize the advantages of multimedia and network environments. Based upon the above theories about multimodal discourse and multimedia technology, the present paper will propose a multi-modal teaching model for college English in China.Keywords: multimodal discourse, multimedia technology, English education, applied linguistics
Procedia PDF Downloads 696250 Drivetrain Comparison and Selection Approach for Armored Wheeled Hybrid Vehicles
Authors: Çağrı Bekir Baysal, Göktuğ Burak Çalık
Abstract:
Armored vehicles may have different traction layouts as a result of terrain capabilities and mobility needs. Two main categories of layouts can be separated as wheeled and tracked. Tracked vehicles have superior off-road capabilities but what they gain on terrain performance they lose on mobility front. Wheeled vehicles on the other hand do not have as good terrain capabilities as tracked vehicles but they have superior mobility capabilities such as top speed, range and agility with respect to tracked vehicles. Conventional armored vehicles employ a diesel ICE as main power source. In these vehicles ICE is mechanically connected to the powertrain. This determines the ICE rpm as a result of speed and torque requested by the driver. ICE efficiency changes drastically with torque and speed required and conventional vehicles suffer in terms of fuel consumption because of this. Hybrid electric vehicles employ at least one electric motor in order to improve fuel efficiency. There are different types of hybrid vehicles but main types are Series Hybrid, Parallel Hybrid and Series-Parallel Hybrid. These vehicles introduce an electric motor for traction and also can have a generator electric motor for range extending purposes. Having an electric motor as the traction power source brings the flexibility of either using the ICE as an alternative traction source while it is in efficient range or completely separating the ICE from traction and using it solely considering efficiency. Hybrid configurations have additional advantages for armored vehicles in addition to fuel efficiency. Heat signature, silent operation and prolonged stationary missions can be possible with the help of the high-power battery pack that will be present in the vehicle for hybrid drivetrain. Because of the reasons explained, hybrid armored vehicles are becoming a target area for military and also for vehicle suppliers. In order to have a better idea and starting point when starting a hybrid armored vehicle design, hybrid drivetrain configuration has to be selected after performing a trade-off study. This study has to include vehicle mobility simulations, integration level, vehicle level and performance level criteria. In this study different hybrid traction configurations possible for an 8x8 vehicle is compared using above mentioned criteria set. In order to compare hybrid traction configurations ease of application, cost, weight advantage, reliability, maintainability, redundancy and performance criteria have been used. Performance criteria points have been defined with the help of vehicle simulations and tests. Results of these simulations and tests also help determining required tractive power for an armored vehicle including conditions like trench and obstacle crossing, gradient climb. With the method explained in this study, each configuration is assigned a point for each criterion. This way, correct configuration can be selected objectively for every application. Also, key aspects of armored vehicles, mine protection and ballistic protection will be considered for hybrid configurations. Results are expected to vary for different types of vehicles but it is observed that having longitudinal differential locking capability improves mobility and having high motor count increases complexity in general.Keywords: armored vehicles, electric drivetrain, electric mobility, hybrid vehicles
Procedia PDF Downloads 866249 Analysis of Labor Behavior Effect on Occupational Health and Safety Management by Multiple Linear Regression
Authors: Yulinda Rizky Pratiwi, Fuji Anugrah Emily
Abstract:
Management of Occupational Safety and Health (OSH) are appropriately applied properly by all workers and pekarya in the company. K3 management application also has become very important to prevent accidents. Violation of the rules regarding the K3 has often occurred from time to time. By 2015 the number of occurrences of a violation of the K3 or so-called unsafe action tends to increase. Until finally in January 2016, the number increased drastically unsafe action. Trigger increase in the number of unsafe action is a decrease in the quality of management practices K3. While the application of K3 management performed by each individual thought to be influenced by the attitude and observation guide the actions of each of the individual. In addition to the decline in the quality of K3 management application may result in increased likelihood of accidents and losses for the company as well as the local co-workers. The big difference in the number of unsafe action is very significant in the month of January 2016, making the company Pertamina as the national oil company must do a lot of effort to keep track of how the implementation of K3 management on every worker and pekarya, one at PT Pertamina EP Cepu Field Asset IV. To consider the effort to control the implementation of K3 management can be seen from the attitude and observation guide the actions of the workers and pekarya. By using Multiple Linear Regression can be seen the influence of attitude and action observation guide workers and pekarya the K3 management application that has been done. The results showed that scores K3 management application of each worker and pekarya will increase by 0.764 if the score pekarya worker attitudes and increase one unit, whereas if the score Reassurance action guidelines and pekarya workers increased by one unit then the score management application K3 will increase by 0.754.Keywords: occupational safety and health, management of occupational safety and health, unsafe action, multiple linear regression
Procedia PDF Downloads 2306248 Predictors of School Safety Awareness among Malaysian Primary School Teachers
Authors: Ssekamanya, Mastura Badzis, Khamsiah Ismail, Dayang Shuzaidah Bt Abduludin
Abstract:
With rising incidents of school violence worldwide, educators and researchers are trying to understand and find ways to enhance the safety of children at school. The purpose of this study was to investigate the extent to which the demographic variables of gender, age, length of service, position, academic qualification, and school location predicted teachers’ awareness about school safety practices in Malaysian primary schools. A stratified random sample of 380 teachers was selected in the central Malaysian states of Kuala Lumpur and Selangor. Multiple regression analysis revealed that none of the factors was a good predictor of awareness about school safety training, delivery methods of school safety information, and available school safety programs. Awareness about school safety activities was significantly predicted by school location (whether the school was located in a rural or urban area). While these results may reflect a general lack of awareness about school safety among primary school teachers in the selected locations, a national study needs to be conducted for the whole country.Keywords: school safety awareness, predictors of school safety, multiple regression analysis, malaysian primary schools
Procedia PDF Downloads 4696247 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine
Procedia PDF Downloads 1266246 The Sexual Knowledge, Attitudes and Behaviors of College Students from Only-Child Families: A National Survey in China
Authors: Jiashu Shen
Abstract:
This study aims at exploring the characteristics of sexual knowledge, attitudes, and behaviors of Chinese college students from the 'one-child' families compared with those with siblings. This study utilized the data from the 'National College Student Survey on Sexual and Reproductive Health 2019'. Multiple logistic regression analyses were used to assess the association between the 'only-child' and their sexual knowledge, sexual attitudes, sexual behaviors, and risky sexual behaviors (RSB) stratified by sex and home regions, respectively. Compared with students with siblings, the 'only-child' students scored higher in sex-related knowledge (only-child students: 4.49 ± 2.28, students with siblings: 3.60 ± 2.27). Stronger associations between only-child and more liberal sexual attitudes were found in urban areas, including the approval of premarital sexual intercourse (OR: 1.51, 95% CI: 1.50-1.65) and multiple sexual partners (OR: 1.85, 95% CI: 1.72-1.99). For risky sexual behaviors, being only-child is more likely to use condoms in first sexual intercourse, especially among male students (OR: 0.68, 95% CI: 0.58-0.80). Only-child students are more likely to have more sexual knowledge, more liberal sexual attitude, and less risky sexual behavior. Further health policy and sex education should focus more on students with siblings.Keywords: attitudes and behaviors, only-child students, sexual knowledge, students with siblings
Procedia PDF Downloads 1836245 Software Quality Assurance in 5G Technology-Redefining Wireless Communication: A Comprehensive Survey
Authors: Sumbal Riaz, Sardar-un-Nisa, Mehreen Sirshar
Abstract:
5G - The 5th generation of mobile phone and data communication standards is the next edge of innovation for whole mobile industry. 5G is Real Wireless World System and it will provide a totally wireless communication system all over the world without limitations. 5G uses many 4g technologies and it will hit the market in 2020. This research is the comprehensive survey on the quality parameters of 5G technology.5G provide High performance, Interoperability, easy roaming, fully converged services, friendly interface and scalability at low cost. To meet the traffic demands in future fifth generation wireless communications systems will include i) higher densification of heterogeneous networks with massive deployment of small base stations supporting various Radio Access Technologies (RATs), ii) use of massive Multiple Input Multiple Output (MIMO) arrays, iii) use of millimetre Wave spectrum where larger wider frequency bands are available, iv) direct device to device (D2D) communication, v) simultaneous transmission and reception, vi) cognitive radio technology.Keywords: 5G, 5th generation, innovation, standard, wireless communication
Procedia PDF Downloads 4476244 Use of Numerical Tools Dedicated to Fire Safety Engineering for the Rolling Stock
Authors: Guillaume Craveur
Abstract:
This study shows the opportunity to use numerical tools dedicated to Fire Safety Engineering for the Rolling Stock. Indeed, some lawful requirements can now be demonstrated by using numerical tools. The first part of this study presents the use of modelling evacuation tool to satisfy the criteria of evacuation time for the rolling stock. The buildingEXODUS software is used to model and simulate the evacuation of rolling stock. Firstly, in order to demonstrate the reliability of this tool to calculate the complete evacuation time, a comparative study was achieved between a real test and simulations done with buildingEXODUS. Multiple simulations are performed to capture the stochastic variations in egress times. Then, a new study is done to calculate the complete evacuation time of a train with the same geometry but with a different interior architecture. The second part of this study shows some applications of Computational Fluid Dynamics. This work presents the approach of a multi scales validation of numerical simulations of standardized tests with Fire Dynamics Simulations software developed by the National Institute of Standards and Technology (NIST). This work highlights in first the cone calorimeter test, described in the standard ISO 5660, in order to characterize the fire reaction of materials. The aim of this process is to readjust measurement results from the cone calorimeter test in order to create a data set usable at the seat scale. In the second step, the modelisation concerns the fire seat test described in the standard EN 45545-2. The data set obtained thanks to the validation of the cone calorimeter test was set up in the fire seat test. To conclude with the third step, after controlled the data obtained for the seat from the cone calorimeter test, a larger scale simulation with a real part of train is achieved.Keywords: fire safety engineering, numerical tools, rolling stock, multi-scales validation
Procedia PDF Downloads 3036243 Optimization of Personnel Selection Problems via Unconstrained Geometric Programming
Authors: Vildan Kistik, Tuncay Can
Abstract:
From a business perspective, cost and profit are two key factors for businesses. The intent of most businesses is to minimize the cost to maximize or equalize the profit, so as to provide the greatest benefit to itself. However, the physical system is very complicated because of technological constructions, rapid increase of competitive environments and similar factors. In such a system it is not easy to maximize profits or to minimize costs. Businesses must decide on the competence and competence of the personnel to be recruited, taking into consideration many criteria in selecting personnel. There are many criteria to determine the competence and competence of a staff member. Factors such as the level of education, experience, psychological and sociological position, and human relationships that exist in the field are just some of the important factors in selecting a staff for a firm. Personnel selection is a very important and costly process in terms of businesses in today's competitive market. Although there are many mathematical methods developed for the selection of personnel, unfortunately the use of these mathematical methods is rarely encountered in real life. In this study, unlike other methods, an exponential programming model was established based on the possibilities of failing in case the selected personnel was started to work. With the necessary transformations, the problem has been transformed into unconstrained Geometrical Programming problem and personnel selection problem is approached with geometric programming technique. Personnel selection scenarios for a classroom were established with the help of normal distribution and optimum solutions were obtained. In the most appropriate solutions, the personnel selection process for the classroom has been achieved with minimum cost.Keywords: geometric programming, personnel selection, non-linear programming, operations research
Procedia PDF Downloads 2726242 An Integrated Web-Based Workflow System for Design of Computational Pipelines in the Cloud
Authors: Shuen-Tai Wang, Yu-Ching Lin
Abstract:
With more and more workflow systems adopting cloud as their execution environment, it presents various challenges that need to be addressed in order to be utilized efficiently. This paper introduces a method for resource provisioning based on our previous research of dynamic allocation and its pipeline processes. We present an abstraction for workload scheduling in which independent tasks get scheduled among various available processors of distributed computing for optimization. We also propose an integrated web-based workflow designer by taking advantage of the HTML5 technology and chaining together multiple tools. In order to make the combination of multiple pipelines executing on the cloud in parallel, we develop a script translator and an execution engine for workflow management in the cloud. All information is known in advance by the workflow engine and tasks are allocated according to the prior knowledge in the repository. This proposed effort has the potential to provide support for process definition, workflow enactment and monitoring of workflow processes. Users would benefit from the web-based system that allows creation and execution of pipelines without scripting knowledge.Keywords: workflow systems, resources provisioning, workload scheduling, web-based, workflow engine
Procedia PDF Downloads 1606241 Developing HRCT Criterion to Predict the Risk of Pulmonary Tuberculosis
Authors: Vandna Raghuvanshi, Vikrant Thakur, Anupam Jhobta
Abstract:
Objective: To design HRCT criterion to forecast the threat of pulmonary tuberculosis. Material and methods: This was a prospective study of 69 patients with clinical suspicion of pulmonary tuberculosis. We studied their medical characteristics, numerous separate HRCT-results, and a combination of HRCT findings to foresee the danger for PTB by utilizing univariate and multivariate investigation. Temporary HRCT diagnostic criteria were planned in view of these outcomes to find out the risk of PTB and tested these criteria on our patients. Results: The results of HRCT chest were analyzed, and Rank was given from 1 to 4 according to the HRCT chest findings. Sensitivity, specificity, positive predictive value, and negative predictive value were calculated. Rank 1: Highly suspected PTB. Rank 2: Probable PTB Rank 3: Nonspecific or difficult to differentiate from other diseases Rank 4: Other suspected diseases • Rank 1 (Highly suspected TB) was present in 22 (31.9%) patients, all of them finally diagnosed to have pulmonary tuberculosis. The sensitivity, specificity, and negative likelihood ratio for RANK 1 on HRCT chest was 53.6%, 100%, and 0.43, respectively. • Rank 2 (Probable TB) was present in 13 patients, out of which 12 were tubercular, and 1 was non-tubercular. • The sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio of the combination of Rank 1 and Rank 2 was 82.9%, 96.4%, 23.22, and 0.18, respectively. • Rank 3 (Non-specific TB) was present in 25 patients, and out of these, 7 were tubercular, and 18 were non-tubercular. • When all these 3 ranks were considered together, the sensitivity approached 100% however, the specificity reduced to 35.7%. The positive likelihood ratio and negative likelihood ratio were 1.56 and 0, respectively. • Rank 4 (Other specific findings) was given to 9 patients, and all of these were non-tubercular. Conclusion: HRCT is useful in selecting individuals with greater chances of pulmonary tuberculosis.Keywords: pulmonary, tuberculosis, multivariate, HRCT
Procedia PDF Downloads 1726240 Load Management Using Multiple Sequential Load Shaping Techniques
Authors: Amira M. Attia, Karim H. Youssef, Nabil H. Abbasi
Abstract:
Demand Side Management (DSM) is an essential characteristic of current and future smart grid systems. As one of DSM functions, load management aims to control customers’ total electric consumption and utility’s load factor by using various load shaping techniques. However, applying load shaping techniques such as load shifting, peak clipping, or strategic conservation individually does not provide the desired level of improvement for load factor increment and/or customer’s bill reduction. In this paper, two load shaping techniques will be simulated as constrained optimization problems. The purpose is to reflect the application of combined load shifting and strategic conservation model together at the same time, and the application of combined load shifting and peak clipping model as well. The problem will be formulated and solved by using disciplined convex programming (CVX) based MATLAB® R2013b. Simulation results will be evaluated and compared for studying the most impactful multi-techniques model in improving load curve.Keywords: convex programing, demand side management, load shaping, multiple, building energy optimization
Procedia PDF Downloads 3136239 Neuroanatomical Specificity in Reporting & Diagnosing Neurolinguistic Disorders: A Functional & Ethical Primer
Authors: Ruairi J. McMillan
Abstract:
Introduction: This critical analysis aims to ascertain how well neuroanatomical aetiologies are communicated within 20 case reports of aphasia. Neuroanatomical visualisations based on dissected brain specimens were produced and combined with white matter tract and vascular taxonomies of function in order to address the most consistently underreported features found within the aphasic case study reports. Together, these approaches are intended to integrate aphasiological knowledge from the past 20 years with aphasiological diagnostics, and to act as prototypal resources for both researchers and clinical professionals. The medico-legal precedent for aphasia diagnostics under Canadian, US and UK case law and the neuroimaging/neurological diagnostics relative to the functional capacity of aphasic patients are discussed in relation to the major findings of the literary analysis, neuroimaging protocols in clinical use today, and the neuroanatomical aetiologies of different aphasias. Basic Methodology: Literature searches of relevant scientific databases (e.g, OVID medline) were carried out using search terms such as aphasia case study (year) & stroke induced aphasia case study. A series of 7 diagnostic reporting criteria were formulated, and the resulting case studies were scored / 7 alongside clinical stroke criteria. In order to focus on the diagnostic assessment of the patient’s condition, only the case report proper (not the discussion) was used to quantify results. Statistical testing established if specific reporting criteria were associated with higher overall scores and potentially inferable increases in quality of reporting. Statistical testing of whether criteria scores were associated with an unclear/adjusted diagnosis were also tested, as well as the probability of a given criterion deviating from an expected estimate. Major Findings: The quantitative analysis of neuroanatomically driven diagnostics in case studies of aphasia revealed particularly low scores in the connection of neuroanatomical functions to aphasiological assessment (10%), and in the inclusion of white matter tracts within neuroimaging or assessment diagnostics (30%). Case studies which included clinical mention of white matter tracts within the report itself were distributed among higher scoring cases, as were case studies which (as clinically indicated) related the affected vascular region to the brain parenchyma of the language network. Concluding Statement: These findings indicate that certain neuroanatomical functions are integrated less often within the patient report than others, despite a precedent for well-integrated neuroanatomical aphasiology also being found among the case studies sampled, and despite these functions being clinically essential in diagnostic neuroimaging and aphasiological assessment. Therefore, ultimately the integration and specificity of aetiological neuroanatomy may contribute positively to the capacity and autonomy of aphasic patients as well as their clinicians. The integration of a full aetiological neuroanatomy within the reporting of aphasias may improve patient outcomes and sustain autonomy in the event of medico-ethical investigation.Keywords: aphasia, language network, functional neuroanatomy, aphasiological diagnostics, medico-legal ethics
Procedia PDF Downloads 676238 Factors Associated with Acute Kidney Injury in Multiple Trauma Patients with Rhabdomyolysis
Authors: Yong Hwang, Kang Yeol Suh, Yundeok Jang, Tae Hoon Kim
Abstract:
Introduction: Rhabdomyolysis is a syndrome characterized by muscle necrosis and the release of intracellular muscle constituents into the circulation. Acute kidney injury is a potential complication of severe rhabdomyolysis and the prognosis is substantially worse if renal failure develops. We try to identify the factors that were predictive of AKI in severe trauma patients with rhabdomyolysis. Methods: This retrospective study was conducted at the emergency department of a level Ⅰ trauma center. Patients enrolled that initial creatine phosphokinase (CPK) levels were higher than 1000 IU with acute multiple trauma, and more than 18 years older from Oct. 2012 to June 2016. We collected demographic data (age, gender, length of hospital day, and patients’ outcome), laboratory data (ABGA, lactate, hemoglobin. hematocrit, platelet, LDH, myoglobin, liver enzyme, and BUN/Cr), and clinical data (Injury Mechanism, RTS, ISS, AIS, and TRISS). The data were compared and analyzed between AKI and Non-AKI group. Statistical analyses were performed using IMB SPSS 20.0 statistics for Window. Results: Three hundred sixty-four patients were enrolled that AKI group were ninety-six and non-AKI group were two hundred sixty-eight. The base excess (HCO3), AST/ALT, LDH, and myoglobin in AKI group were significantly higher than non-AKI group from laboratory data (p ≤ 0.05). The injury severity score (ISS), revised Trauma Score (RTS), Abbreviated Injury Scale 3 and 4 (AIS 3 and 4) were showed significant results in clinical data. The patterns of CPK level were increased from first and second day, but slightly decreased from third day in both group. Seven patients had received hemodialysis treatment despite the bleeding risk and were survived in AKI group. Conclusion: We recommend that HCO3, CPK, LDH, and myoglobin should be checked and be concerned about ISS, RTS, AIS with injury mechanism at the early stage of treatment in the emergency department.Keywords: acute kidney injury, emergencies, multiple trauma, rhabdomyolysis
Procedia PDF Downloads 3396237 Strain Based Failure Criterion for Composite Notched Laminates
Authors: Ibrahim A. Elsayed, Mohamed H. Elalfy, Mostafa M. Abdalla
Abstract:
A strain-based failure criterion for composite notched laminates is introduced where the most critical stress concentration factor for the anisotropic notched laminates could be related to the failure of the corresponding quasi-isotropic laminate and the anisotropy ratio of the laminate. The proposed criterion will simplify the design of composites to meet notched failure requirements by eliminating the need for the detailed specifications of the stacking sequence at the preliminary design stage. The designer will be able to design based on the stiffness of the laminate, then at a later stage, select an appropriate stacking sequence to meet the stiffness requirements. The failure strains for the notched laminates are computed using the material’s Omni-strain envelope. The concept of Omni-strain envelope concerns the region of average strain where the laminate is safe regardless of ply orientation. In this work, we use Hashin’s failure criteria and the strains around the hole are computed using Savin’s analytic solution. A progressive damage analysis study has been conducted where the failure loads for the notched laminates are computed using finite element analysis. The failure strains are computed and used to estimate the concentration factor. It is found that the correlation found using Savin’s analytic solution predicts the same ratio of concentration factors between anisotropic and quasi-isotropic laminates as the more expensive progressive failure analysis.Keywords: anisotropy ratio, failure criteria, notched laminates, Omni-strain envelope, savin’s solution
Procedia PDF Downloads 1166236 A Comprehensive Review of Artificial Intelligence Applications in Sustainable Building
Authors: Yazan Al-Kofahi, Jamal Alqawasmi.
Abstract:
In this study, a comprehensive literature review (SLR) was conducted, with the main goal of assessing the existing literature about how artificial intelligence (AI), machine learning (ML), deep learning (DL) models are used in sustainable architecture applications and issues including thermal comfort satisfaction, energy efficiency, cost prediction and many others issues. For this reason, the search strategy was initiated by using different databases, including Scopus, Springer and Google Scholar. The inclusion criteria were used by two research strings related to DL, ML and sustainable architecture. Moreover, the timeframe for the inclusion of the papers was open, even though most of the papers were conducted in the previous four years. As a paper filtration strategy, conferences and books were excluded from database search results. Using these inclusion and exclusion criteria, the search was conducted, and a sample of 59 papers was selected as the final included papers in the analysis. The data extraction phase was basically to extract the needed data from these papers, which were analyzed and correlated. The results of this SLR showed that there are many applications of ML and DL in Sustainable buildings, and that this topic is currently trendy. It was found that most of the papers focused their discussions on addressing Environmental Sustainability issues and factors using machine learning predictive models, with a particular emphasis on the use of Decision Tree algorithms. Moreover, it was found that the Random Forest repressor demonstrates strong performance across all feature selection groups in terms of cost prediction of the building as a machine-learning predictive model.Keywords: machine learning, deep learning, artificial intelligence, sustainable building
Procedia PDF Downloads 676235 Quantum Graph Approach for Energy and Information Transfer through Networks of Cables
Authors: Mubarack Ahmed, Gabriele Gradoni, Stephen C. Creagh, Gregor Tanner
Abstract:
High-frequency cables commonly connect modern devices and sensors. Interestingly, the proportion of electric components is rising fast in an attempt to achieve lighter and greener devices. Modelling the propagation of signals through these cable networks in the presence of parameter uncertainty is a daunting task. In this work, we study the response of high-frequency cable networks using both Transmission Line and Quantum Graph (QG) theories. We have successfully compared the two theories in terms of reflection spectra using measurements on real, lossy cables. We have derived a generalisation of the vertex scattering matrix to include non-uniform networks – networks of cables with different characteristic impedances and propagation constants. The QG model implicitly takes into account the pseudo-chaotic behavior, at the vertices, of the propagating electric signal. We have successfully compared the asymptotic growth of eigenvalues of the Laplacian with the predictions of Weyl law. We investigate the nearest-neighbour level-spacing distribution of the resonances and compare our results with the predictions of Random Matrix Theory (RMT). To achieve this, we will compare our graphs with the generalisation of Wigner distribution for open systems. The problem of scattering from networks of cables can also provide an analogue model for wireless communication in highly reverberant environments. In this context, we provide a preliminary analysis of the statistics of communication capacity for communication across cable networks, whose eventual aim is to enable detailed laboratory testing of information transfer rates using software defined radio. We specialise this analysis in particular for the case of MIMO (Multiple-Input Multiple-Output) protocols. We have successfully validated our QG model with both TL model and laboratory measurements. The growth of Eigenvalues compares well with Weyl’s law and the level-spacing distribution agrees so well RMT predictions. The results we achieved in the MIMO application compares favourably with the prediction of a parallel on-going research (sponsored by NEMF21.)Keywords: eigenvalues, multiple-input multiple-output, quantum graph, random matrix theory, transmission line
Procedia PDF Downloads 1746234 Optical Variability of Faint Quasars
Authors: Kassa Endalamaw Rewnu
Abstract:
The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.Keywords: nuclear activity, galaxies, active quasars, variability
Procedia PDF Downloads 836233 Contactless and Multiple Space Debris Removal by Micro to Nanno Satellites
Authors: Junichiro Kawaguchi
Abstract:
Space debris problems have emerged and threatened the use of low earth orbit around the Earth owing to a large number of spacecraft. In debris removal, a number of research and patents have been proposed and published so far. They assume servicing spacecraft, robots to be built for accessing the target debris objects. The robots should be sophisticated enough automatically to access the debris articulating the attitude and the translation motion with respect to the debris. This paper presents the idea of using the torpedo-like third unsophisticated and disposable body, in addition to the first body of the servicing robot and the second body of the target debris. The third body is launched from the first body from a distance farer than the size of the second body. This paper presents the method and the system, so that the third body is launched from the first body. The third body carries both a net and an inflatable or extendible drag deceleration device and is built small and light. This method enables even a micro to nano satellite to perform contactless and multiple debris removal even via a single flight.Keywords: ballute, debris removal, echo satellite, gossamer, gun-net, inflatable space structure, small satellite, un-cooperated target
Procedia PDF Downloads 1246232 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method
Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi
Abstract:
This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure
Procedia PDF Downloads 492