Search results for: multiple rare variants
4698 Dietary Pattern and Risk of Breast Cancer Among Women:a Case Control Study
Authors: Huma Naqeeb
Abstract:
Epidemiological studies have shown the robust link between breast cancer and dietary pattern. There has been no previous study conducted in Pakistan, which specifically focuses on dietary patterns among breast cancer women. This study aims to examine the association of breast cancer with dietary patterns among Pakistani women. This case-control research was carried in multiple tertiary care facilities. Newly diagnosed primary breast cancer patients were recruited as cases (n = 408); age matched controls (n = 408) were randomly selected from the general population. Data on required parameters were systematically collected using subjective and objective tools. Factor and Principal Component Analysis (PCA) techniques were used to extract women’s dietary patterns. Four dietary patterns were identified based on eigenvalue >1; (i) veg-ovo-fish, (ii) meat-fat-sweet, (iii) mix (milk and its products, and gourds vegetables) and (iv) lentils - spices. Results of the multiple regressions were displayed as adjusted odds ratio (Adj. OR) and their respective confidence intervals (95% CI). After adjusted for potential confounders, veg-ovo-fish dietary pattern was found to be robustly associated with a lower risk of breast cancer among women (Adj. OR: 0.68, 95%CI: (0.46-0.99, p<0.01). The study findings concluded that attachment to the diets majorly composed of fresh vegetables, and high quality protein sources may contribute in lowering the risk of breast cancer among women.Keywords: breast cancer, dietary pattern, women, principal component analysis
Procedia PDF Downloads 1234697 Decision Tree Based Scheduling for Flexible Job Shops with Multiple Process Plans
Authors: H.-H. Doh, J.-M. Yu, Y.-J. Kwon, J.-H. Shin, H.-W. Kim, S.-H. Nam, D.-H. Lee
Abstract:
This paper suggests a decision tree based approach for flexible job shop scheduling with multiple process plans, i. e. each job can be processed through alternative operations, each of which can be processed on alternative machines. The main decision variables are: (a) selecting operation/machine pair; and (b) sequencing the jobs assigned to each machine. As an extension of the priority scheduling approach that selects the best priority rule combination after many simulation runs, this study suggests a decision tree based approach in which a decision tree is used to select a priority rule combination adequate for a specific system state and hence the burdens required for developing simulation models and carrying out simulation runs can be eliminated. The decision tree based scheduling approach consists of construction and scheduling modules. In the construction module, a decision tree is constructed using a four-stage algorithm, and in the scheduling module, a priority rule combination is selected using the decision tree. To show the performance of the decision tree based approach suggested in this study, a case study was done on a flexible job shop with reconfigurable manufacturing cells and a conventional job shop, and the results are reported by comparing it with individual priority rule combinations for the objectives of minimizing total flow time and total tardiness.Keywords: flexible job shop scheduling, decision tree, priority rules, case study
Procedia PDF Downloads 3584696 Environmental Performance Improvement of Additive Manufacturing Processes with Part Quality Point of View
Authors: Mazyar Yosofi, Olivier Kerbrat, Pascal Mognol
Abstract:
Life cycle assessment of additive manufacturing processes has evolved significantly since these past years. A lot of existing studies mainly focused on energy consumption. Nowadays, new methodologies of life cycle inventory acquisition came through the literature and help manufacturers to take into account all the input and output flows during the manufacturing step of the life cycle of products. Indeed, the environmental analysis of the phenomena that occur during the manufacturing step of additive manufacturing processes is going to be well known. Now it becomes possible to count and measure accurately all the inventory data during the manufacturing step. Optimization of the environmental performances of processes can now be considered. Environmental performance improvement can be made by varying process parameters. However, a lot of these parameters (such as manufacturing speed, the power of the energy source, quantity of support materials) affect directly the mechanical properties, surface finish and the dimensional accuracy of a functional part. This study aims to improve the environmental performance of an additive manufacturing process without deterioration of the part quality. For that purpose, the authors have developed a generic method that has been applied on multiple parts made by additive manufacturing processes. First, a complete analysis of the process parameters is made in order to identify which parameters affect only the environmental performances of the process. Then, multiple parts are manufactured by varying the identified parameters. The aim of the second step is to find the optimum value of the parameters that decrease significantly the environmental impact of the process and keep the part quality as desired. Finally, a comparison between the part made by initials parameters and changed parameters is made. In this study, the major finding claims by authors is to reduce the environmental impact of an additive manufacturing process while respecting the three quality criterion of parts, mechanical properties, dimensional accuracy and surface roughness. Now that additive manufacturing processes can be seen as mature from a technical point of view, environmental improvement of these processes can be considered while respecting the part properties. The first part of this study presents the methodology applied to multiple academic parts. Then, the validity of the methodology is demonstrated on functional parts.Keywords: additive manufacturing, environmental impact, environmental improvement, mechanical properties
Procedia PDF Downloads 2884695 Medical Examiner Collection of Comprehensive, Objective Medical Evidence for Conducted Electrical Weapons and Their Temporal Relationship to Sudden Arrest
Authors: Michael Brave, Mark Kroll, Steven Karch, Charles Wetli, Michael Graham, Sebastian Kunz, Dorin Panescu
Abstract:
Background: Conducted electrical weapons (CEW) are now used in 107 countries and are a common law enforcement less-lethal force practice in the United Kingdom (UK), United States of America (USA), Canada, Australia, New Zealand, and others. Use of these devices is rarely temporally associated with the occurrence of sudden arrest-related deaths (ARD). Because such deaths are uncommon, few Medical Examiners (MEs) ever encounter one, and even fewer offices have established comprehensive investigative protocols. Without sufficient scientific data, the role, if any, played by a CEW in a given case is largely supplanted by conjecture often defaulting to a CEW-induced fatal cardiac arrhythmia. In addition to the difficulty in investigating individual deaths, the lack of information also detrimentally affects being able to define and evaluate the ARD cohort generally. More comprehensive, better information leads to better interpretation in individual cases and also to better research. The purpose of this presentation is to provide MEs with a comprehensive evidence-based checklist to assist in the assessment of CEW-ARD cases. Methods: PUBMED and Sociology/Criminology data bases were queried to find all medical, scientific, electrical, modeling, engineering, and sociology/criminology peer-reviewed literature for mentions of CEW or synonymous terms. Each paper was then individually reviewed to identify those that discussed possible bioelectrical mechanisms relating CEW to ARD. A Naranjo-type pharmacovigilance algorithm was also employed, when relevant, to identify and quantify possible direct CEW electrical myocardial stimulation. Additionally, CEW operational manuals and training materials were reviewed to allow incorporation of CEW-specific technical parameters. Results: Total relevant PUBMED citations of CEWs were less than 250, and reports of death extremely rare. Much relevant information was available from Sociology/Criminology data bases. Once the relevant published papers were identified, and reviewed, we compiled an annotated checklist of data that we consider critical to a thorough CEW-involved ARD investigation. Conclusion: We have developed an evidenced-based checklist that can be used by MEs and their staffs to assist them in identifying, collecting, documenting, maintaining, and objectively analyzing the role, if any, played by a CEW in any specific case of sudden death temporally associated with the use of a CEW. Even in cases where the collected information is deemed by the ME as insufficient for formulating an opinion or diagnosis to a reasonable degree of medical certainty, information collected as per the checklist will often be adequate for other stakeholders to use as a basis for informed decisions. Having reviewed the appropriate materials in a significant number of cases careful examination of the heart and brain is likely adequate. Channelopathy testing should be considered in some cases, however it may be considered cost prohibitive (aprox $3000). Law enforcement agencies may want to consider establishing a reserve fund to help manage such rare cases. The expense may stay the enormous costs associated with incident-precipitated litigation.Keywords: ARD, CEW, police, TASER
Procedia PDF Downloads 3464694 Case Report of Angioedema after Application of Botulinum Toxin
Authors: Sokol Isaraj, Lorela Bendo
Abstract:
Botulinum toxin is the most commonly used treatment to reduce the appearance of dynamic facial wrinkles. It can smooth out wrinkles and restore a more youthful appearance. Although allergic reactions after botox injection are rare, care should be taken by the physician to diagnose the condition and provide suitable treatment in time. The authors report a case of allergic reaction with angioedema to abobotulinumtoxin A. A 50-year-old woman complaining of dynamic wrinkles was injected in a private clinic with Dysport. After two weeks, she returned to the clinic for the touch-up session. Thirty minutes after the completion of the injections in the crow’s feet area, she described the feeling of mild pain and warmth in the injected area, followed by angioedema. The symptoms couldn’t be controlled by IM corticosteroid, and the patient was referred to a hospital center. After adequate systemic treatment for four days, there was a resolution of the symptoms. Despite the reported safety of abobotulinumtoxin A, this case warns practitioners of unpredictably adverse reactions, which require rapid recognition and intravenous support.Keywords: botulinum toxin, side effects, angioedema, injections
Procedia PDF Downloads 1054693 Multi-Criteria Decision Making Network Optimization for Green Supply Chains
Authors: Bandar A. Alkhayyal
Abstract:
Modern supply chains are typically linear, transforming virgin raw materials into products for end consumers, who then discard them after use to landfills or incinerators. Nowadays, there are major efforts underway to create a circular economy to reduce non-renewable resource use and waste. One important aspect of these efforts is the development of Green Supply Chain (GSC) systems which enables a reverse flow of used products from consumers back to manufacturers, where they can be refurbished or remanufactured, to both economic and environmental benefit. This paper develops novel multi-objective optimization models to inform GSC system design at multiple levels: (1) strategic planning of facility location and transportation logistics; (2) tactical planning of optimal pricing; and (3) policy planning to account for potential valuation of GSC emissions. First, physical linear programming was applied to evaluate GSC facility placement by determining the quantities of end-of-life products for transport from candidate collection centers to remanufacturing facilities while satisfying cost and capacity criteria. Second, disassembly and remanufacturing processes have received little attention in industrial engineering and process cost modeling literature. The increasing scale of remanufacturing operations, worth nearly $50 billion annually in the United States alone, have made GSC pricing an important subject of research. A non-linear physical programming model for optimization of pricing policy for remanufactured products that maximizes total profit and minimizes product recovery costs were examined and solved. Finally, a deterministic equilibrium model was used to determine the effects of internalizing a cost of GSC greenhouse gas (GHG) emissions into optimization models. Changes in optimal facility use, transportation logistics, and pricing/profit margins were all investigated against a variable cost of carbon, using case study system created based on actual data from sites in the Boston area. As carbon costs increase, the optimal GSC system undergoes several distinct shifts in topology as it seeks new cost-minimal configurations. A comprehensive study of quantitative evaluation and performance of the model has been done using orthogonal arrays. Results were compared to top-down estimates from economic input-output life cycle assessment (EIO-LCA) models, to contrast remanufacturing GHG emission quantities with those from original equipment manufacturing operations. Introducing a carbon cost of $40/t CO2e increases modeled remanufacturing costs by 2.7% but also increases original equipment costs by 2.3%. The assembled work advances the theoretical modeling of optimal GSC systems and presents a rare case study of remanufactured appliances.Keywords: circular economy, extended producer responsibility, greenhouse gas emissions, industrial ecology, low carbon logistics, green supply chains
Procedia PDF Downloads 1604692 The Mediating Role of Store Personality in the Relationship Between Self-Congruity and Manifestations of Loyalty
Authors: María de los Ángeles Crespo López, Carmen García García
Abstract:
The highly competitive nature of today's globalised marketplace requires that brands and stores develop effective commercial strategies to ensure their economic survival. Maintaining the loyalty of existing customers constitutes one key strategy that yields the best results. Although the relationship between consumers' self-congruity and their manifestations of loyalty towards a store has been investigated, the role of store personality in this relationship remains unclear. In this study, multiple parallel mediation analysis was used to examine the effect of Store Personality on the relationship between Self-Congruity of consumers and their Manifestations of Loyalty. For this purpose, 457 Spanish consumers of the Fnac store completed three self-report questionnaires assessing Store Personality, Self-Congruity, and Store Loyalty. The data were analyzed using the SPSS macro PROCESS. The results revealed that three dimensions of Store Personality, namely Exciting, Close and Competent Store, positively and significantly mediated the relationship between Self-Congruity and Manifestations of Loyalty. The indirect effect of Competent Store was the greatest. This means that a consumer with higher levels of Self-Congruity with the store will exhibit more Manifestations of Loyalty when the store is perceived as Exciting, Close or Competent. These findings suggest that more attention should be paid to the perceived personality of stores for the development of effective marketing strategies to maintain or increase consumers' manifestations of loyalty towards stores.Keywords: multiple parallel mediation, PROCESS, self-congruence, store loyalty, store personality
Procedia PDF Downloads 1584691 Investigating the Potential for Introduction of Warm Mix Asphalt in Kuwait Using the Volcanic Ash
Authors: H. Al-Baghli, F. Al-Asfour
Abstract:
The current applied asphalt technology for Kuwait roads pavement infrastructure is the hot mix asphalt (HMA) pavement, including both pen grade and polymer modified bitumen (PMBs), that is produced and compacted at high temperature levels ranging from 150 to 180 °C. There are no current specifications for warm and cold mix asphalts in Kuwait’s Ministry of Public Works (MPW) asphalt standard and specifications. The process of the conventional HMA is energy intensive and directly responsible for the emission of greenhouse gases and other environmental hazards into the atmosphere leading to significant environmental impacts and raising health risk to labors at site. Warm mix asphalt (WMA) technology, a sustainable alternative preferred in multiple countries, has many environmental advantages because it requires lower production temperatures than HMA by 20 to 40 °C. The reduction of temperatures achieved by WMA originates from multiple technologies including foaming and chemical or organic additives that aim to reduce bitumen and improve mix workability. This paper presents a literature review of WMA technologies and techniques followed by an experimental study aiming to compare the results of produced WMA samples, using a water containing additive (foaming process), at different compaction temperatures with the HMA control volumetric properties mix designed in accordance to the new MPW’s specifications and guidelines.Keywords: warm-mix asphalt, water-bearing additives, foaming-based process, chemical additives, organic additives
Procedia PDF Downloads 1244690 The Relationship between Corporate Governance and Intellectual Capital Disclosure: Malaysian Evidence
Authors: Rabiaal Adawiyah Shazali, Corina Joseph
Abstract:
The disclosure of Intellectual Capital (IC) information is getting more vital in today’s era of a knowledge-based economy. Companies are advised by accounting bodies to enhance IC disclosure which complements the conventional financial disclosures. There are no accounting standards for Intellectual Capital Disclosure (ICD), therefore the disclosure is entirely voluntary. Hence, this study aims to investigate the extent of ICD and to examine the relationship between corporate governance and ICD in Malaysia. This study employed content analysis of 100 annual reports by the top 100 public listed companies in Malaysia during 2012. The uniqueness of this study lies on its underpinning theory used where it applies the institutional isomorphism theory to support the effect of the attributes of corporate governance towards ICD. In order to achieve the stated objective, multiple regression analysis were employed to conduct this study. From the descriptive statistics, it was concluded that public listed companies in Malaysia have increased their awareness towards the importance of ICD. Furthermore, results from the multiple regression analysis confirmed that corporate governance affects the company’s ICD where the frequency of audit committee meetings and the board size has positively influenced the level of ICD in companies. Findings from this study would provide an incentive for companies in Malaysia to enhance the disclosure of IC. In addition, this study would assist Bursa Malaysia and other regulatory bodies to come up with a proper guideline for the disclosure of IC.Keywords: annual report, content analysis, corporate governance, intellectual capital disclosure
Procedia PDF Downloads 2154689 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C
Authors: Keaghan Brown
Abstract:
The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase
Procedia PDF Downloads 774688 Learning Curve Effect on Materials Procurement Schedule of Multiple Sister Ships
Authors: Vijaya Dixit Aasheesh Dixit
Abstract:
Shipbuilding industry operates in Engineer Procure Construct (EPC) context. Product mix of a shipyard comprises of various types of ships like bulk carriers, tankers, barges, coast guard vessels, sub-marines etc. Each order is unique based on the type of ship and customized requirements, which are engineered into the product right from design stage. Thus, to execute every new project, a shipyard needs to upgrade its production expertise. As a result, over the long run, holistic learning occurs across different types of projects which contributes to the knowledge base of the shipyard. Simultaneously, in the short term, during execution of a project comprising of multiple sister ships, repetition of similar tasks leads to learning at activity level. This research aims to capture above learnings of a shipyard and incorporate learning curve effect in project scheduling and materials procurement to improve project performance. Extant literature provides support for the existence of such learnings in an organization. In shipbuilding, there are sequences of similar activities which are expected to exhibit learning curve behavior. For example, the nearly identical structural sub-blocks which are successively fabricated, erected, and outfitted with piping and electrical systems. Learning curve representation can model not only a decrease in mean completion time of an activity, but also a decrease in uncertainty of activity duration. Sister ships have similar material requirements. The same supplier base supplies materials for all the sister ships within a project. On one hand, this provides an opportunity to reduce transportation cost by batching the order quantities of multiple ships. On the other hand, it increases the inventory holding cost at shipyard and the risk of obsolescence. Further, due to learning curve effect the production scheduled of each consequent ship gets compressed. Thus, the material requirement schedule of every next ship differs from its previous ship. As more and more ships get constructed, compressed production schedules increase the possibility of batching the orders of sister ships. This work aims at integrating materials management with project scheduling of long duration projects for manufacturing of multiple sister ships. It incorporates the learning curve effect on progressively compressing material requirement schedules and addresses the above trade-off of transportation cost and inventory holding and shortage costs while satisfying budget constraints of various stages of the project. The activity durations and lead time of items are not crisp and are available in the form of probabilistic distribution. A Stochastic Mixed Integer Programming (SMIP) model is formulated which is solved using evolutionary algorithm. Its output provides ordering dates of items and degree of order batching for all types of items. Sensitivity analysis determines the threshold number of sister ships required in a project to leverage the advantage of learning curve effect in materials management decisions. This analysis will help materials managers to gain insights about the scenarios: when and to what degree is it beneficial to treat a multiple ship project as an integrated one by batching the order quantities and when and to what degree to practice distinctive procurement for individual ship.Keywords: learning curve, materials management, shipbuilding, sister ships
Procedia PDF Downloads 5024687 Comparative Mesh Sensitivity Study of Different Reynolds Averaged Navier Stokes Turbulence Models in OpenFOAM
Authors: Zhuoneng Li, Zeeshan A. Rana, Karl W. Jenkins
Abstract:
In industry, to validate a case, often a multitude of simulation are required and in order to demonstrate confidence in the process where users tend to use a coarser mesh. Therefore, it is imperative to establish the coarsest mesh that could be used while keeping reasonable simulation accuracy. To date, the two most reliable, affordable and broadly used advanced simulations are the hybrid RANS (Reynolds Averaged Navier Stokes)/LES (Large Eddy Simulation) and wall modelled LES. The potentials in these two simulations will still be developed in the next decades mainly because the unaffordable computational cost of a DNS (Direct Numerical Simulation). In the wall modelled LES, the turbulence model is applied as a sub-grid scale model in the most inner layer near the wall. The RANS turbulence models cover the entire boundary layer region in a hybrid RANS/LES (Detached Eddy Simulation) and its variants, therefore, the RANS still has a very important role in the state of art simulations. This research focuses on the turbulence model mesh sensitivity analysis where various turbulence models such as the S-A (Spalart-Allmaras), SSG (Speziale-Sarkar-Gatski), K-Omega transitional SST (Shear Stress Transport), K-kl-Omega, γ-Reθ transitional model, v2f are evaluated within the OpenFOAM. The simulations are conducted on a fully developed turbulent flow over a flat plate where the skin friction coefficient as well as velocity profiles are obtained to compare against experimental values and DNS results. A concrete conclusion is made to clarify the mesh sensitivity for different turbulence models.Keywords: mesh sensitivity, turbulence models, OpenFOAM, RANS
Procedia PDF Downloads 2614686 An Overbooking Model for Car Rental Service with Different Types of Cars
Authors: Naragain Phumchusri, Kittitach Pongpairoj
Abstract:
Overbooking is a very useful revenue management technique that could help reduce costs caused by either undersales or oversales. In this paper, we propose an overbooking model for two types of cars that can minimize the total cost for car rental service. With two types of cars, there is an upgrade possibility for lower type to upper type. This makes the model more complex than one type of cars scenario. We have found that convexity can be proved in this case. Sensitivity analysis of the parameters is conducted to observe the effects of relevant parameters on the optimal solution. Model simplification is proposed using multiple linear regression analysis, which can help estimate the optimal overbooking level using appropriate independent variables. The results show that the overbooking level from multiple linear regression model is relatively close to the optimal solution (with the adjusted R-squared value of at least 72.8%). To evaluate the performance of the proposed model, the total cost was compared with the case where the decision maker uses a naïve method for the overbooking level. It was found that the total cost from optimal solution is only 0.5 to 1 percent (on average) lower than the cost from regression model, while it is approximately 67% lower than the cost obtained by the naïve method. It indicates that our proposed simplification method using regression analysis can effectively perform in estimating the overbooking level.Keywords: overbooking, car rental industry, revenue management, stochastic model
Procedia PDF Downloads 1724685 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms
Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao
Abstract:
Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50
Procedia PDF Downloads 1404684 The Impact of Missense Mutation in Phosphatidylinositol Glycan Class A Associated to Paroxysmal Nocturnal Hemoglobinuria and Multiple Congenital Anomalies-Hypotonia-Seizures Syndrome 2: A Computational Study
Authors: Ashish Kumar Agrahari, Amit Kumar
Abstract:
Paroxysmal nocturnal hemoglobinuria (PNH) is an acquired clonal blood disorder that manifests with hemolytic anemia, thrombosis, and peripheral blood cytopenias. The disease is caused by the deficiency of two glycosylphosphatidylinositols (GPI)-anchored proteins (CD55 and CD59) in the hemopoietic stem cells. The deficiency of GPI-anchored proteins has been associated with the somatic mutations in phosphatidylinositol glycan class A (PIGA). However, the mutations that do not cause PNH is associated with the multiple congenital anomalies-hypotonia-seizures syndrome 2 (MCAHS2). To best of our knowledge, no computational study has been performed to explore the atomistic level impact of PIGA mutations on the structure and dynamics of the protein. In the current work, we are mainly interested to get insights into the molecular mechanism of PIGA mutations. In the initial step, we screened the most pathogenic mutations from the pool of publicly available mutations. Further, to get a better understanding, pathogenic mutations were mapped to the modeled structure and subjected to 50ns molecular dynamics simulation. Our computational study suggests that four mutations are highly vulnerable to altering the structural conformation and stability of the PIGA protein, which illustrates its association with PNH and MCAHS2 phenotype.Keywords: homology modeling, molecular dynamics simulation, missense mutations PNH, MCAHS2, PIGA
Procedia PDF Downloads 1454683 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning
Authors: Pooja Khanal, Huaming Zhang
Abstract:
Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.Keywords: bug classification, bug labels, GitHub issues, semantic differences
Procedia PDF Downloads 2014682 Left Cornual Ectopic Pregnancy with Uterine Rupture - a Case Report
Authors: Vinodhini Elangovan, Jen Heng Pek
Abstract:
Background: An ectopic pregnancy is defined as any pregnancy implanted outside of the endometrial cavity. Cornual pregnancy, a rare variety of ectopic pregnancies, is seen in about 2-4% of ectopic pregnancies. It develops in the interstitial portion of the fallopian tube and invades through the uterine wall. This case describes a third-trimester cornual pregnancy that resulted in a uterine rupture. Case: A 38-year old Chinese lady was brought to the Emergency Department (ED) as a standby case for hypotension. She was 30+6 weeks pregnant (Gravida 3, Parous 1). Her past obstetric history included a live birth delivered via lower segment Caesarean section due to non-reassuring fetal status in 2002 and a miscarriage in 2012. She developed generalized abdominal pain. There was no per vaginal bleeding or leaking liquor. There was also no fever, nausea, vomiting, constipation, diarrhea, or urinary symptoms. On arrival in the ED, she was pale, diaphoretic, and lethargic. She had generalized tenderness with guarding and rebound over her abdomen. Point of care ultrasound was performed and showed a large amount of intra-abdominal free fluid, and the fetal heart rate was 170 beats per minute. The point of care hemoglobin was 7.1 g/dL, and lactate was 6.8 mmol/L. The patient’s blood pressure dropped precipitously to 50/36 mmHg, and her heart rate went up to 141 beats per minute. The clinical impression was profound shock secondary to uterine rupture. Intra-operatively, there was extensive haemoperitoneum, and the fetus was seen in the abdominal cavity. The fetus was delivered immediately and handed to the neonatal team. On exploration of the uterus, the point of rupture was at the left cornual region where the placenta was attached to. Discussion: Cornual pregnancies are difficult to diagnose pre-operatively with low ultrasonographic sensitivity and hence are commonly confused with normal intrauterine pregnancies. They pose a higher risk of rupture and hemorrhage compared to other types of ectopic pregnancies. In very rare circumstances, interstitial pregnancies can result in a viable fetus. Uterine rupture resulting in hemorrhagic shock is a true obstetric emergency that can result in significant morbidity and mortality for the patient and the fetus, and early diagnosis in the emergency department is crucial. The patient in this case presented with known risk factors of multiparity, advanced maternal age, and previous lower segment cesarean section, which increased the suspicion of uterine rupture. Ultrasound assessment may be beneficial to any patient who presents with symptoms and a history of uterine surgery to assess the possibility of uterine dehiscence or rupture. Management of a patient suspected of uterine rupture should be systematic in the emergency department and follow an ABC approach. Conclusion: This case demonstrates the importance for an emergency physician to maintain the suspicion for ectopic pregnancy even at advanced gestational ages. It also highlights how even though all emergency physicians may not be qualified to do a detailed pelvic ultrasound, it is essential for them to be competent with a point of care ultrasound to make a prompt diagnosis of conditions such as uterine rupture.Keywords: cornual ectopic , ectopic pregnancy, emergency medicine, obstetric emergencies
Procedia PDF Downloads 1294681 Hybrid Weighted Multiple Attribute Decision Making Handover Method for Heterogeneous Networks
Authors: Mohanad Alhabo, Li Zhang, Naveed Nawaz
Abstract:
Small cell deployment in 5G networks is a promising technology to enhance capacity and coverage. However, unplanned deployment may cause high interference levels and high number of unnecessary handovers, which in turn will result in an increase in the signalling overhead. To guarantee service continuity, minimize unnecessary handovers, and reduce signalling overhead in heterogeneous networks, it is essential to properly model the handover decision problem. In this paper, we model the handover decision according to Multiple Attribute Decision Making (MADM) method, specifically Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). In this paper, we propose a hybrid TOPSIS method to control the handover in heterogeneous network. The proposed method adopts a hybrid weighting, which is a combination of entropy and standard deviation. A hybrid weighting control parameter is introduced to balance the impact of the standard deviation and entropy weighting on the network selection process and the overall performance. Our proposed method shows better performance, in terms of the number of frequent handovers and the mean user throughput, compared to the existing methods.Keywords: handover, HetNets, interference, MADM, small cells, TOPSIS, weight
Procedia PDF Downloads 1494680 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable
Authors: Xinyuan Y. Song, Kai Kang
Abstract:
Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data
Procedia PDF Downloads 1444679 Zinc Oxid Nanotubes Modified by SiO2 as a Recyclable Catalyst for the Synthesis of 2,3-Dihydroquinazolin-4(1H)-Ones
Authors: Rakhshan Hakimelahi
Abstract:
In recent years, zinc oxid nano tubes have attracted much attention. The direct use of zinc oxid nano tubes modified by SiO2 as recoverable catalysts for organic reactions is very rare. The catalysts were characterized by XRD. The average particle size of ZnO catalysts is 57 nm and there are high density defects on nano tubes surfaces. A simple and efficient method for the quinazolin derivatives synthesis from the condensation isatoic anhydride and an aromatic aldehyde with ammonium acetate in the presence of a catalytic amount zinc oxid nano tubes modified by SiO2 is described. The reason proposed for higher catalytic activity of zinc oxid nano tubes modified by SiO2 is a combination effect of the small particle size and high-density surface defects. The practical and simple method led to excellent yields of the 2,3-Di hydro quinazolin-4(1H)-one derivatives under mild conditions and within short times.Keywords: 2, 3-Dihydroquinazolin-4(1H)-one derivatives, reusable catalyst, SiO2, zinc oxid nanotubes
Procedia PDF Downloads 3724678 Effect of Climate Variability on Honeybee's Production in Ondo State, Nigeria
Authors: Justin Orimisan Ijigbade
Abstract:
The study was conducted to assess the effect of climate variability on honeybee’s production in Ondo State, Nigeria. Multistage sampling technique was employed to collect the data from 60 beekeepers across six Local Government Areas in Ondo State. Data collected were subjected to descriptive statistics and multiple regression model analyses. The results showed that 93.33% of the respondents were male with 80% above 40 years of age. Majority of the respondents (96.67%) had formal education and 90% produced honey for commercial purpose. The result revealed that 90% of the respondents admitted that low temperature as a result of long hours/period of rainfall affected the foraging efficiency of the worker bees, 73.33% claimed that long period of low humidity resulted in low level of nectar flow, while 70% submitted that high temperature resulted in improper composition of workers, dunes and queen in the hive colony. The result of multiple regression showed that beekeepers’ experience, educational level, access to climate information, temperature and rainfall were the main factors affecting honey bees production in the study area. Therefore, beekeepers should be given more education on climate variability and its adaptive strategies towards ensuring better honeybees production in the study area.Keywords: climate variability, honeybees production, humidity, rainfall and temperature
Procedia PDF Downloads 2724677 Antidiabetic Potential of Pseuduvaria monticola Bark Extract on the Pancreatic Cells, NIT-1 and Type 2 Diabetic Rat Model
Authors: Hairin Taha, Aditya Arya, M. A. Hapipah, A. M. Mustafa
Abstract:
Plants have been an important source of medicine since ancient times. Pseuduvaria monticola is a rare montane forest species from the Annonaceae family. Traditionally, the plant was used to cure symptoms of fever, inflammation, stomach-ache and also to reduce the elevated levels of blood glucose. Scientifically, we have evaluated the antidiabetic potential of the Pseuduvaria monticola bark methanolic extract on certain in vitro cell based assays, followed by in vivo study. Results from in vitro models displayed PMm upregulated glucose uptake and insulin secretion in mouse pancreatic β-cells. In vivo study demonstrated the PMm down-regulated hyperglycaemia, oxidative stress and elevated levels of pro-inflammatory cytokines in type 2 diabetic rat models. Altogether, the study revealed that Pseuduvaria monticola might be used as a potential candidate for the management of type 2 diabetes and its related complications.Keywords: type 2 diabetes, Pseuduvaria monticola, insulin secretion, glucose uptake
Procedia PDF Downloads 4394676 A Framework for Designing Complex Product-Service Systems with a Multi-Domain Matrix
Authors: Yoonjung An, Yongtae Park
Abstract:
Offering a Product-Service System (PSS) is a well-accepted strategy that companies may adopt to provide a set of systemic solutions to customers. PSSs were initially provided in a simple form but now take diversified and complex forms involving multiple services, products and technologies. With the growing interest in the PSS, frameworks for the PSS development have been introduced by many researchers. However, most of the existing frameworks fail to examine various relations existing in a complex PSS. Since designing a complex PSS involves full integration of multiple products and services, it is essential to identify not only product-service relations but also product-product/ service-service relations. It is also equally important to specify how they are related for better understanding of the system. Moreover, as customers tend to view their purchase from a more holistic perspective, a PSS should be developed based on the whole system’s requirements, rather than focusing only on the product requirements or service requirements. Thus, we propose a framework to develop a complex PSS that is coordinated fully with the requirements of both worlds. Specifically, our approach adopts a multi-domain matrix (MDM). A MDM identifies not only inter-domain relations but also intra-domain relations so that it helps to design a PSS that includes highly desired and closely related core functions/ features. Also, various dependency types and rating schemes proposed in our approach would help the integration process.Keywords: inter-domain relations, intra-domain relations, multi-domain matrix, product-service system design
Procedia PDF Downloads 6414675 Idiopathic Gingival Fibromatosis
Authors: Bandana Koirala, Shivalal Sharma
Abstract:
Introduction: Gingival enlargements are quite common and may be either inflammatory, non-inflammatory or a combination of both. Idiopathic gingival enlargement is a rare condition with a proliferative fibrous lesion of the gingival tissue that causes esthetic and functional problems. It is of undetermined etiology. Case Description: This case report addresses the diagnosis and treatment of a case of idiopathic gingival enlargement in a 9-year-old male patient. The patient presented with a generalized diffuse gingival enlargement involving the entire maxillary and the mandibular arch with extension on occlusal, buccal, lingual, and palatal surfaces with just parts of occlusal surfaces of few upper and lower molars visible resulting in open mouth, difficulty in mastication and speech. Biopsy report confirmed the diagnosis of fibromatosis gingivae. Gingivectomy was carried out in all four quadrants by using external bevel incision. Conclusion: Though total esthetics could not be restored due to unusual bony enlargement, the general appearance improved satisfactorily. Treatment after complete excision however, improved the masticatory competence to a great extent.Keywords: idiopathic gingival fibromatosis, gingival enlargement, gingivectomy, medical and health sciences
Procedia PDF Downloads 3294674 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT
Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez
Abstract:
Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management
Procedia PDF Downloads 1384673 Avian Bioecological Status In Batna Wetlands (NE, Algeria)
Authors: Marref C., Bezzalla A., Marref S., Houhamdi M.
Abstract:
Wetlands represent ecosystems of great importance through their ecological and socio-economic functions and biological diversity, even if they are most threatened by anthropization. This study aimed to contribute to the creation of an inventory of bird species in Batna, on Algeria from 2020 to 2022. Counts were carried out from 8:00 to 19:00 using a telescope (20 × 60) and a pair of binoculars (10 × 50) and by employing absolute and relative methods. Birds were categorized by phenology, habitat, biogeography, and diet. A total of 80 species in 58 genera and 19 families were observed. Migratory birds were dominant (38%) phenologically, and the birds of Palearctic origin dominated (26.25%) biogeographically. Invertivorous and carnivorous species were most common (35%). Ecologically, the majority of species were waterbirds (73.75%), which are protected in Algeria. This study highlights the need for the preservation of ecosystem components and enhancement of biological resources of protected, rare, and key species. it observed 43797 individuals of Marmaronetta angustirostris during our study and reported the nesting of Podiceps nigricollis, Porphyrio porphyrio, and Tadorna ferruginea. For this reason, it is recommended to propose the area as a Ramsar site.Keywords: biodiversity, avifauna, ecologicat status, zone humide, algerie
Procedia PDF Downloads 694672 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 3314671 Precious and Rare Metals in Overburden Carbonaceous Rocks: Methods of Extraction
Authors: Tatyana Alexandrova, Alexandr Alexandrov, Nadezhda Nikolaeva
Abstract:
A problem of complex mineral resources development is urgent and priority, it is aimed at realization of the processes of their ecologically safe development, one of its components is revealing the influence of the forms of element compounds in raw materials and in the processing products. In view of depletion of the precious metal reserves at the traditional deposits in the XXI century the large-size open cast deposits, localized in black shale strata begin to play the leading role. Carbonaceous (black) shales carry a heightened metallogenic potential. Black shales with high content of carbon are widely distributed within the scope of Bureinsky massif. According to academician Hanchuk`s data black shales of Sutirskaya series contain generally PGEs native form. The presence of high absorptive towards carbonaceous matter gold and PGEs compounds in crude ore results in decrease of valuable components extraction because of their sorption into dissipated carbonaceous matter.Keywords: сarbonaceous rocks, bitumens, precious metals, concentration, extraction
Procedia PDF Downloads 2464670 Targeting Calcium Dysregulation for Treatment of Dementia in Alzheimer's Disease
Authors: Huafeng Wei
Abstract:
Dementia in Alzheimer’s Disease (AD) is the number one cause of dementia internationally, without effective treatments. Increasing evidence suggest that disruption of intracellular calcium homeostasis, primarily pathological elevation of cytosol and mitochondria but reduction of endoplasmic reticulum (ER) calcium concentrations, play critical upstream roles on multiple pathologies and associated neurodegeneration, impaired neurogenesis, synapse, and cognitive dysfunction in various AD preclinical studies. The last federal drug agency (FDA) approved drug for AD dementia treatment, memantine, exert its therapeutic effects by ameliorating N-methyl-D-aspartate (NMDA) glutamate receptor overactivation and subsequent calcium dysregulation. More research works are needed to develop other drugs targeting calcium dysregulation at multiple pharmacological acting sites for future effective AD dementia treatment. Particularly, calcium channel blockers for the treatment of hypertension and dantrolene for the treatment of muscle spasm and malignant hyperthermia can be repurposed for this purpose. In our own research work, intranasal administration of dantrolene significantly increased its brain concentrations and durations, rendering it a more effective therapeutic drug with less side effects for chronic AD dementia treatment. This review summarizesthe progress of various studies repurposing drugs targeting calcium dysregulation for future effective AD dementia treatment as potentially disease-modifying drugs.Keywords: alzheimer, calcium, cognitive dysfunction, dementia, neurodegeneration, neurogenesis
Procedia PDF Downloads 1824669 Deep Reinforcement Learning Approach for Optimal Control of Industrial Smart Grids
Authors: Niklas Panten, Eberhard Abele
Abstract:
This paper presents a novel approach for real-time and near-optimal control of industrial smart grids by deep reinforcement learning (DRL). To achieve highly energy-efficient factory systems, the energetic linkage of machines, technical building equipment and the building itself is desirable. However, the increased complexity of the interacting sub-systems, multiple time-variant target values and stochastic influences by the production environment, weather and energy markets make it difficult to efficiently control the energy production, storage and consumption in the hybrid industrial smart grids. The studied deep reinforcement learning approach allows to explore the solution space for proper control policies which minimize a cost function. The deep neural network of the DRL agent is based on a multilayer perceptron (MLP), Long Short-Term Memory (LSTM) and convolutional layers. The agent is trained within multiple Modelica-based factory simulation environments by the Advantage Actor Critic algorithm (A2C). The DRL controller is evaluated by means of the simulation and then compared to a conventional, rule-based approach. Finally, the results indicate that the DRL approach is able to improve the control performance and significantly reduce energy respectively operating costs of industrial smart grids.Keywords: industrial smart grids, energy efficiency, deep reinforcement learning, optimal control
Procedia PDF Downloads 195