Search results for: multiple winglets
4230 Design and Synthesis of Two Tunable Bandpass Filters Based on Varactors and Defected Ground Structure
Authors: M'Hamed Boulakroune, Mouloud Challal, Hassiba Louazene, Saida Fentiz
Abstract:
This paper presents a new ultra wideband (UWB) microstrip bandpass filter (BPF) at microwave frequencies. The first one is based on multiple-mode resonator (MMR) and rectangular-shaped defected ground structure (DGS). This filter, which is compact size of 25.2 x 3.8 mm2, provides in the pass band an insertion loss of 0.57 dB and a return loss greater than 12 dB. The second structure is a tunable bandpass filters using planar patch resonators based on diode varactor. This filter is formed by a triple mode circular patch resonator with two pairs of slots, in which the varactors are connected. Indeed, this filter is initially centered at 2.4 GHz, the center frequency of the tunable patch filter could be tuned up to 1.8 GHz simultaneously with the bandwidth, reaching high tuning ranges. Lossless simulations were compared to those considering the substrate dielectric, conductor losses, and the equivalent electrical circuit model of the tuning element in order to assess their effects. Within these variations, simulation results showed insertion loss better than 2 dB and return loss better than 10 dB over the passband. The proposed filters presents good performances and the simulation results are in satisfactory agreement with the experimentation ones reported elsewhere.Keywords: defected ground structure, diode varactor, microstrip bandpass filter, multiple-mode resonator
Procedia PDF Downloads 3094229 Compression Index Estimation by Water Content and Liquid Limit and Void Ratio Using Statistics Method
Authors: Lizhou Chen, Abdelhamid Belgaid, Assem Elsayed, Xiaoming Yang
Abstract:
Compression index is essential in foundation settlement calculation. The traditional method for determining compression index is consolidation test which is expensive and time consuming. Many researchers have used regression methods to develop empirical equations for predicting compression index from soil properties. Based on a large number of compression index data collected from consolidation tests, the accuracy of some popularly empirical equations were assessed. It was found that primary compression index is significantly overestimated in some equations while it is underestimated in others. The sensitivity analyses of soil parameters including water content, liquid limit and void ratio were performed. The results indicate that the compression index obtained from void ratio is most accurate. The ANOVA (analysis of variance) demonstrates that the equations with multiple soil parameters cannot provide better predictions than the equations with single soil parameter. In other words, it is not necessary to develop the relationships between compression index and multiple soil parameters. Meanwhile, it was noted that secondary compression index is approximately 0.7-5.0% of primary compression index with an average of 2.0%. In the end, the proposed prediction equations using power regression technique were provided that can provide more accurate predictions than those from existing equations.Keywords: compression index, clay, settlement, consolidation, secondary compression index, soil parameter
Procedia PDF Downloads 1584228 Pharmacodynamic Enhancement of Repetitive rTMS Treatment Outcomes for Major Depressive Disorder
Authors: A. Mech
Abstract:
Repetitive transcranial magnetic stimulation has proven to be a valuable treatment option for patients who have failed to respond to multiple courses of antidepressant medication. In fact, the American Psychiatric Association recommends TMS after one failed treatment course of antidepressant medication. Genetic testing has proven valuable for pharmacokinetic variables, which, if understood, could lead to more efficient dosing of psychotropic medications to improve outcomes. Pharmacodynamic testing can identify biomarkers, which, if addressed, can improve patients' outcomes in antidepressant therapy. Monotherapy treatment of major depressive disorder with methylated B vitamin treatment has been shown to be safe and effective in patients with MTHFR polymorphisms without waiting for multiple trials of failed medication treatment for depression. Such treatment has demonstrated remission rates similar to antidepressant clinical trials. Combining pharmacodynamics testing with repetitive TMS treatment with NeuroStar has shown promising potential for enhancing remission rates and durability of treatment. In this study, a retrospective chart review (ongoing) of patients who obtained repetitive TMS treatment enhanced by dietary supplementation guided by Pharmacodynamic testing, displayed a greater remission rate (90%) than patients treated with only NeuroStar TMS (62%).Keywords: improved remission rate, major depressive disorder, pharmacodynamic testing, rTMS outcomes
Procedia PDF Downloads 554227 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation
Authors: Rashmi Malik, Videep Mishra
Abstract:
The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.Keywords: iterative game design, generative design, gaming asset automation, generative game design
Procedia PDF Downloads 664226 Ectopic Mediastinal Parathyroid Adenoma: A Case Report with Diagnostic and Management Challenges
Authors: Augustina Konadu Larbi-Ampofo, Ekemini Umoinwek
Abstract:
Background: Hypercalcaemia is a common electrolyte imbalance that increases mortality if poorly controlled. Primary hyperparathyroidism often presents like this with a prevalence of 0.1-0.3%. Management due to an ectopic parathyroid adenoma in the mediastinum is challenging, especially in a patient with a pacemaker. Case Presentation: A 79-year-old woman with a history of a previous cardiac arrest, permanent pacemaker, ischaemic heart disease, bilateral renal calculi, rectal polyps, liver cirrhosis, and a family history of hyperthyroidism presented to the emergency department with acute back pain. Management and Outcome: The patient was diagnosed with primary hyperparathyroidism due to her elevated corrected calcium and parathyroid hormone levels. Parathyroid investigations consisting of an NM MIBI scan, SPECT-CT, 4D parathyroid scan, and an ultrasound scan of the neck and thorax confirmed an ectopic parathyroid adenoma in the mediastinum at the level of the aortic arch, along with benign thyroid nodules. The location of the adenoma warranted a thoracoscopic surgical approach; however, the presence of her pacemaker and other cardiovascular conditions predisposed her to a potentially poorer post-operative outcome. Discussion: Mediastinal ectopic parathyroid adenomas are rare and difficult to diagnose and treat, often needing a multimodal imaging approach for accurate localisation. Surgery is a definitive treatment; however, in this patient, long-term medical treatment with cinacalcet was the only next suitable treatment option. The difficulty with this is that cinacalcet tackles the biochemical markers of the disease entity and not the disease itself, leaving room for what happens next if there is refractory/uncontrolled hypercalcaemia in this patient with a pacemaker. Moreover, the coexistence of her multiple conditions raises the suspicion of an underlying multisystemic or multiple endocrine disorder, with multiple endocrine neoplasia coming to mind, necessitating further genetic or autoimmune investigations. Conclusion: Mediastinal ectopic parathyroid adenomas are rare, with diagnostic and management challenges.Keywords: mediastinal ectopic parathyroid adenoma, hyperparathyroidism, SPECT/CT, nuclear medicine, multimodal imaging
Procedia PDF Downloads 164225 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents
Authors: Chiung-Hui Chen
Abstract:
With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.Keywords: internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity
Procedia PDF Downloads 2894224 Dietary Pattern and Risk of Breast Cancer Among Women:a Case Control Study
Authors: Huma Naqeeb
Abstract:
Epidemiological studies have shown the robust link between breast cancer and dietary pattern. There has been no previous study conducted in Pakistan, which specifically focuses on dietary patterns among breast cancer women. This study aims to examine the association of breast cancer with dietary patterns among Pakistani women. This case-control research was carried in multiple tertiary care facilities. Newly diagnosed primary breast cancer patients were recruited as cases (n = 408); age matched controls (n = 408) were randomly selected from the general population. Data on required parameters were systematically collected using subjective and objective tools. Factor and Principal Component Analysis (PCA) techniques were used to extract women’s dietary patterns. Four dietary patterns were identified based on eigenvalue >1; (i) veg-ovo-fish, (ii) meat-fat-sweet, (iii) mix (milk and its products, and gourds vegetables) and (iv) lentils - spices. Results of the multiple regressions were displayed as adjusted odds ratio (Adj. OR) and their respective confidence intervals (95% CI). After adjusted for potential confounders, veg-ovo-fish dietary pattern was found to be robustly associated with a lower risk of breast cancer among women (Adj. OR: 0.68, 95%CI: (0.46-0.99, p<0.01). The study findings concluded that attachment to the diets majorly composed of fresh vegetables, and high quality protein sources may contribute in lowering the risk of breast cancer among women.Keywords: breast cancer, dietary pattern, women, principal component analysis
Procedia PDF Downloads 1214223 Decision Tree Based Scheduling for Flexible Job Shops with Multiple Process Plans
Authors: H.-H. Doh, J.-M. Yu, Y.-J. Kwon, J.-H. Shin, H.-W. Kim, S.-H. Nam, D.-H. Lee
Abstract:
This paper suggests a decision tree based approach for flexible job shop scheduling with multiple process plans, i. e. each job can be processed through alternative operations, each of which can be processed on alternative machines. The main decision variables are: (a) selecting operation/machine pair; and (b) sequencing the jobs assigned to each machine. As an extension of the priority scheduling approach that selects the best priority rule combination after many simulation runs, this study suggests a decision tree based approach in which a decision tree is used to select a priority rule combination adequate for a specific system state and hence the burdens required for developing simulation models and carrying out simulation runs can be eliminated. The decision tree based scheduling approach consists of construction and scheduling modules. In the construction module, a decision tree is constructed using a four-stage algorithm, and in the scheduling module, a priority rule combination is selected using the decision tree. To show the performance of the decision tree based approach suggested in this study, a case study was done on a flexible job shop with reconfigurable manufacturing cells and a conventional job shop, and the results are reported by comparing it with individual priority rule combinations for the objectives of minimizing total flow time and total tardiness.Keywords: flexible job shop scheduling, decision tree, priority rules, case study
Procedia PDF Downloads 3544222 Environmental Performance Improvement of Additive Manufacturing Processes with Part Quality Point of View
Authors: Mazyar Yosofi, Olivier Kerbrat, Pascal Mognol
Abstract:
Life cycle assessment of additive manufacturing processes has evolved significantly since these past years. A lot of existing studies mainly focused on energy consumption. Nowadays, new methodologies of life cycle inventory acquisition came through the literature and help manufacturers to take into account all the input and output flows during the manufacturing step of the life cycle of products. Indeed, the environmental analysis of the phenomena that occur during the manufacturing step of additive manufacturing processes is going to be well known. Now it becomes possible to count and measure accurately all the inventory data during the manufacturing step. Optimization of the environmental performances of processes can now be considered. Environmental performance improvement can be made by varying process parameters. However, a lot of these parameters (such as manufacturing speed, the power of the energy source, quantity of support materials) affect directly the mechanical properties, surface finish and the dimensional accuracy of a functional part. This study aims to improve the environmental performance of an additive manufacturing process without deterioration of the part quality. For that purpose, the authors have developed a generic method that has been applied on multiple parts made by additive manufacturing processes. First, a complete analysis of the process parameters is made in order to identify which parameters affect only the environmental performances of the process. Then, multiple parts are manufactured by varying the identified parameters. The aim of the second step is to find the optimum value of the parameters that decrease significantly the environmental impact of the process and keep the part quality as desired. Finally, a comparison between the part made by initials parameters and changed parameters is made. In this study, the major finding claims by authors is to reduce the environmental impact of an additive manufacturing process while respecting the three quality criterion of parts, mechanical properties, dimensional accuracy and surface roughness. Now that additive manufacturing processes can be seen as mature from a technical point of view, environmental improvement of these processes can be considered while respecting the part properties. The first part of this study presents the methodology applied to multiple academic parts. Then, the validity of the methodology is demonstrated on functional parts.Keywords: additive manufacturing, environmental impact, environmental improvement, mechanical properties
Procedia PDF Downloads 2864221 The Mediating Role of Store Personality in the Relationship Between Self-Congruity and Manifestations of Loyalty
Authors: María de los Ángeles Crespo López, Carmen García García
Abstract:
The highly competitive nature of today's globalised marketplace requires that brands and stores develop effective commercial strategies to ensure their economic survival. Maintaining the loyalty of existing customers constitutes one key strategy that yields the best results. Although the relationship between consumers' self-congruity and their manifestations of loyalty towards a store has been investigated, the role of store personality in this relationship remains unclear. In this study, multiple parallel mediation analysis was used to examine the effect of Store Personality on the relationship between Self-Congruity of consumers and their Manifestations of Loyalty. For this purpose, 457 Spanish consumers of the Fnac store completed three self-report questionnaires assessing Store Personality, Self-Congruity, and Store Loyalty. The data were analyzed using the SPSS macro PROCESS. The results revealed that three dimensions of Store Personality, namely Exciting, Close and Competent Store, positively and significantly mediated the relationship between Self-Congruity and Manifestations of Loyalty. The indirect effect of Competent Store was the greatest. This means that a consumer with higher levels of Self-Congruity with the store will exhibit more Manifestations of Loyalty when the store is perceived as Exciting, Close or Competent. These findings suggest that more attention should be paid to the perceived personality of stores for the development of effective marketing strategies to maintain or increase consumers' manifestations of loyalty towards stores.Keywords: multiple parallel mediation, PROCESS, self-congruence, store loyalty, store personality
Procedia PDF Downloads 1564220 Investigating the Potential for Introduction of Warm Mix Asphalt in Kuwait Using the Volcanic Ash
Authors: H. Al-Baghli, F. Al-Asfour
Abstract:
The current applied asphalt technology for Kuwait roads pavement infrastructure is the hot mix asphalt (HMA) pavement, including both pen grade and polymer modified bitumen (PMBs), that is produced and compacted at high temperature levels ranging from 150 to 180 °C. There are no current specifications for warm and cold mix asphalts in Kuwait’s Ministry of Public Works (MPW) asphalt standard and specifications. The process of the conventional HMA is energy intensive and directly responsible for the emission of greenhouse gases and other environmental hazards into the atmosphere leading to significant environmental impacts and raising health risk to labors at site. Warm mix asphalt (WMA) technology, a sustainable alternative preferred in multiple countries, has many environmental advantages because it requires lower production temperatures than HMA by 20 to 40 °C. The reduction of temperatures achieved by WMA originates from multiple technologies including foaming and chemical or organic additives that aim to reduce bitumen and improve mix workability. This paper presents a literature review of WMA technologies and techniques followed by an experimental study aiming to compare the results of produced WMA samples, using a water containing additive (foaming process), at different compaction temperatures with the HMA control volumetric properties mix designed in accordance to the new MPW’s specifications and guidelines.Keywords: warm-mix asphalt, water-bearing additives, foaming-based process, chemical additives, organic additives
Procedia PDF Downloads 1234219 The Relationship between Corporate Governance and Intellectual Capital Disclosure: Malaysian Evidence
Authors: Rabiaal Adawiyah Shazali, Corina Joseph
Abstract:
The disclosure of Intellectual Capital (IC) information is getting more vital in today’s era of a knowledge-based economy. Companies are advised by accounting bodies to enhance IC disclosure which complements the conventional financial disclosures. There are no accounting standards for Intellectual Capital Disclosure (ICD), therefore the disclosure is entirely voluntary. Hence, this study aims to investigate the extent of ICD and to examine the relationship between corporate governance and ICD in Malaysia. This study employed content analysis of 100 annual reports by the top 100 public listed companies in Malaysia during 2012. The uniqueness of this study lies on its underpinning theory used where it applies the institutional isomorphism theory to support the effect of the attributes of corporate governance towards ICD. In order to achieve the stated objective, multiple regression analysis were employed to conduct this study. From the descriptive statistics, it was concluded that public listed companies in Malaysia have increased their awareness towards the importance of ICD. Furthermore, results from the multiple regression analysis confirmed that corporate governance affects the company’s ICD where the frequency of audit committee meetings and the board size has positively influenced the level of ICD in companies. Findings from this study would provide an incentive for companies in Malaysia to enhance the disclosure of IC. In addition, this study would assist Bursa Malaysia and other regulatory bodies to come up with a proper guideline for the disclosure of IC.Keywords: annual report, content analysis, corporate governance, intellectual capital disclosure
Procedia PDF Downloads 2154218 Case of A Huge Retroperitoneal Abscess Spanning from the Diaphragm to the Pelvic Brim
Authors: Christopher Leung, Tony Kim, Rebecca Lendzion, Scott Mackenzie
Abstract:
Retroperitoneal abscesses are a rare but serious condition with often delayed diagnosis, non-specific symptoms, multiple causes and high morbidity/mortality. With the advent of more readily available cross-sectional imaging, retroperitoneal abscesses are treated earlier and better outcomes are achieved. Occasionally, a retroperitoneal abscess is present as a huge retroperitoneal abscess, as evident in this 53-year-old male. With a background of chronic renal disease and left partial nephrectomy, this gentleman presented with a one-month history of left flank pain without any other symptoms, including fevers or abdominal pain. CT abdomen and pelvis demonstrated a huge retroperitoneal abscess spanning from the diaphragm, abutting the spleen, down to the iliopsoas muscle and abutting the iliac vessels at the pelvic brim. This large retroperitoneal abscess required open drainage as well as drainage by interventional radiology. A long course of intravenous antibiotics and multiple drainages was required to drain the abscess. His blood culture and fluid culture grew Proteus species suggesting a urinary source, likely from his non-functioning kidney, which had a partial nephrectomy. Such a huge retroperitoneal abscess has rarely been described in the literature. The learning point here is that the basic principle of source control and antibiotics is paramount in treating retroperitoneal abscesses regardless of the size of the abscess.Keywords: retroperitoneal abscess, retroperitoneal mass, sepsis, genitourinary infection
Procedia PDF Downloads 2204217 Learning Curve Effect on Materials Procurement Schedule of Multiple Sister Ships
Authors: Vijaya Dixit Aasheesh Dixit
Abstract:
Shipbuilding industry operates in Engineer Procure Construct (EPC) context. Product mix of a shipyard comprises of various types of ships like bulk carriers, tankers, barges, coast guard vessels, sub-marines etc. Each order is unique based on the type of ship and customized requirements, which are engineered into the product right from design stage. Thus, to execute every new project, a shipyard needs to upgrade its production expertise. As a result, over the long run, holistic learning occurs across different types of projects which contributes to the knowledge base of the shipyard. Simultaneously, in the short term, during execution of a project comprising of multiple sister ships, repetition of similar tasks leads to learning at activity level. This research aims to capture above learnings of a shipyard and incorporate learning curve effect in project scheduling and materials procurement to improve project performance. Extant literature provides support for the existence of such learnings in an organization. In shipbuilding, there are sequences of similar activities which are expected to exhibit learning curve behavior. For example, the nearly identical structural sub-blocks which are successively fabricated, erected, and outfitted with piping and electrical systems. Learning curve representation can model not only a decrease in mean completion time of an activity, but also a decrease in uncertainty of activity duration. Sister ships have similar material requirements. The same supplier base supplies materials for all the sister ships within a project. On one hand, this provides an opportunity to reduce transportation cost by batching the order quantities of multiple ships. On the other hand, it increases the inventory holding cost at shipyard and the risk of obsolescence. Further, due to learning curve effect the production scheduled of each consequent ship gets compressed. Thus, the material requirement schedule of every next ship differs from its previous ship. As more and more ships get constructed, compressed production schedules increase the possibility of batching the orders of sister ships. This work aims at integrating materials management with project scheduling of long duration projects for manufacturing of multiple sister ships. It incorporates the learning curve effect on progressively compressing material requirement schedules and addresses the above trade-off of transportation cost and inventory holding and shortage costs while satisfying budget constraints of various stages of the project. The activity durations and lead time of items are not crisp and are available in the form of probabilistic distribution. A Stochastic Mixed Integer Programming (SMIP) model is formulated which is solved using evolutionary algorithm. Its output provides ordering dates of items and degree of order batching for all types of items. Sensitivity analysis determines the threshold number of sister ships required in a project to leverage the advantage of learning curve effect in materials management decisions. This analysis will help materials managers to gain insights about the scenarios: when and to what degree is it beneficial to treat a multiple ship project as an integrated one by batching the order quantities and when and to what degree to practice distinctive procurement for individual ship.Keywords: learning curve, materials management, shipbuilding, sister ships
Procedia PDF Downloads 5014216 An Overbooking Model for Car Rental Service with Different Types of Cars
Authors: Naragain Phumchusri, Kittitach Pongpairoj
Abstract:
Overbooking is a very useful revenue management technique that could help reduce costs caused by either undersales or oversales. In this paper, we propose an overbooking model for two types of cars that can minimize the total cost for car rental service. With two types of cars, there is an upgrade possibility for lower type to upper type. This makes the model more complex than one type of cars scenario. We have found that convexity can be proved in this case. Sensitivity analysis of the parameters is conducted to observe the effects of relevant parameters on the optimal solution. Model simplification is proposed using multiple linear regression analysis, which can help estimate the optimal overbooking level using appropriate independent variables. The results show that the overbooking level from multiple linear regression model is relatively close to the optimal solution (with the adjusted R-squared value of at least 72.8%). To evaluate the performance of the proposed model, the total cost was compared with the case where the decision maker uses a naïve method for the overbooking level. It was found that the total cost from optimal solution is only 0.5 to 1 percent (on average) lower than the cost from regression model, while it is approximately 67% lower than the cost obtained by the naïve method. It indicates that our proposed simplification method using regression analysis can effectively perform in estimating the overbooking level.Keywords: overbooking, car rental industry, revenue management, stochastic model
Procedia PDF Downloads 1704215 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms
Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao
Abstract:
Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50
Procedia PDF Downloads 1374214 The Impact of Missense Mutation in Phosphatidylinositol Glycan Class A Associated to Paroxysmal Nocturnal Hemoglobinuria and Multiple Congenital Anomalies-Hypotonia-Seizures Syndrome 2: A Computational Study
Authors: Ashish Kumar Agrahari, Amit Kumar
Abstract:
Paroxysmal nocturnal hemoglobinuria (PNH) is an acquired clonal blood disorder that manifests with hemolytic anemia, thrombosis, and peripheral blood cytopenias. The disease is caused by the deficiency of two glycosylphosphatidylinositols (GPI)-anchored proteins (CD55 and CD59) in the hemopoietic stem cells. The deficiency of GPI-anchored proteins has been associated with the somatic mutations in phosphatidylinositol glycan class A (PIGA). However, the mutations that do not cause PNH is associated with the multiple congenital anomalies-hypotonia-seizures syndrome 2 (MCAHS2). To best of our knowledge, no computational study has been performed to explore the atomistic level impact of PIGA mutations on the structure and dynamics of the protein. In the current work, we are mainly interested to get insights into the molecular mechanism of PIGA mutations. In the initial step, we screened the most pathogenic mutations from the pool of publicly available mutations. Further, to get a better understanding, pathogenic mutations were mapped to the modeled structure and subjected to 50ns molecular dynamics simulation. Our computational study suggests that four mutations are highly vulnerable to altering the structural conformation and stability of the PIGA protein, which illustrates its association with PNH and MCAHS2 phenotype.Keywords: homology modeling, molecular dynamics simulation, missense mutations PNH, MCAHS2, PIGA
Procedia PDF Downloads 1434213 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning
Authors: Pooja Khanal, Huaming Zhang
Abstract:
Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.Keywords: bug classification, bug labels, GitHub issues, semantic differences
Procedia PDF Downloads 1984212 Hybrid Weighted Multiple Attribute Decision Making Handover Method for Heterogeneous Networks
Authors: Mohanad Alhabo, Li Zhang, Naveed Nawaz
Abstract:
Small cell deployment in 5G networks is a promising technology to enhance capacity and coverage. However, unplanned deployment may cause high interference levels and high number of unnecessary handovers, which in turn will result in an increase in the signalling overhead. To guarantee service continuity, minimize unnecessary handovers, and reduce signalling overhead in heterogeneous networks, it is essential to properly model the handover decision problem. In this paper, we model the handover decision according to Multiple Attribute Decision Making (MADM) method, specifically Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). In this paper, we propose a hybrid TOPSIS method to control the handover in heterogeneous network. The proposed method adopts a hybrid weighting, which is a combination of entropy and standard deviation. A hybrid weighting control parameter is introduced to balance the impact of the standard deviation and entropy weighting on the network selection process and the overall performance. Our proposed method shows better performance, in terms of the number of frequent handovers and the mean user throughput, compared to the existing methods.Keywords: handover, HetNets, interference, MADM, small cells, TOPSIS, weight
Procedia PDF Downloads 1484211 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable
Authors: Xinyuan Y. Song, Kai Kang
Abstract:
Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data
Procedia PDF Downloads 1424210 Effect of Climate Variability on Honeybee's Production in Ondo State, Nigeria
Authors: Justin Orimisan Ijigbade
Abstract:
The study was conducted to assess the effect of climate variability on honeybee’s production in Ondo State, Nigeria. Multistage sampling technique was employed to collect the data from 60 beekeepers across six Local Government Areas in Ondo State. Data collected were subjected to descriptive statistics and multiple regression model analyses. The results showed that 93.33% of the respondents were male with 80% above 40 years of age. Majority of the respondents (96.67%) had formal education and 90% produced honey for commercial purpose. The result revealed that 90% of the respondents admitted that low temperature as a result of long hours/period of rainfall affected the foraging efficiency of the worker bees, 73.33% claimed that long period of low humidity resulted in low level of nectar flow, while 70% submitted that high temperature resulted in improper composition of workers, dunes and queen in the hive colony. The result of multiple regression showed that beekeepers’ experience, educational level, access to climate information, temperature and rainfall were the main factors affecting honey bees production in the study area. Therefore, beekeepers should be given more education on climate variability and its adaptive strategies towards ensuring better honeybees production in the study area.Keywords: climate variability, honeybees production, humidity, rainfall and temperature
Procedia PDF Downloads 2714209 A Framework for Designing Complex Product-Service Systems with a Multi-Domain Matrix
Authors: Yoonjung An, Yongtae Park
Abstract:
Offering a Product-Service System (PSS) is a well-accepted strategy that companies may adopt to provide a set of systemic solutions to customers. PSSs were initially provided in a simple form but now take diversified and complex forms involving multiple services, products and technologies. With the growing interest in the PSS, frameworks for the PSS development have been introduced by many researchers. However, most of the existing frameworks fail to examine various relations existing in a complex PSS. Since designing a complex PSS involves full integration of multiple products and services, it is essential to identify not only product-service relations but also product-product/ service-service relations. It is also equally important to specify how they are related for better understanding of the system. Moreover, as customers tend to view their purchase from a more holistic perspective, a PSS should be developed based on the whole system’s requirements, rather than focusing only on the product requirements or service requirements. Thus, we propose a framework to develop a complex PSS that is coordinated fully with the requirements of both worlds. Specifically, our approach adopts a multi-domain matrix (MDM). A MDM identifies not only inter-domain relations but also intra-domain relations so that it helps to design a PSS that includes highly desired and closely related core functions/ features. Also, various dependency types and rating schemes proposed in our approach would help the integration process.Keywords: inter-domain relations, intra-domain relations, multi-domain matrix, product-service system design
Procedia PDF Downloads 6384208 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT
Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez
Abstract:
Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management
Procedia PDF Downloads 1374207 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 3314206 Targeting Calcium Dysregulation for Treatment of Dementia in Alzheimer's Disease
Authors: Huafeng Wei
Abstract:
Dementia in Alzheimer’s Disease (AD) is the number one cause of dementia internationally, without effective treatments. Increasing evidence suggest that disruption of intracellular calcium homeostasis, primarily pathological elevation of cytosol and mitochondria but reduction of endoplasmic reticulum (ER) calcium concentrations, play critical upstream roles on multiple pathologies and associated neurodegeneration, impaired neurogenesis, synapse, and cognitive dysfunction in various AD preclinical studies. The last federal drug agency (FDA) approved drug for AD dementia treatment, memantine, exert its therapeutic effects by ameliorating N-methyl-D-aspartate (NMDA) glutamate receptor overactivation and subsequent calcium dysregulation. More research works are needed to develop other drugs targeting calcium dysregulation at multiple pharmacological acting sites for future effective AD dementia treatment. Particularly, calcium channel blockers for the treatment of hypertension and dantrolene for the treatment of muscle spasm and malignant hyperthermia can be repurposed for this purpose. In our own research work, intranasal administration of dantrolene significantly increased its brain concentrations and durations, rendering it a more effective therapeutic drug with less side effects for chronic AD dementia treatment. This review summarizesthe progress of various studies repurposing drugs targeting calcium dysregulation for future effective AD dementia treatment as potentially disease-modifying drugs.Keywords: alzheimer, calcium, cognitive dysfunction, dementia, neurodegeneration, neurogenesis
Procedia PDF Downloads 1804205 Deep Reinforcement Learning Approach for Optimal Control of Industrial Smart Grids
Authors: Niklas Panten, Eberhard Abele
Abstract:
This paper presents a novel approach for real-time and near-optimal control of industrial smart grids by deep reinforcement learning (DRL). To achieve highly energy-efficient factory systems, the energetic linkage of machines, technical building equipment and the building itself is desirable. However, the increased complexity of the interacting sub-systems, multiple time-variant target values and stochastic influences by the production environment, weather and energy markets make it difficult to efficiently control the energy production, storage and consumption in the hybrid industrial smart grids. The studied deep reinforcement learning approach allows to explore the solution space for proper control policies which minimize a cost function. The deep neural network of the DRL agent is based on a multilayer perceptron (MLP), Long Short-Term Memory (LSTM) and convolutional layers. The agent is trained within multiple Modelica-based factory simulation environments by the Advantage Actor Critic algorithm (A2C). The DRL controller is evaluated by means of the simulation and then compared to a conventional, rule-based approach. Finally, the results indicate that the DRL approach is able to improve the control performance and significantly reduce energy respectively operating costs of industrial smart grids.Keywords: industrial smart grids, energy efficiency, deep reinforcement learning, optimal control
Procedia PDF Downloads 1934204 Resistance Spot Welding of Boron Steel 22MnB5 with Complex Welding Programs
Authors: Szymon Kowieski, Zygmunt Mikno
Abstract:
The study involved the optimization of process parameters during resistance spot welding of Al-coated martensitic boron steel 22MnB5, applied in hot stamping, performed using a programme with a multiple current impulse mode and a programme with variable pressure force. The aim of this research work was to determine the possibilities of a growth in welded joint strength and to identify the expansion of a welding lobe. The process parameters were adjusted on the basis of welding process simulation and confronted with experimental data. 22MnB5 steel is known for its tendency to obtain high hardness values in weld nuggets, often leading to interfacial failures (observed in the study-related tests). In addition, during resistance spot welding, many production-related factors can affect process stability, e.g. welding lobe narrowing, and lead to the deterioration of quality. Resistance spot welding performed using the above-named welding programme featuring 3 levels of force made it possible to achieve 82% of welding lobe extension. Joints made using the multiple current impulse program, where the total welding time was below 1.4s, revealed a change in a peeling mode (to full plug) and an increase in weld tensile shear strength of 10%.Keywords: 22MnB5, hot stamping, interfacial fracture, resistance spot welding, simulation, single lap joint, welding lobe
Procedia PDF Downloads 3854203 Pinch Technology for Minimization of Water Consumption at a Refinery
Authors: W. Mughees, M. Alahmad
Abstract:
Water is the most significant entity that controls local and global development. For the Gulf region, especially Saudi Arabia, with its limited potable water resources, the potential of the fresh water problem is highly considerable. In this research, the study involves the design and analysis of pinch-based water/wastewater networks. Multiple water/wastewater networks were developed using pinch analysis involving direct recycle/material recycle method. Property-integration technique was adopted to carry out direct recycle method. Particularly, a petroleum refinery was considered as a case study. In direct recycle methodology, minimum water discharge and minimum fresh water resource targets were estimated. Re-design (or retrofitting) of water allocation in the networks was undertaken. Chemical Oxygen Demand (COD) and hardness properties were taken as pollutants. This research was based on single and double contaminant approach for COD and hardness and the amount of fresh water was reduced from 340.0 m3/h to 149.0 m3/h (43.8%), 208.0 m3/h (61.18%) respectively. While regarding double contaminant approach, reduction in fresh water demand was 132.0 m3/h (38.8%). The required analysis was also carried out using mathematical programming technique. Operating software such as LINGO was used for these studies which have verified the graphical method results in a valuable and accurate way. Among the multiple water networks, the one possible water allocation network was developed based on mass exchange.Keywords: minimization, water pinch, water management, pollution prevention
Procedia PDF Downloads 4764202 Balance Control Mechanisms in Individuals With Multiple Sclerosis in Virtual Reality Environment
Authors: Badriah Alayidi, Emad Alyahya
Abstract:
Background: Most people with Multiple Sclerosis (MS) report worsening balance as the condition progresses. Poor balance control is also well known to be a significant risk factor for both falling and fear of falling. The increased risk of falls with disease progression thus makes balance control an essential target of gait rehabilitation amongst people with MS. Intervention programs have developed various methods to improve balance control, and accumulating evidence suggests that exercise programs may help people with MS improve their balance. Among these methods, virtual reality (VR) is growing in popularity as a balance-training technique owing to its potential benefits, including better compliance and greater user happiness. However, it is not clear if a VR environment will induce different balance control mechanisms in MS as compared to healthy individuals or traditional environments. Therefore, this study aims to examine how individuals with MS control their balance in a VR setting. Methodology: The proposed study takes an empirical approach to estimate and determine the role of balance response in persons with MS using a VR environment. It will use primary data collected through patient observations, physiological and biomechanical evaluation of balance, and data analysis. Results: The preliminary systematic review and meta-analysis indicated that there was variability in terms of the outcome assessing balance response in people with MS. The preliminary results of these assessments have the potential to provide essential indicators of the progression of MS and contribute to the individualization of treatment and evaluation of the interventions’ effectiveness. The literature describes patients who have had the opportunity to experiment in VR settings and then used what they have learned in the real world, suggesting that this VR setting could be more appealing than conditional settings. The findings of the proposed study will be beneficial in estimating and determining the effect of VR on balance control in persons with MS. In previous studies, VR was shown to be an interesting approach to neurological rehabilitation, but more data are needed to support this approach in MS. Conclusions: The proposed study enables an assessment of balance and evaluations of a variety of physiological implications related to neural activity as well as biomechanical implications related to movement analysis.Keywords: multiple sclerosis, virtual reality, postural control, balance
Procedia PDF Downloads 744201 Mobility Management for Pedestrian Accident Predictability and Mitigation Strategies Using Multiple
Authors: Oscar Norman Nekesa, Yoshitaka Kajita
Abstract:
Tom Mboya Street is a vital urban corridor within the spectrum of Nairobi city, it experiences high volumes of pedestrian and vehicular traffic. Despite past intervention measures to lessen this catastrophe, rates have remained high. This highlights significant safety concerns that need urgent attention. This study investigates the correlation and pedestrian accident predictability with significant independent variables using multiple linear regression to model to develop effective mobility management strategies for accident mitigation. The methodology involves collecting and analyzing data on pedestrian accidents and various related independent variables. Data sources include the National Transport and Safety Authority (NTSA), Kenya National Bureau of Statistics, and Nairobi City County records, covering five years. This study aims to investigate that traffic volumes (pedestrian and vehicle), Vehicular speed, human factors, illegal parking, policy issues, urban-land use, built environment, traffic signals conditions, inadequate lighting, and insufficient traffic control measures significantly have predictability with the rate of pedestrian accidents. Explanatory variables related to road design and geometry are significant in predictor models for the Tom Mboya Road link but less influential in junction along the 5 km stretch road models. The most impactful variable across all models was vehicular traffic flow. The study recommends infrastructural improvements, enhanced enforcement, and public awareness campaigns to reduce accidents and improve urban mobility. These insights can inform policy-making and urban planning to enhance pedestrian safety along the dense packed Tom Mboya Street and similar urban settings. The findings will inform evidence-based interventions to enhance pedestrian safety and improve urban mobility.Keywords: multiple linear regression, urban mobility, traffic management, Nairobi, Tom Mboya street, infrastructure conditions., pedestrian safety, correlation and prediction
Procedia PDF Downloads 22