Search results for: modifiable areal unit problem (MAUP)
8158 Application of Fatty Acid Salts for Antimicrobial Agents in Koji-Muro
Authors: Aya Tanaka, Mariko Era, Shiho Sakai, Takayoshi Kawahara, Takahide Kanyama, Hiroshi Morita
Abstract:
Objectives: Aspergillus niger and Aspergillus oryzae are used as koji fungi in the spot of the brewing. Since koji-muro (room for making koji) was a low level of airtightness, microbial contamination has long been a concern to the alcoholic beverage production. Therefore, we focused on the fatty acid salt which is the main component of soap. Fatty acid salts have been reported to show some antibacterial and antifungal activity. So this study examined antimicrobial activities against Aspergillus and Bacillus spp. This study aimed to find the effectiveness of the fatty acid salt in koji-muro as antimicrobial agents. Materials & Methods: A. niger NBRC 31628, A. oryzae NBRC 5238, A. oryzae (Akita Konno store) and Bacillus subtilis NBRC 3335 were chosen as tested. Nine fatty acid salts including potassium butyrate (C4K), caproate (C6K), caprylate (C8K), caprate (C10K), laurate (C12K), myristate (C14K), oleate (C18:1K), linoleate (C18:2K) and linolenate (C18:3K) at 350 mM and pH 10.5 were used as antimicrobial activity. FASs and spore suspension were prepared in plastic tubes. The spore suspension of each fungus (3.0×104 spores/mL) or the bacterial suspension (3.0×105 CFU/mL) was mixed with each of the fatty acid salts (final concentration of 175 mM). The mixtures were incubated at 25 ℃. Samples were counted at 0, 10, 60, and 180 min by plating (100 µL) on potato dextrose agar. Fungal and bacterial colonies were counted after incubation for 1 or 2 days at 30 ℃. The MIC (minimum inhibitory concentration) is defined as the lowest concentration of drug sufficient for inhibiting visible growth of spore after 10 min of incubation. MICs against fungi and bacteria were determined using the two-fold dilution method. Each fatty acid salt was separately inoculated with 400 µL of Aspergillus spp. or B. subtilis NBRC 3335 at 3.0 × 104 spores/mL or 3.0 × 105 CFU/mL. Results: No obvious change was observed in tested fatty acid salts against A. niger and A. oryzae. However, C12K was the antibacterial effect of 5 log-unit incubated time for 10 min against B. subtilis. Thus, C12K suppressed 99.999 % of bacterial growth. Besides, C10K was the antibacterial effect of 5 log-unit incubated time for 180 min against B. subtilis. C18:1K, C18:2K and C18:3K was the antibacterial effect of 5 log-unit incubated time for 10 min against B. subtilis. However, compared to saturated fatty acid salts to unsaturated fatty acid salts, saturated fatty acid salts are lower cost. These results suggest C12K has potential in the field of koji-muro. It is necessary to evaluate the antimicrobial activity against other fungi and bacteria, in the future.Keywords: Aspergillus, antimicrobial, fatty acid salts, koji-muro
Procedia PDF Downloads 5548157 An Adiabatic Quantum Optimization Approach for the Mixed Integer Nonlinear Programming Problem
Authors: Maxwell Henderson, Tristan Cook, Justin Chan Jin Le, Mark Hodson, YoungJung Chang, John Novak, Daniel Padilha, Nishan Kulatilaka, Ansu Bagchi, Sanjoy Ray, John Kelly
Abstract:
We present a method of using adiabatic quantum optimization (AQO) to solve a mixed integer nonlinear programming (MINLP) problem instance. The MINLP problem is a general form of a set of NP-hard optimization problems that are critical to many business applications. It requires optimizing a set of discrete and continuous variables with nonlinear and potentially nonconvex constraints. Obtaining an exact, optimal solution for MINLP problem instances of non-trivial size using classical computation methods is currently intractable. Current leading algorithms leverage heuristic and divide-and-conquer methods to determine approximate solutions. Creating more accurate and efficient algorithms is an active area of research. Quantum computing (QC) has several theoretical benefits compared to classical computing, through which QC algorithms could obtain MINLP solutions that are superior to current algorithms. AQO is a particular form of QC that could offer more near-term benefits compared to other forms of QC, as hardware development is in a more mature state and devices are currently commercially available from D-Wave Systems Inc. It is also designed for optimization problems: it uses an effect called quantum tunneling to explore all lowest points of an energy landscape where classical approaches could become stuck in local minima. Our work used a novel algorithm formulated for AQO to solve a special type of MINLP problem. The research focused on determining: 1) if the problem is possible to solve using AQO, 2) if it can be solved by current hardware, 3) what the currently achievable performance is, 4) what the performance will be on projected future hardware, and 5) when AQO is likely to provide a benefit over classical computing methods. Two different methods, integer range and 1-hot encoding, were investigated for transforming the MINLP problem instance constraints into a mathematical structure that can be embedded directly onto the current D-Wave architecture. For testing and validation a D-Wave 2X device was used, as well as QxBranch’s QxLib software library, which includes a QC simulator based on simulated annealing. Our results indicate that it is mathematically possible to formulate the MINLP problem for AQO, but that currently available hardware is unable to solve problems of useful size. Classical general-purpose simulated annealing is currently able to solve larger problem sizes, but does not scale well and such methods would likely be outperformed in the future by improved AQO hardware with higher qubit connectivity and lower temperatures. If larger AQO devices are able to show improvements that trend in this direction, commercially viable solutions to the MINLP for particular applications could be implemented on hardware projected to be available in 5-10 years. Continued investigation into optimal AQO hardware architectures and novel methods for embedding MINLP problem constraints on to those architectures is needed to realize those commercial benefits.Keywords: adiabatic quantum optimization, mixed integer nonlinear programming, quantum computing, NP-hard
Procedia PDF Downloads 5258156 Synthesis, Characterization and Catecholase Study of Novel Bidentate Schiff Base Derived from Dehydroacetic Acid
Authors: Salima Tabti, Chaima Maouche, Tinhinene Louaileche, Amel Djedouani, Ismail Warad
Abstract:
Novel Schiff base ligand HL has been synthesized by condensation of aromatic amine and DHA. It was characterized by UV-Vis, FT-IR, SM, NMR (1H, 13C) and also by single-crystal X-ray diffraction. The crystal structure shows that compound crystallized in a triclinic system in P-1 space group and with a two unit per cell (Z = 2).The asymmetric unit, contains one independent molecules, the conformation is determined by an intermolecular N-H…O hydrogen bond with an S(6) ring motif. The molecule have an (E) conformation about the C=N bond. The dihedral angles between the phenyl and pyran ring planes is 89.37 (1), the two plans are approximately perpendicular. The catecholase activity of is situ copper complexes of this ligand has been investigated against catechol. The progress of the oxidation reactions was closely monitored over time following the strong peak of catechol using UV-Vis. Oxidation rates were determined from the initial slope of absorbance. time plots, then analyzed by Michaelis-Menten equations. Catechol oxidation reactions were realized using different concentrations of copper acetate and ligand (L/Cu: 1/1, 1/2, 2/1). The results show that all complexes were able to catalyze the oxidation of catechol. Acetate complexes have the highest activity. Catalysis is a branch of chemical kinetics that, more generally, studies the influence of all physical or chemical factors determining reaction rates. It solves a lot of problems in the chemistry reaction process, especially for a green, economic and less polluting chemistry. For this reason, the search for new catalysts for known organic reactions, occupies a very advanced place in the themes proposed by the chemists.Keywords: dehydroacetic acid, catechol, copper, catecholase activity, x-ray
Procedia PDF Downloads 1108155 Design and Analysis of a Piezoelectric Linear Motor Based on Rigid Clamping
Authors: Chao Yi, Cunyue Lu, Lingwei Quan
Abstract:
Piezoelectric linear motors have the characteristics of great electromagnetic compatibility, high positioning accuracy, compact structure and no deceleration mechanism, which make it promising to applicate in micro-miniature precision drive systems. However, most piezoelectric motors are employed by flexible clamping, which has insufficient rigidity and is difficult to use in rapid positioning. Another problem is that this clamping method seriously affects the vibration efficiency of the vibrating unit. In order to solve these problems, this paper proposes a piezoelectric stack linear motor based on double-end rigid clamping. First, a piezoelectric linear motor with a length of only 35.5 mm is designed. This motor is mainly composed of a motor stator, a driving foot, a ceramic friction strip, a linear guide, a pre-tightening mechanism and a base. This structure is much simpler and smaller than most similar motors, and it is easy to assemble as well as to realize precise control. In addition, the properties of piezoelectric stack are reviewed and in order to obtain the elliptic motion trajectory of the driving head, a driving scheme of the longitudinal-shear composite stack is innovatively proposed. Finally, impedance analysis and speed performance testing were performed on the piezoelectric linear motor prototype. The motor can measure speed up to 25.5 mm/s under the excitation of signal voltage of 120 V and frequency of 390 Hz. The result shows that the proposed piezoelectric stacked linear motor obtains great performance. It can run smoothly in a large speed range, which is suitable for various precision control in medical images, aerospace, precision machinery and many other fields.Keywords: piezoelectric stack, linear motor, rigid clamping, elliptical trajectory
Procedia PDF Downloads 1538154 Multi-Objective Four-Dimensional Traveling Salesman Problem in an IoT-Based Transport System
Authors: Arindam Roy, Madhushree Das, Apurba Manna, Samir Maity
Abstract:
In this research paper, an algorithmic approach is developed to solve a novel multi-objective four-dimensional traveling salesman problem (MO4DTSP) where different paths with various numbers of conveyances are available to travel between two cities. NSGA-II and Decomposition algorithms are modified to solve MO4DTSP in an IoT-based transport system. This IoT-based transport system can be widely observed, analyzed, and controlled by an extensive distribution of traffic networks consisting of various types of sensors and actuators. Due to urbanization, most of the cities are connected using an intelligent traffic management system. Practically, for a traveler, multiple routes and vehicles are available to travel between any two cities. Thus, the classical TSP is reformulated as multi-route and multi-vehicle i.e., 4DTSP. The proposed MO4DTSP is designed with traveling cost, time, and customer satisfaction as objectives. In reality, customer satisfaction is an important parameter that depends on travel costs and time reflects in the present model.Keywords: multi-objective four-dimensional traveling salesman problem (MO4DTSP), decomposition, NSGA-II, IoT-based transport system, customer satisfaction
Procedia PDF Downloads 1108153 A Fuzzy Multiobjective Model for Bed Allocation Optimized by Artificial Bee Colony Algorithm
Authors: Jalal Abdulkareem Sultan, Abdulhakeem Luqman Hasan
Abstract:
With the development of health care systems competition, hospitals face more and more pressures. Meanwhile, resource allocation has a vital effect on achieving competitive advantages in hospitals. Selecting the appropriate number of beds is one of the most important sections in hospital management. However, in real situation, bed allocation selection is a multiple objective problem about different items with vagueness and randomness of the data. It is very complex. Hence, research about bed allocation problem is relatively scarce under considering multiple departments, nursing hours, and stochastic information about arrival and service of patients. In this paper, we develop a fuzzy multiobjective bed allocation model for overcoming uncertainty and multiple departments. Fuzzy objectives and weights are simultaneously applied to help the managers to select the suitable beds about different departments. The proposed model is solved by using Artificial Bee Colony (ABC), which is a very effective algorithm. The paper describes an application of the model, dealing with a public hospital in Iraq. The results related that fuzzy multi-objective model was presented suitable framework for bed allocation and optimum use.Keywords: bed allocation problem, fuzzy logic, artificial bee colony, multi-objective optimization
Procedia PDF Downloads 3248152 Technology of Gyro Orientation Measurement Unit (Gyro Omu) for Underground Utility Mapping Practice
Authors: Mohd Ruzlin Mohd Mokhtar
Abstract:
At present, most operators who are working on projects for utilities such as power, water, oil, gas, telecommunication and sewerage are using technologies e.g. Total station, Global Positioning System (GPS), Electromagnetic Locator (EML) and Ground Penetrating Radar (GPR) to perform underground utility mapping. With the increase in popularity of Horizontal Directional Drilling (HDD) method among the local authorities and asset owners, most of newly installed underground utilities need to use the HDD method. HDD method is seen as simple and create not much disturbance to the public and traffic. Thus, it was the preferred utilities installation method in most of areas especially in urban areas. HDDs were installed much deeper than exiting utilities (some reports saying that HDD is averaging 5 meter in depth). However, this impacts the accuracy or ability of existing underground utility mapping technologies. In most of Malaysia underground soil condition, those technologies were limited to maximum of 3 meter depth. Thus, those utilities which were installed much deeper than 3 meter depth could not be detected by using existing detection tools. The accuracy and reliability of existing underground utility mapping technologies or work procedure were in doubt. Thus, a mitigation action plan is required. While installing new utility using Horizontal Directional Drilling (HDD) method, a more accurate underground utility mapping can be achieved by using Gyro OMU compared to existing practice using e.g. EML and GPR. Gyro OMU is a method to accurately identify the location of HDD thus this mapping can be used or referred to avoid those cost of breakdown due to future HDD works which can be caused by inaccurate underground utility mapping.Keywords: Gyro Orientation Measurement Unit (Gyro OMU), Horizontal Directional Drilling (HDD), Ground Penetrating Radar (GPR), Electromagnetic Locator (EML)
Procedia PDF Downloads 1408151 A Mean–Variance–Skewness Portfolio Optimization Model
Authors: Kostas Metaxiotis
Abstract:
Portfolio optimization is one of the most important topics in finance. This paper proposes a mean–variance–skewness (MVS) portfolio optimization model. Traditionally, the portfolio optimization problem is solved by using the mean–variance (MV) framework. In this study, we formulate the proposed model as a three-objective optimization problem, where the portfolio's expected return and skewness are maximized whereas the portfolio risk is minimized. For solving the proposed three-objective portfolio optimization model we apply an adapted version of the non-dominated sorting genetic algorithm (NSGAII). Finally, we use a real dataset from FTSE-100 for validating the proposed model.Keywords: evolutionary algorithms, portfolio optimization, skewness, stock selection
Procedia PDF Downloads 1988150 Optimization and Energy Management of Hybrid Standalone Energy System
Authors: T. M. Tawfik, M. A. Badr, E. Y. El-Kady, O. E. Abdellatif
Abstract:
Electric power shortage is a serious problem in remote rural communities in Egypt. Over the past few years, electrification of remote communities including efficient on-site energy resources utilization has achieved high progress. Remote communities usually fed from diesel generator (DG) networks because they need reliable energy and cheap fresh water. The main objective of this paper is to design an optimal economic power supply from hybrid standalone energy system (HSES) as alternative energy source. It covers energy requirements for reverse osmosis desalination unit (DU) located in National Research Centre farm in Noubarya, Egypt. The proposed system consists of PV panels, Wind Turbines (WT), Batteries, and DG as a backup for supplying DU load of 105.6 KWh/day rated power with 6.6 kW peak load operating 16 hours a day. Optimization of HSES objective is selecting the suitable size of each of the system components and control strategy that provide reliable, efficient, and cost-effective system using net present cost (NPC) as a criterion. The harmonization of different energy sources, energy storage, and load requirements are a difficult and challenging task. Thus, the performance of various available configurations is investigated economically and technically using iHOGA software that is based on genetic algorithm (GA). The achieved optimum configuration is further modified through optimizing the energy extracted from renewable sources. Effective minimization of energy charging the battery ensures that most of the generated energy directly supplies the demand, increasing the utilization of the generated energy.Keywords: energy management, hybrid system, renewable energy, remote area, optimization
Procedia PDF Downloads 1998149 Linear Stability of Convection in an Inclined Channel with Nanofluid Saturated Porous Medium
Authors: D. Srinivasacharya, Nidhi Humnekar
Abstract:
The goal of this research is to numerically investigate the convection of nanofluid flow in an inclined porous channel. Brownian motion and thermophoresis effects are accounted for by nanofluid. In addition, the flow in the porous region governs Brinkman’s equation. The perturbed state of the generalized eigenvalue problem is obtained using normal mode analysis, and Chebyshev spectral collocation was used to solve this problem. For various values of the governing parameters, the critical wavenumber and critical Rayleigh number are calculated, and preferred modes are identified.Keywords: Brinkman model, inclined channel, nanofluid, linear stability, porous media
Procedia PDF Downloads 1128148 Certain Results of a New Class of Meromorphic Multivalent Functions Involving Ruscheweyh Derivative
Authors: Kassim A. Jassim
Abstract:
In the present paper, we introduce and discuss a new class Kp(λ,α) of meromorphic multivalent functions in the punctured unit disk U*={z∈¢:0<|z|<1} defined by Ruscheweyh derivative. We obtain some sufficient conditions for the functions belonging to the class Kp(λ,α).Keywords: meromorphic multivalent function, Ruscheweyh derivative, hadamard product
Procedia PDF Downloads 3368147 Sensor Network Routing Optimization by Simulating Eurygaster Life in Wheat Farms
Authors: Fariborz Ahmadi, Hamid Salehi, Khosrow Karimi
Abstract:
A sensor network is set of sensor nodes that cooperate together to perform a predefined tasks. The important problem in this network is power consumption. So, in this paper one algorithm based on the eurygaster life is introduced to minimize power consumption by the nodes of these networks. In this method the search space of problem is divided into several partitions and each partition is investigated separately. The evaluation results show that our approach is more efficient in comparison to other evolutionary algorithm like genetic algorithm.Keywords: evolutionary computation, genetic algorithm, particle swarm optimization, sensor network optimization
Procedia PDF Downloads 4288146 Object Oriented Fault Tree Analysis Methodology
Abstract:
Traditional safety, risk and reliability analysis approaches are problem-oriented, which make it great workload when analyzing complicated and huge system, besides, too much repetitive work would to do if the analyzed system composed by many similar components. It is pressing need an object and function oriented approach to maintain high consistency with problem domain. A new approach is proposed to overcome these shortcomings of traditional approaches, the concepts: class, abstract, inheritance, polymorphism and encapsulation are introduced into FTA and establish the professional class library that the abstractions of physical objects in real word, four areas relevant information also be proposed as the establish help guide. The interaction between classes is completed by the inside or external methods that mapping the attributes to base events through fully search the knowledge base, which forms good encapsulation. The object oriented fault tree analysis system that analyze and evaluate the system safety and reliability according to the original appearance of the problem is set up, where could mapped directly from the class and object to the problem domain of the fault tree analysis. All the system failure situations can be analyzed through this bottom-up fault tree construction approach. Under this approach architecture, FTA approach is developed, which avoids the human influence of the analyst on analysis results. It reveals the inherent safety problems of analyzed system itself and provides a new way of thinking and development for safety analysis. So that object oriented technology in the field of safety applications and development, safety theory is conducive to innovation.Keywords: FTA, knowledge base, object-oriented technology, reliability analysis
Procedia PDF Downloads 2488145 Approximation of Intersection Curves of Two Parametric Surfaces
Authors: Misbah Irshad, Faiza Sarfraz
Abstract:
The problem of approximating surface to surface intersection is considered to be very important in computer aided geometric design and computer aided manufacturing. Although it is a complex problem to handle, its continuous need in the industry makes it an active topic in research. A technique for approximating intersection curves of two parametric surfaces is proposed, which extracts boundary points and turning points from a sequence of intersection points and interpolate them with the help of rational cubic spline functions. The proposed approach is demonstrated with the help of examples and analyzed by calculating error.Keywords: approximation, parametric surface, spline function, surface intersection
Procedia PDF Downloads 2708144 Autophagy Defects That Modify Human Immune Cell Metabolism and Promote Aging-Associated Inflammation
Authors: Grace McCambridge, Alanna Keady, Madhur Agrawal, Dequina Nicholas Alvarado, Barbara Nikolajczyk, Leena Panneerseelan-Bharath
Abstract:
Age is a non-modifiable risk factor for the inflammation that underlies pathologies such as type 2 diabetes mellitus (T2DM). Inflammation, as indicated by circulating cytokines, rises in aging, but mechanisms that promote this ‘inflammaging’ remain poorly defined. Furthermore, downstream consequences of inflammaging, including the development of an inflammatory profile that predicts comorbidities like T2DM, remain speculative. We tested the possibility that natural aging-associated changes in autophagy, a process that is compromised in both aging and T2DM, regulates inflammatory profiles in older subjects. Our data showed that circulating CD4⁺ T cells from older compared to younger subjects have (i) defects in autophagy; (ii) higher mitochondria accumulation; (iii) a failure to metabolically shift from oxidative phosphorylation to anaerobic glycolysis upon αCD3/CD28 activation; (iv) more reactive oxygen species (ROS) accumulation; and (v) a cytokine profile that recapitulates the Th17 profile that predicts T2DM. ROS scavenging in cells from older subjects restored mitochondrial mass and membrane potential (indicators of improved autophagy) and reduced Th17 cytokines to amounts made by T cells from younger subjects. Knock-down of the autophagy protein Atg3 in T cells from younger subjects increased mitochondrial accumulation and Th17 cytokines. To begin translating these findings to clinical practice, we showed that physiological concentrations of the diabetes drug metformin (100 µM) added in vitro enhanced autophagy, prevented mitochondria and ROS accumulation, increased anaerobic glycolysis, and decreased Th17 cytokines in activated CD4⁺ T cells from older subjects. Metformin therefore improves autophagy and multiple downstream pro-inflammatory mechanisms CD4⁺ T cells from older subjects. We conclude that autophagy improvement ameliorates the development of a T2DM-predictive Th17 profile in aging, and thus holds promise for delay or prevention of aging-associated metabolic decline.Keywords: autophagy, mitochondrial turnover, ROS, glycolysis
Procedia PDF Downloads 1648143 Manual Wheelchair Propulsion Efficiency on Different Slopes
Authors: A. Boonpratatong, J. Pantong, S. Kiattisaksophon, W. Senavongse
Abstract:
In this study, an integrated sensing and modeling system for manual wheelchair propulsion measurement and propulsion efficiency calculation was used to indicate the level of overuse. Seven subjects participated in the measurement. On the level surface, the propulsion efficiencies were not different significantly as the riding speed increased. By contrast, the propulsion efficiencies on the 15-degree incline were restricted to around 0.5. The results are supported by previously reported wheeling resistance and propulsion torque relationships implying margin of the overuse. Upper limb musculoskeletal injuries and syndromes in manual wheelchair riders are common, chronic, and may be caused at different levels by the overuse i.e. repetitive riding on steep incline. The qualitative analysis such as the mechanical effectiveness on manual wheeling to establish the relationship between the riding difficulties, mechanical efforts and propulsion outputs is scarce, possibly due to the challenge of simultaneous measurement of those factors in conventional manual wheelchairs and everyday environments. In this study, the integrated sensing and modeling system were used to measure manual wheelchair propulsion efficiency in conventional manual wheelchairs and everyday environments. The sensing unit is comprised of the contact pressure and inertia sensors which are portable and universal. Four healthy male and three healthy female subjects participated in the measurement on level and 15-degree incline surface. Subjects were asked to perform manual wheelchair ridings with three different self-selected speeds on level surface and only preferred speed on the 15-degree incline. Five trials were performed in each condition. The kinematic data of the subject’s dominant hand and a spoke and the trunk of the wheelchair were collected through the inertia sensors. The compression force applied from the thumb of the dominant hand to the push rim was collected through the contact pressure sensors. The signals from all sensors were recorded synchronously. The subject-selected speeds for slow, preferred and fast riding on level surface and subject-preferred speed on 15-degree incline were recorded. The propulsion efficiency as a ratio between the pushing force in tangential direction to the push rim and the net force as a result of the three-dimensional riding motion were derived by inverse dynamic problem solving in the modeling unit. The intra-subject variability of the riding speed was not different significantly as the self-selected speed increased on the level surface. Since the riding speed on the 15-degree incline was difficult to regulate, the intra-subject variability was not applied. On the level surface, the propulsion efficiencies were not different significantly as the riding speed increased. However, the propulsion efficiencies on the 15-degree incline were restricted to around 0.5 for all subjects on their preferred speed. The results are supported by the previously reported relationship between the wheeling resistance and propulsion torque in which the wheelchair axle torque increased but the muscle activities were not increased when the resistance is high. This implies the margin of dynamic efforts on the relatively high resistance being similar to the margin of the overuse indicated by the restricted propulsion efficiency on the 15-degree incline.Keywords: contact pressure sensor, inertia sensor, integrating sensing and modeling system, manual wheelchair propulsion efficiency, manual wheelchair propulsion measurement, tangential force, resultant force, three-dimensional riding motion
Procedia PDF Downloads 2908142 Effects of Spent Dyebath Recycling on Pollution and Cost of Production in a Cotton Textile Industry
Authors: Dinesh Kumar Sharma, Sanjay Sharma
Abstract:
Textile manufacturing industry uses a substantial amount of chemicals not only in the production processes but also in manufacturing the raw materials. Dyes are the most significant raw material which provides colour to the fabric and yarn. Dyes are produced by using a large amount of chemicals both organic and inorganic in nature. Dyes are further classified as Reactive or Vat Dyes which are mostly used in cotton textiles. In the process of application of dyes to the cotton fiber, yarn or fabric, several auxiliary chemicals are also used in the solution called dyebath to improve the absorption of dyes. There is a very little absorption of dyes and auxiliary chemicals and a residual amount of all these substances is released as the spent dye bath effluent. Because of the wide variety of chemicals used in cotton textile dyes, there is always a risk of harmful effects which may not be apparent immediately but may have an irreversible impact in the long term. Colour imparted by the dyes to the water also has an adverse effect on its public acceptability and the potability. This study has been conducted with an objective to assess the feasibility of reuse of the spent dye bath. Studies have been conducted in two independent industries manufacturing dyed cotton yarn and dyed cotton fabric respectively. These have been referred as Unit-I and Unit-II. The studies included assessment of reduction in pollution levels and the economic benefits of such reuse. The study conclusively establishes that the reuse of spent dyebath results in prevention of pollution, reduction in pollution loads and cost of effluent treatment & production. This pollution prevention technique presents a good preposition for pollution prevention in cotton textile industry.Keywords: dyes, dyebath, reuse, toxic, pollution, costs
Procedia PDF Downloads 3928141 Divergence Regularization Method for Solving Ill-Posed Cauchy Problem for the Helmholtz Equation
Authors: Benedict Barnes, Anthony Y. Aidoo
Abstract:
A Divergence Regularization Method (DRM) is used to regularize the ill-posed Helmholtz equation where the boundary deflection is inhomogeneous in a Hilbert space H. The DRM incorporates a positive integer scaler which homogenizes the inhomogeneous boundary deflection in Cauchy problem of the Helmholtz equation. This ensures the existence, as well as, uniqueness of solution for the equation. The DRM restores all the three conditions of well-posedness in the sense of Hadamard.Keywords: divergence regularization method, Helmholtz equation, ill-posed inhomogeneous Cauchy boundary conditions
Procedia PDF Downloads 1898140 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 2848139 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem
Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee
Abstract:
Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research
Procedia PDF Downloads 3368138 A Comparative Study of Multi-SOM Algorithms for Determining the Optimal Number of Clusters
Authors: Imèn Khanchouch, Malika Charrad, Mohamed Limam
Abstract:
The interpretation of the quality of clusters and the determination of the optimal number of clusters is still a crucial problem in clustering. We focus in this paper on multi-SOM clustering method which overcomes the problem of extracting the number of clusters from the SOM map through the use of a clustering validity index. We then tested multi-SOM using real and artificial data sets with different evaluation criteria not used previously such as Davies Bouldin index, Dunn index and silhouette index. The developed multi-SOM algorithm is compared to k-means and Birch methods. Results show that it is more efficient than classical clustering methods.Keywords: clustering, SOM, multi-SOM, DB index, Dunn index, silhouette index
Procedia PDF Downloads 5998137 School Partners in Initial Teacher Education: An Including or Excluding Approach When Engaging Schools
Authors: Laila Niklasson
Abstract:
The aim of the study is to critically discuss how partner schools are engaged during Initial teacher education, ITE. The background is an experiment in Sweden where the practicum organization is reorganized due to a need to enhance quality during practicum. It is a national initiative from the government, supported by the National Agency of Education and lasts 2014-2019. The main features are concentration of students to school with a certain amount of mentors, mentors who have a mentor education and teachers with relevant subject areas and where there could be a mentor team with a leader at the school. An expected outcome is for example that the student teachers should be engaged in peer-learning. The schools should be supported by extra lectures from university teachers during practicum and also extra research projects where the schools should be engaged. A case study of one university based ITE was carried out to explore the consequences for the schools not selected. The result showed that from engaging x schools in a region, x was engaged. The schools are both in urban and rural areas, mainly in the latter. There is also a tendency that private schools are not engaged. On a unit level recruitment is perceived as harder for schools not engaged. In addition they cannot market themselves as ´selected school´ which can affect parent´s selection of school for their children. Also, on unit level, but with consequences for professional development, they are not selected for research project and thereby are not fully supported during school development. The conclusion is that from an earlier inclusive approach concerning professions where all teachers were perceived as possible mentors, there is a change to an exclusive approach where selected schools and selected teachers should be engaged. The change could be perceived as a change in governance mentality, but also in how professions are perceived, and development work is pursued.Keywords: initial teacher education, practicum schools, profession, quality development
Procedia PDF Downloads 1428136 Approximation of Geodesics on Meshes with Implementation in Rhinoceros Software
Authors: Marian Sagat, Mariana Remesikova
Abstract:
In civil engineering, there is a problem how to industrially produce tensile membrane structures that are non-developable surfaces. Nondevelopable surfaces can only be developed with a certain error and we want to minimize this error. To that goal, the non-developable surfaces are cut into plates along to the geodesic curves. We propose a numerical algorithm for finding approximations of open geodesics on meshes and surfaces based on geodesic curvature flow. For practical reasons, it is important to automatize the choice of the time step. We propose a method for automatic setting of the time step based on the diagonal dominance criterion for the matrix of the linear system obtained by discretization of our partial differential equation model. Practical experiments show reliability of this method. Because approximation of the model is made by numerical method based on classic derivatives, it is necessary to solve obstacles which occur for meshes with sharp corners. We solve this problem for big family of meshes with sharp corners via special rotations which can be seen as partial unfolding of the mesh. In practical applications, it is required that the approximation of geodesic has its vertices only on the edges of the mesh. This problem is solved by a specially designed pointing tracking algorithm. We also partially solve the problem of finding geodesics on meshes with holes. We implemented the whole algorithm in Rhinoceros (commercial 3D computer graphics and computer-aided design software ). It is done by using C# language as C# assembly library for Grasshopper, which is plugin in Rhinoceros.Keywords: geodesic, geodesic curvature flow, mesh, Rhinoceros software
Procedia PDF Downloads 1498135 Effect of Open-Ended Laboratory toward Learners Performance in Environmental Engineering Course: Case Study of Civil Engineering at Universiti Malaysia Sabah
Authors: N. Bolong, J. Makinda, I. Saad
Abstract:
Laboratory activities have produced benefits in student learning. With current drives of new technology resources and evolving era of education methods, renewal status of learning and teaching in laboratory methods are in progress, for both learners and the educators. To enhance learning outcomes in laboratory works particularly in engineering practices and testing, learning via hands-on by instruction may not sufficient. This paper describes and compares techniques and implementation of traditional (expository) with open-ended laboratory (problem-based) for two consecutive cohorts studying environmental laboratory course in civil engineering program. The transition of traditional to problem-based findings and effect were investigated in terms of course assessment student feedback survey, course outcome learning measurement and student performance grades. It was proved that students have demonstrated better performance in their grades and 12% increase in the course outcome (CO) in problem-based open-ended laboratory style than traditional method; although in perception, students has responded less favorable in their feedback.Keywords: engineering education, open-ended laboratory, environmental engineering lab
Procedia PDF Downloads 3168134 Systematic Identification of Noncoding Cancer Driver Somatic Mutations
Authors: Zohar Manber, Ran Elkon
Abstract:
Accumulation of somatic mutations (SMs) in the genome is a major driving force of cancer development. Most SMs in the tumor's genome are functionally neutral; however, some cause damage to critical processes and provide the tumor with a selective growth advantage (termed cancer driver mutations). Current research on functional significance of SMs is mainly focused on finding alterations in protein coding sequences. However, the exome comprises only 3% of the human genome, and thus, SMs in the noncoding genome significantly outnumber those that map to protein-coding regions. Although our understanding of noncoding driver SMs is very rudimentary, it is likely that disruption of regulatory elements in the genome is an important, yet largely underexplored mechanism by which somatic mutations contribute to cancer development. The expression of most human genes is controlled by multiple enhancers, and therefore, it is conceivable that regulatory SMs are distributed across different enhancers of the same target gene. Yet, to date, most statistical searches for regulatory SMs have considered each regulatory element individually, which may reduce statistical power. The first challenge in considering the cumulative activity of all the enhancers of a gene as a single unit is to map enhancers to their target promoters. Such mapping defines for each gene its set of regulating enhancers (termed "set of regulatory elements" (SRE)). Considering multiple enhancers of each gene as one unit holds great promise for enhancing the identification of driver regulatory SMs. However, the success of this approach is greatly dependent on the availability of comprehensive and accurate enhancer-promoter (E-P) maps. To date, the discovery of driver regulatory SMs has been hindered by insufficient sample sizes and statistical analyses that often considered each regulatory element separately. In this study, we analyzed more than 2,500 whole-genome sequence (WGS) samples provided by The Cancer Genome Atlas (TCGA) and The International Cancer Genome Consortium (ICGC) in order to identify such driver regulatory SMs. Our analyses took into account the combinatorial aspect of gene regulation by considering all the enhancers that control the same target gene as one unit, based on E-P maps from three genomics resources. The identification of candidate driver noncoding SMs is based on their recurrence. We searched for SREs of genes that are "hotspots" for SMs (that is, they accumulate SMs at a significantly elevated rate). To test the statistical significance of recurrence of SMs within a gene's SRE, we used both global and local background mutation rates. Using this approach, we detected - in seven different cancer types - numerous "hotspots" for SMs. To support the functional significance of these recurrent noncoding SMs, we further examined their association with the expression level of their target gene (using gene expression data provided by the ICGC and TCGA for samples that were also analyzed by WGS).Keywords: cancer genomics, enhancers, noncoding genome, regulatory elements
Procedia PDF Downloads 1048133 A Study of Non Linear Partial Differential Equation with Random Initial Condition
Authors: Ayaz Ahmad
Abstract:
In this work, we present the effect of noise on the solution of a partial differential equation (PDE) in three different setting. We shall first consider random initial condition for two nonlinear dispersive PDE the non linear Schrodinger equation and the Kortteweg –de vries equation and analyse their effect on some special solution , the soliton solutions.The second case considered a linear partial differential equation , the wave equation with random initial conditions allow to substantially decrease the computational and data storage costs of an algorithm to solve the inverse problem based on the boundary measurements of the solution of this equation. Finally, the third example considered is that of the linear transport equation with a singular drift term, when we shall show that the addition of a multiplicative noise term forbids the blow up of solutions under a very weak hypothesis for which we have finite time blow up of a solution in the deterministic case. Here we consider the problem of wave propagation, which is modelled by a nonlinear dispersive equation with noisy initial condition .As observed noise can also be introduced directly in the equations.Keywords: drift term, finite time blow up, inverse problem, soliton solution
Procedia PDF Downloads 2158132 Cauda Equina Syndrome: An Audit on Referral Adequacy and its Impact on Delay to Surgery
Authors: David Mafullul, Jiang Lei, Edward Goacher, Jibin Francis
Abstract:
PURPOSE: Timely decompressive surgery for cauda equina syndrome (CES) is dependent on efficient referral pathways for patients presenting at local primary or secondary centres to tertiary spinal centres in the United Kingdom (UK). Identifying modifiable points of delay within this process is important as minimising time between presentation and surgery may improve patient outcomes. This study aims to analyse whether adequacy of referral impacts on time to surgery in CES. MATERIALS AND METHODS: Data from all cases of confirmed CES referred to a single tertiary UK hospital between August 2017 to December 2019, via a suspected CES e-referral pathway, were obtained retrospectively. Referral adequacy was defined by the inclusion of sufficient information to determine the presence or absence of several NICE ‘red flags’. Correlation between referral adequacy and delay from referral-to-surgery was then analysed. RESULTS: In total, 118 confirmed CES cases were included. Adequate documentation for saddle anaesthesia was associated with reduced delays of more than 48 hours from referral-to-surgery [X2(1, N=116)=7.12, p=.024], an effect partly attributable to these referrals being accepted sooner [U=16.5; n1=27, n2=4, p=.029, r=.39]. Other red flags had poor association with delay. Referral adequacy was better for somatic red flags [bilateral sciatica (97.5%); severe or progressive bilateral neurological deficit of the legs (95.8%); saddle anaesthesia (91.5%)] compared to autonomic red flags [loss of anal tone (80.5%); urinary retention (79.7%); faecal incontinence or lost sensation of rectal fullness (57.6%)]. Although referral adequacy for urinary retention was 79.7%, only 47.5% of referrals documented a post-void residual numerical value. CONCLUSIONS: Adequate documentation of saddle anaesthesia in e-referrals is associated with reduced delay-to-surgery for confirmed CES, partly attributable to these referrals being accepted sooner. Other red flags had poor association with delay to surgery. Referral adequacy for autonomic red flags, including documentation for post-void residuals, has significant room for improvement.Keywords: cauda equina, cauda equina syndrome, neurosurgery, spinal surgery, decompression, delay, referral, referral adequacy
Procedia PDF Downloads 388131 Transition Metal Bis(Dicarbollide) Complexes in Design of Molecular Switches
Authors: Igor B. Sivaev
Abstract:
Design of molecular machines is an extraordinary growing and very important area of research that it was recognized by awarding Sauvage, Stoddart and Feringa the Nobel Prize in Chemistry in 2016 'for the design and synthesis of molecular machines'. Based on the type of motion being performed, molecular machines can be divided into two main types: molecular motors and molecular switches. Molecular switches are molecules or supramolecular complexes having bistability, i.e., the ability to exist in two or more stable forms, among which may be reversible transitions under external influence (heating, lighting, changing the medium acidity, the action of chemicals, exposure to magnetic or electric field). Molecular switches are the main structural element of any molecular electronics devices. Therefore, the design and the study of molecules and supramolecular systems capable of performing mechanical movement is an important and urgent problem of modern chemistry. There is growing interest in molecular switches and other devices of molecular electronics based on transition metal complexes; therefore choice of suitable stable organometallic unit is of great importance. An example of such unit is bis(dicarbollide) complexes of transition metals [3,3’-M(1,2-C₂B₉H₁₁)₂]ⁿ⁻. The control on the ligand rotation in such complexes can be reached by introducing substituents which could provide stabilization of certain rotamers due to specific interactions between the ligands, on the one hand, and which can participate as Lewis bases in complex formation with external metals resulting in a change in the rotation angle of the ligands, on the other hand. A series of isomeric methyl sulfide derivatives of cobalt bis(dicarbollide) complexes containing methyl sulfide substituents at boron atoms in different positions of the pentagonal face of the dicarbollide ligands [8,8’-(MeS)₂-3,3’-Co(1,2-C₂B₉H₁₀)₂]⁻, rac-[4,4’-(MeS)₂-3,3’-Co(1,2-C₂B₉H₁₀)₂]⁻ and meso-[4,7’-(MeS)₂-3,3’-Co(1,2-C₂B₉H₁₀)₂]⁻ were synthesized by the reaction of CoCl₂ with the corresponding methyl sulfide carborane derivatives [10-MeS-7,8-C₂B₉H₁₁)₂]⁻ and [10-MeS-7,8-C₂B₉H₁₁)₂]⁻. In the case of asymmetrically substituted cobalt bis(dicarbollide) complexes the corresponding rac- and meso-isomers were successfully separated by column chromatography as the tetrabutylammonium salts. The compounds obtained were studied by the methods of ¹H, ¹³C, and ¹¹B NMR spectroscopy, single crystal X-ray diffraction, cyclic voltammetry, controlled potential coulometry and quantum chemical calculations. It was found that in the solid state, the transoid- and gauche-conformations of the 8,8’- and 4,4’-isomers are stabilized by four intramolecular CH···S(Me)B hydrogen bonds each one (2.683-2.712 Å and 2.709-2.752 Å, respectively), whereas gauche-conformation of the 4,7’-isomer is stabilized by two intramolecular CH···S hydrogen bonds (2.699-2.711 Å). The existence of the intramolecular CH·S(Me)B hydrogen bonding in solutions was supported by the 1H NMR spectroscopy. These data are in a good agreement with results of the quantum chemical calculations. The corresponding iron and nickel complexes were synthesized as well. The reaction of the methyl sulfide derivatives of cobalt bis(dicarbollide) with various labile transition metal complexes results in rupture of intramolecular hydrogen bonds and complexation of the methyl sulfide groups with external metal. This results in stabilization of other rotational conformation of cobalt bis(dicarbollide) and can be used in design of molecular switches. This work was supported by the Russian Science Foundation (16-13-10331).Keywords: molecular switches, NMR spectroscopy, single crystal X-ray diffraction, transition metal bis(dicarbollide) complexes, quantum chemical calculations
Procedia PDF Downloads 1728130 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 2068129 The Interdisciplinary Synergy Between Computer Engineering and Mathematics
Authors: Mitat Uysal, Aynur Uysal
Abstract:
Computer engineering and mathematics share a deep and symbiotic relationship, with mathematics providing the foundational theories and models for computer engineering advancements. From algorithm development to optimization techniques, mathematics plays a pivotal role in solving complex computational problems. This paper explores key mathematical principles that underpin computer engineering, illustrating their significance through a case study that demonstrates the application of optimization techniques using Python code. The case study addresses the well-known vehicle routing problem (VRP), an extension of the traveling salesman problem (TSP), and solves it using a genetic algorithm.Keywords: VRP, TSP, genetic algorithm, computer engineering, optimization
Procedia PDF Downloads 13