Search results for: Desirability Function Approach
2038 Modelling of Energy Consumption in Wheat Production Using Neural Networks “Case Study in Canterbury Province, New Zealand“
Authors: M. Safa, S. Samarasinghe
Abstract:
An artificial neural network (ANN) approach was used to model the energy consumption of wheat production. This study was conducted over 35,300 hectares of irrigated and dry land wheat fields in Canterbury in the 2007-2008 harvest year.1 In this study several direct and indirect factors have been used to create an artificial neural networks model to predict energy use in wheat production. The final model can predict energy consumption by using farm condition (size of wheat area and number paddocks), farmers- social properties (education), and energy inputs (N and P use, fungicide consumption, seed consumption, and irrigation frequency), it can also predict energy use in Canterbury wheat farms with error margin of ±7% (± 1600 MJ/ha).
Keywords: Artificial neural network, Canterbury, energy consumption, modelling, New Zealand, wheat.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14172037 Supremacy of Differential Evolution Algorithm in Designing Multiplier-Less Low-Pass FIR Filter
Authors: Abhijit Chandra, Sudipta Chattopadhyay
Abstract:
In this communication, we have made an attempt to design multiplier-less low-pass finite impulse response (FIR) filter with the aid of various mutation strategies of Differential Evolution (DE) algorithm. Impulse response coefficient of the designed FIR filter has been represented as sums or differences of powers of two. Performance of the proposed filter has been evaluated in terms of its frequency response and associated hardware cost. Supremacy of our approach has been substantiated by comparing our result with many of the existing multiplier-less filter design algorithms of recent interest. It has also been demonstrated that DE-optimized filter outperforms Genetic Algorithm (GA) based design by a large margin. Hardware efficiency of our algorithm has further been validated by implementing those filters on a Field Programmable Gate Array (FPGA) chip.
Keywords: Convergence speed, Differential Evolution (DE), error histogram, finite impulse response (FIR) filter, total power of two (TPT), zero-valued filter coefficient (ZFC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21552036 A Method for Modeling Flexible Manipulators: Transfer Matrix Method with Finite Segments
Authors: Haijie Li, Xuping Zhang
Abstract:
This paper presents a computationally efficient method for the modeling of robot manipulators with flexible links and joints. This approach combines the Discrete Time Transfer Matrix Method with the Finite Segment Method, in which the flexible links are discretized by a number of rigid segments connected by torsion springs; and the flexibility of joints are modeled by torsion springs. The proposed method avoids the global dynamics and has the advantage of modeling non-uniform manipulators. Experiments and simulations of a single-link flexible manipulator are conducted for verifying the proposed methodologies. The simulations of a three-link robot arm with links and joints flexibility are also performed.Keywords: Flexible manipulator, transfer matrix method, linearization, finite segment method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19652035 The Fundamental Reliance of Iterative Learning Control on Stability Robustness
Authors: Richard W. Longman
Abstract:
Iterative learning control aims to achieve zero tracking error of a specific command. This is accomplished by iteratively adjusting the command given to a feedback control system, based on the tracking error observed in the previous iteration. One would like the iterations to converge to zero tracking error in spite of any error present in the model used to design the learning law. First, this need for stability robustness is discussed, and then the need for robustness of the property that the transients are well behaved. Methods of producing the needed robustness to parameter variations and to singular perturbations are presented. Then a method involving reverse time runs is given that lets the world behavior produce the ILC gains in such a way as to eliminate the need for a mathematical model. Since the real world is producing the gains, there is no issue of model error. Provided the world behaves linearly, the approach gives an ILC law with both stability robustness and good transient robustness, without the need to generate a model.Keywords: Iterative learning control, stability robustness, monotonic convergence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15942034 Program of Health/Safety Integration and the Total Worker Health Concept in the Improvement of Absenteeism of the Work Accommodation Management
Authors: L. R. Ferreira, R. Biscaro, C. C. Danziger, C. M. Galhardi, L. C. Biscaro, R. C. Biscaro, I. S. Vasconcelos, L. C. R. Ferreira, R. Reis, L. H. Oliveira
Abstract:
Introduction: There is a worldwide trend for the employer to be aware of investing in health promotion that goes beyond occupational hygiene approaches with the implementation of a comprehensive program with integration between occupational health and safety, and social/psychosocial responsibility in the workplace. Work accommodation is a necessity in most companies as it allows the worker to return to its function respecting its physical limitations. This study had the objective to verify if the integration of health and safety in the companies, with the inclusion of the concept of TWH promoted by an occupational health service has impacted in the management of absenteeism of workers in work accommodation. Method: A retrospective and paired cohort study was used, in which the impact of the implementation of the Program for the Health/Safety Integration and Total Worker Health Concept (PHSITWHC) was evaluated using the indices of absenteeism, health attestations, days and hours of sick leave of workers that underwent job accommodation/rehabilitation. This was a cohort study and the data were collected from January to September of 2017, prior to the initiation of the integration program, and compared with the data obtained from January to September of 2018, after the implementation of the program. For the statistical analysis, the student's t-test was used, with statistically significant differences being made at p < 0.05. Results: The results showed a 35% reduction in the number of absenteeism rate in 2018 compared to the same period in 2017. There was also a significant reduction in the total numbers of days of attestations/absences (mean of 2,8) as well as days of attestations, absence and sick leaves (mean of 5,2) in 2018 data after the implementation of PHSITWHC compared to 2017 data, means of 4,3 and 25,1, respectively, prior to the program. Conclusion: It can be concluded that the inclusion of the PHSITWHC was associated with a reduction in the rate of absenteeism of workers that underwent job accommodation. It was observed that, once health and safety were approached and integrated with the inclusion of the TWH concept, it was possible to reduce absenteeism, and improve worker’s quality of life and wellness, and work accommodation management.Keywords: Absenteeism, health/safety integration, work accommodation management, total worker health.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8592033 Improving the Effectiveness of Software Testing through Test Case Reduction
Authors: R. P. Mahapatra, Jitendra Singh
Abstract:
This paper proposes a new technique for improving the efficiency of software testing, which is based on a conventional attempt to reduce test cases that have to be tested for any given software. The approach utilizes the advantage of Regression Testing where fewer test cases would lessen time consumption of the testing as a whole. The technique also offers a means to perform test case generation automatically. Compared to one of the techniques in the literature where the tester has no option but to perform the test case generation manually, the proposed technique provides a better option. As for the test cases reduction, the technique uses simple algebraic conditions to assign fixed values to variables (Maximum, minimum and constant variables). By doing this, the variables values would be limited within a definite range, resulting in fewer numbers of possible test cases to process. The technique can also be used in program loops and arrays.Keywords: Software Testing, Test Case Generation, Test CaseReduction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30172032 The Impact of 21st Century Technology in Higher Education: The Role of Artificial Intelligence
Authors: Josefina Bengoechea, Alex Bell
Abstract:
Higher education, with its brick-and-mortar facilities and credits-based on hours of study, was developed to serve the needs of a national, industrial, analogue economy. However, the ongoing process of globalization on the one hand, and the emergence of ever-changing needs of employers on the other hand, make this type of process-based education obsolete, and exclusive to students who can afford to pay a full-time tuition and dedicate 4 years of their lives exclusively to study. The creative destruction brought about by new technologies in the 21st century will not only reconfigure the labour market, as millions of jobs will be lost to Artificial Intelligence. The purpose of this paper is to consider if the implementation of technology is the solution to the problems faced in higher education. The paper builds upon a constructivist approach, combining a literature review and research on key publications.
Keywords: Artificial intelligence, employability, labour market, new technology in higher education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7332031 Network Intrusion Detection Design Using Feature Selection of Soft Computing Paradigms
Authors: T. S. Chou, K. K. Yen, J. Luo
Abstract:
The network traffic data provided for the design of intrusion detection always are large with ineffective information and enclose limited and ambiguous information about users- activities. We study the problems and propose a two phases approach in our intrusion detection design. In the first phase, we develop a correlation-based feature selection algorithm to remove the worthless information from the original high dimensional database. Next, we design an intrusion detection method to solve the problems of uncertainty caused by limited and ambiguous information. In the experiments, we choose six UCI databases and DARPA KDD99 intrusion detection data set as our evaluation tools. Empirical studies indicate that our feature selection algorithm is capable of reducing the size of data set. Our intrusion detection method achieves a better performance than those of participating intrusion detectors.Keywords: Intrusion detection, feature selection, k-nearest neighbors, fuzzy clustering, Dempster-Shafer theory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19332030 MarginDistillation: Distillation for Face Recognition Neural Networks with Margin-Based Softmax
Authors: Svitov David, Alyamkin Sergey
Abstract:
The usage of convolutional neural networks (CNNs) in conjunction with the margin-based softmax approach demonstrates the state-of-the-art performance for the face recognition problem. Recently, lightweight neural network models trained with the margin-based softmax have been introduced for the face identification task for edge devices. In this paper, we propose a distillation method for lightweight neural network architectures that outperforms other known methods for the face recognition task on LFW, AgeDB-30 and Megaface datasets. The idea of the proposed method is to use class centers from the teacher network for the student network. Then the student network is trained to get the same angles between the class centers and face embeddings predicted by the teacher network.Keywords: ArcFace, distillation, face recognition, margin-based softmax.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6312029 Heuristic Set-Covering-Based Postprocessing for Improving the Quine-McCluskey Method
Authors: Miloš Šeda
Abstract:
Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.
Keywords: Boolean algebra, Karnaugh map, Quine-McCluskey method, set covering problem, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27932028 Inter-frame Collusion Attack in SS-N Video Watermarking System
Authors: Yaser Mohammad Taheri, Alireza Zolghadr–asli, Mehran Yazdi
Abstract:
Video watermarking is usually considered as watermarking of a set of still images. In frame-by-frame watermarking approach, each video frame is seen as a single watermarked image, so collusion attack is more critical in video watermarking. If the same or redundant watermark is used for embedding in every frame of video, the watermark can be estimated and then removed by watermark estimate remodolulation (WER) attack. Also if uncorrelated watermarks are used for every frame, these watermarks can be washed out with frame temporal filtering (FTF). Switching watermark system or so-called SS-N system has better performance against WER and FTF attacks. In this system, for each frame, the watermark is randomly picked up from a finite pool of watermark patterns. At first SS-N system will be surveyed and then a new collusion attack for SS-N system will be proposed using a new algorithm for separating video frame based on watermark pattern. So N sets will be built in which every set contains frames carrying the same watermark. After that, using WER attack in every set, N different watermark patterns will be estimated and removed later.
Keywords: Watermark estimation remodulation (WER), Frame Temporal Averaging (FTF), switching watermark system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14972027 Stochastic Simulation of Reaction-Diffusion Systems
Authors: Paola Lecca, Lorenzo Dematte
Abstract:
Reactiondiffusion systems are mathematical models that describe how the concentration of one or more substances distributed in space changes under the influence of local chemical reactions in which the substances are converted into each other, and diffusion which causes the substances to spread out in space. The classical representation of a reaction-diffusion system is given by semi-linear parabolic partial differential equations, whose general form is ÔêétX(x, t) = DΔX(x, t), where X(x, t) is the state vector, D is the matrix of the diffusion coefficients and Δ is the Laplace operator. If the solute move in an homogeneous system in thermal equilibrium, the diffusion coefficients are constants that do not depend on the local concentration of solvent and of solutes and on local temperature of the medium. In this paper a new stochastic reaction-diffusion model in which the diffusion coefficients are function of the local concentration, viscosity and frictional forces of solvent and solute is presented. Such a model provides a more realistic description of the molecular kinetics in non-homogenoeus and highly structured media as the intra- and inter-cellular spaces. The movement of a molecule A from a region i to a region j of the space is described as a first order reaction Ai k- → Aj , where the rate constant k depends on the diffusion coefficient. Representing the diffusional motion as a chemical reaction allows to assimilate a reaction-diffusion system to a pure reaction system and to simulate it with Gillespie-inspired stochastic simulation algorithms. The stochastic time evolution of the system is given by the occurrence of diffusion events and chemical reaction events. At each time step an event (reaction or diffusion) is selected from a probability distribution of waiting times determined by the specific speed of reaction and diffusion events. Redi is the software tool, developed to implement the model of reaction-diffusion kinetics and dynamics. It is a free software, that can be downloaded from http://www.cosbi.eu. To demonstrate the validity of the new reaction-diffusion model, the simulation results of the chaperone-assisted protein folding in cytoplasm obtained with Redi are reported. This case study is redrawing the attention of the scientific community due to current interests on protein aggregation as a potential cause for neurodegenerative diseases.
Keywords: Reaction-diffusion systems, Fick's law, stochastic simulation algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17382026 Aircraft Selection Using Preference Optimization Programming (POP)
Authors: C. Ardil
Abstract:
A multiple-criteria decision support system is proposed for the best aircraft selection decision. Various strategic, economic, environmental, and risk-related factors can directly or indirectly influence this choice, and they should be taken into account in the decision-making process. The paper suggests a multiple-criteria analysis to aid in the airline management's decision-making process when choosing an appropriate aircraft. In terms of the suggested approach, an integrated entropic preference optimization programming (POP) for fleet modeling risk analysis is applied. The findings of the study of multiple criteria analysis indicate that the A321(neo) aircraft type is the best alternative in this particular optimization instance. The proposed methodology can be applied to other complex engineering problems involving multiple criteria analysis.
Keywords: Aircraft selection, decision making, multiple criteria decision making, preference optimization programming, POP, entropic weight method, TOPSIS, WSM, WPM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6252025 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing
Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig
Abstract:
Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.Keywords: Empirical mode decomposition, mode mixing, sifting process, over-sifting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9922024 Faults Forecasting System
Authors: Hanaa E.Sayed, Hossam A. Gabbar, Shigeji Miyazaki
Abstract:
This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.Keywords: Bayesian Techniques, Faults Detection, Forecasting techniques, Multivariate Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15522023 Performance Analysis of Flooding Attack Prevention Algorithm in MANETs
Authors: Revathi Venkataraman, M. Pushpalatha, T. Rama Rao
Abstract:
The lack of any centralized infrastructure in mobile ad hoc networks (MANET) is one of the greatest security concerns in the deployment of wireless networks. Thus communication in MANET functions properly only if the participating nodes cooperate in routing without any malicious intention. However, some of the nodes may be malicious in their behavior, by indulging in flooding attacks on their neighbors. Some others may act malicious by launching active security attacks like denial of service. This paper addresses few related works done on trust evaluation and establishment in ad hoc networks. Related works on flooding attack prevention are reviewed. A new trust approach based on the extent of friendship between the nodes is proposed which makes the nodes to co-operate and prevent flooding attacks in an ad hoc environment. The performance of the trust algorithm is tested in an ad hoc network implementing the Ad hoc On-demand Distance Vector (AODV) protocol.Keywords: AODV, Flooding, MANETs, trust estimation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23862022 Surface Roughness Prediction Model for Grinding of Composite Laminate Using Factorial Design
Authors: P. Chockalingam, C. K. Kok, T. R. Vijayaram
Abstract:
Glass fiber reinforced polymer (GFRP) laminates have been widely used because of their unique mechanical and physical properties such as high specific strength, stiffness and corrosive resistance. Accordingly, the demand for precise grinding of composites has been increasing enormously. Grinding is the one of the obligatory methods for fabricating products with composite materials and it is usually the final operation in the assembly of structural laminates. In this experimental study, an attempt has been made to develop an empirical model to predict the surface roughness of ground GFRP composite laminate with respect to the influencing grinding parameters by factorial design approach of design of experiments (DOE). The significance of grinding parameters and their three factor interaction effects on grinding of GFRP composite have been analyzed in detail. An empirical equation has been developed to attain minimum surface roughness in GFRP laminate grinding.
Keywords: GFRP Laminates, Grinding, Surface Roughness, Factorial Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24522021 A Worst Case Estimation of the Inspection Rate by a Berthing Policy in a Container Terminal
Authors: K.H. Yang
Abstract:
After the terrorist attack on September 11, 2001 in U.S., the container security issue got high attention, especially by U.S. government, which deployed a lot of measures to promote or improve security systems. U.S. government not only enhances its national security system, but allies with other countries against the potential terrorist attacks in the future. For example CSI (Container Security Initiative), it encourages foreign ports outside U.S. to become CSI ports as a part of U.S. anti-terrorism network. Although promotion of the security could partly reach the goal of anti-terrorism, that will influence the efficiency of container supply chain, which is the main concern when implementing the inspection measurements. This paper proposes a quick estimation methodology for an inspection service rate by a berth allocation heuristic such that the inspection activities will not affect the original container supply chain. Theoretical and simulation results show this approach is effective.Keywords: Berth allocation, Container, Heuristic, Inspection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14112020 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients
Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi
Abstract:
Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.
Keywords: Frailty model, latent variables, liver cirrhosis, parametric distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10582019 Optimal Sizing of a Hybrid Wind/PV Plant Considering Reliability Indices
Authors: S. Dehghan, B. Kiani, A. Kazemi, A. Parizad
Abstract:
The utilization of renewable energy sources in electric power systems is increasing quickly because of public apprehensions for unpleasant environmental impacts and increase in the energy costs involved with the use of conventional energy sources. Despite the application of these energy sources can considerably diminish the system fuel costs, they can also have significant influence on the system reliability. Therefore an appropriate combination of the system reliability indices level and capital investment costs of system is vital. This paper presents a hybrid wind/photovoltaic plant, with the aim of supplying IEEE reliability test system load pattern while the plant capital investment costs is minimized by applying a hybrid particle swarm optimization (PSO) / harmony search (HS) approach, and the system fulfills the appropriate level of reliability.Keywords: Distributed Generation, Fuel Cell, HS, Hybrid Power Plant, PSO, Photovoltaic, Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23022018 Design Approach to Incorporate Unique Performance Characteristics of Special Concrete
Authors: Devendra Kumar Pandey, Debabrata Chakraborty
Abstract:
The advancement in various concrete ingredients like plasticizers, additives and fibers, etc. has enabled concrete technologists to develop many viable varieties of special concretes in recent decades. Such various varieties of concrete have significant enhancement in green as well as hardened properties of concrete. A prudent selection of appropriate type of concrete can resolve many design and application issues in construction projects. This paper focuses on usage of self-compacting concrete, high early strength concrete, structural lightweight concrete, fiber reinforced concrete, high performance concrete and ultra-high strength concrete in the structures. The modified properties of strength at various ages, flowability, porosity, equilibrium density, flexural strength, elasticity, permeability etc. need to be carefully studied and incorporated into the design of the structures. The paper demonstrates various mixture combinations and the concrete properties that can be leveraged. The selection of such products based on the end use of structures has been proposed in order to efficiently utilize the modified characteristics of these concrete varieties. The study involves mapping the characteristics with benefits and savings for the structure from design perspective. Self-compacting concrete in the structure is characterized by high shuttering loads, better finish, and feasibility of closer reinforcement spacing. The structural design procedures can be modified to specify higher formwork strength, height of vertical members, cover reduction and increased ductility. The transverse reinforcement can be spaced at closer intervals compared to regular structural concrete. It allows structural lightweight concrete structures to be designed for reduced dead load, increased insulation properties. Member dimensions and steel requirement can be reduced proportionate to about 25 to 35 percent reduction in the dead load due to self-weight of concrete. Steel fiber reinforced concrete can be used to design grade slabs without primary reinforcement because of 70 to 100 percent higher tensile strength. The design procedures incorporate reduction in thickness and joint spacing. High performance concrete employs increase in the life of the structures by improvement in paste characteristics and durability by incorporating supplementary cementitious materials. Often, these are also designed for slower heat generation in the initial phase of hydration. The structural designer can incorporate the slow development of strength in the design and specify 56 or 90 days strength requirement. For designing high rise building structures, creep and elasticity properties of such concrete also need to be considered. Lastly, certain structures require a performance under loading conditions much earlier than final maturity of concrete. High early strength concrete has been designed to cater to a variety of usages at various ages as early as 8 to 12 hours. Therefore, an understanding of concrete performance specifications for special concrete is a definite door towards a superior structural design approach.
Keywords: High performance concrete, special concrete, structural design, structural lightweight concrete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9182017 The Analogue of a Property of Pisot Numbers in Fields of Formal Power Series
Authors: Wiem Gadri
Abstract:
This study delves into the intriguing properties of Pisot and Salem numbers within the framework of formal Laurent series over finite fields, a domain where these numbers’ spectral characteristics, Λm(β) and lm(β), have yet to be fully explored. Utilizing a methodological approach that combines algebraic number theory with the analysis of power series, we extend the foundational work of Erdos, Joo, and Komornik to this setting. Our research uncovers bounds for lm(β), revealing how these depend on the degree of the minimal polynomial of β and thus offering a characterization of Pisot and Salem formal power series. The findings significantly contribute to our understanding of these numbers, highlighting their distribution and properties in the context of formal power series. This investigation not only bridges number theory with formal power series analysis but also sets the stage for further interdisciplinary research in these areas.
Keywords: Pisot numbers, Salem numbers, Formal power series, Minimal polynomial degree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1482016 Analysis of Network Performance Using Aspect of Quantum Cryptography
Authors: Nisarg A. Patel, Hiren B. Patel
Abstract:
Quantum cryptography is described as a point-to-point secure key generation technology that has emerged in recent times in providing absolute security. Researchers have started studying new innovative approaches to exploit the security of Quantum Key Distribution (QKD) for a large-scale communication system. A number of approaches and models for utilization of QKD for secure communication have been developed. The uncertainty principle in quantum mechanics created a new paradigm for QKD. One of the approaches for use of QKD involved network fashioned security. The main goal was point-to-point Quantum network that exploited QKD technology for end-to-end network security via high speed QKD. Other approaches and models equipped with QKD in network fashion are introduced in the literature as. A different approach that this paper deals with is using QKD in existing protocols, which are widely used on the Internet to enhance security with main objective of unconditional security. Our work is towards the analysis of the QKD in Mobile ad-hoc network (MANET).
Keywords: QKD, cryptography, quantum cryptography, network performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9472015 An in Silico Approach for Prioritizing Drug Targets in Metabolic Pathway of Mycobacterium Tuberculosis
Authors: Baharak Khoshkholgh-Sima, Soroush Sardari, Jalal Izadi Mobarakeh, Ramezan Ali Khavari-Nejad
Abstract:
There is an urgent need to develop novel Mycobacterium tuberculosis (Mtb) drugs that are active against drug resistant bacteria but, more importantly, kill persistent bacteria. Our study structured based on integrated analysis of metabolic pathways, small molecule screening and similarity Search in PubChem Database. Metabolic analysis approaches based on Unified weighted used for potent target selection. Our results suggest that pantothenate synthetase (panC) and and 3-methyl-2-oxobutanoate hydroxymethyl transferase (panB) as a appropriate drug targets. In our study, we used pantothenate synthetase because of existence inhibitors. We have reported the discovery of new antitubercular compounds through ligand based approaches using computational tools.Keywords: In Silico, Ligand-based Virtual Screening, Metabolic Pathways, Mycobacterium tuberculosis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20812014 Numerical Modeling of Natural Convection on Various Configuration of Rectangular Fin Arrays on Vertical Base Plates
Authors: H.R.Goshayeshi, M.Fahim inia, M.M.Naserian
Abstract:
In this research, the laminar heat transfer of natural convection on vertical surfaces has been investigated. Most of the studies on natural convection have been considered constantly whereas velocity and temperature domain, do not change with time, transient one are used a lot. Governing equations are solved using a finite volume approach. The convective terms are discretized using the power-law scheme, whereas for diffusive terms the central difference is employed. Coupling between the velocity and pressure is made with SIMPLE algorithm. The resultant system of discretized linear algebraic equations is solved with an alternating direction implicit scheme. Then a configuration of rectangular fins is put in different ways on the surface and heat transfer of natural convection on these surfaces without sliding is studied and finally optimization is done.
Keywords: Natural convection, vertical surfaces, SIMPLE algorithm, Rectangular fins.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15622013 Entropy based Expeditive Methodology for Rating Curves Assessment
Authors: D. Mirauda, M. Greco, P. Moscarelli
Abstract:
The river flow forecasting represents a crucial point to employ for improving a management policy addressed to the right use of water resources as well as for conjugating prevention and defense actions against environmental degradation. The difficulties occurring during the field activities encourage the development and implementation of operative computation and measuring methods addressed to time reduction for data acquisition and processing maintaining a good level of accuracy. Therefore, the aim of the present work is to test a new entropy based expeditive methodology for the evaluation of the rating curves on three gauged sections with different geometric and morphological characteristics. The methodology requires the choice of only three verticals along the measure section and the sampling of only the maximum velocity. The results underline how in most conditions the rating curves drawn can replace those built with classic methodologies, simplifying thus the procedures of data monitoring and calculation.
Keywords: gauged station, entropic approach, expeditive methodology, rating curves.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14112012 Words Reordering based on Statistical Language Model
Authors: Theologos Athanaselis, Stelios Bakamidis, Ioannis Dologlou
Abstract:
There are multiple reasons to expect that detecting the word order errors in a text will be a difficult problem, and detection rates reported in the literature are in fact low. Although grammatical rules constructed by computer linguists improve the performance of grammar checker in word order diagnosis, the repairing task is still very difficult. This paper presents an approach for repairing word order errors in English text by reordering words in a sentence and choosing the version that maximizes the number of trigram hits according to a language model. The novelty of this method concerns the use of an efficient confusion matrix technique for reordering the words. The comparative advantage of this method is that works with a large set of words, and avoids the laborious and costly process of collecting word order errors for creating error patterns.Keywords: Permutations filtering, Statistical languagemodel N-grams, Word order errors
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15872011 Review of the Characteristics of Mahan Garden:One Type of Persian Gardens
Authors: Ladan Tajaddini
Abstract:
Iranians- imagination of heaven, which is the reward of a person-s good deeds during their life, has shown itself in pleasant and green gardens where earthly gardens were made as representations of paradise. Iranians are also quite interested in making their earthly gardens and plantations around their buildings. With Iran-s hot and dry climate with a lack of sufficient water for plantation coverage, it becomes noticeable how important it is to Iranians- art in making gardens. This study, with regard to examples, documents and library studies, investigates the characteristics of Persian gardens. The result shows that elements such as soil, water, plants and layout have been used in forming a unique style of Persian gardens. Bagh-e Shah Zadeh Mahan (Mahan prince garden) is a typical example and has been carefully studied. In this paper I try to investigate and evaluate the characteristics of a Persian garden by means of a descriptive approach.Keywords: environmental planning, Persian garden, landscape, shah zadeh garden, soil and water, gardening.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29392010 H∞ Takagi-Sugeno Fuzzy State-Derivative Feedback Control Design for Nonlinear Dynamic Systems
Authors: N. Kaewpraek, W. Assawinchaichote
Abstract:
This paper considers an H∞ TS fuzzy state-derivative feedback controller for a class of nonlinear dynamical systems. A Takagi-Sugeno (TS) fuzzy model is used to approximate a class of nonlinear dynamical systems. Then, based on a linear matrix inequality (LMI) approach, we design an H∞ TS fuzzy state-derivative feedback control law which guarantees L2-gain of the mapping from the exogenous input noise to the regulated output to be less or equal to a prescribed value. We derive a sufficient condition such that the system with the fuzzy controller is asymptotically stable and H∞ performance is satisfied. Finally, we provide and simulate a numerical example is provided to illustrate the stability and the effectiveness of the proposed controller.Keywords: H∞ fuzzy control, LMI, Takagi-Sugano (TS) fuzzy model, nonlinear dynamic systems, state-derivative feedback.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9422009 Attention Multiple Instance Learning for Cancer Tissue Classification in Digital Histopathology Images
Authors: Afaf Alharbi, Qianni Zhang
Abstract:
The identification of malignant tissue in histopathological slides holds significant importance in both clinical settings and pathology research. This paper presents a methodology aimed at automatically categorizing cancerous tissue through the utilization of a multiple instance learning framework. This framework is specifically developed to acquire knowledge of the Bernoulli distribution of the bag label probability by employing neural networks. Furthermore, we put forward a neural network-based permutation-invariant aggregation operator, equivalent to attention mechanisms, which is applied to the multi-instance learning network. Through empirical evaluation on an openly available colon cancer histopathology dataset, we provide evidence that our approach surpasses various conventional deep learning methods.
Keywords: Attention Multiple Instance Learning, Multiple Instance Learning, transfer learning, histopathological slides, cancer tissue classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 221